Bug 1719558

Summary: libvirtd crashed when start vm with emulatorpin configuration
Product: Red Hat Enterprise Linux 7 Reporter: Fangge Jin <fjin>
Component: libvirtAssignee: Libvirt Maintainers <libvirt-maint>
Status: CLOSED DUPLICATE QA Contact: Virtualization Bugs <virt-bugs>
Severity: high Docs Contact:
Priority: unspecified    
Version: 7.7CC: abologna, jiyan, lmen
Target Milestone: rcKeywords: Regression
Target Release: ---   
Hardware: x86_64   
OS: Linux   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2019-06-12 07:47:17 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Attachments:
Description Flags
libvirtd log none

Description Fangge Jin 2019-06-12 06:12:32 UTC
Created attachment 1579638 [details]
libvirtd log

Description of problem:
libvirtd crashed when start vm with emulatorpin configuration

Version-Release number of selected component:
libvirt-4.5.0-21.virtcov.el7.x86_64

How reproducible:
100%

Steps to Reproduce:
0. The host has 4 cpus online:
# lscpu
Architecture:          x86_64
CPU op-mode(s):        32-bit, 64-bit
Byte Order:            Little Endian
CPU(s):                4
On-line CPU(s) list:   0-3
Thread(s) per core:    1
Core(s) per socket:    4
Socket(s):             1
NUMA node(s):          1
...

1. Prepare a vm xml with emulatorpin configuration:
...
  <vcpu placement='static' cpuset='0-3' current='8'>16</vcpu>
  <cputune>
    <emulatorpin cpuset='3'/>
  </cputune>
...

2. Create vm:
# virsh create min-rep.xml
error: Disconnected from qemu:///system due to end of file
error: Failed to create domain from min-rep.xml
error: End of file while reading data: Input/output error

3. Check backtrace:
Thread 1 (Thread 0x7f76aae70700 (LWP 15977)):
#0  0x00007f76bcb1ad99 in virBitmapCopy (dst=0x0, src=0x7f7684038e90) at util/virbitmap.c:164
#1  0x00007f766c3505a2 in qemuProcessInitCpuAffinity (vm=vm@entry=0x7f7640188930) at qemu/qemu_process.c:2395
#2  0x00007f766c35ba3b in qemuProcessLaunch (conn=conn@entry=0x7f7684000a00, driver=driver@entry=0x7f7640100d80, vm=vm@entry=0x7f7640188930, asyncJob=asyncJob@entry=QEMU_ASYNC_JOB_START, incoming=incoming@entry=0x0,
    snapshot=snapshot@entry=0x0, vmop=vmop@entry=VIR_NETDEV_VPORT_PROFILE_OP_CREATE, flags=flags@entry=17) at qemu/qemu_process.c:6521
#3  0x00007f766c3635ef in qemuProcessStart (conn=conn@entry=0x7f7684000a00, driver=driver@entry=0x7f7640100d80, vm=0x7f7640188930, updatedCPU=updatedCPU@entry=0x0, asyncJob=asyncJob@entry=QEMU_ASYNC_JOB_START,
    migrateFrom=migrateFrom@entry=0x0, migrateFd=migrateFd@entry=-1, migratePath=migratePath@entry=0x0, snapshot=snapshot@entry=0x0, vmop=vmop@entry=VIR_NETDEV_VPORT_PROFILE_OP_CREATE, flags=17, flags@entry=1)
    at qemu/qemu_process.c:6806
#4  0x00007f766c3e1cd9 in qemuDomainCreateXML (conn=0x7f7684000a00,
    xml=0x7f7684001d50 "<domain type='kvm'>\n  <name>rhel7.6</name>\n  <uuid>df899f5c-db94-48b2-867a-e0c266b59b7a</uuid>\n  <genid>001b2039-ca77-4352-ab4a-433521eabf48</genid>\n  <title>A short description - rhel7.6 full xml - o"..., flags=0) at qemu/qemu_driver.c:1745
#5  0x00007f76bcdd91d5 in virDomainCreateXML (conn=0x7f7684000a00,
    xmlDesc=0x7f7684001d50 "<domain type='kvm'>\n  <name>rhel7.6</name>\n  <uuid>df899f5c-db94-48b2-867a-e0c266b59b7a</uuid>\n  <genid>001b2039-ca77-4352-ab4a-433521eabf48</genid>\n  <title>A short description - rhel7.6 full xml - o"..., flags=0) at libvirt-domain.c:176
#6  0x000055c7008f5d77 in remoteDispatchDomainCreateXML (ret=0x7f7684001d20, args=0x7f7684001c60, rerr=0x7f76aae6fbc0, msg=0x55c701931e60, client=0x55c701932620, server=0x55c701910e30)
    at remote/remote_daemon_dispatch_stubs.h:4575
#7  remoteDispatchDomainCreateXMLHelper (server=0x55c701910e30, client=0x55c701932620, msg=0x55c701931e60, rerr=0x7f76aae6fbc0, args=0x7f7684001c60, ret=0x7f7684001d20) at remote/remote_daemon_dispatch_stubs.h:4553
#8  0x00007f76bcce1215 in virNetServerProgramDispatchCall (msg=0x55c701931e60, client=0x55c701932620, server=0x55c701910e30, prog=0x55c70192f650) at rpc/virnetserverprogram.c:437
#9  virNetServerProgramDispatch (prog=0x55c70192f650, server=server@entry=0x55c701910e30, client=client@entry=0x55c701932620, msg=0x55c701931e60) at rpc/virnetserverprogram.c:304
#10 0x00007f76bcce9bea in virNetServerProcessMsg (srv=srv@entry=0x55c701910e30, client=0x55c701932620, prog=<optimized out>, msg=0x55c701931e60) at rpc/virnetserver.c:143
#11 0x00007f76bcce9f51 in virNetServerHandleJob (jobOpaque=<optimized out>, opaque=0x55c701910e30) at rpc/virnetserver.c:164
#12 0x00007f76bcbc7b6c in virThreadPoolWorker (opaque=opaque@entry=0x55c701904e00) at util/virthreadpool.c:167
#13 0x00007f76bcbc691a in virThreadHelper (data=<optimized out>) at util/virthread.c:206
#14 0x00007f76b9ee1ea5 in start_thread (arg=0x7f76aae70700) at pthread_create.c:307
#15 0x00007f76b9c0a8cd in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:111

Actual results:
libvirtd crashed

Expected results:
Vm can start successfully

Additional info:

Comment 3 Andrea Bolognani 2019-06-12 07:47:17 UTC

*** This bug has been marked as a duplicate of bug 1718172 ***