Bug 877110
Summary: | segmentation fault in qemuDomainObjEndJob | ||||||
---|---|---|---|---|---|---|---|
Product: | [Fedora] Fedora | Reporter: | Richard W.M. Jones <rjones> | ||||
Component: | libvirt | Assignee: | Libvirt Maintainers <libvirt-maint> | ||||
Status: | CLOSED CURRENTRELEASE | QA Contact: | Fedora Extras Quality Assurance <extras-qa> | ||||
Severity: | unspecified | Docs Contact: | |||||
Priority: | unspecified | ||||||
Version: | 18 | CC: | aprishchepo, berrange, clalancette, crobinso, dyasny, itamar, jdenemar, jforbes, jyang, laine, libvirt-maint, veillard, virt-maint | ||||
Target Milestone: | --- | ||||||
Target Release: | --- | ||||||
Hardware: | Unspecified | ||||||
OS: | Unspecified | ||||||
Whiteboard: | |||||||
Fixed In Version: | Doc Type: | Bug Fix | |||||
Doc Text: | Story Points: | --- | |||||
Clone Of: | Environment: | ||||||
Last Closed: | 2013-03-04 17:16:14 UTC | Type: | Bug | ||||
Regression: | --- | Mount Type: | --- | ||||
Documentation: | --- | CRM: | |||||
Verified Versions: | Category: | --- | |||||
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |||||
Cloudforms Team: | --- | Target Upstream Version: | |||||
Embargoed: | |||||||
Attachments: |
|
Description
Richard W.M. Jones
2012-11-15 17:39:11 UTC
I hit this bug again today. The stack trace is: [New LWP 3334] [New LWP 3337] [New LWP 3335] [New LWP 3336] [New LWP 3338] [New LWP 4380] [New LWP 3330] [New LWP 4381] [New LWP 3331] [New LWP 3332] [New LWP 3333] [New LWP 3339] [New LWP 12058] [New LWP 3329] [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib64/libthread_db.so.1". warning: Skipping deprecated .gdb_index section in /var/cache/abrt-di/usr/lib/debug/usr/lib64/librbd.so.1.0.0.debug. Do "set use-deprecated-index-sections on" before the file is read to use the section anyway. Core was generated by `/usr/sbin/libvirtd --timeout=30'. Program terminated with signal 11, Segmentation fault. #0 qemuDomainObjBeginJobInternal (driver=driver@entry=0x7f728c06a4e0, driver_locked=driver_locked@entry=true, obj=obj@entry=0x7f72414866b0, job=job@entry=QEMU_JOB_DESTROY, asyncJob=asyncJob@entry=QEMU_ASYNC_JOB_NONE) at qemu/qemu_domain.c:771 771 priv->jobs_queued++; Thread 14 (Thread 0x7f729e1ba840 (LWP 3329)): #0 0x000000328c6e998d in poll () at ../sysdeps/unix/syscall-template.S:81 No locals. #1 0x00000032b045f8bb in poll (__timeout=-1, __nfds=9, __fds=<optimized out>) at /usr/include/bits/poll2.h:46 No locals. #2 virEventPollRunOnce () at util/event_poll.c:630 fds = 0x1765880 ret = <optimized out> timeout = -1 nfds = 9 __func__ = "virEventPollRunOnce" __FUNCTION__ = "virEventPollRunOnce" #3 0x00000032b045e5c7 in virEventRunDefaultImpl () at util/event.c:247 __func__ = "virEventRunDefaultImpl" #4 0x00000032b054c0ad in virNetServerRun (srv=srv@entry=0x1445a60) at rpc/virnetserver.c:748 timerid = 1 timerActive = 0 i = <optimized out> __FUNCTION__ = "virNetServerRun" __func__ = "virNetServerRun" #5 0x000000000040c763 in main (argc=<optimized out>, argv=<optimized out>) at libvirtd.c:1583 srv = 0x1445a60 remote_config_file = 0x1445ed0 "/home/rjones/.config/libvirt/libvirtd.conf" statuswrite = -1 ret = 1 pid_file_fd = 4 pid_file = 0x1445f10 "/run/user/1000/libvirt/libvirtd.pid" sock_file = 0x1445ea0 "/run/user/1000/libvirt/libvirt-sock" sock_file_ro = 0x0 timeout = 30 verbose = 0 godaemon = 0 ipsock = 0 config = 0x1446210 privileged = <optimized out> implicit_conf = <optimized out> run_dir = 0x1445e80 "/run/user/1000/libvirt" old_umask = <optimized out> opts = {{name = 0x4338ca "verbose", has_arg = 0, flag = 0x7fff76856ee8, val = 1}, {name = 0x4338d2 "daemon", has_arg = 0, flag = 0x7fff76856eec, val = 1}, {name = 0x4338d9 "listen", has_arg = 0, flag = 0x7fff76856ef0, val = 1}, {name = 0x433a21 "config", has_arg = 1, flag = 0x0, val = 102}, {name = 0x433975 "timeout", has_arg = 1, flag = 0x0, val = 116}, {name = 0x4338e0 "pid-file", has_arg = 1, flag = 0x0, val = 112}, {name = 0x4338e9 "version", has_arg = 0, flag = 0x0, val = 129}, {name = 0x4338f1 "help", has_arg = 0, flag = 0x0, val = 63}, {name = 0x0, has_arg = 0, flag = 0x0, val = 0}} __func__ = "main" __FUNCTION__ = "main" Thread 13 (Thread 0x7f72947fd700 (LWP 12058)): #0 pthread_cond_wait@@GLIBC_2.3.2 () at ../nptl/sysdeps/unix/sysv/linux/x86_64/pthread_cond_wait.S:165 No locals. #1 0x00000032b0470256 in virCondWait (c=c@entry=0x1445c10, m=m@entry=0x1445be8) at util/threads-pthread.c:117 ret = <optimized out> #2 0x00000032b047069b in virThreadPoolWorker (opaque=opaque@entry=0x16293d0) at util/threadpool.c:103 data = 0x0 pool = 0x1445bb0 cond = 0x1445c10 priority = false job = 0x0 #3 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x16293d0} #4 0x000000328ca07d15 in start_thread (arg=0x7f72947fd700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f72947fd700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140130094405376, -971735137444366802, 0, 21409488, 140130094405376, 22, 893574156048463406, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #5 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. Thread 12 (Thread 0x7f729958f700 (LWP 3339)): #0 pthread_cond_wait@@GLIBC_2.3.2 () at ../nptl/sysdeps/unix/sysv/linux/x86_64/pthread_cond_wait.S:165 No locals. #1 0x00000032b0470256 in virCondWait (c=c@entry=0x1445ca8, m=m@entry=0x1445be8) at util/threads-pthread.c:117 ret = <optimized out> #2 0x00000032b04706bb in virThreadPoolWorker (opaque=opaque@entry=0x1436d50) at util/threadpool.c:103 data = 0x0 pool = 0x1445bb0 cond = 0x1445ca8 priority = true job = 0x0 #3 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x1436d50} #4 0x000000328ca07d15 in start_thread (arg=0x7f729958f700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f729958f700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140130175743744, -971735137444366802, 0, 217101504512, 140130175743744, 21256800, 893549716610807342, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #5 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. Thread 11 (Thread 0x7f729c595700 (LWP 3333)): #0 pthread_cond_wait@@GLIBC_2.3.2 () at ../nptl/sysdeps/unix/sysv/linux/x86_64/pthread_cond_wait.S:165 No locals. #1 0x00000032b0470256 in virCondWait (c=c@entry=0x1445c10, m=m@entry=0x1445be8) at util/threads-pthread.c:117 ret = <optimized out> #2 0x00000032b047069b in virThreadPoolWorker (opaque=opaque@entry=0x1436cc0) at util/threadpool.c:103 data = 0x0 pool = 0x1445bb0 cond = 0x1445c10 priority = false job = 0x0 #3 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x1436cc0} #4 0x000000328ca07d15 in start_thread (arg=0x7f729c595700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f729c595700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140130226099968, -971735137444366802, 0, 217101504512, 140130226099968, 21256800, 893556301869413934, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #5 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. Thread 10 (Thread 0x7f729cd96700 (LWP 3332)): #0 pthread_cond_wait@@GLIBC_2.3.2 () at ../nptl/sysdeps/unix/sysv/linux/x86_64/pthread_cond_wait.S:165 No locals. #1 0x00000032b0470256 in virCondWait (c=c@entry=0x1445c10, m=m@entry=0x1445be8) at util/threads-pthread.c:117 ret = <optimized out> #2 0x00000032b047069b in virThreadPoolWorker (opaque=opaque@entry=0x1436ba0) at util/threadpool.c:103 data = 0x0 pool = 0x1445bb0 cond = 0x1445c10 priority = false job = 0x0 #3 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x1436ba0} #4 0x000000328ca07d15 in start_thread (arg=0x7f729cd96700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f729cd96700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140130234492672, -971735137444366802, 0, 217101504512, 140130234492672, 21256800, 893557400844170798, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #5 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. Thread 9 (Thread 0x7f729d597700 (LWP 3331)): #0 pthread_cond_wait@@GLIBC_2.3.2 () at ../nptl/sysdeps/unix/sysv/linux/x86_64/pthread_cond_wait.S:165 No locals. #1 0x00000032b0470256 in virCondWait (c=c@entry=0x1445c10, m=m@entry=0x1445be8) at util/threads-pthread.c:117 ret = <optimized out> #2 0x00000032b047069b in virThreadPoolWorker (opaque=opaque@entry=0x1436a10) at util/threadpool.c:103 data = 0x0 pool = 0x1445bb0 cond = 0x1445c10 priority = false job = 0x0 #3 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x1436a10} #4 0x000000328ca07d15 in start_thread (arg=0x7f729d597700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f729d597700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140130242885376, -971735137444366802, 0, 217101504512, 140130242885376, 21256800, 893558499818927662, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #5 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. Thread 8 (Thread 0x7f721b492700 (LWP 4381)): #0 pthread_cond_wait@@GLIBC_2.3.2 () at ../nptl/sysdeps/unix/sysv/linux/x86_64/pthread_cond_wait.S:165 No locals. #1 0x00000032b0470256 in virCondWait (c=c@entry=0x1445c10, m=m@entry=0x1445be8) at util/threads-pthread.c:117 ret = <optimized out> #2 0x00000032b047069b in virThreadPoolWorker (opaque=opaque@entry=0x1640190) at util/threadpool.c:103 data = 0x0 pool = 0x1445bb0 cond = 0x1445c10 priority = false job = 0x0 #3 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x1640190} #4 0x000000328ca07d15 in start_thread (arg=0x7f721b492700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f721b492700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140128060778240, -971735137444366802, 0, 217101504512, 140128060778240, 23332408, 893826641606538798, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #5 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. Thread 7 (Thread 0x7f729dd98700 (LWP 3330)): #0 pthread_cond_wait@@GLIBC_2.3.2 () at ../nptl/sysdeps/unix/sysv/linux/x86_64/pthread_cond_wait.S:165 No locals. #1 0x00000032b0470256 in virCondWait (c=c@entry=0x1445c10, m=m@entry=0x1445be8) at util/threads-pthread.c:117 ret = <optimized out> #2 0x00000032b047069b in virThreadPoolWorker (opaque=opaque@entry=0x1436710) at util/threadpool.c:103 data = 0x0 pool = 0x1445bb0 cond = 0x1445c10 priority = false job = 0x0 #3 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x1436710} #4 0x000000328ca07d15 in start_thread (arg=0x7f729dd98700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f729dd98700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140130251278080, -971735137444366802, 0, 217101504512, 140130251278080, 21256800, 893559598793684526, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #5 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. Thread 6 (Thread 0x7f721bc93700 (LWP 4380)): #0 pthread_cond_wait@@GLIBC_2.3.2 () at ../nptl/sysdeps/unix/sysv/linux/x86_64/pthread_cond_wait.S:165 No locals. #1 0x00000032b0470256 in virCondWait (c=c@entry=0x1445c10, m=m@entry=0x1445be8) at util/threads-pthread.c:117 ret = <optimized out> #2 0x00000032b047069b in virThreadPoolWorker (opaque=opaque@entry=0x163fbe0) at util/threadpool.c:103 data = 0x0 pool = 0x1445bb0 cond = 0x1445c10 priority = false job = 0x0 #3 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x163fbe0} #4 0x000000328ca07d15 in start_thread (arg=0x7f721bc93700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f721bc93700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140128069170944, -971735137444366802, 0, 217101504512, 140128069170944, 23331160, 893827740581295662, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #5 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. Thread 5 (Thread 0x7f7299d90700 (LWP 3338)): #0 pthread_cond_wait@@GLIBC_2.3.2 () at ../nptl/sysdeps/unix/sysv/linux/x86_64/pthread_cond_wait.S:165 No locals. #1 0x00000032b0470256 in virCondWait (c=c@entry=0x1445ca8, m=m@entry=0x1445be8) at util/threads-pthread.c:117 ret = <optimized out> #2 0x00000032b04706bb in virThreadPoolWorker (opaque=opaque@entry=0x1436a10) at util/threadpool.c:103 data = 0x0 pool = 0x1445bb0 cond = 0x1445ca8 priority = true job = 0x0 #3 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x1436a10} #4 0x000000328ca07d15 in start_thread (arg=0x7f7299d90700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f7299d90700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140130184136448, -971735137444366802, 0, 217101504512, 140130184136448, 21256800, 893550815585564206, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #5 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. Thread 4 (Thread 0x7f729ad92700 (LWP 3336)): #0 pthread_cond_wait@@GLIBC_2.3.2 () at ../nptl/sysdeps/unix/sysv/linux/x86_64/pthread_cond_wait.S:165 No locals. #1 0x00000032b0470256 in virCondWait (c=c@entry=0x1445ca8, m=m@entry=0x1445be8) at util/threads-pthread.c:117 ret = <optimized out> #2 0x00000032b04706bb in virThreadPoolWorker (opaque=opaque@entry=0x1436a10) at util/threadpool.c:103 data = 0x0 pool = 0x1445bb0 cond = 0x1445ca8 priority = true job = 0x0 #3 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x1436a10} #4 0x000000328ca07d15 in start_thread (arg=0x7f729ad92700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f729ad92700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140130200921856, -971735137444366802, 0, 217101504512, 140130200921856, 21256800, 893544204557153838, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #5 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. Thread 3 (Thread 0x7f729b593700 (LWP 3335)): #0 pthread_cond_wait@@GLIBC_2.3.2 () at ../nptl/sysdeps/unix/sysv/linux/x86_64/pthread_cond_wait.S:165 No locals. #1 0x00000032b0470256 in virCondWait (c=c@entry=0x1445ca8, m=m@entry=0x1445be8) at util/threads-pthread.c:117 ret = <optimized out> #2 0x00000032b04706bb in virThreadPoolWorker (opaque=opaque@entry=0x1436cc0) at util/threadpool.c:103 data = 0x0 pool = 0x1445bb0 cond = 0x1445ca8 priority = true job = 0x0 #3 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x1436cc0} #4 0x000000328ca07d15 in start_thread (arg=0x7f729b593700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f729b593700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140130209314560, -971735137444366802, 0, 217101504512, 140130209314560, 21256800, 893545303531910702, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #5 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. Thread 2 (Thread 0x7f729a591700 (LWP 3337)): #0 pthread_cond_wait@@GLIBC_2.3.2 () at ../nptl/sysdeps/unix/sysv/linux/x86_64/pthread_cond_wait.S:165 No locals. #1 0x00000032b0470256 in virCondWait (c=c@entry=0x1445ca8, m=m@entry=0x1445be8) at util/threads-pthread.c:117 ret = <optimized out> #2 0x00000032b04706bb in virThreadPoolWorker (opaque=opaque@entry=0x1436cc0) at util/threadpool.c:103 data = 0x0 pool = 0x1445bb0 cond = 0x1445ca8 priority = true job = 0x0 #3 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x1436cc0} #4 0x000000328ca07d15 in start_thread (arg=0x7f729a591700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f729a591700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140130192529152, -971735137444366802, 0, 217101504512, 140130192529152, 21256800, 893543105582396974, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #5 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. Thread 1 (Thread 0x7f729bd94700 (LWP 3334)): #0 qemuDomainObjBeginJobInternal (driver=driver@entry=0x7f728c06a4e0, driver_locked=driver_locked@entry=true, obj=obj@entry=0x7f72414866b0, job=job@entry=QEMU_JOB_DESTROY, asyncJob=asyncJob@entry=QEMU_ASYNC_JOB_NONE) at qemu/qemu_domain.c:771 priv = 0x0 now = <optimized out> then = <optimized out> nested = false __func__ = "qemuDomainObjBeginJobInternal" __FUNCTION__ = "qemuDomainObjBeginJobInternal" #1 0x00007f7295e62b1a in qemuDomainObjBeginJobWithDriver (driver=driver@entry=0x7f728c06a4e0, obj=obj@entry=0x7f72414866b0, job=job@entry=QEMU_JOB_DESTROY) at qemu/qemu_domain.c:906 __FUNCTION__ = "qemuDomainObjBeginJobWithDriver" #2 0x00007f7295ead814 in qemuDomainDestroyFlags (dom=<optimized out>, flags=<optimized out>) at qemu/qemu_driver.c:1970 driver = 0x7f728c06a4e0 vm = 0x7f72414866b0 ret = -1 event = 0x0 priv = 0x7f72414867b0 __FUNCTION__ = "qemuDomainDestroyFlags" #3 0x00000032b04ef0a1 in virDomainDestroyFlags (domain=domain@entry=0x7f7281975d50, flags=1) at libvirt.c:2264 ret = <optimized out> conn = 0x7f725242a730 __func__ = "virDomainDestroyFlags" __FUNCTION__ = "virDomainDestroyFlags" #4 0x0000000000415b3d in remoteDispatchDomainDestroyFlags (server=<optimized out>, msg=<optimized out>, args=0x7f7281975d90, rerr=0x7f729bd93c70, client=<optimized out>) at remote_dispatch.h:1329 dom = 0x7f7281975d50 priv = <optimized out> #5 remoteDispatchDomainDestroyFlagsHelper (server=<optimized out>, client=<optimized out>, msg=<optimized out>, rerr=0x7f729bd93c70, args=0x7f7281975d90, ret=<optimized out>) at remote_dispatch.h:1307 __func__ = "remoteDispatchDomainDestroyFlagsHelper" #6 0x00000032b054f5d2 in virNetServerProgramDispatchCall (msg=0x163c840, client=0x1642330, server=0x1445a60, prog=0x14688f0) at rpc/virnetserverprogram.c:431 ret = 0x7f7281a51ce0 "" rv = -1 i = <optimized out> arg = 0x7f7281975d90 "\300^\227\201r\177" dispatcher = 0x64bce0 <remoteProcs+11232> rerr = {code = 0, domain = 0, message = 0x0, level = 0, dom = 0x0, str1 = 0x0, str2 = 0x0, str3 = 0x0, int1 = 0, int2 = 0, net = 0x0} #7 virNetServerProgramDispatch (prog=0x14688f0, server=server@entry=0x1445a60, client=0x1642330, msg=0x163c840) at rpc/virnetserverprogram.c:304 rerr = {code = 0, domain = 0, message = 0x0, level = 0, dom = 0x0, str1 = 0x0, str2 = 0x0, str3 = 0x0, int1 = 0, int2 = 0, net = 0x0} __func__ = "virNetServerProgramDispatch" __FUNCTION__ = "virNetServerProgramDispatch" #8 0x00000032b054b701 in virNetServerProcessMsg (msg=<optimized out>, prog=<optimized out>, client=<optimized out>, srv=0x1445a60) at rpc/virnetserver.c:170 ret = -1 #9 virNetServerHandleJob (jobOpaque=<optimized out>, opaque=0x1445a60) at rpc/virnetserver.c:191 srv = 0x1445a60 job = 0x1643640 __func__ = "virNetServerHandleJob" #10 0x00000032b04705fe in virThreadPoolWorker (opaque=opaque@entry=0x1436710) at util/threadpool.c:144 data = 0x0 pool = 0x1445bb0 cond = 0x1445c10 priority = false job = 0x16436a0 #11 0x00000032b0470089 in virThreadHelper (data=<optimized out>) at util/threads-pthread.c:161 args = 0x0 local = {func = 0x32b04704f0 <virThreadPoolWorker>, opaque = 0x1436710} #12 0x000000328ca07d15 in start_thread (arg=0x7f729bd94700) at pthread_create.c:308 __res = <optimized out> pd = 0x7f729bd94700 now = <optimized out> unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140130217707264, -971735137444366802, 0, 217101504512, 140130217707264, 21256800, 893546402506667566, -943876536309451218}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}} not_first_call = 0 pagesize_m1 = <optimized out> sp = <optimized out> freesize = <optimized out> #13 0x000000328c6f246d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:114 No locals. From To Syms Read Shared Object Library 0x000000328ea00be0 0x000000328ea01224 Yes /lib64/libvirt-qemu.so.0 0x00000032b04514c0 0x00000032b064d49c Yes /lib64/libvirt.so.0 0x00000032ae801370 0x00000032ae802eac Yes /lib64/libcap-ng.so.0 0x00000032abe012b0 0x00000032abe065a8 Yes /lib64/libyajl.so.2 0x00000032aa6134a0 0x00000032aa630dd0 Yes /lib64/libnl-route-3.so.200 0x00000032aaa07cd0 0x00000032aaa11a84 Yes /lib64/libnl-3.so.200 0x00000032a5e028f0 0x00000032a5e08130 Yes /lib64/libaudit.so.1 0x00000032a6e06d40 0x00000032a6e2b428 Yes /lib64/libdevmapper.so.1.02 0x0000003291e07ab0 0x0000003291e312e4 Yes /lib64/libdbus-1.so.3 0x0000003292e2e870 0x0000003292f17470 Yes /lib64/libxml2.so.2 0x00000032a9e029e0 0x00000032a9e06260 Yes /lib64/libnuma.so.1 0x00000032aa203530 0x00000032aa20872c Yes /lib64/libavahi-common.so.3 0x00000032a9a039d0 0x00000032a9a0c0d4 Yes /lib64/libavahi-client.so.3 0x00000032a6218a70 0x00000032a62932fc Yes /lib64/libgnutls.so.26 0x000000329fe07200 0x000000329fe52e48 Yes /lib64/libgcrypt.so.11 0x00000032a5a04ba0 0x00000032a5a15d34 Yes (*) /lib64/libsasl2.so.2 0x00000032a8e05500 0x00000032a8e1ee78 Yes /lib64/libssh2.so.1 0x00000032a8209ad0 0x00000032a824e2c4 Yes /lib64/libcurl.so.4 0x00000032aec0d860 0x00000032aec27ad0 Yes /lib64/libwsman.so.1 0x00000032af403230 0x00000032af40681c Yes /lib64/libwsman_client.so.1 0x00000032af002020 0x00000032af003cf4 Yes /lib64/libwsman_curl_client_transport.so.1 0x000000328e606170 0x000000328e6175d4 Yes /lib64/libselinux.so.1 0x000000328d6022a0 0x000000328d60557c Yes /lib64/librt.so.1 0x000000328ca05790 0x000000328ca104b4 Yes /lib64/libpthread.so.0 0x00000032a4e00f10 0x00000032a4e01824 Yes /lib64/libutil.so.1 0x000000328d200ed0 0x000000328d2019f0 Yes /lib64/libdl.so.2 0x000000328c61f1a0 0x000000328c760940 Yes /lib64/libc.so.6 0x000000328de02a40 0x000000328de12168 Yes /lib64/libgcc_s.so.1 0x000000328c200b20 0x000000328c21a3d9 Yes /lib64/ld-linux-x86-64.so.2 0x000000328ce055b0 0x000000328ce6fd68 Yes /lib64/libm.so.6 0x00000032a2204140 0x00000032a2232704 Yes /lib64/libsepol.so.1 0x00000032922033e0 0x000000329220b910 Yes /lib64/libudev.so.1 0x000000328da02190 0x000000328da0e640 Yes /lib64/libz.so.1 0x00000032926030f0 0x0000003292619340 Yes /lib64/liblzma.so.5 0x00000032a1200990 0x00000032a1200ee8 Yes /lib64/libgpg-error.so.0 0x00000032a6601d00 0x00000032a660ca28 Yes /lib64/libtasn1.so.3 0x00000032a5602ed0 0x00000032a560cb7c Yes /lib64/libp11-kit.so.0 0x000000328ee03a30 0x000000328ee1200c Yes /lib64/libresolv.so.2 0x00000032a0200ed0 0x00000032a020610c Yes /lib64/libcrypt.so.1 0x00000032a4a16f30 0x00000032a4a4d074 Yes /lib64/libssl.so.10 0x00000032a0a61fc0 0x00000032a0b44af8 Yes /lib64/libcrypto.so.10 0x00000032a7e03010 0x00000032a7e07528 Yes /lib64/libidn.so.11 0x00000032a76036d0 0x00000032a760ab24 Yes /lib64/liblber-2.4.so.2 0x00000032a720f090 0x00000032a723d998 Yes /lib64/libldap-2.4.so.2 0x00000032a060ad90 0x00000032a0638264 Yes /lib64/libgssapi_krb5.so.2 0x000000329f61b690 0x000000329f693410 Yes /lib64/libkrb5.so.3 0x000000329fa044d0 0x000000329fa1c938 Yes /lib64/libk5crypto.so.3 0x000000329da01560 0x000000329da02144 Yes /lib64/libcom_err.so.2 0x00000032a1e0a140 0x00000032a1e2b738 Yes /lib64/libssl3.so 0x00000032a1a09d40 0x00000032a1a22400 Yes /lib64/libsmime3.so 0x000000329ee19530 0x000000329eefc8a4 Yes /lib64/libnss3.so 0x000000329de0bc00 0x000000329de19ac4 Yes /lib64/libnssutil3.so 0x000000329e600ff0 0x000000329e601f18 Yes /lib64/libplds4.so 0x000000329e201510 0x000000329e202bf4 Yes /lib64/libplc4.so 0x000000329ea0d350 0x000000329ea2cdf0 Yes /lib64/libnspr4.so 0x000000328e201db0 0x000000328e2452e8 Yes /lib64/libpcre.so.1 0x0000003291600da0 0x0000003291601bfa Yes /lib64/libsystemd-daemon.so.0 0x00000032a1603660 0x00000032a1647570 Yes /lib64/libfreebl3.so 0x000000329f202b50 0x000000329f2080cc Yes /lib64/libkrb5support.so.0 0x00000032a0e01190 0x00000032a0e01b44 Yes /lib64/libkeyutils.so.1 0x00007f729dfa9d50 0x00007f729dfb3d0c Yes /usr/lib64/pkcs11/gnome-keyring-pkcs11.so 0x00007f729dd9b1e0 0x00007f729dda267c Yes /lib64/libnss_files.so.2 0x00007f7298b7a0c0 0x00007f7298b87fcc Yes /usr/lib64/libvirt/connection-driver/libvirt_driver_network.so 0x00007f729894fd30 0x00007f729896925c Yes /usr/lib64/libvirt/connection-driver/libvirt_driver_storage.so 0x00007f729871bc20 0x00007f729873957c Yes /lib64/libblkid.so.1 0x00007f72984dac90 0x00007f7298505c1c Yes /lib64/librbd.so.1 0x00007f729805e130 0x00007f72982098ec Yes /lib64/librados.so.2 0x0000003294201510 0x0000003294202a8c Yes /lib64/libuuid.so.1 0x00007f7297b26210 0x00007f7297c7394c Yes /lib64/libcryptopp.so.6 0x000000329025bb80 0x00000032902c10bb Yes /lib64/libstdc++.so.6 0x00007f7297655d10 0x00007f72976601ec Yes /usr/lib64/libvirt/connection-driver/libvirt_driver_nodedev.so 0x000000328f202350 0x000000328f206530 Yes /lib64/libpciaccess.so.0 0x00007f7297444790 0x00007f729744d16c Yes /usr/lib64/libvirt/connection-driver/libvirt_driver_secret.so 0x00007f7297223cc0 0x00007f7297237fec Yes /usr/lib64/libvirt/connection-driver/libvirt_driver_nwfilter.so 0x00007f7296fc4720 0x00007f7296fe18fc Yes /lib64/libpcap.so.1 0x00007f7296db0790 0x00007f7296db8ecc Yes /usr/lib64/libvirt/connection-driver/libvirt_driver_interface.so 0x00007f7296b9f490 0x00007f7296ba839c Yes /lib64/libnetcf.so.1 0x00007f72969590e0 0x00007f7296988d40 Yes /lib64/libaugeas.so.0 0x00007f7296741d90 0x00007f729674ed5c Yes /lib64/libexslt.so.0 0x00000032acc0ac50 0x00000032acc2f7d4 Yes /lib64/libxslt.so.1 0x00007f729652d860 0x00007f729653a03c Yes /lib64/libfa.so.1 0x00007f7296301540 0x00007f729632014c Yes /usr/lib64/libvirt/connection-driver/libvirt_driver_xen.so 0x00007f72960ef060 0x00007f72960f13d8 Yes /lib64/libxenstore.so.3.0 0x00007f7295e451b0 0x00007f7295ec098c Yes /usr/lib64/libvirt/connection-driver/libvirt_driver_qemu.so 0x00007f7295c067d0 0x00007f7295c1b63c Yes /usr/lib64/libvirt/connection-driver/libvirt_driver_lxc.so 0x00007f72959e9f70 0x00007f72959f6a0c Yes /usr/lib64/libvirt/connection-driver/libvirt_driver_uml.so 0x00007f72957df260 0x00007f72957e1654 Yes (*) /usr/lib64/sasl2/libcrammd5.so 0x00007f72955da160 0x00007f72955dbd64 Yes (*) /usr/lib64/sasl2/libanonymous.so 0x00007f72953ccb70 0x00007f72953d52d4 Yes (*) /usr/lib64/sasl2/libdigestmd5.so 0x00007f72951c4890 0x00007f72951c8c54 Yes (*) /usr/lib64/sasl2/libgssapiv2.so 0x00007f7294fbf1b0 0x00007f7294fc1004 Yes (*) /usr/lib64/sasl2/libplain.so 0x00007f7294db8500 0x00007f7294dbb298 Yes (*) /usr/lib64/sasl2/libsasldb.so 0x00007f7294a322c0 0x00007f7294b71968 Yes /lib64/libdb-5.3.so 0x00007f72947ff150 0x00007f7294800ef4 Yes (*) /usr/lib64/sasl2/liblogin.so (*): Shared library is missing debugging information. $1 = 0x0 No symbol "__glib_assert_msg" in current context. rax 0x1 1 rbx 0x7f72414866b0 140128698263216 rcx 0x2 2 rdx 0x7f72414866b0 140128698263216 rsi 0x1 1 rdi 0x7f729bd93a08 140130217703944 rbp 0x7f728c06a400 0x7f728c06a400 rsp 0x7f729bd93990 0x7f729bd93990 r8 0x0 0 r9 0x0 0 r10 0x0 0 r11 0x328c688d27 217104026919 r12 0x1 1 r13 0x7f72414867b0 140128698263472 r14 0x7f729bd93c10 140130217704464 r15 0x0 0 rip 0x7f7295e61992 0x7f7295e61992 <qemuDomainObjBeginJobInternal+50> eflags 0x297 [ CF PF AF SF IF ] cs 0x33 51 ss 0x2b 43 ds 0x0 0 es 0x0 0 fs 0x0 0 gs 0x0 0 Dump of assembler code for function qemuDomainObjBeginJobInternal: 0x00007f7295e61960 <+0>: push %r15 0x00007f7295e61962 <+2>: push %r14 0x00007f7295e61964 <+4>: push %r13 0x00007f7295e61966 <+6>: push %r12 0x00007f7295e61968 <+8>: push %rbp 0x00007f7295e61969 <+9>: push %rbx 0x00007f7295e6196a <+10>: mov %rdx,%rbx 0x00007f7295e6196d <+13>: sub $0x88,%rsp 0x00007f7295e61974 <+20>: mov 0x70(%rdx),%r15 0x00007f7295e61978 <+24>: cmp $0x8,%ecx 0x00007f7295e6197b <+27>: sete %bpl 0x00007f7295e6197f <+31>: mov %rdi,0x58(%rsp) 0x00007f7295e61984 <+36>: lea 0x78(%rsp),%rdi 0x00007f7295e61989 <+41>: mov %ecx,0x60(%rsp) 0x00007f7295e6198d <+45>: mov %r8d,0x6c(%rsp) => 0x00007f7295e61992 <+50>: addl $0x1,0x16c(%r15) 0x00007f7295e6199a <+58>: mov %sil,0x67(%rsp) 0x00007f7295e6199f <+63>: callq 0x7f7295e44470 <virTimeMillisNow@plt> 0x00007f7295e619a4 <+68>: test %eax,%eax 0x00007f7295e619a6 <+70>: js 0x7f7295e61d40 <qemuDomainObjBeginJobInternal+992> 0x00007f7295e619ac <+76>: mov 0x78(%rsp),%r13 0x00007f7295e619b1 <+81>: mov %rbx,%rdi 0x00007f7295e619b4 <+84>: callq 0x7f7295e45120 <virObjectRef@plt> 0x00007f7295e619b9 <+89>: add $0x7530,%r13 0x00007f7295e619c0 <+96>: cmpb $0x0,0x67(%rsp) 0x00007f7295e619c5 <+101>: jne 0x7f7295e61ba0 <qemuDomainObjBeginJobInternal+576> 0x00007f7295e619cb <+107>: mov 0x60(%rsp),%ecx 0x00007f7295e619cf <+111>: mov $0x1,%r14d 0x00007f7295e619d5 <+117>: lea 0x10(%rbx),%r12 0x00007f7295e619d9 <+121>: sub $0x1,%ecx 0x00007f7295e619dc <+124>: shl %cl,%r14d 0x00007f7295e619df <+127>: mov %ecx,0x68(%rsp) 0x00007f7295e619e3 <+131>: lea 0x38(%r15),%rcx 0x00007f7295e619e7 <+135>: movslq %r14d,%r14 0x00007f7295e619ea <+138>: mov %rcx,0x50(%rsp) 0x00007f7295e619ef <+143>: mov 0x58(%rsp),%rdx 0x00007f7295e619f4 <+148>: mov 0x12c(%rdx),%eax 0x00007f7295e619fa <+154>: test %eax,%eax 0x00007f7295e619fc <+156>: je 0x7f7295e61a07 <qemuDomainObjBeginJobInternal+167> 0x00007f7295e619fe <+158>: cmp 0x16c(%r15),%eax 0x00007f7295e61a05 <+165>: jl 0x7f7295e61a37 <qemuDomainObjBeginJobInternal+215> 0x00007f7295e61a07 <+167>: test %bpl,%bpl 0x00007f7295e61a0a <+170>: jne 0x7f7295e61a18 <qemuDomainObjBeginJobInternal+184> 0x00007f7295e61a0c <+172>: mov 0x68(%r15),%eax 0x00007f7295e61a10 <+176>: test %eax,%eax 0x00007f7295e61a12 <+178>: jne 0x7f7295e61b70 <qemuDomainObjBeginJobInternal+528> 0x00007f7295e61a18 <+184>: mov 0x30(%r15),%r11d 0x00007f7295e61a1c <+188>: test %r11d,%r11d 0x00007f7295e61a1f <+191>: je 0x7f7295e61c70 <qemuDomainObjBeginJobInternal+784> 0x00007f7295e61a25 <+197>: mov %r13,%rdx 0x00007f7295e61a28 <+200>: mov %r12,%rsi 0x00007f7295e61a2b <+203>: mov %r15,%rdi 0x00007f7295e61a2e <+206>: callq 0x7f7295e407b0 <virCondWaitUntil@plt> 0x00007f7295e61a33 <+211>: test %eax,%eax 0x00007f7295e61a35 <+213>: jns 0x7f7295e61a18 <qemuDomainObjBeginJobInternal+184> 0x00007f7295e61a37 <+215>: mov 0x6c(%r15),%ecx 0x00007f7295e61a3b <+219>: mov 0x34(%r15),%edx 0x00007f7295e61a3f <+223>: mov 0x68(%r15),%edi 0x00007f7295e61a43 <+227>: mov %edx,0x40(%rsp) 0x00007f7295e61a47 <+231>: mov %ecx,0x48(%rsp) 0x00007f7295e61a4b <+235>: callq 0x7f7295e44cc0 <qemuDomainAsyncJobTypeToString@plt> 0x00007f7295e61a50 <+240>: mov 0x30(%r15),%edi 0x00007f7295e61a54 <+244>: mov %rax,%r14 0x00007f7295e61a57 <+247>: callq 0x7f7295e41130 <qemuDomainJobTypeToString@plt> 0x00007f7295e61a5c <+252>: mov 0x6c(%rsp),%edi 0x00007f7295e61a60 <+256>: mov %rax,%r13 0x00007f7295e61a63 <+259>: mov 0x48(%rbx),%rax 0x00007f7295e61a67 <+263>: mov 0x18(%rax),%r12 0x00007f7295e61a6b <+267>: callq 0x7f7295e44cc0 <qemuDomainAsyncJobTypeToString@plt> 0x00007f7295e61a70 <+272>: mov 0x60(%rsp),%edi 0x00007f7295e61a74 <+276>: mov %rax,%rbp 0x00007f7295e61a77 <+279>: callq 0x7f7295e41130 <qemuDomainJobTypeToString@plt> 0x00007f7295e61a7c <+284>: mov 0x40(%rsp),%edx 0x00007f7295e61a80 <+288>: mov 0x48(%rsp),%ecx 0x00007f7295e61a84 <+292>: lea 0x6586d(%rip),%r9 # 0x7f7295ec72f8 0x00007f7295e61a8b <+299>: lea 0x65e61(%rip),%rdi # 0x7f7295ec78f3 0x00007f7295e61a92 <+306>: xor %r8d,%r8d 0x00007f7295e61a95 <+309>: mov %rbp,0x8(%rsp) 0x00007f7295e61a9a <+314>: mov %rax,(%rsp) 0x00007f7295e61a9e <+318>: mov $0x3,%esi 0x00007f7295e61aa3 <+323>: xor %eax,%eax 0x00007f7295e61aa5 <+325>: mov %edx,0x28(%rsp) 0x00007f7295e61aa9 <+329>: lea 0x66390(%rip),%rdx # 0x7f7295ec7e40 <__func__.23056> 0x00007f7295e61ab0 <+336>: mov %ecx,0x30(%rsp) 0x00007f7295e61ab4 <+340>: mov %r14,0x20(%rsp) 0x00007f7295e61ab9 <+345>: mov $0x346,%ecx 0x00007f7295e61abe <+350>: mov %r13,0x18(%rsp) 0x00007f7295e61ac3 <+355>: mov %r12,0x10(%rsp) 0x00007f7295e61ac8 <+360>: callq 0x7f7295e410f0 <virLogMessage@plt> 0x00007f7295e61acd <+365>: callq 0x7f7295e40740 <__errno_location@plt> 0x00007f7295e61ad2 <+370>: cmpl $0x6e,(%rax) 0x00007f7295e61ad5 <+373>: mov %rax,%rbp 0x00007f7295e61ad8 <+376>: je 0x7f7295e61c20 <qemuDomainObjBeginJobInternal+704> 0x00007f7295e61ade <+382>: mov 0x58(%rsp),%rax 0x00007f7295e61ae3 <+387>: mov 0x12c(%rax),%edx 0x00007f7295e61ae9 <+393>: test %edx,%edx 0x00007f7295e61aeb <+395>: je 0x7f7295e61afa <qemuDomainObjBeginJobInternal+410> 0x00007f7295e61aed <+397>: cmp 0x16c(%r15),%edx 0x00007f7295e61af4 <+404>: jl 0x7f7295e61bd0 <qemuDomainObjBeginJobInternal+624> 0x00007f7295e61afa <+410>: lea 0x65e79(%rip),%rsi # 0x7f7295ec797a 0x00007f7295e61b01 <+417>: lea 0x74646(%rip),%rdi # 0x7f7295ed614e 0x00007f7295e61b08 <+424>: mov $0x5,%edx 0x00007f7295e61b0d <+429>: callq 0x7f7295e414a0 <dcgettext@plt> 0x00007f7295e61b12 <+434>: mov %rax,(%rsp) 0x00007f7295e61b16 <+438>: mov 0x0(%rbp),%esi 0x00007f7295e61b19 <+441>: lea 0x701b7(%rip),%r9 # 0x7f7295ed1cd7 0x00007f7295e61b20 <+448>: lea 0x66339(%rip),%rcx # 0x7f7295ec7e60 <__FUNCTION__.23057> 0x00007f7295e61b27 <+455>: lea 0x65dca(%rip),%rdx # 0x7f7295ec78f8 0x00007f7295e61b2e <+462>: mov $0x352,%r8d 0x00007f7295e61b34 <+468>: mov $0xa,%edi 0x00007f7295e61b39 <+473>: xor %eax,%eax 0x00007f7295e61b3b <+475>: callq 0x7f7295e44780 <virReportSystemErrorFull@plt> 0x00007f7295e61b40 <+480>: subl $0x1,0x16c(%r15) 0x00007f7295e61b48 <+488>: cmpb $0x0,0x67(%rsp) 0x00007f7295e61b4d <+493>: jne 0x7f7295e61bb0 <qemuDomainObjBeginJobInternal+592> 0x00007f7295e61b4f <+495>: mov %rbx,%rdi 0x00007f7295e61b52 <+498>: callq 0x7f7295e43f10 <virObjectUnref@plt> 0x00007f7295e61b57 <+503>: mov $0xffffffff,%eax 0x00007f7295e61b5c <+508>: add $0x88,%rsp 0x00007f7295e61b63 <+515>: pop %rbx 0x00007f7295e61b64 <+516>: pop %rbp 0x00007f7295e61b65 <+517>: pop %r12 0x00007f7295e61b67 <+519>: pop %r13 0x00007f7295e61b69 <+521>: pop %r14 0x00007f7295e61b6b <+523>: pop %r15 0x00007f7295e61b6d <+525>: retq 0x00007f7295e61b6e <+526>: xchg %ax,%ax 0x00007f7295e61b70 <+528>: test %r14,0x78(%r15) 0x00007f7295e61b74 <+532>: jne 0x7f7295e61a18 <qemuDomainObjBeginJobInternal+184> 0x00007f7295e61b7a <+538>: mov 0x50(%rsp),%rdi 0x00007f7295e61b7f <+543>: mov %r13,%rdx 0x00007f7295e61b82 <+546>: mov %r12,%rsi 0x00007f7295e61b85 <+549>: callq 0x7f7295e407b0 <virCondWaitUntil@plt> 0x00007f7295e61b8a <+554>: test %eax,%eax 0x00007f7295e61b8c <+556>: jns 0x7f7295e61a0c <qemuDomainObjBeginJobInternal+172> 0x00007f7295e61b92 <+562>: jmpq 0x7f7295e61a37 <qemuDomainObjBeginJobInternal+215> 0x00007f7295e61b97 <+567>: nopw 0x0(%rax,%rax,1) 0x00007f7295e61ba0 <+576>: mov 0x58(%rsp),%rdi 0x00007f7295e61ba5 <+581>: callq 0x7f7295e41e20 <qemuDriverUnlock@plt> 0x00007f7295e61baa <+586>: jmpq 0x7f7295e619cb <qemuDomainObjBeginJobInternal+107> 0x00007f7295e61baf <+591>: nop 0x00007f7295e61bb0 <+592>: mov %rbx,%rdi 0x00007f7295e61bb3 <+595>: callq 0x7f7295e42070 <virDomainObjUnlock@plt> 0x00007f7295e61bb8 <+600>: mov 0x58(%rsp),%rdi 0x00007f7295e61bbd <+605>: callq 0x7f7295e41e70 <qemuDriverLock@plt> 0x00007f7295e61bc2 <+610>: mov %rbx,%rdi 0x00007f7295e61bc5 <+613>: callq 0x7f7295e422b0 <virDomainObjLock@plt> 0x00007f7295e61bca <+618>: jmpq 0x7f7295e61b4f <qemuDomainObjBeginJobInternal+495> 0x00007f7295e61bcf <+623>: nop 0x00007f7295e61bd0 <+624>: lea 0x657a1(%rip),%rsi # 0x7f7295ec7378 0x00007f7295e61bd7 <+631>: lea 0x74570(%rip),%rdi # 0x7f7295ed614e 0x00007f7295e61bde <+638>: mov $0x5,%edx 0x00007f7295e61be3 <+643>: callq 0x7f7295e414a0 <dcgettext@plt> 0x00007f7295e61be8 <+648>: lea 0x700e8(%rip),%r9 # 0x7f7295ed1cd7 0x00007f7295e61bef <+655>: lea 0x6626a(%rip),%rcx # 0x7f7295ec7e60 <__FUNCTION__.23057> 0x00007f7295e61bf6 <+662>: lea 0x65cfb(%rip),%rdx # 0x7f7295ec78f8 0x00007f7295e61bfd <+669>: mov %rax,(%rsp) 0x00007f7295e61c01 <+673>: mov $0x34f,%r8d 0x00007f7295e61c07 <+679>: mov $0x9,%esi 0x00007f7295e61c0c <+684>: mov $0xa,%edi 0x00007f7295e61c11 <+689>: xor %eax,%eax 0x00007f7295e61c13 <+691>: callq 0x7f7295e41a70 <virReportErrorHelper@plt> 0x00007f7295e61c18 <+696>: jmpq 0x7f7295e61b40 <qemuDomainObjBeginJobInternal+480> 0x00007f7295e61c1d <+701>: nopl (%rax) 0x00007f7295e61c20 <+704>: lea 0x65729(%rip),%rsi # 0x7f7295ec7350 0x00007f7295e61c27 <+711>: lea 0x74520(%rip),%rdi # 0x7f7295ed614e 0x00007f7295e61c2e <+718>: mov $0x5,%edx 0x00007f7295e61c33 <+723>: callq 0x7f7295e414a0 <dcgettext@plt> 0x00007f7295e61c38 <+728>: lea 0x70098(%rip),%r9 # 0x7f7295ed1cd7 0x00007f7295e61c3f <+735>: lea 0x6621a(%rip),%rcx # 0x7f7295ec7e60 <__FUNCTION__.23057> 0x00007f7295e61c46 <+742>: lea 0x65cab(%rip),%rdx # 0x7f7295ec78f8 0x00007f7295e61c4d <+749>: mov %rax,(%rsp) 0x00007f7295e61c51 <+753>: mov $0x34a,%r8d 0x00007f7295e61c57 <+759>: mov $0x44,%esi 0x00007f7295e61c5c <+764>: mov $0xa,%edi 0x00007f7295e61c61 <+769>: xor %eax,%eax 0x00007f7295e61c63 <+771>: callq 0x7f7295e41a70 <virReportErrorHelper@plt> 0x00007f7295e61c68 <+776>: jmpq 0x7f7295e61b40 <qemuDomainObjBeginJobInternal+480> 0x00007f7295e61c6d <+781>: nopl (%rax) 0x00007f7295e61c70 <+784>: test %bpl,%bpl 0x00007f7295e61c73 <+787>: jne 0x7f7295e61c88 <qemuDomainObjBeginJobInternal+808> 0x00007f7295e61c75 <+789>: mov 0x68(%r15),%r10d 0x00007f7295e61c79 <+793>: test %r10d,%r10d 0x00007f7295e61c7c <+796>: je 0x7f7295e61c88 <qemuDomainObjBeginJobInternal+808> 0x00007f7295e61c7e <+798>: test %r14,0x78(%r15) 0x00007f7295e61c82 <+802>: je 0x7f7295e619ef <qemuDomainObjBeginJobInternal+143> 0x00007f7295e61c88 <+808>: cmpl $0x7,0x60(%rsp) 0x00007f7295e61c8d <+813>: movl $0x0,0x30(%r15) 0x00007f7295e61c95 <+821>: movl $0x0,0x34(%r15) 0x00007f7295e61c9d <+829>: je 0x7f7295e61d66 <qemuDomainObjBeginJobInternal+1030> 0x00007f7295e61ca3 <+835>: mov 0x68(%r15),%edi 0x00007f7295e61ca7 <+839>: callq 0x7f7295e44cc0 <qemuDomainAsyncJobTypeToString@plt> 0x00007f7295e61cac <+844>: mov 0x60(%rsp),%edi 0x00007f7295e61cb0 <+848>: mov %rax,%rbp 0x00007f7295e61cb3 <+851>: callq 0x7f7295e41130 <qemuDomainJobTypeToString@plt> 0x00007f7295e61cb8 <+856>: lea 0x65c88(%rip),%r9 # 0x7f7295ec7947 0x00007f7295e61cbf <+863>: lea 0x6617a(%rip),%rdx # 0x7f7295ec7e40 <__func__.23056> 0x00007f7295e61cc6 <+870>: lea 0x65c26(%rip),%rdi # 0x7f7295ec78f3 0x00007f7295e61ccd <+877>: mov %rax,(%rsp) 0x00007f7295e61cd1 <+881>: xor %r8d,%r8d 0x00007f7295e61cd4 <+884>: mov $0x327,%ecx 0x00007f7295e61cd9 <+889>: mov $0x1,%esi 0x00007f7295e61cde <+894>: xor %eax,%eax 0x00007f7295e61ce0 <+896>: mov %rbp,0x8(%rsp) 0x00007f7295e61ce5 <+901>: callq 0x7f7295e410f0 <virLogMessage@plt> 0x00007f7295e61cea <+906>: mov 0x60(%rsp),%ecx 0x00007f7295e61cee <+910>: mov %ecx,0x30(%r15) 0x00007f7295e61cf2 <+914>: callq 0x7f7295e406f0 <virThreadSelfID@plt> 0x00007f7295e61cf7 <+919>: mov %eax,0x34(%r15) 0x00007f7295e61cfb <+923>: cmpb $0x0,0x67(%rsp) 0x00007f7295e61d00 <+928>: jne 0x7f7295e61d4a <qemuDomainObjBeginJobInternal+1002> 0x00007f7295e61d02 <+930>: mov 0x68(%rsp),%ecx 0x00007f7295e61d06 <+934>: mov $0x42,%edx 0x00007f7295e61d0b <+939>: xor %eax,%eax 0x00007f7295e61d0d <+941>: bt %ecx,%edx 0x00007f7295e61d10 <+944>: jae 0x7f7295e61b5c <qemuDomainObjBeginJobInternal+508> 0x00007f7295e61d16 <+950>: mov 0x58(%rsp),%rsi 0x00007f7295e61d1b <+955>: mov 0x58(%rsp),%rdi 0x00007f7295e61d20 <+960>: mov %rbx,%rdx 0x00007f7295e61d23 <+963>: mov %eax,0x48(%rsp) 0x00007f7295e61d27 <+967>: add $0x130,%rsi 0x00007f7295e61d2e <+974>: sub $0xffffffffffffff80,%rdi 0x00007f7295e61d32 <+978>: callq 0x7f7295e615c0 <qemuDomainObjSaveJob> 0x00007f7295e61d37 <+983>: mov 0x48(%rsp),%eax 0x00007f7295e61d3b <+987>: jmpq 0x7f7295e61b5c <qemuDomainObjBeginJobInternal+508> 0x00007f7295e61d40 <+992>: mov $0xffffffff,%eax 0x00007f7295e61d45 <+997>: jmpq 0x7f7295e61b5c <qemuDomainObjBeginJobInternal+508> 0x00007f7295e61d4a <+1002>: mov %rbx,%rdi 0x00007f7295e61d4d <+1005>: callq 0x7f7295e42070 <virDomainObjUnlock@plt> 0x00007f7295e61d52 <+1010>: mov 0x58(%rsp),%rdi 0x00007f7295e61d57 <+1015>: callq 0x7f7295e41e70 <qemuDriverLock@plt> 0x00007f7295e61d5c <+1020>: mov %rbx,%rdi 0x00007f7295e61d5f <+1023>: callq 0x7f7295e422b0 <virDomainObjLock@plt> 0x00007f7295e61d64 <+1028>: jmp 0x7f7295e61d02 <qemuDomainObjBeginJobInternal+930> 0x00007f7295e61d66 <+1030>: mov 0x6c(%rsp),%edi 0x00007f7295e61d6a <+1034>: callq 0x7f7295e44cc0 <qemuDomainAsyncJobTypeToString@plt> 0x00007f7295e61d6f <+1039>: lea 0x65bed(%rip),%r9 # 0x7f7295ec7963 0x00007f7295e61d76 <+1046>: lea 0x660c3(%rip),%rdx # 0x7f7295ec7e40 <__func__.23056> 0x00007f7295e61d7d <+1053>: lea 0x65b6f(%rip),%rdi # 0x7f7295ec78f3 0x00007f7295e61d84 <+1060>: xor %r8d,%r8d 0x00007f7295e61d87 <+1063>: mov $0x32c,%ecx 0x00007f7295e61d8c <+1068>: mov $0x1,%esi 0x00007f7295e61d91 <+1073>: mov %rax,(%rsp) 0x00007f7295e61d95 <+1077>: xor %eax,%eax 0x00007f7295e61d97 <+1079>: callq 0x7f7295e410f0 <virLogMessage@plt> 0x00007f7295e61d9c <+1084>: mov %r15,%rdi 0x00007f7295e61d9f <+1087>: callq 0x7f7295e60cc0 <qemuDomainObjResetAsyncJob> 0x00007f7295e61da4 <+1092>: mov 0x6c(%rsp),%eax 0x00007f7295e61da8 <+1096>: mov %eax,0x68(%r15) 0x00007f7295e61dac <+1100>: callq 0x7f7295e406f0 <virThreadSelfID@plt> 0x00007f7295e61db1 <+1105>: mov %eax,0x6c(%r15) 0x00007f7295e61db5 <+1109>: mov 0x78(%rsp),%rax 0x00007f7295e61dba <+1114>: mov %rax,0x80(%r15) 0x00007f7295e61dc1 <+1121>: jmpq 0x7f7295e61cfb <qemuDomainObjBeginJobInternal+923> End of assembler dump. This was with libvirt-0.10.2.2-3.fc18.x86_64. What looks like an instance of the same bug was also reported on the mailing list here: https://www.redhat.com/archives/libvir-list/2013-January/msg00020.html One possibly relevant point in common there is that both libguest & the mailing list reporter are using transient domains. Interesting. So qemuDomainObjBeginJobInternal gets NULL from obj->privateData at the very beginning of the function, while qemuDomainDestroyFlags saw a non-NULL value just few moments before. However, between reading that value and calling qemuDomainObjBeginJobWithDriver, qemuDomainDestroyFlags calls to qemuProcessKill(), which among other things unlocks the vm object and locks it again. I bet the vm->privateData gets set to NULL somewhere at that point. Created attachment 682368 [details]
Crash reproducer
The attached demo program can reproduce the problem faster than I could with libguestfs' test-parallel demo
# gcc -lpthread `pkg-config --cflags --libs libvirt` -Wall -o demo demo.c
Confirmed that current GIT is affected too. Using systemtap I was able to determine the trace of the last API call which free'd a virDomainObjPtr instance 6.152 Del 7fb264002760: virDomainObj 0x7fb2b6579bd1 : virObjectUnref+0x11a/0x1d0 [...home/berrange/src/virt/libvirt/src/.libs/libvirt.so.0.1000.1] 0x4cb359 : 0x4cb359 [/home/berrange/src/virt/libvirt/daemon/.libs/lt-libvirtd+0xcb359/0x21a000] Address 0x4cb359 corresponds to qemu_process.c:1094 which is qemuProcessHandleMonitorDestroy The object address 7fb264002760 corresponds to the object trying to be accessed when the null pointer is de-referenced #0 0x0000000000505e08 in qemuDomainObjBeginJobInternal (driver=0x7fb29806d310, driver_locked=true, obj=0x7fb264002760, job=QEMU_JOB_DESTROY, asyncJob=QEMU_ASYNC_JOB_NONE) at qemu/qemu_domain.c:772 #1 0x0000000000506421 in qemuDomainObjBeginJobWithDriver (driver=0x7fb29806d310, obj=0x7fb264002760, job=QEMU_JOB_DESTROY) at qemu/qemu_domain.c:906 #2 0x000000000047004e in qemuDomainDestroyFlags (dom=0x7fb2980ee060, flags=0) at qemu/qemu_driver.c:2142 #3 0x0000000000470208 in qemuDomainDestroy (dom=0x7fb2980ee060) at qemu/qemu_driver.c:2183 #4 0x00007fb2b662e918 in virDomainDestroy (domain=0x7fb2980ee060) at libvirt.c:2224 So clearly we have reference counting problem here commit 81621f3e6e45e8681cc18ae49404736a0e772a11 Author: Daniel P. Berrange <berrange> Date: Fri Jan 18 14:33:51 2013 +0000 Fix race condition when destroying guests When running virDomainDestroy, we need to make sure that no other background thread cleans up the domain while we're doing our work. This can happen if we release the domain object while in the middle of work, because the monitor might detect EOF in this window. For this reason we have a 'beingDestroyed' flag to stop the monitor from doing its normal cleanup. Unfortunately this flag was only being used to protect qemuDomainBeginJob, and not qemuProcessKill This left open a race condition where either libvirtd could crash, or alternatively report bogus error messages about the domain already having been destroyed to the caller Signed-off-by: Daniel P. Berrange <berrange> I just backported the above patch and pushed it to libvirt's v0.10.2-maint branch, so that it will automatically be pulled in the next time a new build is done for F18 from that branch. *** Bug 852490 has been marked as a duplicate of this bug. *** libvirt-0.10.2.3-1.fc18 has been submitted as an update for Fedora 18. https://admin.fedoraproject.org/updates/libvirt-0.10.2.3-1.fc18 *** Bug 905187 has been marked as a duplicate of this bug. *** *** Bug 902887 has been marked as a duplicate of this bug. *** libvirt-0.10.2.3-1.fc18 has been pushed to the Fedora 18 stable repository. If problems still persist, please make note of it in this bug report. |