Bug 741139
Summary: | spice mode - display freezes | ||
---|---|---|---|
Product: | [Fedora] Fedora | Reporter: | Richard Keech <rkeech> |
Component: | virt-manager | Assignee: | Cole Robinson <crobinso> |
Status: | CLOSED WORKSFORME | QA Contact: | Fedora Extras Quality Assurance <extras-qa> |
Severity: | high | Docs Contact: | |
Priority: | unspecified | ||
Version: | 15 | CC: | berrange, cfeller, crobinso, dougsland, dpierce, erik-fedora, hbrock, jforbes, nobody, virt-maint |
Target Milestone: | --- | ||
Target Release: | --- | ||
Hardware: | Unspecified | ||
OS: | Linux | ||
Whiteboard: | |||
Fixed In Version: | Doc Type: | Bug Fix | |
Doc Text: | Story Points: | --- | |
Clone Of: | Environment: | ||
Last Closed: | 2012-01-25 21:51:30 UTC | Type: | --- |
Regression: | --- | Mount Type: | --- |
Documentation: | --- | CRM: | |
Verified Versions: | Category: | --- | |
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
Cloudforms Team: | --- | Target Upstream Version: | |
Embargoed: |
Description
Richard Keech
2011-09-26 00:30:45 UTC
Thanks for the report. Please run the following commands: yum install yum-utils gdb debuginfo-install virt-manager (run virt-manager, reproduce the freeze, keep the app running) ps axwww | grep virt-manager | grep -v grep (the first number that command shows is virt-manager's pid) gdp -p $pid thread apply all bt (copy that output to this bug report) Virt-manager doesn't have a debuginfo bc it's not the executable - it's running on top of python. An initial backtrace shows that it's hung in poll() which is called from libvirt. I've installed debuginfo for libvirt, python and glib2, and this is what I get: Thread 2 (Thread 0x7fef21764700 (LWP 29778)): #0 0x000000342ecd7423 in __GI___poll (fds=<optimized out>, nfds=<optimized out>, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:87 #1 0x0000003430c42d24 in g_main_context_poll (n_fds=1, fds=0x7fef1c001150, priority=<optimized out>, timeout=-1, context=0x2b2dce0) at gmain.c:3405 #2 g_main_context_iterate (context=0x2b2dce0, block=<optimized out>, dispatch=1, self=<optimized out>) at gmain.c:3087 #3 0x0000003430c4360d in g_main_loop_run (loop=0x23d0ef0) at gmain.c:3300 #4 0x00007fef28dc1564 in gdbus_shared_thread_func (data=<optimized out>) at gdbusprivate.c:276 #5 0x0000003430c683a6 in g_thread_create_proxy (data=0x2b2ddd0) at gthread.c:1955 #6 0x000000342f407b31 in start_thread (arg=0x7fef21764700) at pthread_create.c:305 #7 0x000000342ecdfd2d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:115 Thread 1 (Thread 0x7fef3235c720 (LWP 29777)): #0 0x000000342ecd7423 in __GI___poll (fds=<optimized out>, nfds=<optimized out>, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:87 #1 0x0000003430c42d24 in g_main_context_poll (n_fds=16, fds=0x376b780, priority=<optimized out>, timeout=537, context= 0x239d570) at gmain.c:3405 #2 g_main_context_iterate (context=0x239d570, block=<optimized out>, dispatch=1, self=<optimized out>) at gmain.c:3087 #3 0x0000003430c4360d in g_main_loop_run (loop=0x2e63780) at gmain.c:3300 #4 0x000000343914c007 in gtk_main () from /usr/lib64/libgtk-x11-2.0.so.0 #5 0x00007fef291dffa6 in ?? () from /usr/lib64/python2.7/site-packages/gtk-2.0/gtk/_gtk.so #6 0x00000034348df592 in call_function (oparg=<optimized out>, pp_stack=0x7ffffc08bc68) at /usr/src/debug/Python-2.7.1/Python/ceval.c:4056 #7 PyEval_EvalFrameEx (f=<optimized out>, throwflag=<optimized out>) at /usr/src/debug/Python-2.7.1/Python/ceval.c:2722 #8 0x00000034348e0098 in fast_function (nk=<optimized out>, na=0, n=<optimized out>, pp_stack=0x7ffffc08bda8, func= <function at remote 0x1fe6ed8>) at /usr/src/debug/Python-2.7.1/Python/ceval.c:4158 #9 call_function (oparg=<optimized out>, pp_stack=0x7ffffc08bda8) at /usr/src/debug/Python-2.7.1/Python/ceval.c:4093 #10 PyEval_EvalFrameEx (f=<optimized out>, throwflag=<optimized out>) at /usr/src/debug/Python-2.7.1/Python/ceval.c:2722 Note, I haven't seen this fault for a couple of weeks now. Could it be that recent errata have fixed this problem? Probably, closing as WORKSFORME. Thanks for all the info though |