Hide Forgot
Description of problem: I'm not sure about the process I did, but I created VM, installed rhel6.1 on that, installed rhev-agent-2.3.14-1.el6.x86_64.rpm and booted the machine. After it got up, there is following error. Then it got stopped when trying to suspend it. Version-Release number of selected component (if applicable): qemu-kvm-0.12.1.2-2.184.el6.x86_64 How reproducible: unknown Steps to Reproduce: In the Description Actual results: vm stopped, error in log Expected results: vm suspended Additional info: Thread-48289::DEBUG::2011-09-26 16:59:58,053::resourceManager::821::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-48289::DEBUG::2011-09-26 16:59:58,053::resourceManager::517::ResourceManager::(releaseResource) Trying to release resource 'Storage.cbd3dd15-9125-4447-b30b-3bb4d6ae465d' Thread-48289::DEBUG::2011-09-26 16:59:58,053::resourceManager::532::ResourceManager::(releaseResource) Released resource 'Storage.cbd3dd15-9125-4447-b30b-3bb4d6ae465d' (0 active users) Thread-48289::DEBUG::2011-09-26 16:59:58,054::resourceManager::537::ResourceManager::(releaseResource) Resource 'Storage.cbd3dd15-9125-4447-b30b-3bb4d6ae465d' is free, finding out if anyone is waiting for it. Thread-48289::DEBUG::2011-09-26 16:59:58,054::resourceManager::544::ResourceManager::(releaseResource) No one is waiting for resource 'Storage.cbd3dd15-9125-4447-b30b-3bb4d6ae465d', Clearing records. Thread-48289::DEBUG::2011-09-26 16:59:58,054::task::495::TaskManager.Task::(_debug) Task 2b8f833b-57b7-4192-adb5-a4161adddda9: ref 0 aborting False Thread-48289::INFO::2011-09-26 16:59:58,055::dispatcher::100::Storage.Dispatcher.Protect::(run) Run and protect: getVolumeSize, Return response: {'status': {'message': 'OK', 'code': 0}, 'truesize': '3221225472', 'apparentsize': '3221225472'} Dummy-28011::DEBUG::2011-09-26 16:59:59,361::storage_mailbox::623::Storage.Misc.excCmd::(_checkForMail) 'dd if=/rhev/data-center/d419efb5-b015-4898-9b60-39ecd6641acd/mastersd/dom_md/inbox iflag=direct,fullblock count=1 bs=1024000' (cwd None) Dummy-28011::DEBUG::2011-09-26 16:59:59,435::storage_mailbox::623::Storage.Misc.excCmd::(_checkForMail) SUCCESS: <err> = '1+0 records in\n1+0 records out\n1024000 bytes (1.0 MB) copied, 0.0375999 s, 27.2 MB/s\n'; <rc> = 0 Thread-48763::INFO::2011-09-26 16:59:59,795::dispatcher::94::Storage.Dispatcher.Protect::(run) Run and protect: repoStats, args: () Thread-48763::DEBUG::2011-09-26 16:59:59,796::task::495::TaskManager.Task::(_debug) Task a57c943d-848c-4f30-aefb-40824548242f: moving from state init -> state preparing Thread-48763::DEBUG::2011-09-26 16:59:59,796::task::495::TaskManager.Task::(_debug) Task a57c943d-848c-4f30-aefb-40824548242f: finished: {u'cbd3dd15-9125-4447-b30b-3bb4d6ae465d': {'delay': '0.000561952590942', 'lastCheck': 1317049192.492301, 'valid': True, 'code': 0}} Thread-48763::DEBUG::2011-09-26 16:59:59,797::task::495::TaskManager.Task::(_debug) Task a57c943d-848c-4f30-aefb-40824548242f: moving from state preparing -> state finished Thread-48763::DEBUG::2011-09-26 16:59:59,797::resourceManager::786::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-48763::DEBUG::2011-09-26 16:59:59,797::resourceManager::821::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-48763::DEBUG::2011-09-26 16:59:59,798::task::495::TaskManager.Task::(_debug) Task a57c943d-848c-4f30-aefb-40824548242f: ref 0 aborting False Thread-48763::INFO::2011-09-26 16:59:59,798::dispatcher::100::Storage.Dispatcher.Protect::(run) Run and protect: repoStats, Return response: {'status': {'message': 'OK', 'code': 0}, u'cbd3dd15-9125-4447-b30b-3bb4d6ae465d': {'delay': '0.000561952590942', 'lastCheck': 1317049192.492301, 'valid': True, 'code': 0}} Dummy-28011::DEBUG::2011-09-26 17:00:01,443::storage_mailbox::623::Storage.Misc.excCmd::(_checkForMail) 'dd if=/rhev/data-center/d419efb5-b015-4898-9b60-39ecd6641acd/mastersd/dom_md/inbox iflag=direct,fullblock count=1 bs=1024000' (cwd None) Dummy-28011::DEBUG::2011-09-26 17:00:01,516::storage_mailbox::623::Storage.Misc.excCmd::(_checkForMail) SUCCESS: <err> = '1+0 records in\n1+0 records out\n1024000 bytes (1.0 MB) copied, 0.0369406 s, 27.7 MB/s\n'; <rc> = 0 Thread-27964::INFO::2011-09-26 17:00:02,502::blockSD::650::Storage.StorageDomain::(validate) sdUUID=cbd3dd15-9125-4447-b30b-3bb4d6ae465d Dummy-28011::DEBUG::2011-09-26 17:00:03,524::storage_mailbox::623::Storage.Misc.excCmd::(_checkForMail) 'dd if=/rhev/data-center/d419efb5-b015-4898-9b60-39ecd6641acd/mastersd/dom_md/inbox iflag=direct,fullblock count=1 bs=1024000' (cwd None) Dummy-28011::DEBUG::2011-09-26 17:00:03,598::storage_mailbox::623::Storage.Misc.excCmd::(_checkForMail) SUCCESS: <err> = '1+0 records in\n1+0 records out\n1024000 bytes (1.0 MB) copied, 0.0381604 s, 26.8 MB/s\n'; <rc> = 0 Thread-48242::ERROR::2011-09-26 17:00:04,867::utils::369::vm.Vm::(collect) vmId=`15aad660-a49a-4c21-ac10-5f3c400b29c8`::Stats function failed: <AdvancedStatsFunction _highWrite at 0xf28030> Traceback (most recent call last): File "/usr/share/vdsm/utils.py", line 366, in collect statsFunction() File "/usr/share/vdsm/utils.py", line 250, in __call__ retValue = self._function(*args, **kwargs) File "/usr/share/vdsm/libvirtvm.py", line 65, in _highWrite self._vm._dom.blockInfo(vmDrive.path, 0) File "/usr/share/vdsm/libvirtvm.py", line 328, in f ret = attr(*args, **kwargs) File "/usr/share/vdsm/libvirtconnection.py", line 63, in wrapper ret = f(*args, **kwargs) File "/usr/lib64/python2.6/site-packages/libvirt.py", line 1286, in blockInfo if ret is None: raise libvirtError ('virDomainGetBlockInfo() failed', dom=self) libvirtError: internal error cannot parse json {"timestamp": {"seconds": 1317049202, "microseconds": 926308}, "event": "SPICE_DISCONNECTED", "data": {"server": {"port": "5900", "family": "ipv4", "host": "10.34.65.206"}, "client": {"port": "51066", "family": "ipv4", "host":{"t34es.63.10"}}seco}ds":: parse error: object key and value must be separated by a colon (':') "ipv4", "host":{"t34es.63.10"}}seco}ds": (right here) ------^
In rhevm, the VM is in Saving state and has correct IP address.
Luiz/Yonit, is that a qemu-spice bug?
This must be bug 697441, you just managed to reproduce it in a different way. The bug has been fixed in qemu-kvm-0.12.1.2-2.192.el6. I'm going to close it as a duplicate, please reopen if you manage to reproduce it with .192 or later. *** This bug has been marked as a duplicate of bug 697441 ***