Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.
RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.

Bug 741333

Summary: JSON invalid after vm booted with rhev-agent
Product: Red Hat Enterprise Linux 6 Reporter: Jaroslav Henner <jhenner>
Component: qemu-kvmAssignee: Luiz Capitulino <lcapitulino>
Status: CLOSED DUPLICATE QA Contact: Virtualization Bugs <virt-bugs>
Severity: medium Docs Contact:
Priority: unspecified    
Version: 6.1CC: acathrow, alevy, mkenneth, tburke, virt-maint, yhalperi
Target Milestone: rc   
Target Release: ---   
Hardware: x86_64   
OS: Linux   
Whiteboard:
Fixed In Version: Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2011-09-27 12:41:50 UTC Type: ---
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description Jaroslav Henner 2011-09-26 15:38:19 UTC
Description of problem:
I'm not sure about the process I did, but
I created VM, installed rhel6.1 on that, installed rhev-agent-2.3.14-1.el6.x86_64.rpm and booted the machine. After it got up, there is following error. Then it got stopped when trying to suspend it.

Version-Release number of selected component (if applicable):
qemu-kvm-0.12.1.2-2.184.el6.x86_64


How reproducible:
unknown

Steps to Reproduce:
In the Description
  
Actual results:
vm stopped, error in log

Expected results:
vm suspended


Additional info:
Thread-48289::DEBUG::2011-09-26 16:59:58,053::resourceManager::821::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-48289::DEBUG::2011-09-26 16:59:58,053::resourceManager::517::ResourceManager::(releaseResource) Trying to release resource 'Storage.cbd3dd15-9125-4447-b30b-3bb4d6ae465d'
Thread-48289::DEBUG::2011-09-26 16:59:58,053::resourceManager::532::ResourceManager::(releaseResource) Released resource 'Storage.cbd3dd15-9125-4447-b30b-3bb4d6ae465d' (0 active users)
Thread-48289::DEBUG::2011-09-26 16:59:58,054::resourceManager::537::ResourceManager::(releaseResource) Resource 'Storage.cbd3dd15-9125-4447-b30b-3bb4d6ae465d' is free, finding out if anyone is waiting for it.
Thread-48289::DEBUG::2011-09-26 16:59:58,054::resourceManager::544::ResourceManager::(releaseResource) No one is waiting for resource 'Storage.cbd3dd15-9125-4447-b30b-3bb4d6ae465d', Clearing records.
Thread-48289::DEBUG::2011-09-26 16:59:58,054::task::495::TaskManager.Task::(_debug) Task 2b8f833b-57b7-4192-adb5-a4161adddda9: ref 0 aborting False
Thread-48289::INFO::2011-09-26 16:59:58,055::dispatcher::100::Storage.Dispatcher.Protect::(run) Run and protect: getVolumeSize, Return response: {'status': {'message': 'OK', 'code': 0}, 'truesize': '3221225472', 'apparentsize': '3221225472'}
Dummy-28011::DEBUG::2011-09-26 16:59:59,361::storage_mailbox::623::Storage.Misc.excCmd::(_checkForMail) 'dd if=/rhev/data-center/d419efb5-b015-4898-9b60-39ecd6641acd/mastersd/dom_md/inbox iflag=direct,fullblock count=1 bs=1024000' (cwd None)
Dummy-28011::DEBUG::2011-09-26 16:59:59,435::storage_mailbox::623::Storage.Misc.excCmd::(_checkForMail) SUCCESS: <err> = '1+0 records in\n1+0 records out\n1024000 bytes (1.0 MB) copied, 0.0375999 s, 27.2 MB/s\n'; <rc> = 0
Thread-48763::INFO::2011-09-26 16:59:59,795::dispatcher::94::Storage.Dispatcher.Protect::(run) Run and protect: repoStats, args: ()
Thread-48763::DEBUG::2011-09-26 16:59:59,796::task::495::TaskManager.Task::(_debug) Task a57c943d-848c-4f30-aefb-40824548242f: moving from state init -> state preparing
Thread-48763::DEBUG::2011-09-26 16:59:59,796::task::495::TaskManager.Task::(_debug) Task a57c943d-848c-4f30-aefb-40824548242f: finished: {u'cbd3dd15-9125-4447-b30b-3bb4d6ae465d': {'delay': '0.000561952590942', 'lastCheck': 1317049192.492301, 'valid': True, 'code': 0}}
Thread-48763::DEBUG::2011-09-26 16:59:59,797::task::495::TaskManager.Task::(_debug) Task a57c943d-848c-4f30-aefb-40824548242f: moving from state preparing -> state finished
Thread-48763::DEBUG::2011-09-26 16:59:59,797::resourceManager::786::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {}
Thread-48763::DEBUG::2011-09-26 16:59:59,797::resourceManager::821::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-48763::DEBUG::2011-09-26 16:59:59,798::task::495::TaskManager.Task::(_debug) Task a57c943d-848c-4f30-aefb-40824548242f: ref 0 aborting False
Thread-48763::INFO::2011-09-26 16:59:59,798::dispatcher::100::Storage.Dispatcher.Protect::(run) Run and protect: repoStats, Return response: {'status': {'message': 'OK', 'code': 0}, u'cbd3dd15-9125-4447-b30b-3bb4d6ae465d': {'delay': '0.000561952590942', 'lastCheck': 1317049192.492301, 'valid': True, 'code': 0}}
Dummy-28011::DEBUG::2011-09-26 17:00:01,443::storage_mailbox::623::Storage.Misc.excCmd::(_checkForMail) 'dd if=/rhev/data-center/d419efb5-b015-4898-9b60-39ecd6641acd/mastersd/dom_md/inbox iflag=direct,fullblock count=1 bs=1024000' (cwd None)
Dummy-28011::DEBUG::2011-09-26 17:00:01,516::storage_mailbox::623::Storage.Misc.excCmd::(_checkForMail) SUCCESS: <err> = '1+0 records in\n1+0 records out\n1024000 bytes (1.0 MB) copied, 0.0369406 s, 27.7 MB/s\n'; <rc> = 0
Thread-27964::INFO::2011-09-26 17:00:02,502::blockSD::650::Storage.StorageDomain::(validate) sdUUID=cbd3dd15-9125-4447-b30b-3bb4d6ae465d
Dummy-28011::DEBUG::2011-09-26 17:00:03,524::storage_mailbox::623::Storage.Misc.excCmd::(_checkForMail) 'dd if=/rhev/data-center/d419efb5-b015-4898-9b60-39ecd6641acd/mastersd/dom_md/inbox iflag=direct,fullblock count=1 bs=1024000' (cwd None)
Dummy-28011::DEBUG::2011-09-26 17:00:03,598::storage_mailbox::623::Storage.Misc.excCmd::(_checkForMail) SUCCESS: <err> = '1+0 records in\n1+0 records out\n1024000 bytes (1.0 MB) copied, 0.0381604 s, 26.8 MB/s\n'; <rc> = 0
Thread-48242::ERROR::2011-09-26 17:00:04,867::utils::369::vm.Vm::(collect) vmId=`15aad660-a49a-4c21-ac10-5f3c400b29c8`::Stats function failed: <AdvancedStatsFunction _highWrite at 0xf28030>
Traceback (most recent call last):
  File "/usr/share/vdsm/utils.py", line 366, in collect
    statsFunction()
  File "/usr/share/vdsm/utils.py", line 250, in __call__
    retValue = self._function(*args, **kwargs)
  File "/usr/share/vdsm/libvirtvm.py", line 65, in _highWrite
    self._vm._dom.blockInfo(vmDrive.path, 0)
  File "/usr/share/vdsm/libvirtvm.py", line 328, in f
    ret = attr(*args, **kwargs)
  File "/usr/share/vdsm/libvirtconnection.py", line 63, in wrapper
    ret = f(*args, **kwargs)
  File "/usr/lib64/python2.6/site-packages/libvirt.py", line 1286, in blockInfo
    if ret is None: raise libvirtError ('virDomainGetBlockInfo() failed', dom=self)
libvirtError: internal error cannot parse json {"timestamp": {"seconds": 1317049202, "microseconds": 926308}, "event": "SPICE_DISCONNECTED", "data": {"server": {"port": "5900", "family": "ipv4", "host": "10.34.65.206"}, "client": {"port": "51066", "family": "ipv4", "host":{"t34es.63.10"}}seco}ds":: parse error: object key and value must be separated by a colon (':')
          "ipv4", "host":{"t34es.63.10"}}seco}ds":
                     (right here) ------^

Comment 1 Jaroslav Henner 2011-09-26 15:41:48 UTC
In rhevm, the VM is in Saving state and has correct IP address.

Comment 3 Dor Laor 2011-09-27 09:52:16 UTC
Luiz/Yonit, is that a qemu-spice bug?

Comment 4 Luiz Capitulino 2011-09-27 12:41:50 UTC
This must be bug 697441, you just managed to reproduce it in a different way. 

The bug has been fixed in qemu-kvm-0.12.1.2-2.192.el6. I'm going to close it as a duplicate, please reopen if you manage to reproduce it with .192 or later.

*** This bug has been marked as a duplicate of bug 697441 ***