Bug 1419931 - Failed to destroy partially-initialized VM with port mirroring
Summary: Failed to destroy partially-initialized VM with port mirroring
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: ovirt-engine
Classification: oVirt
Component: BLL.Virt
Version: 4.0.6.3
Hardware: Unspecified
OS: Unspecified
medium
high
Target Milestone: ovirt-4.1.1
: 4.1.1.2
Assignee: Dan Kenigsberg
QA Contact: Israel Pinto
URL:
Whiteboard:
Depends On:
Blocks: 1430658
TreeView+ depends on / blocked
 
Reported: 2017-02-07 12:32 UTC by Israel Pinto
Modified: 2017-04-21 09:41 UTC (History)
8 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
: 1430658 (view as bug list)
Environment:
Last Closed: 2017-04-21 09:41:17 UTC
oVirt Team: Network
rule-engine: ovirt-4.1+


Attachments (Terms of Use)
engine.log (2.52 MB, application/zip)
2017-02-07 12:32 UTC, Israel Pinto
no flags Details


Links
System ID Private Priority Status Summary Last Updated
oVirt gerrit 72263 0 master MERGED virt-net: attempt to unset port mirroring only if the VM was properly initialized 2017-02-15 06:26:42 UTC
oVirt gerrit 72319 0 ovirt-4.1 MERGED virt-net: attempt to unset port mirroring only if the VM was properly initialized 2017-02-15 12:20:20 UTC
oVirt gerrit 72742 0 ovirt-4.0 ABANDONED virt-net: attempt to unset port mirroring only if the VM was properly initialized 2017-02-26 12:08:52 UTC

Description Israel Pinto 2017-02-07 12:32:30 UTC
Created attachment 1248390 [details]
engine.log

Description of problem:
Failed to start VM since the DestroyVDSCommand failed and not clean from tasks.
Get error: Operation Failed: [Desktop already exists]

Version-Release number of selected component (if applicable):
Engine: 4.0.6-9
Host:
libvirt-2.0.0-10.el7_3.4.x86_64
Red Hat Enterprise Linux Server 7.3 (Maipo)
qemu-img-rhev-2.6.0-28.el7_3.3.x86_64
vdsm-4.18.21.1-1.el7ev.x86_64
qemu-kvm-rhev-2.6.0-28.el7_3.3.x86_64
vdsm-cli-4.18.21.1-1.el7ev.noarch

In automation test:
https://rhev-jenkins.rhev-ci-vms.eng.rdu2.redhat.com/job/4.0v4-GE-Tier2-x86_64/37


Additional info: all logs are at jenkins 

art log: https://rhev-jenkins.rhev-ci-vms.eng.rdu2.redhat.com/job/4.0v4-GE-Tier2-x86_64/37/artifact/logs/art_test_runner.log.2/*view*/

VM id:
VM name:

-- Failed to stop VM
2017-02-06 06:28:45,935 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (DefaultQuartzScheduler4) [3d3c11bf] Command 'DestroyVDSCommand(HostName = host_mixed_1, DestroyVmVDSCommandParameters:{runAsync='true', hostId='db68f77f-551f-465e-bc8d-7d1711d27f0e', vmId='2707de71-e71f-4a33-b057-7205b06d548c', force='false', secondsToWait='0', gracefully='false', reason='', ignoreNoVm='true'})' execution failed: VDSGenericException: VDSErrorException: Failed to DestroyVDS, error = name, code = -32603

-- Tasks are not clean:
2017-02-06 06:52:19,530 INFO  [org.ovirt.engine.core.bll.tasks.AsyncTaskManager] (DefaultQuartzScheduler7) [79a2be2f] Setting new tasks map. The map contains now 2 tasks
2017-02-06 06:52:20,073 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (ForkJoinPool-1-worker-0) [] Failed to destroy VM '6836887d-491d-40a8-bc50-bc79b4c6847d' because VM does not exist, ignoring
2017-02-06 06:52:20,073 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (ForkJoinPool-1-worker-0) [] FINISH, DestroyVDSCommand, log id: 3e8199b4
2017-02-06 06:52:20,073 INFO  [org.ovirt.engine.core.vdsbroker.monitoring.VmAnalyzer] (ForkJoinPool-1-worker-0) [] VM '6836887d-491d-40a8-bc50-bc79b4c6847d'(memory_hotplug_test) moved from 'PoweringDown' --> 'Down'
2017-02-06 06:52:20,087 INFO  [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (ForkJoinPool-1-worker-0) [] Correlation ID: null, Call Stack: null, Custom Event ID: -1, Message: VM memory_hotplug_test is down. Exit message: Admin shut down from the engine
2017-02-06 06:52:20,099 INFO  [org.ovirt.engine.core.bll.ProcessDownVmCommand] (org.ovirt.thread.pool-6-thread-10) [3b41252b] Running command: ProcessDownVmCommand internal: true.
2017-02-06 06:52:23,252 INFO  [org.ovirt.engine.core.vdsbroker.monitoring.VmAnalyzer] (DefaultQuartzScheduler6) [7946888f] VM '2707de71-e71f-4a33-b057-7205b06d548c' was reported as Down on VDS 'db68f77f-551f-465e-bc8d-7d1711d27f0e'(host_mixed_1)
2017-02-06 06:52:23,253 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (DefaultQuartzScheduler6) [7946888f] START, DestroyVDSCommand(HostName = host_mixed_1, DestroyVmVDSCommandParameters:{runAsync='true', hostId='db68f77f-551f-465e-bc8d-7d1711d27f0e', vmId='2707de71-e71f-4a33-b057-7205b06d548c', force='false', secondsToWait='0', gracefully='false', reason='', ignoreNoVm='true'}), log id: 76917a68
2017-02-06 06:52:23,493 WARN  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (DefaultQuartzScheduler6) [7946888f] Unexpected return value: StatusForXmlRpc [code=-32603, message=name]
2017-02-06 06:52:23,493 WARN  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (DefaultQuartzScheduler6) [7946888f] Unexpected return value: StatusForXmlRpc [code=-32603, message=name]
2017-02-06 06:52:23,494 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (DefaultQuartzScheduler6) [7946888f] Failed in 'DestroyVDS' method
2017-02-06 06:52:23,494 WARN  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (DefaultQuartzScheduler6) [7946888f] Unexpected return value: StatusForXmlRpc [code=-32603, message=name]
2017-02-06 06:52:23,497 ERROR [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (DefaultQuartzScheduler6) [7946888f] Correlation ID: null, Call Stack: null, Custom Event ID: -1, Message: VDSM host_mixed_1 command failed: name
2017-02-06 06:52:23,497 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (DefaultQuartzScheduler6) [7946888f] Command 'org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand' return value 'StatusOnlyReturnForXmlRpc [status=StatusForXmlRpc [code=-32603, message=name]]'
2017-02-06 06:52:23,497 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (DefaultQuartzScheduler6) [7946888f] HostName = host_mixed_1
2017-02-06 06:52:23,497 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (DefaultQuartzScheduler6) [7946888f] Command 'DestroyVDSCommand(HostName = host_mixed_1, DestroyVmVDSCommandParameters:{runAsync='true', hostId='db68f77f-551f-465e-bc8d-7d1711d27f0e', vmId='2707de71-e71f-4a33-b057-7205b06d548c', force='false', secondsToWait='0', gracefully='false', reason='', ignoreNoVm='true'})' execution failed: VDSGenericException: VDSErrorException: Failed to DestroyVDS, error = name, code = -32603
2017-02-06 06:52:23,497 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (DefaultQuartzScheduler6) [7946888f] FINISH, DestroyVDSCommand, log id: 76917a68

-- Failed to start VM:
Correlation-Id: vms_syncAction_a5421d31-08d5-4c1c

2017-02-06 06:52:29,123 INFO  [org.ovirt.engine.core.bll.RunVmCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] Lock Acquired to object 'EngineLock:{exclusiveLocks='[2707de71-e71f-4a33-b057-7205b06d548c=<VM, ACTION_TYPE_FAILED_OBJECT_LOCKED>]', sharedLocks='null'}'
2017-02-06 06:52:29,153 INFO  [org.ovirt.engine.core.vdsbroker.IsVmDuringInitiatingVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] START, IsVmDuringInitiatingVDSCommand( IsVmDuringInitiatingVDSCommandParameters:{runAsync='true', vmId='2707de71-e71f-4a33-b057-7205b06d548c'}), log id: 42d98748
2017-02-06 06:52:29,153 INFO  [org.ovirt.engine.core.vdsbroker.IsVmDuringInitiatingVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] FINISH, IsVmDuringInitiatingVDSCommand, return: false, log id: 42d98748
2017-02-06 06:52:29,179 INFO  [org.ovirt.engine.core.bll.RunVmCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] Running command: RunVmCommand internal: false. Entities affected :  ID: 2707de71-e71f-4a33-b057-7205b06d548c Type: VMAction group RUN_VM with role type USER
2017-02-06 06:52:29,200 INFO  [org.ovirt.engine.core.vdsbroker.UpdateVmDynamicDataVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] START, UpdateVmDynamicDataVDSCommand( UpdateVmDynamicDataVDSCommandParameters:{runAsync='true', hostId='null', vmId='00000000-0000-0000-0000-000000000000', vmDynamic='org.ovirt.engine.core.common.businessentities.VmDynamic@8ca3d3c'}), log id: c37e7e6
2017-02-06 06:52:29,202 INFO  [org.ovirt.engine.core.vdsbroker.UpdateVmDynamicDataVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] FINISH, UpdateVmDynamicDataVDSCommand, log id: c37e7e6
2017-02-06 06:52:29,204 INFO  [org.ovirt.engine.core.vdsbroker.CreateVmVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] START, CreateVmVDSCommand( CreateVmVDSCommandParameters:{runAsync='true', hostId='db68f77f-551f-465e-bc8d-7d1711d27f0e', vmId='2707de71-e71f-4a33-b057-7205b06d548c', vm='VM [golden_env_mixed_virtio_1_0]'}), log id: 35ff52e4
2017-02-06 06:52:29,205 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] START, CreateVDSCommand(HostName = host_mixed_1, CreateVmVDSCommandParameters:{runAsync='true', hostId='db68f77f-551f-465e-bc8d-7d1711d27f0e', vmId='2707de71-e71f-4a33-b057-7205b06d548c', vm='VM [golden_env_mixed_virtio_1_0]'}), log id: a7cbd60
2017-02-06 06:52:29,209 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.VmInfoBuilderBase] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] Bootable disk '3fe8f139-7816-490f-b24a-be388eef46ba' set to index '0'
2017-02-06 06:52:29,223 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] org.ovirt.engine.core.vdsbroker.vdsbroker.CreateVDSCommand pitReinjection=false,memGuaranteedSize=1024,smpThreadsPerCore=1,cpuType=Conroe,vmId=2707de71-e71f-4a33-b057-7205b06d548c,acpiEnable=true,vmType=kvm,smp=1,smpCoresPerSocket=1,emulatedMachine=pc-i440fx-rhel7.2.0,smartcardEnable=false,guestNumaNodes=[{memory=1024, cpus=0, nodeIndex=0}],transparentHugePages=true,displayNetwork=ovirtmgmt,vmName=golden_env_mixed_virtio_1_0,maxVCpus=16,kvmEnable=true,devices=[{address={bus=0x00, domain=0x0000, function=0x0, slot=0x02, type=pci}, type=video, specParams={vgamem=16384, heads=1, vram=8192, ram=65536}, device=qxl, deviceId=e8c506a5-5166-4e3d-a48d-f16de57a6987}, {type=graphics, specParams={fileTransferEnable=true, spiceSecureChannels=smain,sinputs,scursor,splayback,srecord,sdisplay,ssmartcard,susbredir, spiceSslCipherSuite=DEFAULT, copyPasteEnable=true}, device=spice, deviceId=5b81434f-c369-4f4a-81f7-5c0c10117506}, {iface=ide, shared=false, path=, address={bus=1, controller=0, unit=0, type=drive, target=0}, readonly=true, index=2, type=disk, specParams={path=}, device=cdrom, deviceId=ba99dc23-0ce4-491a-9739-b9facb43874d}, {shared=false, address={bus=0x00, domain=0x0000, function=0x0, slot=0x0a, type=pci}, imageID=3fe8f139-7816-490f-b24a-be388eef46ba, format=cow, index=0, optional=false, type=disk, deviceId=3fe8f139-7816-490f-b24a-be388eef46ba, domainID=27963afe-1e3c-4f5a-bba6-6e56907830f7, propagateErrors=off, iface=virtio, readonly=false, bootOrder=1, poolID=bf722ca4-b458-4779-820f-23cb0ce42787, volumeID=673dd480-5f33-4f1a-bc44-344cb881a8fe, specParams={}, device=disk}, {filter=vdsm-no-mac-spoofing, nicModel=pv, address={bus=0x00, domain=0x0000, function=0x0, slot=0x03, type=pci}, type=interface, specParams={inbound={}, outbound={}}, device=bridge, linkActive=true, deviceId=6e98ba79-7c26-4b32-84ac-085ca3209d82, macAddr=00:1a:4a:16:88:c9, network=ovirtmgmt}, {address={bus=0x00, domain=0x0000, function=0x0, slot=0x04, type=pci}, type=sound, specParams={}, device=ich6, deviceId=e55a3f92-ecda-411f-84b3-b5ceabcbb7b8}, {address={bus=0x00, function=0x0, domain=0x0000, slot=0x0b, type=pci}, type=balloon, specParams={model=virtio}, device=memballoon, deviceId=e90ece78-f6bc-4dfe-91a0-e200411bc50c}, {address={bus=0x00, domain=0x0000, function=0x0, slot=0x08, type=pci}, index=0, model=virtio-scsi, type=controller, specParams={}, device=scsi, deviceId=c9ece41a-ad6d-49f5-860a-836edc083860}, {address={bus=0x00, domain=0x0000, function=0x0, slot=0x09, type=pci}, type=controller, specParams={}, device=virtio-serial, deviceId=11a72aea-80a6-4ffd-9b37-0ba27322487f}],custom={device_5102d33a-6133-41a4-a62d-b3f884d64c04=VmDevice:{id='VmDeviceId:{deviceId='5102d33a-6133-41a4-a62d-b3f884d64c04', vmId='2707de71-e71f-4a33-b057-7205b06d548c'}', device='ide', type='CONTROLLER', bootOrder='0', specParams='[]', address='{slot=0x01, bus=0x00, domain=0x0000, type=pci, function=0x1}', managed='false', plugged='true', readOnly='false', deviceAlias='ide', customProperties='[]', snapshotId='null', logicalName='null', usingScsiReservation='false', hostDevice='null'}, device_5102d33a-6133-41a4-a62d-b3f884d64c04device_d46102ca-1ac5-4a3b-a1ca-d9669e9d9b2bdevice_83488c13-ccf3-44e5-bdaf-dcea6b5a3747device_66860fd0-bd54-4c56-bcc3-e123a7ab5a05=VmDevice:{id='VmDeviceId:{deviceId='66860fd0-bd54-4c56-bcc3-e123a7ab5a05', vmId='2707de71-e71f-4a33-b057-7205b06d548c'}', device='spicevmc', type='CHANNEL', bootOrder='0', specParams='[]', address='{bus=0, controller=0, type=virtio-serial, port=3}', managed='false', plugged='true', readOnly='false', deviceAlias='channel2', customProperties='[]', snapshotId='null', logicalName='null', usingScsiReservation='false', hostDevice='null'}, device_5102d33a-6133-41a4-a62d-b3f884d64c04device_d46102ca-1ac5-4a3b-a1ca-d9669e9d9b2b=VmDevice:{id='VmDeviceId:{deviceId='d46102ca-1ac5-4a3b-a1ca-d9669e9d9b2b', vmId='2707de71-e71f-4a33-b057-7205b06d548c'}', device='unix', type='CHANNEL', bootOrder='0', specParams='[]', address='{bus=0, controller=0, type=virtio-serial, port=1}', managed='false', plugged='true', readOnly='false', deviceAlias='channel0', customProperties='[]', snapshotId='null', logicalName='null', usingScsiReservation='false', hostDevice='null'}, device_5102d33a-6133-41a4-a62d-b3f884d64c04device_d46102ca-1ac5-4a3b-a1ca-d9669e9d9b2bdevice_83488c13-ccf3-44e5-bdaf-dcea6b5a3747=VmDevice:{id='VmDeviceId:{deviceId='83488c13-ccf3-44e5-bdaf-dcea6b5a3747', vmId='2707de71-e71f-4a33-b057-7205b06d548c'}', device='unix', type='CHANNEL', bootOrder='0', specParams='[]', address='{bus=0, controller=0, type=virtio-serial, port=2}', managed='false', plugged='true', readOnly='false', deviceAlias='channel1', customProperties='[]', snapshotId='null', logicalName='null', usingScsiReservation='false', hostDevice='null'}},display=qxl,timeOffset=0,nice=0,maxMemSize=4194304,maxMemSlots=16,bootMenuEnable=false,memSize=1024
2017-02-06 06:52:30,228 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] Failed in 'CreateVDS' method
2017-02-06 06:52:30,232 ERROR [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] Correlation ID: null, Call Stack: null, Custom Event ID: -1, Message: VDSM host_mixed_1 command failed: Virtual machine already exists
2017-02-06 06:52:30,232 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] Command 'org.ovirt.engine.core.vdsbroker.vdsbroker.CreateVDSCommand' return value 'org.ovirt.engine.core.vdsbroker.vdsbroker.OneVmReturnForXmlRpc@4be73cd2'
2017-02-06 06:52:30,232 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] HostName = host_mixed_1
2017-02-06 06:52:30,232 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] Command 'CreateVDSCommand(HostName = host_mixed_1, CreateVmVDSCommandParameters:{runAsync='true', hostId='db68f77f-551f-465e-bc8d-7d1711d27f0e', vmId='2707de71-e71f-4a33-b057-7205b06d548c', vm='VM [golden_env_mixed_virtio_1_0]'})' execution failed: VDSGenericException: VDSErrorException: Failed to CreateVDS, error = Virtual machine already exists, code = 4
2017-02-06 06:52:30,232 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] FINISH, CreateVDSCommand, log id: a7cbd60
2017-02-06 06:52:30,232 ERROR [org.ovirt.engine.core.vdsbroker.CreateVmVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] VDS::create Failed creating vm 'golden_env_mixed_virtio_1_0' in vds = 'db68f77f-551f-465e-bc8d-7d1711d27f0e' error = 'org.ovirt.engine.core.vdsbroker.vdsbroker.VDSErrorException: VDSGenericException: VDSErrorException: Failed to CreateVDS, error = Virtual machine already exists, code = 4'
2017-02-06 06:52:30,232 INFO  [org.ovirt.engine.core.vdsbroker.CreateVmVDSCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] FINISH, CreateVmVDSCommand, return: Down, log id: 35ff52e4
2017-02-06 06:52:30,239 INFO  [org.ovirt.engine.core.bll.RunVmCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] Lock freed to object 'EngineLock:{exclusiveLocks='[2707de71-e71f-4a33-b057-7205b06d548c=<VM, ACTION_TYPE_FAILED_OBJECT_LOCKED>]', sharedLocks='null'}'
2017-02-06 06:52:30,239 ERROR [org.ovirt.engine.core.bll.RunVmCommand] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] Command 'org.ovirt.engine.core.bll.RunVmCommand' failed: EngineException: org.ovirt.engine.core.vdsbroker.vdsbroker.VDSErrorException: VDSGenericException: VDSErrorException: Failed to CreateVDS, error = Virtual machine already exists, code = 4 (Failed with error exist and code 4)
2017-02-06 06:52:30,242 ERROR [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (default task-13) [vms_syncAction_a5421d31-08d5-4c1c] Correlation ID: vms_syncAction_a5421d31-08d5-4c1c, Job ID: c7ba1ba5-7532-4461-b39e-cdb734488ca6, Call Stack: null, Custom Event ID: -1, Message: Failed to run VM golden_env_mixed_virtio_1_0 (User: admin@internal-authz).
2017-02-06 06:52:30,244 ERROR [org.ovirt.engine.api.restapi.resource.AbstractBackendResource] (default task-13) [] Operation Failed: [Desktop already exists]

Comment 1 Tomas Jelinek 2017-02-08 09:12:43 UTC
Not sure what does it have to do with the tasks. What I see is this:
- try to stop a VM
- stop VM fails due to Internal JSON-RPC error. (for this I'd like to see the VDSM logs to see what happened)
- the VM is still running, try to start it again
- start VM fails since the VM is running

The issue I see here is the failed stop attempt.
Can you please attach the VDSM logs? I was not able to dig them out from jenkins...

Comment 2 Israel Pinto 2017-02-08 09:23:43 UTC
This VM is down for about 30 min and when trying to start it is failed with
"Desktop already exist", which is very wired.
I see that tasks are not cleaned and this VM is reported as failed to stop/destroy. I guess that can be relate to the fact it failed to run.

I will try to add all logs but there is limitation of size ...

Comment 3 Yaniv Kaul 2017-02-08 10:03:28 UTC
(In reply to Israel Pinto from comment #2)
> This VM is down for about 30 min and when trying to start it is failed with
> "Desktop already exist", which is very wired.
> I see that tasks are not cleaned and this VM is reported as failed to
> stop/destroy. I guess that can be relate to the fact it failed to run.
> 
> I will try to add all logs but there is limitation of size ...

Use Google drive.

Comment 4 Israel Pinto 2017-02-08 10:59:20 UTC
from the failed stop you can see that it was running on
hostId='db68f77f-551f-465e-bc8d-7d1711d27f0e' name = host_mixed_1
ip = 10.35.117.48

-- Engine log failed to stop VM:
2017-02-06 06:28:45,935 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (DefaultQuartzScheduler4) [3d3c11bf] Command 'DestroyVDSCommand(HostName = host_mixed_1, DestroyVmVDSCommandParameters:{runAsync='true', hostId='db68f77f-551f-465e-bc8d-7d1711d27f0e', vmId='2707de71-e71f-4a33-b057-7205b06d548c', force='false', secondsToWait='0', gracefully='false', reason='', ignoreNoVm='true'})' execution failed: VDSGenericException: VDSErrorException: Failed to DestroyVDS, error = name, code = -32603

vdsm log (see vdsm.log.1 in host-logs:
1. There are a lot of server internal errors 
jsonrpc.Executor/5::DEBUG::2017-02-06 04:07:48,846::API::314::vds::(destroy) About to destroy VM 2707de71-e71f-4a33-b057-7205b06d548c
jsonrpc.Executor/5::DEBUG::2017-02-06 04:07:48,846::vm::4217::virt.vm::(destroy) vmId=`2707de71-e71f-4a33-b057-7205b06d548c`::destroy Called
jsonrpc.Executor/5::ERROR::2017-02-06 04:07:48,846::__init__::550::jsonrpc.JsonRpcServer::(_handle_request) Internal server error
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/yajsonrpc/__init__.py", line 545, in _handle_request
    res = method(**params)
  File "/usr/lib/python2.7/site-packages/vdsm/rpc/Bridge.py", line 195, in _dynamicMethod
    result = fn(*methodArgs)
  File "/usr/share/vdsm/API.py", line 320, in destroy
    res = v.destroy(gracefulAttempts)
  File "/usr/share/vdsm/virt/vm.py", line 4219, in destroy
    result = self.doDestroy(gracefulAttempts)
  File "/usr/share/vdsm/virt/vm.py", line 4237, in doDestroy
    return self.releaseVm(gracefulAttempts)
  File "/usr/share/vdsm/virt/vm.py", line 4125, in releaseVm
    nic.name)
AttributeError: name

2. And at the same time of stop on the engine same error:
jsonrpc.Executor/4::DEBUG::2017-02-06 06:28:44,928::API::314::vds::(destroy) About to destroy VM 2707de71-e71f-4a33-b057-7205b06d548c
jsonrpc.Executor/4::DEBUG::2017-02-06 06:28:44,929::vm::4217::virt.vm::(destroy) vmId=`2707de71-e71f-4a33-b057-7205b06d548c`::destroy Called
jsonrpc.Executor/4::ERROR::2017-02-06 06:28:44,929::__init__::550::jsonrpc.JsonRpcServer::(_handle_request) Internal server error
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/yajsonrpc/__init__.py", line 545, in _handle_request
    res = method(**params)
  File "/usr/lib/python2.7/site-packages/vdsm/rpc/Bridge.py", line 195, in _dynamicMethod
    result = fn(*methodArgs)
  File "/usr/share/vdsm/API.py", line 320, in destroy
    res = v.destroy(gracefulAttempts)
  File "/usr/share/vdsm/virt/vm.py", line 4219, in destroy
    result = self.doDestroy(gracefulAttempts)
  File "/usr/share/vdsm/virt/vm.py", line 4237, in doDestroy
    return self.releaseVm(gracefulAttempts)
  File "/usr/share/vdsm/virt/vm.py", line 4125, in releaseVm
    nic.name)
AttributeError: name
jsonrpc.Executor/4::INFO::2017-02-06 06:28:44,930::__init__::513::jsonrpc.JsonRpcServer::(_serveRequest) RPC call VM.destroy failed (error -32603) in 0.01 seconds

 

google drive logs:
https://drive.google.com/drive/folders/0BxBXxc5waimaUkU3RjdwY3VNdGc

Comment 5 Yaniv Kaul 2017-02-13 19:09:49 UTC
Michal - can someone look at this?

Comment 6 Michal Skrivanek 2017-02-13 19:52:33 UTC
This is an exception in networking code. Dan, any recent changes in that area?

Comment 7 Dan Kenigsberg 2017-02-13 20:36:52 UTC
vdsm-4.18.21.1-1.el7ev.x86_64 is not so recent... (2 months old)

Neither me not `git log --oneline ovirt/ovirt-4.0 -- vdsm/virt/vmdevices/network.py` recall any related change.

Eddy, can you look deeper into the vdsm.log to tell how could we have a nameless nic?

Comment 8 Dan Kenigsberg 2017-02-14 12:35:23 UTC
I don't see it as a blocker. It pops up in a very rare case (qemu crash right after startup) of a rather rare use case (port mirroring).

Comment 9 Israel Pinto 2017-02-22 11:23:56 UTC
Verify with:
Red Hat Virtualization Manager Version: 4.1.1.2-0.1.el7


Note You need to log in before you can comment on or make changes to this bug.