Bugzilla (bugzilla.redhat.com) will be under maintenance for infrastructure upgrades and will not be available on July 31st between 12:30 AM - 05:30 AM UTC. We appreciate your understanding and patience. You can follow status.redhat.com for details.
Bug 973751 - [REST-API] during deleting disk get an error “Could not find sub-resource in the collection resource"
Summary: [REST-API] during deleting disk get an error “Could not find sub-resource in ...
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: ovirt-engine-restapi
Version: 3.2.0
Hardware: x86_64
OS: Linux
unspecified
medium
Target Milestone: ---
: 3.3.0
Assignee: Ravi Nori
QA Contact: yeylon@redhat.com
URL:
Whiteboard: infra
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2013-06-12 15:43 UTC by vvyazmin@redhat.com
Modified: 2016-04-18 06:52 UTC (History)
9 users (show)

Fixed In Version: is2
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed:
oVirt Team: Infra
Target Upstream Version:


Attachments (Terms of Use)
## Logs rhevm (262.63 KB, application/x-gzip)
2013-06-12 15:43 UTC, vvyazmin@redhat.com
no flags Details


Links
System ID Private Priority Status Summary Last Updated
oVirt gerrit 15632 0 None None None Never

Description vvyazmin@redhat.com 2013-06-12 15:43:27 UTC
Created attachment 760233 [details]
## Logs rhevm

Description of problem: During deleting disk get an error “Could not find sub-resource in the collection resource"

Version-Release number of selected component (if applicable):
RHEVM 3.2 - SF17.5 environment:

RHEVM: rhevm-3.2.0-11.30.el6ev.noarch
VDSM: vdsm-4.10.2-22.0.el6ev.x86_64
LIBVIRT: libvirt-0.10.2-18.el6_4.5.x86_64
QEMU & KVM: qemu-kvm-rhev-0.12.1.2-2.355.el6_4.5.x86_64
SANLOCK: sanlock-2.6-2.el6.x86_64
rhevm-sdk-3.2.0.11-1.el6ev.noarch

How reproducible:
100%

Steps to Reproduce:
1. Create disk via PythonSDK
2. Wait when disk get status “OK”
3. Delete disk via PythonSDK
  
Actual results:
Succeed delete disk, but in engine.log I get en error.

Expected results:
Action delete done without errors.

Impact on user:
none

Workaround:

Additional info:

def deleteDisk(diskName):
    """
    Delete disk,
     * diskName - diskName to be removed
    """
    disk = API.disks.get(diskName)
    if disk is None:
        LOGGER.warn("Disk '%s' is None, test will fail" % (diskName))
    else:
        disk.delete()
        LOGGER.info("Removing disk '%s'" % (diskName))
    return True

/var/log/ovirt-engine/engine.log

2013-06-12 18:29:12,329 ERROR [org.ovirt.engine.api.restapi.resource.AbstractBackendCollectionResource] (ajp-/127.0.0.1:8702-9) Could not find sub-resource in the 
collection resource
2013-06-12 18:29:12,371 INFO  [org.ovirt.engine.core.bll.RemoveDiskCommand] (ajp-/127.0.0.1:8702-9) [62266fd] Lock Acquired to object EngineLock [exclusiveLocks= k
ey: ff171806-9a62-410f-9c1c-532dddd6c7b3 value: DISK
, sharedLocks= ]
2013-06-12 18:29:12,414 INFO  [org.ovirt.engine.core.bll.RemoveDiskCommand] (ajp-/127.0.0.1:8702-9) [62266fd] Running command: RemoveDiskCommand internal: false. E
ntities affected :  ID: ff171806-9a62-410f-9c1c-532dddd6c7b3 Type: Disk
2013-06-12 18:29:12,442 INFO  [org.ovirt.engine.core.bll.RemoveImageCommand] (ajp-/127.0.0.1:8702-9) [62266fd] Running command: RemoveImageCommand internal: true. 
Entities affected :  ID: 3aa46a5b-42d1-494b-87c3-a3478692060e Type: Storage
2013-06-12 18:29:12,450 INFO  [org.ovirt.engine.core.bll.RemoveImageCommand] (ajp-/127.0.0.1:8702-9) [62266fd] Lock freed to object EngineLock [exclusiveLocks= key
: ff171806-9a62-410f-9c1c-532dddd6c7b3 value: DISK
, sharedLocks= ]
2013-06-12 18:29:12,451 INFO  [org.ovirt.engine.core.vdsbroker.irsbroker.DeleteImageGroupVDSCommand] (ajp-/127.0.0.1:8702-9) [62266fd] START, DeleteImageGroupVDSCo
mmand( storagePoolId = f5b6630b-4b7a-4e8c-952a-c2aa3b7fe1d5, ignoreFailoverLimit = false, compatabilityVersion = 3.2, storageDomainId = 3aa46a5b-42d1-494b-87c3-a3478692060e, imageGroupId = ff171806-9a62-410f-9c1c-532dddd6c7b3, postZeros = false, forceDelete = false), log id: 68f682fc
2013-06-12 18:29:14,023 INFO  [org.ovirt.engine.core.vdsbroker.irsbroker.DeleteImageGroupVDSCommand] (ajp-/127.0.0.1:8702-9) [62266fd] FINISH, DeleteImageGroupVDSCommand, log id: 68f682fc
2013-06-12 18:29:14,028 INFO  [org.ovirt.engine.core.bll.EntityAsyncTask] (ajp-/127.0.0.1:8702-9) [62266fd] EntityAsyncTask::Adding EntityMultiAsyncTasks object for entity ff171806-9a62-410f-9c1c-532dddd6c7b3
2013-06-12 18:29:14,028 INFO  [org.ovirt.engine.core.bll.EntityMultiAsyncTasks] (ajp-/127.0.0.1:8702-9) [62266fd] EntityMultiAsyncTasks::AttachTask: Attaching task 60902964-1fbf-48f6-92e0-d43e5d758ada to entity ff171806-9a62-410f-9c1c-532dddd6c7b3.
2013-06-12 18:29:14,040 INFO  [org.ovirt.engine.core.bll.AsyncTaskManager] (ajp-/127.0.0.1:8702-9) [62266fd] Adding task 60902964-1fbf-48f6-92e0-d43e5d758ada (Parent Command RemoveDisk, Parameters Type org.ovirt.engine.core.common.asynctasks.AsyncTaskParameters), polling hasn't started yet..
2013-06-12 18:29:14,121 WARN  [org.ovirt.engine.core.compat.backendcompat.PropertyInfo] (ajp-/127.0.0.1:8702-9) Unable to get value of property: glusterVolume for class org.ovirt.engine.core.bll.RemoveDiskCommand
2013-06-12 18:29:14,122 WARN  [org.ovirt.engine.core.compat.backendcompat.PropertyInfo] (ajp-/127.0.0.1:8702-9) Unable to get value of property: vds for class org.ovirt.engine.core.bll.RemoveDiskCommand
2013-06-12 18:29:14,132 INFO  [org.ovirt.engine.core.bll.SPMAsyncTask] (ajp-/127.0.0.1:8702-9) [62266fd] BaseAsyncTask::StartPollingTask: Starting to poll task 60902964-1fbf-48f6-92e0-d43e5d758ada.
2013-06-12 18:29:14,132 INFO  [org.ovirt.engine.core.bll.EntityMultiAsyncTasks] (ajp-/127.0.0.1:8702-9) [62266fd] EntityMultiAsyncTasks::StartPollingTask: Current Action Type for entity ff171806-9a62-410f-9c1c-532dddd6c7b3 is RemoveDisk (determined by task 60902964-1fbf-48f6-92e0-d43e5d758ada)

/var/log/vdsm/vdsm.log

Comment 2 vvyazmin@redhat.com 2013-06-13 07:13:53 UTC
FYI: No problems are found if I deleting disk via UI.

Comment 4 vvyazmin@redhat.com 2013-07-15 15:48:46 UTC
No issues are found.

Verified on RHEVM 3.3 - IS5 environment:

RHEVM: rhevm-3.3.0-0.7.master.el6ev.noarch
VDSM: vdsm-4.11.0-121.git082925a.el6.x86_64
LIBVIRT: libvirt-0.10.2-18.el6_4.9.x86_64
QEMU & KVM: qemu-kvm-rhev-0.12.1.2-2.355.el6_4.5.x86_64
SANLOCK: sanlock-2.6-2.el6.x86_64

Comment 8 Itamar Heim 2014-01-21 22:22:49 UTC
Closing - RHEV 3.3 Released

Comment 9 Itamar Heim 2014-01-21 22:23:58 UTC
Closing - RHEV 3.3 Released

Comment 10 Itamar Heim 2014-01-21 22:27:44 UTC
Closing - RHEV 3.3 Released


Note You need to log in before you can comment on or make changes to this bug.