Bug 1251956 - Live storage migration is broken
Summary: Live storage migration is broken
Status: CLOSED CURRENTRELEASE
Alias: None
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: ovirt-engine
Version: 3.6.0
Hardware: Unspecified
OS: Unspecified
urgent
urgent
Target Milestone: ovirt-3.6.0-rc
: 3.6.0
Assignee: Daniel Erez
QA Contact: Kevin Alon Goldblatt
URL:
Whiteboard:
Keywords: Regression
: 1256786 (view as bug list)
Depends On:
Blocks: 1058757
TreeView+ depends on / blocked
 
Reported: 2015-08-10 12:02 UTC by Arik
Modified: 2016-03-10 12:01 UTC (History)
12 users (show)

(edit)
Clone Of:
(edit)
Last Closed:


Attachments (Terms of Use)
engine log (6.78 MB, text/plain)
2015-08-10 12:04 UTC, Arik
no flags Details


External Trackers
Tracker ID Priority Status Summary Last Updated
oVirt gerrit 44647 master MERGED core: avoid disks lock check on create snapshot while LSM Never
oVirt gerrit 44741 ovirt-engine-3.6 MERGED core: avoid disks lock check on create snapshot while LSM Never

Description Arik 2015-08-10 12:02:26 UTC
Description of problem:
Unable to migrate a disk between different storage domains while the VM is up.

Version-Release number of selected component (if applicable):


How reproducible:
100%

Steps to Reproduce:
1. Run a VM
2. Try to move one of its disks to a different storage domain
3.

Actual results:
The opreation fails, the disk remains locked in the DB

Expected results:
The disk should move and then the lock should be released

Additional info:
It seems that the problem is that CreateAllSnapshotsFromVmCommand try to lock the disk while it is already locked (exclusively) by LiveMigrateVmDisksCommand

Comment 1 Arik 2015-08-10 12:04:40 UTC
Created attachment 1061017 [details]
engine log

The relevant part is:
Call Stack: null, Custom Event ID: -1, Message: VM windows7 started on Host bamba
2015-08-10 14:45:11,441 INFO  [org.ovirt.engine.core.bll.MoveDisksCommand] (default task-27) [37ac4b8c] Running command: MoveDisksCommand internal: false. Entities affected :  ID: 63213433-b3e9-4fab-809a-60972897baea Type: DiskAction group CONFIGURE_DISK_STORAGE with role type USER
2015-08-10 14:45:11,660 INFO  [org.ovirt.engine.core.bll.lsm.LiveMigrateVmDisksCommand] (default task-27) [37ac4b8c] Lock Acquired to object 'EngineLock:{exclusiveLocks='[63213433-b3e9-4fab-809a-60972897baea=<DISK, ACTION_TYPE_FAILED_DISK_IS_BEING_MIGRATED$DiskName windows7>]', sharedLocks='[564dffd1-06ca-ccae-a533-97aac010ea3d=<VM, ACTION_TYPE_FAILED_OBJECT_LOCKED>]'}'
2015-08-10 14:45:11,784 INFO  [org.ovirt.engine.core.bll.lsm.LiveMigrateVmDisksCommand] (org.ovirt.thread.pool-8-thread-7) [37ac4b8c] Running command: LiveMigrateVmDisksCommand Task handler: LiveSnapshotTaskHandler internal: false. Entities affected :  ID: 63213433-b3e9-4fab-809a-60972897baea Type: DiskAction group DISK_LIVE_STORAGE_MIGRATION with role type USER
2015-08-10 14:45:12,080 WARN  [org.ovirt.engine.core.bll.CreateAllSnapshotsFromVmCommand] (org.ovirt.thread.pool-8-thread-7) [42da1270] CanDoAction of action 'CreateAllSnapshotsFromVm' failed for user admin@internal. Reasons: VAR__ACTION__CREATE,VAR__TYPE__SNAPSHOT,ACTION_TYPE_FAILED_DISKS_LOCKED,$diskAliases windows7
2015-08-10 14:45:12,151 INFO  [org.ovirt.engine.core.bll.lsm.LiveMigrateVmDisksCommand] (org.ovirt.thread.pool-8-thread-7) [42da1270] Lock freed to object 'EngineLock:{exclusiveLocks='[63213433-b3e9-4fab-809a-60972897baea=<DISK, ACTION_TYPE_FAILED_DISK_IS_BEING_MIGRATED$DiskName windows7>]', sharedLocks='[564dffd1-06ca-ccae-a533-97aac010ea3d=<VM, ACTION_TYPE_FAILED_OBJECT_LOCKED>]'}

Comment 2 Carlos Mestre González 2015-08-10 14:35:10 UTC
Can confirm it happens with last build ovirt-engine-3.6.0-0.0.master.20150804111407.git122a3a0.el6.noarch, I tried to move between iscsi domains and RHEL 7.1

Comment 3 Allon Mureinik 2015-09-02 13:10:47 UTC
*** Bug 1256786 has been marked as a duplicate of this bug. ***

Comment 4 Carlos Mestre González 2015-09-03 08:21:06 UTC
I encountered other issue in the last build 3.6.0-10, operation succeeds but the job "Migrating ..." is never marked as FINISHED, should I open a new BZ or add the details here?

Comment 5 Daniel Erez 2015-09-03 08:28:44 UTC
(In reply to Carlos Mestre González from comment #4)
> I encountered other issue in the last build 3.6.0-10, operation succeeds but
> the job "Migrating ..." is never marked as FINISHED, should I open a new BZ
> or add the details here?

It's an in the task monitoring so please open a new BZ.
Can you please also attach the relevant logs and screenshots.

Comment 6 Kevin Alon Goldblatt 2015-09-16 09:14:03 UTC
Verified this with the following version:
----------------------------------------------------
rhevm-3.6.0-0.12.master.el6.noarch
vdsm-4.17.3-1.el7ev.noarch

Verified using the following scenario:
---------------------------------------------------
Steps to Reproduce:
1. Run a VM
2. Try to move one of its disks to a different storage domain >>>>> OPERATION WORKS FINE

Moving to VERIFIED!

Comment 7 Allon Mureinik 2016-03-10 10:39:19 UTC
RHEV 3.6.0 has been released, setting status to CLOSED CURRENTRELEASE

Comment 8 Allon Mureinik 2016-03-10 10:39:24 UTC
RHEV 3.6.0 has been released, setting status to CLOSED CURRENTRELEASE

Comment 9 Allon Mureinik 2016-03-10 10:45:10 UTC
RHEV 3.6.0 has been released, setting status to CLOSED CURRENTRELEASE

Comment 10 Allon Mureinik 2016-03-10 12:01:48 UTC
RHEV 3.6.0 has been released, setting status to CLOSED CURRENTRELEASE


Note You need to log in before you can comment on or make changes to this bug.