Bug 894035
Summary: | engine: storage live migration is reported as failed in engine although action was successful | ||||||
---|---|---|---|---|---|---|---|
Product: | Red Hat Enterprise Virtualization Manager | Reporter: | Dafna Ron <dron> | ||||
Component: | ovirt-engine | Assignee: | Daniel Erez <derez> | ||||
Status: | CLOSED CURRENTRELEASE | QA Contact: | Dafna Ron <dron> | ||||
Severity: | urgent | Docs Contact: | |||||
Priority: | unspecified | ||||||
Version: | 3.2.0 | CC: | amureini, derez, dyasny, hateya, iheim, lpeer, Rhev-m-bugs, sgrinber, yeylon, ykaul | ||||
Target Milestone: | --- | Keywords: | Regression | ||||
Target Release: | 3.2.0 | ||||||
Hardware: | x86_64 | ||||||
OS: | Linux | ||||||
Whiteboard: | storage | ||||||
Fixed In Version: | sf5 | Doc Type: | Bug Fix | ||||
Doc Text: | Story Points: | --- | |||||
Clone Of: | Environment: | ||||||
Last Closed: | Type: | Bug | |||||
Regression: | --- | Mount Type: | --- | ||||
Documentation: | --- | CRM: | |||||
Verified Versions: | Category: | --- | |||||
oVirt Team: | Storage | RHEL 7.3 requirements from Atomic Host: | |||||
Cloudforms Team: | --- | Target Upstream Version: | |||||
Embargoed: | |||||||
Bug Depends On: | |||||||
Bug Blocks: | 915537 | ||||||
Attachments: |
|
According to the vdsm log, an 'Input/output' has been occoured right before LiveMigrateDiskCommand ended with failure (2013-01-10 16:56:52 at vdsm log). Can you please check if it's 100% reproducible? tested several times and it's 100% reproducible devel-ack delayed until we have a suggested solution, in order to assess the risk. Daniel, do we have such a thing? patch sent: http://gerrit.ovirt.org/#/c/11202/ Change-Id: I37926ba2ae87c334e6f2d17b65b91290fe2c3d39 verified on sf5 3.2 has been released 3.2 has been released 3.2 has been released 3.2 has been released 3.2 has been released |
Created attachment 676387 [details] logs Description of problem: when I migrate a live disk engine reports action as failed but the disk is reported as moved in vdsm the disk is also reported to be moved to the second domain by the UI (so obviously the db is updated with the target domain as well) the only thing we can see in the logs is: 2013-01-10 16:49:48,067 ERROR [org.ovirt.engine.core.bll.lsm.LiveMigrateDiskCommand] (pool-3-thread-50) [11b8bd23] Ending command with failure: org.ovirt.engine.core.bll.lsm.LiveMigrateDiskCommand Thread-6242::DEBUG::2013-01-10 16:49:47,581::taskManager::96::TaskManager::(getTaskStatus) Return. Response: {'code': 0, 'message': '1 jobs completed successfully', 'taskState': 'finished', 'taskResult': 'success', 'taskID': '9ec406a8-0 bfe-465f-94e6-3bd19165f04c'} Version-Release number of selected component (if applicable): sf3 vdsm-4.10.2-3.0.el6ev.x86_64 libvirt-0.10.2-14.el6.x86_64 How reproducible: 100% Steps to Reproduce: 1. move a disk 2. 3. Actual results: although task was successful the task is reported as failed Expected results: task should not be reported as failed Additional info:logs and dbdump