Bug 1263937
Summary: | RHEV-H 3.6 fails to install on multipath device | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Product: | Red Hat Enterprise Virtualization Manager | Reporter: | Fabian Deutsch <fdeutsch> | ||||||||||
Component: | ovirt-node | Assignee: | Fabian Deutsch <fdeutsch> | ||||||||||
Status: | CLOSED ERRATA | QA Contact: | Fabian Deutsch <fdeutsch> | ||||||||||
Severity: | urgent | Docs Contact: | |||||||||||
Priority: | urgent | ||||||||||||
Version: | 3.6.0 | CC: | cshao, fdeutsch, gklein, huzhao, lsurette, mgoldboi, ycui, ykaul | ||||||||||
Target Milestone: | ovirt-3.6.0-rc3 | Keywords: | AutomationBlocker, OtherQA, TestBlocker, TestOnly | ||||||||||
Target Release: | 3.6.0 | ||||||||||||
Hardware: | Unspecified | ||||||||||||
OS: | Unspecified | ||||||||||||
Whiteboard: | |||||||||||||
Fixed In Version: | ovirt-node-3.3.0-0.13.20151008git03eefb5.el7ev.noarch | Doc Type: | Bug Fix | ||||||||||
Doc Text: | Story Points: | --- | |||||||||||
Clone Of: | Environment: | ||||||||||||
Last Closed: | 2016-03-09 14:38:37 UTC | Type: | Bug | ||||||||||
Regression: | --- | Mount Type: | --- | ||||||||||
Documentation: | --- | CRM: | |||||||||||
Verified Versions: | Category: | --- | |||||||||||
oVirt Team: | Node | RHEL 7.3 requirements from Atomic Host: | |||||||||||
Cloudforms Team: | --- | Target Upstream Version: | |||||||||||
Embargoed: | |||||||||||||
Bug Depends On: | 1259831, 1271118 | ||||||||||||
Bug Blocks: | |||||||||||||
Attachments: |
|
Description
Fabian Deutsch
2015-09-17 06:40:44 UTC
Can't reproduce this issue on multipath iscsi machine. Test version: RHEV-H 7.2 for RHEV 3.6 (rhev-hypervisor-7-7.2-20150831.0) ovirt-node-3.3.0-0.2.20150722git7eba125.el7ev.noarch Test machine: hp-z600-03: multipath iSCSI Test result: RHEV-H 3.6 install on multipath iscsi machine - pass. Created attachment 1074415 [details]
pass.png
Created attachment 1074500 [details]
Multipath disk is detected as two separate ones
It seems that sometimes (race!) the mpath disk is not assembled correctly and the raw disks appear. When the raw disks appear the installation fails. When the mpath disk appears the installations completes. Created attachment 1074514 [details]
sosreport from a failed run
I also can't reproduce, even with multiple attempts at reproducing a race (with both high and low vcpu counts) Assigning back, since you can. It seems to be a symptom of bug 1259831. It turned out that device-mapper-multipath-0.4.9-85.el7.x86_64 is fixing this issue, as described in comment 8. Can't reproduce this issue on multipath FC machine. Test version: RHEV-H 7.2 for RHEV 3.6 (rhev-hypervisor-7-7.2-20150831.0) ovirt-node-3.3.0-0.2.20150722git7eba125.el7ev.noarch Test machine: hp-dl385pg8-11.qe.lab.eng.nay.redhat.com: multipath FC Test result: RHEV-H 3.6 clean install on multipath FC machine - pass. Additional info: RHEV-H 3.6 dirty install on multipath FC machine faild. Bug 1264317. Created attachment 1080929 [details]
clean install successful on multipath FC disk
This bug is related to a race, so the reproducability might be low. (In reply to Fabian Deutsch from comment #12) > This bug is related to a race, so the reproducability might be low. Hi fabiand, The bug's status is ON_QA now, but we can't reproduce this bug in our env, so I can't verify it. Could you help us to verify this bug? If you can, I will add keyword "OtherQA" to this bug. Thanks! Yes, I can test this bug. I'll mark it as VERIFIED if I don't see it in my further testing. (In reply to Fabian Deutsch from comment #14) > Yes, I can test this bug. I'll mark it as VERIFIED if I don't see it in my > further testing. Thanks! (In reply to Fabian Deutsch from comment #14) > Yes, I can test this bug. I'll mark it as VERIFIED if I don't see it in my > further testing. Fabian, did you see this issue recently? Can you verify it before 3.6 Beta 1? No, I have not seen it lately. Let me verify this bug. Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://rhn.redhat.com/errata/RHBA-2016-0378.html |