Description of problem: RHEV-H 3.6 fails to install on a multipath device, in my case a VM with two disks using the same serial. Version-Release number of selected component (if applicable): RHEV-3.6 ovirt-node-3.3.0-0.5.20150917git82fb4d4.el7ev How reproducible: Always Steps to Reproduce: 1. Create vm with two disks and set the same serial for both of them 2. Install RHEV-H on one disk 3. Installation fails Actual results: Expected results: Additional info: A similar problem if you install RHEV-H on both disks
Can't reproduce this issue on multipath iscsi machine. Test version: RHEV-H 7.2 for RHEV 3.6 (rhev-hypervisor-7-7.2-20150831.0) ovirt-node-3.3.0-0.2.20150722git7eba125.el7ev.noarch Test machine: hp-z600-03: multipath iSCSI Test result: RHEV-H 3.6 install on multipath iscsi machine - pass.
Created attachment 1074415 [details] pass.png
Created attachment 1074500 [details] Multipath disk is detected as two separate ones
It seems that sometimes (race!) the mpath disk is not assembled correctly and the raw disks appear. When the raw disks appear the installation fails. When the mpath disk appears the installations completes.
Created attachment 1074514 [details] sosreport from a failed run
I also can't reproduce, even with multiple attempts at reproducing a race (with both high and low vcpu counts) Assigning back, since you can.
It seems to be a symptom of bug 1259831.
It turned out that device-mapper-multipath-0.4.9-85.el7.x86_64 is fixing this issue, as described in comment 8.
Can't reproduce this issue on multipath FC machine. Test version: RHEV-H 7.2 for RHEV 3.6 (rhev-hypervisor-7-7.2-20150831.0) ovirt-node-3.3.0-0.2.20150722git7eba125.el7ev.noarch Test machine: hp-dl385pg8-11.qe.lab.eng.nay.redhat.com: multipath FC Test result: RHEV-H 3.6 clean install on multipath FC machine - pass. Additional info: RHEV-H 3.6 dirty install on multipath FC machine faild. Bug 1264317.
Created attachment 1080929 [details] clean install successful on multipath FC disk
This bug is related to a race, so the reproducability might be low.
(In reply to Fabian Deutsch from comment #12) > This bug is related to a race, so the reproducability might be low. Hi fabiand, The bug's status is ON_QA now, but we can't reproduce this bug in our env, so I can't verify it. Could you help us to verify this bug? If you can, I will add keyword "OtherQA" to this bug. Thanks!
Yes, I can test this bug. I'll mark it as VERIFIED if I don't see it in my further testing.
(In reply to Fabian Deutsch from comment #14) > Yes, I can test this bug. I'll mark it as VERIFIED if I don't see it in my > further testing. Thanks!
(In reply to Fabian Deutsch from comment #14) > Yes, I can test this bug. I'll mark it as VERIFIED if I don't see it in my > further testing. Fabian, did you see this issue recently? Can you verify it before 3.6 Beta 1?
No, I have not seen it lately. Let me verify this bug.
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://rhn.redhat.com/errata/RHBA-2016-0378.html