Bug 1251608

Summary: Cannot install RHEV-H 6.6-20150603.0.el6ev from CD/ISO (not via RHEV-M)
Product: Red Hat Enterprise Virtualization Manager Reporter: Robert McSwain <rmcswain>
Component: rhev-hypervisorAssignee: Anatoly Litovsky <tlitovsk>
Status: CLOSED INSUFFICIENT_DATA QA Contact: Virtualization Bugs <virt-bugs>
Severity: urgent Docs Contact:
Priority: unspecified    
Version: 3.5.3CC: cshao, cwu, dfediuck, ecohen, fdeutsch, gklein, huiwa, huzhao, leiwang, lsurette, pstehlik, rmcswain, tlitovsk, yaniwang, ycui, yeylon
Target Milestone: ---   
Target Release: ---   
Hardware: x86_64   
OS: Linux   
Whiteboard: node
Fixed In Version: Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2015-09-16 10:42:57 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: Node RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On: 1236738    
Bug Blocks:    

Description Robert McSwain 2015-08-07 21:50:25 UTC
Description of problem:
Cannot upgrade to the most recent release with the error as shown in 6__7acce5d6-1f02-41ee-bd82-631ebc3bd093__.png:

An error appears in the UI: CalledProcessError()
Press ENTER  to log out ...
or enter 's' to drop to shell

Version-Release number of selected component (if applicable):
Red Hat Enterprise Virtualization Hypervisor release 6.6 (20150603.0.el6ev)

How reproducible:
Unknonwn, however the system is a Dell PowerEdge C2100       
	Version: A00          

Steps to Reproduce:
1. Install Hypervisor 6.6-20150603.0.el6ev
2. Choose U.S. English
3. Local FibreChannel (as shown in image 3__c29651ad-f1b7-4e0b-b3e2-16509117c2be__.png and 4__b57b7969-b62f-487d-9bdd-15d9a3701a25__.png)
4. See choose defaults for Storage Volumes
5. Get the error of 
An error appears in the UI: CalledProcessError()
Press ENTER  to log out ...
or enter 's' to drop to shell

Actual results:
An error appears in the UI: CalledProcessError()
Press ENTER  to log out ...
or enter 's' to drop to shell

Expected results:
Successful installation.

Additional info:
Images coming in a private update

Comment 2 Anatoly Litovsky 2015-08-09 11:16:17 UTC
i cant reproduce localy with latest node 
Please provide link to the machine

Comment 3 Fabian Deutsch 2015-08-10 09:37:45 UTC
We often see the CalledProcess error in the screenshots, this is a good indication of what might go wrong, and in ovirt-node.log we can see:

2015-07-24 17:17:51,507       INFO Exception:
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/ovirt/node/app.py", line 304, in run
  File "/usr/lib/python2.6/site-packages/ovirt/node/ui/urwid_builder.py", line 441, in run
  File "/usr/lib64/python2.6/site-packages/urwid/main_loop.py", line 271, in run
  File "/usr/lib64/python2.6/site-packages/urwid/raw_display.py", line 241, in run_wrapper
  File "/usr/lib64/python2.6/site-packages/urwid/main_loop.py", line 336, in _run
  File "/usr/lib64/python2.6/site-packages/urwid/main_loop.py", line 707, in run
  File "/usr/lib64/python2.6/site-packages/urwid/main_loop.py", line 786, in _loop
  File "/usr/lib64/python2.6/site-packages/urwid/main_loop.py", line 387, in _update
  File "/usr/lib64/python2.6/site-packages/urwid/main_loop.py", line 487, in process_input
  File "/usr/lib64/python2.6/site-packages/urwid/container.py", line 1102, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/container.py", line 1559, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/container.py", line 2240, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/container.py", line 1102, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/decoration.py", line 618, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/container.py", line 1559, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/container.py", line 1559, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/decoration.py", line 618, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/container.py", line 1102, in keypress
  File "/usr/lib/python2.6/site-packages/ovirt/node/ui/widgets.py", line 759, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/container.py", line 1559, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/decoration.py", line 833, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/container.py", line 2240, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/container.py", line 1559, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/decoration.py", line 618, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/wimp.py", line 534, in keypress
  File "/usr/lib64/python2.6/site-packages/urwid/widget.py", line 463, in _emit
  File "/usr/lib64/python2.6/site-packages/urwid/signals.py", line 120, in emit
  File "/usr/lib/python2.6/site-packages/ovirt/node/ui/widgets.py", line 542, in on_click_cb
  File "/usr/lib64/python2.6/site-packages/urwid/signals.py", line 120, in emit
  File "/usr/lib/python2.6/site-packages/ovirt/node/ui/urwid_builder.py", line 111, in on_widget_click_cb
  File "/usr/lib/python2.6/site-packages/ovirt/node/base.py", line 103, in __call__
  File "/usr/lib/python2.6/site-packages/ovirt/node/base.py", line 85, in emit
  File "/usr/lib/python2.6/site-packages/ovirt/node/ui/__init__.py", line 201, in __call__
  File "/usr/lib/python2.6/site-packages/ovirt/node/app.py", line 176, in call_on_ui_save
  File "/usr/lib/python2.6/site-packages/ovirt/node/plugins.py", line 406, in _on_ui_save
  File "/usr/lib/python2.6/site-packages/ovirt/node/installer/core/storage_vol_page.py", line 155, in on_merge
  File "/usr/lib/python2.6/site-packages/ovirt/node/ui/__init__.py", line 807, in to_next_plugin
  File "/usr/lib/python2.6/site-packages/ovirt/node/ui/__init__.py", line 802, in to_nth
  File "/usr/lib/python2.6/site-packages/ovirt/node/ui/__init__.py", line 787, in to_plugin
  File "/usr/lib/python2.6/site-packages/ovirt/node/app.py", line 243, in switch_to_plugin
  File "/usr/lib/python2.6/site-packages/ovirt/node/installer/core/confirmation_page.py", line 66, in ui_content
  File "/usr/lib/python2.6/site-packages/ovirt/node/installer/core/confirmation_page.py", line 128, in _storage_tagged
  File "/usr/lib/python2.6/site-packages/ovirt/node/utils/system.py", line 1170, in pv_names
  File "/usr/lib/python2.6/site-packages/ovirt/node/utils/system.py", line 1179, in _query_vgs
  File "/usr/lib/python2.6/site-packages/ovirt/node/utils/process.py", line 166, in check_output
  File "/usr/lib/python2.6/site-packages/ovirt/node/utils/process.py", line 225, in pipe
CalledProcessError: Command '['lvm', 'vgs', '--noheadings', '-o', 'pv_name', u'Found duplicate PV XXXXXXXXXXXXXXXXX: using /dev/sdb2 not /dev/sda2']' returned non-zero exit status 3

Comment 4 Huijuan Zhao 2015-08-10 10:36:45 UTC
I can not reproduce this bug. 

Version-Release number of selected component:
Red Hat Enterprise Virtualization Hypervisor release 6.6 (20150603.0.el6ev)
ovirt-node-3.2.3-3.el6.noarch


Steps to Reproduce:
1. Install Hypervisor 6.6-20150603.0.el6ev
2. Choose U.S. English
3. Local FibreChannel (test machine info:hp-dl385pg8-11.qe.lab.eng.nay.redhat.com, 360050763008084e6e00000000000004c

30G)
4. choose defaults for Storage Volumes
5. Successful installation

Expected results:
Successful installation.

actual results:
Successful installation.

Additional info:
Please clean setup again, maybe there will be no problem.

Comment 5 Fabian Deutsch 2015-08-10 14:57:22 UTC
To my understanding the bug can be reproduced when the data LV is on a multipath device which is not getting assembled correctly.

Without being tested:

1. Install RHEV-H on multipath device, make sure the data / installation disk is a multipath device
2. Upgrade RHEV-H

Comment 6 Ying Cui 2015-08-11 05:13:42 UTC
From the customer ticket, there included three rhevh version info, and it is single path device, see picture 4.png, and model is MR9260-8i, here MegaRAID on Dell PowerEdge C2100.

rhevh 6.3 20130129.0.el6_3
rhevh 6.6 20150128.0.el6
rhevh 6.6 20150603.0.el6

1. uninstall rhevh 6.3 20130129.0.el6_3
2. install rhevh 6.6 20150128.0.el6, but here seems dirty on device /sdb /sdb2
3. reinstall rhevh 6.6 20150603.0.el6, see pic VoKhxIK.jpg

Comment 8 Fabian Deutsch 2015-08-11 07:29:54 UTC
Robert, it looks like the disks are dirty, and contain data from a previous installation. Can you try to manually clean up the disk?

Also: RHEV-H does not support "software RAID" (non-hardware RAID), please make sure that any software RAID functionality is turned off.

Once the disks are clean and any software RAID is disabled, please retry the installation.

Comment 9 Huijuan Zhao 2015-08-11 10:22:28 UTC
Still can not reproduce this bug. 

Version-Release number of selected component:
Red Hat Enterprise Virtualization Hypervisor release 6.6 (20150128.0.el6ev)
Red Hat Enterprise Virtualization Hypervisor release 6.6 (20150603.0.el6ev)


Steps to Reproduce:
1. Uninstall.
2. Install Hypervisor 6.6-20150128.0.el6ev
3. Reinstall Hypervisor 6.6-20150603.0.el6ev
4. Choose U.S. English
5. Local FibreChannel
6. choose defaults for Storage Volumes
7. Successful installation

Environment to Reproduce:
Reproduce it in 2 test machines:
1.DELL R210(we do not have DELL C2100, so using DELL R210),machine info: https://intel-x3470-8-1-ipmi.englab.nay.redhat.com/login.html.
2.hp-bl460cg7-02, machine info:  iLo2: 10.66.73.1 

Expected results:
Successful installation.

actual results:
Successful installation.

Additional info:
Please check the Installation Mediaļ¼Œand hardware information.

Comment 10 Fabian Deutsch 2015-09-16 10:42:57 UTC
The likely cause was pointed out in comment 9.

If there are still questions please re-open this bug.

Comment 11 Red Hat Bugzilla 2023-09-14 03:03:23 UTC
The needinfo request[s] on this closed bug have been removed as they have been unresolved for 1000 days