RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 1370458 - Report lvm error during installation and can not continue to install RHVH
Summary: Report lvm error during installation and can not continue to install RHVH
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: Red Hat Enterprise Linux 7
Classification: Red Hat
Component: anaconda
Version: 7.2
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: pre-dev-freeze
: ---
Assignee: Anaconda Maintenance Team
QA Contact: Release Test Team
URL:
Whiteboard:
Depends On:
Blocks: ovirt-node-ng-platform
TreeView+ depends on / blocked
 
Reported: 2016-08-26 11:30 UTC by Huijuan Zhao
Modified: 2016-12-23 05:24 UTC (History)
13 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2016-12-23 05:24:27 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
screenshot of error (203.46 KB, application/x-gzip)
2016-08-26 11:30 UTC, Huijuan Zhao
no flags Details
logs in /var/log/ on RHVH (23.17 KB, application/x-gzip)
2016-08-26 11:32 UTC, Huijuan Zhao
no flags Details

Description Huijuan Zhao 2016-08-26 11:30:58 UTC
Created attachment 1194309 [details]
screenshot of error

Description of problem:
Report lvm error during installation

Version-Release number of selected component (if applicable):
RHVH-4.0-20160822.8-RHVH-x86_64-dvd1.iso


How reproducible:
Tested 2 times on machine hp-dl385g8-03, reproduced 2 times on RHVH-4.0-20160822.8.iso
Tested 10+ times on other machines(include dell/hp, iSCSI/FC/EFI), do not encounter this issue on RHVH-4.0-20160822.8.iso

Steps to Reproduce:
1. Install RHVH-4.0-20160822.8-RHVH-x86_64-dvd1.iso via virtual-CDROM on machine hp-dl385g8-03 (1 local storage, 3 FC luns)
2. During installation, just start the anaconda GUI, in Welcome RHVH 4.0 Page, no other action in GUI, just wait for a while, maybe 5 seconds, then the error is displayed in anaconda GUI.


Actual results:
After step2, there is "An unknown error has occurred", and can not continue to install RHVH.
please refer to attachment for detailed info.


Expected results:
After step2, there should be no error and can install RHVH successful

Additional info:
Please enter to ENV (machine console info) if necessary.
I will email the ENV info to you later.

Comment 1 Huijuan Zhao 2016-08-26 11:32:10 UTC
Created attachment 1194310 [details]
logs in /var/log/ on RHVH

Comment 2 Huijuan Zhao 2016-08-26 11:33:58 UTC
Logs (tmplog385g8-03.tar.gz) in /tmp/ on RHVH is too big and can not upload to Bugzillar, please refer to link[1] to get it:
[1] http://10.66.10.3:8000/tmplog385g8-03.tar.gz

Comment 3 Fabian Deutsch 2016-08-26 12:43:36 UTC
Moving this to anaconda as well, as the bug does not originate from the %post part.

Comment 4 Ying Cui 2016-08-26 12:48:59 UTC
from anaconda.log
<snip>
11:04:22,363 CRIT anaconda: Traceback (most recent call last):

  File "/usr/lib64/python2.7/site-packages/pyanaconda/threads.py", line 227, in run
    threading.Thread.run(self, *args, **kwargs)

  File "/usr/lib64/python2.7/threading.py", line 764, in run
    self.__target(*self.__args, **self.__kwargs)

  File "/usr/lib64/python2.7/site-packages/pyanaconda/packaging/__init__.py", line 1234, in _runThread
    threadMgr.wait(THREAD_STORAGE)

  File "/usr/lib64/python2.7/site-packages/pyanaconda/threads.py", line 112, in wait
    self.raise_if_error(name)

  File "/usr/lib64/python2.7/site-packages/pyanaconda/threads.py", line 227, in run
    threading.Thread.run(self, *args, **kwargs)

  File "/usr/lib64/python2.7/threading.py", line 764, in run
    self.__target(*self.__args, **self.__kwargs)

  File "/usr/lib/python2.7/site-packages/blivet/__init__.py", line 184, in storageInitialize
    storage.reset()

  File "/usr/lib/python2.7/site-packages/blivet/__init__.py", line 489, in reset
    self.devicetree.populate(cleanupOnly=cleanupOnly)

  File "/usr/lib/python2.7/site-packages/blivet/devicetree.py", line 2228, in populate
    self._populate()

  File "/usr/lib/python2.7/site-packages/blivet/devicetree.py", line 2295, in _populate
    self.addUdevDevice(dev)

  File "/usr/lib/python2.7/site-packages/blivet/devicetree.py", line 1285, in addUdevDevice
    self.handleUdevDeviceFormat(info, device)

  File "/usr/lib/python2.7/site-packages/blivet/devicetree.py", line 1981, in handleUdevDeviceFormat
    self.handleUdevLVMPVFormat(info, device)

  File "/usr/lib/python2.7/site-packages/blivet/devicetree.py", line 1629, in handleUdevLVMPVFormat
    self.handleVgLvs(vg_device)

  File "/usr/lib/python2.7/site-packages/blivet/devicetree.py", line 1566, in handleVgLvs
    addLV(lv)

  File "/usr/lib/python2.7/site-packages/blivet/devicetree.py", line 1549, in addLV
    lv_device.setup()

  File "/usr/lib/python2.7/site-packages/blivet/devices/storage.py", line 413, in setup
    self._setup(orig=orig)

  File "/usr/lib/python2.7/site-packages/blivet/devices/lvm.py", line 690, in _setup
    lvm.lvactivate(self.vg.name, self._name)

  File "/usr/lib/python2.7/site-packages/blivet/devicelibs/lvm.py", line 538, in lvactivate
    raise LVMError("lvactivate failed for %s: %s" % (lv_name, msg))

LVMError: lvactivate failed for swap: running lvm lvchange -a y --config  devices { preferred_names=["^/dev/mapper/", "^/dev/md/", "^/dev/sd"] }  r4b/swap failed

</snip>

Comment 5 Fabian Deutsch 2016-08-26 12:55:48 UTC
Ying, does it work if you manually try to activate the swap partition from the console?

Comment 6 Ying Cui 2016-09-18 10:01:51 UTC
Huijuan, see comment 5, could you have a try?

Comment 7 Huijuan Zhao 2016-09-20 02:18:07 UTC
(In reply to Fabian Deutsch from comment #5)
> Ying, does it work if you manually try to activate the swap partition from
> the console?

Fabian, do you mean manually activate the swap partition in the partition page during installation? But this bug occurred before enter to partition page

Comment 8 Fabian Deutsch 2016-10-28 09:18:33 UTC
The flow I was thinking about:

1. Boot installer
2. Switch to console and run: lvm lvchange -a y --config  "devices { preferred_names=["^/dev/mapper/", "^/dev/md/", "^/dev/sd"] }"  r4b/swap

After 2 check the returncode

Comment 9 Jan Stodola 2016-12-19 08:34:58 UTC
Huijuan,
log files from comment 2 are not accessible, could you please retest and upload /tmp/anaconda-tb-* when you encounter the issue again?
Could you also retest with RHEL-7.3?
Thank you

Comment 10 Huijuan Zhao 2016-12-20 05:55:12 UTC
(In reply to Jan Stodola from comment #9)
> Huijuan,
> log files from comment 2 are not accessible, could you please retest and
> upload /tmp/anaconda-tb-* when you encounter the issue again?

Jan, please refer to below link for log files:
http://10.66.8.184/cshao/log/tmplog385g8-03.tar.gz

> Could you also retest with RHEL-7.3?
> Thank you

And I will retest with RHEL-7.3 as soon as possible, thanks.

Comment 12 Huijuan Zhao 2016-12-23 05:24:27 UTC
No such issue in RHEL-7.3(RHEL-7.3-20161019.0-Server-x86_64-dvd1.iso).
And no such issue in RHVH-4.0.6(RHVH-4.0-20161214.0-RHVH-x86_64-dvd1.iso).

So this issue has been fixed in latest RHVH-4.0.6(RHVH-4.0-20161214.0-RHVH-x86_64-dvd1.iso), I will close this issue.


Note You need to log in before you can comment on or make changes to this bug.