Bug 593871

Summary: cannot activate VG with missing PV(s)
Product: Red Hat Enterprise Linux 6 Reporter: Brian Lane <bcl>
Component: anacondaAssignee: Hans de Goede <hdegoede>
Status: CLOSED CURRENTRELEASE QA Contact: Release Test Team <release-test-team-automation>
Severity: urgent Docs Contact:
Priority: low    
Version: 6.0CC: atodorov, hdegoede, snagar
Target Milestone: rc   
Target Release: ---   
Hardware: All   
OS: Linux   
Whiteboard:
Fixed In Version: anaconda-13.21.44-1 Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2010-07-02 20:49:52 UTC Type: ---
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Attachments:
Description Flags
traceback none

Description Brian Lane 2010-05-19 22:02:24 UTC
Created attachment 415273 [details]
traceback

Description of problem:
Created an updates image with all changes since v13.21.39-1 and booted 20100512 Workstation DVD with the updates.

After formatting some of the partitions it fails with the error: cannot activate VG with missing PV(s)


How reproducible:
Always

I modified my updates.img to include only the test I was trying to test and it ran fine. If I create an updates with changes since 13.21.43 it has my changes for mount sanity and storage module changes, which fails. If I remove the storage changes, leaving only my changes since 13.21.43 it runs correctly.

Comment 2 Hans de Goede 2010-05-20 09:23:01 UTC
Oh, brown paper bag bug. I made 2 innocent looking changes to the patchset in question after some discussions with dlehman (but this is entirely my fault) and
I did not retest.

The exception which you are seeing used to get thrown based on this check:
if len(self.parents) < self.pvCount:

Which I changed to:
if not self.complete:

Which boils down to:
if len(self.parents) != self.pvCount:

But self.pvCount never gets set for non pre existing vgs, so it is always
0, we didn't trip over this before because of the (wrong) < check in the original code.

Patch coming up.

Note to PM and QA, this bug makes any installs involving lvm fail, can we get acks please.

Comment 3 Hans de Goede 2010-05-20 09:53:51 UTC
Fixed in anaconda-13.21.44-1, moving to modified.

Comment 4 Alexander Todorov 2010-05-20 09:55:04 UTC
The traceback is:

anaconda 13.21.39 exception report
Traceback (most recent call first):
  File "/tmp/updates/storage/devices.py", line 1930, in setup
    raise DeviceError("cannot activate VG with missing PV(s)", self.name)
  File "/tmp/updates/storage/devices.py", line 1969, in create
    self.setup()
  File "/tmp/updates/storage/deviceaction.py", line 203, in execute
    self.device.create(intf=intf)
  File "/tmp/updates/storage/devicetree.py", line 701, in processActions
    action.execute(intf=self.intf)
  File "/tmp/updates/storage/__init__.py", line 292, in doIt
    self.devicetree.processActions()
  File "/tmp/updates/packages.py", line 109, in turnOnFilesystems
    anaconda.id.storage.doIt()
  File "/usr/lib/anaconda/dispatch.py", line 205, in moveStep
    rc = stepFunc(self.anaconda)
  File "/usr/lib/anaconda/dispatch.py", line 126, in gotoNext
    self.moveStep()
  File "/tmp/updates/gui.py", line 1423, in setScreen
    self.anaconda.dispatch.gotoNext()
  File "/tmp/updates/gui.py", line 1336, in nextClicked
    self.setScreen ()
DeviceError: ('cannot activate VG with missing PV(s)', 'VolGroup')

Comment 5 Brian Lane 2010-05-20 15:52:44 UTC
Perfect! This fixes the problem for me. I retested with the same setup, only difference was the updates included this morning's commits, and it works fine.

Comment 7 Alexander Todorov 2010-05-25 16:13:21 UTC
With snapshot #5 (0523.0) I did two installs:

1) Using defaults (LVM) with 2 disks
2) Using all space (LVM) with only the second disk

There was no error. Moving to VERIFIED.

Comment 8 releng-rhel@redhat.com 2010-07-02 20:49:52 UTC
Red Hat Enterprise Linux Beta 2 is now available and should resolve
the problem described in this bug report. This report is therefore being closed
with a resolution of CURRENTRELEASE. You may reopen this bug report if the
solution does not work for you.