Bug 1330161
| Summary: | blivet.errors.DeviceError: ('array is not fully defined', 'home') | ||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Product: | [Fedora] Fedora | Reporter: | Menanteau Guy <menantea> | ||||||||||||||||||||||||||||
| Component: | mdadm | Assignee: | Nigel Croxon <ncroxon> | ||||||||||||||||||||||||||||
| Status: | CLOSED CURRENTRELEASE | QA Contact: | Fedora Extras Quality Assurance <extras-qa> | ||||||||||||||||||||||||||||
| Severity: | unspecified | Docs Contact: | |||||||||||||||||||||||||||||
| Priority: | unspecified | ||||||||||||||||||||||||||||||
| Version: | 24 | CC: | agk, anaconda-maint-list, dan, dledford, g.kaviyarasu, Jes.Sorensen, jonathan, menantea, ncroxon, vanmeeuwen+fedora, xni | ||||||||||||||||||||||||||||
| Target Milestone: | --- | ||||||||||||||||||||||||||||||
| Target Release: | --- | ||||||||||||||||||||||||||||||
| Hardware: | ppc64 | ||||||||||||||||||||||||||||||
| OS: | Unspecified | ||||||||||||||||||||||||||||||
| Whiteboard: | abrt_hash:92f71cdc726e5d0be411a39aee9953dea19ca19e63c9900fadf9c212b5777008; | ||||||||||||||||||||||||||||||
| Fixed In Version: | Doc Type: | Bug Fix | |||||||||||||||||||||||||||||
| Doc Text: | Story Points: | --- | |||||||||||||||||||||||||||||
| Clone Of: | Environment: | ||||||||||||||||||||||||||||||
| Last Closed: | 2017-07-20 13:45:46 UTC | Type: | --- | ||||||||||||||||||||||||||||
| Regression: | --- | Mount Type: | --- | ||||||||||||||||||||||||||||
| Documentation: | --- | CRM: | |||||||||||||||||||||||||||||
| Verified Versions: | Category: | --- | |||||||||||||||||||||||||||||
| oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |||||||||||||||||||||||||||||
| Cloudforms Team: | --- | Target Upstream Version: | |||||||||||||||||||||||||||||
| Embargoed: | |||||||||||||||||||||||||||||||
| Bug Depends On: | |||||||||||||||||||||||||||||||
| Bug Blocks: | 1071880, 1320886 | ||||||||||||||||||||||||||||||
| Attachments: |
|
||||||||||||||||||||||||||||||
|
Description
Menanteau Guy
2016-04-25 13:32:14 UTC
Created attachment 1150477 [details]
File: anaconda-tb
Created attachment 1150478 [details]
File: anaconda.log
Created attachment 1150479 [details]
File: dnf.log
Created attachment 1150480 [details]
File: environ
Created attachment 1150481 [details]
File: lsblk_output
Created attachment 1150482 [details]
File: lvm.log
Created attachment 1150483 [details]
File: nmcli_dev_list
Created attachment 1150484 [details]
File: os_info
Created attachment 1150485 [details]
File: program.log
Created attachment 1150486 [details]
File: storage.log
Created attachment 1150487 [details]
File: syslog
Created attachment 1150488 [details]
File: ifcfg.log
Created attachment 1150489 [details]
File: packaging.log
Note that this problem is not present on ppc64le. Seems likely that udev is not reporting the array's UUID when we ask after creating the array. On the failed install machine, I can get the UUID thru following command:
[anaconda root@localhost tmp]# mdadm --examine --scan
ARRAY /dev/md/home metadata=1.2 UUID=67a04a5b:38a1de07:9c3eba89:3e1aaed3 name=localhost:home
But in the program.log I don't see the UUId on mdadm --detail command.
Also if I run the command manually I get:
[anaconda root@localhost tmp]# mdadm --detail --test /dev/md/home
/dev/md/home:
Version : 1.2
Creation Time : Fri Apr 29 11:02:57 2016
Raid Level : raid1
Array Size : 10485760 (10.00 GiB 10.74 GB)
Used Dev Size : 10485760 (10.00 GiB 10.74 GB)
Raid Devices : 2
Total Devices : 2
Persistence : Superblock is persistent
Intent Bitmap : Internal
Update Time : Fri Apr 29 10:13:58 2016
State : active
Active Devices : 2
Working Devices : 2
Failed Devices : 0
Spare Devices : 0
Number Major Minor RaidDevice State
0 8 5 0 active sync /dev/sda5
1 8 18 1 active sync /dev/sdb2
There is no UUID !!!!
I expected some more lines like:
Name : localhost:home (local to host localhost)
UUID : 67a04a5b:38a1de07:9c3eba89:3e1aaed3
Events : ...
I transfered the bug to mdadm component because I reproduced the problem on a VM ppc64 already installed, when I created raid1 partitions manually.
To compare outputs, I did the same process on a ppc64 VM and a ppc64le VM.
First I installed the VM ppc64 and the VM ppc64le.
The two machines have been installed with 2 disks of 20G and following partitions:
Device Boot Start End Sectors Size Id Type
/dev/sda1 * 2048 10239 8192 4M 41 PPC PReP Boot
/dev/sda2 10240 1034239 1024000 500M 83 Linux
/dev/sda3 1034240 15734783 14700544 7G 8e Linux LVM
Device Boot Start End Sectors Size Id Type
/dev/sdb1 2048 14702591 14700544 7G 8e Linux LVM
Then I manually created an extended partition sda4 using fdisk
/dev/sda4 15734784 41943039 26208256 12.5G 5 Extended
And one raid1 partition of 10G on sda5 and sdb2
/dev/sda5 15736832 36708351 20971520 10G fd Linux raid autodetect
/dev/sdb2 14702592 35674111 20971520 10G fd Linux raid autodetect
Then I used this command:
mdadm --create /dev/md/home --run --level=raid1 --raid-devices=2 --metadata=default --bitmap=internal /dev/sda5 /dev/sdb2
When I used following command:
mdadm --detail /dev/md/home
I've got a different output on ppc64 and ppc64le
And specifically on ppc64le I've got the UUID:
/dev/md/home:
Version : 1.2
Creation Time : Tue May 3 04:11:21 2016
Raid Level : raid1
Array Size : 10477568 (9.99 GiB 10.73 GB)
Used Dev Size : 10477568 (9.99 GiB 10.73 GB)
Raid Devices : 2
Total Devices : 2
Persistence : Superblock is persistent
Intent Bitmap : Internal
Update Time : Tue May 3 04:30:43 2016
State : clean, resyncing
Active Devices : 2
Working Devices : 2
Failed Devices : 0
Spare Devices : 0
Name : home
UUID : 1afa7914:13b82dee:89aa091f:01ce6bd8
Events : 225
Number Major Minor RaidDevice State
0 8 5 0 active sync /dev/sda5
1 8 18 1 active sync /dev/sdb2
When I don't have the UUID on ppc64:
/dev/md/home:
Version : 1.2
Creation Time : Mon May 2 12:05:12 2016
Raid Level : raid1
Array Size : 10477568 (9.99 GiB 10.73 GB)
Used Dev Size : 10477568 (9.99 GiB 10.73 GB)
Raid Devices : 2
Total Devices : 2
Persistence : Superblock is persistent
Intent Bitmap : Internal
Update Time : Mon May 2 12:19:34 2016
State : clean
Active Devices : 2
Working Devices : 2
Failed Devices : 0
Spare Devices : 0
Number Major Minor RaidDevice State
0 8 5 0 active sync /dev/sda5
1 8 18 1 active sync /dev/sdb2
Unable to install fedora on RAID disks. Problem still present on f24 RC1. none of the MD code is processor specific.
Tested on Fedora release 26
On "Installation Destination" panel, I selected 2 blank disks (disks 2x20G)
and set "I will configure partitioning"
Then I used "Click here to create them automatically"
I get a /boot partition of 1024Mib
I get a / partition of 15 GB
I get a swap partition of 3.98 GB
I then reduced / partition size to 10G (LVM)
"Update Settings"
Then I created a /home partition of 10G (LVM)
and I changed it to select "RAID" in "Device Type" and "RAID1" in "RAID Level"
"Update Settings"
After the reboot, I see, mdadm --detail /dev/md127
/dev/md127:
Version : 1.2
Creation Time : Thu Jul 20 08:32:00 2017
Raid Level : raid1
Array Size : 10485760 (10.00 GiB 10.74 GB)
Used Dev Size : 10485760 (10.00 GiB 10.74 GB)
Raid Devices : 2
Total Devices : 2
Persistence : Superblock is persistent
Intent Bitmap : Internal
Update Time : Thu Jul 20 08:59:02 2017
State : clean
Active Devices : 2
Working Devices : 2
Failed Devices : 0
Spare Devices : 0
Name : localhost:home
UUID : 8adf2e4c:f32ded4a:2b8a9005:10ee1149
Events : 29
Number Major Minor RaidDevice State
0 8 3 0 active sync /dev/sda3
1 8 18 1 active sync /dev/sdb2
I confirm it is working fine now on f26 ppc64 VM. I close this problem. |