From Bugzilla Helper:
User-Agent: Mozilla/4.76 [en] (X11; U; Linux 2.4.2-0.1.28enterprise i686)
I have been testing Red Hat Wolverine on my Dell PowerEdge 1300 server.
Long story but I currently have everything running off a single 20Gb IDE
drive transitioning towards three Ultra160 9Gb drives off an Adaptec U2W
controller. I am moving that all to software RAID as soon as I can get it
booting, which I can't currently. I created two soft RAID partions,
/dev/md0 and /dev/md1. /dev/md0 is RAID 1 and /dev/md1 is RAID 5 and they
both are reiserfs partitions. I used "find <old partition name> -xdev |
cpio -pm <new partition name>" to move things from the IDE drive to the
RAID partitions (which incidentally causes another problems - dmesg shows
messages about running out of memory and the kernel starts killing
processes, like httpd and mysqld but the copy finishes fine...). I setup
lilo for a bootable RAID configuration and reboot. I haven't gotten the
boot process to get past the LI prompt but I can boot from floppy disk.
This works fine, the RAM disk loads, then the kernel image loads. It
attempts to mount /dev/md0 and then the kernel panics with the following
swapper (pid 1) used obsolete MD ioctl, upgrade your software to use new
ictls <this is verbatim>
kernel panic: unable to mount device (09:00) on VFS <from my flawed
The only problem, unless I'm mistaken, is no software, besides kernel
software, is loaded yet. So what software needs upgraded? Is there a
newer version of raidtools available?
I've upgrade to kernel 2.4.2-0.1.28enterprise, raidtools-0.90-13,
initscripts-5.78, lilo-21.4.4-13 and reiserfs-utils-3.x.0f-1 from RawHide
after I was having these initial problems with Wolverine but nothing has
made a difference.
Steps to Reproduce:
1. Boot from root RAID 1, /dev/md0
Actual Results: As described, the kernel panics. The error message
appears to be coming from linux/drivers/md/md.c, about line 2884, which
appears to be the default if no other changes are specified for the RAID
Expected Results: Should be able to boot from root RAID.
Actually, one thing I haven't tried upgrading yet is SysVinit - I've just
done that and will reboot with that change after I submit this. If it
fixes the problem, I'll file a comment to this.
1) Reiserfs doesn't work on software raid and isn't reliable on hardware raid
2) Reiserfs is not supported by Red Hat and is included in the beta to asses
the quality of reiserfs.
The other thing: We fixed several boot-raid bugs recently and we test this in
now; I assume this is fixed.
Reiserfs recently got fixed to at least "work" with raid, although you still