Bugzilla will be upgraded to version 5.0. The upgrade date is tentatively scheduled for 2 December 2018, pending final testing and feedback.
Bug 1599668 - Current merge function causes issues when stacked on top of RAID50 [rhel-7.5.z]
Current merge function causes issues when stacked on top of RAID50 [rhel-7.5.z]
Status: CLOSED ERRATA
Product: Red Hat Enterprise Linux 7
Classification: Red Hat
Component: kmod-kvdo (Show other bugs)
7.6
Unspecified Linux
high Severity high
: rc
: ---
Assigned To: bjohnsto
Jakub Krysl
Marek Suchanek
: ZStream
Depends On: 1593444
Blocks:
  Show dependency treegraph
 
Reported: 2018-07-10 06:58 EDT by Jaroslav Reznik
Modified: 2018-08-16 10:19 EDT (History)
5 users (show)

See Also:
Fixed In Version: 6.1.0.178-16
Doc Type: If docs needed, set a value
Doc Text:
Previously, creating a VDO volume on top of a RAID 50 array caused the system to halt unexpectedly. With this update, the problem has been fixed, and creating a VDO volume on RAID 50 no longer crashes the system.
Story Points: ---
Clone Of: 1593444
Environment:
Last Closed: 2018-08-16 10:19:04 EDT
Type: ---
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)


External Trackers
Tracker ID Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2018:2450 None None None 2018-08-16 10:19 EDT

  None (edit)
Description Jaroslav Reznik 2018-07-10 06:58:12 EDT
This bug has been copied from bug #1593444 and has been proposed to be backported to 7.5 z-stream (EUS).
Comment 3 Jakub Krysl 2018-07-23 08:57:29 EDT
kernel-3.10.0-862.10.2.el7
kmod-kvdo-kmod-kvdo-6.1.0.171-17.el7_5

# vdo create -n vdo1 --device /dev/md50 --activate=enabled --compression=enabled --deduplication=enabled --sparseIndex=enabled --vdoLogicalSize=20T --verbose
Creating VDO vdo1
    grep MemAvailable /proc/meminfo
    pvcreate -qq --test /dev/md50
    modprobe kvdo
    vdoformat --uds-checkpoint-frequency=0 --uds-memory-size=0.25 --uds-sparse --logical-size=20T /dev/md50
    vdodumpconfig /dev/md50
Starting VDO vdo1
    dmsetup status vdo1
    grep MemAvailable /proc/meminfo
    modprobe kvdo
    vdodumpconfig /dev/md50
    dmsetup create vdo1 --uuid VDO-aa204d8f-f552-48b6-a31d-083405eca2c5 --table '0 42949672960 vdo /dev/md50 4096 disabled 0 32768 16380 on auto vdo1 ack=1,bio=4,bioRotationInterval=64,cpu=2,hash=1,logical=1,physical=1'
    dmsetup status vdo1
Starting compression on VDO vdo1
    dmsetup message vdo1 0 compression on
    dmsetup status vdo1
VDO instance 82 volume is ready at /dev/mapper/vdo1

# lsblk
NAME                        MAJ:MIN RM   SIZE RO TYPE  MOUNTPOINT
sdb                           8:16   0     2T  0 disk
├─vg-lv1                    253:3    0  39.1G  0 lvm
│ └─md51                      9:51   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv2                    253:4    0  39.1G  0 lvm
│ └─md51                      9:51   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv3                    253:5    0  39.1G  0 lvm
│ └─md51                      9:51   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv4                    253:6    0  39.1G  0 lvm
│ └─md51                      9:51   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv5                    253:7    0  39.1G  0 lvm
│ └─md51                      9:51   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv6                    253:8    0  39.1G  0 lvm
│ └─md51                      9:51   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv7                    253:9    0  39.1G  0 lvm
│ └─md51                      9:51   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv8                    253:10   0  39.1G  0 lvm
│ └─md51                      9:51   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv9                    253:11   0  39.1G  0 lvm
│ └─md52                      9:52   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv10                   253:12   0  39.1G  0 lvm
│ └─md52                      9:52   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv11                   253:13   0  39.1G  0 lvm
│ └─md52                      9:52   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv12                   253:14   0  39.1G  0 lvm
│ └─md52                      9:52   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv13                   253:15   0  39.1G  0 lvm
│ └─md52                      9:52   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv14                   253:16   0  39.1G  0 lvm
│ └─md52                      9:52   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv15                   253:17   0  39.1G  0 lvm
│ └─md52                      9:52   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv16                   253:18   0  39.1G  0 lvm
│ └─md52                      9:52   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv17                   253:19   0  39.1G  0 lvm
│ └─md53                      9:53   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv18                   253:20   0  39.1G  0 lvm
│ └─md53                      9:53   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv19                   253:21   0  39.1G  0 lvm
│ └─md53                      9:53   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv20                   253:22   0  39.1G  0 lvm
│ └─md53                      9:53   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv21                   253:23   0  39.1G  0 lvm
│ └─md53                      9:53   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv22                   253:24   0  39.1G  0 lvm
│ └─md53                      9:53   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
├─vg-lv23                   253:25   0  39.1G  0 lvm
│ └─md53                      9:53   0 273.2G  0 raid5
│   └─md50                    9:50   0 819.3G  0 raid0
│     └─vdo1                253:27   0    20T  0 vdo
└─vg-lv24                   253:26   0  39.1G  0 lvm
  └─md53                      9:53   0 273.2G  0 raid5
    └─md50                    9:50   0 819.3G  0 raid0
      └─vdo1                253:27   0    20T  0 vdo
Comment 4 Jakub Krysl 2018-07-23 08:58:35 EDT
(In reply to Jakub Krysl from comment #3)
> kmod-kvdo-kmod-kvdo-6.1.0.171-17.el7_5
sorry, should have been kmod-kvdo-6.1.0.181-17.el7_5
Comment 9 errata-xmlrpc 2018-08-16 10:19:04 EDT
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2018:2450

Note You need to log in before you can comment on or make changes to this bug.