Bug 593576 - udisks-lvm-pv-export is pegging all CPU's at 100% when used with iSCSI luns
udisks-lvm-pv-export is pegging all CPU's at 100% when used with iSCSI luns
Status: CLOSED RAWHIDE
Product: Red Hat Enterprise Linux 6
Classification: Red Hat
Component: udisks (Show other bugs)
6.0
All Linux
low Severity medium
: rc
: ---
Assigned To: David Zeuthen
desktop-bugs@redhat.com
:
Depends On:
Blocks:
  Show dependency treegraph
 
Reported: 2010-05-19 04:15 EDT by Justin Clift
Modified: 2013-03-05 23:02 EST (History)
1 user (show)

See Also:
Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed: 2010-06-01 11:07:14 EDT
Type: ---
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)
Screenshot showing all cpu's at 100 percent (295.99 KB, image/png)
2010-05-19 04:17 EDT, Justin Clift
no flags Details
Output from "udisks --dump" (785.79 KB, text/plain)
2010-05-19 04:18 EDT, Justin Clift
no flags Details
Output from "udevadm info --export-db" (987.42 KB, text/plain)
2010-05-19 04:18 EDT, Justin Clift
no flags Details
Output from "cat /proc/self/mountinfo" (1003 bytes, text/plain)
2010-05-19 04:19 EDT, Justin Clift
no flags Details
Output from "cat /etc/fstab" (1.01 KB, text/plain)
2010-05-19 04:20 EDT, Justin Clift
no flags Details
Output from "cat /etc/multipath.conf" (5.63 KB, text/plain)
2010-05-19 04:20 EDT, Justin Clift
no flags Details

  None (edit)
Description Justin Clift 2010-05-19 04:15:46 EDT
Description of problem:

On RHEL 6 (beta) x86_64, after adding many remote iSCSI luns on a server, udisks-lvm-pv-export is pegging all of the CPU's at 100% from the next boot onwards.


Version-Release number of selected component (if applicable):

udisks-1.0.0-0.git20100115.3.el6.x86_64
gvfs-1.4.3-6.el6.x86_64
libatasmart-0.17-3.el6.x86_64


How reproducible:

Every time. :(


Steps to Reproduce:
1. Add many iSCSI luns using iscsiadm as per normal (ie http://kbase.redhat.com/faq/docs/DOC-6388)
2. Configure user_friendly_names for these luns in /etc/multipath.conf
2. Enable the iscsid and iscsi init scripts at system start (run level 3)
3. Reboot
  
Actual results:

After rebooting, all of the cpu's in the system are at 100% utilisation and remain that way.

All other processes on the server (ie virtual machines) are seriously performance impacted.  Too slow to use.


Expected results:

udisks-lvm-pv-export to not be using the CPU's at such, and for the server to be able to perform it's intended purpose.


Additional info:

A few minutes after a reboot, these messages appeared in /var/log/messages:

May 19 17:37:44 host2 udevd[589]: worker [1515] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1515] failed while handling '/devices/platform/host6/session1/target6:0:0/6:0:0:1/block/sdb/sdb2'
May 19 17:37:44 host2 udevd[589]: worker [1518] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1518] failed while handling '/devices/platform/host6/session1/target6:0:0/6:0:0:3/block/sdg/sdg2'
May 19 17:37:44 host2 udevd[589]: worker [1512] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1512] failed while handling '/devices/platform/host7/session2/target7:0:0/7:0:0:2/block/sdd/sdd2'
May 19 17:37:44 host2 udevd[589]: worker [1525] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1525] failed while handling '/devices/platform/host8/session3/target8:0:0/8:0:0:2/block/sdh/sdh2'
May 19 17:37:44 host2 udevd[589]: worker [1116] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1116] failed while handling '/devices/platform/host6/session1/target6:0:0/6:0:0:4/block/sdm/sdm2'
May 19 17:37:44 host2 udevd[589]: worker [1514] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1514] failed while handling '/devices/platform/host9/session4/target9:0:0/9:0:0:1/block/sdn/sdn2'
May 19 17:37:44 host2 udevd[589]: worker [1503] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1503] failed while handling '/devices/platform/host6/session1/target6:0:0/6:0:0:5/block/sdp/sdp2'
May 19 17:37:44 host2 udevd[589]: worker [1616] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1616] failed while handling '/devices/platform/host8/session3/target8:0:0/8:0:0:8/block/sdy/sdy2'
May 19 17:37:44 host2 udevd[589]: worker [1645] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1645] failed while handling '/devices/platform/host8/session3/target8:0:0/8:0:0:9/block/sdab/sdab2'
May 19 17:37:44 host2 udevd[589]: worker [1527] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1527] failed while handling '/devices/platform/host6/session1/target6:0:0/6:0:0:6/block/sdw/sdw2'
May 19 17:37:44 host2 udevd[589]: worker [2626] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [2626] failed while handling '/devices/platform/host6/session1/target6:0:0/6:0:0:11/block/sdas/sdas2'
May 19 17:37:44 host2 udevd[589]: worker [1544] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1544] failed while handling '/devices/platform/host9/session4/target9:0:0/9:0:0:4/block/sdaa/sdaa2'
May 19 17:37:44 host2 udevd[589]: worker [1541] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1541] failed while handling '/devices/platform/host9/session4/target9:0:0/9:0:0:3/block/sdv/sdv2'
May 19 17:37:44 host2 udevd[589]: worker [2587] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [2587] failed while handling '/devices/platform/host7/session2/target7:0:0/7:0:0:8/block/sdz/sdz2'
May 19 17:37:44 host2 udevd[589]: worker [2498] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [2498] failed while handling '/devices/platform/host6/session1/target6:0:0/6:0:0:8/block/sdag/sdag2'
May 19 17:37:44 host2 udevd[589]: worker [1526] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1526] failed while handling '/devices/platform/host8/session3/target8:0:0/8:0:0:7/block/sdu/sdu2'
May 19 17:37:44 host2 udevd[589]: worker [1532] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1532] failed while handling '/devices/platform/host7/session2/target7:0:0/7:0:0:11/block/sdai/sdai2'
May 19 17:37:44 host2 udevd[589]: worker [1543] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1543] failed while handling '/devices/platform/host7/session2/target7:0:0/7:0:0:10/block/sdae/sdae2'
May 19 17:37:44 host2 udevd[589]: worker [1551] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1551] failed while handling '/devices/platform/host8/session3/target8:0:0/8:0:0:10/block/sdah/sdah2'
May 19 17:37:44 host2 udevd[589]: worker [2742] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [2742] failed while handling '/devices/platform/host6/session1/target6:0:0/6:0:0:9/block/sdal/sdal2'
May 19 17:37:44 host2 udevd[589]: worker [1582] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1582] failed while handling '/devices/platform/host7/session2/target7:0:0/7:0:0:7/block/sdx/sdx2'
May 19 17:37:44 host2 udevd[589]: worker [2933] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [2933] failed while handling '/devices/platform/host6/session1/target6:0:0/6:0:0:10/block/sdao/sdao2'
May 19 17:37:44 host2 udevd[589]: worker [1522] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1522] failed while handling '/devices/platform/host7/session2/target7:0:0/7:0:0:9/block/sdad/sdad2'
May 19 17:37:44 host2 udevd[589]: worker [1556] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1556] failed while handling '/devices/platform/host8/session3/target8:0:0/8:0:0:11/block/sdaj/sdaj2'
May 19 17:37:44 host2 udevd[589]: worker [1581] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1581] failed while handling '/devices/platform/host9/session4/target9:0:0/9:0:0:5/block/sdaf/sdaf2'
May 19 17:37:44 host2 udevd[589]: worker [1530] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1530] failed while handling '/devices/platform/host9/session4/target9:0:0/9:0:0:6/block/sdak/sdak2'
May 19 17:37:44 host2 udevd[589]: worker [1562] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1562] failed while handling '/devices/platform/host9/session4/target9:0:0/9:0:0:8/block/sday/sday2'
May 19 17:37:44 host2 udevd[589]: worker [1539] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1539] failed while handling '/devices/platform/host9/session4/target9:0:0/9:0:0:9/block/sdbe/sdbe2'
May 19 17:37:44 host2 udevd[589]: worker [1578] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1578] failed while handling '/devices/platform/host9/session4/target9:0:0/9:0:0:11/block/sdbo/sdbo2'
May 19 17:37:44 host2 udevd[589]: worker [1549] unexpectedly returned with status 0x0100
May 19 17:37:44 host2 udevd[589]: worker [1549] failed while handling '/devices/platform/host9/session4/target9:0:0/9:0:0:10/block/sdbk/sdbk2'

30 in total.

Counting the number of udisks-lvm-pv-export processes in the process table shows exactly the same amount:

  $ ps -ef | grep /lib/udev/udisks-lvm-pv-export | grep -v grep
  root      1778     1 21 17:34 ?        00:06:23 /lib/udev/udisks-lvm-pv-export Elnyq9-4oms-YgvI-Z2b3-KzVK-FRv2-RtZGJG
  root      2126     1 18 17:34 ?        00:05:35 /lib/udev/udisks-lvm-pv-export 2M9W5t-T1az-QV1q-TuQA-fpyq-nnzO-twT750
  root      2142     1 22 17:34 ?        00:06:36 /lib/udev/udisks-lvm-pv-export gVLqdy-un9F-9vVV-UmIS-3lZy-EArU-9Nzaal
  root      2159     1 18 17:34 ?        00:05:27 /lib/udev/udisks-lvm-pv-export gVLqdy-un9F-9vVV-UmIS-3lZy-EArU-9Nzaal
  root      2203     1 19 17:34 ?        00:05:50 /lib/udev/udisks-lvm-pv-export UgSM6N-RStV-VGjd-OeoG-9Wua-U0ns-Q2pKS3
  root      2235     1 35 17:34 ?        00:10:23 /lib/udev/udisks-lvm-pv-export Elnyq9-4oms-YgvI-Z2b3-KzVK-FRv2-RtZGJG
  root      2310     1 18 17:34 ?        00:05:38 /lib/udev/udisks-lvm-pv-export W0tFe8-D1gd-nZut-R8pR-1mR8-C7zI-r7yirl
  root      2591     1 42 17:34 ?        00:12:30 /lib/udev/udisks-lvm-pv-export TEMHT5-lXhd-YQen-l3Qm-ytaA-drK0-0Cc1PO
  root      2665     1 30 17:34 ?        00:09:06 /lib/udev/udisks-lvm-pv-export 26oZZI-aJ04-VimC-8yPx-2hyn-f1PJ-yjmd9D
  root      2787     1 20 17:34 ?        00:06:00 /lib/udev/udisks-lvm-pv-export HyDF9q-1tvO-1I6d-Go9h-e5J3-7cyQ-hSfAhf
  root      2865     1 22 17:34 ?        00:06:42 /lib/udev/udisks-lvm-pv-export 60sZpi-Ohl8-SAMg-kzEx-rctI-htbJ-18sYaI
  root      2872     1 31 17:34 ?        00:09:23 /lib/udev/udisks-lvm-pv-export 26oZZI-aJ04-VimC-8yPx-2hyn-f1PJ-yjmd9D
  root      2928     1 51 17:34 ?        00:15:25 /lib/udev/udisks-lvm-pv-export 2M9W5t-T1az-QV1q-TuQA-fpyq-nnzO-twT750
  root      3206     1 19 17:34 ?        00:05:46 /lib/udev/udisks-lvm-pv-export 20b20Z-SsJ9-RMf3-FNgn-zLNk-amj9-vh3PK9
  root      3260     1 43 17:34 ?        00:12:48 /lib/udev/udisks-lvm-pv-export nyU2Qh-7UT4-J9OI-4Kw2-l3OO-ZWGu-u7gcUG
  root      3309     1 19 17:34 ?        00:05:47 /lib/udev/udisks-lvm-pv-export nyU2Qh-7UT4-J9OI-4Kw2-l3OO-ZWGu-u7gcUG
  root      3322     1 18 17:34 ?        00:05:36 /lib/udev/udisks-lvm-pv-export 60sZpi-Ohl8-SAMg-kzEx-rctI-htbJ-18sYaI
  root      3324     1 21 17:34 ?        00:06:16 /lib/udev/udisks-lvm-pv-export UgSM6N-RStV-VGjd-OeoG-9Wua-U0ns-Q2pKS3
  root      3332     1 34 17:34 ?        00:10:13 /lib/udev/udisks-lvm-pv-export 20b20Z-SsJ9-RMf3-FNgn-zLNk-amj9-vh3PK9
  root      3431     1 26 17:34 ?        00:07:47 /lib/udev/udisks-lvm-pv-export TEMHT5-lXhd-YQen-l3Qm-ytaA-drK0-0Cc1PO
  root      3608     1 20 17:34 ?        00:06:01 /lib/udev/udisks-lvm-pv-export 26oZZI-aJ04-VimC-8yPx-2hyn-f1PJ-yjmd9D
  root      4036     1 18 17:34 ?        00:05:29 /lib/udev/udisks-lvm-pv-export TEMHT5-lXhd-YQen-l3Qm-ytaA-drK0-0Cc1PO
  root      4038     1 17 17:34 ?        00:05:15 /lib/udev/udisks-lvm-pv-export 60sZpi-Ohl8-SAMg-kzEx-rctI-htbJ-18sYaI
  root      4041     1 34 17:34 ?        00:10:10 /lib/udev/udisks-lvm-pv-export 26oZZI-aJ04-VimC-8yPx-2hyn-f1PJ-yjmd9D
  root      4043     1 22 17:34 ?        00:06:36 /lib/udev/udisks-lvm-pv-export TEMHT5-lXhd-YQen-l3Qm-ytaA-drK0-0Cc1PO
  root      4045     1 20 17:34 ?        00:06:10 /lib/udev/udisks-lvm-pv-export 20b20Z-SsJ9-RMf3-FNgn-zLNk-amj9-vh3PK9
  root      4049     1 22 17:34 ?        00:06:43 /lib/udev/udisks-lvm-pv-export W0tFe8-D1gd-nZut-R8pR-1mR8-C7zI-r7yirl
  root      4291     1 29 17:34 ?        00:08:44 /lib/udev/udisks-lvm-pv-export HyDF9q-1tvO-1I6d-Go9h-e5J3-7cyQ-hSfAhf
  root      4292     1 20 17:34 ?        00:05:59 /lib/udev/udisks-lvm-pv-export 20b20Z-SsJ9-RMf3-FNgn-zLNk-amj9-vh3PK9
  root      4349     1 19 17:34 ?        00:05:42 /lib/udev/udisks-lvm-pv-export 60sZpi-Ohl8-SAMg-kzEx-rctI-htbJ-18sYaI
  $ ps -ef | grep udisks-lvm-pv-export | grep -v grep | wc -l
  30
  $

This looks related, possibly in that udisks-lvm-pv-export isn't handling a failure condition at all well, and is then going into an endless loop.

The system in question has 1 local boot drive (500GB SATA) and 91 remote iSCSI luns attached.
Comment 1 Justin Clift 2010-05-19 04:17:11 EDT
Created attachment 415046 [details]
Screenshot showing all cpu's at 100 percent
Comment 2 Justin Clift 2010-05-19 04:18:02 EDT
Created attachment 415047 [details]
Output from "udisks --dump"
Comment 3 Justin Clift 2010-05-19 04:18:56 EDT
Created attachment 415048 [details]
Output from "udevadm info --export-db"
Comment 4 Justin Clift 2010-05-19 04:19:44 EDT
Created attachment 415049 [details]
Output from "cat /proc/self/mountinfo"
Comment 5 Justin Clift 2010-05-19 04:20:16 EDT
Created attachment 415050 [details]
Output from "cat /etc/fstab"
Comment 7 Justin Clift 2010-05-19 04:20:56 EDT
Created attachment 415051 [details]
Output from "cat /etc/multipath.conf"
Comment 8 Justin Clift 2010-05-19 04:37:32 EDT
As a workaround, removing the "udisks" package (and the packages relying on it) then rebooting returned things back to normal.
Comment 9 David Zeuthen 2010-06-01 11:07:14 EDT
Known problem and why current builds have this feature turned off:

* Mon Mar 22 2010 David Zeuthen <davidz@redhat.com> - 1.0.0-1%{?dist}
- Update to release 1.0.0
- Build without LVM2 and Multipath support for now (#548870, #548874)
- Related: rhbz#575889

Note You need to log in before you can comment on or make changes to this bug.