RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 1016218 - overkill of 'socket connect' messages when lvmetad is not running
Summary: overkill of 'socket connect' messages when lvmetad is not running
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Linux 6
Classification: Red Hat
Component: lvm2
Version: 6.5
Hardware: x86_64
OS: Linux
low
low
Target Milestone: rc
: ---
Assignee: Peter Rajnoha
QA Contact: Cluster QE
URL:
Whiteboard:
Depends On:
Blocks: 1085119
TreeView+ depends on / blocked
 
Reported: 2013-10-07 17:50 UTC by Corey Marthaler
Modified: 2018-12-06 15:20 UTC (History)
11 users (show)

Fixed In Version: lvm2-2.02.107-1.el6
Doc Type: Bug Fix
Doc Text:
Cause: LVM tools tried to connect to lvmetad even though the tools were run with the --sysinit option to denote the early system initialization where lvmetad may not have been running yet. Consequence: There were error messages "lvmetad.socket: connect failed: No such file or directory" issued multiple times during system initialization. Fix: There's an additional check whether LVM tool is running during early system initialization by checking the use of --sysinit option which is used in such cases. Result: If there's no lvmetad socket present while in early system initialization (while using --sysinit LVM command option), no error messages are issued about missing lvmetad socket if this is detected and LVM tools automatically fallback to non-lvmetad mode silently.
Clone Of:
: 1085119 (view as bug list)
Environment:
Last Closed: 2014-10-14 08:24:43 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Knowledge Base (Solution) 650503 0 None None None Never
Red Hat Product Errata RHBA-2014:1387 0 normal SHIPPED_LIVE lvm2 bug fix and enhancement update 2014-10-14 01:39:47 UTC

Description Corey Marthaler 2013-10-07 17:50:14 UTC
Description of problem:
Isn't the one "WARNING: Failed to connect to lvmetad: ..." message enough to get the point across when lvmetad isn't running?


    # If lvmetad has been running while use_lvmetad was 0, it MUST be stopped
    # before changing use_lvmetad to 1 and started again afterwards.
    use_lvmetad = 1

[root@harding-03 ~]# pvs
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  WARNING: Failed to connect to lvmetad: No such file or directory. Falling back to internal scanning.
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  PV         VG           Fmt  Attr PSize  PFree
  /dev/sda2  vg_harding03 lvm2 a--  92.67g    0 
[root@harding-03 ~]# lvs
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  WARNING: Failed to connect to lvmetad: No such file or directory. Falling back to internal scanning.
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  LV      VG           Attr       LSize  Pool Origin Data%  Move Log Cpy%Sync Convert
  lv_home vg_harding03 -wi-ao---- 33.36g                                             
  lv_root vg_harding03 -wi-ao---- 50.00g                                             
  lv_swap vg_harding03 -wi-ao----  9.31g                                             


Version-Release number of selected component (if applicable):
2.6.32-410.el6.x86_64
lvm2-2.02.100-4.el6    BUILT: Fri Sep 27 09:05:32 CDT 2013
lvm2-libs-2.02.100-4.el6    BUILT: Fri Sep 27 09:05:32 CDT 2013
lvm2-cluster-2.02.100-4.el6    BUILT: Fri Sep 27 09:05:32 CDT 2013
udev-147-2.48.el6    BUILT: Fri Aug  9 06:09:50 CDT 2013
device-mapper-1.02.79-4.el6    BUILT: Fri Sep 27 09:05:32 CDT 2013
device-mapper-libs-1.02.79-4.el6    BUILT: Fri Sep 27 09:05:32 CDT 2013
device-mapper-event-1.02.79-4.el6    BUILT: Fri Sep 27 09:05:32 CDT 2013
device-mapper-event-libs-1.02.79-4.el6    BUILT: Fri Sep 27 09:05:32 CDT 2013
cmirror-2.02.100-4.el6    BUILT: Fri Sep 27 09:05:32 CDT 2013

Comment 1 Peter Rajnoha 2013-10-08 08:28:00 UTC
This patch added more log messages for daemon handling code (daemon_open):
https://git.fedorahosted.org/cgit/lvm2.git/commit/?id=981962b339123338335da5a3cda822117808c591

Those log messages are issued because at the time the "vgchange -aay --sysinit" is called in rc.sysinit, it's just too early to contact lvmetad - the lvm2-lvmetad initscript is started a bit later.

What we could do here is to automatically fallback to non-lvmetad mode if --sysinit option is used as that option is supposed for use during early bootup stage where some of the services are not supposed to be started yet (hmm, didn't we have this before, isn't this a regression with --sysinit + use_lvmetad=1?).

Comment 2 Peter Rajnoha 2013-11-26 14:02:14 UTC
This patch should help here:

https://git.fedorahosted.org/cgit/lvm2.git/commit/?id=8d5cff5b9bb217942c009dca49d0bffb51e85004

Now, it prints only one and more logical message instead of uselessly trying to connect to nonexistent daemon/service again and again:
  "lvmetad is not active yet, using direct activation during sysinit"

Comment 5 Nenad Peric 2014-07-15 14:30:26 UTC
Marking VERIFIED. 

[root@virt-122 ~]# vgchange -an seven ten
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  WARNING: Failed to connect to lvmetad. Falling back to internal scanning.
  0 logical volume(s) in volume group "seven" now active
  0 logical volume(s) in volume group "ten" now active
[root@virt-122 ~]# vgchange -aay --sysinit seven
  lvmetad is not active yet, using direct activation during sysinit
  1 logical volume(s) in volume group "seven" now active
[root@virt-122 ~]# 


[root@virt-122 ~]# lvs
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  WARNING: Failed to connect to lvmetad. Falling back to internal scanning.
  LV      VG         Attr       LSize   Pool Origin Data%  Meta%  Move Log          Cpy%Sync Convert
  mirror1 seven      mwi-a-m--- 100.00m                                mirror1_mlog 100.00          
  mirror2 ten        mwi---m--- 100.00m                                mirror2_mlog                 
  lv_root vg_virt122 -wi-ao----   6.71g                                                             
  lv_swap vg_virt122 -wi-ao---- 816.00m                                                             
[root@virt-122 ~]# pvs
  /var/run/lvm/lvmetad.socket: connect failed: No such file or directory
  WARNING: Failed to connect to lvmetad. Falling back to internal scanning.
  PV         VG         Fmt  Attr PSize  PFree 
  /dev/sda              lvm2 ---  15.00g 15.00g
  /dev/sdb   seven      lvm2 a--  15.00g 14.90g
  /dev/sdc   seven      lvm2 a--  15.00g 14.90g
  /dev/sdd   seven      lvm2 a--  15.00g 14.99g
  /dev/sde   ten        lvm2 a--  15.00g 14.90g
  /dev/sdf   ten        lvm2 a--  15.00g 14.89g
  /dev/sdg   seven      lvm2 a--  15.00g 15.00g
  /dev/sdh   seven      lvm2 a--  15.00g 15.00g
  /dev/vda2  vg_virt122 lvm2 a--   7.51g     0 



Tested with:

lvm2-2.02.107-2.el6    BUILT: Fri Jul 11 15:47:33 CEST 2014
lvm2-libs-2.02.107-2.el6    BUILT: Fri Jul 11 15:47:33 CEST 2014
lvm2-cluster-2.02.107-2.el6    BUILT: Fri Jul 11 15:47:33 CEST 2014
udev-147-2.55.el6    BUILT: Wed Jun 18 13:30:21 CEST 2014
device-mapper-1.02.86-2.el6    BUILT: Fri Jul 11 15:47:33 CEST 2014
device-mapper-libs-1.02.86-2.el6    BUILT: Fri Jul 11 15:47:33 CEST 2014
device-mapper-event-1.02.86-2.el6    BUILT: Fri Jul 11 15:47:33 CEST 2014
device-mapper-event-libs-1.02.86-2.el6    BUILT: Fri Jul 11 15:47:33 CEST 2014
device-mapper-persistent-data-0.3.2-1.el6    BUILT: Fri Apr  4 15:43:06 CEST 2014
cmirror-2.02.107-2.el6    BUILT: Fri Jul 11 15:47:33 CEST 2014

Comment 6 errata-xmlrpc 2014-10-14 08:24:43 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

http://rhn.redhat.com/errata/RHBA-2014-1387.html


Note You need to log in before you can comment on or make changes to this bug.