Bug 1116562 - lvmetad socket is down although its configured as enabled
Summary: lvmetad socket is down although its configured as enabled
Keywords:
Status: CLOSED WONTFIX
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-cinder
Version: 5.0 (RHEL 7)
Hardware: x86_64
OS: Linux
low
low
Target Milestone: ---
: 6.0 (Juno)
Assignee: Eric Harney
QA Contact: Dafna Ron
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2014-07-06 10:50 UTC by bkopilov
Modified: 2016-04-27 01:10 UTC (History)
3 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2014-08-26 18:57:45 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)

Description bkopilov 2014-07-06 10:50:29 UTC
Description of problem:
After boot rhel7  , sometimes openstackc cinders can not configure LVM 

Command: sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvcreate -T -L 19.57g cinder-volumes/cinder-volumes-pool
Exit code: 5
Stdout: ''
Stderr: '  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  WARNING: Failed to connect to lvmetad: No such file or directory. Falling back to internal scanning.\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm


it happened after vm boot and reproduced many times . the issue found before installing openstack .

Sergely Gotlib was helping with troubleshooting , the reason was : 
The root cause is that lvmetad socket is down although its configured as enabled, so all LVM commands trigger errors

Version-Release number of selected component (if applicable):


How reproducible:
1. run automation , boot vm from vm cloud  , it happened after vm boot . 

Steps to Reproduce:
The automation scrip: 
boot vm rhel7 
updated yum.repo for openstack and rhel7
installing openstack

Actual results:
Unable to configure cinder

Expected results:


Additional info:

Comment 2 bkopilov 2014-07-06 10:56:15 UTC

Log messages :

Hi ,

Adding sergey .

Can u please check ?

Thanks ,
Benny

----- Original Message -----
From: "Benny Kopilov" <bkopilov>
To: "rh-openstack-dev" <rh-openstack-dev>
Cc: "rhos-qe-dept" <rhos-qe-dept>
Sent: Wednesday, July 2, 2014 10:24:56 AM
Subject: Re: Error encountered during initialization of driver: LVMISCSIDriver

Adding automation run && logs

http://jenkins.rhev.lab.eng.brq.redhat.com/view/RHOS-STORAGE-QE/view/RHOS-5.0-RHEL7.0/job/rhos-5-rhel-7.0-cinder-thinlvm/56/

One note - Sometimes the driver installed successfully with the same configuration .

----- Original Message -----
From: "Benny Kopilov" <bkopilov>
To: "rh-openstack-dev" <rh-openstack-dev>
Cc: "rhos-qe-dept" <rhos-qe-dept>
Sent: Wednesday, July 2, 2014 10:04:34 AM
Subject: Error encountered during initialization of driver: LVMISCSIDriver


Hi ,

I am running automated test for thinlvm , looks like driver inits fails .
lvm tests pass successfully .

Rhos 5.0 with rhel7
Can u please explain whats wrong ?


from log under cinder/ -

2014-07-02 02:36:45.550 21737 DEBUG cinder.openstack.common.service [-] ******************************************************************************** log_opt_values /usr/lib/python2.7/site-packages/oslo/config/cfg.py:1955
2014-07-02 02:36:45.927 21937 DEBUG cinder.openstack.common.processutils [req-06c62275-5522-410c-a400-83411d2065e4 - - - - -] Running cmd (subprocess): sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C lvs --noheadings --unit=g -o vg_name,name,size --nosuffix cinder-volumes execute /usr/lib/python2.7/site-packages/cinder/openstack/common/processutils.py:142
2014-07-02 02:36:46.441 21937 DEBUG cinder.openstack.common.processutils [req-06c62275-5522-410c-a400-83411d2065e4 - - - - -] Running cmd (subprocess): sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C vgs --version execute /usr/lib/python2.7/site-packages/cinder/openstack/common/processutils.py:142
2014-07-02 02:36:47.041 21937 DEBUG cinder.openstack.common.processutils [req-06c62275-5522-410c-a400-83411d2065e4 - - - - -] Running cmd (subprocess): sudo cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C vgs --noheadings --unit=g -o name,size,free,lv_count,uuid --separator : --nosuffix cinder-volumes execute /usr/lib/python2.7/site-packages/cinder/openstack/common/processutils.py:142
2014-07-02 02:36:47.575 21937 DEBUG cinder.brick.local_dev.lvm [req-06c62275-5522-410c-a400-83411d2065e4 - - - - -] Created thin pool 'cinder-volumes/cinder-volumes-pool' with size 19.57g of total 20.6g create_thin_pool /usr/lib/python2.7/site-packages/cinder/brick/local_dev/lvm.py:463
2014-07-02 02:36:47.576 21937 DEBUG cinder.openstack.common.processutils [req-06c62275-5522-410c-a400-83411d2065e4 - - - - -] Running cmd (subprocess): sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvcreate -T -L 19.57g cinder-volumes/cinder-volumes-pool execute /usr/lib/python2.7/site-packages/cinder/openstack/common/processutils.py:142
2014-07-02 02:36:49.270 21937 DEBUG cinder.openstack.common.processutils [req-06c62275-5522-410c-a400-83411d2065e4 - - - - -] Result was 5 execute /usr/lib/python2.7/site-packages/cinder/openstack/common/processutils.py:167
2014-07-02 02:36:49.271 21937 ERROR cinder.volume.manager [req-06c62275-5522-410c-a400-83411d2065e4 - - - - -] Error encountered during initialization of driver: LVMISCSIDriver
2014-07-02 02:36:49.271 21937 ERROR cinder.volume.manager [req-06c62275-5522-410c-a400-83411d2065e4 - - - - -] Unexpected error while running command.
Command: sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvcreate -T -L 19.57g cinder-volumes/cinder-volumes-pool
Exit code: 5
Stdout: ''
Stderr: '  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  WARNING: Failed to connect to lvmetad: No such file or directory. Falling back to internal scanning.\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  Thin pool cinder-volumes-pool already exists in Volume group cinder-volumes.\n'
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager Traceback (most recent call last):
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 243, in init_host
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager     self.driver.check_for_setup_error()
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/lvm.py", line 83, in check_for_setup_error
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager     executor=self._execute)
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/brick/local_dev/lvm.py", line 91, in __init__
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager     self.create_thin_pool(pool_name)
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/brick/local_dev/lvm.py", line 467, in create_thin_pool
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager     run_as_root=True)
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/utils.py", line 136, in execute
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager     return processutils.execute(*cmd, **kwargs)
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/openstack/common/processutils.py", line 173, in execute
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager     cmd=' '.join(cmd))
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager ProcessExecutionError: Unexpected error while running command.
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager Command: sudo cinder-rootwrap /etc/cinder/rootwrap.conf lvcreate -T -L 19.57g cinder-volumes/cinder-volumes-pool
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager Exit code: 5
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager Stdout: ''
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager Stderr: '  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  WARNING: Failed to connect to lvmetad: No such file or directory. Falling back to internal scanning.\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  Thin pool cinder-volumes-pool already exists in Volume group cinder-volumes.\n'
2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager
2014-07-02 02:36:49.276 21937 DEBUG cinder.openstack.common.lockutils [-] Got semaphore "dbapi_backend" for method "__get_backend"... inner /usr/lib/python2.7/site-packages/cinder/openstack/common/lockutils.py:191
2014-07-02 02:36:49.649 21937 DEBUG cinder.service [-] Creating RPC server for service cinder-volume start /usr/lib/python2.7/site-packages/cinder/service.py:113
2014-07-02 02:36:49.650 21937 DEBUG stevedore.extension [-] found extension EntryPoint.parse('blocking = oslo.messaging._executors.impl_blocking:BlockingExecutor') _load_plugins /usr/lib/python2.7/site-packages/stevedore/extension.py:156
2014-07-02 02:36:49.650 21937 DEBUG stevedore.extension [-] found extension EntryPoint.parse('eventlet = oslo.messaging._executors.impl_eventlet:EventletExecutor') _load_plugins /usr/lib/python2.7/site-packages/stevedore/extension.py:156
2014-07-02 02:36:49.672 21937 INFO oslo.messaging._drivers.impl_rabbit [-] Connected to AMQP server on 192.168.2.46:5672
2014-07-02 02:37:44.693 21937 DEBUG cinder.openstack.common.periodic_task [req-a0bf5319-1f6c-43a7-b7ff-5b4f3ffd627c - - - - -] Running periodic task VolumeManager._publish_service_capabilities run_periodic_tasks /usr/lib/python2.7/site-packages/cinder/openstack/common/periodic_task.py:178
2014-07-02 02:37:44.693 21937 DEBUG cinder.openstack.common.periodic_task [req-a0bf5319-1f6c-43a7-b7ff-5b4f3ffd627c - - - - -] Running periodic task VolumeManager._report_driver_status run_periodic_tasks /usr/lib/python2.7/site-packages/cinder/openstack/common/periodic_task.py:178
2014-07-02 02:37:44.694 21937 INFO cinder.volume.manager [req-a0bf5319-1f6c-43a7-b7ff-5b4f3ffd627c - - - - -] Updating volume status
2014-07-02 02:37:44.694 21937 WARNING cinder.volume.manager [req-a0bf5319-1f6c-43a7-b7ff-5b4f3ffd627c - - - - -] Unable to update stats, LVMISCSIDriver -2.0.0  driver is uninitialized.
2014-07-02 02:38:44.695 21937 DEBUG cinder.openstack.common.periodic_task [-] Running periodic task VolumeManager._publish_service_capabilities run_periodic_tasks /usr/lib/python2.7/site-packages/cinder/openstack/common/periodic_task.py:178
2014-07-02 02:38:44.696 21937 DEBUG cinder.openstack.common.periodic_task [-] Running periodic task VolumeManager._report_driver_status run_periodic_tasks /usr/lib/python2.7/site-packages/cinder/openstack/common/periodic_task.py:178
2014-07-02 02:38:44.696 21937 INFO cinder.volume.manager [-] Updating volume status
2014-07-02 02:38:44.696 21937 WARNING cinder.volume.manager [-] Unable to update stats, LVMISCSIDriver -2.0.0  driver is uninitialized.
2014-07-02 02:39:44.700 21937 DEBUG cinder.openstack.common.periodic_task [-] Running periodic task VolumeManager._publish_service_capabilities run_periodic_tasks /usr/lib/python2.7/site-packages/cinder/openstack/common/periodic_task.py:178
2014-07-02 02:39:44.701 21937 DEBUG cinder.openstack.common.periodic_task [-] Running periodic task VolumeManager._report_driver_status run_periodic_tasks /usr/lib/python2.7/site-packages/cinder/openstack/common/periodic_task.py:178
2014-07-02 02:39:44.701 21937 INFO cinder.volume.manager [-] Updating volume status
2014-07-02 02:39:44.702 21937 WARNING cinder.volume.manager [-] Unable to update stats, LVMISCSIDriver -2.0.0  driver is uninitialized.

Comment 3 bkopilov 2014-07-06 10:58:29 UTC
Workaround to this issue :
sudo systemctl start lvm2-lvmetad.socket

or

sudo systemctl start lvm2-lvmetad.service

Comment 4 Sergey Gotliv 2014-07-06 11:13:46 UTC
This is not a Cinder issue! 

It some where in between of lvm and your automatic tests.

In RHEL 7.0 by default lvm configured to work with lvm metadata daemon (lvm2-lvmetad) "use-lvmetad=1", but on your environment lvm2-lvmetad.socket is down, so all lvm commands create a lot of error printings.

And then it looks like your test trying to create an already existing pool so this is an another log error.

2014-07-02 02:36:49.271 21937 TRACE cinder.volume.manager Stderr: '  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  WARNING: Failed to connect to lvmetad: No such file or directory. Falling back to internal scanning.\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  Thin pool cinder-volumes-pool already exists in Volume group cinder-volumes.\n'

Comment 5 Sergey Gotliv 2014-08-26 18:57:45 UTC
closed based on comment #4, this is not a Cinder issue.


Note You need to log in before you can comment on or make changes to this bug.