Bug 1844348 - Unable to call volumeEmptyCheck in vdsm-gluster due to errors in vdsm-gluster
Summary: Unable to call volumeEmptyCheck in vdsm-gluster due to errors in vdsm-gluster
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Gluster Storage
Classification: Red Hat Storage
Component: rhhi
Version: rhgs-3.5
Hardware: x86_64
OS: Linux
high
high
Target Milestone: ---
: RHHI-V 1.8
Assignee: Kaustav Majumder
QA Contact: SATHEESARAN
URL:
Whiteboard:
Depends On: 1842767
Blocks: RHHI-V-1.8-Engineering-Inflight-BZs
TreeView+ depends on / blocked
 
Reported: 2020-06-05 07:34 UTC by SATHEESARAN
Modified: 2020-08-04 14:52 UTC (History)
5 users (show)

Fixed In Version:
Doc Type: No Doc Update
Doc Text:
Clone Of: 1842767
Environment:
Last Closed: 2020-08-04 14:52:32 UTC
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHEA-2020:3314 0 None None None 2020-08-04 14:52:54 UTC

Description SATHEESARAN 2020-06-05 07:34:32 UTC
+++ This bug was initially created as a clone of Bug #1842767 +++

Description of problem:
volumeEmptyCheck in vdsm-gluster doesn't work and throws error.


Version-Release number of selected component (if applicable):


How reproducible:


Steps to Reproduce:
1.Setup a ovirt-gluster hci cluster
2.Run vdsm-client --gluster-enabled GlusterVolume volumeEmptyCheck volumeName=data on one of the hosts.

Actual results:
Volume empty check is issued by vdsm-client.
Errors:
vdsm-client: Command GlusterVolume.volumeEmptyCheck with args {'volumeName': 'data'} failed:
(code=4574, message=Failed to Check if gluster volume is empty: rc=1 out=[b'/usr/bin/python3: No module named gluster.gfapi\n'] err=())

Expected results:
No errors should be observed.

Additional info:
Similar to a other functionalities in gfapi.py which throws error and needs rework.

excerpt from supervdsm log
MainProcess|jsonrpc/7::ERROR::2020-06-02 10:28:52,973::supervdsm_server::97::SuperVdsm.ServerCallback::(wrapper) Error in volumeEmptyCheck
Traceback (most recent call last):
  File "/usr/lib/python3.6/site-packages/vdsm/gluster/gfapi.py", line 266, in volumeEmptyCheck
    out = commands.run(command, env=env)
  File "/usr/lib/python3.6/site-packages/vdsm/common/commands.py", line 101, in run
    raise cmdutils.Error(args, p.returncode, out, err)
vdsm.common.cmdutils.Error: Command ['/usr/bin/python3', '-m', 'gluster.gfapi', '-v', 'data', '-p', '24007', '-H', 'localhost', '-t', 'tcp', '-c', 'readdir'] failed with rc=1 out=b'' err=b'/usr/bin/python3: No module named gluster.gfapi\n'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.6/site-packages/vdsm/supervdsm_server.py", line 95, in wrapper
    res = func(*args, **kwargs)
  File "/usr/lib/python3.6/site-packages/vdsm/gluster/gfapi.py", line 268, in volumeEmptyCheck
    raise ge.GlusterVolumeEmptyCheckFailedException(e.rc, [e.err])
vdsm.gluster.exception.GlusterVolumeEmptyCheckFailedException: Failed to Check if gluster volume is empty: rc=1 out=[b'/usr/bin/python3: No module named gluster.gfapi\n'] err=()

--- Additional comment from Yaniv Kaul on 2020-06-02 07:59:25 UTC ---

Hmmm, are you running with libgfapi?

--- Additional comment from Kaustav Majumder on 2020-06-02 09:36:34 UTC ---

(In reply to Yaniv Kaul from comment #1)
> Hmmm, are you running with libgfapi?

With and without, have checked. Both giving the same errors.

--- Additional comment from Sahina Bose on 2020-06-03 12:58:30 UTC ---

(In reply to Yaniv Kaul from comment #1)
> Hmmm, are you running with libgfapi?

gfapi is also used in vdsm to check volume size - this has no relation to the gfapi driver in qemu. The failure seen here is from the internal usage of gfapi to query volume size.

Comment 3 SATHEESARAN 2020-07-14 07:24:05 UTC
Verified with vdsm-4.40.22-1.el8ev.x86_64

There are no exceptions see with supervdsm.log or vdsm.log related to 'vdsm.gluster.exception.GlusterVolumeEmptyCheckFailedException'

Comment 4 errata-xmlrpc 2020-08-04 14:52:32 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (RHHI for Virtualization 1.8 bug fix and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2020:3314


Note You need to log in before you can comment on or make changes to this bug.