Bug 2042318

Summary: CephFS: Log Failure details if subvolume clone fails.
Product: [Red Hat Storage] Red Hat OpenShift Data Foundation Reporter: Madhu Rajanna <mrajanna>
Component: csi-driverAssignee: Niels de Vos <ndevos>
Status: CLOSED ERRATA QA Contact: Avi Liani <alayani>
Severity: unspecified Docs Contact:
Priority: unspecified    
Version: 4.9CC: assingh, madam, mmuench, muagarwa, ocs-bugs, odf-bz-bot, tdesala
Target Milestone: ---   
Target Release: ODF 4.12.0   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: Doc Type: No Doc Update
Doc Text:
Story Points: ---
Clone Of:
: 2042320 (view as bug list) Environment:
Last Closed: 2023-01-31 00:19:18 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On: 2042320    
Bug Blocks:    

Comment 6 Avi Liani 2022-06-29 15:19:27 UTC
Fix is not available yet in cephcsi , https://github.com/ceph/ceph-csi/pull/3201 need to be merged.

Also,  @mrajanna please provide information on how to verify

what do we need to look for in the logs, and in which logs ?

Comment 7 Madhu Rajanna 2022-06-29 15:24:32 UTC
Assigning it to Niels as he is having PR in cephcsi to log clone failure message

>Fix is not available yet in cephcsi , https://github.com/ceph/ceph-csi/pull/3201 need to be merged.

> Also,  @mrajanna please provide information on how to verify

Create a bigger size PVC and clone

> what do we need to look for in the logs, and in which logs ?

In the rbd-provisioner rbd-plugin container, you will see the `clone failed` error message.

Comment 8 Niels de Vos 2022-06-29 15:55:30 UTC
Mudit, Ceph-CSI in ODF-4.11 uses go-ceph v0.15. The new failure logging is only available from go-ceph v0.16. Can this be moved to ODF-4.12? Backporting a rebase of go-ceph is not appropriate at this time in the ODF-4.11 planning.

Comment 9 Mudit Agarwal 2022-06-29 17:03:38 UTC
Not a 4.11 blocker, moving it out.

Comment 10 Niels de Vos 2022-07-08 08:50:03 UTC
https://github.com/red-hat-storage/ceph-csi/pull/110 includes the change for ODF-4.12.

Comment 16 Avi Liani 2022-11-16 10:37:17 UTC
Tested on :

OCP version : 4.12.0-0.nightly-2022-10-25-210451

        OCS versions
        ==============

                NAME                              DISPLAY                       VERSION   REPLACES   PHASE
                mcg-operator.v4.12.0              NooBaa Operator               4.12.0               Succeeded
                ocs-operator.v4.12.0              OpenShift Container Storage   4.12.0               Succeeded
                odf-csi-addons-operator.v4.12.0   CSI Addons                    4.12.0               Succeeded
                odf-operator.v4.12.0              OpenShift Data Foundation     4.12.0               Succeeded
                
                ODF (OCS) build :                     full_version: 4.12.0-86
                
        Rook versions
        ===============

                rook: v4.12.0-0.75728a5a9c8e47917cae3cf7ea5c2db26590154b
                go: go1.18.4
 

I Created volume on 50% of the storage capacity and filled it up with data, and then try to create a clone of it.

The clone is stuck on pending state, and in the csi-plugin container log i see :

# oc -n openshift-storage logs csi-cephfsplugin-provisioner-5566474757-z5hdt -c csi-cephfsplugin|grep failed
E1103 15:12:10.480021       1 volumemounter.go:88] failed to run ceph-fuse exec: "ceph-fuse": executable file not found in $PATH
E1116 10:21:32.097011       1 controllerserver.go:144] ID: 22 Req-ID: pvc-aa5247ce-34c5-4a64-962b-7504a78aa198 failed to create clone from subvolume csi-vol-8035ecb3-0614-472c-925c-fb3e6d2000f9: clone from snapshot is pending

Comment 19 errata-xmlrpc 2023-01-31 00:19:18 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Red Hat OpenShift Data Foundation 4.12.0 enhancement and bug fix update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2023:0551