Bug 2046766

Summary: [IBM Z]: csi-rbdplugin pods failed to come up due to ImagePullBackOff from the "csiaddons" registry
Product: [Red Hat Storage] Red Hat OpenShift Data Foundation Reporter: Sravika <sbalusu>
Component: ocs-operatorAssignee: Madhu Rajanna <mrajanna>
Status: CLOSED ERRATA QA Contact: Sravika <sbalusu>
Severity: medium Docs Contact:
Priority: unspecified    
Version: 4.10CC: dahorak, madam, mrajanna, muagarwa, nberry, ocs-bugs, odf-bz-bot, sostapov
Target Milestone: ---   
Target Release: ODF 4.10.0   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: 4.10.0-132 Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2022-04-13 18:52:21 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description Sravika 2022-01-27 09:26:20 UTC
Description of problem (please be detailed as possible and provide log
snippests):

With the latest version of odf 4.10.0-122 deployment , the csi-rbdplugin pods failed to come up due to ImagePullBackOff from the "csiaddons" registry.

csi-rbdplugin-jmqtj                                               3/4     ImagePullBackOff   0          13h
csi-rbdplugin-provisioner-84bd488586-9wxd4                        6/7     ImagePullBackOff   0          13h
csi-rbdplugin-provisioner-84bd488586-blb9f                        6/7     ImagePullBackOff   0          13h
csi-rbdplugin-r562z                                               3/4     ImagePullBackOff   0          13h
csi-rbdplugin-x6tlb                                               3/4     ImagePullBackOff   0          13h
csi-rbdplugin-z6jfj                                               3/4     ImagePullBackOff   0          13h

#  oc -n openshift-storage describe po csi-rbdplugin-jmqtj | grep Image
    Image:         quay.io/rhceph-dev/openshift-ose-csi-node-driver-registrar@sha256:80f8e188497b4499c885fa09facc83f8dae5c6ae4b368787d5cd30781e8bbbf3
    Image ID:      quay.io/rhceph-dev/openshift-ose-csi-node-driver-registrar@sha256:433c18ed10beadde3a217d5aa0af867f40eaacea1d2eeca4a59343d5dfcc1bc3
    Image:         quay.io/rhceph-dev/odf4-cephcsi-rhel8@sha256:e32d9abfa39c76bdc8e35697df4bfd18d7655c03aff24ebd345f6f1b0806f2d3
    Image ID:      quay.io/rhceph-dev/odf4-cephcsi-rhel8@sha256:a690c4dfd1ef855f155c3617e91d90fb3e0f44d785c8381967dddef6abae854d
    Image:         quay.io/csiaddons/k8s-sidecar:v0.2.1
    Image ID:
      Reason:       ImagePullBackOff
    Image:         quay.io/rhceph-dev/odf4-cephcsi-rhel8@sha256:e32d9abfa39c76bdc8e35697df4bfd18d7655c03aff24ebd345f6f1b0806f2d3
    Image ID:      quay.io/rhceph-dev/odf4-cephcsi-rhel8@sha256:a690c4dfd1ef855f155c3617e91d90fb3e0f44d785c8381967dddef6abae854d

Version of all relevant components (if applicable):
odf-csi-addons-operator.v4.10.0
ocs-operator.v4.10.0 -  4.10.0-122 


Does this issue impact your ability to continue to work with the product
(please explain in detail what is the user impact)?


Is there any workaround available to the best of your knowledge?


Rate from 1 - 5 the complexity of the scenario you performed that caused this
bug (1 - very simple, 5 - very complex)?


Can this issue reproducible?
Yes

Can this issue reproduce from the UI?
Yes

If this is a regression, please provide more details to justify this:


Steps to Reproduce:
1. Deploy Openshift 4.10
2. Deploy Odf 4.10
3.


Actual results:

csi-rbdplugin pods failed to come up due to ImagePullBackOff from the "csiaddons" registry

# oc get po -n openshift-storage
NAME                                                              READY   STATUS             RESTARTS   AGE
csi-addons-controller-manager-657d6fb664-4zz2x                    2/2     Running            0          14h
csi-cephfsplugin-2fvrd                                            3/3     Running            0          14h
csi-cephfsplugin-bjrkx                                            3/3     Running            0          14h
csi-cephfsplugin-fw9sn                                            3/3     Running            0          14h
csi-cephfsplugin-provisioner-584b8cd888-8ct5z                     6/6     Running            0          14h
csi-cephfsplugin-provisioner-584b8cd888-jkf8f                     6/6     Running            0          14h
csi-cephfsplugin-vg7sw                                            3/3     Running            0          14h
csi-rbdplugin-jmqtj                                               3/4     ImagePullBackOff   0          14h
csi-rbdplugin-provisioner-84bd488586-9wxd4                        6/7     ImagePullBackOff   0          14h
csi-rbdplugin-provisioner-84bd488586-blb9f                        6/7     ImagePullBackOff   0          14h
csi-rbdplugin-r562z                                               3/4     ImagePullBackOff   0          14h
csi-rbdplugin-x6tlb                                               3/4     ImagePullBackOff   0          14h
csi-rbdplugin-z6jfj                                               3/4     ImagePullBackOff   0          14h
noobaa-core-0                                                     1/1     Running            0          14h
noobaa-db-pg-0                                                    1/1     Running            0          14h
noobaa-endpoint-7455995b54-hq2pj                                  1/1     Running            0          14h
noobaa-operator-798c977867-m4j5q                                  1/1     Running            0          14h
ocs-metrics-exporter-556bf6cbdd-ttbds                             1/1     Running            0          14h
ocs-operator-6c85698f4b-rm8rj                                     1/1     Running            0          14h
odf-console-847cc544db-4bzqc                                      1/1     Running            0          14h
odf-operator-controller-manager-7ccc48fb44-xjhwm                  2/2     Running            0          14h
rook-ceph-crashcollector-master-2.ocsm4205001.lnxne.boe-79wsgf2   1/1     Running            0          14h
rook-ceph-crashcollector-worker-0.ocsm4205001.lnxne.boe-7b4g27j   1/1     Running            0          14h
rook-ceph-crashcollector-worker-1.ocsm4205001.lnxne.boe-7dnj2tz   1/1     Running            0          14h
rook-ceph-crashcollector-worker-2.ocsm4205001.lnxne.boe-ccf7q7z   1/1     Running            0          14h
rook-ceph-crashcollector-worker-3.ocsm4205001.lnxne.boe-76qkfdf   1/1     Running            0          14h
rook-ceph-mds-ocs-storagecluster-cephfilesystem-a-7dc4c458vxtxf   2/2     Running            0          14h
rook-ceph-mds-ocs-storagecluster-cephfilesystem-b-5748db87dqc6q   2/2     Running            0          14h
rook-ceph-mgr-a-6686dd9645-fsgwq                                  3/3     Running            0          14h
rook-ceph-mgr-b-f8c467b58-bfw8k                                   3/3     Running            0          14h
rook-ceph-mon-a-7d5cf45d95-bpxfw                                  2/2     Running            0          14h
rook-ceph-mon-b-58f7cf76cf-wgmfp                                  2/2     Running            0          14h
rook-ceph-mon-c-58fb6d4d86-ddbdl                                  2/2     Running            0          14h
rook-ceph-mon-d-54d9d6fddd-zw9m5                                  2/2     Running            0          14h
rook-ceph-mon-e-5c64cb6fc-w52cv                                   2/2     Running            0          14h
rook-ceph-operator-859c88995b-cj9mz                               1/1     Running            0          14h
rook-ceph-osd-0-6fdb578465-s9wpm                                  2/2     Running            0          14h
rook-ceph-osd-1-5c74ccdc94-kslh5                                  2/2     Running            0          14h
rook-ceph-osd-2-78cf96dd56-v7dcl                                  2/2     Running            0          14h
rook-ceph-osd-3-7f4fb4b78c-jc974                                  2/2     Running            0          14h
rook-ceph-osd-prepare-1b78713c124ae7f86858e38398eae586-pxdsp      0/1     Completed          0          14h
rook-ceph-osd-prepare-4d8e4f116f15a1a5ebbaee85790b79d1-kkqqh      0/1     Completed          0          14h
rook-ceph-osd-prepare-a7ae894a9272c0fde57c70f04d46582b-c5zbp      0/1     Completed          0          14h
rook-ceph-osd-prepare-e778038dc9e84de7fce5bd1afa5fa71a-xf59t      0/1     Completed          0          14h
rook-ceph-rgw-ocs-storagecluster-cephobjectstore-a-7b69cc54rw2w   2/2     Running            0          14h
rook-ceph-rgw-ocs-storagecluster-cephobjectstore-a-7b69cc5zdtjv   2/2     Running            0          14h


Expected results:
csi-rbdplugin pods should be provisioned 

Additional info:

Must-gather logs
https://drive.google.com/file/d/1Cd2emUj8g2tYaLDtXAPP1mCUnY5nNIsT/view?usp=sharing

Comment 8 Sravika 2022-03-09 09:15:29 UTC
This BZ is not reproducible on the latest versions of odf and can be moved to done.

Comment 10 errata-xmlrpc 2022-04-13 18:52:21 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Important: Red Hat OpenShift Data Foundation 4.10.0 enhancement, security & bug fix update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2022:1372