Bug 2183683 - [ODF 4.11] Deployment of ODF 4.9 over external mode failing with: panic: assignment to entry in nil map in ocs-operator logs
Summary: [ODF 4.11] Deployment of ODF 4.9 over external mode failing with: panic: assi...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat OpenShift Data Foundation
Classification: Red Hat Storage
Component: ocs-operator
Version: 4.11
Hardware: Unspecified
OS: Unspecified
unspecified
high
Target Milestone: ---
: ODF 4.11.7
Assignee: Malay Kumar parida
QA Contact: Shivam Durgbuns
URL:
Whiteboard:
Depends On: 2183501 2183684 2183685
Blocks:
TreeView+ depends on / blocked
 
Reported: 2023-04-01 04:18 UTC by Malay Kumar parida
Modified: 2023-08-09 17:00 UTC (History)
6 users (show)

Fixed In Version:
Doc Type: No Doc Update
Doc Text:
Clone Of: 2183501
Environment:
Last Closed: 2023-04-26 10:01:19 UTC
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github red-hat-storage ocs-operator pull 1978 0 None open Bug 2183683: [release-4.11] Initialize rookCephOperatorCM if data is nil 2023-04-03 08:28:58 UTC
Red Hat Product Errata RHSA-2023:2023 0 None None None 2023-04-26 10:01:29 UTC

Description Malay Kumar parida 2023-04-01 04:18:14 UTC
+++ This bug was initially created as a clone of Bug #2183501 +++

Description of problem (please be detailed as possible and provide log
snippests):
In ocs-operator logs I see:
2023-03-31T09:39:47.418626879Z panic: assignment to entry in nil map
2023-03-31T09:39:47.418626879Z 

Version of all relevant components (if applicable):
quay.io/rhceph-dev/ocs-registry:4.9.15-1
OCP 4.9 nightly

Does this issue impact your ability to continue to work with the product
(please explain in detail what is the user impact)?
Yes, blocking deployment with external cluster on ODF 4.9

Is there any workaround available to the best of your knowledge?
Nope


Rate from 1 - 5 the complexity of the scenario you performed that caused this
bug (1 - very simple, 5 - very complex)?
1

Can this issue reproducible?
Yes

Can this issue reproduce from the UI?
Haven't tried

If this is a regression, please provide more details to justify this:
Yes

Steps to Reproduce:
1. Install ODF with external mode
2.
3.


Actual results:
Failed deployment - 
ocs-operator-796dff676f-5mp87                     0/1     CrashLoopBackOff   13 (4m22s ago)   51m

Expected results:
Have successful deployment

Additional info:
Logs in private comment

--- Additional comment from Petr Balogh on 2023-03-31 11:46:47 UTC ---

OCS-operator log:
http://magna002.ceph.redhat.com/ocsci-jenkins/openshift-clusters/j-229vu1ce33-t1/j-229vu1ce33-t1_20230331T085049/logs/failed_testcase_ocs_logs_1680254576/test_deployment_ocs_logs/ocs_must_gather/quay-io-rhceph-dev-ocs-must-gather-sha256-82081d40487b4fc9325ac6f687edabba5dbb109b2d7028fd8a34c134d92ecb7c/namespaces/openshift-storage/pods/ocs-operator-796dff676f-5mp87/ocs-operator/ocs-operator/logs/previous.log

Must gather log:
http://magna002.ceph.redhat.com/ocsci-jenkins/openshift-clusters/j-229vu1ce33-t1/j-229vu1ce33-t1_20230331T085049/logs/failed_testcase_ocs_logs_1680254576/test_deployment_ocs_logs/

Jenkins job:
https://ocs4-jenkins-csb-odf-qe.apps.ocp-c1.prod.psi.redhat.com/job/qe-trigger-vsphere-upi-1az-rhcos-external-3m-3w-tier1/229/

--- Additional comment from RHEL Program Management on 2023-03-31 11:46:57 UTC ---

This bug having no release flag set previously, is now set with release flag 'odf‑4.13.0' to '?', and so is being proposed to be fixed at the ODF 4.13.0 release. Note that the 3 Acks (pm_ack, devel_ack, qa_ack), if any previously set while release flag was missing, have now been reset since the Acks are to be set against a release flag.

--- Additional comment from RHEL Program Management on 2023-03-31 11:46:57 UTC ---

This bug report has Keywords: Regression or TestBlocker.
Since no regressions or test blockers are allowed between releases, it is also being proposed as a blocker for this release. Please resolve ASAP.

Comment 12 errata-xmlrpc 2023-04-26 10:01:19 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Important: Red Hat OpenShift Data Foundation 4.11.7 Bug Fix and security update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2023:2023


Note You need to log in before you can comment on or make changes to this bug.