Description of problem (please be detailed as possible and provide log snippests): ocs-operator.v4.9.0-53.ci fails to install Version of all relevant components (if applicable): openshift installer (4.9.0-0.nightly-2021-08-07-175228) ocs-operator.v4.9.0-53.ci Does this issue impact your ability to continue to work with the product (please explain in detail what is the user impact)? Yes Is there any workaround available to the best of your knowledge? No Rate from 1 - 5 the complexity of the scenario you performed that caused this bug (1 - very simple, 5 - very complex)? 1 Can this issue reproducible? 1/1 Can this issue reproduce from the UI? Not Tried If this is a regression, please provide more details to justify this: Yes Steps to Reproduce: 1. Install OCS using ocs-ci 2. check ocs-operator is installed 3. Actual results: Expected results: ocs-operator should be in succeded phase Additional info: > olm logs: {"level":"error","ts":1628575464.7899,"logger":"controllers.operatorcondition","msg":"Error ensuring OperatorCondition Deployment EnvVars","request":"openshift-storage/ocs-operator.v4.9.0-53.ci","error":"Deployment.apps \"noobaa-operator\" not found","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/build/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:298\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/build/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:253\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/build/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:214"} {"level":"error","ts":1628575464.78995,"logger":"controller-runtime.manager.controller.operatorcondition","msg":"Reconciler error","reconciler group":"operators.coreos.com","reconciler kind":"OperatorCondition","name":"ocs-operator.v4.9.0-53.ci","namespace":"openshift-storage","error":"Deployment.apps \"noobaa-operator\" not found","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/build/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:253\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/build/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:214"} {"level":"error","ts":1628575464.8411217,"logger":"controllers.operator","msg":"Could not update Operator status","request":"/ocs-operator.openshift-storage","error":"Operation cannot be fulfilled on operators.operators.coreos.com \"ocs-operator.openshift-storage\": the object has been modified; please apply your changes to the latest version and try again","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/build/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:298\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/build/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:253\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/build/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:214"} E0810 06:04:24.844594 1 queueinformer_operator.go:290] sync {"update" "openshift-storage/ocs-operator.v4.9.0-53.ci"} failed: Deployment.apps "noobaa-operator" is invalid: [spec.template.spec.containers[0].env[5].name: Required value, spec.template.spec.containers[0].env[6].name: Required value] time="2021-08-10T06:04:25Z" level=warning msg="needs reinstall: missing deployment with name=noobaa-operator" csv=ocs-operator.v4.9.0-53.ci id=dm8IS namespace=openshift-storage phase=Failed strategy=deployment I0810 06:04:25.786186 1 event.go:282] Event(v1.ObjectReference{Kind:"ClusterServiceVersion", Namespace:"openshift-storage", Name:"ocs-operator.v4.9.0-53.ci", UID:"178c0a12-f536-418f-b87a-fbf2e8301ac7", APIVersion:"operators.coreos.com/v1alpha1", ResourceVersion:"44758", FieldPath:""}): type: 'Normal' reason: 'NeedsReinstall' installing: missing deployment with name=noobaa-operator time="2021-08-10T06:04:26Z" level=info msg="checking packageserver" time="2021-08-10T06:04:26Z" level=info msg="checking packageserver" time="2021-08-10T06:04:27Z" level=warning msg="needs reinstall: missing deployment with name=noobaa-operator" csv=ocs-operator.v4.9.0-53.ci id=GEPhI namespace=openshift-storage phase=Failed strategy=deployment I0810 06:04:27.383653 1 event.go:282] Event(v1.ObjectReference{Kind:"ClusterServiceVersion", Namespace:"openshift-storage", Name:"ocs-operator.v4.9.0-53.ci", UID:"178c0a12-f536-418f-b87a-fbf2e8301ac7", APIVersion:"operators.coreos.com/v1alpha1", ResourceVersion:"44758", FieldPath:""}): type: 'Normal' reason: 'NeedsReinstall' installing: missing deployment with name=noobaa-operator time="2021-08-10T06:04:27Z" level=info msg="error updating ClusterServiceVersion status: Operation cannot be fulfilled on clusterserviceversions.operators.coreos.com \"ocs-operator.v4.9.0-53.ci\": the object has been modified; please apply your changes to the latest version and try again" csv=ocs-operator.v4.9.0-53.ci id=Nnlks namespace=openshift-storage phase=Failed E0810 06:04:27.426255 1 queueinformer_operator.go:290] sync {"update" "openshift-storage/ocs-operator.v4.9.0-53.ci"} failed: error updating ClusterServiceVersion status: Operation cannot be fulfilled on clusterserviceversions.operators.coreos.com "ocs-operator.v4.9.0-53.ci": the object has been modified; please apply your changes to the latest version and try again > $ oc describe csv ocs-operator.v4.9.0-53.ci Name: ocs-operator.v4.9.0-53.ci Namespace: openshift-storage Status: Cleanup: Conditions: Last Transition Time: 2021-08-10T06:07:40Z Last Update Time: 2021-08-10T06:07:40Z Message: all requirements found, attempting install Phase: InstallReady Reason: AllRequirementsMet Last Transition Time: 2021-08-10T06:07:42Z Last Update Time: 2021-08-10T06:07:42Z Message: install strategy failed: Deployment.apps "noobaa-operator" is invalid: [spec.template.spec.containers[0].env[5].name: Required value, spec.template.spec.containers[0].env[6].name: Required value] vents: Type Reason Age From Message ---- ------ ---- ---- ------- Normal RequirementsUnknown 30m (x2 over 30m) operator-lifecycle-manager requirements not yet checked Normal RequirementsNotMet 30m (x2 over 30m) operator-lifecycle-manager one or more requirements couldn't be found Normal NeedsReinstall 29m (x5 over 29m) operator-lifecycle-manager installing: waiting for deployment ocs-operator to become ready: deployment "ocs-operator" not available: Deployment does not have minimum availability. Warning InstallComponentFailed 29m (x8 over 29m) operator-lifecycle-manager install strategy failed: Deployment.apps "noobaa-operator" is invalid: [spec.template.spec.containers[0].env[5].name: Required value, spec.template.spec.containers[0].env[6].name: Required value] Normal AllRequirementsMet 10m (x427 over 29m) operator-lifecycle-manager all requirements found, attempting install Normal NeedsReinstall 5s (x690 over 29m) operator-lifecycle-manager installing: missing deployment with name=noobaa-operator Job: https://ocs4-jenkins-csb-ocsqe.apps.ocp4.prod.psi.redhat.com/job/qe-deploy-ocs-cluster/5151/console
must gather logs: http://magna002.ceph.redhat.com/ocsci-jenkins/openshift-clusters/vavuthu53-fails/vavuthu53-fails_20210810T045333/logs/failed_testcase_ocs_logs_1628571967/deployment_ocs_logs/
Adding NI on Danny and Liran for https://bugzilla.redhat.com/show_bug.cgi?id=1991822#c5
this is indeed a result of the csv changes for 4.9. I'll issue a fix
Update: =========== > tested with ocs-registry:4.9.0-91.ci and openshift installer (4.9.0-0.nightly-2021-08-16-154237) on vSphere platform > During verification phase, csv is still in installing phase $ oc get csv NAME DISPLAY VERSION REPLACES PHASE ocs-operator.v4.9.0-91.ci OpenShift Container Storage 4.9.0-91.ci Installing > $ oc describe csv ocs-operator.v4.9.0-91.ci Name: ocs-operator.v4.9.0-91.ci Namespace: openshift-storage Labels: olm.api.1cf66995ee5bab83=provided olm.api.345775c37f11b6ca=provided olm.api.38cd97520e769cdd=provided Last Transition Time: 2021-08-17T13:01:51Z Last Update Time: 2021-08-17T13:01:51Z Message: installing: waiting for deployment ocs-operator to become ready: deployment "ocs-operator" not available: Deployment does not have minimum availability. Phase: Installing Reason: InstallWaiting Last Transition Time: 2021-08-17T13:01:51Z Last Update Time: 2021-08-17T13:01:51Z Message: installing: waiting for deployment ocs-operator to become ready: deployment "ocs-operator" not available: Deployment does not have minimum availability. Phase: Installing Reason: InstallWaiting Events: Type Reason Age From Message ---- ------ ---- ---- ------- Normal RequirementsUnknown 114m (x3 over 115m) operator-lifecycle-manager requirements not yet checked Normal RequirementsNotMet 114m (x2 over 114m) operator-lifecycle-manager one or more requirements couldn't be found Normal InstallWaiting 114m (x2 over 114m) operator-lifecycle-manager installing: waiting for deployment rook-ceph-operator to become ready: deployment "rook-ceph-operator" not available: Deployment does not have minimum availability. Normal InstallSucceeded 114m operator-lifecycle-manager install strategy completed with no errors Warning ComponentUnhealthy 113m (x2 over 113m) operator-lifecycle-manager installing: waiting for deployment ocs-operator to become ready: deployment "ocs-operator" not available: Deployment does not have minimum availability. Normal NeedsReinstall 113m (x2 over 113m) operator-lifecycle-manager installing: waiting for deployment ocs-operator to become ready: deployment "ocs-operator" not available: Deployment does not have minimum availability. Normal AllRequirementsMet 113m (x5 over 114m) operator-lifecycle-manager all requirements found, attempting install Normal InstallSucceeded 113m (x4 over 114m) operator-lifecycle-manager waiting for install components to report healthy Normal InstallWaiting 113m (x4 over 114m) operator-lifecycle-manager installing: waiting for deployment ocs-operator to become ready: deployment "ocs-operator" not available: Deployment does not have minimum availability. Warning InstallCheckFailed 115s (x38 over 108m) operator-lifecycle-manager install timeout > olm logs $ oc logs olm-operator-657ccf864b-jrps2 -n openshift-operator-lifecycle-manager {"level":"error","ts":1629205615.7535024,"logger":"controllers.operator","msg":"Could not update Operator status","request":"/ocs-operator.openshift-storage","error":"Operation cannot be fulfilled on operators.o perators.coreos.com \"ocs-operator.openshift-storage\": the object has been modified; please apply your changes to the latest version and try again","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/cont roller.(*Controller).reconcileHandler\n\t/build/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:298\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWo rkItem\n\t/build/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:253\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/build/vendor/sigs.k8s.io/c ontroller-runtime/pkg/internal/controller/controller.go:214"} time="2021-08-17T13:06:55Z" level=info msg="install strategy successful" csv=ocs-operator.v4.9.0-91.ci id=y9WC8 namespace=openshift-storage phase=Installing strategy=deployment {"level":"error","ts":1629205615.8934119,"logger":"controllers.operator","msg":"Could not update Operator status","request":"/ocs-operator.openshift-storage","error":"Operation cannot be fulfilled on operators.o perators.coreos.com \"ocs-operator.openshift-storage\": the object has been modified; please apply your changes to the latest version and try again","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/cont roller.(*Controller).reconcileHandler\n\t/build/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:298\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWo rkItem\n\t/build/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:253\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/build/vendor/sigs.k8s.io/c ontroller-runtime/pkg/internal/controller/controller.go:214"} Job: https://ocs4-jenkins-csb-ocsqe.apps.ocp4.prod.psi.redhat.com/job/qe-deploy-ocs-cluster/5292/consoleFull must gather: http://magna002.ceph.redhat.com/ocsci-jenkins/openshift-clusters/vavuthu4-ocs49/vavuthu4-ocs49_20210817T101113/logs/failed_testcase_ocs_logs_1629196536/test_deployment_ocs_logs/
correcting the status
Is this still failing on Noobaa, I can't see the messages which say so or I am missing something? This looks more similar to https://bugzilla.redhat.com/show_bug.cgi?id=1994261
(In reply to Mudit Agarwal from comment #13) > Is this still failing on Noobaa, I can't see the messages which say so or I > am missing something? > > This looks more similar to > https://bugzilla.redhat.com/show_bug.cgi?id=1994261 Its not failing on Noobaa. but csv is still installing phase
If its not failing on Noobaa then this is a different issue (please open) or a duplicate of BZ #1994261
This is not the same issue, moving it back to ON_QA. Please open a new issue.
Raised new bug for ocs-operator issue.( https://bugzilla.redhat.com/show_bug.cgi?id=1994687 ) Hence closing this issue.
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory (Moderate: Red Hat OpenShift Data Foundation 4.9.0 enhancement, security, and bug fix update), and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHSA-2021:5086