Bug 2124379
| Summary: | ODF4.12 Installation, ocs-operator.v4.12.0 and mcg-operator.v4.12.0 failed | |||
|---|---|---|---|---|
| Product: | [Red Hat Storage] Red Hat OpenShift Data Foundation | Reporter: | Oded <oviner> | |
| Component: | ocs-operator | Assignee: | umanga <uchapaga> | |
| Status: | CLOSED CURRENTRELEASE | QA Contact: | Oded <oviner> | |
| Severity: | urgent | Docs Contact: | ||
| Priority: | unspecified | |||
| Version: | 4.12 | CC: | mparida, muagarwa, nberry, ocs-bugs, odf-bz-bot, sostapov, tnielsen, uchapaga, vavuthu | |
| Target Milestone: | --- | |||
| Target Release: | ODF 4.12.0 | |||
| Hardware: | Unspecified | |||
| OS: | Unspecified | |||
| Whiteboard: | ||||
| Fixed In Version: | Doc Type: | No Doc Update | ||
| Doc Text: | Story Points: | --- | ||
| Clone Of: | ||||
| : | 2124591 2124593 (view as bug list) | Environment: | ||
| Last Closed: | 2023-02-08 14:06:28 UTC | Type: | Bug | |
| Regression: | --- | Mount Type: | --- | |
| Documentation: | --- | CRM: | ||
| Verified Versions: | Category: | --- | ||
| oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | ||
| Cloudforms Team: | --- | Target Upstream Version: | ||
| Embargoed: | ||||
| Bug Depends On: | ||||
| Bug Blocks: | 2124591, 2124593 | |||
|
Description
Oded
2022-09-05 22:26:20 UTC
Keeping this BZ for ocs-metrics-exporter, have cloned two BZs one for rook and another for noobaa https://github.com/red-hat-storage/ocs-operator/pull/1813 removes privileged access from ocs-metrics-exporter and should fix these SCC errors. Any latest 4.12 builds can be used to test. Bug Fixed. PR Validation Job pass. https://github.com/red-hat-storage/ocs-ci/pull/6573/files OCP Version: 4.12.0-0.nightly-2022-10-18-192348 ODF Version: 4.12.0-77 Provider: Vmware ODF4.12 installation failed on AWS_UPI_RHEL without WA $ kubectl label --overwrite ns openshift-storage \ pod-security.kubernetes.io/enforce=privileged \ pod-security.kubernetes.io/warn=baseline \ pod-security.kubernetes.io/audit=baseline SetUp: OCP Version: 4.12 ODF Version: 4.12 Provider: AWS_RHEL_UPI OCP MG: http://magna002.ceph.redhat.com/ocsci-jenkins/openshift-clusters/jnk-pr6573b3719/jnk-pr6573b3719_20221024T112000/logs/failed_testcase_ocs_logs_1666610685/deployment_ocs_logs/ Jenkins Job: https://ocs4-jenkins-csb-odf-qe.apps.ocp-c1.prod.psi.redhat.com/job/qe-trigger-test-pr/3719/testReport/tests.ecosystem.deployment/test_deployment/test_deployment/ ODF4.12 installation failed on AWS_UPI_RHEL without WA $ kubectl label --overwrite ns openshift-storage \ pod-security.kubernetes.io/enforce=privileged \ pod-security.kubernetes.io/warn=baseline \ pod-security.kubernetes.io/audit=baseline SetUp: OCP Version: 4.12 ODF Version: 4.12 Provider: AWS_RHEL_UPI OCP MG: http://magna002.ceph.redhat.com/ocsci-jenkins/openshift-clusters/jnk-pr6573b3719/jnk-pr6573b3719_20221024T112000/logs/failed_testcase_ocs_logs_1666610685/deployment_ocs_logs/ Jenkins Job: https://ocs4-jenkins-csb-odf-qe.apps.ocp-c1.prod.psi.redhat.com/job/qe-trigger-test-pr/3719/testReport/tests.ecosystem.deployment/test_deployment/test_deployment/ This is now fixed in OLM, please try with the latest build. Bug reproduced on latest version OCP Version:4.12 ODF Version:4.12 Provider: AWS_UPI https://ocs4-jenkins-csb-odf-qe.apps.ocp-c1.prod.psi.redhat.com/job/qe-trigger-test-pr/3733/testReport/tests.ecosystem.deployment/test_deployment/test_deployment/ failed on setup with "ocs_ci.ocs.exceptions.CommandFailed: Error during execution of command: oc -n default create -f /tmp/POD_43ysp0qc -o yaml. Error is Error from server (Forbidden): error when creating "/tmp/POD_43ysp0qc": pods "rhel-ansible" is forbidden: violates PodSecurity "restricted:latest": host namespaces (hostNetwork=true, hostPID=true), privileged (container "rhel" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (container "rhel" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (container "rhel" must set securityContext.capabilities.drop=["ALL"]), runAsNonRoot != true (pod or container "rhel" must set securityContext.runAsNonRoot=true), runAsUser=0 (container "rhel" must not set runAsUser=0), seccompProfile (pod or container "rhel" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost")" Hi Oded, I see you are trying to create a pod in the default namespace. And That is what is the cause of the error. In latest OLM changes, they are automatically labelling namespaces only which are prefixed with openshift-, They are not touching any other NS, and here the ns in question is not any openshift-* NS, hence the error. I am not very sure about how the installation happens in different methods, But it seems like if we want to use the default ns, we have to label it beforehand. Bug Fixed. rhel-ansible pod is part of OCS-CI infra. PR validation job pass on AWS_UPI https://ocs4-jenkins-csb-odf-qe.apps.ocp-c1.prod.psi.redhat.com/job/qe-trigger-test-pr/3744/ |