Bug 2176738
| Summary: | All ODF pods shows in PodSecurityViolation alerts | ||
|---|---|---|---|
| Product: | [Red Hat Storage] Red Hat OpenShift Data Foundation | Reporter: | James Biao <jbiao> |
| Component: | odf-operator | Assignee: | Nitin Goyal <nigoyal> |
| Status: | CLOSED NOTABUG | QA Contact: | Elad <ebenahar> |
| Severity: | unspecified | Docs Contact: | |
| Priority: | unspecified | ||
| Version: | 4.11 | CC: | hnallurv, jansingh, muagarwa, odf-bz-bot |
| Target Milestone: | --- | ||
| Target Release: | --- | ||
| Hardware: | Unspecified | ||
| OS: | Unspecified | ||
| Whiteboard: | |||
| Fixed In Version: | Doc Type: | If docs needed, set a value | |
| Doc Text: | Story Points: | --- | |
| Clone Of: | Environment: | ||
| Last Closed: | 2023-08-08 07:34:43 UTC | Type: | Bug |
| Regression: | --- | Mount Type: | --- |
| Documentation: | --- | CRM: | |
| Verified Versions: | Category: | --- | |
| oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
| Cloudforms Team: | --- | Target Upstream Version: | |
| Embargoed: | |||
Closing it as it was a customer bug and case is closed too. |
Description of problem (please be detailed as possible and provide log snippests): viewing alerts in OpenShift, shows all ODF pods in alert for PodSecurityViolation. oc adm must-gather -- /usr/bin/gather_audit_logs zgrep -h pod-security.kubernetes.io/audit-violations must-gather.local.2089141812275895646/quay*/audit_logs/kube-apiserver/*log.gz | jq -r 'select((.annotations["pod-security.kubernetes.io/audit-violations"] != null) and (.objectRef.resource=="pods")) | .objectRef.namespace + " " + .objectRef.name + " " + .objectRef.resource' | sort | uniq -c The output from this was: 1 openshift-must-gather-nqd7q pods 1 openshift-storage noobaa-db-pg-0 pods 26 openshift-storage pods Cutomer only have ODF pods running under the openshift-storage namespace. Version of all relevant components (if applicable): ODF 4.11 OCP 4.11 Does this issue impact your ability to continue to work with the product (please explain in detail what is the user impact)? customer will not be able to safely upgrade to 4.12. This also seems like a potential security risk. Is there any workaround available to the best of your knowledge? not sure Rate from 1 - 5 the complexity of the scenario you performed that caused this bug (1 - very simple, 5 - very complex)? Can this issue reproducible? always in customer environment Can this issue reproduce from the UI? If this is a regression, please provide more details to justify this: Steps to Reproduce: 1.Deploy OCP 4.11 2.Install ODF 4.11 3.check PodSecurityViolation violation alerts Actual results: ODF pods have PodSecurityViolation alerts Expected results: ODF pods not in PodSecurityViolation alerts Additional info: