Legit bug. On a first glance, we need to just schedule on nodes where os=linux. Need to test that though, I'm trying to get access to a cluster with windows machines. Since it appears that all the nodes are in the same pool, one q. is how would remediations work since they are certainly Linux-specific.
[Bug_verification] Looks good. Now, The node scans can only be scheduled on Linux nodes and it is preventing pod creation on windows nodes. Also The complianceScan completed successfully. Verified on: 4.9.26-x86_64 + compliance-operator.v0.1.49 Cluster Profile: IPI on vSphere 7.0 & OVN & WindowsContainer # oc get clusterversion NAME VERSION AVAILABLE PROGRESSING SINCE STATUS version 4.9.26 True False 124m Cluster version is 4.9.26 # oc project openshift-compliance Now using project "openshift-compliance" on server "https://api.winc-pdhamdhe29.qe.devcluster.openshift.com:6443". # oc get csv NAME DISPLAY VERSION REPLACES PHASE compliance-operator.v0.1.49 Compliance Operator 0.1.49 Succeeded elasticsearch-operator.5.3.6-25 OpenShift Elasticsearch Operator 5.3.6-25 Succeeded # oc get pods NAME READY STATUS RESTARTS AGE compliance-operator-75c6c56599-2j6hr 1/1 Running 1 (5m38s ago) 6m19s ocp4-openshift-compliance-pp-56dd949976-pbkcf 1/1 Running 0 5m2s rhcos4-openshift-compliance-pp-7595d55cfb-68lzl 1/1 Running 0 5m2s # oc get nodes NAME STATUS ROLES AGE VERSION winc-pdhamdhe29-9t9x6-master-0 Ready master 131m v1.22.5+5c84e52 winc-pdhamdhe29-9t9x6-master-1 Ready master 131m v1.22.5+5c84e52 winc-pdhamdhe29-9t9x6-master-2 Ready master 131m v1.22.5+5c84e52 winc-pdhamdhe29-9t9x6-worker-mxjr2 Ready worker 121m v1.22.5+5c84e52 winc-pdhamdhe29-9t9x6-worker-v8rlk Ready worker 122m v1.22.5+5c84e52 winworker-4zssl Ready worker 94m v1.22.1-1739+c8538fcbd98efa winworker-wm7sc Ready worker 89m v1.22.1-1739+c8538fcbd98efa # oc get nodes -l beta.kubernetes.io/os=windows NAME STATUS ROLES AGE VERSION winworker-4zssl Ready worker 95m v1.22.1-1739+c8538fcbd98efa winworker-wm7sc Ready worker 90m v1.22.1-1739+c8538fcbd98efa # oc create -f - << EOF > apiVersion: compliance.openshift.io/v1alpha1 > kind: ScanSettingBinding > metadata: > name: my-ssb-r > profiles: > - name: ocp4-cis > kind: Profile > apiGroup: compliance.openshift.io/v1alpha1 > - name: ocp4-cis-node > kind: Profile > apiGroup: compliance.openshift.io/v1alpha1 > settingsRef: > name: default > kind: ScanSetting > apiGroup: compliance.openshift.io/v1alpha1 > EOF scansettingbinding.compliance.openshift.io/my-ssb-r created # oc get pods -w NAME READY STATUS RESTARTS AGE compliance-operator-75c6c56599-2j6hr 1/1 Running 1 (11m ago) 12m ocp4-cis-api-checks-pod 2/2 Running 0 28s ocp4-cis-node-master-rs-7b98d5f8b7-x4g8l 1/1 Running 0 29s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-0-pod 1/2 NotReady 0 29s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-1-pod 2/2 Running 0 29s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-2-pod 2/2 Running 0 29s ocp4-cis-node-worker-rs-67dc795855-6djq9 1/1 Running 0 25s ocp4-cis-node-worker-winc-pdhamdhe29-9t9x6-worker-mxjr2-pod 2/2 Running 0 25s ocp4-cis-node-worker-winc-pdhamdhe29-9t9x6-worker-v8rlk-pod 2/2 Running 0 25s ocp4-cis-rs-cbf64b487-f6n8m 1/1 Running 0 28s ocp4-openshift-compliance-pp-56dd949976-pbkcf 1/1 Running 0 10m rhcos4-openshift-compliance-pp-7595d55cfb-68lzl 1/1 Running 0 10m ocp4-cis-node-worker-winc-pdhamdhe29-9t9x6-worker-mxjr2-pod 1/2 NotReady 0 27s ocp4-cis-api-checks-pod 1/2 NotReady 0 35s ocp4-cis-node-worker-winc-pdhamdhe29-9t9x6-worker-v8rlk-pod 1/2 NotReady 0 32s ocp4-cis-node-worker-winc-pdhamdhe29-9t9x6-worker-mxjr2-pod 0/2 Completed 0 33s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-1-pod 1/2 NotReady 0 37s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-0-pod 0/2 Completed 0 37s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-2-pod 1/2 NotReady 0 37s ocp4-cis-node-worker-winc-pdhamdhe29-9t9x6-worker-v8rlk-pod 0/2 Completed 0 38s # oc get pods -w NAME READY STATUS RESTARTS AGE aggregator-pod-ocp4-cis-node-worker 0/1 Init:0/1 0 1s compliance-operator-75c6c56599-2j6hr 1/1 Running 1 (11m ago) 12m ocp4-cis-api-checks-pod 0/2 Completed 0 44s ocp4-cis-node-master-rs-7b98d5f8b7-x4g8l 1/1 Running 0 45s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-0-pod 0/2 Completed 0 45s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-1-pod 1/2 NotReady 0 45s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-2-pod 1/2 NotReady 0 45s ocp4-cis-node-worker-rs-67dc795855-6djq9 1/1 Running 0 41s ocp4-cis-node-worker-winc-pdhamdhe29-9t9x6-worker-mxjr2-pod 0/2 Completed 0 41s ocp4-cis-node-worker-winc-pdhamdhe29-9t9x6-worker-v8rlk-pod 0/2 Completed 0 41s ocp4-cis-rs-cbf64b487-f6n8m 1/1 Running 0 44s ocp4-openshift-compliance-pp-56dd949976-pbkcf 1/1 Running 0 11m rhcos4-openshift-compliance-pp-7595d55cfb-68lzl 1/1 Running 0 11m ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-1-pod 0/2 Completed 0 46s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-2-pod 0/2 Completed 0 46s aggregator-pod-ocp4-cis-node-worker 0/1 Init:0/1 0 3s aggregator-pod-ocp4-cis-node-worker 0/1 PodInitializing 0 5s aggregator-pod-ocp4-cis-node-worker 1/1 Running 0 6s aggregator-pod-ocp4-cis-node-master 0/1 Pending 0 0s aggregator-pod-ocp4-cis-node-master 0/1 Pending 0 0s aggregator-pod-ocp4-cis-node-master 0/1 Pending 0 0s aggregator-pod-ocp4-cis-node-master 0/1 Init:0/1 0 0s aggregator-pod-ocp4-cis 0/1 Pending 0 0s aggregator-pod-ocp4-cis 0/1 Pending 0 0s aggregator-pod-ocp4-cis 0/1 Pending 0 0s aggregator-pod-ocp4-cis 0/1 Init:0/1 0 0s aggregator-pod-ocp4-cis 0/1 Init:0/1 0 2s aggregator-pod-ocp4-cis-node-master 0/1 Init:0/1 0 2s # oc get pods -w NAME READY STATUS RESTARTS AGE aggregator-pod-ocp4-cis 1/1 Running 0 5s aggregator-pod-ocp4-cis-node-master 1/1 Running 0 5s aggregator-pod-ocp4-cis-node-worker 0/1 Completed 0 15s compliance-operator-75c6c56599-2j6hr 1/1 Running 1 (11m ago) 12m ocp4-cis-api-checks-pod 0/2 Completed 0 58s ocp4-cis-node-master-rs-7b98d5f8b7-x4g8l 1/1 Running 0 59s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-0-pod 0/2 Completed 0 59s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-1-pod 0/2 Completed 0 59s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-2-pod 0/2 Completed 0 59s ocp4-cis-node-worker-rs-67dc795855-6djq9 1/1 Running 0 55s ocp4-cis-node-worker-winc-pdhamdhe29-9t9x6-worker-mxjr2-pod 0/2 Completed 0 55s ocp4-cis-node-worker-winc-pdhamdhe29-9t9x6-worker-v8rlk-pod 0/2 Completed 0 55s ocp4-cis-rs-cbf64b487-f6n8m 1/1 Running 0 58s ocp4-openshift-compliance-pp-56dd949976-pbkcf 1/1 Running 0 11m rhcos4-openshift-compliance-pp-7595d55cfb-68lzl 1/1 Running 0 11m ocp4-cis-node-worker-rs-67dc795855-6djq9 1/1 Terminating 0 56s ocp4-cis-node-worker-rs-67dc795855-6djq9 0/1 Terminating 0 57s ocp4-cis-node-worker-rs-67dc795855-6djq9 0/1 Terminating 0 57s ocp4-cis-node-worker-rs-67dc795855-6djq9 0/1 Terminating 0 57s # oc get suite NAME PHASE RESULT my-ssb-r DONE NON-COMPLIANT # oc get scan NAME PHASE RESULT ocp4-cis DONE NON-COMPLIANT ocp4-cis-node-master DONE NON-COMPLIANT ocp4-cis-node-worker DONE NON-COMPLIANT # oc get pods NAME READY STATUS RESTARTS AGE aggregator-pod-ocp4-cis 0/1 Completed 0 42s aggregator-pod-ocp4-cis-node-master 0/1 Completed 0 42s aggregator-pod-ocp4-cis-node-worker 0/1 Completed 0 52s compliance-operator-75c6c56599-2j6hr 1/1 Running 1 (12m ago) 13m ocp4-cis-api-checks-pod 0/2 Completed 0 95s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-0-pod 0/2 Completed 0 96s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-1-pod 0/2 Completed 0 96s ocp4-cis-node-master-winc-pdhamdhe29-9t9x6-master-2-pod 0/2 Completed 0 96s ocp4-cis-node-worker-winc-pdhamdhe29-9t9x6-worker-mxjr2-pod 0/2 Completed 0 92s ocp4-cis-node-worker-winc-pdhamdhe29-9t9x6-worker-v8rlk-pod 0/2 Completed 0 92s ocp4-openshift-compliance-pp-56dd949976-pbkcf 1/1 Running 0 11m rhcos4-openshift-compliance-pp-7595d55cfb-68lzl 1/1 Running 0 11m # oc get nodes NAME STATUS ROLES AGE VERSION winc-pdhamdhe29-9t9x6-master-0 Ready master 138m v1.22.5+5c84e52 winc-pdhamdhe29-9t9x6-master-1 Ready master 138m v1.22.5+5c84e52 winc-pdhamdhe29-9t9x6-master-2 Ready master 138m v1.22.5+5c84e52 winc-pdhamdhe29-9t9x6-worker-mxjr2 Ready worker 128m v1.22.5+5c84e52 winc-pdhamdhe29-9t9x6-worker-v8rlk Ready worker 128m v1.22.5+5c84e52 winworker-4zssl Ready worker 101m v1.22.1-1739+c8538fcbd98efa winworker-wm7sc Ready worker 96m v1.22.1-1739+c8538fcbd98efa # oc get nodes -l beta.kubernetes.io/os=windows NAME STATUS ROLES AGE VERSION winworker-4zssl Ready worker 101m v1.22.1-1739+c8538fcbd98efa winworker-wm7sc Ready worker 96m v1.22.1-1739+c8538fcbd98efa # oc describe suite my-ssb-r |tail -40 Status: Conditions: Last Transition Time: 2022-03-29T06:41:13Z Message: Compliance suite run is done running the scans Reason: NotRunning Status: False Type: Processing Last Transition Time: 2022-03-29T06:41:13Z Message: Compliance suite run is done and has results Reason: Done Status: True Type: Ready Phase: DONE Result: NON-COMPLIANT Scan Statuses: Name: ocp4-cis Phase: DONE Result: NON-COMPLIANT Results Storage: Name: ocp4-cis Namespace: openshift-compliance Warnings: could not fetch /apis/flowcontrol.apiserver.k8s.io/v1alpha1/flowschemas/catch-all: the server could not find the requested resource could not fetch /apis/logging.openshift.io/v1/namespaces/openshift-logging/clusterlogforwarders/instance: clusterlogforwarders.logging.openshift.io "instance" not found could not fetch /apis/apps/v1/namespaces/openshift-sdn/daemonsets/sdn: daemonsets.apps "sdn" not found Name: ocp4-cis-node-master Phase: DONE Result: NON-COMPLIANT Results Storage: Name: ocp4-cis-node-master Namespace: openshift-compliance Name: ocp4-cis-node-worker Phase: DONE Result: NON-COMPLIANT Results Storage: Name: ocp4-cis-node-worker Namespace: openshift-compliance Events: Type Reason Age From Message ---- ------ ---- ---- ------- Normal ResultAvailable 3m24s (x6 over 3m25s) suitectrl The result is: NON-COMPLIANT
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory (OpenShift Compliance Operator bug fix and enhancement update), and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2022:1148