Description of problem: After remediation applied, the compliancecheckresults still reports Failed status for some rules though the rule configuration applied at node level. Example: $ oc get complianceremediation worker-scan-audit-rules-privileged-commands-at NAME STATE worker-scan-audit-rules-privileged-commands-at Applied After rescan: $ oc get compliancecheckresult worker-scan-audit-rules-privileged-commands-at NAME STATUS SEVERITY worker-scan-audit-rules-privileged-commands-at FAIL medium Version-Release number of selected component (if applicable): 4.7.0-0.nightly-2020-12-21-131655 How reproducible: Always Steps to Reproduce: 1. install compliance operator 2. Set label 1 rhcos worker nodes out of all workr $ oc label node ip-10-0-151-127.us-east-2.compute.internal node-role.kubernetes.io/wscan= 3. oc create -f - <<EOF apiVersion: machineconfiguration.openshift.io/v1 kind: MachineConfigPool metadata: name: wscan spec: machineConfigSelector: matchExpressions: - {key: machineconfiguration.openshift.io/role, operator: In, values: [worker,wscan]} nodeSelector: matchLabels: node-role.kubernetes.io/wscan: "" EOF 4. $ oc create -f - << EOF { "kind": "List", "apiVersion": "v1", "metadata": {}, "items": [ { "apiVersion": "compliance.openshift.io/v1alpha1", "kind": "ComplianceSuite", "metadata": { "name": "worker-compliancesuite", "namespace": "openshift-compliance" }, "spec": { "autoApplyRemediations": true, "scans": [ { "content": "ssg-rhcos4-ds.xml", "contentImage": "quay.io/complianceascode/ocp4:latest", "debug": true, "name": "worker-scan", "noExternalResources": false, "nodeSelector": { "node-role.kubernetes.io/wscan": "" }, "profile": "xccdf_org.ssgproject.content_profile_moderate", "rawResultStorage": { "rotation": 0, "size": "" }, "rule": "", "scanType": "" } ], "schedule": "0 1 * * *" } } ] } EOF Actual results: 102 remediation rules applied, but out of only 73 rules are reported PASS after the rescan triggered Before remediation: $ oc get compliancecheckresults.compliance.openshift.io | grep PASS | wc -l 46 $ oc get compliancecheckresults.compliance.openshift.io | grep FAIL | wc -l 194 $ oc get compliancecheckresults.compliance.openshift.io | grep SKIP | wc -l 1 After remediation: $ oc get mc -l compliance.openshift.io/scan-name=worker-compliancesuite,machineconfiguration.openshift.io/role=wscan | wc -l 103 $ oc get complianceremediation | grep Applied | wc -l 102 $ oc get compliancecheckresults.compliance.openshift.io | grep PASS | wc -l 119 $ oc get compliancecheckresults.compliance.openshift.io | grep FAIL | wc -l 121 $ oc get mc -l compliance.openshift.io/scan-name=worker-compliancesuite,machineconfiguration.openshift.io/role=wscan | grep privileged 75-worker-scan-audit-rules-privileged-commands-at 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-chage 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-chsh 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-crontab 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-gpasswd 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-mount 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-newgidmap 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-newgrp 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-newuidmap 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-pam-timestamp-check 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-passwd 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-postdrop 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-postqueue 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-pt-chown 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-ssh-keysign 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-su 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-sudo 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-sudoedit 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-umount 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-unix-chkpwd 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-userhelper 3.1.0 179m 75-worker-scan-audit-rules-privileged-commands-usernetctl 3.1.0 179m 75-worker-scan-sysctl-kernel-unprivileged-bpf-disabled 3.1.0 179m $ oc get compliancecheckresults.compliance.openshift.io | grep privileged worker-scan-audit-rules-privileged-commands FAIL medium worker-scan-audit-rules-privileged-commands-at FAIL medium worker-scan-audit-rules-privileged-commands-chage FAIL medium worker-scan-audit-rules-privileged-commands-chsh FAIL medium worker-scan-audit-rules-privileged-commands-crontab FAIL medium worker-scan-audit-rules-privileged-commands-gpasswd FAIL medium worker-scan-audit-rules-privileged-commands-mount FAIL medium worker-scan-audit-rules-privileged-commands-newgidmap FAIL medium worker-scan-audit-rules-privileged-commands-newgrp FAIL medium worker-scan-audit-rules-privileged-commands-newuidmap FAIL medium worker-scan-audit-rules-privileged-commands-pam-timestamp-check FAIL medium worker-scan-audit-rules-privileged-commands-passwd FAIL medium worker-scan-audit-rules-privileged-commands-postdrop FAIL medium worker-scan-audit-rules-privileged-commands-postqueue FAIL medium worker-scan-audit-rules-privileged-commands-pt-chown FAIL medium worker-scan-audit-rules-privileged-commands-ssh-keysign FAIL medium worker-scan-audit-rules-privileged-commands-su FAIL medium worker-scan-audit-rules-privileged-commands-sudo FAIL medium worker-scan-audit-rules-privileged-commands-sudoedit FAIL medium worker-scan-audit-rules-privileged-commands-umount FAIL medium worker-scan-audit-rules-privileged-commands-unix-chkpwd FAIL medium worker-scan-audit-rules-privileged-commands-userhelper FAIL medium worker-scan-audit-rules-privileged-commands-usernetctl FAIL medium worker-scan-sysctl-kernel-unprivileged-bpf-disabled PASS medium $ oc get compliancecheckresults.compliance.openshift.io | grep privileged worker-scan-audit-rules-privileged-commands FAIL medium worker-scan-audit-rules-privileged-commands-at FAIL medium worker-scan-audit-rules-privileged-commands-chage FAIL medium worker-scan-audit-rules-privileged-commands-chsh FAIL medium worker-scan-audit-rules-privileged-commands-crontab FAIL medium worker-scan-audit-rules-privileged-commands-gpasswd FAIL medium worker-scan-audit-rules-privileged-commands-mount FAIL medium worker-scan-audit-rules-privileged-commands-newgidmap FAIL medium worker-scan-audit-rules-privileged-commands-newgrp FAIL medium worker-scan-audit-rules-privileged-commands-newuidmap FAIL medium worker-scan-audit-rules-privileged-commands-pam-timestamp-check FAIL medium worker-scan-audit-rules-privileged-commands-passwd FAIL medium worker-scan-audit-rules-privileged-commands-postdrop FAIL medium worker-scan-audit-rules-privileged-commands-postqueue FAIL medium worker-scan-audit-rules-privileged-commands-pt-chown FAIL medium worker-scan-audit-rules-privileged-commands-ssh-keysign FAIL medium worker-scan-audit-rules-privileged-commands-su FAIL medium worker-scan-audit-rules-privileged-commands-sudo FAIL medium worker-scan-audit-rules-privileged-commands-sudoedit FAIL medium worker-scan-audit-rules-privileged-commands-umount FAIL medium worker-scan-audit-rules-privileged-commands-unix-chkpwd FAIL medium worker-scan-audit-rules-privileged-commands-userhelper FAIL medium worker-scan-audit-rules-privileged-commands-usernetctl FAIL medium worker-scan-sysctl-kernel-unprivileged-bpf-disabled PASS medium $ oc get mc -l compliance.openshift.io/scan-name=worker-compliancesuite,machineconfiguration.openshift.io/role=wscan | grep execution 75-worker-scan-audit-rules-execution-chcon 3.1.0 3h 75-worker-scan-audit-rules-execution-restorecon 3.1.0 3h 75-worker-scan-audit-rules-execution-semanage 3.1.0 3h 75-worker-scan-audit-rules-execution-setfiles 3.1.0 3h 75-worker-scan-audit-rules-execution-setsebool 3.1.0 3h 75-worker-scan-audit-rules-execution-seunshare 3.1.0 3h $ oc get compliancecheckresults.compliance.openshift.io | grep execution worker-scan-audit-rules-execution-chcon FAIL medium worker-scan-audit-rules-execution-restorecon FAIL medium worker-scan-audit-rules-execution-semanage FAIL medium worker-scan-audit-rules-execution-setfiles FAIL medium worker-scan-audit-rules-execution-setsebool FAIL medium worker-scan-audit-rules-execution-seunshare FAIL medium Expected results: compliancecheckresults shows PASS after remediation applied Additional info: For example: the remediation for rule has been applied and the specific rule also get added in audit.d but still rescan shows the rule failed $ oc describe complianceremediations worker-scan-audit-rules-privileged-commands-at |tail Files: Contents: Source: data:,-a%20always%2Cexit%20-F%20path%3D/usr/bin/at%20-F%20perm%3Dx%20-F%20auid%3E%3D1000%20-F%20auid%21%3Dunset%20-F%20key%3Dprivileged%0A Mode: 420 Overwrite: true Path: /etc/audit/rules.d/75-usr_bin_at_execution.rules Outdated: Status: Application State: Applied Events: <none> $ oc debug node/ip-10-0-176-58.us-east-2.compute.internal Creating debug namespace/openshift-debug-node-55rhc ... Starting pod/ip-10-0-176-58us-east-2computeinternal-debug ... To use host binaries, run `chroot /host` Pod IP: 10.0.176.58 If you don't see a command prompt, try pressing enter. sh-4.4# chroot /host sh-4.4# cat /etc/audit/rules.d/75-usr_bin_at_execution.rules -a always,exit -F path=/usr/bin/at -F perm=x -F auid>=1000 -F auid!=unset -F key=privileged $ oc get complianceremediation worker-scan-audit-rules-privileged-commands-at NAME STATE worker-scan-audit-rules-privileged-commands-at Applied
Could you please retest this scenario with today's builds of both the content and the operator as built for 4.6? I suspect that this is either fixed by https://github.com/openshift/compliance-operator/commit/995b63f41a9d67a693139d651e3f419d9a27092f (this is a 4.6 commit) or https://github.com/ComplianceAsCode/content/commit/92cbab2e7b9dfa7422b601a1c15fa7be219c5045 (master, but we build even our 4.6 content images from master). If the bug is no longer visible with the 4.6 builds, I think we can mark this BZ as a duplicate of https://bugzilla.redhat.com/show_bug.cgi?id=1907414
Has this been re-tested as requested in comment#1? The bug Jakub suggested this is a duplicate of was successfully marked VERIFIED today.
verification pass with 4.7.0-0.nightly-2021-02-06-084550 and compliance-operator.v0.1.26 $ oc get ip NAME CSV APPROVAL APPROVED install-sb9s8 compliance-operator.v0.1.26 Automatic true $ oc get csv NAME DISPLAY VERSION REPLACES PHASE compliance-operator.v0.1.26 Compliance Operator 0.1.26 Succeeded 1. label node: $ oc label node xiyuan071-4s8kj-worker-a-tsdzj.c.openshift-qe.internal node-role.kubernetes.io/wscan= node/xiyuan071-4s8kj-worker-a-tsdzj.c.openshift-qe.internal labeled 2. create mcp $ oc create -f - <<EOF apiVersion: machineconfiguration.openshift.io/v1 kind: MachineConfigPool metadata: name: wscan spec: machineConfigSelector: matchExpressions: - {key: machineconfiguration.openshift.io/role, operator: In, values: [worker,wscan]} nodeSelector: matchLabels: node-role.kubernetes.io/wscan: "" EOF machineconfigpool.machineconfiguration.openshift.io/wscan created 3. create compliancesuite and check the results: $ oc create -f - << EOF > { > "kind": "List", > "apiVersion": "v1", > "metadata": {}, > "items": [ > { > "apiVersion": "compliance.openshift.io/v1alpha1", > "kind": "ComplianceSuite", > "metadata": { > "name": "worker-compliancesuite", > "namespace": "openshift-compliance" > }, > "spec": { > "autoApplyRemediations": true, > "scans": [ > { > "content": "ssg-rhcos4-ds.xml", > "contentImage": "quay.io/complianceascode/ocp4:latest", > "debug": true, > "name": "worker-scan", > "noExternalResources": false, > "nodeSelector": { > "node-role.kubernetes.io/wscan": "" > }, > "profile": "xccdf_org.ssgproject.content_profile_moderate", > "rawResultStorage": { > "rotation": 0, > "size": "" > }, > "rule": "", > "scanType": "" > } > ], > "schedule": "0 1 * * *" > } > } > ] > } > EOF compliancesuite.compliance.openshift.io/worker-compliancesuite created [xiyuan@MiWiFi-RA69-srv deploy]$ ^C [xiyuan@MiWiFi-RA69-srv deploy]$ oc delete compliancesuite --all compliancesuite.compliance.openshift.io "worker-compliancesuite" deleted [xiyuan@MiWiFi-RA69-srv deploy]$ oc create -f - << EOF > { > "kind": "List", > "apiVersion": "v1", > "metadata": {}, > "items": [ > { > "apiVersion": "compliance.openshift.io/v1alpha1", > "kind": "ComplianceSuite", > "metadata": { > "name": "worker-compliancesuite", > "namespace": "openshift-compliance" > }, > "spec": { > "autoApplyRemediations": false, > "scans": [ > { > "content": "ssg-rhcos4-ds.xml", > "contentImage": "quay.io/complianceascode/ocp4:latest", > "debug": true, > "name": "worker-scan", > "noExternalResources": false, > "nodeSelector": { > "node-role.kubernetes.io/wscan": "" > }, > "profile": "xccdf_org.ssgproject.content_profile_moderate", > "rawResultStorage": { > "rotation": 0, > "size": "" > }, > "rule": "", > "scanType": "" > } > ], > "schedule": "0 1 * * *" > } > } > ] > } > EOF compliancesuite.compliance.openshift.io/worker-compliancesuite created $ oc get compliancesuite NAME PHASE RESULT worker-compliancesuite DONE NON-COMPLIANT $ oc get compliancecheckresults.compliance.openshift.io | grep PASS | wc -l 46 $ oc get compliancecheckresults.compliance.openshift.io | grep FAIL | wc -l 195 $ oc get compliancecheckresults.compliance.openshift.io | grep SKIP | wc -l 0 $ oc get compliancecheckresults.compliance.openshift.io | grep INFO | wc -l 4 4. trigger remediation and check how many rules auto-remediated: $ oc annotate compliancesuites/worker-compliancesuite compliance.openshift.io/apply-remediations= compliancesuite.compliance.openshift.io/worker-compliancesuite annotated $ oc get mc | grep 75-worker-scan | wc -l 102 5. trigger rescan and check the result: $ oc annotate compliancescan/worker-scan compliance.openshift.io/rescan= compliancescan.compliance.openshift.io/worker-scan annotated $ oc get compliancesuite NAME PHASE RESULT worker-compliancesuite DONE NON-COMPLIANT $ oc get compliancecheckresults.compliance.openshift.io | grep PASS | wc -l 147 $ oc get compliancecheckresults.compliance.openshift.io | grep FAIL | wc -l 94 $ oc get compliancecheckresults.compliance.openshift.io | grep SKIP | wc -l 0 $ oc get compliancecheckresults.compliance.openshift.io | grep INFO | wc -l 4 So, the PASS rules increase 101, FAIL rules decrease 101, and 1 INFO level rule was applied remediation, and remains INFO level after auto remediation applied: $ oc get compliancecheckresults.compliance.openshift.io | grep INFO worker-scan-bios-disable-usb-boot INFO unknown worker-scan-coreos-vsyscall-kernel-argument INFO medium worker-scan-sshd-limit-user-access INFO unknown worker-scan-wireless-disable-in-bios INFO unknown $ oc get mc | grep vsyscall-kernel-argument 75-worker-scan-coreos-vsyscall-kernel-argument 3.1.0 20m
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory (OpenShift Container Platform 4.7 compliance-operator image update), and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2021:0435