Bug 1907414 - [OCP v46] Not all remediations get applied through machineConfig although the status of all rules shows Applied in ComplianceRemediations object
Summary: [OCP v46] Not all remediations get applied through machineConfig although the...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Compliance Operator
Version: 4.6.z
Hardware: Unspecified
OS: Unspecified
unspecified
medium
Target Milestone: ---
: 4.6.z
Assignee: Jakub Hrozek
QA Contact: Prashant Dhamdhere
URL:
Whiteboard:
Depends On: 1907410
Blocks:
TreeView+ depends on / blocked
 
Reported: 2020-12-14 13:25 UTC by Prashant Dhamdhere
Modified: 2021-01-19 13:54 UTC (History)
5 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of: 1907410
Environment:
Last Closed: 2021-01-19 13:53:52 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHSA-2021:0190 0 None None None 2021-01-19 13:54:18 UTC

Description Prashant Dhamdhere 2020-12-14 13:25:22 UTC
+++ This bug was initially created as a clone of Bug #1907410 +++

Description of problem:

Not all remediations get applied through machineConfig although the status of all rules shows Applied 
in ComplianceRemediations object

    $ oc get pods
     
    NAME                                                        READY   STATUS      RESTARTS   AGE
    aggregator-pod-worker-scan                                  0/1     Completed   0          2m33s
    compliance-operator-8d6f976cf-9zlrm                         1/1     Running     0          145m
    ocp4-openshift-compliance-pp-7cd9f6b64f-wrr7k               1/1     Running     0          15m
    rhcos4-openshift-compliance-pp-999fd896f-j7d4p              1/1     Running     0          15m
    worker-scan-ip-10-0-57-244.us-east-2.compute.internal-pod   0/2     Completed   0          6m14s
    worker-scan-ip-10-0-72-126.us-east-2.compute.internal-pod   0/2     Completed   0          6m13s


    $ cat /tmp/e2e-test-compliance-zsiwoivl-db68w-uqpxgui1isc-config.json
     
    {
        "kind": "List",
        "apiVersion": "v1",
        "metadata": {},
        "items": [
            {
                "apiVersion": "compliance.openshift.io/v1alpha1",
                "kind": "ComplianceSuite",
                "metadata": {
                    "name": "worker-compliancesuite",
                    "namespace": "openshift-compliance"
                },
                "spec": {
                    "autoApplyRemediations": true,
                    "scans": [
                        {
                            "content": "ssg-rhcos4-ds.xml",
                            "contentImage": "quay.io/complianceascode/ocp4:latest",
                            "debug": true,
                            "name": "worker-scan",
                            "noExternalResources": false,
                            "nodeSelector": {
                                "node-role.kubernetes.io/wscan": ""
                            },
                            "profile": "xccdf_org.ssgproject.content_profile_moderate",
                            "rawResultStorage": {
                                "rotation": 0,
                                "size": ""
                            },
                            "rule": "",
                            "scanType": ""
                        }
                    ],
                    "schedule": "0 1 * * *"
                }
            }
        ]
    }


    $ oc get compliancesuite

    NAME                     PHASE   RESULT
    worker-compliancesuite   DONE    NON-COMPLIANT

    $ oc get compliancesuite worker-compliancesuite -o yaml
     
    apiVersion: compliance.openshift.io/v1alpha1
    kind: ComplianceSuite
    metadata:
      creationTimestamp: "2020-12-09T06:42:02Z"
      finalizers:
      - suite.finalizers.compliance.openshift.io
      generation: 2
      managedFields:
      - apiVersion: compliance.openshift.io/v1alpha1
        fieldsType: FieldsV1
        fieldsV1:
          f:spec:
            .: {}
            f:autoApplyRemediations: {}
            f:schedule: {}
       ....
        manager: compliance-operator
        operation: Update
        time: "2020-12-09T06:46:31Z"
      name: worker-compliancesuite
      namespace: openshift-compliance
      resourceVersion: "115397"
      selfLink: /apis/compliance.openshift.io/v1alpha1/namespaces/openshift-compliance/compliancesuites/worker-compliancesuite
      uid: f137ce65-93d1-4889-8188-d95196e54642
    spec:
      autoApplyRemediations: true
      scans:
      - content: ssg-rhcos4-ds.xml
        contentImage: quay.io/complianceascode/ocp4:latest
        debug: true
        name: worker-scan
        nodeSelector:
          node-role.kubernetes.io/wscan: ""
        profile: xccdf_org.ssgproject.content_profile_moderate
        rawResultStorage:
          pvAccessModes:
          - ReadWriteOnce
          rotation: 3
          size: 1Gi
        scanTolerations:
        - effect: NoSchedule
          key: node-role.kubernetes.io/master
          operator: Exists
        scanType: Node
      schedule: 0 1 * * *
    status:
      phase: DONE
      result: NON-COMPLIANT
      scanStatuses:
      - name: worker-scan
        phase: DONE
        result: NON-COMPLIANT
        resultsStorage:
          name: worker-scan
          namespace: openshift-compliance

The complianceremediations object output shows all rules are Applied but some of those rules 
are not available in machineConfig

$ oc get complianceremediations |tail
worker-scan-sysctl-net-ipv4-conf-default-send-redirects           Applied
worker-scan-sysctl-net-ipv4-icmp-echo-ignore-broadcasts           Applied
worker-scan-sysctl-net-ipv4-icmp-ignore-bogus-error-responses     Applied
worker-scan-sysctl-net-ipv4-tcp-syncookies                        Applied
worker-scan-sysctl-net-ipv6-conf-all-accept-ra                    Applied
worker-scan-sysctl-net-ipv6-conf-all-accept-redirects             Applied
worker-scan-sysctl-net-ipv6-conf-all-accept-source-route          Applied
worker-scan-sysctl-net-ipv6-conf-default-accept-ra                Applied
worker-scan-sysctl-net-ipv6-conf-default-accept-redirects         Applied
worker-scan-sysctl-net-ipv6-conf-default-accept-source-route      Applied

$ oc get mc
NAME                                               GENERATEDBYCONTROLLER                      IGNITIONVERSION   AGE
00-master                                          d6b5d1922d848885cf5d2737306ab14323b7783a   3.2.0             3h13m
00-worker                                          d6b5d1922d848885cf5d2737306ab14323b7783a   3.2.0             3h13m
01-master-container-runtime                        d6b5d1922d848885cf5d2737306ab14323b7783a   3.2.0             3h13m
01-master-kubelet                                  d6b5d1922d848885cf5d2737306ab14323b7783a   3.2.0             3h13m
01-worker-container-runtime                        d6b5d1922d848885cf5d2737306ab14323b7783a   3.2.0             3h13m
01-worker-kubelet                                  d6b5d1922d848885cf5d2737306ab14323b7783a   3.2.0             3h13m
75-worker-scan-worker-compliancesuite                                                         3.1.0             2m25s


$ oc get mc 75-worker-scan-worker-compliancesuite -o yaml |grep "worker-scan-sysctl-net-ipv6-conf-default-accept-source-route"
$ oc get mc 75-worker-scan-worker-compliancesuite -o yaml |grep "worker-scan-sysctl-net-ipv6-conf-default-accept-redirects"
$ oc get mc 75-worker-scan-worker-compliancesuite -o yaml |grep "worker-scan-sysctl-net-ipv6-conf-default-accept-ra"
    remediation/worker-scan-sysctl-net-ipv6-conf-default-accept-ra: "2"
          f:remediation/worker-scan-sysctl-net-ipv6-conf-default-accept-ra: {}
$ oc get mc 75-worker-scan-worker-compliancesuite -o yaml |grep "worker-scan-sysctl-net-ipv6-conf-all-accept-source-route"
$ oc get mc 75-worker-scan-worker-compliancesuite -o yaml |grep "worker-scan-sysctl-net-ipv6-conf-all-accept-redirects"
$ oc get mc 75-worker-scan-worker-compliancesuite -o yaml |grep "worker-scan-sysctl-net-ipv6-conf-all-accept-ra"
    remediation/worker-scan-sysctl-net-ipv6-conf-all-accept-ra: "2"
          f:remediation/worker-scan-sysctl-net-ipv6-conf-all-accept-ra: {}


Version-Release number of selected component (if applicable):

4.7.0-0.nightly-2020-12-14-035110

How reproducible:

Always

Steps to Reproduce:

1. Deploy Compliance Operator
2. Create ComplianceSuite object CR
   $ oc create -f /tmp/e2e-test-compliance-zsiwoivl-db68w-uqpxgui1isc-config.json
3. Monitor scan pods
   $ oc get pods -w -nopenshift-compliance
4. Check for compliance scan result through compliancesuite object
   $ oc get compliancesuite
5. Check complianceRemediations output which shows all rules are Applied 
   $ oc get complianceremediations
6. Check machineconfig and verify all rules are available in it
   $ oc get mc
   oc get mc 75-worker-scan-worker-compliancesuite -o yaml


Actual results:

Not all remediations get applied through machineConfig although the status of all rules shows Applied 
in ComplianceRemediations object

Expected results:

All remediations should get applied through machineConfig as well and the status of all rules shows 
Applied in ComplianceRemediations object

Additional info:

inspecting the created machineConfig (the 75-XXXX), it seems like some remediations are simply missing

Comment 1 Nathan Kinder 2021-01-06 19:20:33 UTC
This has been fixed upstream for the 4.6 branch as a part of the following PR:

https://github.com/openshift/compliance-operator/pull/527

The specific commit for 4.6 is:

https://github.com/openshift/compliance-operator/commit/995b63f41a9d67a693139d651e3f419d9a27092f

Comment 4 Prashant Dhamdhere 2021-01-08 15:44:24 UTC
The bug verification failed on the latest version of the compliance operator v0.1.24. The auto-remediation 
does not get applied and all remediation rules go in Error state with the below error message

Error Message:      not applying remediation that doesn't have a matching MachineconfigPool. Scan: worker-scan


[1] Applied compliancesuite CR through json file 

$ oc get csv
NAME                          DISPLAY               VERSION   REPLACES                      PHASE
compliance-operator.v0.1.24   Compliance Operator   0.1.24    compliance-operator.v0.1.17   Succeeded

$ oc get pods
NAME                                              READY   STATUS    RESTARTS   AGE
compliance-operator-67c6f76f54-v7wjx              1/1     Running   0          2m16s
ocp4-openshift-compliance-pp-5bc4b87f99-qch92     1/1     Running   0          76s
rhcos4-openshift-compliance-pp-78d9c5d499-c8c5w   1/1     Running   0          76s

$ cat /tmp/e2e-test-compliance-zsiwoivl-db68w-uqpxgui1isc-config.json
{
        "kind": "List",
        "apiVersion": "v1",
        "metadata": {},
        "items": [
            {
                "apiVersion": "compliance.openshift.io/v1alpha1",
                "kind": "ComplianceSuite",
                "metadata": {
                    "name": "worker-compliancesuite",
                    "namespace": "openshift-compliance"
                },
                "spec": {
                    "autoApplyRemediations": true,
                    "scans": [
                        {
                            "content": "ssg-rhcos4-ds.xml",
                            "contentImage": "quay.io/complianceascode/ocp4:latest",
                            "debug": true,
                            "name": "worker-scan",
                            "noExternalResources": false,
                            "nodeSelector": {
                                "node-role.kubernetes.io/wscan": ""
                            },
                            "profile": "xccdf_org.ssgproject.content_profile_moderate",
                            "rawResultStorage": {
                                "rotation": 0,
                                "size": ""
                            },
                            "rule": "",
                            "scanType": ""
                        }
                    ],
                    "schedule": "0 1 * * *"
                }
            }
        ]
    }

$ oc create -f /tmp/e2e-test-compliance-zsiwoivl-db68w-uqpxgui1isc-config.json
compliancesuite.compliance.openshift.io/worker-compliancesuite created

$ oc get pods -w
NAME                                                       READY   STATUS      RESTARTS   AGE
aggregator-pod-worker-scan                                 0/1     Completed   0          4m43s
compliance-operator-67c6f76f54-v7wjx                       1/1     Running     0          14m
ocp4-openshift-compliance-pp-5bc4b87f99-qch92              1/1     Running     0          13m
rhcos4-openshift-compliance-pp-78d9c5d499-c8c5w            1/1     Running     0          13m
worker-scan-ip-10-0-53-7.us-east-2.compute.internal-pod    0/2     Completed   0          7m53s
worker-scan-ip-10-0-58-26.us-east-2.compute.internal-pod   0/2     Completed   0          7m53s
worker-scan-ip-10-0-76-91.us-east-2.compute.internal-pod   0/2     Completed   0          7m53s

$ oc get compliancesuite
NAME                     PHASE   RESULT
worker-compliancesuite   DONE    NON-COMPLIANT

$oc get mc
NAME                                               GENERATEDBYCONTROLLER                      IGNITIONVERSION   AGE
00-master                                          eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
00-worker                                          eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
01-master-container-runtime                        eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
01-master-kubelet                                  eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
01-worker-container-runtime                        eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
01-worker-kubelet                                  eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
99-master-fips                                                                                3.1.0             9h
99-master-generated-registries                     eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
99-master-ssh                                                                                 3.1.0             9h
99-worker-fips                                                                                3.1.0             9h
99-worker-generated-registries                     eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
99-worker-ssh                                                                                 3.1.0             9h
rendered-master-154a7dae66c15fef0545dc9af517c1dd   eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
rendered-master-758ebd417045a6b5a48c8a9ce51fcd29   eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
rendered-worker-16ae54de226db0ba7841781a6e574756   eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
rendered-worker-636a9c3209570438fe833281745313af   eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h

$ oc get complianceremediations |tail
worker-scan-sysctl-net-ipv4-conf-default-send-redirects           Error
worker-scan-sysctl-net-ipv4-icmp-echo-ignore-broadcasts           Error
worker-scan-sysctl-net-ipv4-icmp-ignore-bogus-error-responses     Error
worker-scan-sysctl-net-ipv4-tcp-syncookies                        Error
worker-scan-sysctl-net-ipv6-conf-all-accept-ra                    Error
worker-scan-sysctl-net-ipv6-conf-all-accept-redirects             Error
worker-scan-sysctl-net-ipv6-conf-all-accept-source-route          Error
worker-scan-sysctl-net-ipv6-conf-default-accept-ra                Error
worker-scan-sysctl-net-ipv6-conf-default-accept-redirects         Error
worker-scan-sysctl-net-ipv6-conf-default-accept-source-route      Error

$ oc describe complianceremediations worker-scan-sysctl-net-ipv4-conf-default-send-redirects |tail -5
  Outdated:
Status:
  Application State:  Error
  Error Message:      not applying remediation that doesn't have a matching MachineconfigPool. Scan: worker-scan
Events:               <none>


[2] Applied compliancesuite CR through yaml and noticed the same issue


$ oc create -f - <<EOF
> apiVersion: compliance.openshift.io/v1alpha1
> kind: ComplianceSuite
> metadata:
>   name: example-compliancesuite
> spec:
>   autoApplyRemediations: true
>   schedule: "0 1 * * *"
>   scans:
>     - name: wscan-scan
>       profile: xccdf_org.ssgproject.content_profile_moderate
>       content: ssg-rhcos4-ds.xml
>       contentImage: quay.io/complianceascode/ocp4:latest
>       debug: true
>       nodeSelector:
>         node-role.kubernetes.io/wscan: ""
> EOF
compliancesuite.compliance.openshift.io/example-compliancesuite created


$ oc get pods
NAME                                                       READY   STATUS      RESTARTS   AGE
aggregator-pod-worker-scan                                 0/1     Completed   0          29m
aggregator-pod-wscan-scan                                  0/1     Completed   0          7m58s
compliance-operator-67c6f76f54-v7wjx                       1/1     Running     0          39m
ocp4-openshift-compliance-pp-5bc4b87f99-qch92              1/1     Running     0          38m
rhcos4-openshift-compliance-pp-78d9c5d499-c8c5w            1/1     Running     0          38m
worker-scan-ip-10-0-53-7.us-east-2.compute.internal-pod    0/2     Completed   0          32m
worker-scan-ip-10-0-58-26.us-east-2.compute.internal-pod   0/2     Completed   0          32m
worker-scan-ip-10-0-76-91.us-east-2.compute.internal-pod   0/2     Completed   0          32m
wscan-scan-ip-10-0-53-7.us-east-2.compute.internal-pod     0/2     Completed   0          10m
wscan-scan-ip-10-0-58-26.us-east-2.compute.internal-pod    0/2     Completed   0          10m
wscan-scan-ip-10-0-76-91.us-east-2.compute.internal-pod    0/2     Completed   0          10m


$ oc get compliancesuite
NAME                      PHASE   RESULT
example-compliancesuite   DONE    NON-COMPLIANT
worker-compliancesuite    DONE    NON-COMPLIANT



$ oc get complianceremediations |head
NAME                                                              STATE
worker-scan-audit-rules-dac-modification-chmod                    Error
worker-scan-audit-rules-dac-modification-chown                    Error
worker-scan-audit-rules-dac-modification-fchmod                   Error
worker-scan-audit-rules-dac-modification-fchmodat                 Error
worker-scan-audit-rules-dac-modification-fchown                   Error
worker-scan-audit-rules-dac-modification-fchownat                 Error
worker-scan-audit-rules-dac-modification-fremovexattr             Error
worker-scan-audit-rules-dac-modification-fsetxattr                Error
worker-scan-audit-rules-dac-modification-lchown                   Error


$ oc get mc
NAME                                               GENERATEDBYCONTROLLER                      IGNITIONVERSION   AGE
00-master                                          eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             9h
00-worker                                          eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             9h
01-master-container-runtime                        eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             9h
01-master-kubelet                                  eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             9h
01-worker-container-runtime                        eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             9h
01-worker-kubelet                                  eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             9h
99-master-fips                                                                                3.1.0             9h
99-master-generated-registries                     eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             9h
99-master-ssh                                                                                 3.1.0             9h
99-worker-fips                                                                                3.1.0             9h
99-worker-generated-registries                     eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             9h
99-worker-ssh                                                                                 3.1.0             9h
rendered-master-154a7dae66c15fef0545dc9af517c1dd   eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
rendered-master-758ebd417045a6b5a48c8a9ce51fcd29   eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             9h
rendered-worker-16ae54de226db0ba7841781a6e574756   eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             8h
rendered-worker-636a9c3209570438fe833281745313af   eab9c35dfbeb0d21be6e1db3887acbbb93592d34   3.1.0             9h

$ oc get mcp
NAME     CONFIG                                             UPDATED   UPDATING   DEGRADED   MACHINECOUNT   READYMACHINECOUNT   UPDATEDMACHINECOUNT   DEGRADEDMACHINECOUNT   AGE
master   rendered-master-154a7dae66c15fef0545dc9af517c1dd   True      False      False      3              3                   3                     0                      9h
worker   rendered-worker-16ae54de226db0ba7841781a6e574756   True      False      False      5              5                   5                     0                      9h

Comment 5 Jakub Hrozek 2021-01-08 19:17:02 UTC
The above sounds like a typo to me. You scan targets the "wscan" pool:
 "nodeSelector": {
                                "node-role.kubernetes.io/wscan": ""
                            },

but according to the oc get mcp output, you only have master and worker scans. This is also reported in the remediation statuses:
  Error Message:      not applying remediation that doesn't have a matching MachineconfigPool. Scan: worker-scan

Would you mind retesting yet again with a scan that matches the cluster pools?

Comment 6 Prashant Dhamdhere 2021-01-11 07:19:03 UTC
I labelled all rhcos worker nodes with wscan label before performed scan and the same way, 
I reproduced this issue while reporting this bug (comment #1) and the remediation got applied. 
Let me try to retest this with worker nodeSelector, I guess it won't report the remediation 
error as a worker node MachineconfigPool is available by default on a cluster.

Comment 7 Prashant Dhamdhere 2021-01-11 11:39:26 UTC
LGTM. The remediation gets applied without an error with worker nodeSelector. 
The machineConfig gets created for all remediation rules and the status of 
these rules show Applied in ComplianceRemediations object.


Verified on :

4.6.0-0.nightly-2021-01-10-033123
compliance-operator.v0.1.24


$ oc get csv
NAME                          DISPLAY               VERSION   REPLACES                      PHASE
compliance-operator.v0.1.24   Compliance Operator   0.1.24    compliance-operator.v0.1.17   Succeeded

$ oc get pods
NAME                                              READY   STATUS    RESTARTS   AGE
compliance-operator-67c6f76f54-94v2h              1/1     Running   0          3m8s
ocp4-openshift-compliance-pp-5bc4b87f99-sggdx     1/1     Running   0          2m8s
rhcos4-openshift-compliance-pp-78d9c5d499-42vj8   1/1     Running   0          2m8s


$ cat /tmp/e2e-test-compliance-zsiwoivl-db68w-uqpxgui1isc-config.json
{
        "kind": "List",
        "apiVersion": "v1",
        "metadata": {},
        "items": [
            {
                "apiVersion": "compliance.openshift.io/v1alpha1",
                "kind": "ComplianceSuite",
                "metadata": {
                    "name": "worker-compliancesuite",
                    "namespace": "openshift-compliance"
                },
                "spec": {
                    "autoApplyRemediations": true,
                    "scans": [
                        {
                            "content": "ssg-rhcos4-ds.xml",
                            "contentImage": "quay.io/complianceascode/ocp4:latest",
                            "debug": true,
                            "name": "worker-scan",
                            "noExternalResources": false,
                            "nodeSelector": {
                                "node-role.kubernetes.io/worker": ""
                            },
                            "profile": "xccdf_org.ssgproject.content_profile_moderate",
                            "rawResultStorage": {
                                "rotation": 0,
                                "size": ""
                            },
                            "rule": "",
                            "scanType": ""
                        }
                    ],
                    "schedule": "0 1 * * *"
                }
            }
        ]
    }


$ oc create -f /tmp/e2e-test-compliance-zsiwoivl-db68w-uqpxgui1isc-config.json
compliancesuite.compliance.openshift.io/worker-compliancesuite created

$ oc get mcp
NAME     CONFIG                                             UPDATED   UPDATING   DEGRADED   MACHINECOUNT   READYMACHINECOUNT   UPDATEDMACHINECOUNT   DEGRADEDMACHINECOUNT   AGE
master   rendered-master-deca888a5e272c20ad2187d8eb6b35fc   True      False      False      3              3                   3                     0                      129m
worker   rendered-worker-9daad341dd3d13a1c6fb4d08c1b13be2   True      False      False      3              3                   3                     0                      129m

$ oc get pods
NAME                                                        READY   STATUS      RESTARTS   AGE
aggregator-pod-worker-scan                                  0/1     Completed   0          65s
compliance-operator-67c6f76f54-94v2h                        1/1     Running     0          8m4s
ocp4-openshift-compliance-pp-5bc4b87f99-sggdx               1/1     Running     0          7m4s
rhcos4-openshift-compliance-pp-78d9c5d499-42vj8             1/1     Running     0          7m4s
worker-scan-ip-10-0-57-140.us-east-2.compute.internal-pod   0/2     Completed   0          4m45s
worker-scan-ip-10-0-61-171.us-east-2.compute.internal-pod   0/2     Completed   0          4m45s
worker-scan-ip-10-0-75-65.us-east-2.compute.internal-pod    0/2     Completed   0          4m45s

$ oc get compliancesuite
NAME                     PHASE   RESULT
worker-compliancesuite   DONE    NON-COMPLIANT

$ oc get mc -l compliance.openshift.io/scan-name=worker-compliancesuite| head
NAME                                                                 GENERATEDBYCONTROLLER   IGNITIONVERSION   AGE
75-worker-scan-audit-rules-dac-modification-chmod                                            3.1.0             119s
75-worker-scan-audit-rules-dac-modification-chown                                            3.1.0             107s
75-worker-scan-audit-rules-dac-modification-fchmod                                           3.1.0             119s
75-worker-scan-audit-rules-dac-modification-fchmodat                                         3.1.0             109s
75-worker-scan-audit-rules-dac-modification-fchown                                           3.1.0             110s
75-worker-scan-audit-rules-dac-modification-fchownat                                         3.1.0             117s
75-worker-scan-audit-rules-dac-modification-fremovexattr                                     3.1.0             119s
75-worker-scan-audit-rules-dac-modification-fsetxattr                                        3.1.0             114s
75-worker-scan-audit-rules-dac-modification-lchown                                           3.1.0             112s

$ oc get mc -l compliance.openshift.io/scan-name=worker-compliancesuite |wc -l
103

$ oc get complianceremediations |head
NAME                                                              STATE
worker-scan-audit-rules-dac-modification-chmod                    Applied
worker-scan-audit-rules-dac-modification-chown                    Applied
worker-scan-audit-rules-dac-modification-fchmod                   Applied
worker-scan-audit-rules-dac-modification-fchmodat                 Applied
worker-scan-audit-rules-dac-modification-fchown                   Applied
worker-scan-audit-rules-dac-modification-fchownat                 Applied
worker-scan-audit-rules-dac-modification-fremovexattr             Applied
worker-scan-audit-rules-dac-modification-fsetxattr                Applied
worker-scan-audit-rules-dac-modification-lchown                   Applied

$ oc get complianceremediations |wc -l
103

Comment 9 errata-xmlrpc 2021-01-19 13:53:52 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Moderate: OpenShift Container Platform 4.6 compliance-operator security and bug fix update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2021:0190


Note You need to log in before you can comment on or make changes to this bug.