Bug 1969620 - In RHOCP 4.7, compliance operator scan results in XCCDF format has target name missing only for profiles: ocp4-cis-node
Summary: In RHOCP 4.7, compliance operator scan results in XCCDF format has target nam...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Compliance Operator
Version: 4.7
Hardware: x86_64
OS: Linux
low
low
Target Milestone: ---
: 4.9.0
Assignee: Jakub Hrozek
QA Contact: Prashant Dhamdhere
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2021-06-08 18:42 UTC by Sayali Bhavsar
Modified: 2024-12-20 20:12 UTC (History)
4 users (show)

Fixed In Version:
Doc Type: Enhancement
Doc Text:
Feature: The scan results in the ARF format now include the host name of the system being scanned in the `<target>` XML element as well as the Kubernetes Node name in the`<fact>` element under the `id=urn:xccdf:fact:identifier` attribute. Reason: This is what the standard prescribes as well as this helps users associate node scans with the respective results. Result: Users are now able to process the ARF scan results for node scans (note that platform scans have not been changed) with tools that require that the host name is at the appropriate place in the ARF results.
Clone Of:
Environment:
Last Closed: 2021-11-10 07:37:22 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github openshift compliance-operator pull 688 0 None None None 2021-08-30 08:16:58 UTC
Red Hat Product Errata RHBA-2021:4530 0 None None None 2021-11-10 07:37:28 UTC

Description Sayali Bhavsar 2021-06-08 18:42:31 UTC
Description of problem:
When you run compliance scan for profile-name: ocp4-cis-node, the extracted raw results (XCCDF format) for it has <target> missing. 

Version-Release number of selected component (if applicable):
compliance-operator 0.1.32 provided by Red Hat Inc.

$ oc version
Client Version: 4.7.13
Server Version: 4.7.13
Kubernetes Version: v1.20.0+df9c838

How reproducible:
Always

Steps to Reproduce:

$ cat scansettingbinding
apiVersion: compliance.openshift.io/v1alpha1
kind: ScanSettingBinding
metadata:
  name: cis-compliance
profiles:
  - name: ocp4-cis-node
    kind: Profile
    apiGroup: compliance.openshift.io/v1alpha1
  - name: ocp4-cis
    kind: Profile
    apiGroup: compliance.openshift.io/v1alpha1
settingsRef:
  name: default
  kind: ScanSetting
  apiGroup: compliance.openshift.io/v1alpha1
scansettingbinding.compliance.openshift.io/cis-compliance created

$ oc get compliancesuites
NAME             PHASE   RESULT
cis-compliance   DONE    NON-COMPLIANT
nist-moderate    DONE    NON-COMPLIANT

$ oc get compliancescan 
NAME                   PHASE   RESULT
ocp4-cis               DONE    NON-COMPLIANT
ocp4-cis-node-master   DONE    NON-COMPLIANT
ocp4-cis-node-worker   DONE    NON-COMPLIANT
ocp4-moderate          DONE    NON-COMPLIANT

oc get pod
NAME                                                    READY   STATUS      RESTARTS   AGE
aggregator-pod-ocp4-cis                                 0/1     Completed   0          12m
aggregator-pod-ocp4-cis-node-master                     0/1     Completed   0          11m
aggregator-pod-ocp4-cis-node-worker                     0/1     Completed   0          11m
aggregator-pod-ocp4-moderate                            0/1     Completed   0          31m
compliance-operator-54b7c6fd4f-ngvwc                    1/1     Running     0          5h14m
ocp4-cis-api-checks-pod                                 0/2     Completed   0          12m
ocp4-moderate-api-checks-pod                            0/2     Completed   0          32m
ocp4-openshift-compliance-pp-b455d578-kp7x9             1/1     Running     0          5h13m
openscap-pod-26821489e3dc8f835d846cf8fe82138f1c554bd8   0/2     Completed   0          12m
openscap-pod-29153eec4f43688c57bf15695261645b3cbefee9   0/2     Completed   0          12m
openscap-pod-481663aadf3a764e83b5dd379a16588145ec7cdc   0/2     Completed   0          12m
openscap-pod-bbfe971ab33f06df43a6e4ca5f7283a0306ed9aa   0/2     Completed   0          12m
openscap-pod-c7328aa080ae31d4378640a4a032d2cf8b83bbb5   0/2     Completed   0          12m
openscap-pod-ec2323315e80bab73bf8830cd986c4edb0935129   0/2     Completed   0          12m
rhcos4-openshift-compliance-pp-6b49d75c9d-dzgmh         1/1     Running     0          5h13m

oc get pv
NAME                                       CAPACITY   ACCESS MODES   RECLAIM POLICY   STATUS   CLAIM                                       STORAGECLASS   REASON   AGE

pvc-4f28be22-5ffa-4f10-98f9-06ea0bf27913   1Gi        RWO            Delete           Bound    openshift-compliance/ocp4-cis-node-worker   gp2                     13m
pvc-b0993012-98d7-4b23-bd43-06825b14309f   1Gi        RWO            Delete           Bound    openshift-compliance/ocp4-moderate          gp2                     54m
pvc-d7bca6b9-fcd0-4397-a4a4-5fc355331f16   1Gi        RWO            Delete           Bound    openshift-compliance/ocp4-cis-node-master   gp2                     13m
pvc-d8c49f17-7f4c-41f0-88ad-33617d5eae8c   1Gi        RWO            Delete           Bound    openshift-compliance/ocp4-cis               gp2                     13m

$ oc get pvc
NAME                   STATUS   VOLUME                                     CAPACITY   ACCESS MODES   STORAGECLASS   AGE
ocp4-cis               Bound    pvc-d8c49f17-7f4c-41f0-88ad-33617d5eae8c   1Gi        RWO            gp2            14m
ocp4-cis-node-master   Bound    pvc-d7bca6b9-fcd0-4397-a4a4-5fc355331f16   1Gi        RWO            gp2            14m
ocp4-cis-node-worker   Bound    pvc-4f28be22-5ffa-4f10-98f9-06ea0bf27913   1Gi        RWO            gp2            14m
ocp4-moderate          Bound    pvc-b0993012-98d7-4b23-bd43-06825b14309f   1Gi        RWO            gp2            55m

$ oc get pvc/ocp4-cis-node-master
NAME                   STATUS   VOLUME                                     CAPACITY   ACCESS MODES   STORAGECLASS   AGE
ocp4-cis-node-master   Bound    pvc-d7bca6b9-fcd0-4397-a4a4-5fc355331f16   1Gi        RWO            gp2            14m

$ cat pvextract.yaml
apiVersion: "v1"
kind: Pod
metadata:
  name: pv-master-extract
spec:
  containers:
    - name: pv-extract-pod
      image: registry.access.redhat.com/ubi8/ubi
      command: ["sleep", "3000"]
      volumeMounts:
        - mountPath: "/ocp4-cis-node-master-results"
          name: workers-scan-vol
  volumes:
    - name: workers-scan-vol
      persistentVolumeClaim:
        claimName: ocp4-cis-node-master
pod/pv-master-extract created

$ cat pvextract.yaml
apiVersion: "v1"
kind: Pod
metadata:
  name: pv-worker-extract
spec:
  containers:
    - name: pv-extract-pod
      image: registry.access.redhat.com/ubi8/ubi
      command: ["sleep", "3000"]
      volumeMounts:
        - mountPath: "/ocp4-cis-node-worker-results"
          name: workers-scan-vol
  volumes:
    - name: workers-scan-vol
      persistentVolumeClaim:
        claimName: ocp4-cis-node-worker
pod/pv-worker-extract created


$ oc exec pods/pv-master-extract -- ls /ocp4-cis-node-master-results/0
openscap-pod-26821489e3dc8f835d846cf8fe82138f1c554bd8.xml.bzip2
openscap-pod-bbfe971ab33f06df43a6e4ca5f7283a0306ed9aa.xml.bzip2
openscap-pod-ec2323315e80bab73bf8830cd986c4edb0935129.xml.bzip2
$ oc cp pv-master-extract:/ocp4-cis-node-master-results .
tar: Removing leading `/' from member names

$ oc exec pv-worker-extract -- ls /ocp4-cis-node-worker-results/0
openscap-pod-29153eec4f43688c57bf15695261645b3cbefee9.xml.bzip2
openscap-pod-481663aadf3a764e83b5dd379a16588145ec7cdc.xml.bzip2
openscap-pod-c7328aa080ae31d4378640a4a032d2cf8b83bbb5.xml.bzip2
$ oc cp pv-worker-extract:/ocp4-cis-node-worker-results .
tar: Removing leading `/' from member names


[sbhavsar@sbhavsar ~]$ oc get cm 
NAME                                                    DATA   AGE
compliance-operator-lock                                0      6h29m
kube-root-ca.crt                                        1      6h31m
ocp4-cis-api-checks-pod                                 3      88m
ocp4-cis-node-master-openscap-container-entrypoint      1      89m
ocp4-cis-node-master-openscap-env-map                   4      89m
ocp4-cis-node-master-openscap-env-map-platform          3      89m
ocp4-cis-node-worker-openscap-container-entrypoint      1      89m
ocp4-cis-node-worker-openscap-env-map                   4      89m
ocp4-cis-node-worker-openscap-env-map-platform          3      89m
ocp4-cis-openscap-container-entrypoint                  1      89m
ocp4-cis-openscap-env-map                               4      89m
ocp4-cis-openscap-env-map-platform                      3      89m
ocp4-moderate-api-checks-pod                            3      107m
ocp4-moderate-openscap-container-entrypoint             1      108m
ocp4-moderate-openscap-env-map                          4      108m
ocp4-moderate-openscap-env-map-platform                 3      108m
openscap-pod-26821489e3dc8f835d846cf8fe82138f1c554bd8   3      88m
openscap-pod-29153eec4f43688c57bf15695261645b3cbefee9   3      88m
openscap-pod-481663aadf3a764e83b5dd379a16588145ec7cdc   3      88m
openscap-pod-bbfe971ab33f06df43a6e4ca5f7283a0306ed9aa   3      88m
openscap-pod-c7328aa080ae31d4378640a4a032d2cf8b83bbb5   3      88m
openscap-pod-ec2323315e80bab73bf8830cd986c4edb0935129   3      88m

$ oc extract cm/openscap-pod-26821489e3dc8f835d846cf8fe82138f1c554bd8 --confirm
$ head -10 results
<?xml version="1.0" encoding="UTF-8"?>
<TestResult xmlns="http://checklists.nist.gov/xccdf/1.2" id="xccdf_org.open-scap_testresult_xccdf_org.ssgproject.content_profile_cis-node" start-time="2021-06-08T17:05:59+00:00" end-time="2021-06-08T17:06:01+00:00" version="0.1.56" test-system="cpe:/a:redhat:openscap:1.3.4">
          <benchmark href="/content/ssg-ocp4-ds.xml" id="xccdf_org.ssgproject.content_benchmark_OCP-4"/>
          <title>OSCAP Scan Result</title>
          <profile idref="xccdf_org.ssgproject.content_profile_cis-node"/>
          <target>Unknown</target>
          <target-facts>
            <fact name="urn:xccdf:fact:identifier" type="string">chroot:///host</fact>
            <fact name="urn:xccdf:fact:scanner:name" type="string">OpenSCAP</fact>
            <fact name="urn:xccdf:fact:scanner:version" type="string">1.3.4</fact>

$ oc extract cm/openscap-pod-29153eec4f43688c57bf15695261645b3cbefee9 --confirm
results
warnings
exit-code
$ head -10 results
<?xml version="1.0" encoding="UTF-8"?>
<TestResult xmlns="http://checklists.nist.gov/xccdf/1.2" id="xccdf_org.open-scap_testresult_xccdf_org.ssgproject.content_profile_cis-node" start-time="2021-06-08T17:06:32+00:00" end-time="2021-06-08T17:06:32+00:00" version="0.1.56" test-system="cpe:/a:redhat:openscap:1.3.4">
          <benchmark href="/content/ssg-ocp4-ds.xml" id="xccdf_org.ssgproject.content_benchmark_OCP-4"/>
          <title>OSCAP Scan Result</title>
          <profile idref="xccdf_org.ssgproject.content_profile_cis-node"/>
          <target>Unknown</target>
          <target-facts>
            <fact name="urn:xccdf:fact:identifier" type="string">chroot:///host</fact>
            <fact name="urn:xccdf:fact:scanner:name" type="string">OpenSCAP</fact>
            <fact name="urn:xccdf:fact:scanner:version" type="string">1.3.4</fact>

Actual results:
<target>Unknown</target>

Expected results:
<target>ocp4-cis-node-master-results</target>
<target>ocp4-cis-node-worker-results</target>

Additional info:
Raw results from other complaincescan `ocp4-cis` and `ocp4-moderate` shows target

$ oc extract cm/ocp4-moderate-api-checks-pod --confirm
exit-code
results
warnings
$ head -9 results
<?xml version="1.0" encoding="UTF-8"?>
<TestResult xmlns="http://checklists.nist.gov/xccdf/1.2" id="xccdf_org.open-scap_testresult_xccdf_org.ssgproject.content_profile_moderate" start-time="2021-06-08T16:46:44+00:00" end-time="2021-06-08T16:46:44+00:00" version="0.1.56" test-system="cpe:/a:redhat:openscap:1.3.4">
          <benchmark href="/content/ssg-ocp4-ds.xml" id="xccdf_org.ssgproject.content_benchmark_OCP-4"/>
          <title>OSCAP Scan Result</title>
          <identity authenticated="false" privileged="false"/>
          <profile idref="xccdf_org.ssgproject.content_profile_moderate"/>
          <target>ocp4-moderate-api-checks-pod</target>
          <target-address>127.0.0.1</target-address>

$ oc extract cm/ocp4-cis-api-checks-pod --confirm
warnings
exit-code
results
$ head -9 results
<?xml version="1.0" encoding="UTF-8"?>
<TestResult xmlns="http://checklists.nist.gov/xccdf/1.2" id="xccdf_org.open-scap_testresult_xccdf_org.ssgproject.content_profile_cis" start-time="2021-06-08T17:06:11+00:00" end-time="2021-06-08T17:06:12+00:00" version="0.1.56" test-system="cpe:/a:redhat:openscap:1.3.4">
          <benchmark href="/content/ssg-ocp4-ds.xml" id="xccdf_org.ssgproject.content_benchmark_OCP-4"/>
          <title>OSCAP Scan Result</title>
          <identity authenticated="false" privileged="false"/>
          <profile idref="xccdf_org.ssgproject.content_profile_cis"/>
          <target>ocp4-cis-api-checks-pod</target>
          <target-address>127.0.0.1</target-address>
          <target-address>10.130.0.61</target-address>

RELEVANT LINKS:
https://docs.openshift.com/container-platform/4.7/security/compliance_operator/compliance-scans.html#running-compliance-scans_compliance-operator-scans
https://github.com/openshift/compliance-operator#extracting-raw-results

Comment 3 Jakub Hrozek 2021-06-09 11:15:53 UTC
I agree this is slightly irritating. Normally, openscap which the CO is running under the hood is trying to read /etc/hostname, but that's not present on RHCOS.

That said, the node that was scanned is present as an annotation:
Namespace:    openshift-compliance
Labels:       compliance.openshift.io/scan-name=workers-cis
              complianceoperator.openshift.io/scan-result=
Annotations:  compliance-remediations/processed:
              compliance.openshift.io/scan-error-msg:
              compliance.openshift.io/scan-result: NON-COMPLIANT
              openscap-scan-result/node: ip-10-0-140-145.eu-north-1.compute.internal
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Data
====
exit-code:
----
2
results:
----
<?xml version="1.0" encoding="UTF-8"?>
<TestResult xmlns="http://checklists.nist.gov/xccdf/1.2" id="xccdf_org.open-scap_testresult_xccdf_org.ssgproject.content_profile_cis-node" start-time="2021-06-09T11:06:02+00:00" end-time="2021-06-09T11:06:03+00:00" version="0.1.57" test-s>
          <benchmark href="/content/ssg-ocp4-ds.xml" id="xccdf_org.ssgproject.content_benchmark_OCP-4"/>
          <title>OSCAP Scan Result</title>
          <profile idref="xccdf_org.ssgproject.content_profile_cis-node"/>
          <target>Unknown</target>
          <target-facts>
            <fact name="urn:xccdf:fact:identifier" type="string">chroot:///host</fact>
            <fact name="urn:xccdf:fact:scanner:name" type="string">OpenSCAP</fact>
            <fact name="urn:xccdf:fact:scanner:version" type="string">1.3.4</fact>
          </target-facts>

Would using this annotation help the customer?

There are some environment variables that can tune things like the name (https://github.com/OpenSCAP/openscap/blob/maint-1.3/docs/manual/manual.adoc#list-of-accepted-environment-variables) I'll try checking if the node name can be set this way.

Comment 4 Jakub Hrozek 2021-06-09 11:33:31 UTC
OK, the variables would probably not help..there is OSCAP_EVALUATION_TARGET which would set urn:xccdf:fact:identifier under target-facts for example:

          <target>Unknown</target>
          <target-facts>
            <fact name="urn:xccdf:fact:identifier" type="string">chroot:///host</fact>
            <fact name="urn:xccdf:fact:scanner:name" type="string">OpenSCAP</fact>
            <fact name="urn:xccdf:fact:scanner:version" type="string">1.3.4</fact>
          </target-facts>

using this variable we would set the urn:xccdf:fact:identifier to whatever we want, perhaps the node name. Would that help the customer?

Comment 11 Evgeny Kolesnikov 2021-06-14 08:36:48 UTC
Acc. to the standard <target> should hold the FQDN or hostname (if FQDN is not available). We can adjust the scanner to place OSCAP_EVALUATION_TARGET if it is set and hostname is absent (i.e. in place of 'unknown'). What do you think about this idea?

BTW is there any other way to get the hostname of the RHCOS node besides reading it from /etc/hostname?

Comment 12 Jakub Hrozek 2021-06-14 10:19:06 UTC
(In reply to Evgeny Kolesnikov from comment #11)
> Acc. to the standard <target> should hold the FQDN or hostname (if FQDN is
> not available). We can adjust the scanner to place OSCAP_EVALUATION_TARGET
> if it is set and hostname is absent (i.e. in place of 'unknown'). What do
> you think about this idea?
> 
> BTW is there any other way to get the hostname of the RHCOS node besides
> reading it from /etc/hostname?

/proc/sys/kernel/hostname is what you can take a look at in absence of /etc/hostname.

But I suspect for OCP environment, especially public clouds, the hostname is not as relevant as the node name (which is the 'name' attribute of the 'Node' API object in kubernetes). The reason simply being that it's the node name, not the host name that the admin uses when interacting with the cluster nodes. So having the hostname settable by whoever is orchestrating the scan would be more useful.

Although poking at another source would be a big improvement over 'unknown', no argument about that.

Comment 13 Jakub Hrozek 2021-06-14 10:22:40 UTC
btw if you think it would be more in line with the standard to put the hostname into <target> and the nodeName into one of the facts, that would also work I guess..

Comment 14 Evgeny Kolesnikov 2021-06-14 20:02:30 UTC
One thing bothers me, why the hostname of the node can't be equal to nodename of the node? Why it is not if it can?

Comment 15 Jakub Hrozek 2021-06-15 07:47:09 UTC
(In reply to Evgeny Kolesnikov from comment #14)
> One thing bothers me, why the hostname of the node can't be equal to
> nodename of the node? Why it is not if it can?

It can, but it doesn't have to be. In the end, I don't mind which one we use that much. As I said in comment #13, using the hostname for the target and putting nodeName into the facts (which can be done using an env variable already) is OK.

Comment 29 Prashant Dhamdhere 2021-09-24 07:57:13 UTC
[Bug_Verification]

Looks good to me. Now, the complianceScan XCCDF format result shows relevant nodeName in target element and urn:xccdf:fact:identifier for node checks.

Verified On:
4.9.0-0.nightly-2021-09-23-142241 + compliance-operator.v0.1.41


$ oc get clusterversion
NAME      VERSION                             AVAILABLE   PROGRESSING   SINCE   STATUS
version   4.9.0-0.nightly-2021-09-23-142241   True        False         3h48m   Cluster version is 4.9.0-0.nightly-2021-09-23-142241

$ oc get csv
NAME                             DISPLAY                            VERSION   REPLACES   PHASE
compliance-operator.v0.1.41      Compliance Operator                0.1.41               Succeeded
elasticsearch-operator.5.2.2-7   OpenShift Elasticsearch Operator   5.2.2-7              Succeeded

$ oc get pods -w
NAME                                              READY   STATUS    RESTARTS       AGE
compliance-operator-c769749d5-8pjgm               1/1     Running   1 (175m ago)   176m
ocp4-openshift-compliance-pp-64dbd7c98f-dwhdz     1/1     Running   0              175m
rhcos4-openshift-compliance-pp-66575dc885-lb9vh   1/1     Running   0              175m

$ oc create -f - <<EOF
> apiVersion: compliance.openshift.io/v1alpha1
> kind: ScanSettingBinding
> metadata:
>   name: cis-compliance
> profiles:
>   - name: ocp4-cis-node
>     kind: Profile
>     apiGroup: compliance.openshift.io/v1alpha1
>   - name: ocp4-cis
>     kind: Profile
>     apiGroup: compliance.openshift.io/v1alpha1
> settingsRef:
>   name: default
>   kind: ScanSetting
>   apiGroup: compliance.openshift.io/v1alpha1
> EOF
scansettingbinding.compliance.openshift.io/cis-compliance created



$ oc get pods 
NAME                                                    READY   STATUS      RESTARTS     AGE
aggregator-pod-ocp4-cis                                 0/1     Completed   0            2m48s
aggregator-pod-ocp4-cis-node-master                     0/1     Completed   0            2m58s
aggregator-pod-ocp4-cis-node-worker                     0/1     Completed   0            2m28s
compliance-operator-c769749d5-8pjgm                     1/1     Running     1 (3h ago)   3h1m
ocp4-cis-api-checks-pod                                 0/2     Completed   0            3m30s
ocp4-cis-node-master-pdhamdhe0924-8nndq-master-0-pod    0/2     Completed   0            3m31s
ocp4-cis-node-master-pdhamdhe0924-8nndq-master-1-pod    0/2     Completed   0            3m31s
ocp4-cis-node-master-pdhamdhe0924-8nndq-master-2-pod    0/2     Completed   0            3m30s
ocp4-openshift-compliance-pp-64dbd7c98f-dwhdz           1/1     Running     0            3h
openscap-pod-a66b87aad261fa091e6791413811e72bb0d232f9   0/2     Completed   0            3m28s
openscap-pod-b4a2f4787f65c320edd9a57a0cfbb9c7bfbc24f7   0/2     Completed   0            3m28s
openscap-pod-f64ff847b6f0c2421082c3eee0268d1815f0b70e   0/2     Completed   0            3m28s
rhcos4-openshift-compliance-pp-66575dc885-lb9vh         1/1     Running     0            3h

$ oc get compliancescan 
NAME                   PHASE   RESULT
ocp4-cis               DONE    NON-COMPLIANT
ocp4-cis-node-master   DONE    NON-COMPLIANT
ocp4-cis-node-worker   DONE    NON-COMPLIANT

$ oc get suite
NAME             PHASE   RESULT
cis-compliance   DONE    NON-COMPLIANT

$ oc get pv
NAME                                       CAPACITY   ACCESS MODES   RECLAIM POLICY   STATUS   CLAIM                                       STORAGECLASS      REASON   AGE
pvc-1727589d-2aa1-44d0-9046-e9d00c8852fa   1Gi        RWO            Delete           Bound    openshift-compliance/ocp4-cis               managed-premium            4m27s
pvc-4a254d2e-0d81-4908-8e68-3fa23458aa98   1Gi        RWO            Delete           Bound    openshift-compliance/ocp4-cis-node-worker   managed-premium            4m25s
pvc-8f7a719e-9571-47fe-8b25-f18fb690ab7f   1Gi        RWO            Delete           Bound    openshift-compliance/ocp4-cis-node-master   managed-premium            4m27s

$ oc get pvc
NAME                   STATUS   VOLUME                                     CAPACITY   ACCESS MODES   STORAGECLASS      AGE
ocp4-cis               Bound    pvc-1727589d-2aa1-44d0-9046-e9d00c8852fa   1Gi        RWO            managed-premium   4m40s
ocp4-cis-node-master   Bound    pvc-8f7a719e-9571-47fe-8b25-f18fb690ab7f   1Gi        RWO            managed-premium   4m49s
ocp4-cis-node-worker   Bound    pvc-4a254d2e-0d81-4908-8e68-3fa23458aa98   1Gi        RWO            managed-premium   4m33s


$ oc get cm
NAME                                                    DATA   AGE
compliance-operator-lock                                0      3h5m
kube-root-ca.crt                                        1      3h5m
ocp4-cis-api-checks-pod                                 3      6m48s
ocp4-cis-node-master-openscap-container-entrypoint      1      7m46s
ocp4-cis-node-master-openscap-env-map                   4      7m46s
ocp4-cis-node-master-openscap-env-map-platform          3      7m46s
ocp4-cis-node-master-pdhamdhe0924-8nndq-master-0-pod    3      6m52s
ocp4-cis-node-master-pdhamdhe0924-8nndq-master-1-pod    3      6m55s
ocp4-cis-node-master-pdhamdhe0924-8nndq-master-2-pod    3      6m52s
ocp4-cis-node-worker-openscap-container-entrypoint      1      7m50s
ocp4-cis-node-worker-openscap-env-map                   4      7m50s
ocp4-cis-node-worker-openscap-env-map-platform          3      7m49s
ocp4-cis-openscap-container-entrypoint                  1      7m38s
ocp4-cis-openscap-env-map                               4      7m38s
ocp4-cis-openscap-env-map-platform                      3      7m38s
openscap-pod-a66b87aad261fa091e6791413811e72bb0d232f9   3      6m53s
openscap-pod-b4a2f4787f65c320edd9a57a0cfbb9c7bfbc24f7   3      6m56s
openscap-pod-f64ff847b6f0c2421082c3eee0268d1815f0b70e   3      6m57s
openshift-service-ca.crt                                1      3h5m


$ oc get nodes
NAME                                         STATUS   ROLES    AGE     VERSION
pdhamdhe0924-8nndq-master-0                  Ready    master   4h11m   v1.22.0-rc.0+af080cb
pdhamdhe0924-8nndq-master-1                  Ready    master   4h11m   v1.22.0-rc.0+af080cb
pdhamdhe0924-8nndq-master-2                  Ready    master   4h11m   v1.22.0-rc.0+af080cb
pdhamdhe0924-8nndq-worker-centralus1-bkhnz   Ready    worker   4h2m    v1.22.0-rc.0+af080cb
pdhamdhe0924-8nndq-worker-centralus2-9xh96   Ready    worker   3h46m   v1.22.0-rc.0+af080cb
pdhamdhe0924-8nndq-worker-centralus3-6qzwq   Ready    worker   4h2m    v1.22.0-rc.0+af080cb


$ for pod in $(oc get cm -lcompliance.openshift.io/scan-name=ocp4-cis-node-master,complianceoperator.openshift.io/scan-result= --no-headers |awk '{print $1}'); do echo -e "\n\n <<<<< Check target in raw result for '$pod' >>>>>"; oc extract cm/$pod --confirm; grep "target>pdhamdhe\|identifier" results; done


 <<<<< Check target in raw result for 'ocp4-cis-node-master-pdhamdhe0924-8nndq-master-0-pod' >>>>>
exit-code
results
warnings
          <target>pdhamdhe0924-8nndq-master-0</target>
            <fact name="urn:xccdf:fact:identifier" type="string">pdhamdhe0924-8nndq-master-0</fact>


 <<<<< Check target in raw result for 'ocp4-cis-node-master-pdhamdhe0924-8nndq-master-1-pod' >>>>>
exit-code
results
warnings
          <target>pdhamdhe0924-8nndq-master-1</target>
            <fact name="urn:xccdf:fact:identifier" type="string">pdhamdhe0924-8nndq-master-1</fact>


 <<<<< Check target in raw result for 'ocp4-cis-node-master-pdhamdhe0924-8nndq-master-2-pod' >>>>>
exit-code
results
warnings
          <target>pdhamdhe0924-8nndq-master-2</target>
            <fact name="urn:xccdf:fact:identifier" type="string">pdhamdhe0924-8nndq-master-2</fact>


$ for pod in $(oc get cm -lcompliance.openshift.io/scan-name=ocp4-cis-node-worker,complianceoperator.openshift.io/scan-result= --no-headers |awk '{print $1}'); do echo -e "\n\n <<<<< Check target in raw result for '$pod' >>>>>"; oc extract cm/$pod --confirm; oc get pods $pod -oyaml |grep "nodeName"; grep "target>pdhamdhe\|identifier" results; done


 <<<<< Check target in raw result for 'openscap-pod-a66b87aad261fa091e6791413811e72bb0d232f9' >>>>>
results
warnings
exit-code
  nodeName: pdhamdhe0924-8nndq-worker-centralus1-bkhnz
          <target>pdhamdhe0924-8nndq-worker-centralus1-bkhnz</target>
            <fact name="urn:xccdf:fact:identifier" type="string">pdhamdhe0924-8nndq-worker-centralus1-bkhnz</fact>


 <<<<< Check target in raw result for 'openscap-pod-b4a2f4787f65c320edd9a57a0cfbb9c7bfbc24f7' >>>>>
exit-code
results
warnings
  nodeName: pdhamdhe0924-8nndq-worker-centralus2-9xh96
          <target>pdhamdhe0924-8nndq-worker-centralus2-9xh96</target>
            <fact name="urn:xccdf:fact:identifier" type="string">pdhamdhe0924-8nndq-worker-centralus2-9xh96</fact>


 <<<<< Check target in raw result for 'openscap-pod-f64ff847b6f0c2421082c3eee0268d1815f0b70e' >>>>>
exit-code
results
warnings
  nodeName: pdhamdhe0924-8nndq-worker-centralus3-6qzwq
          <target>pdhamdhe0924-8nndq-worker-centralus3-6qzwq</target>
            <fact name="urn:xccdf:fact:identifier" type="string">pdhamdhe0924-8nndq-worker-centralus3-6qzwq</fact>

Comment 31 errata-xmlrpc 2021-11-10 07:37:22 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (OpenShift Compliance Operator bug fix and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2021:4530


Note You need to log in before you can comment on or make changes to this bug.