Bug 2082431 - The remediation doesn’t work for rule ocp4-kubelet-configure-tls-cipher-suites
Summary: The remediation doesn’t work for rule ocp4-kubelet-configure-tls-cipher-suites
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Compliance Operator
Version: 4.11
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: ---
: ---
Assignee: Vincent Shen
QA Contact:
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2022-05-06 06:49 UTC by xiyuan
Modified: 2022-06-06 14:39 UTC (History)
6 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2022-06-06 14:39:50 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github ComplianceAsCode compliance-operator pull 36 0 None open Bug 2082431: Fix MachineConfig base64 encoding issue on OVN cluster 2022-05-17 03:28:11 UTC
Red Hat Product Errata RHBA-2022:4657 0 None None None 2022-06-06 14:39:55 UTC

Description xiyuan 2022-05-06 06:49:17 UTC
Description of problem:
Trigger remediation for rule ocp4-kubelet-configure-tls-cipher-suites on ovn cluster, the mcp will get paused. And no remediation will be applied because of error “encoded kubeletconfig 99-worker-generated-kubelet does not contain data:text/plain, prefix":
# oc get mcp
NAME     CONFIG                                             UPDATED   UPDATING   DEGRADED   MACHINECOUNT   READYMACHINECOUNT   UPDATEDMACHINECOUNT   DEGRADEDMACHINECOUNT   AGE
master   rendered-master-2e2cabcb4c0dc68e0d5f65ef299351f4   False     False      False      3              0                   0                     0                      86m
worker   rendered-worker-7957dbecb967c03d7d8923a9a249f7e4   False     False      False      3              0                   0                     0                      86m
in log for compliance-operator pod, you will see below error:
{"level":"info","ts":1651814895.169528,"logger":"suitectrl","msg":"All scans are in Done phase. Post-processing remediations","Request.Namespace":"openshift-compliance","Request.Name":"ocp4-kubelet-configure-tls-cipher-test"}
{"level":"error","ts":1651814895.1696122,"logger":"suitectrl","msg":"Retriable error","Request.Namespace":"openshift-compliance","Request.Name":"ocp4-kubelet-configure-tls-cipher-test","error":"encoded kubeletconfig 99-worker-generated-kubelet does not contain data:text/plain, prefix","stacktrace":"github.com/openshift/compliance-operator/pkg/controller/compliancesuite.(*ReconcileComplianceSuite).Reconcile\n\t/go/src/github.com/openshift/compliance-operator/pkg/controller/compliancesuite/compliancesuite_controller.go:181\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:235\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:209\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:188\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133\nk8s.io/apimachinery/pkg/util/wait.Until\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90"}
{"level":"error","ts":1651814895.169669,"logger":"controller","msg":"Reconciler error","controller":"compliancesuite-controller","name":"ocp4-kubelet-configure-tls-cipher-test","namespace":"openshift-compliance","error":"encoded kubeletconfig 99-worker-generated-kubelet does not contain data:text/plain, prefix","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:209\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:188\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133\nk8s.io/apimachinery/pkg/util/wait.Until\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90"}

Version-Release number of selected component (if applicable):
4.11.0-0.nightly-2022-05-05-015322 + latest CO upstream code
How reproducible:
Always

Steps to Reproduce:
Enable debug:
$ oc patch ss default-auto-apply -p '{"debug":true}' --type='merge'
scansetting.compliance.openshift.io/default-auto-apply patched
Create tailored profile and enable rule ocp4-kubelet-configure-tls-cipher-suites only in the tp:
$ oc create -f - << EOF
apiVersion: compliance.openshift.io/v1alpha1
kind: TailoredProfile
metadata:
  name: test-node                        
  namespace: openshift-compliance
spec:                                         
  description: set value for ocp4-kubelet-configure-tls-cipher-suites
  title: set value for ocp4-kubelet-configure-tls-cipher-suites
  enableRules:
    - name: ocp4-kubelet-configure-tls-cipher-suites
      rationale: Node
EOF
tailoredprofile.compliance.openshift.io/test-node created
Create scansettingbinding with the tailored profile and default-auto-apply ss:
 $ oc create -f - << EOF
apiVersion: compliance.openshift.io/v1alpha1
kind: ScanSettingBinding
metadata:
  name: ocp4-kubelet-configure-tls-cipher-test
profiles:
  - apiGroup: compliance.openshift.io/v1alpha1
    kind: TailoredProfile
    name: test-node
settingsRef:
  apiGroup: compliance.openshift.io/v1alpha1
  kind: ScanSetting
  name: default-auto-apply
EOF
scansettingbinding.compliance.openshift.io/ocp4-kubelet-configure-tls-cipher-test created


4.check the scan result and remediation result

Actual results:
The scan was triggered successfully. MC were created, mcp were paused. However, no remediation will be applied. The mc data could be decoded with base64, not urldecoder.


# oc get suite
NAME                                     PHASE   RESULT
ocp4-kubelet-configure-tls-cipher-test   DONE    NON-COMPLIANT
$ oc get mc
NAME                                               GENERATEDBYCONTROLLER                      IGNITIONVERSION   AGE
00-master                                          4b065b8da3741daae8002fd207f181586e46096e   3.2.0             118m
00-worker                                          4b065b8da3741daae8002fd207f181586e46096e   3.2.0             118m
01-master-container-runtime                        4b065b8da3741daae8002fd207f181586e46096e   3.2.0             118m
01-master-kubelet                                  4b065b8da3741daae8002fd207f181586e46096e   3.2.0             118m
01-worker-container-runtime                        4b065b8da3741daae8002fd207f181586e46096e   3.2.0             118m
01-worker-kubelet                                  4b065b8da3741daae8002fd207f181586e46096e   3.2.0             118m
99-master-fips                                                                                3.2.0             123m
99-master-generated-kubelet                        4b065b8da3741daae8002fd207f181586e46096e   3.2.0             53m
99-master-generated-registries                     4b065b8da3741daae8002fd207f181586e46096e   3.2.0             118m
99-master-ssh                                                                                 3.2.0             123m
99-worker-fips                                                                                3.2.0             123m
99-worker-generated-kubelet                        4b065b8da3741daae8002fd207f181586e46096e   3.2.0             53m
99-worker-generated-registries                     4b065b8da3741daae8002fd207f181586e46096e   3.2.0             118m
99-worker-ssh                                                                                 3.2.0             123m
rendered-master-003986438a720b5c31f5ce5a14222223   4b065b8da3741daae8002fd207f181586e46096e   3.2.0             118m
rendered-master-2e2cabcb4c0dc68e0d5f65ef299351f4   4b065b8da3741daae8002fd207f181586e46096e   3.2.0             97m
rendered-master-a2837a394800413eceab6849b02400dd   4b065b8da3741daae8002fd207f181586e46096e   3.2.0             52m
rendered-worker-63ba8511634f668c087b1925abc7df18   4b065b8da3741daae8002fd207f181586e46096e   3.2.0             118m
rendered-worker-7957dbecb967c03d7d8923a9a249f7e4   4b065b8da3741daae8002fd207f181586e46096e   3.2.0             97m
rendered-worker-dbbc92d0c39f24c116bda9c31e25cd4e   4b065b8da3741daae8002fd207f181586e46096e   3.2.0             53m
$ oc get mcp
NAME     CONFIG                                             UPDATED   UPDATING   DEGRADED   MACHINECOUNT   READYMACHINECOUNT   UPDATEDMACHINECOUNT   DEGRADEDMACHINECOUNT   AGE
master   rendered-master-2e2cabcb4c0dc68e0d5f65ef299351f4   False     False      False      3              0                   0                     0                      120m
worker   rendered-worker-7957dbecb967c03d7d8923a9a249f7e4   False     False      False      3              0                   0                     0                      120m
$ oc get mc 99-worker-generated-kubelet -o=jsonpath={.spec.config.storage.files} | jq -r
[
  {
    "contents": {
      "compression": "",
      "source": "data:text/plain;charset=utf-8;base64,ewogICJraW5kIjogIkt1YmVsZXRDb25maWd1cmF0aW9uIiwKICAiYXBpVmVyc2lvbiI6ICJrdWJlbGV0LmNvbmZpZy5rOHMuaW8vdjFiZXRhMSIsCiAgInN0YXRpY1BvZFBhdGgiOiAiL2V0Yy9rdWJlcm5ldGVzL21hbmlmZXN0cyIsCiAgInN5bmNGcmVxdWVuY3kiOiAiMHMiLAogICJmaWxlQ2hlY2tGcmVxdWVuY3kiOiAiMHMiLAogICJodHRwQ2hlY2tGcmVxdWVuY3kiOiAiMHMiLAogICJ0bHNDaXBoZXJTdWl0ZXMiOiBbCiAgICAiVExTX0VDREhFX1JTQV9XSVRIX0FFU18yNTZfR0NNX1NIQTM4NCIsCiAgICAiVExTX0VDREhFX0VDRFNBX1dJVEhfQUVTXzI1Nl9HQ01fU0hBMzg0IiwKICAgICJUTFNfRUNESEVfUlNBX1dJVEhfQUVTXzEyOF9HQ01fU0hBMjU2IiwKICAgICJUTFNfRUNESEVfRUNEU0FfV0lUSF9BRVNfMTI4X0dDTV9TSEEyNTYiCiAgXSwKICAidGxzTWluVmVyc2lvbiI6ICJWZXJzaW9uVExTMTIiLAogICJyb3RhdGVDZXJ0aWZpY2F0ZXMiOiB0cnVlLAogICJzZXJ2ZXJUTFNCb290c3RyYXAiOiB0cnVlLAogICJhdXRoZW50aWNhdGlvbiI6IHsKICAgICJ4NTA5IjogewogICAgICAiY2xpZW50Q0FGaWxlIjogIi9ldGMva3ViZXJuZXRlcy9rdWJlbGV0LWNhLmNydCIKICAgIH0sCiAgICAid2ViaG9vayI6IHsKICAgICAgImNhY2hlVFRMIjogIjBzIgogICAgfSwKICAgICJhbm9ueW1vdXMiOiB7CiAgICAgICJlbmFibGVkIjogZmFsc2UKICAgIH0KICB9LAogICJhdXRob3JpemF0aW9uIjogewogICAgIndlYmhvb2siOiB7CiAgICAgICJjYWNoZUF1dGhvcml6ZWRUVEwiOiAiMHMiLAogICAgICAiY2FjaGVVbmF1dGhvcml6ZWRUVEwiOiAiMHMiCiAgICB9CiAgfSwKICAiY2x1c3RlckRvbWFpbiI6ICJjbHVzdGVyLmxvY2FsIiwKICAiY2x1c3RlckROUyI6IFsKICAgICIxNzIuMzAuMC4xMCIKICBdLAogICJzdHJlYW1pbmdDb25uZWN0aW9uSWRsZVRpbWVvdXQiOiAiMHMiLAogICJub2RlU3RhdHVzVXBkYXRlRnJlcXVlbmN5IjogIjBzIiwKICAibm9kZVN0YXR1c1JlcG9ydEZyZXF1ZW5jeSI6ICIwcyIsCiAgImltYWdlTWluaW11bUdDQWdlIjogIjBzIiwKICAidm9sdW1lU3RhdHNBZ2dQZXJpb2QiOiAiMHMiLAogICJzeXN0ZW1DZ3JvdXBzIjogIi9zeXN0ZW0uc2xpY2UiLAogICJjZ3JvdXBSb290IjogIi8iLAogICJjZ3JvdXBEcml2ZXIiOiAic3lzdGVtZCIsCiAgImNwdU1hbmFnZXJSZWNvbmNpbGVQZXJpb2QiOiAiMHMiLAogICJydW50aW1lUmVxdWVzdFRpbWVvdXQiOiAiMHMiLAogICJtYXhQb2RzIjogMjUwLAogICJwb2RQaWRzTGltaXQiOiA0MDk2LAogICJrdWJlQVBJUVBTIjogNTAsCiAgImt1YmVBUElCdXJzdCI6IDEwMCwKICAic2VyaWFsaXplSW1hZ2VQdWxscyI6IGZhbHNlLAogICJldmljdGlvblByZXNzdXJlVHJhbnNpdGlvblBlcmlvZCI6ICIwcyIsCiAgImZlYXR1cmVHYXRlcyI6IHsKICAgICJBUElQcmlvcml0eUFuZEZhaXJuZXNzIjogdHJ1ZSwKICAgICJDU0lNaWdyYXRpb25BV1MiOiBmYWxzZSwKICAgICJDU0lNaWdyYXRpb25BenVyZUZpbGUiOiBmYWxzZSwKICAgICJDU0lNaWdyYXRpb25HQ0UiOiBmYWxzZSwKICAgICJDU0lNaWdyYXRpb252U3BoZXJlIjogZmFsc2UsCiAgICAiRG93bndhcmRBUElIdWdlUGFnZXMiOiB0cnVlLAogICAgIlBvZFNlY3VyaXR5IjogdHJ1ZSwKICAgICJSb3RhdGVLdWJlbGV0U2VydmVyQ2VydGlmaWNhdGUiOiB0cnVlCiAgfSwKICAibWVtb3J5U3dhcCI6IHt9LAogICJjb250YWluZXJMb2dNYXhTaXplIjogIjUwTWkiLAogICJzeXN0ZW1SZXNlcnZlZCI6IHsKICAgICJlcGhlbWVyYWwtc3RvcmFnZSI6ICIxR2kiCiAgfSwKICAibG9nZ2luZyI6IHsKICAgICJmbHVzaEZyZXF1ZW5jeSI6IDAsCiAgICAidmVyYm9zaXR5IjogMCwKICAgICJvcHRpb25zIjogewogICAgICAianNvbiI6IHsKICAgICAgICAiaW5mb0J1ZmZlclNpemUiOiAiMCIKICAgICAgfQogICAgfQogIH0sCiAgInNodXRkb3duR3JhY2VQZXJpb2QiOiAiMHMiLAogICJzaHV0ZG93bkdyYWNlUGVyaW9kQ3JpdGljYWxQb2RzIjogIjBzIgp9Cg=="
    },
    "overwrite": true,
    "path": "/etc/kubernetes/kubelet.conf"
  }
]
$ cat must-gather.local.1490530716913679833/quay-io-compliance-operator-must-gather-sha256-9a995dd0583444e74b92c1ee4f7e049627e0655d721fb0608469b69ab47c3c2d/openshift-compliance/pods/compliance-operator-576d6c5fb-pppkw.log | tail -n 10
{"level":"info","ts":1651814895.155579,"logger":"suitectrl","msg":"Reconciling ComplianceSuite","Request.Namespace":"openshift-compliance","Request.Name":"ocp4-kubelet-configure-tls-cipher-test"}
{"level":"info","ts":1651814895.1556447,"logger":"suitectrl","msg":"Not updating scan, the phase is the same","Request.Namespace":"openshift-compliance","Request.Name":"ocp4-kubelet-configure-tls-cipher-test","ComplianceScan.Name":"test-node-master","ComplianceScan.Phase":"DONE"}
{"level":"info","ts":1651814895.1556854,"logger":"suitectrl","msg":"Generating events for suite","Request.Namespace":"openshift-compliance","Request.Name":"ocp4-kubelet-configure-tls-cipher-test"}
{"level":"info","ts":1651814895.1557424,"logger":"suitectrl","msg":"Scan is up to date","Request.Namespace":"openshift-compliance","Request.Name":"ocp4-kubelet-configure-tls-cipher-test","ComplianceScan.Name":"test-node-master"}
{"level":"info","ts":1651814895.1557584,"logger":"suitectrl","msg":"Not updating scan, the phase is the same","Request.Namespace":"openshift-compliance","Request.Name":"ocp4-kubelet-configure-tls-cipher-test","ComplianceScan.Name":"test-node-worker","ComplianceScan.Phase":"DONE"}
{"level":"info","ts":1651814895.1557653,"logger":"suitectrl","msg":"Generating events for suite","Request.Namespace":"openshift-compliance","Request.Name":"ocp4-kubelet-configure-tls-cipher-test"}
{"level":"info","ts":1651814895.1557899,"logger":"suitectrl","msg":"Scan is up to date","Request.Namespace":"openshift-compliance","Request.Name":"ocp4-kubelet-configure-tls-cipher-test","ComplianceScan.Name":"test-node-worker"}
{"level":"info","ts":1651814895.169528,"logger":"suitectrl","msg":"All scans are in Done phase. Post-processing remediations","Request.Namespace":"openshift-compliance","Request.Name":"ocp4-kubelet-configure-tls-cipher-test"}
{"level":"error","ts":1651814895.1696122,"logger":"suitectrl","msg":"Retriable error","Request.Namespace":"openshift-compliance","Request.Name":"ocp4-kubelet-configure-tls-cipher-test","error":"encoded kubeletconfig 99-worker-generated-kubelet does not contain data:text/plain, prefix","stacktrace":"github.com/openshift/compliance-operator/pkg/controller/compliancesuite.(*ReconcileComplianceSuite).Reconcile\n\t/go/src/github.com/openshift/compliance-operator/pkg/controller/compliancesuite/compliancesuite_controller.go:181\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:235\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:209\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:188\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133\nk8s.io/apimachinery/pkg/util/wait.Until\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90"}
{"level":"error","ts":1651814895.169669,"logger":"controller","msg":"Reconciler error","controller":"compliancesuite-controller","name":"ocp4-kubelet-configure-tls-cipher-test","namespace":"openshift-compliance","error":"encoded kubeletconfig 99-worker-generated-kubelet does not contain data:text/plain, prefix","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:209\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:188\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133\nk8s.io/apimachinery/pkg/util/wait.Until\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90"}

$ oc logs --selector k8s-app=machine-config-operator --all-containers -nopenshift-machine-config-operator
E0506 05:42:51.717908       1 sync.go:723] Error syncing Required MachineConfigPools: "error required pool master is not ready, retrying. Status: (total: 3, ready 0, updated: 0, unavailable: 0, degraded: 0)"
I0506 05:42:51.871459       1 event.go:285] Event(v1.ObjectReference{Kind:"", Namespace:"", Name:"machine-config", UID:"0d3e1975-48ca-44b4-b5cd-eb5a7a1bd7dd", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Warning' reason: 'OperatorDegraded: RequiredPoolsFailed' Failed to resync 4.11.0-0.nightly-2022-05-05-015322 because: error during syncRequiredMachineConfigPools: [timed out waiting for the condition, error required pool master is not ready, retrying. Status: (total: 3, ready 0, updated: 0, unavailable: 0, degraded: 0)]
E0506 05:52:58.671991       1 sync.go:723] Error syncing Required MachineConfigPools: "error required pool master is not ready, retrying. Status: (total: 3, ready 0, updated: 0, unavailable: 0, degraded: 0)"
I0506 05:52:58.826064       1 event.go:285] Event(v1.ObjectReference{Kind:"", Namespace:"", Name:"machine-config", UID:"0d3e1975-48ca-44b4-b5cd-eb5a7a1bd7dd", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Warning' reason: 'OperatorDegraded: RequiredPoolsFailed' Failed to resync 4.11.0-0.nightly-2022-05-05-015322 because: error during syncRequiredMachineConfigPools: [timed out waiting for the condition, error required pool master is not ready, retrying. Status: (total: 3, ready 0, updated: 0, unavailable: 0, degraded: 0)]
E0506 06:03:05.657720       1 sync.go:723] Error syncing Required MachineConfigPools: "error required pool master is not ready, retrying. Status: (total: 3, ready 0, updated: 0, unavailable: 0, degraded: 0)"
I0506 06:03:05.803430       1 event.go:285] Event(v1.ObjectReference{Kind:"", Namespace:"", Name:"machine-config", UID:"0d3e1975-48ca-44b4-b5cd-eb5a7a1bd7dd", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Warning' reason: 'OperatorDegraded: RequiredPoolsFailed' Failed to resync 4.11.0-0.nightly-2022-05-05-015322 because: error during syncRequiredMachineConfigPools: [timed out waiting for the condition, error required pool master is not ready, retrying. Status: (total: 3, ready 0, updated: 0, unavailable: 0, degraded: 0)]
E0506 06:13:12.619371       1 sync.go:723] Error syncing Required MachineConfigPools: "error required pool master is not ready, retrying. Status: (total: 3, ready 0, updated: 0, unavailable: 0, degraded: 0)"
I0506 06:13:12.763462       1 event.go:285] Event(v1.ObjectReference{Kind:"", Namespace:"", Name:"machine-config", UID:"0d3e1975-48ca-44b4-b5cd-eb5a7a1bd7dd", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Warning' reason: 'OperatorDegraded: RequiredPoolsFailed' Failed to resync 4.11.0-0.nightly-2022-05-05-015322 because: error during syncRequiredMachineConfigPools: [timed out waiting for the condition, error required pool master is not ready, retrying. Status: (total: 3, ready 0, updated: 0, unavailable: 0, degraded: 0)]
E0506 06:23:19.563992       1 sync.go:723] Error syncing Required MachineConfigPools: "error required pool master is not ready, retrying. Status: (total: 3, ready 0, updated: 0, unavailable: 0, degraded: 0)"
I0506 06:23:19.710073       1 event.go:285] Event(v1.ObjectReference{Kind:"", Namespace:"", Name:"machine-config", UID:"0d3e1975-48ca-44b4-b5cd-eb5a7a1bd7dd", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Warning' reason: 'OperatorDegraded: RequiredPoolsFailed' Failed to resync 4.11.0-0.nightly-2022-05-05-015322 because: error during syncRequiredMachineConfigPools: [timed out waiting for the condition, error required pool master is not ready, retrying. Status: (total: 3, ready 0, updated: 0, unavailable: 0, degraded: 0)]

Expected results:
The scan was triggered successfully. And the remediation was triggered successfully. 

Additional info:
The mcp could recover by manually path the mcp to unpause:
$ oc patch mcp worker --type=merge -p '{"spec": {"paused": false}}'
machineconfigpool.machineconfiguration.openshift.io/worker patched
$ oc patch mcp master --type=merge -p '{"spec": {"paused": false}}'
machineconfigpool.machineconfiguration.openshift.io/master patched

Comment 1 Vincent Shen 2022-05-13 04:56:07 UTC
It seems like the issue is caused by https://github.com/openshift/compliance-operator/pull/814, as initially we were not expecting rendered MC to use base64 encoding.

Comment 2 xiyuan 2022-05-16 07:08:18 UTC
Hi Vincent,
The base64 decoder issue is for ovn only. However, the mcp pause issue for remediation(always need manual unpause or it will stuck at paused status) is a common issue could be reproduced easily on all platforms. Could you help  to raise the priority for this bug? Thanks.

Comment 3 Vincent Shen 2022-05-17 03:27:42 UTC
Hi Xiyuan,

I have suggested a patch PR here: https://github.com/ComplianceAsCode/compliance-operator/pull/36. I will discuss this with the team to see if we can have this in the upcoming release.

Best,
Vincent

Comment 4 xiyuan 2022-05-19 10:48:50 UTC
Verification pass with 4.11.0-0.nightly-2022-05-18-171831 and latest CO code.
# git log | head
commit fb0f0469cb50a89158b193e928d856f59c0e14b7
Merge: 3a94f273 aa900230
Author: Juan Osorio Robles <jaosorior>
Date:   Tue May 17 12:17:35 2022 +0300

    Merge pull request #36 from Vincent056/bugfix_kc
    
    Bug 2082431: Fix MachineConfig base64 encoding issue on OVN cluster

commit aa9002305b2f857151cdd269190a92315bf1018b

$ make deploy-local
...
deployment.apps/compliance-operator created
role.rbac.authorization.k8s.io/compliance-operator created
clusterrole.rbac.authorization.k8s.io/compliance-operator created
role.rbac.authorization.k8s.io/resultscollector created
role.rbac.authorization.k8s.io/api-resource-collector created
role.rbac.authorization.k8s.io/resultserver created
role.rbac.authorization.k8s.io/remediation-aggregator created
clusterrole.rbac.authorization.k8s.io/remediation-aggregator created
role.rbac.authorization.k8s.io/rerunner created
role.rbac.authorization.k8s.io/profileparser created
clusterrole.rbac.authorization.k8s.io/api-resource-collector created
rolebinding.rbac.authorization.k8s.io/compliance-operator created
clusterrolebinding.rbac.authorization.k8s.io/compliance-operator created
rolebinding.rbac.authorization.k8s.io/resultscollector created
rolebinding.rbac.authorization.k8s.io/remediation-aggregator created
clusterrolebinding.rbac.authorization.k8s.io/remediation-aggregator created
clusterrolebinding.rbac.authorization.k8s.io/api-resource-collector created
rolebinding.rbac.authorization.k8s.io/api-resource-collector created
rolebinding.rbac.authorization.k8s.io/rerunner created
rolebinding.rbac.authorization.k8s.io/profileparser created
rolebinding.rbac.authorization.k8s.io/resultserver created
serviceaccount/compliance-operator created
serviceaccount/resultscollector created
serviceaccount/remediation-aggregator created
serviceaccount/rerunner created
serviceaccount/api-resource-collector created
serviceaccount/profileparser created
serviceaccount/resultserver created
clusterrolebinding.rbac.authorization.k8s.io/compliance-operator-metrics created
clusterrole.rbac.authorization.k8s.io/compliance-operator-metrics created
W0519 15:50:50.428000   25441 warnings.go:70] would violate PodSecurity "restricted:latest": unrestricted capabilities (container "compliance-operator" must set securityContext.capabilities.drop=["ALL"]), runAsNonRoot != true (pod or container "compliance-operator" must set securityContext.runAsNonRoot=true), seccompProfile (pod or container "compliance-operator" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost")
deployment.apps/compliance-operator triggers updated

$  oc get pod
NAME                                              READY   STATUS    RESTARTS      AGE
compliance-operator-8595cc98df-qh2ml              1/1     Running   1 (60m ago)   61m
ocp4-openshift-compliance-pp-7599c78b-642x8       1/1     Running   0             59m
rhcos4-openshift-compliance-pp-758b8f6d54-m4z76   1/1     Running   0             59m
$ oc create -f -<<EOF
> apiVersion: compliance.openshift.io/v1alpha1
> kind: ScanSettingBinding
> metadata:
>   name: my-ssb-r
> profiles:
>   - name: ocp4-moderate-node
>     kind: Profile
>     apiGroup: compliance.openshift.io/v1alpha1
> settingsRef:
>   name: default-auto-apply
>   kind: ScanSetting
>   apiGroup: compliance.openshift.io/v1alpha1
> EOF
scansettingbinding.compliance.openshift.io/my-ssb-r created
$ oc get suite -w
NAME       PHASE       RESULT
my-ssb-r   LAUNCHING   NOT-AVAILABLE
my-ssb-r   LAUNCHING   NOT-AVAILABLE
my-ssb-r   RUNNING     NOT-AVAILABLE
my-ssb-r   RUNNING     NOT-AVAILABLE
my-ssb-r   AGGREGATING   NOT-AVAILABLE
my-ssb-r   AGGREGATING   NOT-AVAILABLE
my-ssb-r   DONE          NON-COMPLIANT
my-ssb-r   DONE          NON-COMPLIANT
 
$ oc get mcp -w
NAME     CONFIG                                             UPDATED   UPDATING   DEGRADED   MACHINECOUNT   READYMACHINECOUNT   UPDATEDMACHINECOUNT   DEGRADEDMACHINECOUNT   AGE
master   rendered-master-16cb661b1f9966208976f789a1ac8941   False     True       False      3              2                   2                     0                      5h33m
worker   rendered-worker-29d920e599951fbb9ab779b6f26468b7   True      False      False      3              3                   3                     0                      5h33m
worker   rendered-worker-29d920e599951fbb9ab779b6f26468b7   True      False      False      3              3                   3                     0                      5h34m
master   rendered-master-16cb661b1f9966208976f789a1ac8941   False     True       False      3              2                   2                     0                      5h34m
worker   rendered-worker-29d920e599951fbb9ab779b6f26468b7   True      False      False      3              3                   3                     0                      5h34m
master   rendered-master-16cb661b1f9966208976f789a1ac8941   False     True       False      3              2                   2                     0                      5h34m
master   rendered-master-16cb661b1f9966208976f789a1ac8941   False     True       False      3              2                   2                     0                      5h35m
worker   rendered-worker-29d920e599951fbb9ab779b6f26468b7   True      False      False      3              3                   3                     0                      5h35m

$ oc compliance rerun-now scansettingbinding my-ssb-r
Rerunning scans from 'my-ssb-r': ocp4-moderate-node-master, ocp4-moderate-node-worker
Re-running scan 'openshift-compliance/ocp4-moderate-node-master'
Re-running scan 'openshift-compliance/ocp4-moderate-node-worker'
$ oc get suite -w
NAME       PHASE       RESULT
my-ssb-r   LAUNCHING   NOT-AVAILABLE
my-ssb-r   LAUNCHING   NOT-AVAILABLE
my-ssb-r   RUNNING     NOT-AVAILABLE
my-ssb-r   RUNNING     NOT-AVAILABLE
my-ssb-r   AGGREGATING   NOT-AVAILABLE
my-ssb-r   AGGREGATING   NOT-AVAILABLE
my-ssb-r   DONE          NON-COMPLIANT
my-ssb-r   DONE          NON-COMPLIANT
$ oc get cr
NAME                                                                                  STATE
ocp4-moderate-node-master-directory-access-var-log-kube-audit                         Applied
ocp4-moderate-node-master-directory-access-var-log-oauth-audit                        Applied
ocp4-moderate-node-master-directory-access-var-log-ocp-audit                          Applied
ocp4-moderate-node-master-kubelet-configure-event-creation                            Applied
ocp4-moderate-node-master-kubelet-configure-tls-cipher-suites                         Applied
ocp4-moderate-node-master-kubelet-enable-iptables-util-chains                         Applied
ocp4-moderate-node-master-kubelet-enable-protect-kernel-defaults                      Applied
ocp4-moderate-node-master-kubelet-enable-protect-kernel-sysctl                        Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-hard-imagefs-available      Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-hard-imagefs-available-1    Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-hard-imagefs-inodesfree     Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-hard-imagefs-inodesfree-1   Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-hard-memory-available       Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-hard-memory-available-1     Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-hard-nodefs-available       Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-hard-nodefs-available-1     Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-hard-nodefs-inodesfree      Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-hard-nodefs-inodesfree-1    Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-imagefs-available      Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-imagefs-available-1    Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-imagefs-available-2    Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-imagefs-inodesfree     Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-imagefs-inodesfree-1   Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-imagefs-inodesfree-2   Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-memory-available       Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-memory-available-1     Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-memory-available-2     Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-nodefs-available       Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-nodefs-available-1     Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-nodefs-available-2     Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-nodefs-inodesfree      Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-nodefs-inodesfree-1    Applied
ocp4-moderate-node-master-kubelet-eviction-thresholds-set-soft-nodefs-inodesfree-2    Applied
ocp4-moderate-node-worker-kubelet-configure-event-creation                            Applied
ocp4-moderate-node-worker-kubelet-configure-tls-cipher-suites                         Applied
ocp4-moderate-node-worker-kubelet-enable-iptables-util-chains                         Applied
ocp4-moderate-node-worker-kubelet-enable-protect-kernel-defaults                      Applied
ocp4-moderate-node-worker-kubelet-enable-protect-kernel-sysctl                        Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-hard-imagefs-available      Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-hard-imagefs-available-1    Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-hard-imagefs-inodesfree     Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-hard-imagefs-inodesfree-1   Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-hard-memory-available       Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-hard-memory-available-1     Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-hard-nodefs-available       Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-hard-nodefs-available-1     Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-hard-nodefs-inodesfree      Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-hard-nodefs-inodesfree-1    Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-imagefs-available      Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-imagefs-available-1    Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-imagefs-available-2    Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-imagefs-inodesfree     Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-imagefs-inodesfree-1   Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-imagefs-inodesfree-2   Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-memory-available       Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-memory-available-1     Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-memory-available-2     Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-nodefs-available       Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-nodefs-available-1     Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-nodefs-available-2     Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-nodefs-inodesfree      Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-nodefs-inodesfree-1    Applied
ocp4-moderate-node-worker-kubelet-eviction-thresholds-set-soft-nodefs-inodesfree-2    Applied

$ # oc logs pod/compliance-operator-8595cc98df-h8xpr --all-containers | grep -i error
{"level":"info","ts":1652956208.4256825,"logger":"metrics","msg":"Registering metric: compliance_scan_error_total"}
{"level":"error","ts":1652956584.335884,"logger":"suitectrl","msg":"Could not pause pool","Request.Namespace":"openshift-compliance","Request.Name":"my-ssb-r","MachineConfigPool.Name":"worker","error":"Operation cannot be fulfilled on machineconfigpools.machineconfiguration.openshift.io \"worker\": the object has been modified; please apply your changes to the latest version and try again","stacktrace":"github.com/openshift/compliance-operator/pkg/controller/compliancesuite.(*ReconcileComplianceSuite).applyRemediation\n\t/go/src/github.com/openshift/compliance-operator/pkg/controller/compliancesuite/compliancesuite_controller.go:578\ngithub.com/openshift/compliance-operator/pkg/controller/compliancesuite.(*ReconcileComplianceSuite).reconcileRemediations\n\t/go/src/github.com/openshift/compliance-operator/pkg/controller/compliancesuite/compliancesuite_controller.go:486\ngithub.com/openshift/compliance-operator/pkg/controller/compliancesuite.(*ReconcileComplianceSuite).Reconcile\n\t/go/src/github.com/openshift/compliance-operator/pkg/controller/compliancesuite/compliancesuite_controller.go:180\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:235\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:209\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:188\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133\nk8s.io/apimachinery/pkg/util/wait.Until\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90"}
{"level":"error","ts":1652956584.340368,"logger":"suitectrl","msg":"Retriable error","Request.Namespace":"openshift-compliance","Request.Name":"my-ssb-r","error":"Operation cannot be fulfilled on machineconfigpools.machineconfiguration.openshift.io \"worker\": the object has been modified; please apply your changes to the latest version and try again","stacktrace":"github.com/openshift/compliance-operator/pkg/controller/compliancesuite.(*ReconcileComplianceSuite).Reconcile\n\t/go/src/github.com/openshift/compliance-operator/pkg/controller/compliancesuite/compliancesuite_controller.go:181\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:235\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:209\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:188\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133\nk8s.io/apimachinery/pkg/util/wait.Until\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90"}
{"level":"error","ts":1652956584.340567,"logger":"controller","msg":"Reconciler error","controller":"compliancesuite-controller","name":"my-ssb-r","namespace":"openshift-compliance","error":"Operation cannot be fulfilled on machineconfigpools.machineconfiguration.openshift.io \"worker\": the object has been modified; please apply your changes to the latest version and try again","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:209\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker\n\t/go/src/github.com/openshift/compliance-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:188\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133\nk8s.io/apimachinery/pkg/util/wait.Until\n\t/go/src/github.com/openshift/compliance-operator/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90"}

Comment 7 xiyuan 2022-05-26 11:14:43 UTC
verification pass with 4.11.0-0.nightly-2022-05-25-193227 and compliance-operator.v0.1.52
$ oc get ip
NAME            CSV                           APPROVAL    APPROVED
install-prbqr   compliance-operator.v0.1.52   Automatic   true
$ oc get csv
NAME                           DISPLAY                            VERSION   REPLACES   PHASE
compliance-operator.v0.1.52    Compliance Operator                0.1.52               Succeeded
elasticsearch-operator.5.4.2   OpenShift Elasticsearch Operator   5.4.2                Succeeded


1. create ssb with tailoredprofile:
$ oc create -f - << EOF
> apiVersion: compliance.openshift.io/v1alpha1
> kind: TailoredProfile
> metadata:
>   name: test-node                        
>   namespace: openshift-compliance
> spec:                                         
>   description: set value for ocp4-kubelet-configure-tls-cipher-suites
>   title: set value for ocp4-kubelet-configure-tls-cipher-suites
>   enableRules:
>     - name: ocp4-kubelet-configure-tls-cipher-suites
>       rationale: Node
> EOF
tailoredprofile.compliance.openshift.io/test-node created
$ oc create -f - << EOF
> apiVersion: compliance.openshift.io/v1alpha1
> kind: ScanSettingBinding
> metadata:
>   name: ocp4-kubelet-configure-tls-cipher-test
> profiles:
>   - apiGroup: compliance.openshift.io/v1alpha1
>     kind: TailoredProfile
>     name: test-node
> settingsRef:
>   apiGroup: compliance.openshift.io/v1alpha1
>   kind: ScanSetting
>   name: default-auto-apply
> EOF
scansettingbinding.compliance.openshift.io/ocp4-kubelet-configure-tls-cipher-test created
$ oc get suite -w
NAME                                     PHASE     RESULT
ocp4-kubelet-configure-tls-cipher-test   RUNNING   NOT-AVAILABLE
ocp4-kubelet-configure-tls-cipher-test   RUNNING   NOT-AVAILABLE
ocp4-kubelet-configure-tls-cipher-test   AGGREGATING   NOT-AVAILABLE
ocp4-kubelet-configure-tls-cipher-test   AGGREGATING   NOT-AVAILABLE
ocp4-kubelet-configure-tls-cipher-test   DONE          NON-COMPLIANT
ocp4-kubelet-configure-tls-cipher-test   DONE          NON-COMPLIANT

$ oc get mc 99-worker-generated-kubelet -o=jsonpath={.spec.config.storage.files[0].contents.source}
data:text/plain;charset=utf-8;base64,ewogICJraW5kIjogIkt1YmVsZXRDb25maWd1cmF0aW9uIiwKICAiYXBpVmVyc2lvbiI6ICJrdWJlbGV0LmNvbmZpZy5rOHMuaW8vdjFiZXRhMSIsCiAgInN0YXRpY1BvZFBhdGgiOiAiL2V0Yy9rdWJlcm5ldGVzL21hbmlmZXN0cyIsCiAgInN5bmNGcmVxdWVuY3kiOiAiMHMiLAogICJmaWxlQ2hlY2tGcmVxdWVuY3kiOiAiMHMiLAogICJodHRwQ2hlY2tGcmVxdWVuY3kiOiAiMHMiLAogICJ0bHNDaXBoZXJTdWl0ZXMiOiBbCiAgICAiVExTX0VDREhFX1JTQV9XSVRIX0FFU18yNTZfR0NNX1NIQTM4NCIsCiAgICAiVExTX0VDREhFX0VDRFNBX1dJVEhfQUVTXzI1Nl9HQ01fU0hBMzg0IiwKICAgICJUTFNfRUNESEVfUlNBX1dJVEhfQUVTXzEyOF9HQ01fU0hBMjU2IiwKICAgICJUTFNfRUNESEVfRUNEU0FfV0lUSF9BRVNfMTI4X0dDTV9TSEEyNTYiCiAgXSwKICAidGxzTWluVmVyc2lvbiI6ICJWZXJzaW9uVExTMTIiLAogICJyb3RhdGVDZXJ0aWZpY2F0ZXMiOiB0cnVlLAogICJzZXJ2ZXJUTFNCb290c3RyYXAiOiB0cnVlLAogICJhdXRoZW50aWNhdGlvbiI6IHsKICAgICJ4NTA5IjogewogICAgICAiY2xpZW50Q0FGaWxlIjogIi9ldGMva3ViZXJuZXRlcy9rdWJlbGV0LWNhLmNydCIKICAgIH0sCiAgICAid2ViaG9vayI6IHsKICAgICAgImNhY2hlVFRMIjogIjBzIgogICAgfSwKICAgICJhbm9ueW1vdXMiOiB7CiAgICAgICJlbmFibGVkIjogZmFsc2UKICAgIH0KICB9LAogICJhdXRob3JpemF0aW9uIjogewogICAgIndlYmhvb2siOiB7CiAgICAgICJjYWNoZUF1dGhvcml6ZWRUVEwiOiAiMHMiLAogICAgICAiY2FjaGVVbmF1dGhvcml6ZWRUVEwiOiAiMHMiCiAgICB9CiAgfSwKICAiY2x1c3RlckRvbWFpbiI6ICJjbHVzdGVyLmxvY2FsIiwKICAiY2x1c3RlckROUyI6IFsKICAgICIxNzIuMzAuMC4xMCIKICBdLAogICJzdHJlYW1pbmdDb25uZWN0aW9uSWRsZVRpbWVvdXQiOiAiMHMiLAogICJub2RlU3RhdHVzVXBkYXRlRnJlcXVlbmN5IjogIjBzIiwKICAibm9kZVN0YXR1c1JlcG9ydEZyZXF1ZW5jeSI6ICIwcyIsCiAgImltYWdlTWluaW11bUdDQWdlIjogIjBzIiwKICAidm9sdW1lU3RhdHNBZ2dQZXJpb2QiOiAiMHMiLAogICJzeXN0ZW1DZ3JvdXBzIjogIi9zeXN0ZW0uc2xpY2UiLAogICJjZ3JvdXBSb290IjogIi8iLAogICJjZ3JvdXBEcml2ZXIiOiAic3lzdGVtZCIsCiAgImNwdU1hbmFnZXJSZWNvbmNpbGVQZXJpb2QiOiAiMHMiLAogICJydW50aW1lUmVxdWVzdFRpbWVvdXQiOiAiMHMiLAogICJtYXhQb2RzIjogMjUwLAogICJwb2RQaWRzTGltaXQiOiA0MDk2LAogICJrdWJlQVBJUVBTIjogNTAsCiAgImt1YmVBUElCdXJzdCI6IDEwMCwKICAic2VyaWFsaXplSW1hZ2VQdWxscyI6IGZhbHNlLAogICJldmljdGlvblByZXNzdXJlVHJhbnNpdGlvblBlcmlvZCI6ICIwcyIsCiAgImZlYXR1cmVHYXRlcyI6IHsKICAgICJBUElQcmlvcml0eUFuZEZhaXJuZXNzIjogdHJ1ZSwKICAgICJDU0lNaWdyYXRpb25BV1MiOiBmYWxzZSwKICAgICJDU0lNaWdyYXRpb25BenVyZUZpbGUiOiBmYWxzZSwKICAgICJDU0lNaWdyYXRpb25HQ0UiOiBmYWxzZSwKICAgICJDU0lNaWdyYXRpb252U3BoZXJlIjogZmFsc2UsCiAgICAiRG93bndhcmRBUElIdWdlUGFnZXMiOiB0cnVlLAogICAgIlBvZFNlY3VyaXR5IjogdHJ1ZSwKICAgICJSb3RhdGVLdWJlbGV0U2VydmVyQ2VydGlmaWNhdGUiOiB0cnVlCiAgfSwKICAibWVtb3J5U3dhcCI6IHt9LAogICJjb250YWluZXJMb2dNYXhTaXplIjogIjUwTWkiLAogICJzeXN0ZW1SZXNlcnZlZCI6IHsKICAgICJlcGhlbWVyYWwtc3RvcmFnZSI6ICIxR2kiCiAgfSwKICAibG9nZ2luZyI6IHsKICAgICJmbHVzaEZyZXF1ZW5jeSI6IDAsCiAgICAidmVyYm9zaXR5IjogMCwKICAgICJvcHRpb25zIjogewogICAgICAianNvbiI6IHsKICAgICAgICAiaW5mb0J1ZmZlclNpemUiOiAiMCIKICAgICAgfQogICAgfQogIH0sCiAgInNodXRkb3duR3JhY2VQZXJpb2QiOiAiMHMiLAogICJzaHV0ZG93bkdyYWNlUGVyaW9kQ3JpdGljYWxQb2RzIjogIjBzIgp9Cg==


$ oc get mcp -w
NAME     CONFIG                                             UPDATED   UPDATING   DEGRADED   MACHINECOUNT   READYMACHINECOUNT   UPDATEDMACHINECOUNT   DEGRADEDMACHINECOUNT   AGE
master   rendered-master-88653b9010cf401816f02bbd0ca6067e   False     True       False      3              0                   0                     0                      4h21m
worker   rendered-worker-8b12364529c4327d42a8be257d60994b   False     True       False      3              0                   0                     0                      4h21m
worker   rendered-worker-8b12364529c4327d42a8be257d60994b   False     True       False      3              0                   1                     0                      4h23m
worker   rendered-worker-8b12364529c4327d42a8be257d60994b   False     True       False      3              1                   1                     0                      4h23m
...
master   rendered-master-02e4f67b541447fc98be1b4058db1142   True      False      False      3              3                   3                     0                      4h53m
worker   rendered-worker-0e244d0bc1579c3ed29531b1fb49336a   True      False      False      3              3                   3                     0                      4h53m
$ oc compliance rerun-now scansettingbindings ocp4-kubelet-configure-tls-cipher-test
Rerunning scans from 'ocp4-kubelet-configure-tls-cipher-test': test-node-master, test-node-worker
Re-running scan 'openshift-compliance/test-node-master'
Re-running scan 'openshift-compliance/test-node-worker'

$ oc get ccr
NAME                                                   STATUS   SEVERITY
test-node-master-kubelet-configure-tls-cipher-suites   PASS     medium
test-node-worker-kubelet-configure-tls-cipher-suites   PASS     medium
$ oc get ccr test-node-master-kubelet-configure-tls-cipher-suites  -o=jsonpath={.instructions}
Run the following command on the kubelet node(s):
$ sudo grep tlsCipherSuites /etc/kubernetes/kubelet.conf
Verify that the set of ciphers contains only the following:

TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,
TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,
TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,
TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384


$ oc get ccr test-node-master-kubelet-configure-tls-cipher-suites  -o=jsonpath={.instructions}
Run the following command on the kubelet node(s):
$ sudo grep tlsCipherSuites /etc/kubernetes/kubelet.conf
Verify that the set of ciphers contains only the following:

TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,
TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,
TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,
TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384

$ oc debug node/ip-10-0-152-227.us-east-2.compute.internal -- chroot /host cat /etc/kubernetes/kubelet.conf
W0526 18:57:53.619851     609 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true, hostPID=true), hostPath volumes (volume "host"), privileged (container "container-00" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (container "container-00" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (container "container-00" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volume "host" uses restricted volume type "hostPath"), runAsNonRoot != true (pod or container "container-00" must set securityContext.runAsNonRoot=true), runAsUser=0 (container "container-00" must not set runAsUser=0), seccompProfile (pod or container "container-00" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost")
Starting pod/ip-10-0-152-227us-east-2computeinternal-debug ...
To use host binaries, run `chroot /host`
{
  "kind": "KubeletConfiguration",
  "apiVersion": "kubelet.config.k8s.io/v1beta1",
  "staticPodPath": "/etc/kubernetes/manifests",
  "syncFrequency": "0s",
  "fileCheckFrequency": "0s",
  "httpCheckFrequency": "0s",
  "tlsCipherSuites": [
    "TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384",
    "TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384",
    "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256",
    "TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256"
  ],
  "tlsMinVersion": "VersionTLS12",
  "rotateCertificates": true,
  "serverTLSBootstrap": true,
  "authentication": {
    "x509": {
      "clientCAFile": "/etc/kubernetes/kubelet-ca.crt"
    },
    "webhook": {
      "cacheTTL": "0s"
    },
    "anonymous": {
      "enabled": false
    }
  },
  "authorization": {
    "webhook": {
      "cacheAuthorizedTTL": "0s",
      "cacheUnauthorizedTTL": "0s"
    }
  },
  "clusterDomain": "cluster.local",
  "clusterDNS": [
    "172.30.0.10"
  ],
  "streamingConnectionIdleTimeout": "0s",
  "nodeStatusUpdateFrequency": "0s",
  "nodeStatusReportFrequency": "0s",
  "imageMinimumGCAge": "0s",
  "volumeStatsAggPeriod": "0s",
  "systemCgroups": "/system.slice",
  "cgroupRoot": "/",
  "cgroupDriver": "systemd",
  "cpuManagerReconcilePeriod": "0s",
  "runtimeRequestTimeout": "0s",
  "maxPods": 250,
  "podPidsLimit": 4096,
  "kubeAPIQPS": 50,
  "kubeAPIBurst": 100,
  "serializeImagePulls": false,
  "evictionPressureTransitionPeriod": "0s",
  "featureGates": {
    "APIPriorityAndFairness": true,
    "CSIMigrationAWS": false,
    "CSIMigrationAzureFile": false,
    "CSIMigrationGCE": false,
    "CSIMigrationvSphere": false,
    "DownwardAPIHugePages": true,
    "PodSecurity": true,
    "RotateKubeletServerCertificate": true
  },
  "memorySwap": {},
  "containerLogMaxSize": "50Mi",
  "systemReserved": {
    "ephemeral-storage": "1Gi"
  },
  "logging": {
    "flushFrequency": 0,
    "verbosity": 0,
    "options": {
      "json": {
        "infoBufferSize": "0"
      }
    }
  },
  "shutdownGracePeriod": "0s",
  "shutdownGracePeriodCriticalPods": "0s"
}

Comment 9 errata-xmlrpc 2022-06-06 14:39:50 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (OpenShift Compliance Operator bug fix update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2022:4657


Note You need to log in before you can comment on or make changes to this bug.