Bug 1960554 - Remove rbacv1beta1 handling code
Summary: Remove rbacv1beta1 handling code
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Cluster Version Operator
Version: 4.8
Hardware: Unspecified
OS: Unspecified
unspecified
medium
Target Milestone: ---
: 4.8.0
Assignee: Vadim Rutkovsky
QA Contact: Yang Yang
URL:
Whiteboard:
Depends On:
Blocks: 1961341
TreeView+ depends on / blocked
 
Reported: 2021-05-14 08:22 UTC by Vadim Rutkovsky
Modified: 2021-07-27 23:08 UTC (History)
5 users (show)

Fixed In Version:
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2021-07-27 23:08:28 UTC
Target Upstream Version:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github openshift baremetal-operator pull 147 0 None closed Bug 1960554: config: use rbacv1 instead of rbacv1beta1 2021-05-17 17:47:16 UTC
Github openshift cluster-autoscaler-operator pull 205 0 None closed Bug 1960554: manifests: use v1 for RBAC 2021-05-18 19:54:49 UTC
Github openshift cluster-version-operator pull 565 0 None closed Bug 1960554: Remove rbacv1beta1 support 2021-05-14 17:33:30 UTC
Red Hat Product Errata RHSA-2021:2438 0 None None None 2021-07-27 23:08:45 UTC

Description Vadim Rutkovsky 2021-05-14 08:22:55 UTC
Most of the components in 4.8 use rbacv1, so CVO should no longer have a duplicated codebase for rbacv1beta1 handling

Comment 2 Yang Yang 2021-05-17 09:59:15 UTC
Attempting to verify with 4.8.0-0.nightly-2021-05-15-141455

# oc get clusterversion
NAME      VERSION                             AVAILABLE   PROGRESSING   SINCE   STATUS
version   4.8.0-0.nightly-2021-05-15-141455   True        False         51m     Cluster version is 4.8.0-0.nightly-2021-05-15-141455

Cluster is able to be installed successfully.

# oc adm release extract --to 4.8
Extracted release payload created at 2021-05-15T14:16:30Z

# grep -r 'apiVersion: rbac' 4.8 | grep v1beta1
4.8/0000_50_cluster-autoscaler-operator_03_rbac.yaml:apiVersion: rbac.authorization.k8s.io/v1beta1
4.8/0000_50_cluster-autoscaler-operator_03_rbac.yaml:apiVersion: rbac.authorization.k8s.io/v1beta1

Yeah, most of components use the rbacv1 but above 2 resources ^^.

# grep v1beta1 0000_50_cluster-autoscaler-operator_03_rbac.yaml -a5
  - subjectaccessreviews
  verbs: ["create"]

---
kind: Role
apiVersion: rbac.authorization.k8s.io/v1beta1
metadata:
  name: cluster-autoscaler-operator
  namespace: openshift-machine-api
  annotations:
    include.release.openshift.io/ibm-cloud-managed: "true"
--
    - patch
    - delete

---
kind: RoleBinding
apiVersion: rbac.authorization.k8s.io/v1beta1
metadata:
  name: cluster-autoscaler-operator
  namespace: openshift-machine-api
  annotations:
    include.release.openshift.io/ibm-cloud-managed: "true"


# oc get role cluster-autoscaler-operator -n openshift-machine-api -oyaml | grep rbac
apiVersion: rbac.authorization.k8s.io/v1

# oc get RoleBinding cluster-autoscaler-operator -n openshift-machine-api -oyaml | grep rbac
apiVersion: rbac.authorization.k8s.io/v1
  apiGroup: rbac.authorization.k8s.io

It's wired that the cluster-autoscaler-operator role and rolebinding use rbacv1beta1 in the manifests but get rbacv1 in the cluster.

Comment 3 W. Trevor King 2021-05-17 17:50:02 UTC
I'm pretty sure we didn't actually need baremetal-operator#147, e.g. see comment 2 where the only v1beta1 RBAC manifests were from the autoscaler.  But since the PR has merged linking this bug, I won't unlink it again.

Comment 5 Yang Yang 2021-05-18 08:09:16 UTC
Verifying it with 4.8.0-0.nightly-2021-05-17-231618

# oc get clusterversion
NAME      VERSION                             AVAILABLE   PROGRESSING   SINCE   STATUS
version   4.8.0-0.nightly-2021-05-17-231618   True        False         55m     Cluster version is 4.8.0-0.nightly-2021-05-17-231618

# oc adm release extract --to 4.8
Extracted release payload created at 2021-05-17T23:18:24Z

# grep -r 'apiVersion: rbac' 4.8 | grep v1beta1
Null

All of the components are using rbacv1

# oc get clusterrole cluster-baremetal-operator -oyaml | grep rbac
apiVersion: rbac.authorization.k8s.io/v1

# oc get role cluster-autoscaler-operator -n openshift-machine-api -oyaml | grep rbac
apiVersion: rbac.authorization.k8s.io/v1

# oc get RoleBinding cluster-autoscaler-operator -n openshift-machine-api -oyaml | grep rbac
apiVersion: rbac.authorization.k8s.io/v1
  apiGroup: rbac.authorization.k8s.io

Randomly check 3 resources^^, all of them are using rbacv1 in cluster.

# oc logs pod/cluster-version-operator-7b7cbf6574-gm8px -n openshift-cluster-version > cvo1.log

# grep 'rbac.authorization.k8s.io/v1beta1' cvo1.log
I0518 06:37:06.380135       1 request.go:591] Throttling request took 1.654489087s, request: GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/rbac.authorization.k8s.io/v1beta1?timeout=32s
I0518 06:44:08.784898       1 request.go:591] Throttling request took 1.772947083s, request: GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/rbac.authorization.k8s.io/v1beta1?timeout=32s
I0518 06:48:20.525289       1 request.go:591] Throttling request took 814.104328ms, request: GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/rbac.authorization.k8s.io/v1beta1?timeout=32s
I0518 06:54:26.368170       1 request.go:591] Throttling request took 106.155145ms, request: GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/rbac.authorization.k8s.io/v1beta1?timeout=32s
I0518 07:43:44.929603       1 request.go:591] Throttling request took 71.028255ms, request: GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/rbac.authorization.k8s.io/v1beta1?timeout=32s
I0518 07:47:04.881058       1 request.go:591] Throttling request took 64.153968ms, request: GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/rbac.authorization.k8s.io/v1beta1?timeout=32s
I0518 07:56:55.580297       1 request.go:591] Throttling request took 95.953741ms, request: GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/rbac.authorization.k8s.io/v1beta1?timeout=32s

There are still requests directing to rbacv1beta1 through searching the CVO log.

Comment 6 Vadim Rutkovsky 2021-05-18 09:12:56 UTC
(In reply to Yang Yang from comment #5)
> # oc logs pod/cluster-version-operator-7b7cbf6574-gm8px -n
> openshift-cluster-version > cvo1.log
> 
> # grep 'rbac.authorization.k8s.io/v1beta1' cvo1.log
> I0518 06:37:06.380135       1 request.go:591] Throttling request took
> 1.654489087s, request:
> GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/
> rbac.authorization.k8s.io/v1beta1?timeout=32s
> I0518 06:44:08.784898       1 request.go:591] Throttling request took
> 1.772947083s, request:
> GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/
> rbac.authorization.k8s.io/v1beta1?timeout=32s
> I0518 06:48:20.525289       1 request.go:591] Throttling request took
> 814.104328ms, request:
> GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/
> rbac.authorization.k8s.io/v1beta1?timeout=32s
> I0518 06:54:26.368170       1 request.go:591] Throttling request took
> 106.155145ms, request:
> GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/
> rbac.authorization.k8s.io/v1beta1?timeout=32s
> I0518 07:43:44.929603       1 request.go:591] Throttling request took
> 71.028255ms, request:
> GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/
> rbac.authorization.k8s.io/v1beta1?timeout=32s
> I0518 07:47:04.881058       1 request.go:591] Throttling request took
> 64.153968ms, request:
> GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/
> rbac.authorization.k8s.io/v1beta1?timeout=32s
> I0518 07:56:55.580297       1 request.go:591] Throttling request took
> 95.953741ms, request:
> GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/
> rbac.authorization.k8s.io/v1beta1?timeout=32s
> 
> There are still requests directing to rbacv1beta1 through searching the CVO
> log.


If I understood correctly these requests are still being done while getting existing ClusterRole/Role etc., as getter can't be restricted to particular APIVersion. I don't think we can do anything about those

Comment 7 Yang Yang 2021-05-18 11:01:32 UTC
The request is sent to rbacv1beta1 while syncing these resources below,

clusterrole "system:openshift:scc:hostnetwork"
configmap "openshift-cluster-machine-approver/kube-rbac-proxy"
customresourcedefinition "kubestorageversionmigrators.operator.openshift.io"
customresourcedefinition "catalogsources.operators.coreos.com"
servicemonitor "openshift-machine-api/cluster-autoscaler-operator"
servicemonitor "openshift-cluster-samples-operator/cluster-samples-operator" 
rolebinding "openshift-config-managed/console-configmap-reader"


CVO log is showing for syncing rolebinding "openshift-config-managed/console-configmap-reader",

68201 I0518 08:17:04.684321       1 sync_worker.go:760] Running sync for rolebinding "openshift-config-managed/console-configmap-reader" (441 of 676)
68202 I0518 08:17:04.685298       1 request.go:591] Throttling request took 108.121989ms, request: GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/apiextensions.k8s.io/v1?timeout=32s
68203 I0518 08:17:04.710132       1 request.go:591] Throttling request took 132.950336ms, request: GET:https://api-int.yangyang0518-2.qe.devcluster.openshift.com:6443/apis/rbac.authorization.k8s.io/v1beta1?timeout=32s
68204 I0518 08:17:04.733621       1 sync_worker.go:772] Done syncing for rolebinding "openshift-config-managed/console-configmap-reader" (441 of 676)

Request is sent to rbacv1beta1 while syncing rolebinding "openshift-config-managed/console-configmap-reader"

# oc get rolebinding console-configmap-reader -n openshift-config-managed -oyaml
apiVersion: rbac.authorization.k8s.io/v1
kind: RoleBinding
metadata:
  annotations:
    include.release.openshift.io/ibm-cloud-managed: "true"
    include.release.openshift.io/self-managed-high-availability: "true"
    include.release.openshift.io/single-node-developer: "true"
  creationTimestamp: "2021-05-18T06:34:49Z"
  name: console-configmap-reader
  namespace: openshift-config-managed
  resourceVersion: "16313"
  uid: 542a2ce6-2037-45b8-a79f-ccbbe01c3ceb
roleRef:
  apiGroup: rbac.authorization.k8s.io
  kind: Role
  name: console-configmap-reader
subjects:
- kind: ServiceAccount
  name: console
  namespace: openshift-console


rolebinding "openshift-config-managed/console-configmap-reader" is using rbacv1. 

I'm confused why the request is sent to rbacv1beta1 for a few resources. It doesn't fail the resources creation though, if it's acceptable for you, it's okay for me to move it to verify.

Comment 8 W. Trevor King 2021-05-18 20:33:07 UTC
Poking at 4.8.0-0.nightly-2021-05-18-164623 [1]:

$ curl -s https://gcsweb-ci.apps.ci.l2s4.p1.openshiftapps.com/gcs/origin-ci-test/logs/periodic-ci-openshift-relea
se-master-nightly-4.8-e2e-aws-serial/1394700146947657728/artifacts/e2e-aws-serial/gather-audit-logs/artifacts/audit-logs.tar | tar xz --strip-components=1
$ zgrep -h '"apiGroup":"rbac.authorization.k8s.io"' audit_logs/*/*.gz 2>/dev/null | jq -c 'select(.objectRef.apiVersion == "v1beta1") | {objectRef: (.objectRef | {apiVersion, namespace, name}), username: .user.username, verb}' | sort | uniq -c
      1 {"objectRef":{"apiVersion":"v1beta1","namespace":"etcdstoragepathtestnamespace","name":"role2"},"username":"system:admin","verb":"create"}
      1 {"objectRef":{"apiVersion":"v1beta1","namespace":"etcdstoragepathtestnamespace","name":"role2"},"username":"system:admin","verb":"delete"}
      1 {"objectRef":{"apiVersion":"v1beta1","namespace":"etcdstoragepathtestnamespace","name":"roleb2"},"username":"system:admin","verb":"create"}
      1 {"objectRef":{"apiVersion":"v1beta1","namespace":"etcdstoragepathtestnamespace","name":"roleb2"},"username":"system:admin","verb":"delete"}
      1 {"objectRef":{"apiVersion":"v1beta1","namespace":null,"name":"crole2"},"username":"system:admin","verb":"create"}
      1 {"objectRef":{"apiVersion":"v1beta1","namespace":null,"name":"crole2"},"username":"system:admin","verb":"delete"}
      1 {"objectRef":{"apiVersion":"v1beta1","namespace":null,"name":"croleb2"},"username":"system:admin","verb":"create"}
      1 {"objectRef":{"apiVersion":"v1beta1","namespace":null,"name":"croleb2"},"username":"system:admin","verb":"delete"}

That's despite having some v1beta1 throttling messages:

$ curl -s https://gcsweb-ci.apps.ci.l2s4.p1.openshiftapps.com/gcs/origin-ci-test/logs/periodic-ci-openshift-release-master-nightly-4.8-e2e-aws-serial/1394700146947657728/artifacts/e2e-aws-serial/gather-extra/artifacts/pods/openshift-cluster-version_cluster-version-operator-68d6885b7d-dkrgl_cluster-version-operator.log | grep 'rbac.*v1beta1'
I0518 18:14:13.641153       1 request.go:591] Throttling request took 847.565617ms, request: GET:https://api-int.ci-op-xp8w76ib-eb421.origin-ci-int-aws.dev.rhcloud.com:6443/apis/rbac.authorization.k8s.io/v1beta1?timeout=32s
I0518 18:17:25.416498       1 request.go:591] Throttling request took 500.701033ms, request: GET:https://api-int.ci-op-xp8w76ib-eb421.origin-ci-int-aws.dev.rhcloud.com:6443/apis/rbac.authorization.k8s.io/v1beta1?timeout=32s
I0518 18:21:16.086884       1 request.go:591] Throttling request took 1.222964081s, request: GET:https://api-int.ci-op-xp8w76ib-eb421.origin-ci-int-aws.dev.rhcloud.com:6443/apis/rbac.authorization.k8s.io/v1beta1?timeout=32s
I0518 19:17:13.947750       1 request.go:591] Throttling request took 498.278752ms, request: GET:https://api-int.ci-op-xp8w76ib-eb421.origin-ci-int-aws.dev.rhcloud.com:6443/apis/rbac.authorization.k8s.io/v1beta1?timeout=32s

So I'm fine with this going VERIFIED even though I don't quite understand where the v1beta1 throttling messages come from.

[1]: https://prow.ci.openshift.org/view/gs/origin-ci-test/logs/periodic-ci-openshift-release-master-nightly-4.8-e2e-aws-serial/1394700146947657728

Comment 9 Yang Yang 2021-05-19 01:10:59 UTC
Thanks Trevor for the confirmation. Based on comment#7 and comment#8, moving it to verified state.

Comment 12 errata-xmlrpc 2021-07-27 23:08:28 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Moderate: OpenShift Container Platform 4.8.2 bug fix and security update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2021:2438


Note You need to log in before you can comment on or make changes to this bug.