Description of problem: OpenShift console doesn't allow unprivileged user to access CRD list. Version-Release number of selected component (if applicable): How reproducible: Steps to Reproduce: 1. Create a basic user 2. Login to console with basic user credentials 3. Try to access Knative Event Source CRDs using API `/api/kubernetes/apis/apiextensions.k8s.io/v1beta1/customresourcedefinitions?labelSelector=duck.knative.dev/source=true` Actual results: Console doesn't allow to list the CRDs & fails with 403 Forbidden error Expected results: Console service account shall allow an unprivileged user to get this information without read access to CRDs. Additional info:
Created attachment 1681047 [details] serviceaccount-console
Checked on 4.5 cluster with payload 4.5.0-0.ci-2020-04-22-212726。 Go to serviceaccount page in openshift-console project, click into console serviceaccount, there is error in "Secrets" part: Oh no! Something went wrong. TypeError: Cannot read property 'reduce' of null
(In reply to Yanping Zhang from comment #4) > Checked on 4.5 cluster with payload 4.5.0-0.ci-2020-04-22-212726。 > Go to serviceaccount page in openshift-console project, click into console > serviceaccount, there is error in "Secrets" part: Oh no! Something went > wrong. > TypeError: Cannot read property 'reduce' of null This is an unrelated error. Can you open a separate Bugzilla? This bug is about getting Knative event sources. I'm updating the title to clarify and changing component to Dev Console. (Not being able to list CRDs is expected and enforced by RBAC on the cluster.)
(In reply to Yanping Zhang from comment #4) > Checked on 4.5 cluster with payload 4.5.0-0.ci-2020-04-22-212726。 > Go to serviceaccount page in openshift-console project, click into console > serviceaccount, there is error in "Secrets" part: Oh no! Something went > wrong. > TypeError: Cannot read property 'reduce' of null I'm not able to reproduce this error. I get a "Forbidden" message when logged in a `cluster-reader`, and I can see the secrets as `cluster-admin`. Can you give precise steps to reproduce?
Moving back to ON_QA to verify the event sources fix.
Checked on ocp 4.5 cluster with payload: 4.5.0-0.nightly-2020-05-19-041951 The error in Comment 4 doesn't exist now. Installed Knative Event Operator and Openshift Serverless Operator Access "api/kubernetes/apis/apiextensions.k8s.io/v1beta1/customresourcedefinitions?labelSelector=duck.knative.dev/source=true" with cluster-admin and basic user separately. Basic user got 403 forbidden info: message "customresourcedefinitions.apiextensions.k8s.io is forbidden: User \"yanpzhan\" cannot list resource \"customresourcedefinitions\" in API group \"apiextensions.k8s.io\" at the cluster scope" reason "Forbidden" Cluser admin could got items info.
That's the wrong endpoint. The endpoint to check is `/api/console/knative-event-sources` https://github.com/openshift/console/blob/master/pkg/server/server.go#L315 Moving back ON_QA.
Thanks for your instruction, Sam. Checked on OCP 4.5 cluster with payload: 4.5.0-0.nightly-2020-05-20-183547 Install Knative Event Operator and Openshift Serverless Operator, create knative eventing instance. Basic user access endpoint "/api/console/knative-event-sources", get items successfully: {"kind":"CustomResourceDefinitionList","apiVersion":"apiextensions.k8s.io/v1","metadata":{"selfLink":"/apis/apiextensions.k8s.io/v1/customresourcedefinitions","resourceVersion":"143235"},"items":[{"metadata":{"name":"apiserversources.sources.eventing.knative.dev"},"spec":{"group":"sources.eventing.knative.dev","names":{"plural":"apiserversources","singular":"apiserversource","kind":"ApiServerSource","listKind":"ApiServerSourceList","categories":["all","knative","eventing","sources"]},"versions":[{"name":"v1alpha1","served":true,"storage":true}]}},{"metadata":{"name":"apiserversources.sources.knative.dev"},"spec":{"group":"sources.knative.dev","names":{"plural":"apiserversources","singular":"apiserversource","kind":"ApiServerSource","listKind":"ApiServerSourceList","categories":["all","knative","eventing","sources"]},"versions":[{"name":"v1alpha1","served":true,"storage":true},{"name":"v1alpha2","served":false,"storage":false}]}},{"metadata":{"name":"containersources.sources.eventing.knative.dev"},"spec":{"group":"sources.eventing.knative.dev","names":{"plural":"containersources","singular":"containersource","kind":"ContainerSource","listKind":"ContainerSourceList","categories":["all","knative","eventing","sources"]},"versions":[{"name":"v1alpha1","served":true,"storage":true}]}},{"metadata":{"name":"cronjobsources.sources.eventing.knative.dev"},"spec":{"group":"sources.eventing.knative.dev","names":{"plural":"cronjobsources","singular":"cronjobsource","kind":"CronJobSource","listKind":"CronJobSourceList","categories":["all","knative","eventing","sources"]},"versions":[{"name":"v1alpha1","served":true,"storage":true}]}},{"metadata":{"name":"pingsources.sources.knative.dev"},"spec":{"group":"sources.knative.dev","names":{"plural":"pingsources","singular":"pingsource","kind":"PingSource","listKind":"PingSourceList","categories":["all","knative","eventing","sources"]},"versions":[{"name":"v1alpha1","served":true,"storage":true},{"name":"v1alpha2","served":true,"storage":false}]}},{"metadata":{"name":"sinkbindings.sources.eventing.knative.dev"},"spec":{"group":"sources.eventing.knative.dev","names":{"plural":"sinkbindings","singular":"sinkbinding","kind":"SinkBinding","listKind":"SinkBindingList","categories":["all","knative","eventing","sources","bindings"]},"versions":[{"name":"v1alpha1","served":true,"storage":true}]}},{"metadata":{"name":"sinkbindings.sources.knative.dev"},"spec":{"group":"sources.knative.dev","names":{"plural":"sinkbindings","singular":"sinkbinding","kind":"SinkBinding","listKind":"SinkBindingList","categories":["all","knative","eventing","sources","bindings"]},"versions":[{"name":"v1alpha1","served":true,"storage":true},{"name":"v1alpha2","served":true,"storage":false}]}}]}
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2020:2409