Logs in openshift-authentication namespace: $ oc logs -n openshift-authentication oauth-openshift-6697765b49-zlrdb Command "openshift-osinserver" is deprecated, will be removed in 4.0 I0628 01:17:27.622333 1 clientca.go:93] [0] "/tmp/requestheader-client-ca-file838365080" client-ca certificate: "aggregator-signer" [] issuer="<self>" (2019-06-28 00:53:52 +0000 UTC to 2019-06-29 00:53:52 +0000 UTC (now=2019-06-28 01:17:27.622311638 +0000 UTC)) I0628 01:17:27.623195 1 clientca.go:93] [0] "/tmp/client-ca-file969237981" client-ca certificate: "admin-kubeconfig-signer" [] issuer="<self>" (2019-06-28 00:53:40 +0000 UTC to 2029-06-25 00:53:40 +0000 UTC (now=2019-06-28 01:17:27.623182144 +0000 UTC)) I0628 01:17:27.623229 1 clientca.go:93] [1] "/tmp/client-ca-file969237981" client-ca certificate: "kube-csr-signer_@1561684182" [] issuer="kubelet-signer" (2019-06-28 01:09:41 +0000 UTC to 2019-06-29 00:53:56 +0000 UTC (now=2019-06-28 01:17:27.623214745 +0000 UTC)) I0628 01:17:27.623252 1 clientca.go:93] [2] "/tmp/client-ca-file969237981" client-ca certificate: "kubelet-signer" [] issuer="<self>" (2019-06-28 00:53:56 +0000 UTC to 2019-06-29 00:53:56 +0000 UTC (now=2019-06-28 01:17:27.623240091 +0000 UTC)) I0628 01:17:27.623274 1 clientca.go:93] [3] "/tmp/client-ca-file969237981" client-ca certificate: "kube-apiserver-to-kubelet-signer" [] issuer="<self>" (2019-06-28 00:53:57 +0000 UTC to 2020-06-27 00:53:57 +0000 UTC (now=2019-06-28 01:17:27.623260456 +0000 UTC)) I0628 01:17:27.623294 1 clientca.go:93] [4] "/tmp/client-ca-file969237981" client-ca certificate: "kube-control-plane-signer" [] issuer="<self>" (2019-06-28 00:53:56 +0000 UTC to 2020-06-27 00:53:56 +0000 UTC (now=2019-06-28 01:17:27.62328116 +0000 UTC)) I0628 01:17:27.637467 1 secure_serving.go:66] Forcing use of http/1.1 only I0628 01:17:27.638408 1 serving.go:196] [0] "/var/config/system/secrets/v4-0-config-system-serving-cert/tls.crt" serving certificate: "oauth-openshift.openshift-authentication.svc" [serving] validServingFor=[oauth-openshift.openshift-authentication.svc,oauth-openshift.openshift-authentication.svc.cluster.local] issuer="openshift-service-serving-signer@1561684183" (2019-06-28 01:17:01 +0000 UTC to 2021-06-27 01:17:02 +0000 UTC (now=2019-06-28 01:17:27.638389236 +0000 UTC)) I0628 01:17:27.638435 1 serving.go:196] [1] "/var/config/system/secrets/v4-0-config-system-serving-cert/tls.crt" serving certificate: "openshift-service-serving-signer@1561684183" [] issuer="<self>" (2019-06-28 01:09:42 +0000 UTC to 2020-06-27 01:09:43 +0000 UTC (now=2019-06-28 01:17:27.63842563 +0000 UTC)) I0628 01:17:27.638457 1 secure_serving.go:136] Serving securely on 0.0.0.0:6443 I0628 01:17:27.638478 1 serving.go:78] Starting DynamicLoader I0628 01:17:27.638594 1 clientca.go:59] Starting DynamicCA: /tmp/requestheader-client-ca-file838365080 I0628 01:17:27.638594 1 clientca.go:59] Starting DynamicCA: /tmp/client-ca-file969237981 E0628 02:11:54.715069 1 access.go:177] osin: error=unauthorized_client, internal_error=<nil> get_client=client check failed, client_id=kibana-proxy E0628 02:11:58.802983 1 access.go:177] osin: error=unauthorized_client, internal_error=<nil> get_client=client check failed, client_id=kibana-proxy
Working from standing up a 4.1 cluster and then swapping in 4.2 images I was able to replicate the issue. The problem is the kibana-proxy oauthclient secret is not the same as the kibana-proxy secret oauth-secret. Deleting the oauthclient/kibana-proxy and secret/kibana-proxy forces the CLO to recreate the entries and the issue goes away. This may be the short term work around
Fixed in https://github.com/openshift/cluster-logging-operator/pull/217 Tested by removing the file from the CLO pod which causes CLO to regen secrets.
Delete the oauthclient/kibana-proxy, and wait until a new oauthclient/kibana-proxy created, then I can log into the Kibana console successfully.
Verified with ose-cluster-logging-operator-v4.2.0-201907311819
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2019:2922