Bug 1746695 - Normal users could not view project logs in Kibana.
Summary: Normal users could not view project logs in Kibana.
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Logging
Version: 4.2.0
Hardware: Unspecified
OS: Unspecified
unspecified
high
Target Milestone: ---
: 4.2.0
Assignee: Jeff Cantrill
QA Contact: Anping Li
URL:
Whiteboard:
Depends On:
Blocks: 1746482
TreeView+ depends on / blocked
 
Reported: 2019-08-29 06:45 UTC by Qiaoling Tang
Modified: 2019-10-16 06:40 UTC (History)
5 users (show)

Fixed In Version:
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2019-10-16 06:38:14 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
content of .kibana.* in ES (546.91 KB, text/plain)
2019-08-29 06:45 UTC, Qiaoling Tang
no flags Details


Links
System ID Private Priority Status Summary Last Updated
Github openshift cluster-logging-operator pull 241 0 'None' closed Bug 1746695: Add scope user:list-projects back to proxy 2020-10-13 01:14:15 UTC
Github openshift origin-aggregated-logging pull 1735 0 'None' closed Bug 1738411: Bump openshift-elasticsearch-client to 5.6.13.10 2020-10-13 01:14:15 UTC
Red Hat Product Errata RHBA-2019:2922 0 None None None 2019-10-16 06:40:24 UTC

Description Qiaoling Tang 2019-08-29 06:45:12 UTC
Created attachment 1609274 [details]
content of .kibana.* in ES

Description of problem:
Create an user, then log into the cluster, create projects test1 and test2 to generate logs. Then log into kibana with this user, there are no default index pattern for normal user, error message: [object Object]: [security_exception] no permissions for [indices:data/read/field_caps] and User [name=CN=system.logging.kibana,OU=OpenShift,O=Logging, roles=[]] 

I tried to create index pattern project.test2.* manually, got same error. I refreshed the page, then it showed a blank page.

After this, I logged into kibana with a cluster-admin user, then I could view all the logs, and I also can see the index pattern created by the normal user.

It seems that the cluster-admin and noraml user are using the same .kibana.* settings, because when I log into kibana again with normal user, the index patterns are .operations.*, .all, project.*, .orphaned.* and project.test2.*.

I also tried to curl in ES with normal user's token, it works as expected.

$ oc logs -c kibana-proxy kibana-66599964b7-jwxpw 
2019/08/29 05:45:53 oauthproxy.go:200: mapping path "/" => upstream "http://localhost:5601/"
2019/08/29 05:45:53 oauthproxy.go:227: OAuthProxy configured for  Client ID: system:serviceaccount:openshift-logging:kibana
2019/08/29 05:45:53 oauthproxy.go:237: Cookie settings: name:_oauth_proxy secure(https):true httponly:true expiry:168h0m0s domain:<default> refresh:disabled
2019/08/29 05:45:53 http.go:58: HTTP: listening on 127.0.0.1:4180
2019/08/29 05:45:53 http.go:96: HTTPS: listening on [::]:3000
2019/08/29 05:54:41 provider.go:554: Performing OAuth discovery against https://172.30.0.1/.well-known/oauth-authorization-server
2019/08/29 05:54:41 provider.go:594: 200 GET https://172.30.0.1/.well-known/oauth-authorization-server {
  "issuer": "https://oauth-openshift.apps.qitang.test.com",
  "authorization_endpoint": "https://oauth-openshift.apps.qitang.test.com/oauth/authorize",
  "token_endpoint": "https://oauth-openshift.apps.qitang.test.com/oauth/token",
  "scopes_supported": [
    "user:check-access",
    "user:full",
    "user:info",
    "user:list-projects",
    "user:list-scoped-projects"
  ],
  "response_types_supported": [
    "code",
    "token"
  ],
  "grant_types_supported": [
    "authorization_code",
    "implicit"
  ],
  "code_challenge_methods_supported": [
    "plain",
    "S256"
  ]
}
2019/08/29 05:54:57 provider.go:554: Performing OAuth discovery against https://172.30.0.1/.well-known/oauth-authorization-server
2019/08/29 05:54:57 provider.go:594: 200 GET https://172.30.0.1/.well-known/oauth-authorization-server {
  "issuer": "https://oauth-openshift.apps.qitang.test.com",
  "authorization_endpoint": "https://oauth-openshift.apps.qitang.test.com/oauth/authorize",
  "token_endpoint": "https://oauth-openshift.apps.qitang.test.com/oauth/token",
  "scopes_supported": [
    "user:check-access",
    "user:full",
    "user:info",
    "user:list-projects",
    "user:list-scoped-projects"
  ],
  "response_types_supported": [
    "code",
    "token"
  ],
  "grant_types_supported": [
    "authorization_code",
    "implicit"
  ],
  "code_challenge_methods_supported": [
    "plain",
    "S256"
  ]
}
2019/08/29 05:54:57 provider.go:594: 200 GET https://172.30.0.1/apis/user.openshift.io/v1/users/~ {"kind":"User","apiVersion":"user.openshift.io/v1","metadata":{"name":"testuser-0","selfLink":"/apis/user.openshift.io/v1/users/testuser-0","uid":"97cbbe85-ca1e-11e9-948f-0a580a820027","resourceVersion":"122411","creationTimestamp":"2019-08-29T05:33:55Z"},"identities":["flexy-htpasswd-provider:testuser-0"],"groups":["system:authenticated","system:authenticated:oauth"]}
2019/08/29 05:54:57 oauthproxy.go:675: 10.129.2.6:42362 authentication complete Session{testuser-0 token:true}

ES pod log:
[2019-08-29T05:55:55,449][ERROR][i.f.e.p.a.OpenShiftTokenAuthentication] Error handling request
com.google.common.util.concurrent.UncheckedExecutionException: ElasticsearchSecurityException[Unable to retrieve users's project list]
	at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2207) ~[guava-19.0.jar:?]
	at com.google.common.cache.LocalCache.get(LocalCache.java:3953) ~[guava-19.0.jar:?]
	at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3957) ~[guava-19.0.jar:?]
	at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4875) ~[guava-19.0.jar:?]
	at io.fabric8.elasticsearch.plugin.OpenshiftRequestContextFactory.create(OpenshiftRequestContextFactory.java:143) ~[openshift-elasticsearch-plugin-5.6.13.8-redhat-1.jar:?]
	at io.fabric8.elasticsearch.plugin.auth.OpenShiftTokenAuthentication.extractCredentials(OpenShiftTokenAuthentication.java:92) [openshift-elasticsearch-plugin-5.6.13.8-redhat-1.jar:?]
	at com.floragunn.searchguard.auth.BackendRegistry.authenticate(BackendRegistry.java:428) [search-guard-5-5.6.13.19-2-redhat-1.jar:?]
	at com.floragunn.searchguard.filter.SearchGuardRestFilter.checkAndAuthenticateRequest(SearchGuardRestFilter.java:113) [search-guard-5-5.6.13.19-2-redhat-1.jar:?]
	at com.floragunn.searchguard.filter.SearchGuardRestFilter.access$000(SearchGuardRestFilter.java:45) [search-guard-5-5.6.13.19-2-redhat-1.jar:?]
	at com.floragunn.searchguard.filter.SearchGuardRestFilter$1.handleRequest(SearchGuardRestFilter.java:69) [search-guard-5-5.6.13.19-2-redhat-1.jar:?]
	at io.fabric8.elasticsearch.plugin.acl.DynamicACLFilter$1.handleRequest(DynamicACLFilter.java:94) [openshift-elasticsearch-plugin-5.6.13.8-redhat-1.jar:?]
	at org.elasticsearch.rest.RestController.dispatchRequest(RestController.java:262) [elasticsearch-5.6.13.redhat-1.jar:5.6.13.redhat-1]
	at org.elasticsearch.rest.RestController.dispatchRequest(RestController.java:200) [elasticsearch-5.6.13.redhat-1.jar:5.6.13.redhat-1]
	at com.floragunn.searchguard.ssl.http.netty.ValidatingDispatcher.dispatchRequest(ValidatingDispatcher.java:60) [search-guard-ssl-5.6.13.23-redhat-1.jar:5.6.13.23-redhat-1]
	at org.elasticsearch.http.netty4.Netty4HttpServerTransport.dispatchRequest(Netty4HttpServerTransport.java:505) [transport-netty4-client-5.6.13.redhat-1.jar:5.6.13.redhat-1]
	at org.elasticsearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:80) [transport-netty4-client-5.6.13.redhat-1.jar:5.6.13.redhat-1]
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at org.elasticsearch.http.netty4.pipelining.HttpPipeliningHandler.channelRead(HttpPipeliningHandler.java:68) [transport-netty4-client-5.6.13.redhat-1.jar:5.6.13.redhat-1]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) [netty-codec-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111) [netty-codec-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) [netty-codec-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) [netty-codec-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) [netty-codec-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284) [netty-codec-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1273) [netty-handler-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1084) [netty-handler-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:489) [netty-codec-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:428) [netty-codec-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) [netty-codec-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:544) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:498) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) [netty-transport-4.1.13.Final.jar:4.1.13.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) [netty-common-4.1.13.Final.jar:4.1.13.Final]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_222]
Caused by: org.elasticsearch.ElasticsearchSecurityException: Unable to retrieve users's project list
	at io.fabric8.elasticsearch.plugin.OpenshiftAPIService.projectNames(OpenshiftAPIService.java:93) ~[openshift-elasticsearch-plugin-5.6.13.8-redhat-1.jar:?]
	at io.fabric8.elasticsearch.plugin.OpenshiftRequestContextFactory$1.run(OpenshiftRequestContextFactory.java:187) ~[openshift-elasticsearch-plugin-5.6.13.8-redhat-1.jar:?]
	at io.fabric8.elasticsearch.plugin.OpenshiftRequestContextFactory$1.run(OpenshiftRequestContextFactory.java:183) ~[openshift-elasticsearch-plugin-5.6.13.8-redhat-1.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_222]
	at io.fabric8.elasticsearch.plugin.OpenshiftRequestContextFactory.listProjectsFor(OpenshiftRequestContextFactory.java:183) ~[openshift-elasticsearch-plugin-5.6.13.8-redhat-1.jar:?]
	at io.fabric8.elasticsearch.plugin.OpenshiftRequestContextFactory.load(OpenshiftRequestContextFactory.java:117) ~[openshift-elasticsearch-plugin-5.6.13.8-redhat-1.jar:?]
	at io.fabric8.elasticsearch.plugin.OpenshiftRequestContextFactory.load(OpenshiftRequestContextFactory.java:58) ~[openshift-elasticsearch-plugin-5.6.13.8-redhat-1.jar:?]
	at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3542) ~[guava-19.0.jar:?]
	at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2323) ~[guava-19.0.jar:?]
	at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2286) ~[guava-19.0.jar:?]
	at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201) ~[guava-19.0.jar:?]
	... 64 more


Version-Release number of selected component (if applicable):
ose-logging-kibana5-v4.2.0-201908271819
ose-oauth-proxy-v4.2.0-201908251340
ose-logging-elasticsearch5-v4.2.0-201908271819
ose-cluster-logging-operator-v4.2.0-201908271819

$ oc get clusterversion
NAME      VERSION                             AVAILABLE   PROGRESSING   SINCE   STATUS
version   4.2.0-0.nightly-2019-08-28-152644   True        False         5h28m   Cluster version is 4.2.0-0.nightly-2019-08-28-152644


How reproducible:
Always

Steps to Reproduce:
1. deploy logging via OLM
2. log into OCP cluster with normal user, then create some projects to generate some logs 
3. log into Kibana console with normal user

Actual results:
Normal users could not view it's project logs in Kibana

Expected results:
All users should be able to view logs in Kibana

Additional info:

Comment 1 Jeff Cantrill 2019-08-30 14:24:14 UTC
I believe this is related to a missing token when making requests to the openshift api server fixed in https://github.com/fabric8io/openshift-elasticsearch-plugin/pull/186 where I saw issues while investigating test failures for https://github.com/openshift/origin-aggregated-logging/pull/1735

Comment 3 Anping Li 2019-09-03 07:33:48 UTC
The fix is not in openshift/ose-logging-elasticsearch5/images/v4.2.0-201909020729. Waiting another image

Comment 5 Anping Li 2019-09-03 13:26:17 UTC
The ES plugin version

sh-4.2$ /usr/share/elasticsearch/bin/plugin list -v
Plugins directory: /usr/share/elasticsearch/plugins
openshift-elasticsearch
- Plugin information:
Name: openshift-elasticsearch
Description: Openshift Elasticsearch Plugin
Version: 5.6.13.10-redhat-1
Native Controller: false
 * Classname: io.fabric8.elasticsearch.plugin.OpenShiftElasticSearchPlugin
prometheus-exporter
- Plugin information:
Name: prometheus-exporter
Description: Export Elasticsearch metrics to Prometheus
Version: 5.6.13.2
Native Controller: false
 * Classname: org.elasticsearch.plugin.prometheus.PrometheusExporterPlugin

Comment 6 Jeff Cantrill 2019-09-04 20:10:20 UTC
@anli,  I don't understand #c5.  Are you stating there is still a failure even though it has the correct plugin version?

Comment 7 Jeff Cantrill 2019-09-10 19:03:47 UTC
Confirmed the need to:

* reintroduce the ability to list projects for the proxy: https://github.com/openshift/cluster-logging-operator/pull/234/files#diff-5ff1cbe659b99e0e73d8ba484249c27cL480
* log user out and then log them back in where they will need to grant use of the list project permissions

Comment 10 Anping Li 2019-09-16 05:09:18 UTC
Verified in v4.2.0-201909151553

Comment 11 errata-xmlrpc 2019-10-16 06:38:14 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2019:2922


Note You need to log in before you can comment on or make changes to this bug.