Created attachment 1196206 [details] es_log
Created attachment 1196207 [details] fluentd_log
Created attachment 1196208 [details] deployer_log
Created attachment 1196209 [details] kibana_log
Exception seen in ES log: [2016-08-30 23:34:49,622][ERROR][com.floragunn.searchguard.filter.SearchGuardActionFilter] Attempt from null to _all indices for indices:data/read/search and User [name=system.logging.kibana, roles=[]] [2016-08-30 23:34:49,623][ERROR][com.floragunn.searchguard.filter.SearchGuardActionFilter] Forbidden while apply() due to com.floragunn.searchguard.authorization.ForbiddenException: Attempt from null to _all indices for indices:data/read/search and User [name=system.logging.kibana, roles=[]] for action indices:data/read/search com.floragunn.searchguard.authorization.ForbiddenException: Attempt from null to _all indices for indices:data/read/search and User [name=system.logging.kibana, roles=[]] at com.floragunn.searchguard.filter.SearchGuardActionFilter.apply0(SearchGuardActionFilter.java:199) at com.floragunn.searchguard.filter.SearchGuardActionFilter.apply(SearchGuardActionFilter.java:90) at org.elasticsearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:165) at com.floragunn.searchguard.filter.FLSActionFilter.applySecure(FLSActionFilter.java:76) at com.floragunn.searchguard.filter.AbstractActionFilter.apply(AbstractActionFilter.java:97) at org.elasticsearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:165) at com.floragunn.searchguard.filter.DLSActionFilter.applySecure(DLSActionFilter.java:73) at com.floragunn.searchguard.filter.AbstractActionFilter.apply(AbstractActionFilter.java:97) at org.elasticsearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:165) at com.floragunn.searchguard.filter.RequestActionFilter.applySecure(RequestActionFilter.java:94) at com.floragunn.searchguard.filter.AbstractActionFilter.apply(AbstractActionFilter.java:97) at org.elasticsearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:165) at org.elasticsearch.action.support.ActionFilter$Simple.apply(ActionFilter.java:64) at org.elasticsearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:165) at io.fabric8.elasticsearch.plugin.ActionForbiddenActionFilter.apply(ActionForbiddenActionFilter.java:48) at org.elasticsearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:165) at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:82) at org.elasticsearch.client.node.NodeClient.execute(NodeClient.java:98) at org.elasticsearch.client.FilterClient.execute(FilterClient.java:66) at org.elasticsearch.rest.BaseRestHandler$HeadersAndContextCopyClient.execute(BaseRestHandler.java:92) at org.elasticsearch.client.support.AbstractClient.search(AbstractClient.java:334) at org.elasticsearch.rest.action.search.RestSearchAction.handleRequest(RestSearchAction.java:81) at org.elasticsearch.rest.BaseRestHandler.handleRequest(BaseRestHandler.java:53) at org.elasticsearch.rest.RestController.executeHandler(RestController.java:225) at org.elasticsearch.rest.RestController$RestHandlerFilter.process(RestController.java:299) at org.elasticsearch.rest.RestController$ControllerFilterChain.continueProcessing(RestController.java:280) at io.fabric8.elasticsearch.plugin.KibanaUserReindexFilter.process(KibanaUserReindexFilter.java:76) at org.elasticsearch.rest.RestController$ControllerFilterChain.continueProcessing(RestController.java:283) at com.floragunn.searchguard.rest.DefaultRestFilter.processSecure(DefaultRestFilter.java:37) at com.floragunn.searchguard.rest.AbstractACRestFilter.process(AbstractACRestFilter.java:198) at org.elasticsearch.rest.RestController$ControllerFilterChain.continueProcessing(RestController.java:283) at io.fabric8.elasticsearch.plugin.acl.DynamicACLFilter.process(DynamicACLFilter.java:162) at org.elasticsearch.rest.RestController$ControllerFilterChain.continueProcessing(RestController.java:283) at org.elasticsearch.rest.RestController.dispatchRequest(RestController.java:180) at org.elasticsearch.http.HttpServer.internalDispatchRequest(HttpServer.java:121) at org.elasticsearch.http.HttpServer$Dispatcher.dispatchRequest(HttpServer.java:83) at org.elasticsearch.http.netty.NettyHttpServerTransport.dispatchRequest(NettyHttpServerTransport.java:329) at org.elasticsearch.http.netty.HttpRequestHandler.messageReceived(HttpRequestHandler.java:65) at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.elasticsearch.common.netty.channel.SimpleChannelHandler.messageReceived(SimpleChannelHandler.java:142) at com.floragunn.searchguard.http.netty.MutualSSLHandler.messageReceived(MutualSSLHandler.java:80) at org.elasticsearch.common.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.elasticsearch.http.netty.pipelining.HttpPipeliningHandler.messageReceived(HttpPipeliningHandler.java:60) at org.elasticsearch.common.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.elasticsearch.common.netty.handler.codec.http.HttpChunkAggregator.messageReceived(HttpChunkAggregator.java:145) at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.elasticsearch.common.netty.handler.codec.http.HttpContentDecoder.messageReceived(HttpContentDecoder.java:108) at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296) at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:459) at org.elasticsearch.common.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:536) at org.elasticsearch.common.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435) at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.elasticsearch.common.netty.OpenChannelsHandler.handleUpstream(OpenChannelsHandler.java:74) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296) at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268) at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255) at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) at org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) at org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)
Eric, could this be related to https://github.com/openshift/origin-aggregated-logging/pull/208#issuecomment-243267008 ?
I don't believe so, that issue was specific to the upgrade_logging branch that I'm working in, which uses Kibana 4.5.4. When I had rolled back to use the version of kibana used for 3.3, I saw my records show up in Kibana, when using the journald log driver. There is also a different error message that shows up in the ES logs, this one appears to be ACL related...
> # see if anything is in elasticsearch > $ oc exec logging-kibana-1-m2bg9 -- curl -s -k --cert /etc/kibana/keys/cert --key /etc/kibana/keys/key https://logging-es:9200/_search | python -mjson.tool > defaulting container name to kibana, use 'oc describe po/logging-kibana-1-m2bg9' cmd to see all containers in this pod{ > "error": "ForbiddenException[Attempt from null to _all indices for indices:data/read/search and User [name=system.logging.kibana, roles=[]]]", > "status": 403 > } This will not work - access control does not allow searches at https://logging-es:9200/_search If you want to see everything, try adding an index pattern like this: $ oc exec logging-kibana-1-m2bg9 -c kibana -- curl -s -k --cert /etc/kibana/keys/cert --key /etc/kibana/keys/key https://logging-es:9200/*2016*/_search | python -mjson.tool I also added `-c kibana` because recent versions of openshift will print that error message if you do not specify the container to use with oc exec.
Oh, sorry for my mistake--Just realized that I actually configured the journald log driver for docker on master machine instead of node machine, and give comment #9. Please kindly ignore it. After re-configured journald log driver on node machine, the log entries got shown on kibana UI. Could you please help transfer this back to ON_QA? I will then close it.
Set to verified according to comment #10, please ignore comment #9. Images tested with: brew-pulp-docker01.web.prod.ext.phx2.redhat.com:8888/openshift3/logging-elasticsearch 3.3.0 6e39e59e8b0e 2 hours ago 426 MB brew-pulp-docker01.web.prod.ext.phx2.redhat.com:8888/openshift3/logging-deployer 3.3.0 de84ad1448af 27 hours ago 760.1 MB brew-pulp-docker01.web.prod.ext.phx2.redhat.com:8888/openshift3/logging-kibana 3.3.0 ad2713df85a7 27 hours ago 266.9 MB brew-pulp-docker01.web.prod.ext.phx2.redhat.com:8888/openshift3/logging-fluentd 3.3.0 74505c2dd791 28 hours ago 238.7 MB brew-pulp-docker01.web.prod.ext.phx2.redhat.com:8888/openshift3/logging-auth-proxy 3.3.0 196ecb30fc93 3 weeks ago 229.2 MB brew-pulp-docker01.web.prod.ext.phx2.redhat.com:8888/openshift3/logging-curator 3.3.0 2c88e1273c11 6 weeks ago 253.8 MB
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2016:1933