DescriptionGreg Rodriguez II
2018-05-01 23:33:47 UTC
Bug 1422008 previously closed without resolution. Customer reporting issue persists. Requesting new RFE.
Description of problem:
Long lines read by fluentd from the Docker logs are split into several documents sent to Elasticsearch.
The max size of the message seems to be 16KB therefore for a message of 85KB the result is that 6 messages were created in different chunks.
Fluentd is configured with the default configuration (docker json-file log driver).
Version-Release number of selected component (if applicable):
OCP v3.3.1.7
How reproducible:
100%
Steps to Reproduce:
1. oc debug dc/cakephp
2. generate a file with all the content (attached) in a single line.
3. cat longlog.txt
Actual results:
The message is split into 6 messages visible from Kibana
Expected results:
A single message should have been generated
Additional info:
* I have tried to put the document into Elasticsearch manually and it is not split
* oc logs don't show anything
* fluentd logs don't show anything
* docker logs show the entire message
Previous RFE 1422008 closed without resolution - https://bugzilla.redhat.com/show_bug.cgi?id=1422008