Bug 1904380
Summary: | Forwarding logs to Kafka using Chained certificates fails with error "state=error: certificate verify failed (unable to get local issuer certificate)" | ||||||||
---|---|---|---|---|---|---|---|---|---|
Product: | OpenShift Container Platform | Reporter: | Oscar Casal Sanchez <ocasalsa> | ||||||
Component: | Logging | Assignee: | Sergey Yedrikov <syedriko> | ||||||
Status: | CLOSED ERRATA | QA Contact: | Anping Li <anli> | ||||||
Severity: | medium | Docs Contact: | |||||||
Priority: | high | ||||||||
Version: | 4.6 | CC: | anli, aos-bugs, jcantril, jdelft, jwennerberg, kpelc, periklis, syedriko | ||||||
Target Milestone: | --- | ||||||||
Target Release: | 4.6.z | ||||||||
Hardware: | Unspecified | ||||||||
OS: | Unspecified | ||||||||
Whiteboard: | logging-core | ||||||||
Fixed In Version: | Doc Type: | Bug Fix | |||||||
Doc Text: |
Previously, forwarding logs to Kafka using chained certificates failed with error "state=error: certificate verify failed (unable to get local issuer certificate)." Logs could not be forwarded to a Kafka broker with a certificate signed by an intermediate CA. This happened because fluentd Kafka plugin could only handle a single CA certificate supplied in the ca-bundle.crt entry of the corresponding secret. The current release fixes this issue. It enables the fluentd Kafka plugin to handle multiple CA certificates supplied in the ca-bundle.crt entry of the corresponding secret. Now, logs can be forwarded to a Kafka broker with a certificate signed by an intermediate CA.
|
Story Points: | --- | ||||||
Clone Of: | |||||||||
: | 1939693 (view as bug list) | Environment: | |||||||
Last Closed: | 2021-04-20 19:20:20 UTC | Type: | Bug | ||||||
Regression: | --- | Mount Type: | --- | ||||||
Documentation: | --- | CRM: | |||||||
Verified Versions: | Category: | --- | |||||||
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |||||||
Cloudforms Team: | --- | Target Upstream Version: | |||||||
Embargoed: | |||||||||
Attachments: |
|
Description
Oscar Casal Sanchez
2020-12-04 09:16:28 UTC
@anli I can give you all the info for a test case if you need one. Created attachment 1768094 [details]
Kafka config and certificates
@anli The test case: Unpack the latest release from https://kafka.apache.org and the bz_1904380_testcase.tar.gz attachment. In kafka_2.13-2.7.0/config/server.properties, you care about this part: listeners=PLAINTEXT://:9092,SSL://:9093 ssl.keystore.location=/home/syedriko/bz/1904380/pki/server_root_ca/server_intermediate_ca/server.pkcs12 ssl.keystore.password=server ssl.truststore.location=/home/syedriko/bz/1904380/pki/client_root_ca/client_intermediate_ca/client_ca_bundle.jks ssl.truststore.password=client_ca_bundle fix the paths to point to where you unpacked the attachment. Similarly, in kafka_2.13-2.7.0/client-ssl.properties, fix the paths to the certs. Give the example in https://kafka.apache.org/quickstart a go, just run the producer and consumer over TLS: kafka-console-producer.sh --bootstrap-server localhost:9093 --topic test --producer.config client-ssl.properties kafka-console-consumer.sh --bootstrap-server localhost:9093 --topic test --consumer.config client-ssl.properties Created attachment 1768104 [details]
Kafka in-cluster certs
@anli The certs in the bz_1904380_testcase.tar.gz attachment are for running Kafka on the localhost. I added another attachment with in-cluster certs, with different SANs:
[syedriko@localhost ~]$ diff ~/bz/1904380/pki_clo/server_root_ca/server_intermediate_ca/server.conf ~/bz/1904380/pki/server_root_ca/server_intermediate_ca/server.conf
17c17
< subjectAltName = DNS.1:kafka.openshift-logging.svc.cluster.local
---
> subjectAltName = IP.1:127.0.0.1,IP.2:0:0:0:0:0:0:0:1,DNS.1:localhost
Hello, Do we have any news about this? Any issues doing the QA? Or are we missing something? Best regards, Oscar Verified on clusterlogging.4.6.0-202104030104.p0 Background information for release note: Cause: The fluentd Kafka plugin can only handle a single CA certificate supplied in the ca-bundle.crt entry of the corresponding secret. Consequence: Logs can not be forwarded to a Kafka broker with a certificate signed by an intermediate CA. Fix: Enable the fluentd Kafka plugin to handle multiple CA certificates supplied in the ca-bundle.crt entry of the corresponding secret. Result: Logs can be forwarded to a Kafka broker with a certificate signed by an intermediate CA. Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory (OpenShift Container Platform 4.6.25 extras update), and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2021:1155 |