Bug 1885241 - kube-rbac-proxy: Logging is broken due to mix of k8s.io/klog v1 and v2
Summary: kube-rbac-proxy: Logging is broken due to mix of k8s.io/klog v1 and v2
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Monitoring
Version: 4.6
Hardware: Unspecified
OS: Unspecified
unspecified
medium
Target Milestone: ---
: 4.7.0
Assignee: Pawel Krupa
QA Contact: hongyan li
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2020-10-05 13:41 UTC by Sergiusz Urbaniak
Modified: 2021-02-24 15:23 UTC (History)
7 users (show)

Fixed In Version:
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2021-02-24 15:23:10 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github brancz kube-rbac-proxy pull 92 0 None closed *: move to klog v2 2021-01-13 08:19:09 UTC
Github brancz kube-rbac-proxy pull 95 0 None closed *: update client-go and other dependencies 2021-01-13 08:19:50 UTC
Github openshift kube-rbac-proxy pull 32 0 None closed Bug 1885241: Bump to master (post v0.7.0) 2021-01-13 08:19:09 UTC
Github openshift kube-rbac-proxy pull 33 0 None closed Bug 1885241: finish moving to klog v2 2021-01-13 08:19:13 UTC
Red Hat Product Errata RHSA-2020:5633 0 None None None 2021-02-24 15:23:35 UTC

Description Sergiusz Urbaniak 2020-10-05 13:41:21 UTC
This bug was initially created as a copy of Bug #1883461

I am copying this bug because: 



k8s.io/klog moved to v2 in client-go and frieds. The operator does not use v2 yet and hence, we lose one half of the logging output (v2).

Comment 2 Junqi Zhao 2020-10-13 09:04:25 UTC
tested with 4.7.0-0.ci-2020-10-12-222453, use a non-exist config file,we can see klog/v2 in the log
sh-4.4$ kube-rbac-proxy --config-file=asfd --v=4
I1013 09:02:38.607167      64 main.go:159] Reading config file: asfd
F1013 09:02:38.607220      64 main.go:162] Failed to read resource-attribute file: open asfd: no such file or directory
goroutine 1 [running]:
k8s.io/klog/v2.stacks(0xc00012a001, 0xc00051e280, 0x78, 0x99)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:996 +0xb9
k8s.io/klog/v2.(*loggingT).output(0x1e95820, 0xc000000003, 0x0, 0x0, 0xc000322230, 0x1e00259, 0x7, 0xa2, 0x0)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:945 +0x191
k8s.io/klog/v2.(*loggingT).printf(0x1e95820, 0x3, 0x0, 0x0, 0x14f7501, 0x2a, 0xc000607dc0, 0x1, 0x1)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:733 +0x17a
k8s.io/klog/v2.Fatalf(...)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:1463
main.main()
	/go/src/github.com/brancz/kube-rbac-proxy/main.go:162 +0x2fba

goroutine 18 [chan receive]:
k8s.io/klog.(*loggingT).flushDaemon(0x1e95740)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/klog.go:1010 +0x8b
created by k8s.io/klog.init.0
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/klog.go:411 +0xd8

goroutine 19 [chan receive]:
k8s.io/klog/v2.(*loggingT).flushDaemon(0x1e95820)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:1131 +0x8b
created by k8s.io/klog/v2.init.0
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:416 +0xd8

Comment 3 Junqi Zhao 2020-10-13 09:08:58 UTC
see from Comment 2, I think it still mix of k8s.io/klog v1 and v2, please correct me if I am wrong
goroutine 18 [chan receive]:
k8s.io/klog.(*loggingT).flushDaemon(0x1e95740)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/klog.go:1010 +0x8b
created by k8s.io/klog.init.0
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/klog.go:411 +0xd8

goroutine 19 [chan receive]:
k8s.io/klog/v2.(*loggingT).flushDaemon(0x1e95820)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:1131 +0x8b
created by k8s.io/klog/v2.init.0
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:416 +0xd8

Comment 4 Pawel Krupa 2020-10-13 10:33:51 UTC
Great catch. It seems that one of libraries used by kube-rbac-proxy is in an older version which wasn't moved to klog v2 yet.

I created an upstream PR in https://github.com/brancz/kube-rbac-proxy/pull/95 and will port it downstream after it is merged.

Comment 6 hongyan li 2020-10-23 08:40:48 UTC
fix is in 4.7.0-0.nightly-2020-10-23-024149 or later payloads

Comment 7 hongyan li 2020-10-23 10:38:13 UTC
Log is back to normal and no v1 log now

sh-4.4$ kube-rbac-proxy --config-file=asfd --v=4
I1023 10:36:55.879406   16106 main.go:159] Reading config file: asfd
F1023 10:36:55.879467   16106 main.go:162] Failed to read resource-attribute file: open asfd: no such file or directory
goroutine 1 [running]:
k8s.io/klog/v2.stacks(0xc000010001, 0xc00013e140, 0x78, 0x99)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:996 +0xb9
k8s.io/klog/v2.(*loggingT).output(0x22a52c0, 0xc000000003, 0x0, 0x0, 0xc000372690, 0x21fc6a1, 0x7, 0xa2, 0x0)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:945 +0x191
k8s.io/klog/v2.(*loggingT).printf(0x22a52c0, 0x3, 0x0, 0x0, 0x177e3bc, 0x2a, 0xc000397dc0, 0x1, 0x1)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:733 +0x17a
k8s.io/klog/v2.Fatalf(...)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:1463
main.main()
	/go/src/github.com/brancz/kube-rbac-proxy/main.go:162 +0x2fba

goroutine 6 [chan receive]:
k8s.io/klog/v2.(*loggingT).flushDaemon(0x22a52c0)
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:1131 +0x8b
created by k8s.io/klog/v2.init.0
	/go/src/github.com/brancz/kube-rbac-proxy/vendor/k8s.io/klog/v2/klog.go:416 +0xd8

Comment 11 errata-xmlrpc 2021-02-24 15:23:10 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Moderate: OpenShift Container Platform 4.7.0 security, bug fix, and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2020:5633


Note You need to log in before you can comment on or make changes to this bug.