Bug 1907728 - [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed] in some kube-rbac-proxy containers
Summary: [invalid bearer token, square/go-jose: error in cryptographic primitive, toke...
Keywords:
Status: CLOSED INSUFFICIENT_DATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Monitoring
Version: 4.7
Hardware: Unspecified
OS: Unspecified
low
medium
Target Milestone: ---
: ---
Assignee: Pawel Krupa
QA Contact: Junqi Zhao
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2020-12-15 04:16 UTC by Junqi Zhao
Modified: 2022-03-21 19:53 UTC (History)
10 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2021-03-30 15:21:07 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
kube-rbac-proxy container logs (2.76 KB, text/plain)
2020-12-15 04:16 UTC, Junqi Zhao
no flags Details
daemonset node-exporter yaml file (11.37 KB, text/plain)
2020-12-15 04:18 UTC, Junqi Zhao
no flags Details

Description Junqi Zhao 2020-12-15 04:16:43 UTC
Created attachment 1739231 [details]
kube-rbac-proxy container logs

Description of problem:
check kube-rbac-proxy container logs for node-exporter pods, "[invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]" for some node-exporter pod's kube-rbac-proxy container, not for all
# for i in $(oc -n openshift-monitoring get po | grep node-exporter| awk '{print $1}'); do echo $i; oc -n openshift-monitoring logs $i -c kube-rbac-proxy | grep "error in cryptographic primitive" | head -n 5; echo -e "\n"; done
node-exporter-4r4bl
E1215 02:15:49.829960    2724 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
E1215 02:16:04.816119    2724 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
E1215 02:16:19.825445    2724 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
E1215 02:16:34.815418    2724 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
E1215 02:16:49.819390    2724 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]


node-exporter-7nmng
E1215 02:15:48.825964    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
E1215 02:16:03.759746    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
E1215 02:16:18.753322    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
E1215 02:16:33.753387    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
E1215 02:16:48.753577    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]


node-exporter-cv8vn


node-exporter-mj7jw


node-exporter-prbxw


node-exporter-ztnkd
****************************************
# oc -n openshift-monitoring get pod -o wide | grep -E "node-exporter-4r4bl|node-exporter-7nmng"
node-exporter-4r4bl                            2/2     Running   0          111m   10.0.1.5      whu47az2-62hg9-worker-usgovvirginia-lrf92   <none>           <none>
node-exporter-7nmng                            2/2     Running   0          124m   10.0.0.8      whu47az2-62hg9-master-2                     <none>           <none>

debug on one cluster
# oc debug node/whu47az2-62hg9-master-2
Creating debug namespace/openshift-debug-node-xgbsh ...
Starting pod/whu47az2-62hg9-master-2-debug ...
To use host binaries, run `chroot /host`
Pod IP: 10.0.0.8
If you don't see a command prompt, try pressing enter.
sh-4.4# chroot /host
sh-4.4# journalctl | grep node-exporter-7nmng   
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.635086    1628 kubelet.go:1834] SyncLoop (ADD, "api"): "node-exporter-7nmng_openshift-monitoring(81e0d5de-9e17-4d91-8968-46fb0ba6d482)"
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.724433    1628 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "sys" (UniqueName: "kubernetes.io/host-path/81e0d5de-9e17-4d91-8968-46fb0ba6d482-sys") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.724498    1628 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.724537    1628 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "root" (UniqueName: "kubernetes.io/host-path/81e0d5de-9e17-4d91-8968-46fb0ba6d482-root") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.724570    1628 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "node-exporter-token-njg2s" (UniqueName: "kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-token-njg2s") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.724669    1628 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "proc" (UniqueName: "kubernetes.io/host-path/81e0d5de-9e17-4d91-8968-46fb0ba6d482-proc") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.724742    1628 reconciler.go:224] operationExecutor.VerifyControllerAttachedVolume started for volume "node-exporter-textfile" (UniqueName: "kubernetes.io/empty-dir/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-textfile") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.825195    1628 reconciler.go:269] operationExecutor.MountVolume started for volume "root" (UniqueName: "kubernetes.io/host-path/81e0d5de-9e17-4d91-8968-46fb0ba6d482-root") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.825272    1628 reconciler.go:269] operationExecutor.MountVolume started for volume "node-exporter-token-njg2s" (UniqueName: "kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-token-njg2s") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.825305    1628 operation_generator.go:663] MountVolume.SetUp succeeded for volume "root" (UniqueName: "kubernetes.io/host-path/81e0d5de-9e17-4d91-8968-46fb0ba6d482-root") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.825317    1628 reconciler.go:269] operationExecutor.MountVolume started for volume "proc" (UniqueName: "kubernetes.io/host-path/81e0d5de-9e17-4d91-8968-46fb0ba6d482-proc") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.825362    1628 reconciler.go:269] operationExecutor.MountVolume started for volume "node-exporter-textfile" (UniqueName: "kubernetes.io/empty-dir/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-textfile") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.825368    1628 operation_generator.go:663] MountVolume.SetUp succeeded for volume "proc" (UniqueName: "kubernetes.io/host-path/81e0d5de-9e17-4d91-8968-46fb0ba6d482-proc") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.825387    1628 reconciler.go:269] operationExecutor.MountVolume started for volume "sys" (UniqueName: "kubernetes.io/host-path/81e0d5de-9e17-4d91-8968-46fb0ba6d482-sys") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.825426    1628 reconciler.go:269] operationExecutor.MountVolume started for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.825473    1628 operation_generator.go:663] MountVolume.SetUp succeeded for volume "sys" (UniqueName: "kubernetes.io/host-path/81e0d5de-9e17-4d91-8968-46fb0ba6d482-sys") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: E1215 01:59:25.825591    1628 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls podName:81e0d5de-9e17-4d91-8968-46fb0ba6d482 nodeName:}" failed. No retries permitted until 2020-12-15 01:59:26.32556863 +0000 UTC m=+83.203733528 (durationBeforeRetry 500ms). Error: "MountVolume.SetUp failed for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls\") pod \"node-exporter-7nmng\" (UID: \"81e0d5de-9e17-4d91-8968-46fb0ba6d482\") : secret \"node-exporter-tls\" not found"
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.826141    1628 operation_generator.go:663] MountVolume.SetUp succeeded for volume "node-exporter-textfile" (UniqueName: "kubernetes.io/empty-dir/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-textfile") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:25 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:25.839502    1628 operation_generator.go:663] MountVolume.SetUp succeeded for volume "node-exporter-token-njg2s" (UniqueName: "kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-token-njg2s") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:26 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:26.327367    1628 reconciler.go:269] operationExecutor.MountVolume started for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:26 whu47az2-62hg9-master-2 hyperkube[1628]: E1215 01:59:26.327672    1628 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls podName:81e0d5de-9e17-4d91-8968-46fb0ba6d482 nodeName:}" failed. No retries permitted until 2020-12-15 01:59:27.32763635 +0000 UTC m=+84.205801348 (durationBeforeRetry 1s). Error: "MountVolume.SetUp failed for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls\") pod \"node-exporter-7nmng\" (UID: \"81e0d5de-9e17-4d91-8968-46fb0ba6d482\") : secret \"node-exporter-tls\" not found"
Dec 15 01:59:27 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:27.331515    1628 reconciler.go:269] operationExecutor.MountVolume started for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:27 whu47az2-62hg9-master-2 hyperkube[1628]: E1215 01:59:27.331852    1628 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls podName:81e0d5de-9e17-4d91-8968-46fb0ba6d482 nodeName:}" failed. No retries permitted until 2020-12-15 01:59:29.331816395 +0000 UTC m=+86.209981393 (durationBeforeRetry 2s). Error: "MountVolume.SetUp failed for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls\") pod \"node-exporter-7nmng\" (UID: \"81e0d5de-9e17-4d91-8968-46fb0ba6d482\") : secret \"node-exporter-tls\" not found"
Dec 15 01:59:29 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:29.340835    1628 reconciler.go:269] operationExecutor.MountVolume started for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:29 whu47az2-62hg9-master-2 hyperkube[1628]: E1215 01:59:29.341251    1628 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls podName:81e0d5de-9e17-4d91-8968-46fb0ba6d482 nodeName:}" failed. No retries permitted until 2020-12-15 01:59:33.341204661 +0000 UTC m=+90.219369859 (durationBeforeRetry 4s). Error: "MountVolume.SetUp failed for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls\") pod \"node-exporter-7nmng\" (UID: \"81e0d5de-9e17-4d91-8968-46fb0ba6d482\") : secret \"node-exporter-tls\" not found"
Dec 15 01:59:33 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:33.360496    1628 reconciler.go:269] operationExecutor.MountVolume started for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:33 whu47az2-62hg9-master-2 hyperkube[1628]: E1215 01:59:33.360732    1628 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls podName:81e0d5de-9e17-4d91-8968-46fb0ba6d482 nodeName:}" failed. No retries permitted until 2020-12-15 01:59:41.360708879 +0000 UTC m=+98.238873777 (durationBeforeRetry 8s). Error: "MountVolume.SetUp failed for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls\") pod \"node-exporter-7nmng\" (UID: \"81e0d5de-9e17-4d91-8968-46fb0ba6d482\") : secret \"node-exporter-tls\" not found"
Dec 15 01:59:41 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:41.396641    1628 reconciler.go:269] operationExecutor.MountVolume started for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:41 whu47az2-62hg9-master-2 hyperkube[1628]: E1215 01:59:41.396880    1628 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls podName:81e0d5de-9e17-4d91-8968-46fb0ba6d482 nodeName:}" failed. No retries permitted until 2020-12-15 01:59:57.39684631 +0000 UTC m=+114.275011308 (durationBeforeRetry 16s). Error: "MountVolume.SetUp failed for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls\") pod \"node-exporter-7nmng\" (UID: \"81e0d5de-9e17-4d91-8968-46fb0ba6d482\") : secret \"node-exporter-tls\" not found"
Dec 15 01:59:57 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:57.463999    1628 reconciler.go:269] operationExecutor.MountVolume started for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:57 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:57.477181    1628 operation_generator.go:663] MountVolume.SetUp succeeded for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/81e0d5de-9e17-4d91-8968-46fb0ba6d482-node-exporter-tls") pod "node-exporter-7nmng" (UID: "81e0d5de-9e17-4d91-8968-46fb0ba6d482")
Dec 15 01:59:57 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:57.483900    1628 kuberuntime_manager.go:425] No sandbox for pod "node-exporter-7nmng_openshift-monitoring(81e0d5de-9e17-4d91-8968-46fb0ba6d482)" can be found. Need to start a new one
Dec 15 01:59:57 whu47az2-62hg9-master-2 crio[1590]: time="2020-12-15 01:59:57.484482455Z" level=info msg="Running pod sandbox: openshift-monitoring/node-exporter-7nmng/POD" id=f7d299a6-2dad-40cf-93b8-50fd3c26dfaf name=/runtime.v1alpha2.RuntimeService/RunPodSandbox
Dec 15 01:59:57 whu47az2-62hg9-master-2 crio[1590]: time="2020-12-15 01:59:57.863315711Z" level=info msg="Ran pod sandbox 24fe725ae5dc2c63726f96b6b6bf991b88644ffd2d08a92be8c1a5bb5b7aa5e8 with infra container: openshift-monitoring/node-exporter-7nmng/POD" id=f7d299a6-2dad-40cf-93b8-50fd3c26dfaf name=/runtime.v1alpha2.RuntimeService/RunPodSandbox
Dec 15 01:59:58 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 01:59:58.056684    1628 kubelet.go:1872] SyncLoop (PLEG): "node-exporter-7nmng_openshift-monitoring(81e0d5de-9e17-4d91-8968-46fb0ba6d482)", event: &pleg.PodLifecycleEvent{ID:"81e0d5de-9e17-4d91-8968-46fb0ba6d482", Type:"ContainerStarted", Data:"24fe725ae5dc2c63726f96b6b6bf991b88644ffd2d08a92be8c1a5bb5b7aa5e8"}
Dec 15 01:59:59 whu47az2-62hg9-master-2 crio[1590]: time="2020-12-15 01:59:59.809806361Z" level=info msg="Creating container: openshift-monitoring/node-exporter-7nmng/init-textfile" id=1ef2ad81-1c87-4eb8-b13d-98be89eb21dc name=/runtime.v1alpha2.RuntimeService/CreateContainer
Dec 15 01:59:59 whu47az2-62hg9-master-2 crio[1590]: time="2020-12-15 01:59:59.946973273Z" level=info msg="Created container cdb9ed5fab7231ad764d7392948183d48a2a4ffb9f5e0f073a3a021c806d8740: openshift-monitoring/node-exporter-7nmng/init-textfile" id=1ef2ad81-1c87-4eb8-b13d-98be89eb21dc name=/runtime.v1alpha2.RuntimeService/CreateContainer
Dec 15 01:59:59 whu47az2-62hg9-master-2 crio[1590]: time="2020-12-15 01:59:59.960819336Z" level=info msg="Started container cdb9ed5fab7231ad764d7392948183d48a2a4ffb9f5e0f073a3a021c806d8740: openshift-monitoring/node-exporter-7nmng/init-textfile" id=a3e443c3-5ed6-481f-ac90-e3f372b7b844 name=/runtime.v1alpha2.RuntimeService/StartContainer
Dec 15 02:00:00 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 02:00:00.061786    1628 kubelet.go:1872] SyncLoop (PLEG): "node-exporter-7nmng_openshift-monitoring(81e0d5de-9e17-4d91-8968-46fb0ba6d482)", event: &pleg.PodLifecycleEvent{ID:"81e0d5de-9e17-4d91-8968-46fb0ba6d482", Type:"ContainerStarted", Data:"cdb9ed5fab7231ad764d7392948183d48a2a4ffb9f5e0f073a3a021c806d8740"}
Dec 15 02:00:01 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 02:00:01.064148    1628 kubelet.go:1872] SyncLoop (PLEG): "node-exporter-7nmng_openshift-monitoring(81e0d5de-9e17-4d91-8968-46fb0ba6d482)", event: &pleg.PodLifecycleEvent{ID:"81e0d5de-9e17-4d91-8968-46fb0ba6d482", Type:"ContainerDied", Data:"cdb9ed5fab7231ad764d7392948183d48a2a4ffb9f5e0f073a3a021c806d8740"}
Dec 15 02:00:01 whu47az2-62hg9-master-2 crio[1590]: time="2020-12-15 02:00:01.070878380Z" level=info msg="Creating container: openshift-monitoring/node-exporter-7nmng/node-exporter" id=e1725058-a3ad-4d16-a619-56812699c83f name=/runtime.v1alpha2.RuntimeService/CreateContainer
Dec 15 02:00:01 whu47az2-62hg9-master-2 crio[1590]: time="2020-12-15 02:00:01.242726021Z" level=info msg="Created container 89d2a7eb9bff0b7764a5e6e2387d446647d576b1c44479487cc3e0f30f74e8af: openshift-monitoring/node-exporter-7nmng/node-exporter" id=e1725058-a3ad-4d16-a619-56812699c83f name=/runtime.v1alpha2.RuntimeService/CreateContainer
Dec 15 02:00:01 whu47az2-62hg9-master-2 crio[1590]: time="2020-12-15 02:00:01.254698318Z" level=info msg="Started container 89d2a7eb9bff0b7764a5e6e2387d446647d576b1c44479487cc3e0f30f74e8af: openshift-monitoring/node-exporter-7nmng/node-exporter" id=a0b64187-af48-4885-aef5-5b6fa0d044be name=/runtime.v1alpha2.RuntimeService/StartContainer
Dec 15 02:00:01 whu47az2-62hg9-master-2 crio[1590]: time="2020-12-15 02:00:01.275683540Z" level=info msg="Creating container: openshift-monitoring/node-exporter-7nmng/kube-rbac-proxy" id=459b33ad-b7c1-46a7-9ebc-b64852611034 name=/runtime.v1alpha2.RuntimeService/CreateContainer
Dec 15 02:00:01 whu47az2-62hg9-master-2 crio[1590]: time="2020-12-15 02:00:01.454898556Z" level=info msg="Created container f0cff2d1bfcbda7a87523004ed81387332e25f46386eb76f1e519f761daaa528: openshift-monitoring/node-exporter-7nmng/kube-rbac-proxy" id=459b33ad-b7c1-46a7-9ebc-b64852611034 name=/runtime.v1alpha2.RuntimeService/CreateContainer
Dec 15 02:00:01 whu47az2-62hg9-master-2 crio[1590]: time="2020-12-15 02:00:01.472228943Z" level=info msg="Started container f0cff2d1bfcbda7a87523004ed81387332e25f46386eb76f1e519f761daaa528: openshift-monitoring/node-exporter-7nmng/kube-rbac-proxy" id=b4761e94-3946-4f55-86a1-fd529698ca18 name=/runtime.v1alpha2.RuntimeService/StartContainer
Dec 15 02:00:02 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 02:00:02.069091    1628 kubelet.go:1872] SyncLoop (PLEG): "node-exporter-7nmng_openshift-monitoring(81e0d5de-9e17-4d91-8968-46fb0ba6d482)", event: &pleg.PodLifecycleEvent{ID:"81e0d5de-9e17-4d91-8968-46fb0ba6d482", Type:"ContainerStarted", Data:"f0cff2d1bfcbda7a87523004ed81387332e25f46386eb76f1e519f761daaa528"}
Dec 15 02:00:02 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 02:00:02.069149    1628 kubelet.go:1872] SyncLoop (PLEG): "node-exporter-7nmng_openshift-monitoring(81e0d5de-9e17-4d91-8968-46fb0ba6d482)", event: &pleg.PodLifecycleEvent{ID:"81e0d5de-9e17-4d91-8968-46fb0ba6d482", Type:"ContainerStarted", Data:"89d2a7eb9bff0b7764a5e6e2387d446647d576b1c44479487cc3e0f30f74e8af"}
Dec 15 03:47:58 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 03:47:58.674952    1628 logs.go:317] Finish parsing log file "/var/log/pods/openshift-monitoring_node-exporter-7nmng_81e0d5de-9e17-4d91-8968-46fb0ba6d482/node-exporter/0.log"
Dec 15 03:47:58 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 03:47:58.901702    1628 logs.go:317] Finish parsing log file "/var/log/pods/openshift-monitoring_node-exporter-7nmng_81e0d5de-9e17-4d91-8968-46fb0ba6d482/kube-rbac-proxy/0.log"
Dec 15 03:52:53 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 03:52:53.671318    1628 logs.go:317] Finish parsing log file "/var/log/pods/openshift-monitoring_node-exporter-7nmng_81e0d5de-9e17-4d91-8968-46fb0ba6d482/kube-rbac-proxy/0.log"
Dec 15 03:53:34 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 03:53:34.304673    1628 logs.go:317] Finish parsing log file "/var/log/pods/openshift-monitoring_node-exporter-7nmng_81e0d5de-9e17-4d91-8968-46fb0ba6d482/node-exporter/0.log"
Dec 15 03:53:34 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 03:53:34.569126    1628 logs.go:317] Finish parsing log file "/var/log/pods/openshift-monitoring_node-exporter-7nmng_81e0d5de-9e17-4d91-8968-46fb0ba6d482/kube-rbac-proxy/0.log"
Dec 15 03:54:47 whu47az2-62hg9-master-2 hyperkube[1628]: I1215 03:54:47.744752    1628 logs.go:317] Finish parsing log file "/var/log/pods/openshift-monitoring_node-exporter-7nmng_81e0d5de-9e17-4d91-8968-46fb0ba6d482/kube-rbac-proxy/0.log"
sh-4.4# cat /var/log/pods/openshift-monitoring_node-exporter-7nmng_81e0d5de-9e17-4d91-8968-46fb0ba6d482/kube-rbac-proxy/0.log
2020-12-15T02:00:01.511761565+00:00 stderr F I1215 02:00:01.511612    5629 main.go:190] Valid token audiences: 
2020-12-15T02:00:01.511843272+00:00 stderr F I1215 02:00:01.511761    5629 main.go:278] Reading certificate files
2020-12-15T02:00:01.512203505+00:00 stderr F I1215 02:00:01.512160    5629 main.go:311] Starting TCP socket on [10.0.0.8]:9100
2020-12-15T02:00:01.513020280+00:00 stderr F I1215 02:00:01.512988    5629 main.go:318] Listening securely on [10.0.0.8]:9100
2020-12-15T02:15:48.826021176+00:00 stderr F E1215 02:15:48.825964    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
2020-12-15T02:16:03.759799154+00:00 stderr F E1215 02:16:03.759746    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
2020-12-15T02:16:18.753378224+00:00 stderr F E1215 02:16:18.753322    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
2020-12-15T02:16:33.753445413+00:00 stderr F E1215 02:16:33.753387    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
2020-12-15T02:16:48.753630729+00:00 stderr F E1215 02:16:48.753577    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
2020-12-15T02:17:03.745744995+00:00 stderr F E1215 02:17:03.745514    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
2020-12-15T02:17:18.757029704+00:00 stderr F E1215 02:17:18.756973    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
2020-12-15T02:17:33.755012158+00:00 stderr F E1215 02:17:33.754963    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
2020-12-15T02:17:48.759355879+00:00 stderr F E1215 02:17:48.759308    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
2020-12-15T02:18:03.753719763+00:00 stderr F E1215 02:18:03.753670    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
2020-12-15T02:18:18.756052230+00:00 stderr F E1215 02:18:18.756007    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
2020-12-15T02:18:33.760305549+00:00 stderr F E1215 02:18:33.760256    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]
2020-12-15T02:18:48.756544804+00:00 stderr F E1215 02:18:48.756497    5629 proxy.go:73] Unable to authenticate the request due to an error: [invalid bearer token, square/go-jose: error in cryptographic primitive, token lookup failed]

Version-Release number of selected component (if applicable):
node_exporter version=1.0.1
4.7.0-0.nightly-2020-12-14-165231

How reproducible:
first time to see the error

Steps to Reproduce:
1. check kube-rbac-proxy container logs for node-exporter pods
2.
3.

Actual results:


Expected results:


Additional info:

Comment 1 Junqi Zhao 2020-12-15 04:18:59 UTC
Created attachment 1739232 [details]
daemonset node-exporter yaml file


Note You need to log in before you can comment on or make changes to this bug.