Bug 1618581 - [free-int] failed to start up kibana-proxy container
Summary: [free-int] failed to start up kibana-proxy container
Keywords:
Status: CLOSED DUPLICATE of bug 1615275
Alias: None
Product: OpenShift Online
Classification: Red Hat
Component: Logging
Version: 3.x
Hardware: Unspecified
OS: Unspecified
medium
medium
Target Milestone: ---
: ---
Assignee: Jeff Cantrill
QA Contact: Anping Li
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2018-08-17 02:52 UTC by Junqi Zhao
Modified: 2018-08-17 18:51 UTC (History)
1 user (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2018-08-17 18:51:11 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)

Description Junqi Zhao 2018-08-17 02:52:24 UTC
Description of problem:
free-int cluster,logging-kibana pod is CrashLoopBackOff, there is error in kibana-proxy container log:  cookie_secret must be 16, 24, or 32 bytes to create an AES cipher when pass_access_token == true or cookie_refresh != 0, but is 152 bytes

NAME                                       READY     STATUS             RESTARTS   AGE
logging-kibana-16-deploy                   0/1       Error              0          13d
logging-kibana-17-2cz7k                    1/2       CrashLoopBackOff   3743       13d

# oc describe pod logging-kibana-17-2cz7k -n logging
Containers:
  kibana:
    Container ID:   docker://84b7d726dab5b168efecc3440637a755e56719cc4e5ac931cab2dee8d4e24346
    Image:          registry.reg-aws.openshift.com:443/openshift3/ose-logging-kibana5:v3.11.0-0.10.0
    Image ID:       docker-pullable://registry.reg-aws.openshift.com:443/openshift3/ose-logging-kibana5@sha256:60e17fd97bdd7f9e05661a49431f28f4de924e382b8b4b08fbf39a68cba0ee54
    Port:           <none>
    Host Port:      <none>
    State:          Running
      Started:      Thu, 16 Aug 2018 17:32:14 +0000
    Last State:     Terminated
      Reason:       Error
      Exit Code:    137
      Started:      Thu, 16 Aug 2018 17:24:26 +0000
      Finished:     Thu, 16 Aug 2018 17:31:50 +0000
    Ready:          True
    Restart Count:  3
  kibana-proxy:
    Container ID:  docker://4cb150bc9d908e1fe750db1b0b9c055dbd01fcb6b79e536ebd1aefd0961de644
    Image:         registry.reg-aws.openshift.com:443/openshift3/oauth-proxy:v3.11.0-0.10.0
    Image ID:      docker-pullable://registry.reg-aws.openshift.com:443/openshift3/oauth-proxy@sha256:8b6ddce1939e729e62c68e8081077c08a4a1d8b9e3845e07e4850fb06cc40123
    Port:          3000/TCP
    Host Port:     0/TCP
    Args:
      --upstream-ca=/var/run/secrets/kubernetes.io/serviceaccount/ca.crt
      --https-address=:3000
      -provider=openshift
      -client-id=kibana-proxy
      -client-secret-file=/secret/oauth-secret
      -cookie-secret-file=/secret/session-secret
      -upstream=http://localhost:5601
      -scope=user:info user:check-access user:list-projects
      --tls-cert=/secret/server-cert
      --tls-key=/secret/server-key
      -pass-access-token
      -skip-provider-button
    State:          Waiting
      Reason:       CrashLoopBackOff
    Last State:     Terminated
      Reason:       Error
      Exit Code:    1

# oc logs -c kibana-proxy logging-kibana-17-2cz7k -n logging
Enter passphrase for key '/home/fedora/online/id_rsa': 
2018/08/17 02:39:24 provider.go:476: Performing OAuth discovery against https://172.30.0.1/.well-known/oauth-authorization-server
2018/08/17 02:39:24 provider.go:522: 200 GET https://172.30.0.1/.well-known/oauth-authorization-server {
  "issuer": "https://api.free-int.openshift.com",
  "authorization_endpoint": "https://api.free-int.openshift.com/oauth/authorize",
  "token_endpoint": "https://api.free-int.openshift.com/oauth/token",
  "scopes_supported": [
    "user:check-access",
    "user:full",
    "user:info",
    "user:list-projects",
    "user:list-scoped-projects"
  ],
  "response_types_supported": [
    "code",
    "token"
  ],
  "grant_types_supported": [
    "authorization_code",
    "implicit"
  ],
  "code_challenge_methods_supported": [
    "plain",
    "S256"
  ]
}
2018/08/17 02:39:24 main.go:127: Invalid configuration:
  cookie_secret must be 16, 24, or 32 bytes to create an AES cipher when pass_access_token == true or cookie_refresh != 0, but is 152 bytes. note: cookie secret was base64 decoded from "hRJxuODfUAd2SA0dRPKi0KpBpeyHlwKuHcdgjhH5AVQlsYtwoiMijW2nMFnaYfoX1zdtMy0x3zqAaWC8RgSJu3oPzCPFJmJzCNP8KaWEh3jUEapmoIe7dfo4vDKfeqG6JFtMTQN2BvyF1Z9HmetH4zfvLJjWaK1cObLZseqJtkWYmcBb38GkdJkYstyEGXn0q9T6rdSi"


Version-Release number of selected component (if applicable):
logging version: v3.11.0-0.10.0


OpenShift Master:v3.11.0-0.16.0 
Kubernetes Master:v1.11.0+d4cacc0 
OpenShift Web Console:v3.11.0-0.16.0 

How reproducible:
Always

Steps to Reproduce:
1. Check logging pods' status
2. 
3.

Actual results:
failed to start up kibana-proxy container

Expected results:
be abled to start up kibana-proxy container

Additional info:

Comment 1 Jeff Cantrill 2018-08-17 18:51:11 UTC

*** This bug has been marked as a duplicate of bug 1615275 ***


Note You need to log in before you can comment on or make changes to this bug.