Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.

Bug 1619724

Summary: Installation failed on task [openshift_console : Waiting for console rollout to complete]
Product: OpenShift Container Platform Reporter: Weibin Liang <weliang>
Component: InstallerAssignee: Scott Dodson <sdodson>
Status: CLOSED NOTABUG QA Contact: Johnny Liu <jialiu>
Severity: medium Docs Contact:
Priority: unspecified    
Version: 3.11.0CC: aos-bugs, jokerman, mmccomas, weliang
Target Milestone: ---   
Target Release: 3.11.0   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2018-08-27 15:42:36 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Attachments:
Description Flags
Failed testing logs none

Description Weibin Liang 2018-08-21 15:10:35 UTC
Description of problem:
Installation failed on task [openshift_console : Waiting for console rollout to complete] when setup os_sdn_network_plugin_name:cni and   openshift_web_console_install: false


Version-Release number of the following components:
rpm -q openshift-ansible
openshift-ansible-3.11.0-0.19.0.git.0.ebd1bf9None.noarch.rpm
openshift-ansible-docs-3.11.0-0.19.0.git.0.ebd1bf9None.noarch.rpm
openshift-ansible-playbooks-3.11.0-0.19.0.git.0.ebd1bf9None.noarch.rpm
openshift-ansible-roles-3.11.0-0.19.0.git.0.ebd1bf9None.noarch.rpm
rpm -q ansible
ansible-2.4.2.0-2.el7.noarch
ansible --version
ansible 2.4.2.0
  config file = /etc/ansible/ansible.cfg
  configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python2.7/site-packages/ansible
  executable location = /usr/bin/ansible
  python version = 2.7.5 (default, May 31 2018, 09:41:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-28)]


How reproducible:
Every time

Steps to Reproduce:
Run openshift_ansible by setting up:
os_sdn_network_plugin_name: cni
openshift_ansible_vars:
  openshift_web_console_install: false


Actual results:
TASK [openshift_console : Waiting for console rollout to complete] *************
Tuesday 21 August 2018  22:23:47 +0800 (0:00:00.209)       0:21:03.066 ******** 
fatal: [host-8-240-215.host.centralci.eng.rdu2.redhat.com]: FAILED! => {"changed": false, "cmd": ["oc", "rollout", "status", "deployment/console", "--config=/etc/origin/master/admin.kubeconfig", "-n", "openshift-console"], "delta": "0:09:58.610895", "end": "2018-08-21 10:33:46.028484", "msg": "non-zero return code", "rc": 1, "start": "2018-08-21 10:23:47.417589", "stderr": "error: deployment \"console\" exceeded its progress deadline", "stderr_lines": ["error: deployment \"console\" exceeded its progress deadline"], "stdout": "Waiting for deployment \"console\" rollout to finish: 0 of 1 updated replicas are available...", "stdout_lines": ["Waiting for deployment \"console\" rollout to finish: 0 of 1 updated replicas are available..."]}
...ignoring

Expected results:
openshift-ansible installation finished without any errors.

Additional info:
https://openshift-qe-jenkins.rhev-ci-vms.eng.rdu2.redhat.com/job/Launch%20Environment%20Flexy/47009/consoleFull

Comment 1 Weibin Liang 2018-08-21 17:28:21 UTC
If only setup os_sdn_network_plugin_name:cni and without  openshift_web_console_install: false, installation failed on TASK [openshift_web_console : Verify that the console is running]

TASK [openshift_web_console : Verify that the console is running] **************
Tuesday 21 August 2018  23:49:06 +0800 (0:00:00.103)       0:12:45.450 ******** 
FAILED - RETRYING: Verify that the console is running (60 retries left).
FAILED - RETRYING: Verify that the console is running (59 retries left).
.
.
.
FAILED - RETRYING: Verify that the console is running (1 retries left).

fatal: [host-8-252-70.host.centralci.eng.rdu2.redhat.com]: FAILED! => {"attempts": 60, "changed": false, "results": {"cmd": "/usr/bin/oc get deployment webconsole -o json -n openshift-web-console", "results": [{"apiVersion": "extensions/v1beta1", "kind": "Deployment", "metadata": {"annotations": {"deployment.kubernetes.io/revision": "1", "kubectl.kubernetes.io/last-applied-configuration": "{\"apiVersion\":\"apps/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"app\":\"openshift-web-console\",\"webconsole\":\"true\"},\"name\":\"webconsole\",\"namespace\":\"openshift-web-console\"},\"spec\":{\"replicas\":1,\"strategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"100%\"},\"type\":\"RollingUpdate\"},\"template\":{\"metadata\":{\"labels\":{\"app\":\"openshift-web-console\",\"webconsole\":\"true\"},\"name\":\"webconsole\"},\"spec\":{\"containers\":[{\"command\":[\"/usr/bin/origin-web-console\",\"--audit-log-path=-\",\"-v=0\",\"--config=/var/webconsole-config/webconsole-config.yaml\"],\"image\":\"registry.dev.redhat.io/openshift3/ose-web-console:v3.11.0\",\"imagePullPolicy\":\"IfNotPresent\",\"livenessProbe\":{\"exec\":{\"command\":[\"/bin/sh\",\"-c\",\"if [[ ! -f /tmp/webconsole-config.hash ]]; then \\\\\\n  md5sum /var/webconsole-config/webconsole-config.yaml \\u003e /tmp/webconsole-config.hash; \\\\\\nelif [[ $(md5sum /var/webconsole-config/webconsole-config.yaml) != $(cat /tmp/webconsole-config.hash) ]]; then \\\\\\n  echo 'webconsole-config.yaml has changed.'; \\\\\\n  exit 1; \\\\\\nfi \\u0026\\u0026 curl -k -f https://0.0.0.0:8443/console/\"]}},\"name\":\"webconsole\",\"ports\":[{\"containerPort\":8443}],\"readinessProbe\":{\"httpGet\":{\"path\":\"/healthz\",\"port\":8443,\"scheme\":\"HTTPS\"}},\"resources\":{\"requests\":{\"cpu\":\"100m\",\"memory\":\"100Mi\"}},\"volumeMounts\":[{\"mountPath\":\"/var/serving-cert\",\"name\":\"serving-cert\"},{\"mountPath\":\"/var/webconsole-config\",\"name\":\"webconsole-config\"}]}],\"nodeSelector\":{\"node-role.kubernetes.io/master\":\"true\"},\"serviceAccountName\":\"webconsole\",\"volumes\":[{\"name\":\"serving-cert\",\"secret\":{\"defaultMode\":288,\"secretName\":\"webconsole-serving-cert\"}},{\"configMap\":{\"defaultMode\":288,\"name\":\"webconsole-config\"},\"name\":\"webconsole-config\"}]}}}}\n"}, "creationTimestamp": "2018-08-21T15:49:07Z", "generation": 1, "labels": {"app": "openshift-web-console", "webconsole": "true"}, "name": "webconsole", "namespace": "openshift-web-console", "resourceVersion": "3227", "selfLink": "/apis/extensions/v1beta1/namespaces/openshift-web-console/deployments/webconsole", "uid": "bd087f87-a559-11e8-a7c0-fa163eabc8f6"}, "spec": {"progressDeadlineSeconds": 600, "replicas": 1, "revisionHistoryLimit": 2, "selector": {"matchLabels": {"app": "openshift-web-console", "webconsole": "true"}}, "strategy": {"rollingUpdate": {"maxSurge": "25%", "maxUnavailable": "100%"}, "type": "RollingUpdate"}, "template": {"metadata": {"creationTimestamp": null, "labels": {"app": "openshift-web-console", "webconsole": "true"}, "name": "webconsole"}, "spec": {"containers": [{"command": ["/usr/bin/origin-web-console", "--audit-log-path=-", "-v=0", "--config=/var/webconsole-config/webconsole-config.yaml"], "image": "registry.dev.redhat.io/openshift3/ose-web-console:v3.11.0", "imagePullPolicy": "IfNotPresent", "livenessProbe": {"exec": {"command": ["/bin/sh", "-c", "if [[ ! -f /tmp/webconsole-config.hash ]]; then \\\n  md5sum /var/webconsole-config/webconsole-config.yaml > /tmp/webconsole-config.hash; \\\nelif [[ $(md5sum /var/webconsole-config/webconsole-config.yaml) != $(cat /tmp/webconsole-config.hash) ]]; then \\\n  echo 'webconsole-config.yaml has changed.'; \\\n  exit 1; \\\nfi && curl -k -f https://0.0.0.0:8443/console/"]}, "failureThreshold": 3, "periodSeconds": 10, "successThreshold": 1, "timeoutSeconds": 1}, "name": "webconsole", "ports": [{"containerPort": 8443, "protocol": "TCP"}], "readinessProbe": {"failureThreshold": 3, "httpGet": {"path": "/healthz", "port": 8443, "scheme": "HTTPS"}, "periodSeconds": 10, "successThreshold": 1, "timeoutSeconds": 1}, "resources": {"requests": {"cpu": "100m", "memory": "100Mi"}}, "terminationMessagePath": "/dev/termination-log", "terminationMessagePolicy": "File", "volumeMounts": [{"mountPath": "/var/serving-cert", "name": "serving-cert"}, {"mountPath": "/var/webconsole-config", "name": "webconsole-config"}]}], "dnsPolicy": "ClusterFirst", "nodeSelector": {"node-role.kubernetes.io/master": "true"}, "restartPolicy": "Always", "schedulerName": "default-scheduler", "securityContext": {}, "serviceAccount": "webconsole", "serviceAccountName": "webconsole", "terminationGracePeriodSeconds": 30, "volumes": [{"name": "serving-cert", "secret": {"defaultMode": 288, "secretName": "webconsole-serving-cert"}}, {"configMap": {"defaultMode": 288, "name": "webconsole-config"}, "name": "webconsole-config"}]}}}, "status": {"conditions": [{"lastTransitionTime": "2018-08-21T15:49:07Z", "lastUpdateTime": "2018-08-21T15:49:07Z", "message": "Deployment has minimum availability.", "reason": "MinimumReplicasAvailable", "status": "True", "type": "Available"}, {"lastTransitionTime": "2018-08-21T15:59:08Z", "lastUpdateTime": "2018-08-21T15:59:08Z", "message": "ReplicaSet \"webconsole-6b8bdf69cf\" has timed out progressing.", "reason": "ProgressDeadlineExceeded", "status": "False", "type": "Progressing"}], "observedGeneration": 1, "replicas": 1, "unavailableReplicas": 1, "updatedReplicas": 1}}], "returncode": 0}, "state": "list"}
...ignoring

Comment 2 Samuel Padgett 2018-08-22 20:13:00 UTC
I don't have access to https://openshift-qe-jenkins.rhev-ci-vms.eng.rdu2.redhat.com/job/Launch%20Environment%20Flexy/47009/consoleFull

Can you include the rest of the log from "Waiting for console rollout to complete" until the end?

If you want to disable both consoles (developer and admin), you need to set both:

openshift_web_console_install: false
openshift_console_install: false

Comment 3 Weibin Liang 2018-08-23 13:17:42 UTC
Installation passed log:
https://openshift-qe-jenkins.rhev-ci-vms.eng.rdu2.redhat.com/job/Launch%20Environment%20Flexy/47308/console


Installation failed log: 
https://openshift-qe-jenkins.rhev-ci-vms.eng.rdu2.redhat.com/job/Launch%20Environment%20Flexy/47307/console

The difference between them is the failed one enabled:
os_sdn_network_plugin_name: cni 
openshift_web_console_install: false
openshift_console_install: false

Comment 4 Samuel Padgett 2018-08-23 13:18:46 UTC
I can't access these log files. Can you attach to the Bugzilla?

Comment 6 Weibin Liang 2018-08-24 18:58:31 UTC
Created attachment 1478611 [details]
Failed testing logs

Comment 7 Samuel Padgett 2018-08-24 19:43:29 UTC
Updating component to install since this fails when (both) console installs are disabled.

Comment 8 Scott Dodson 2018-08-27 14:07:16 UTC
https://openshift-qe-jenkins.rhev-ci-vms.eng.rdu2.redhat.com/job/Launch%20Environment%20Flexy/47307/console shows a failure in service catalog rollout after the web console bits were skipped as expected earlier in the test run.

The logs from comment 6 show failure logging into registry.dev.redhat.io well before the console playbooks should be skipped.

Logs from comment 0 are unfortunately unavailable. Right now it seems as if this bug cannot be reproduced.

Comment 9 Weibin Liang 2018-08-27 15:42:36 UTC
Can not reproduce the failure on logging into registry.dev.redhat.io in latest v3.11.0-0.22.0