Bug 1721650 - Install is failing for 4.2 nightly
Summary: Install is failing for 4.2 nightly
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Installer
Version: 4.2.0
Hardware: Unspecified
OS: Unspecified
unspecified
urgent
Target Milestone: ---
: 4.2.0
Assignee: Abhinav Dahiya
QA Contact: sheng.lao
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2019-06-18 19:32 UTC by Eric Paris
Modified: 2019-10-16 06:32 UTC (History)
12 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of: 1721343
Environment:
Last Closed: 2019-10-16 06:32:02 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2019:2922 0 None None None 2019-10-16 06:32:16 UTC

Description Eric Paris 2019-06-18 19:32:18 UTC
+++ This bug was initially created as a clone of Bug #1721343 +++

Please see 4.2 nightlies at https://openshift-release.svc.ci.openshift.org/

Install is failing with following error, more details can be found here https://prow.svc.ci.openshift.org/view/gcs/origin-ci-test/logs/release-openshift-ocp-installer-e2e-aws-4.2/1


level=info msg="Waiting up to 30m0s for the Kubernetes API at https://api.ci-op-8qphyj32-e9160.origin-ci-int-aws.dev.rhcloud.com:6443..."
level=info msg="API v1.14.0+da2e2c0 up"
level=info msg="Waiting up to 30m0s for bootstrapping to complete..."
level=info msg="Pulling debug logs from the bootstrap machine"
level=info msg="Bootstrap gather logs captured here \"/tmp/artifacts/installer/log-bundle-20190617233221.tar.gz\""
level=fatal msg="failed to wait for bootstrapping to complete: timed out waiting for the condition"
2019/06/17 23:33:09 Container setup in pod e2e-aws failed, exit code 1, reason Error
Another process exited
2019/06/17 23:33:09 Container test in pod e2e-aws failed, exit code 1, reason Error
2019/06/17 23:44:13 Copied 3.21Mi of artifacts from e2e-aws to /logs/artifacts/e2e-aws
2019/06/17 23:44:19 Ran for 57m12s
skipped 8 lines unfold_more
level=info msg="Waiting up to 30m0s for the Kubernetes API at https://api.ci-op-8qphyj32-e9160.origin-ci-int-aws.dev.rhcloud.com:6443..."
level=info msg="API v1.14.0+da2e2c0 up"
level=info msg="Waiting up to 30m0s for bootstrapping to complete..."
level=info msg="Pulling debug logs from the bootstrap machine"
level=info msg="Bootstrap gather logs captured here \"/tmp/artifacts/installer/log-bundle-20190617233221.tar.gz\""
level=fatal msg="failed to wait for bootstrapping to complete: timed out waiting for the condition"
---
Container test exited with code 1, reason Error
---
Another process exited

--- Additional comment from Maciej Szulik on 2019-06-18 10:17:11 UTC ---

From looking at the logs it looks like the release is broken:

F0618 06:05:24.136906       1 start.go:22] error: the config map openshift-config-managed/release-verification has an invalid key "verifier-public-key-redhat" that must be a GPG public key: openpgp: invalid data: tag byte does not have MSB set: openpgp: invalid data: tag byte does not have MSB set

--- Additional comment from Sudha Ponnaganti on 2019-06-18 13:20:05 UTC ---

@luke.meyer - Can you take a look at this?

--- Additional comment from Eric Paris on 2019-06-18 13:31:30 UTC ---

Wild guess, I think this is likely: https://github.com/openshift/cluster-update-keys/pull/15
not an ART problem....

--- Additional comment from Eric Paris on 2019-06-18 15:14:32 UTC ---

https://github.com/openshift/cluster-update-keys/pull/15

Comment 1 sheng.lao 2019-07-11 11:13:34 UTC
Verified with version 4.2.0-0.nightly-2019-07-11-071248

Comment 2 errata-xmlrpc 2019-10-16 06:32:02 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2019:2922


Note You need to log in before you can comment on or make changes to this bug.