Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.

Bug 1754569

Summary: [build-cop] release-openshift-origin-installer-e2e-aws-upgrade-4.2-to-4.3 failure
Product: OpenShift Container Platform Reporter: Lokesh Mandvekar <lsm5>
Component: ReleaseAssignee: Luke Meyer <lmeyer>
Status: CLOSED ERRATA QA Contact: sheng.lao <shlao>
Severity: unspecified Docs Contact:
Priority: unspecified    
Version: 4.2.0CC: aos-bugs, calfonso, eparis, jokerman, kgarriso, shlao, trankin
Target Milestone: ---   
Target Release: 4.3.0   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2020-01-23 11:06:54 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description Lokesh Mandvekar 2019-09-23 15:17:32 UTC
Description of problem:

See: https://prow.svc.ci.openshift.org/view/gcs/origin-ci-test/logs/release-openshift-origin-installer-e2e-aws-upgrade-4.2-to-4.3/10

seems like this is an issue with not being able to fetch tarball.

   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100    58  100    58    0     0    229      0 --:--:-- --:--:-- --:--:--   229
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100    58  100    58    0     0    231      0 --:--:-- --:--:-- --:--:--   236
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib64/python2.7/json/__init__.py", line 290, in load
    **kw)
  File "/usr/lib64/python2.7/json/__init__.py", line 338, in loads
    return _default_decoder.decode(s)
  File "/usr/lib64/python2.7/json/decoder.py", line 366, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib64/python2.7/json/decoder.py", line 384, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded

Comment 1 Kirsten Garrison 2019-09-23 16:56:37 UTC
This BZ has nothing to do with MCO, unsure why it is filed as such. Moving to Unknown component so you can reassign.

Comment 2 Kirsten Garrison 2019-09-23 17:32:12 UTC
Spoke to EParis, putting this into Test Infra as the initial tarball never loaded and entire thing failed in 5s. Probably a one-off failure though.

Comment 3 Lokesh Mandvekar 2019-09-23 18:00:36 UTC
Sorry about the wrong component choice. Btw, this is currently causing a lot of failures for 4.3 related upgrade and rollback tests. https://prow.svc.ci.openshift.org/?job=release-*-upgrade*&state=failure

Comment 4 Lokesh Mandvekar 2019-09-23 18:13:28 UTC
(In reply to Lokesh Mandvekar from comment #3)
> Sorry about the wrong component choice. Btw, this is currently causing a lot
> of failures for 4.3 related upgrade and rollback tests.
> https://prow.svc.ci.openshift.org/?job=release-*-upgrade*&state=failure

Couple of recent failures for release-openshift-origin-installer-e2e-aws-upgrade (see: https://prow.svc.ci.openshift.org/view/gcs/origin-ci-test/logs/release-openshift-origin-installer-e2e-aws-upgrade/7465 )  have just a blank line error log. Haven't seen those occur in the past few days though.

Comment 5 Eric Paris 2019-09-23 19:01:03 UTC
4.2 to 4.3 is known to fail. Please keep filing bugs about it though. The PR referenced above should make the error a bit more clear.(It's going to fail because it's looking for a 'stable' 4.2 release. And we don't have a stable 4.2 release. And that's a known failure.

Comment 6 Lokesh Mandvekar 2019-09-23 19:24:09 UTC
(In reply to Lokesh Mandvekar from comment #4)
> (In reply to Lokesh Mandvekar from comment #3)
> > Sorry about the wrong component choice. Btw, this is currently causing a lot
> > of failures for 4.3 related upgrade and rollback tests.
> > https://prow.svc.ci.openshift.org/?job=release-*-upgrade*&state=failure
> 
> Couple of recent failures for
> release-openshift-origin-installer-e2e-aws-upgrade (see:
> https://prow.svc.ci.openshift.org/view/gcs/origin-ci-test/logs/release-
> openshift-origin-installer-e2e-aws-upgrade/7465 )  have just a blank line
> error log. Haven't seen those occur in the past few days though.

Some change in the most recent failed run at https://prow.svc.ci.openshift.org/view/gcs/origin-ci-test/logs/release-openshift-origin-installer-e2e-aws-upgrade/7466
Ran for about 1h47m41s. The failures look similar to Bug 1754523

Comment 7 Lokesh Mandvekar 2019-09-23 19:26:25 UTC
reverting status..

Comment 8 Eric Paris 2019-09-23 19:28:06 UTC
The original report in this BZ was about the release-openshift-origin-installer-e2e-aws-upgrade-4.2-to-4.3 job and curl failure

Comment #6 is about releae-openshift-origin-installer-e2e-aws job and nothing to do with curl/json failure.  Was that comment a mistake?

Comment 9 Steve Kuznetsov 2019-09-23 19:29:10 UTC
Test Infra has been publishing candidate 4.3 images for months. This is a bug against the release tooling for not testing any and not publishing a candidate release before branching.

Comment 10 Lokesh Mandvekar 2019-09-23 19:37:01 UTC
(In reply to Eric Paris from comment #8)
> The original report in this BZ was about the
> release-openshift-origin-installer-e2e-aws-upgrade-4.2-to-4.3 job and curl
> failure
> 
> Comment #6 is about releae-openshift-origin-installer-e2e-aws job and
> nothing to do with curl/json failure.  Was that comment a mistake?

Yes. I'll check if the new issue has been filed prior. Sorry about the mixup.

Comment 15 errata-xmlrpc 2020-01-23 11:06:54 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2020:0062