Bug 1696393 - Overcloud deploy fails while pulling ceph-container from ceph container repository
Summary: Overcloud deploy fails while pulling ceph-container from ceph container repos...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: ansible-role-container-registry
Version: 15.0 (Stein)
Hardware: Unspecified
OS: Unspecified
medium
medium
Target Milestone: beta
: 15.0 (Stein)
Assignee: John Fulton
QA Contact: Aharon Canan
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2019-04-04 18:24 UTC by Alistair Tonner
Modified: 2019-09-26 10:49 UTC (History)
5 users (show)

Fixed In Version: ansible-role-container-registry-1.0.1-0.20190608060402.80af0d2.el8ost
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2019-09-21 11:21:05 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Launchpad 1823821 0 None None None 2019-04-09 01:11:21 UTC
OpenStack gerrit 651051 0 None None None 2019-04-09 01:10:27 UTC
Red Hat Product Errata RHEA-2019:2811 0 None None None 2019-09-21 11:21:23 UTC

Description Alistair Tonner 2019-04-04 18:24:29 UTC
Description of problem: Overcloud deploy fails to pull from ceph repository with KeyError: Layers


Version-Release number of selected component (if applicable):

RHEL8 - 1845
RHOS_TRUNK-15.0-RHEL-8-20190403.n.3

ansible.noarch                                2.7.6-1.el8                                          @rhelosp-15.0-trunk
ansible-pacemaker.noarch                      1.0.4-0.20190129114541.0e4d7c0.el8ost                @rhelosp-15.0-trunk
ansible-role-atos-hsm.noarch                  0.1.1-0.20190306173142.f6f9c3f.el8ost                @rhelosp-15.0-trunk
ansible-role-chrony.noarch                    0.0.1-0.20190327040343.068668b.el8ost                @rhelosp-15.0-trunk
ansible-role-container-registry.noarch        1.0.1-0.20190219021249.d6a749a.el8ost                @rhelosp-15.0-trunk
ansible-role-redhat-subscription.noarch       1.0.2-0.20190215212927.13bf86d.el8ost                @rhelosp-15.0-trunk
ansible-role-thales-hsm.noarch                0.2.1-0.20190306204553.08b5efa.el8ost                @rhelosp-15.0-trunk
ansible-role-tripleo-modify-image.noarch      1.0.1-0.20190402220346.012209a.el8ost                @rhelosp-15.0-trunk
ansible-tripleo-ipsec.noarch                  9.0.1-0.20190220162047.f60ad6c.el8ost                @rhelosp-15.0-trunk
openstack-tripleo-common-containers.noarch    10.6.1-0.20190404000356.3398bec.el8ost               @rhelosp-15.0-trunk
openstack-tripleo-common.noarch               10.6.1-0.20190404000356.3398bec.el8ost               @rhelosp-15.0-trunk
openstack-tripleo-heat-templates.noarch       10.4.1-0.20190403221322.0d98720.el8ost               @rhelosp-15.0-trunk
openstack-tripleo-image-elements.noarch       10.3.1-0.20190325204940.253fe88.el8ost               @rhelosp-15.0-trunk
openstack-tripleo-puppet-elements.noarch      10.2.1-0.20190327211339.0f6cacb.el8ost               @rhelosp-15.0-trunk
openstack-tripleo-validations.noarch          10.3.1-0.20190403171315.a4c40f2.el8ost               @rhelosp-15.0-trunk
puppet-tripleo.noarch                         10.3.1-0.20190403180925.81d7714.el8ost               @rhelosp-15.0-trunk
python3-heat-agent-ansible.noarch             1.8.1-0.20190402070337.ad2a5d1.el8ost                @rhelosp-15.0-trunk
python3-tripleoclient-heat-installer.noarch   11.3.1-0.20190403170353.73cc438.el8ost               @rhelosp-15.0-trunk
python3-tripleoclient.noarch                  11.3.1-0.20190403170353.73cc438.el8ost               @rhelosp-15.0-trunk
python3-tripleo-common.noarch                 10.6.1-0.20190404000356.3398bec.el8ost               @rhelosp-15.0-trunk


How reproducible:

   


Steps to Reproduce:
  Run attached script to deploy overcloud


Actual results:

IN :
/var/log/tripleo-container-image-prepare.log

Using config files: ['/tmp/tmpbpv5mulo']
imagename: quay.io/rhceph-dev/rhceph-4.0-rhel-8:latest
Image prepare failed: 'layers'
Traceback (most recent call last):
  File "/usr/bin/tripleo-container-image-prepare", line 131, in <module>
    env, roles_data, cleanup=args.cleanup, dry_run=args.dry_run)
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/kolla_builder.py", line 203, in container_images_prepare_multi
    uploader.upload()
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_uploader.py", line 181, in upload
    uploader.run_tasks()
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_uploader.py", line 1570, in run_tasks
    result = uploader.upload_image(first_task)
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_uploader.py", line 886, in upload_image
    source_layers = [l['digest'] for l in manifest['layers']]
KeyError: 'layers'


Expected results:

  Containers deployed, overcloud deploys successfully


Additional info:

  Also notes in /var/log/containers/mistral/overcloud/ansible.log:

fatal: [controller-0]: FAILED! => {"changed": true, "cmd": "podman pull 192.168.24.1:8787/rhosp15/openstack-cinder-volume:20190403.1

(for each controller)

Comment 1 Artem Hrechanychenko 2019-04-04 18:26:50 UTC
Also got that issue today on titan46

Comment 7 Steve Baker 2019-04-08 20:20:55 UTC
quay.io serves v1 manifests instead of the v2 manifests served by most other registries. The container prepare has support for v1 but it looks like there is a bug in that support. I'll take a look

Comment 10 John Fulton 2019-04-16 13:04:47 UTC
I hit this in ceph-ansible-4.0.0-0.beta1.95.g6d506dba.el8cp.noarch today though it was working yesterday; all I did was redeploy. I then tried via the CLI from my undercloud to podman pull it and I saw:

Error determining manifest MIME type for docker://quay.io/rhceph-dev/rhceph-4.0-rhel-8:latest: pinging docker registry returned: Get https://quay.io/v2/: EOF

More details in:

 http://paste.openstack.org/show/749351/

Comment 20 errata-xmlrpc 2019-09-21 11:21:05 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2019:2811


Note You need to log in before you can comment on or make changes to this bug.