Bug 1779517 - error pulling image 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-*:15.0-79
Summary: error pulling image 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-*:15.0-79
Keywords:
Status: CLOSED WORKSFORME
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-tripleo-common
Version: 15.0 (Stein)
Hardware: x86_64
OS: Linux
medium
urgent
Target Milestone: ---
: ---
Assignee: Adriano Petrich
QA Contact: Alexander Chuzhoy
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2019-12-04 06:20 UTC by Salman Khan
Modified: 2020-01-13 14:53 UTC (History)
4 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2020-01-13 14:53:19 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
overcloud deployment ansible.log (64.59 KB, application/x-xz)
2019-12-04 06:20 UTC, Salman Khan
no flags Details
overcloud deployment ansible.log (3.17 MB, text/plain)
2020-01-02 13:33 UTC, Salman Khan
msalmanmasood: review+
Details
ceph ansible deployment log trying to pull from different registry (4.98 MB, text/plain)
2020-01-02 13:34 UTC, Salman Khan
no flags Details
triple-container-image-prepare.log failing due to unauthorize (1.02 MB, text/plain)
2020-01-09 12:18 UTC, Salman Khan
no flags Details
overcloud deployment ansible.log (1.56 MB, text/plain)
2020-01-13 05:27 UTC, Salman Khan
no flags Details

Description Salman Khan 2019-12-04 06:20:25 UTC
Created attachment 1641897 [details]
overcloud deployment ansible.log

Description of problem:


Version-Release number of selected component (if applicable):


How reproducible:

All virtual setup comprising of; 3x controller, 2x computes and 4x ceph nodes, running over the KVM host with vBMC to be used by ironic ipmi drivers for overcloud nodes.

Steps to Reproduce:
1. Install fresh undercloud running over RHEL8
2. enabling core projects for deployment.
3. only controller reported to be failed while pulling the latest version of the cinder containers than what was downloaded in undercloud local registry

Actual results:

ansible.log:

~~~
2019-12-04 08:38:27,981 p=220978 u=mistral |  TASK [tripleo-container-tag : Pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-79 image] ***
2019-12-04 08:38:27,981 p=220978 u=mistral |  Wednesday 04 December 2019  08:38:27 +0300 (0:00:01.814)       0:06:14.333 **** 
2019-12-04 08:38:28,424 p=220978 u=mistral |  skipping: [devpod-cephstore-00] => {"changed": false, "skip_reason": "Conditional result was False"}
2019-12-04 08:38:28,526 p=220978 u=mistral |  skipping: [devpod-cephstore-01] => {"changed": false, "skip_reason": "Conditional result was False"}
2019-12-04 08:38:28,632 p=220978 u=mistral |  skipping: [devpod-cephstore-02] => {"changed": false, "skip_reason": "Conditional result was False"}
2019-12-04 08:38:28,734 p=220978 u=mistral |  skipping: [devpod-cephstore-03] => {"changed": false, "skip_reason": "Conditional result was False"}
2019-12-04 08:38:28,834 p=220978 u=mistral |  skipping: [devpod-compute-00] => {"changed": false, "skip_reason": "Conditional result was False"}
2019-12-04 08:38:28,865 p=220978 u=mistral |  skipping: [devpod-compute-01] => {"changed": false, "skip_reason": "Conditional result was False"}
2019-12-04 08:38:30,712 p=220978 u=mistral |  fatal: [devpod-controller-00]: FAILED! => {"changed": true, "cmd": "podman pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-79", "delta": "0:00:02.387111", "end": "2019-12-04 08:38:30.681593", "msg": "non-zero return code", "rc": 125, "start": "2019-12-04 08:38:28.294482", "stderr": "Trying to pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-79...Getting image source signatures
Copying blob sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3
Copying blob sha256:93dce31b122d9fb675286d4236a78d957c57cae88361f7a39b13457fc8082a0f
Copying blob sha256:f5b4a35604737933303cb3b31d867225c251ca1c4ee085fd136e0c813ee29bfd
Copying blob sha256:79fcffea81d5cd8344890e0bef152fb3f838864c655f3a571d409ffe28fe703f
Copying blob sha256:7b9dc4a79650db3e40784fbb40882ab602dadaf2fb4a445223cd3646b55c2a4f
Copying blob sha256:2c1b88e43dfdd9aaeb726f65cad06bfa000d3b7e7943803a7dacf15ceff461c8
Failed
error pulling image "10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-79": unable to pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-79: unable to pull image: Error reading blob sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3: Digest did not match, expected sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3, got sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855", "stderr_lines": ["Trying to pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-79...Getting image source signatures", "Copying blob sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3", "Copying blob sha256:93dce31b122d9fb675286d4236a78d957c57cae88361f7a39b13457fc8082a0f", "Copying blob sha256:f5b4a35604737933303cb3b31d867225c251ca1c4ee085fd136e0c813ee29bfd", "Copying blob sha256:79fcffea81d5cd8344890e0bef152fb3f838864c655f3a571d409ffe28fe703f", "Copying blob sha256:7b9dc4a79650db3e40784fbb40882ab602dadaf2fb4a445223cd3646b55c2a4f", "Copying blob sha256:2c1b88e43dfdd9aaeb726f65cad06bfa000d3b7e7943803a7dacf15ceff461c8", "Failed", "error pulling image "10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-79": unable to pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-79: unable to pull image: Error reading blob sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3: Digest did not match, expected sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3, got sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"], "stdout": "", "stdout_lines": []}
~~~

Expected results:

Successfully deployed overcloud.

Additional info:

If needed will be provided.

Comment 1 Salman Khan 2019-12-04 06:22:29 UTC
containers-prepare-parameter.yaml


~~~
parameter_defaults:
  ContainerImagePrepare:
  - tag_from_label: "{version}-{release}"
    push_destination: true
    excludes:
    - nova-api
    set:
      ceph_image: rhceph-4-rhel8
      ceph_namespace: registry.redhat.io/rhceph-beta
      ceph_tag: 4-4
      name_prefix: openstack-
      name_suffix: ''
      namespace: registry.redhat.io/rhosp15-rhel8
      neutron_driver: ovn
      tag: latest
      barbican_tag: latest
      #barbican_namespace: registry.redhat.io/rhosp15-rhel8
      barbican_api_image: barbican-api
      barbican_keystone_image: barbican-keystone
      barbican_worker_image: barbican-worker
  - push_destination: true
    includes:
    - nova-api
    set:
      namespace: registry.redhat.io/rhosp15-rhel8
      tag: 15.0-69
  ContainerImageRegistryCredentials:
    registry.redhat.io:
      USER : PASS
~~~

Comment 2 Salman Khan 2019-12-04 18:20:09 UTC
Local registry contains following;

[root@devpod-controller-00 heat-admin]# podman search "10.101.52.5:8787/rhosp15-rhel8/openstack-cinder"
INDEX       NAME                                                        DESCRIPTION   STARS   OFFICIAL   AUTOMATED
52.5:8787   10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume                    0                  
52.5:8787   10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-api                       0                  
52.5:8787   10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-backup                    0                  
52.5:8787   10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-scheduler                 0                  
[root@devpod-controller-00 heat-admin]# 


while the strange part is during deployment why it tries to pull the next version than what is available in the local registry?

Did attempt to update the local registry as;

sudo openstack tripleo container image prepare   -e ~/containers-prepare-parameter.yaml

any thoughts?

Comment 3 Salman Khan 2019-12-04 18:30:29 UTC
About installed tripleo packages;


(undercloud) [stack@smanager ~]$ rpm -qa | grep tripleo
python3-tripleoclient-heat-installer-11.5.1-0.20190829110437.9b9b5aa.el8ost.noarch
ansible-role-tripleo-modify-image-1.1.1-0.20190711170427.eabaed0.el8ost.noarch
openstack-tripleo-common-containers-10.8.2-0.20190913130445.4754dea.el8ost.noarch
openstack-tripleo-image-elements-10.4.1-0.20190705161217.2c8a6a5.el8ost.noarch
openstack-tripleo-validations-10.5.1-0.20190830060432.585c119.el8ost.noarch
python3-tripleo-common-10.8.2-0.20190913130445.4754dea.el8ost.noarch
openstack-tripleo-heat-templates-10.6.2-0.20190923210442.7db107a.el8ost.noarch
python3-tripleoclient-11.5.1-0.20190829110437.9b9b5aa.el8ost.noarch
openstack-tripleo-puppet-elements-10.3.2-0.20190820220452.5453b89.el8ost.noarch
openstack-tripleo-common-10.8.2-0.20190913130445.4754dea.el8ost.noarch
ansible-tripleo-ipsec-9.1.1-0.20190513190404.ffe104c.el8ost.noarch
puppet-tripleo-10.5.1-0.20190812120435.ed6c6b0.el8ost.noarch

Comment 4 Salman Khan 2019-12-05 12:19:31 UTC
Same issue reported here too; 

https://bugzilla.redhat.com/show_bug.cgi?id=1684301

Comment 5 Alex Schultz 2019-12-12 22:55:53 UTC
`podman images` does not show the images in the registry. The registry images are kept in /var/lib/image-serve/v2/*. You could try clearing out the directory and rerunning the prepare command to refetch.  If you do not specify a specific tag, it will always grab the latest version available (which may be newer than what you have).

I'm not sure why the digests are not matching, so you could try clearing out /var/lib/image-serve/v2/ and rerunning. 

You can clear out the registry by running the following as root:

rm -rf /var/lib/image-serve/v2/*
touch /var/lib/image-serve/v2/_catalog
echo "{}" > /var/lib/image-serve/v2/index.json


In OSP16 we will be providing commands via the tripleoclient to do basic list/push/delete operations for the image registry but they are not available in OSP15.

Comment 6 Salman Khan 2019-12-15 13:30:09 UTC
Yes it does show images;

sudo podman images                                                                                                                                             
REPOSITORY                                                           TAG       IMAGE ID       CREATED        SIZE
10.101.52.5:8787/rhosp15-rhel8/openstack-nova-compute-ironic         15.0-79   c1c208d2f0b9   2 weeks ago    1.97 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-mistral-api                 15.0-77   eae0fdf86878   4 weeks ago    1.1 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-ironic-conductor            15.0-76   0fe92327a0a3   4 weeks ago    917 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-nova-placement-api          15.0-81   bc34f2c2f1a1   4 weeks ago    1.04 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-zaqar-wsgi                  15.0-76   f921dd1f2e0a   4 weeks ago    727 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-swift-account               15.0-75   62227d473b3c   4 weeks ago    841 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-neutron-dhcp-agent          15.0-77   a1df540430fe   4 weeks ago    1.05 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-keystone                    15.0-77   8a0ee7c7d0f7   4 weeks ago    776 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-glance-api                  15.0-74   ae5421b66ad1   4 weeks ago    1.02 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-mistral-engine              15.0-75   8ad69f7824bd   4 weeks ago    1.08 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-swift-cosudo podman images                                                                                                                                             
REPOSITORY                                                           TAG       IMAGE ID       CREATED        SIZE
10.101.52.5:8787/rhosp15-rhel8/openstack-nova-compute-ironic         15.0-79   c1c208d2f0b9   2 weeks ago    1.97 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-mistral-api                 15.0-77   eae0fdf86878   4 weeks ago    1.1 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-ironic-conductor            15.0-76   0fe92327a0a3   4 weeks ago    917 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-nova-placement-api          15.0-81   bc34f2c2f1a1   4 weeks ago    1.04 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-zaqar-wsgi                  15.0-76   f921dd1f2e0a   4 weeks ago    727 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-swift-account               15.0-75   62227d473b3c   4 weeks ago    841 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-neutron-dhcp-agent          15.0-77   a1df540430fe   4 weeks ago    1.05 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-keystone                    15.0-77   8a0ee7c7d0f7   4 weeks ago    776 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-glance-api                  15.0-74   ae5421b66ad1   4 weeks ago    1.02 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-mistral-engine              15.0-75   8ad69f7824bd   4 weeks ago    1.08 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-swift-container             15.0-76   4b9d4573f9c7   4 weeks ago    841 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-mistral-executor            15.0-75   a6b6ed1c908f   4 weeks ago    1.37 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-neutron-openvswitch-agent   15.0-75   8c7320362d5f   4 weeks ago    907 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-heat-api-cfn                15.0-76   b5a4549f9024   4 weeks ago    763 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-heat-api                    15.0-76   c97cdbab903f   4 weeks ago    763 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-ironic-api                  15.0-75   215753453722   4 weeks ago    794 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-neutron-server              15.0-77   c5519bf27f50   4 weeks ago    1.02 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-neutron-l3-agent            15.0-77   1d48a0f844af   4 weeks ago    1.05 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-swift-object                15.0-76   cc5121b760cc   4 weeks ago    841 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-nova-scheduler              15.0-79   ecce23c7490c   4 weeks ago    1.19 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-swift-proxy-server          15.0-77   852279d6f91a   4 weeks ago    900 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-ironic-pxe                  15.0-76   507466cbff56   4 weeks ago    800 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-ironic-neutron-agent        15.0-75   fd4b23e55cc4   4 weeks ago    907 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-heat-engine                 15.0-76   66a5a95d5e90   4 weeks ago    763 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-nova-conductor              15.0-80   4de00ab219f7   4 weeks ago    1.02 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-mistral-event-engine        15.0-78   81bee0471be4   4 weeks ago    1.08 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-ironic-inspector            15.0-79   dba5e5a1d79f   4 weeks ago    700 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-tempest                     15.0-79   51ce92283f9b   4 weeks ago    933 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-mariadb                     15.0-89   81ed7d3cef4b   4 weeks ago    776 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-rabbitmq                    15.0-87   d5b619e4dc4b   4 weeks ago    604 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-memcached                   15.0-82   35fb07b3facc   4 weeks ago    444 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-keepalived                  15.0-83   2b7805bf2b5c   4 weeks ago    437 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-haproxy                     15.0-86   4f5b7d9858c3   4 weeks ago    561 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-cron                        15.0-84   6a73b400c5ca   4 weeks ago    423 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-iscsid                      15.0-84   46e1c3bad8f3   4 weeks ago    443 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-nova-api                    15.0-69   13ac76a8f700   2 months ago   1.08 GB
ntainer             15.0-76   4b9d4573f9c7   4 weeks ago    841 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-mistral-executor            15.0-75   a6b6ed1c908f   4 weeks ago    1.37 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-neutron-openvswitch-agent   15.0-75   8c7320362d5f   4 weeks ago    907 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-heat-api-cfn                15.0-76   b5a4549f9024   4 weeks ago    763 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-heat-api                    15.0-76   c97cdbab903f   4 weeks ago    763 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-ironic-api                  15.0-75   215753453722   4 weeks ago    794 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-neutron-server              15.0-77   c5519bf27f50   4 weeks ago    1.02 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-neutron-l3-agent            15.0-77   1d48a0f844af   4 weeks ago    1.05 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-swift-object                15.0-76   cc5121b760cc   4 weeks ago    841 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-nova-scheduler              15.0-79   ecce23c7490c   4 weeks ago    1.19 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-swift-proxy-server          15.0-77   852279d6f91a   4 weeks ago    900 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-ironic-pxe                  15.0-76   507466cbff56   4 weeks ago    800 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-ironic-neutron-agent        15.0-75   fd4b23e55cc4   4 weeks ago    907 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-heat-engine                 15.0-76   66a5a95d5e90   4 weeks ago    763 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-nova-conductor              15.0-80   4de00ab219f7   4 weeks ago    1.02 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-mistral-event-engine        15.0-78   81bee0471be4   4 weeks ago    1.08 GB
10.101.52.5:8787/rhosp15-rhel8/openstack-ironic-inspector            15.0-79   dba5e5a1d79f   4 weeks ago    700 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-tempest                     15.0-79   51ce92283f9b   4 weeks ago    933 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-mariadb                     15.0-89   81ed7d3cef4b   4 weeks ago    776 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-rabbitmq                    15.0-87   d5b619e4dc4b   4 weeks ago    604 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-memcached                   15.0-82   35fb07b3facc   4 weeks ago    444 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-keepalived                  15.0-83   2b7805bf2b5c   4 weeks ago    437 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-haproxy                     15.0-86   4f5b7d9858c3   4 weeks ago    561 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-cron                        15.0-84   6a73b400c5ca   4 weeks ago    423 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-iscsid                      15.0-84   46e1c3bad8f3   4 weeks ago    443 MB
10.101.52.5:8787/rhosp15-rhel8/openstack-nova-api                    15.0-69   13ac76a8f700   2 months ago   1.08 GB


Is there any specific version of registry issue here for any service i.e. nova/cinder/neutron/glance/ceph/barbican etc which is to be specified in the registry prep yaml?

Thanks

Comment 7 Alex Schultz 2019-12-16 16:18:36 UTC
'podman images' does not show the contents of the registry. It shows the containers fetched into the local podman storage.  If you run container image prepare, it'll fetch all the containers but that isn't what is necessarily available in the registry (there might be more/less available).

Comment 8 Salman Khan 2019-12-22 18:31:39 UTC
Thanks @Alex but it didn't work for here.

Comment 9 Salman Khan 2019-12-30 12:19:10 UTC
Failed
error pulling image "10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-81": unable to pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-81: unable to pull image: Error reading blob sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3: Digest did not match, expected sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3, got sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855", "stderr_lines": ["Trying to pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-81...Getting image source signatures", "Copying blob sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3", "Copying blob sha256:93dce31b122d9fb675286d4236a78d957c57cae88361f7a39b13457fc8082a0f", "Copying blob sha256:cd15afa6044b66b35397f6380e67ea1f4308fe23f1623ec93d5b1f063913f2d9", "Copying blob sha256:d83568c8e965abc71ad53529b66ac86220df8e1b337e760741ab19ba48d0b593", "Copying blob sha256:bf7b57ec3a027cf3478d0bd2f14a9bbd32a4ab8741f5e4128093e383a5ae0de5", "Copying blob sha256:87ddaac82d7b1de496c39a0ff9b2e356af460d79979021a9a32a0329c7d7f903", "Failed", "error pulling image "10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-81": unable to pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-81: unable to pull image: Error reading blob sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3: Digest did not match, expected sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3, got sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"], "stdout": "", "stdout_lines": []}
2019-12-30 14:47:26,252 p=469671 u=mistral |  fatal: [devpod-controller-02]: FAILED! => {"changed": true, "cmd": "podman pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-81", "delta": "0:00:02.002822", "end": "2019-12-30 14:47:26.224039", "msg": "non-zero return code", "rc": 125, "start": "2019-12-30 14:47:24.221217", "stderr": "Trying to pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-81...Getting image source signatures
Copying blob sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3
Copying blob sha256:93dce31b122d9fb675286d4236a78d957c57cae88361f7a39b13457fc8082a0f
Copying blob sha256:cd15afa6044b66b35397f6380e67ea1f4308fe23f1623ec93d5b1f063913f2d9
Copying blob sha256:d83568c8e965abc71ad53529b66ac86220df8e1b337e760741ab19ba48d0b593
Copying blob sha256:bf7b57ec3a027cf3478d0bd2f14a9bbd32a4ab8741f5e4128093e383a5ae0de5
Copying blob sha256:87ddaac82d7b1de496c39a0ff9b2e356af460d79979021a9a32a0329c7d7f903
Failed
error pulling image "10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-81": unable to pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-81: unable to pull image: Error reading blob sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3: Digest did not match, expected sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3, got sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855", "stderr_lines": ["Trying to pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-81...Getting image source signatures", "Copying blob sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3", "Copying blob sha256:93dce31b122d9fb675286d4236a78d957c57cae88361f7a39b13457fc8082a0f", "Copying blob sha256:cd15afa6044b66b35397f6380e67ea1f4308fe23f1623ec93d5b1f063913f2d9", "Copying blob sha256:d83568c8e965abc71ad53529b66ac86220df8e1b337e760741ab19ba48d0b593", "Copying blob sha256:bf7b57ec3a027cf3478d0bd2f14a9bbd32a4ab8741f5e4128093e383a5ae0de5", "Copying blob sha256:87ddaac82d7b1de496c39a0ff9b2e356af460d79979021a9a32a0329c7d7f903", "Failed", "error pulling image "10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-81": unable to pull 10.101.52.5:8787/rhosp15-rhel8/openstack-cinder-volume:15.0-81: unable to pull image: Error reading blob sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3: Digest did not match, expected sha256:ba293c69720e135eb642e9481565ef0e7f1fc5d579e0e9026fbc755c340481b3, got sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"], "stdout": "", "stdout_lines": []}
2019-12-30 14:47:26,254 p=469671 u=mistral |  NO MORE HOSTS LEFT *************************************************************
2019-12-30 14:47:26,262 p=469671 u=mistral |  PLAY RECAP *********************************************************************
2019-12-30 14:47:26,262 p=469671 u=mistral |  devpod-cephstore-00        : ok=93   changed=21   unreachable=0    failed=0    skipped=137  rescued=0    ignored=1   
2019-12-30 14:47:26,262 p=469671 u=mistral |  devpod-cephstore-01        : ok=92   changed=21   unreachable=0    failed=0    skipped=137  rescued=0    ignored=1   
2019-12-30 14:47:26,263 p=469671 u=mistral |  devpod-cephstore-02        : ok=92   changed=21   unreachable=0    failed=0    skipped=137  rescued=0    ignored=1   
2019-12-30 14:47:26,263 p=469671 u=mistral |  devpod-cephstore-03        : ok=92   changed=21   unreachable=0    failed=0    skipped=137  rescued=0    ignored=1   
2019-12-30 14:47:26,263 p=469671 u=mistral |  devpod-compute-00          : ok=122  changed=34   unreachable=0    failed=0    skipped=155  rescued=0    ignored=2   
2019-12-30 14:47:26,263 p=469671 u=mistral |  devpod-compute-01          : ok=122  changed=34   unreachable=0    failed=0    skipped=155  rescued=0    ignored=2   
2019-12-30 14:47:26,263 p=469671 u=mistral |  devpod-controller-00       : ok=141  changed=49   unreachable=0    failed=1    skipped=162  rescued=0    ignored=2   
2019-12-30 14:47:26,264 p=469671 u=mistral |  devpod-controller-01       : ok=141  changed=49   unreachable=0    failed=1    skipped=162  rescued=0    ignored=2   
2019-12-30 14:47:26,264 p=469671 u=mistral |  devpod-controller-02       : ok=141  changed=49   unreachable=0    failed=1    skipped=162  rescued=0    ignored=2   
2019-12-30 14:47:26,264 p=469671 u=mistral |  undercloud                 : ok=29   changed=7    unreachable=0    failed=0    skipped=26   rescued=0    ignored=0   
2019-12-30 14:47:26,265 p=469671 u=mistral |  Monday 30 December 2019  14:47:26 +0300 (0:00:02.624)       0:10:47.730 ******* 
2019-12-30 14:47:26,266 p=469671 u=mistral |  ===============================================================================

Comment 10 Salman Khan 2019-12-30 12:21:46 UTC
2019-12-18 15:45:53,064 p=791675 u=root |  fatal: [devpod-cephstore-00]: FAILED! => changed=false 
  attempts: 3
  cmd:
  - timeout
  - --foreground
  - -s
  - KILL
  - 300s
  - podman
  - pull
  - 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4
  delta: '0:00:00.115007'
  end: '2019-12-18 15:45:53.026359'
  msg: non-zero return code
  rc: 125
  start: '2019-12-18 15:45:52.911352'
  stderr: |-
    Trying to pull 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4...Failed
    error pulling image "10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4": unable to pull 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4: unable to pull image: Error determining manifest MIME type for docker://10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4: pinging docker registry returned: Get https://10.101.52.7:8787/v2/: http: server gave HTTP response to HTTPS client
  stderr_lines: <omitted>
  stdout: ''
  stdout_lines: <omitted>
2019-12-18 15:45:53,066 p=791675 u=root |  NO MORE HOSTS LEFT *************************************************************
2019-12-18 15:45:53,069 p=791675 u=root |  PLAY RECAP *********************************************************************
2019-12-18 15:45:53,070 p=791675 u=root |  devpod-cephstore-00        : ok=41   changed=0    unreachable=0    failed=1    skipped=94   rescued=0    ignored=0   
2019-12-18 15:45:53,070 p=791675 u=root |  devpod-cephstore-01        : ok=41   changed=0    unreachable=0    failed=1    skipped=94   rescued=0    ignored=0   
2019-12-18 15:45:53,070 p=791675 u=root |  devpod-cephstore-02        : ok=41   changed=0    unreachable=0    failed=1    skipped=94   rescued=0    ignored=0   
2019-12-18 15:45:53,070 p=791675 u=root |  devpod-cephstore-03        : ok=41   changed=0    unreachable=0    failed=1    skipped=94   rescued=0    ignored=0   
2019-12-18 15:45:53,071 p=791675 u=root |  devpod-compute-00          : ok=35   changed=0    unreachable=0    failed=1    skipped=98   rescued=0    ignored=0   
2019-12-18 15:45:53,071 p=791675 u=root |  devpod-compute-01          : ok=31   changed=0    unreachable=0    failed=0    skipped=79   rescued=0    ignored=0   
2019-12-18 15:45:53,071 p=791675 u=root |  devpod-controller-00       : ok=41   changed=0    unreachable=0    failed=1    skipped=96   rescued=0    ignored=0   
2019-12-18 15:45:53,071 p=791675 u=root |  devpod-controller-01       : ok=39   changed=0    unreachable=0    failed=1    skipped=95   rescued=0    ignored=0   
2019-12-18 15:45:53,071 p=791675 u=root |  devpod-controller-02       : ok=39   changed=0    unreachable=0    failed=1    skipped=95   rescued=0    ignored=0   
2019-12-18 15:45:53,072 p=791675 u=root |  Wednesday 18 December 2019  15:45:53 +0300 (0:00:32.945)       0:02:51.875 **** 
2019-12-18 15:45:53,072 p=791675 u=root |  =============================================================================== 
2019-12-18 15:45:53,073 p=791675 u=root |  ceph-container-common : pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image -- 32.95s
2019-12-18 15:45:53,073 p=791675 u=root |  gather and delegate facts ---------------------------------------------- 25.20s
2019-12-18 15:45:53,073 p=791675 u=root |  ceph-validate : validate provided configuration ------------------------- 5.19s
2019-12-18 15:45:53,073 p=791675 u=root |  ceph-validate : get devices information --------------------------------- 1.89s
2019-12-18 15:45:53,073 p=791675 u=root |  ceph-facts : resolve device link(s) ------------------------------------- 1.85s
2019-12-18 15:45:53,074 p=791675 u=root |  ceph-facts : set_fact _monitor_address to monitor_address_block ipv4 ---- 1.71s
2019-12-18 15:45:53,074 p=791675 u=root |  gather facts ------------------------------------------------------------ 1.47s
2019-12-18 15:45:53,074 p=791675 u=root |  check for python -------------------------------------------------------- 1.39s
2019-12-18 15:45:53,074 p=791675 u=root |  ceph-facts : include facts.yml ------------------------------------------ 1.39s
2019-12-18 15:45:53,074 p=791675 u=root |  ceph-container-common : remove ceph udev rules -------------------------- 1.35s
2019-12-18 15:45:53,074 p=791675 u=root |  ceph-handler : include check_running_containers.yml --------------------- 1.30s
2019-12-18 15:45:53,074 p=791675 u=root |  ceph-facts : get current fsid ------------------------------------------- 1.29s
2019-12-18 15:45:53,075 p=791675 u=root |  ceph-container-common : include fetch_image.yml ------------------------- 1.20s
2019-12-18 15:45:53,075 p=791675 u=root |  ceph-container-common : include prerequisites.yml ----------------------- 1.20s
2019-12-18 15:45:53,075 p=791675 u=root |  ceph-validate : include check_system.yml -------------------------------- 1.20s
2019-12-18 15:45:53,075 p=791675 u=root |  ceph-facts : check if podman binary is present -------------------------- 1.18s
2019-12-18 15:45:53,075 p=791675 u=root |  ceph-facts : create a local fetch directory if it does not exist -------- 1.12s
2019-12-18 15:45:53,075 p=791675 u=root |  check if podman binary is present --------------------------------------- 1.11s
2019-12-18 15:45:53,075 p=791675 u=root |  ceph-validate : include check_ipaddr_mon.yml ---------------------------- 1.08s
2019-12-18 15:45:53,076 p=791675 u=root |  ceph-facts : check if it is atomic host --------------------------------- 1.07s

Comment 11 Salman Khan 2020-01-02 13:33:12 UTC
Created attachment 1649178 [details]
overcloud deployment ansible.log

local registry listening on 10.101.52.5 then why pulling from 10.101.52.7

       "TASK [ceph-container-common : pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image] ***",
        "Wednesday 01 January 2020  17:08:46 +0300 (0:00:00.668)       0:02:17.550 ***** ",
        "FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).",
        "FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).",
        "FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).",
        "fatal: [devpod-controller-00]: FAILED! => changed=false ",
        "  attempts: 3",
        "  - pull",
        "  - 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4",
        "  delta: '0:00:00.090218'",
        "  end: '2020-01-01 17:09:18.008030'",
        "  start: '2020-01-01 17:09:17.917812'",
        "  stderr: |-",
        "    Trying to pull 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4...Failed",
        "    error pulling image "10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4": unable to pull 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4: unable to pull image: Error determining manifest MIME type for docker://10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4: pinging docker registry returned: Get https://10.101.52.7:8787/v2/: http: server gave HTTP response to HTTPS client",

Comment 12 Salman Khan 2020-01-02 13:34:23 UTC
Created attachment 1649179 [details]
ceph ansible deployment log trying to pull from different registry

2019-12-08 12:52:30,159 p=841797 u=root |  TASK [ceph-container-common : pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image] ***
2019-12-08 12:52:30,160 p=841797 u=root |  Sunday 08 December 2019  12:52:30 +0300 (0:00:00.626)       0:02:16.426 ******* 
2019-12-08 12:52:30,716 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:30,718 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:30,776 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:30,910 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:30,974 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:31,009 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:31,069 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:31,176 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:41,108 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,132 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,164 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,351 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,375 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,393 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,448 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,555 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:51,534 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:51,544 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:51,552 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:51,753 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:51,783 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:51,862 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:51,924 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:52,786 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:53:01,935 p=841797 u=root |  fatal: [devpod-controller-02]: FAILED! => changed=false 
  attempts: 3
  cmd:
  - timeout
  - --foreground
  - -s
  - KILL
  - 300s
  - podman
  - pull
  - 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4
  delta: '0:00:00.093123'
  end: '2019-12-08 12:53:01.897362'
  msg: non-zero return code
  rc: 125
  start: '2019-12-08 12:53:01.804239'
  stderr: |-
    Trying to pull 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4...Failed

Comment 13 Salman Khan 2020-01-02 13:35:58 UTC
2019-12-08 12:52:30,159 p=841797 u=root |  TASK [ceph-container-common : pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image] ***
2019-12-08 12:52:30,160 p=841797 u=root |  Sunday 08 December 2019  12:52:30 +0300 (0:00:00.626)       0:02:16.426 ******* 
2019-12-08 12:52:30,716 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:30,718 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:30,776 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:30,910 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:30,974 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:31,009 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:31,069 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:31,176 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (3 retries left).
2019-12-08 12:52:41,108 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,132 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,164 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,351 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,375 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,393 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,448 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:41,555 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (2 retries left).
2019-12-08 12:52:51,534 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:51,544 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:51,552 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:51,753 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:51,783 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:51,862 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:51,924 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:52:52,786 p=841797 u=root |  FAILED - RETRYING: pulling 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4 image (1 retries left).
2019-12-08 12:53:01,935 p=841797 u=root |  fatal: [devpod-controller-02]: FAILED! => changed=false 
  attempts: 3
  cmd:
  - timeout
  - --foreground
  - -s
  - KILL
  - 300s
  - podman
  - pull
  - 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4
  delta: '0:00:00.093123'
  end: '2019-12-08 12:53:01.897362'
  msg: non-zero return code
  rc: 125
  start: '2019-12-08 12:53:01.804239'
  stderr: |-
    Trying to pull 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4...Failed
    error pulling image "10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4": unable to pull 10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4: unable to pull image: Error determining manifest MIME type for docker://10.101.52.7:8787/rhceph-beta/rhceph-4-rhel8:4-4: pinging docker registry returned: Get https://10.101.52.7:8787/v2/: http: server gave HTTP response to HTTPS client

Comment 14 Salman Khan 2020-01-09 12:18:21 UTC
Created attachment 1650952 [details]
triple-container-image-prepare.log failing due to unauthorize

it's repeatedly failing for new deployment!

Comment 15 Alex Schultz 2020-01-10 15:10:09 UTC
Still attempting to reproduce.  I'm unable to get the same error so I don't know what's going on based on the provided information.  If you can provide a sosreport from the undercloud that might be helpful.

Comment 16 Alex Schultz 2020-01-10 21:22:38 UTC
I cannot reproduce any mime issues using the container-image-prepare.yaml from comment 1.  I would recommend opening a support ticket and providing the system logs as well as the deployment command and templates.  As mentioned in comment 5, the way to reset the image registry in OSP15 is done by running:

sudo rm -rf /var/lib/image-serve/v2/*
sudo touch /var/lib/image-serve/v2/_catalog
echo "{}" | sudo tee /var/lib/image-serve/v2/index.json

You can then re-populate the registry by running the 'openstack tripleo container image prepare command' from comment 2

sudo openstack tripleo container image prepare   -e ~/containers-prepare-parameter.yaml

Since we store the files on disk, my thoughts on mime errors might be file system corruption or network issues where data is being dropped somehow.

Comment 17 Salman Khan 2020-01-12 07:56:09 UTC
So attempted the manual download of the container images;

$ sudo openstack tripleo container image prepare   -e ~/containers-prepare-parameter.yaml

which updated the rest but ends up with the following failures;

Failure: id 7182b1899a52a7d5f71125a106d43252f237a0fe, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-gnocchi-metricd/blobs/sha256:0b4c003815df562f6716b723ea98e3c3e6f7256077f446855d1595881899a8e9
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-gnocchi-metricd/blobs/sha256:0b4c003815df562f6716b723ea98e3c3e6f7256077f446855d1595881899a8e9.gz
Failure: id 7182b1899a52a7d5f71125a106d43252f237a0fe, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-gnocchi-metricd/blobs/sha256:0b4c003815df562f6716b723ea98e3c3e6f7256077f446855d1595881899a8e9
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-gnocchi-metricd/blobs/sha256:0b4c003815df562f6716b723ea98e3c3e6f7256077f446855d1595881899a8e9.gz
Failure: id 7182b1899a52a7d5f71125a106d43252f237a0fe, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-gnocchi-metricd/blobs/sha256:0b4c003815df562f6716b723ea98e3c3e6f7256077f446855d1595881899a8e9
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-gnocchi-metricd/blobs/sha256:0b4c003815df562f6716b723ea98e3c3e6f7256077f446855d1595881899a8e9.gz
Failure: id 7182b1899a52a7d5f71125a106d43252f237a0fe, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-gnocchi-metricd/blobs/sha256:0b4c003815df562f6716b723ea98e3c3e6f7256077f446855d1595881899a8e9
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-gnocchi-metricd/blobs/sha256:0b4c003815df562f6716b723ea98e3c3e6f7256077f446855d1595881899a8e9.gz
Failure: id 7182b1899a52a7d5f71125a106d43252f237a0fe, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-gnocchi-metricd/blobs/sha256:0b4c003815df562f6716b723ea98e3c3e6f7256077f446855d1595881899a8e9
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-gnocchi-metricd/blobs/sha256:0b4c003815df562f6716b723ea98e3c3e6f7256077f446855d1595881899a8e9.gz
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-cinder-scheduler:15.0-81
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-ceilometer-notification:15.0-81
Failure: id e65aaa916d750284f3be453a8d5f5acb8479255e, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95.gz
Failure: id e65aaa916d750284f3be453a8d5f5acb8479255e, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95.gz
Failure: id e65aaa916d750284f3be453a8d5f5acb8479255e, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95.gz
Failure: id e65aaa916d750284f3be453a8d5f5acb8479255e, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95.gz
Failure: id e65aaa916d750284f3be453a8d5f5acb8479255e, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95.gz
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-aodh-evaluator:15.0-80
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-aodh-api:15.0-79
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-aodh-listener:15.0-80
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-swift-proxy-server:15.0-81
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-swift-object:15.0-80
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-swift-container:15.0-80
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-swift-account:15.0-79
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-cinder-backup:15.0-81
Failure: id 60c5d7d402e830810c9d25b279cf818609cdd722, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-gnocchi-api/blobs/sha256:f6bf9ba68a4b5d8453bcff59508fae612e59cc9f5f4d762dfac9125ce026b6b7
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-gnocchi-api/blobs/sha256:f6bf9ba68a4b5d8453bcff59508fae612e59cc9f5f4d762dfac9125ce026b6b7.gz
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95.gz
Failure: id e65aaa916d750284f3be453a8d5f5acb8479255e, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95.gz
Failure: id e65aaa916d750284f3be453a8d5f5acb8479255e, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95.gz
Failure: id e65aaa916d750284f3be453a8d5f5acb8479255e, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95.gz
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-aodh-evaluator:15.0-80
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-aodh-api:15.0-79
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-aodh-listener:15.0-80
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-swift-proxy-server:15.0-81
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-swift-object:15.0-80
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-swift-container:15.0-80
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-swift-account:15.0-79
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-cinder-backup:15.0-81
Failure: id 60c5d7d402e830810c9d25b279cf818609cdd722, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-gnocchi-api/blobs/sha256:f6bf9ba68a4b5d8453bcff59508fae612e59cc9f5f4d762dfac9125ce026b6b7
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-gnocchi-api/blobs/sha256:f6bf9ba68a4b5d8453bcff59508fae612e59cc9f5f4d762dfac9125ce026b6b7.gz
Failure: id 60c5d7d402e830810c9d25b279cf818609cdd722, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-gnocchi-api/blobs/sha256:f6bf9ba68a4b5d8453bcff59508fae612e59cc9f5f4d762dfac9125ce026b6b7
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-gnocchi-api/blobs/sha256:f6bf9ba68a4b5d8453bcff59508fae612e59cc9f5f4d762dfac9125ce026b6b7.gz
Failure: id 60c5d7d402e830810c9d25b279cf818609cdd722, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-gnocchi-api/blobs/sha256:f6bf9ba68a4b5d8453bcff59508fae612e59cc9f5f4d762dfac9125ce026b6b7
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-gnocchi-api/blobs/sha256:f6bf9ba68a4b5d8453bcff59508fae612e59cc9f5f4d762dfac9125ce026b6b7.gz
Failure: id 60c5d7d402e830810c9d25b279cf818609cdd722, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-gnocchi-api/blobs/sha256:f6bf9ba68a4b5d8453bcff59508fae612e59cc9f5f4d762dfac9125ce026b6b7
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-gnocchi-api/blobs/sha256:f6bf9ba68a4b5d8453bcff59508fae612e59cc9f5f4d762dfac9125ce026b6b7.gz
Failure: id 60c5d7d402e830810c9d25b279cf818609cdd722, status 401, reason Unauthorized text {"errors":[{"code":"UNAUTHORIZED","message":"Access to the requested resource is not authorized"}]}
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-gnocchi-api/blobs/sha256:f6bf9ba68a4b5d8453bcff59508fae612e59cc9f5f4d762dfac9125ce026b6b7
Broken layer found and removed: /var/lib/image-serve/v2/rhosp15-rhel8/openstack-gnocchi-api/blobs/sha256:f6bf9ba68a4b5d8453bcff59508fae612e59cc9f5f4d762dfac9125ce026b6b7.gz
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-gnocchi-statsd:15.0-80
Exception occured while running the command
Traceback (most recent call last):
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_export.py", line 71, in export_stream
    for chunk in layer_stream:
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_uploader.py", line 1278, in _layer_stream_registry
    cls.check_status(session=session, request=blob_req)
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_uploader.py", line 525, in check_status
    request.raise_for_status()
  File "/usr/lib/python3.6/site-packages/requests/models.py", line 940, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.6/site-packages/tripleoclient/command.py", line 32, in run
    super(Command, self).run(parsed_args)
  File "/usr/lib/python3.6/site-packages/osc_lib/command/command.py", line 41, in run
    return super(Command, self).run(parsed_args)
  File "/usr/lib/python3.6/site-packages/cliff/command.py", line 184, in run
    return_code = self.take_action(parsed_args) or 0
  File "/usr/lib/python3.6/site-packages/tripleoclient/v1/container_image.py", line 657, in take_action
    cleanup=parsed_args.cleanup)
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/kolla_builder.py", line 213, in container_images_prepare_multi
    uploader.upload()
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_uploader.py", line 237, in upload
    uploader.run_tasks()
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_uploader.py", line 1823, in run_tasks
    for result in p.map(upload_task, self.upload_tasks):
  File "/usr/lib64/python3.6/concurrent/futures/_base.py", line 586, in result_iterator
    yield fs.pop().result()
  File "/usr/lib64/python3.6/concurrent/futures/_base.py", line 432, in result
    return self.__get_result()
  File "/usr/lib64/python3.6/concurrent/futures/_base.py", line 384, in __get_result
    raise self._exception
  File "/usr/lib64/python3.6/concurrent/futures/thread.py", line 56, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_uploader.py", line 1874, in upload_task
    return uploader.upload_image(task)
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_uploader.py", line 1130, in upload_image
    target_session=target_session
  File "/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 292, in wrapped_f
    return self.call(f, *args, **kw)
  File "/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 358, in call
    do = self.iter(retry_state=retry_state)
  File "/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 319, in iter
    return fut.result()
  File "/usr/lib64/python3.6/concurrent/futures/_base.py", line 425, in result
    return self.__get_result()
  File "/usr/lib64/python3.6/concurrent/futures/_base.py", line 384, in __get_result
    raise self._exception
  File "/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 361, in call
    result = fn(*args, **kwargs)
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_uploader.py", line 1373, in _copy_registry_to_registry
    raise e
  File "/usr/lib64/python3.6/concurrent/futures/thread.py", line 56, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 292, in wrapped_f
    return self.call(f, *args, **kw)
  File "/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 358, in call
    do = self.iter(retry_state=retry_state)
  File "/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 331, in iter
    raise retry_exc.reraise()
  File "/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 167, in reraise
    raise self.last_attempt.result()
  File "/usr/lib64/python3.6/concurrent/futures/_base.py", line 425, in result
    return self.__get_result()
  File "/usr/lib64/python3.6/concurrent/futures/_base.py", line 384, in __get_result
    raise self._exception
  File "/usr/lib/python3.6/site-packages/tenacity/__init__.py", line 361, in call
    result = fn(*args, **kwargs)
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_uploader.py", line 1309, in _copy_layer_registry_to_registry
    layer_stream, target_session)
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_uploader.py", line 1608, in _copy_stream_to_registry
    target_url, layer, layer_stream, verify_digest=verify_digest)
  File "/usr/lib/python3.6/site-packages/tripleo_common/image/image_export.py", line 83, in export_stream
    raise IOError(write_error)
OSError: Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95
Write Failure: 401 Client Error: Unauthorized for url: https://registry.redhat.io/v2/rhosp15-rhel8/openstack-aodh-notifier/blobs/sha256:79d6925c4e5071e39c69ab44ba57ed0957eeb68cfadd33f43673170fffce4f95
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-cinder-volume:15.0-83
Completed upload for image registry.redhat.io/rhosp15-rhel8/openstack-cinder-api:15.0-82
sys:1: ResourceWarning: unclosed <socket.socket fd=10, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('10.101.52.5', 37092)>
sys:1: ResourceWarning: unclosed <socket.socket fd=42, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('10.101.52.5', 37130)>
sys:1: ResourceWarning: unclosed <ssl.SSLSocket fd=47, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('192.168.122.5', 46900), raddr=('92.123.208.17', 443)>
sys:1: ResourceWarning: unclosed <ssl.SSLSocket fd=16, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('192.168.122.5', 46902), raddr=('92.123.208.17', 443)>
sys:1: ResourceWarning: unclosed <ssl.SSLSocket fd=48, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('192.168.122.5', 46890), raddr=('92.123.208.17', 443)>
sys:1: ResourceWarning: unclosed <socket.socket fd=51, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('10.101.52.5', 37260)>
sys:1: ResourceWarning: unclosed <socket.socket fd=61, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('10.101.52.5', 37274)>
sys:1: ResourceWarning: unclosed <ssl.SSLSocket fd=57, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('192.168.122.5', 47016), raddr=('92.123.208.17', 443)>
sys:1: ResourceWarning: unclosed <ssl.SSLSocket fd=58, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('192.168.122.5', 47014), raddr=('92.123.208.17', 443)>
sys:1: ResourceWarning: unclosed <ssl.SSLSocket fd=63, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('192.168.122.5', 47022), raddr=('92.123.208.17', 443)>
sys:1: ResourceWarning: unclosed <ssl.SSLSocket fd=69, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('192.168.122.5', 47026), raddr=('92.123.208.17', 443)>
sys:1: ResourceWarning: unclosed <ssl.SSLSocket fd=74, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('192.168.122.5', 47044), raddr=('92.123.208.17', 443)>
sys:1: ResourceWarning: unclosed <ssl.SSLSocket fd=55, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('192.168.122.5', 47038), raddr=('92.123.208.17', 443)>
sys:1: ResourceWarning: unclosed <ssl.SSLSocket fd=75, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('192.168.122.5', 47046), raddr=('92.123.208.17', 443)>
sys:1: ResourceWarning: unclosed <ssl.SSLSocket fd=35, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('192.168.122.5', 47036), raddr=('92.123.208.17', 443)>
sys:1: ResourceWarning: unclosed <ssl.SSLSocket fd=71, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('192.168.122.5', 47040), raddr=('92.123.208.17', 443)>
sys:1: ResourceWarning: unclosed <socket.socket fd=33, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('10.101.52.5', 49262)>
sys:1: ResourceWarning: unclosed <socket.socket fd=9, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('10.101.52.5', 53760)>
sys:1: ResourceWarning: unclosed <socket.socket fd=5, family=AddressFamily.AF_INET, type=2049, proto=6, laddr=('10.101.52.5', 59888)>


Observations;

- why the some container images are reported being Unauthorized client errors while the most are updated without any issue?
- why the openstack-cinder-* container images versions difference seen i.e. to be downloaded are older than what is to be installed/pulled at the overcloud deployment is latest one?

I believe, if the latest version tag is ther in the image preparation yaml file then only the latest for each attempt should be downloaded/pulled for local registry.

advice pl.

Thanks.

Comment 19 Salman Khan 2020-01-13 05:27:34 UTC
Created attachment 1651727 [details]
overcloud deployment ansible.log

check the different version of openstack-cinder-* is being pulled than what is downloaded.

Comment 20 Alex Schultz 2020-01-13 14:53:19 UTC
Please go through Red Hat Support to continue troubleshooting. We are unable to reproduce this and the logs provided are not sufficient to troubleshoot.  Other possible causes are the usage of a caching proxy which would result in incorrect metadata being fetched.


Note You need to log in before you can comment on or make changes to this bug.