Bug 2236691 - Default images for downstream RHCS 7.0 monitoring stack being pulled from quay instead of redhat registry
Summary: Default images for downstream RHCS 7.0 monitoring stack being pulled from qua...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Ceph Storage
Classification: Red Hat Storage
Component: Cephadm
Version: 7.0
Hardware: x86_64
OS: Linux
unspecified
urgent
Target Milestone: ---
: 7.0
Assignee: Adam King
QA Contact: Mohit Bisht
Rivka Pollack
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2023-09-01 09:07 UTC by Vinayak Papnoi
Modified: 2023-12-13 15:22 UTC (History)
7 users (show)

Fixed In Version: ceph-18.2.0-69
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2023-12-13 15:22:39 UTC
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Issue Tracker RHCEPH-7300 0 None None None 2023-09-01 09:09:32 UTC
Red Hat Product Errata RHBA-2023:7780 0 None None None 2023-12-13 15:22:42 UTC

Description Vinayak Papnoi 2023-09-01 09:07:12 UTC
Description of problem:

While deploying RHCS 7.0 with monitoring stack enabled, the images are being pulled from quay instead of redhat registries.

# podman images
REPOSITORY                                            TAG                                                        IMAGE ID      CREATED       SIZE
registry-proxy.engineering.redhat.com/rh-osbs/rhceph  ceph-7.0-rhel-9-containers-candidate-52437-20230830172452  1ceac0fe4e9f  39 hours ago  1.05 GB
quay.io/ceph/ceph-grafana                             9.4.7                                                      2c41d148cca3  4 months ago  647 MB
quay.io/prometheus/prometheus                         v2.43.0                                                    a07b618ecd1d  5 months ago  235 MB
quay.io/prometheus/alertmanager                       v0.25.0                                                    c8568f914cd2  8 months ago  66.5 MB
quay.io/prometheus/node-exporter                      v1.5.0                                                     0da6a335fe13  9 months ago  23.9 MB

# ceph orch ps
NAME                                                  HOST                                    PORTS             STATUS         REFRESHED  AGE  MEM USE  MEM LIM  VERSION         IMAGE ID      CONTAINER ID  
alertmanager.ceph-vpapnoi-70-mfoa4d-node1-installer   ceph-vpapnoi-70-mfoa4d-node1-installer  *:9093,9094       running (23h)     4m ago  23h    22.7M        -  0.25.0          c8568f914cd2  0833d7d29cc0  
ceph-exporter.ceph-vpapnoi-70-mfoa4d-node1-installer  ceph-vpapnoi-70-mfoa4d-node1-installer                    running (23h)     4m ago  23h    18.3M        -  18.2.0-2.el9cp  1ceac0fe4e9f  142caa51e813  
ceph-exporter.ceph-vpapnoi-70-mfoa4d-node2            ceph-vpapnoi-70-mfoa4d-node2                              running (23h)    92s ago  23h    16.8M        -  18.2.0-2.el9cp  1ceac0fe4e9f  79d2d2002056  
ceph-exporter.ceph-vpapnoi-70-mfoa4d-node3            ceph-vpapnoi-70-mfoa4d-node3                              running (23h)    91s ago  23h    16.8M        -  18.2.0-2.el9cp  1ceac0fe4e9f  b0009cac2897  
crash.ceph-vpapnoi-70-mfoa4d-node1-installer          ceph-vpapnoi-70-mfoa4d-node1-installer                    running (23h)     4m ago  23h    6899k        -  18.2.0-2.el9cp  1ceac0fe4e9f  b19f598ac7ef  
crash.ceph-vpapnoi-70-mfoa4d-node2                    ceph-vpapnoi-70-mfoa4d-node2                              running (23h)    92s ago  23h    6895k        -  18.2.0-2.el9cp  1ceac0fe4e9f  0518c2fc359b  
crash.ceph-vpapnoi-70-mfoa4d-node3                    ceph-vpapnoi-70-mfoa4d-node3                              running (23h)    91s ago  23h    6903k        -  18.2.0-2.el9cp  1ceac0fe4e9f  874cf13c621e  
grafana.ceph-vpapnoi-70-mfoa4d-node1-installer        ceph-vpapnoi-70-mfoa4d-node1-installer  *:3000            running (23h)     4m ago  23h    84.3M        -  9.4.7           2c41d148cca3  e638fa0c6fde  
mds.cephfs.ceph-vpapnoi-70-mfoa4d-node2.rdexvw        ceph-vpapnoi-70-mfoa4d-node2                              running (23h)    92s ago  23h    24.8M        -  18.2.0-2.el9cp  1ceac0fe4e9f  10fa0073a1b5  
mds.cephfs.ceph-vpapnoi-70-mfoa4d-node3.hcpbog        ceph-vpapnoi-70-mfoa4d-node3                              running (23h)    91s ago  23h    24.5M        -  18.2.0-2.el9cp  1ceac0fe4e9f  67261c7f84d8  
mgr.ceph-vpapnoi-70-mfoa4d-node1-installer.zzxfko     ceph-vpapnoi-70-mfoa4d-node1-installer  *:9283,8765,8443  running (23h)     4m ago  23h     548M        -  18.2.0-2.el9cp  1ceac0fe4e9f  c12bd96eb74c  
mgr.ceph-vpapnoi-70-mfoa4d-node3.wnrfng               ceph-vpapnoi-70-mfoa4d-node3            *:8443,9283,8765  running (23h)    91s ago  23h     438M        -  18.2.0-2.el9cp  1ceac0fe4e9f  e10e49d0a27c  
mon.ceph-vpapnoi-70-mfoa4d-node1-installer            ceph-vpapnoi-70-mfoa4d-node1-installer                    running (23h)     4m ago  23h     395M    2048M  18.2.0-2.el9cp  1ceac0fe4e9f  1ddcd773d975  
mon.ceph-vpapnoi-70-mfoa4d-node2                      ceph-vpapnoi-70-mfoa4d-node2                              running (23h)    92s ago  23h     385M    2048M  18.2.0-2.el9cp  1ceac0fe4e9f  30675e931a3c  
mon.ceph-vpapnoi-70-mfoa4d-node3                      ceph-vpapnoi-70-mfoa4d-node3                              running (23h)    91s ago  23h     384M    2048M  18.2.0-2.el9cp  1ceac0fe4e9f  fd27ed61a15a  
node-exporter.ceph-vpapnoi-70-mfoa4d-node1-installer  ceph-vpapnoi-70-mfoa4d-node1-installer  *:9100            running (23h)     4m ago  23h    15.0M        -  1.5.0           0da6a335fe13  d7a08c60f252  
node-exporter.ceph-vpapnoi-70-mfoa4d-node2            ceph-vpapnoi-70-mfoa4d-node2            *:9100            running (23h)    92s ago  23h    14.8M        -  1.5.0           0da6a335fe13  c669e8dffde7  
node-exporter.ceph-vpapnoi-70-mfoa4d-node3            ceph-vpapnoi-70-mfoa4d-node3            *:9100            running (23h)    91s ago  23h    16.3M        -  1.5.0           0da6a335fe13  f1eb2421e8d9  
osd.0                                                 ceph-vpapnoi-70-mfoa4d-node1-installer                    running (23h)     4m ago  23h    60.7M    4096M  18.2.0-2.el9cp  1ceac0fe4e9f  111160b95a60  
osd.1                                                 ceph-vpapnoi-70-mfoa4d-node1-installer                    running (23h)     4m ago  23h    61.7M    4096M  18.2.0-2.el9cp  1ceac0fe4e9f  60911711417d  
osd.2                                                 ceph-vpapnoi-70-mfoa4d-node1-installer                    running (23h)     4m ago  23h    62.7M    4096M  18.2.0-2.el9cp  1ceac0fe4e9f  9981b3c3f2b2  
osd.3                                                 ceph-vpapnoi-70-mfoa4d-node1-installer                    running (23h)     4m ago  23h    63.7M    4096M  18.2.0-2.el9cp  1ceac0fe4e9f  92d23cd3084e  
osd.4                                                 ceph-vpapnoi-70-mfoa4d-node1-installer                    running (23h)     4m ago  23h    62.7M    4096M  18.2.0-2.el9cp  1ceac0fe4e9f  538cd18c3827  
osd.5                                                 ceph-vpapnoi-70-mfoa4d-node1-installer                    running (23h)     4m ago  23h    61.2M    4096M  18.2.0-2.el9cp  1ceac0fe4e9f  6f075e0e98d4  
prometheus.ceph-vpapnoi-70-mfoa4d-node1-installer     ceph-vpapnoi-70-mfoa4d-node1-installer  *:9095            running (23h)     4m ago  23h    78.9M        -  2.43.0          a07b618ecd1d  e6b216ffe84b  


Version-Release number of selected component (if applicable):
18.2.0-2.el9cp

How reproducible:
Always

Steps to Reproduce:
1. Deploy an RHCS 7.0 cluster with monitoring stack
2. Check podman images
3.

Actual results:
Images are being pulled from quay

Expected results:
Images must be pulled from redhat registry

Additional info:
Ref: https://bugzilla.redhat.com/show_bug.cgi?id=2121067

Comment 14 errata-xmlrpc 2023-12-13 15:22:39 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Red Hat Ceph Storage 7.0 Bug Fix update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2023:7780


Note You need to log in before you can comment on or make changes to this bug.