Bug 2071676

Summary: Adding rgw using site-container.yml with limit fails
Product: [Red Hat Storage] Red Hat Ceph Storage Reporter: Ameena Suhani S H <amsyedha>
Component: Ceph-AnsibleAssignee: Guillaume Abrioux <gabrioux>
Status: CLOSED ERRATA QA Contact: Ameena Suhani S H <amsyedha>
Severity: medium Docs Contact:
Priority: unspecified    
Version: 3.3CC: aschoen, ceph-eng-bugs, ceph-qe-bugs, gmeno, nthomas, tserlin, ykaul
Target Milestone: ---Flags: amsyedha: automate_bug+
Target Release: 3.3z8   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: ceph-ansible-3.2.59-1.el7cp Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2022-04-19 10:20:59 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description Ameena Suhani S H 2022-04-04 14:17:58 UTC
Description of problem:
Adding rgw using site-container.yml with limit fails at below task
Logs: http://magna002.ceph.redhat.com/cephci-jenkins/cephci-run-QS8UIV/config_roll_over_rgw_0.log


TASK [ceph-osd : wait for all osd to be up] ************************************
2022-04-02 10:12:36,808 - ceph.ceph - INFO - fatal: [ceph-suhu-qs8uiv-node12]: FAILED! => {
    "msg": "The conditional check '(wait_for_all_osds_up.stdout | default('{}') | from_json)[\"osdmap\"][\"osdmap\"][\"num_osds\"] | int > 0' failed. The error was: error while evaluating conditional ((wait_for_all_osds_up.stdout | default('{}') | from_json)[\"osdmap\"][\"osdmap\"][\"num_osds\"] | int > 0): 'dict object' has no attribute 'osdmap'"
}
Version-Release number of selected component (if applicable):
3.2.58-1.el7cp.noarch


How reproducible:
2/2

Steps to Reproduce:
1. install rhcs3 cluster
2. Add osds to cluster
3. try add rgw using site-container playbook with limit

Actual results:
playbook fails

Expected results:
playbook should be successful and rgw should be deployed

Comment 5 Ameena Suhani S H 2022-04-06 03:36:05 UTC
Verified using 3.2.59-1.el7cp.noarch

http://magna002.ceph.redhat.com/cephci-jenkins/cephci-run-97Q854/

Comment 7 errata-xmlrpc 2022-04-19 10:20:59 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Important: Red Hat Ceph Storage 3 Security and Bug Fix update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2022:1394