Bug 1336875

Summary: [ceph-ansible] : UBUNTU : rgw installation fails in task 'TASK: [ceph.ceph-common | get ceph version]'
Product: [Red Hat Storage] Red Hat Storage Console Reporter: Rachana Patel <racpatel>
Component: ceph-ansibleAssignee: Andrew Schoen <aschoen>
Status: CLOSED ERRATA QA Contact: sds-qe-bugs
Severity: high Docs Contact:
Priority: unspecified    
Version: 2CC: adeza, aschoen, ceph-eng-bugs, kdreyer, nthomas, racpatel, sankarshan, tchandra
Target Milestone: ---   
Target Release: 2   
Hardware: x86_64   
OS: Linux   
Whiteboard:
Fixed In Version: ceph-ansible-1.0.5-15.el7scon Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2016-08-23 19:50:48 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description Rachana Patel 2016-05-17 16:38:19 UTC
Created attachment 1158406 [details]
complete output

Description of problem:
=======================
During rgw installation using ceph-ansible, Installation fails as it trys to get ceph version on rgw node but ceph packages are not installed on that node.

TASK: [ceph.ceph-common | get ceph version] *********************************** 
<magna067> REMOTE_MODULE command ceph --version
failed: [magna067] => {"changed": false, "cmd": "ceph --version", "failed": true, "rc": 2}
msg: [Errno 2] No such file or directory

 


Version-Release number of selected component (if applicable):
=============================================================



How reproducible:
=================
always


Steps to Reproduce:
====================
1.perform perquisite on all ubuntu node
2. update inventory file for one mon node, 3 osd node and one rgw node 
[mons]
magna051

[osds]
magna051
magna057
magna066

[rgws]
magna067

3. run below command

[ubuntu@magna044 ceph-ansible]$ ansible-playbook site.yml -vv -i  /etc/ansible/hosts  --extra-vars '{"ceph_stable": true, "ceph_origin": "distro", "ceph_stable_rh_storage": true, "monitor_interface": "eth0", "journal_collocation": true, "devices": ["/dev/sdb", "/dev/sdc"], "journal_size": 100, "public_network": "xx.xx.xx.xx/21"}' 

Actual results:
===============
TASK: [ceph.ceph-common | get ceph version] *********************************** 
<magna067> REMOTE_MODULE command ceph --version
failed: [magna067] => {"changed": false, "cmd": "ceph --version", "failed": true, "rc": 2}
msg: [Errno 2] No such file or directory

FATAL: all hosts have already failed -- aborting

PLAY RECAP ******************************************************************** 
           to retry, use: --limit @/home/ubuntu/site.retry

magna051                   : ok=245  changed=14   unreachable=0    failed=0   
magna057                   : ok=157  changed=9    unreachable=0    failed=0   
magna066                   : ok=157  changed=10   unreachable=0    failed=0   
magna067                   : ok=22   changed=2    unreachable=0    failed=1   



Expected results:


Additional info:
================

Comment 2 Rachana Patel 2016-05-17 16:44:03 UTC
Version-Release number of selected component (if applicable):
=============================================================
10.2.1-2redhat1xenial
ceph-ansible-1.0.5-12.el7scon.noarch

Comment 3 Alfredo Deza 2016-05-19 14:39:09 UTC
*** Bug 1336661 has been marked as a duplicate of this bug. ***

Comment 4 Andrew Schoen 2016-05-19 15:06:55 UTC
The radosgw package was not being installed by ceph-ansible when using rhcs on debian systems. I've filled a PR upstream to fix this.

https://github.com/ceph/ceph-ansible/pull/797

Comment 8 Tejas 2016-06-14 10:41:54 UTC
Verified in build:
ceph-ansible-1.0.5-19.el7scon.noarch
ceph version 10.2.1-8redhat1xenial (0abd18aa5da0dde0f01404d8ac10876cb3691bb3

Moving to verified state.

Comment 10 errata-xmlrpc 2016-08-23 19:50:48 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2016:1754