Description of problem: Adding a new OSD node to the existing ceph cluster on ceph containers is failing in ceph-ansible's task Version-Release number of selected component (if applicable): ceph-ansible-3.2.4-1.el7cp.noarch 2 ceph version 12.2.8-52.el7cp (3af3ca15b68572a357593c261f95038d02f46201) luminous (stable) How reproducible: 2/2 Steps to Reproduce: 1. On the existing ceph cluster, make an entry of a new OSD in inventory file, try to run ansible playbook with '--limit osds' option Actual results: Command issued: cd /usr/share/ceph-ansible ; ANSIBLE_STDOUT_CALLBACK=debug; ansible-playbook -vv -i hosts site.yml --limit osds 2019-01-29 20:22:26,607 - ceph.ceph - INFO - TASK [ceph-mds : create filesystem pools] ************************************** task path: /usr/share/ceph-ansible/roles/ceph-mds/tasks/create_mds_filesystems.yml:4 2019-01-29 20:22:26,607 - ceph.ceph - INFO - Tuesday 29 January 2019 15:22:25 -0500 (0:00:00.127) 0:13:29.170 ******* 2019-01-29 20:22:26,858 - ceph.ceph - INFO - failed: [ceph-ansible-1548782152719-node4-osdmds -> ceph-ansible-1548782152719-node11-pool] (item={u'name': u'cephfs_data', u'pgs': u'8'}) => {"changed": false, "cmd": "ceph --cluster ceph osd pool create cephfs_data 8", "item": {"name": "cephfs_data", "pgs": "8"}, "msg": "[Errno 2] No such file or directory", "rc": 2} 2019-01-29 20:22:27,057 - ceph.ceph - INFO - failed: [ceph-ansible-1548782152719-node4-osdmds -> ceph-ansible-1548782152719-node11-pool] (item={u'name': u'cephfs_metadata', u'pgs': u'8'}) => {"changed": false, "cmd": "ceph --cluster ceph osd pool create cephfs_metadata 8", "item": {"name": "cephfs_metadata", "pgs": "8"}, "msg": "[Errno 2] No such file or directory", "rc": 2} 2 Expected results: New OSD should get added successfully Additional info: Ansible log for adding new OSD: http://magna002.ceph.redhat.com/cephci-jenkins/cephci-run-1548782152719/config_roll_over_1.log Entire suite log: http://magna002.ceph.redhat.com/cephci-jenkins/cephci-run-1548782152719/
*** Bug 1670661 has been marked as a duplicate of this bug. ***
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHSA-2019:0911