Bug 1732157 - ceph-osd: OpenStack pool creation fails with "unable to exec into ceph-mon-controller-0: no container with name or ID ceph-mon-controller-0 found: no such container"
Summary: ceph-osd: OpenStack pool creation fails with "unable to exec into ceph-mon-co...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Ceph Storage
Classification: Red Hat Storage
Component: Ceph-Ansible
Version: 4.0
Hardware: x86_64
OS: Linux
urgent
medium
Target Milestone: rc
: 4.0
Assignee: Dimitri Savineau
QA Contact: Vasishta
URL:
Whiteboard:
Depends On:
Blocks: 1642481
TreeView+ depends on / blocked
 
Reported: 2019-07-22 19:57 UTC by Dimitri Savineau
Modified: 2020-01-31 12:47 UTC (History)
12 users (show)

Fixed In Version: ceph-ansible-4.0.5-1.el8cp.noarch.rpm
Doc Type: If docs needed, set a value
Doc Text:
Clone Of: 1722066
Environment:
Last Closed: 2020-01-31 12:46:52 UTC
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github ceph ceph-ansible pull 4258 0 'None' closed ceph-osd: check container engine rc for pools 2020-06-23 08:17:33 UTC
Github ceph ceph-ansible pull 4279 0 'None' closed ceph-osd: check container engine rc for pools (bp #4258) 2020-06-23 08:17:33 UTC
Red Hat Product Errata RHBA-2020:0312 0 None None None 2020-01-31 12:47:23 UTC

Description Dimitri Savineau 2019-07-22 19:57:33 UTC
--- Additional comment from Artem Hrechanychenko on 2019-07-18 09:31:19 UTC ---


(undercloud) [stack@undercloud-0 ~]$ rpm -qa ceph-ansible
ceph-ansible-4.0.0-0.1.rc10.el8cp.noarch


 "failed: [ceph-2 -> 192.168.24.8] (item=[{'application': 'openstack_gnocchi', 'name': 'metrics', 'pg_num': 32, 'rule_name': 'replicated_rule'}, {'msg': 'non-zero return code', 'cmd': ['podman', 'exec', 'ceph-mon-controller-0', 'ce
ph', '--cluster', 'ceph', 'osd', 'pool', 'get', 'metrics', 'size'], 'stdout': '', 'stderr': 'unable to exec into ceph-mon-controller-0: no container with name or ID ceph-mon-controller-0 found: no such container', 'rc': 125, 'start': '201
9-07-17 16:49:47.920625', 'end': '2019-07-17 16:49:47.966148', 'delta': '0:00:00.045523', 'changed': True, 'failed': False, 'invocation': {'module_args': {'_raw_params': 'podman exec ceph-mon-controller-0 ceph --cluster ceph osd pool get
metrics size\\n', 'warn': True, '_uses_shell': False, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lin
es': ['unable to exec into ceph-mon-controller-0: no container with name or ID ceph-mon-controller-0 found: no such container'], 'failed_when_result': False, 'item': {'application': 'openstack_gnocchi', 'name': 'metrics', 'pg_num': 32, 'r
ule_name': 'replicated_rule'}, 'ansible_loop_var': 'item'}]) => changed=false ",
"  delta: '0:00:00.053923'",
        "  end: '2019-07-17 16:49:49.504360'",
        "        podman exec ceph-mon-controller-0 ceph --cluster ceph osd pool create metrics 32 32 replicated_rule 1",
        "  - application: openstack_gnocchi",
        "    - metrics",
        "    delta: '0:00:00.045523'",
        "    end: '2019-07-17 16:49:47.966148'",
        "          podman exec ceph-mon-controller-0 ceph --cluster ceph osd pool get metrics size",
        "      application: openstack_gnocchi",
        "      name: metrics",
        "    start: '2019-07-17 16:49:47.920625'",
        "  start: '2019-07-17 16:49:49.450437'",

[heat-admin@ceph-2 ~]$ sudo podman ps -a
CONTAINER ID  IMAGE                                                COMMAND               CREATED       STATUS           PORTS  NAMES
77e3cf880b9c  192.168.24.1:8787/rhosp15/openstack-cron:20190711.1  dumb-init --singl...  23 hours ago  Up 23 hours ago         logrotate_crond
9947cb175aed  192.168.24.1:8787/ceph/rhceph-4.0-rhel8:latest       /opt/ceph-contain...  23 hours ago  Up 23 hours ago         ceph-osd-8
6321d76031e1  192.168.24.1:8787/ceph/rhceph-4.0-rhel8:latest       /opt/ceph-contain...  23 hours ago  Up 23 hours ago         ceph-osd-5
00ddb30cbf84  192.168.24.1:8787/ceph/rhceph-4.0-rhel8:latest       /opt/ceph-contain...  23 hours ago  Up 23 hours ago         ceph-osd-14
b83a4a18df38  192.168.24.1:8787/ceph/rhceph-4.0-rhel8:latest       /opt/ceph-contain...  23 hours ago  Up 23 hours ago         ceph-osd-11
47242e9e34b7  192.168.24.1:8787/ceph/rhceph-4.0-rhel8:latest       /opt/ceph-contain...  23 hours ago  Up 23 hours ago         ceph-osd-1

Comment 1 Giridhar Ramaraju 2019-08-05 13:12:17 UTC
Updating the QA Contact to a Hemant. Hemant will be rerouting them to the appropriate QE Associate. 

Regards,
Giri

Comment 2 Giridhar Ramaraju 2019-08-05 13:13:04 UTC
Updating the QA Contact to a Hemant. Hemant will be rerouting them to the appropriate QE Associate. 

Regards,
Giri

Comment 10 Yogev Rabl 2020-01-21 18:57:09 UTC
Verified

Comment 12 errata-xmlrpc 2020-01-31 12:46:52 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2020:0312


Note You need to log in before you can comment on or make changes to this bug.