(undercloud) [stack@hardprov-fx2-1 ~]$ openstack overcloud node clean --all-manageable Waiting for messages on queue 'tripleo' with no timeout. openstack overcloud node clean reports all nodes were cleaned, but one node is stuck in clean_wait Environment: python2-ironicclient-2.5.0-0.20180810135843.fb94fb8.el7ost.noarch python2-ironic-inspector-client-3.3.0-0.20180810080932.53bf4e8.el7ost.noarch puppet-ironic-13.3.1-0.20180831191239.61387eb.el7ost.noarch instack-undercloud-9.3.1-0.20180831000258.e464799.el7ost.noarch On BM setup ran: (undercloud) [stack@hardprov-fx2-1 ~]$ openstack overcloud node clean --all-manageable Waiting for messages on queue 'tripleo' with no timeout. Cleaned 7 node(s) (undercloud) [stack@hardprov-fx2-1 ~]$ openstack baremetal node list +--------------------------------------+--------------+---------------+-------------+--------------------+-------------+ | UUID | Name | Instance UUID | Power State | Provisioning State | Maintenance | +--------------------------------------+--------------+---------------+-------------+--------------------+-------------+ | 67188f50-6daa-415a-83b9-0965219f5e99 | controller-0 | None | power off | manageable | False | | dabcceea-2cdf-405a-ac90-dce370ee296e | controller-1 | None | power on | clean wait | False | | b5a15206-d9a3-4c46-8f99-7024c022a713 | controller-2 | None | power off | manageable | False | | 908453f7-a02d-45de-8ea5-849625e6d47e | compute-0 | None | power off | manageable | False | | 7865d6d6-b4c3-4e0b-af21-bd8b9709fe87 | ceph-0 | None | power off | manageable | False | | a788aa01-bb73-4ed5-9a91-ab9505f4477c | ironic-0 | None | power off | manageable | False | | 644d3223-207c-4c3a-a070-84c3b1c9f052 | ironic-1 | None | power off | manageable | False | +--------------------------------------+--------------+---------------+-------------+--------------------+-------------+
Created attachment 1485184 [details] sosreports
*** Bug 1665564 has been marked as a duplicate of this bug. ***
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2019:0446
*** Bug 1671897 has been marked as a duplicate of this bug. ***