Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.

Bug 1550210

Summary: Unable to launch BM instances in OC after minor update.
Product: Red Hat OpenStack Reporter: Alexander Chuzhoy <sasha>
Component: openstack-novaAssignee: OSP DFG:Compute <osp-dfg-compute>
Status: CLOSED DUPLICATE QA Contact: OSP DFG:Compute <osp-dfg-compute>
Severity: high Docs Contact:
Priority: high    
Version: 12.0 (Pike)CC: berrange, bfournie, dasmith, eglynn, jhakimra, kchamart, sbauza, sferdjao, sgordon, srevivo, vromanso
Target Milestone: z2Keywords: ZStream
Target Release: 12.0 (Pike)   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2018-02-28 20:46:56 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description Alexander Chuzhoy 2018-02-28 19:27:55 UTC
Unable to launch BM instances in OC after minor update.

Environment:
instack-undercloud-7.4.3-5.el7ost.noarch
openstack-tripleo-heat-templates-7.0.3-22.el7ost.noarch
openstack-puppet-modules-11.0.0-1.el7ost.noarch
puppet-nova-11.4.0-2.el7ost.noarch
openstack-nova-compute-16.0.2-9.el7ost.noarch
openstack-nova-conductor-16.0.2-9.el7ost.noarch
python-ironicclient-1.17.0-1.el7ost.noarch
openstack-ironic-common-9.1.2-3.el7ost.noarch
openstack-nova-placement-api-16.0.2-9.el7ost.noarch
python-ironic-inspector-client-2.1.0-1.el7ost.noarch
python-nova-16.0.2-9.el7ost.noarch
openstack-ironic-api-9.1.2-3.el7ost.noarch
openstack-nova-api-16.0.2-9.el7ost.noarch
openstack-ironic-conductor-9.1.2-3.el7ost.noarch
openstack-nova-scheduler-16.0.2-9.el7ost.noarch
openstack-ironic-inspector-6.0.0-3.el7ost.noarch
python-novaclient-9.1.1-1.el7ost.noarch
puppet-ironic-11.3.0-2.el7ost.noarch
openstack-nova-common-16.0.2-9.el7ost.noarch
python-ironic-lib-2.10.0-1.el7ost.noarch

Steps to reproduce:
1. Deploy overcloud with ironic using GA version:
openstack overcloud deploy \
--templates /usr/share/openstack-tripleo-heat-templates \
--stack overcloud \
--libvirt-type kvm \
--ntp-server clock.redhat.com \
--environment-file /usr/share/openstack-tripleo-heat-templates/environments/services-docker/ironic.yaml \
-e /home/stack/virt/config_lvm.yaml \
-e /usr/share/openstack-tripleo-heat-templates/environments/network-isolation.yaml \
-e /home/stack/virt/network/network-environment.yaml \
-e /home/stack/virt/hostnames.yml \
-e /home/stack/virt/debug.yaml \
-e /home/stack/virt/ironic.yaml \
-e /home/stack/virt/nodes_data.yaml \
-e /home/stack/virt/docker-images.yaml

2. Minor update to z1 version

3. Try to launch BM instances in oc.

Result:
(overcloud) [stack@undercloud-0 ~]$ nova list
+--------------------------------------+---------------------+--------+------------+-------------+---------------------------+
| ID                                   | Name                | Status | Task State | Power State | Networks                  |
+--------------------------------------+---------------------+--------+------------+-------------+---------------------------+
| 173a91d5-3fb4-4076-afc7-18f077793952 | instance1           | ERROR  | -          | NOSTATE     |                           |
| 283a7135-e1fe-4819-8404-546ac5637d58 | instance2           | ERROR  | -          | NOSTATE     |                           |


(overcloud) [stack@undercloud-0 ~]$ openstack compute service list
+----+------------------+--------------------------+----------+---------+-------+----------------------------+
| ID | Binary           | Host                     | Zone     | Status  | State | Updated At                 |
+----+------------------+--------------------------+----------+---------+-------+----------------------------+
|  2 | nova-scheduler   | controller-1.localdomain | internal | enabled | up    | 2018-02-28T19:26:31.000000 |
|  5 | nova-consoleauth | controller-1.localdomain | internal | enabled | up    | 2018-02-28T19:26:28.000000 |
|  8 | nova-scheduler   | controller-2.localdomain | internal | enabled | up    | 2018-02-28T19:26:27.000000 |
| 11 | nova-consoleauth | controller-2.localdomain | internal | enabled | up    | 2018-02-28T19:26:23.000000 |
| 14 | nova-conductor   | controller-1.localdomain | internal | enabled | up    | 2018-02-28T19:26:32.000000 |
| 17 | nova-conductor   | controller-2.localdomain | internal | enabled | up    | 2018-02-28T19:26:29.000000 |
| 29 | nova-compute     | compute-1.localdomain    | nova     | enabled | up    | 2018-02-28T19:26:31.000000 |
| 32 | nova-compute     | compute-0.localdomain    | nova     | enabled | up    | 2018-02-28T19:26:32.000000 |
| 44 | nova-scheduler   | controller-0.localdomain | internal | enabled | up    | 2018-02-28T19:26:28.000000 |
| 47 | nova-consoleauth | controller-0.localdomain | internal | enabled | up    | 2018-02-28T19:26:25.000000 |
| 50 | nova-conductor   | controller-0.localdomain | internal | enabled | up    | 2018-02-28T19:26:31.000000 |
| 62 | nova-compute     | controller-2.localdomain | nova     | enabled | up    | 2018-02-28T19:26:28.000000 |
| 65 | nova-compute     | controller-1.localdomain | nova     | enabled | up    | 2018-02-28T19:26:32.000000 |
| 68 | nova-compute     | controller-0.localdomain | nova     | enabled | up    | 2018-02-28T19:26:25.000000 |
+----+------------------+--------------------------+----------+---------+-------+----------------------------+



Note: the cleaning succeeded.

2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager [req-4103a281-d07d-42d2-a75f-74530f2a9f51 d5a1d5d367e04f56a367ab383d2b44e3 5338edeebb7d465a9d540e57bdd16737 - default default] Failed to schedule instances: NoValidHost_Remote: No valid host was found. There are not enough hosts available.
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager Traceback (most recent call last):
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/nova/conductor/manager.py", line 1036, in schedule_and_build_instances
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     instance_uuids)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/nova/conductor/manager.py", line 627, in _schedule_instances
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     request_spec, instance_uuids)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/nova/scheduler/utils.py", line 586, in wrapped
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     return func(*args, **kwargs)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/nova/scheduler/client/__init__.py", line 52, in select_destinations
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     instance_uuids)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/nova/scheduler/client/__init__.py", line 37, in __run_method
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     return getattr(self.instance, __name)(*args, **kwargs)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/nova/scheduler/client/query.py", line 33, in select_destinations
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     instance_uuids)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/nova/scheduler/rpcapi.py", line 137, in select_destinations
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     return cctxt.call(ctxt, 'select_destinations', **msg_args)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/client.py", line 169, in call
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     retry=self.retry)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/oslo_messaging/transport.py", line 123, in _send
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     timeout=timeout, retry=retry)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 578, in send
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     retry=retry)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 569, in _send
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     raise result
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager NoValidHost_Remote: No valid host was found. There are not enough hosts available.
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager Traceback (most recent call last):
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager 
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 232, in inner
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     return func(*args, **kwargs)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager 
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/nova/scheduler/manager.py", line 149, in select_destinations
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     alloc_reqs_by_rp_uuid, provider_summaries)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager 
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager   File "/usr/lib/python2.7/site-packages/nova/scheduler/filter_scheduler.py", line 110, in select_destinations
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager     raise exception.NoValidHost(reason=reason)
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager 
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager NoValidHost: No valid host was found. There are not enough hosts available.
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager 
2018-02-28 18:53:46.818 21 ERROR nova.conductor.manager

Comment 2 Alexander Chuzhoy 2018-02-28 20:46:56 UTC

*** This bug has been marked as a duplicate of bug 1522872 ***