Description of problem: OSPd14 fails during overcloud deployment with "no valid hosts found" (nova-conductor.log) error, OC's VMs end up in ERROR state. nova-placement-api.log on undercloud reports: INFO nova.api.openstack.placement.requestlog [req-8832ccd2-1eca-48c7-86d3-eafd9a53a56b f33da22f865c4bc9b83aec61391bd9ce d004eb2f3b3b4c88b88daf108f76cef5 - default default] 192.168.24.1 "GET /placement/resource_providers?in_tree=28ad31d1-4317-4c61-83db-cbe5683e12c8" status: 500 len: 280 microversion: 1.14 DEBUG nova.api.openstack.placement.requestlog [req-1d7a5d48-cc04-49c0-bd9a-92c1d4b99223 f33da22f865c4bc9b83aec61391bd9ce d004eb2f3b3b4c88b88daf108f76cef5 - default default] Starting request: 192.168.24.1 "GET /placement/resource_providers?in_tree=f7ef599c-ecfe-4fc0-aee8-87e6126257a4" __call__ /usr/lib/python2.7/site-packages/nova/api/openstack/placement/requestlog.py:38 ERROR nova.api.openstack.placement.fault_wrap [req-1d7a5d48-cc04-49c0-bd9a-92c1d4b99223 f33da22f865c4bc9b83aec61391bd9ce d004eb2f3b3b4c88b88daf108f76cef5 - default default] Placement API unexpected error: 'MIMEAccept' object has no attribute 'acceptable_offers': AttributeError: 'MIMEAccept' object has no attribute 'acceptable_offers' ERROR nova.api.openstack.placement.fault_wrap Traceback (most recent call last): ERROR nova.api.openstack.placement.fault_wrap File "/usr/lib/python2.7/site-packages/nova/api/openstack/placement/fault_wrap.py", line 40, in __call__ ERROR nova.api.openstack.placement.fault_wrap return self.application(environ, start_response) ERROR nova.api.openstack.placement.fault_wrap File "/usr/lib/python2.7/site-packages/webob/dec.py", line 131, in __call__ ERROR nova.api.openstack.placement.fault_wrap resp = self.call_func(req, *args, **self.kwargs) ERROR nova.api.openstack.placement.fault_wrap File "/usr/lib/python2.7/site-packages/webob/dec.py", line 196, in call_func ERROR nova.api.openstack.placement.fault_wrap return self.func(req, *args, **kwargs) ERROR nova.api.openstack.placement.fault_wrap File "/usr/lib/python2.7/site-packages/microversion_parse/middleware.py", line 80, in __call__ ERROR nova.api.openstack.placement.fault_wrap response = req.get_response(self.application) ERROR nova.api.openstack.placement.fault_wrap File "/usr/lib/python2.7/site-packages/webob/request.py", line 1316, in send ERROR nova.api.openstack.placement.fault_wrap application, catch_exc_info=False) ERROR nova.api.openstack.placement.fault_wrap File "/usr/lib/python2.7/site-packages/webob/request.py", line 1280, in call_application ERROR nova.api.openstack.placement.fault_wrap app_iter = application(self.environ, start_response) ERROR nova.api.openstack.placement.fault_wrap File "/usr/lib/python2.7/site-packages/nova/api/openstack/placement/handler.py", line 209, in __call__ ERROR nova.api.openstack.placement.fault_wrap return dispatch(environ, start_response, self._map) ERROR nova.api.openstack.placement.fault_wrap File "/usr/lib/python2.7/site-packages/nova/api/openstack/placement/handler.py", line 146, in dispatch ERROR nova.api.openstack.placement.fault_wrap return handler(environ, start_response) ERROR nova.api.openstack.placement.fault_wrap File "/usr/lib/python2.7/site-packages/webob/dec.py", line 131, in __call__ ERROR nova.api.openstack.placement.fault_wrap resp = self.call_func(req, *args, **self.kwargs) ERROR nova.api.openstack.placement.fault_wrap File "/usr/lib/python2.7/site-packages/nova/api/openstack/placement/wsgi_wrapper.py", line 29, in call_func ERROR nova.api.openstack.placement.fault_wrap super(PlacementWsgify, self).call_func(req, *args, **kwargs) ERROR nova.api.openstack.placement.fault_wrap File "/usr/lib/python2.7/site-packages/webob/dec.py", line 196, in call_func ERROR nova.api.openstack.placement.fault_wrap return self.func(req, *args, **kwargs) ERROR nova.api.openstack.placement.fault_wrap File "/usr/lib/python2.7/site-packages/nova/api/openstack/placement/util.py", line 73, in decorated_function ERROR nova.api.openstack.placement.fault_wrap best_matches = req.accept.acceptable_offers(types) ERROR nova.api.openstack.placement.fault_wrap AttributeError: 'MIMEAccept' object has no attribute 'acceptable_offers nova-compute.log on UC: DEBUG nova.compute.resource_tracker [req-823b8c68-bab7-459d-8ad8-88900060e8a6 - - - - -] Auditing locally available compute resources for undercloud-0.redhat.local (node: c40592fd-6b81-4279-8496-8a3c5da28f52) update_available_resource /usr/lib/python2.7/site-packages/nova/compute/resource_tracker.py:669 4269 DEBUG nova.virt.ironic.driver [req-823b8c68-bab7-459d-8ad8-88900060e8a6 - - - - -] Using cache for node c40592fd-6b81-4279-8496-8a3c5da28f52, age: 0.0220968723297 _node_from_cache /usr/lib/python2.7/site-packages/nova/virt/ironic/driver.py:837 4269 DEBUG nova.compute.resource_tracker [req-823b8c68-bab7-459d-8ad8-88900060e8a6 - - - - -] Hypervisor/Node resource view: name=c40592fd-6b81-4279-8496-8a3c5da28f52 free_ram=32768MB free_disk=39GB free_vcpus=8 pci_devices=None _report_hypervisor_resource_view /usr/lib/python2.7/site-packages/nova/compute/resource_tracker.py:808 4269 DEBUG oslo_concurrency.lockutils [req-823b8c68-bab7-459d-8ad8-88900060e8a6 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker._update_available_resource" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:273 4269 ERROR nova.scheduler.client.report [req-823b8c68-bab7-459d-8ad8-88900060e8a6 - - - - -] [req-c42355d3-3143-438a-be70-f23193277d53] Failed to retrieve resource provider tree from placement API for UUID f7ef599c-ecfe-4fc0-aee8-87e6126257a4. Got 500: {"errors": [{"status": 500, "request_id": "req-c42355d3-3143-438a-be70-f23193277d53", "detail": "The server has either erred or is incapable of performing the requested operation.\n\n 'MIMEAccept' object has no attribute 'acceptable_offers' ", "title": "Internal Server Error"}]}. 4269 DEBUG oslo_concurrency.lockutils [req-823b8c68-bab7-459d-8ad8-88900060e8a6 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker._update_available_resource" :: held 0.007s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:285 4269 ERROR nova.compute.manager [req-823b8c68-bab7-459d-8ad8-88900060e8a6 - - - - -] Error updating resources for node c40592fd-6b81-4279-8496-8a3c5da28f52.: ResourceProviderRetrievalFailed: Failed to get resource provider with UUID f7ef599c-ecfe-4fc0-aee8-87e6126257a4 4269 ERROR nova.compute.manager Traceback (most recent call last): 4269 ERROR nova.compute.manager File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 7457, in _update_available_resource_for_node 4269 ERROR nova.compute.manager rt.update_available_resource(context, nodename) 4269 ERROR nova.compute.manager File "/usr/lib/python2.7/site-packages/nova/compute/resource_tracker.py", line 686, in update_available_resource 4269 ERROR nova.compute.manager self._update_available_resource(context, resources) 4269 ERROR nova.compute.manager File "/usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py", line 274, in inner 4269 ERROR nova.compute.manager return f(*args, **kwargs) 4269 ERROR nova.compute.manager File "/usr/lib/python2.7/site-packages/nova/compute/resource_tracker.py", line 710, in _update_available_resource 4269 ERROR nova.compute.manager self._init_compute_node(context, resources) 4269 ERROR nova.compute.manager File "/usr/lib/python2.7/site-packages/nova/compute/resource_tracker.py", line 561, in _init_compute_node 4269 ERROR nova.compute.manager self._update(context, cn) 4269 ERROR nova.compute.manager File "/usr/lib/python2.7/site-packages/nova/compute/resource_tracker.py", line 883, in _update 4269 ERROR nova.compute.manager context, compute_node.uuid, name=compute_node.hypervisor_hostname) 4269 ERROR nova.compute.manager File "/usr/lib/python2.7/site-packages/nova/scheduler/client/__init__.py", line 37, in __run_method 4269 ERROR nova.compute.manager return getattr(self.instance, __name)(*args, **kwargs) 4269 ERROR nova.compute.manager File "/usr/lib/python2.7/site-packages/nova/scheduler/client/report.py", line 990, in get_provider_tree_and_ensure_root 4269 ERROR nova.compute.manager parent_provider_uuid=parent_provider_uuid) 4269 ERROR nova.compute.manager File "/usr/lib/python2.7/site-packages/nova/scheduler/client/report.py", line 653, in _ensure_resource_provider 4269 ERROR nova.compute.manager rps_to_refresh = self._get_providers_in_tree(context, uuid) 4269 ERROR nova.compute.manager File "/usr/lib/python2.7/site-packages/nova/scheduler/client/report.py", line 66, in wrapper 4269 ERROR nova.compute.manager return f(self, *a, **k) 4269 ERROR nova.compute.manager File "/usr/lib/python2.7/site-packages/nova/scheduler/client/report.py", line 520, in _get_providers_in_tree 4269 ERROR nova.compute.manager raise exception.ResourceProviderRetrievalFailed(uuid=uuid) 4269 ERROR nova.compute.manager ResourceProviderRetrievalFailed: Failed to get resource provider with UUID f7ef599c-ecfe-4fc0-aee8-87e6126257a4 4269 ERROR nova.compute.manager How reproducible: always Steps to Reproduce: 1. Deploy OSPd14 1:1:1:1 using InfraRed, puddle 2018-06-28.1 Additional info: openstack-nova-common-18.0.0-0.20180625215857.9a8a98b.el7ost.noarch python-novaclient-9.1.1-1.el7ost.noarch openstack-nova-api-18.0.0-0.20180625215857.9a8a98b.el7ost.noarch openstack-nova-conductor-18.0.0-0.20180625215857.9a8a98b.el7ost.noarch puppet-nova-13.1.1-0.20180625065737.29d307b.el7ost.noarch python-nova-18.0.0-0.20180625215857.9a8a98b.el7ost.noarch openstack-nova-scheduler-18.0.0-0.20180625215857.9a8a98b.el7ost.noarch openstack-nova-placement-api-18.0.0-0.20180625215857.9a8a98b.el7ost.noarch openstack-nova-compute-18.0.0-0.20180625215857.9a8a98b.el7ost.noarch
Created attachment 1455363 [details] nova-compute.log
Created attachment 1455365 [details] nova-placement-api.log
It sounds like this could be related to change: https://review.openstack.org/#/c/575127/ from upstream change made as part of https://bugs.launchpad.net/nova/+bug/1773225 .
First glance makes me consider a package version issue for WebOb where upstream gate works with WebOb==1.8.2 https://github.com/openstack/requirements/blob/d5a3c58f7195517a6083032e41b702c2a0aca431/upper-constraints.txt#L16
Upstream requirements have lower constraints set to WebOb==1.7.1 so this could be an upstream bug (either raise lower constraints or make placement more conservative by checking the WebOb API).
Upstream Nova change is under review for allowing webob 1.7 version to work with placement
Yes, I can confirm we are not hitting this bug in CI with python2-webob-1.8.1-1.el7ost present on UC.
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHEA-2019:0045