Description of problem: Neutron started to return 500 errors for a simple subnets queries. From server log: 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors [-] An error occurred during processing the request: GET /v2.0/subnets?name=ostest-kkmms-nodes&tags=openshiftClusterID%3Dostest-kkmms HTTP/1.0 Accept: application/json Accept-Encoding: gzip, deflate Content-Type: text/plain Host: 172.18.0.82:9696 User-Agent: kuryr-k8s-controller keystoneauth1/4.4.0 python-requests/2.20.0 CPython/3.6.8 X-Auth-Token: ***** X-Forwarded-Port: 9696 X-Forwarded-Proto: http: oslo_cache.exception.QueueEmpty: Unable to get a connection from pool id 140291074137672 after 10 seconds. 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors Traceback (most recent call last): 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 137, in acquire 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors conn = self.get(timeout=self._connection_get_timeout) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/eventlet/queue.py", line 322, in get 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors return waiter.wait() 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/eventlet/queue.py", line 141, in wait 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors return get_hub().switch() 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/eventlet/hubs/hub.py", line 298, in switch 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors return self.greenlet.switch() 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors queue.Empty 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors During handling of the above exception, another exception occurred: 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors Traceback (most recent call last): 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/oslo_middleware/catch_errors.py", line 40, in __call__ 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors response = req.get_response(self.application) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/webob/request.py", line 1314, in send 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors application, catch_exc_info=False) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/webob/request.py", line 1278, in call_application 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors app_iter = application(self.environ, start_response) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/webob/dec.py", line 129, in __call__ 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors resp = self.call_func(req, *args, **kw) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/webob/dec.py", line 193, in call_func 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors return self.func(req, *args, **kwargs) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/osprofiler/web.py", line 112, in __call__ 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors return request.get_response(self.application) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/webob/request.py", line 1314, in send 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors application, catch_exc_info=False) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/webob/request.py", line 1278, in call_application 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors app_iter = application(self.environ, start_response) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/webob/dec.py", line 129, in __call__ 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors resp = self.call_func(req, *args, **kw) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/webob/dec.py", line 193, in call_func 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors return self.func(req, *args, **kwargs) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/keystonemiddleware/auth_token/__init__.py", line 338, in __call__ 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors response = self.process_request(req) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/keystonemiddleware/auth_token/__init__.py", line 659, in process_request 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors resp = super(AuthProtocol, self).process_request(request) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/keystonemiddleware/auth_token/__init__.py", line 411, in process_request 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors allow_expired=allow_expired) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/keystonemiddleware/auth_token/__init__.py", line 445, in _do_fetch_token 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors data = self.fetch_token(token, **kwargs) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/keystonemiddleware/auth_token/__init__.py", line 736, in fetch_token 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors cached = self._token_cache.get(token) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/keystonemiddleware/auth_token/_cache.py", line 226, in get 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors with self._cache_pool.reserve() as cache: 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib64/python3.6/contextlib.py", line 81, in __enter__ 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors return next(self.gen) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/keystonemiddleware/auth_token/_cache.py", line 104, in reserve 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors with self._pool.acquire() as client: 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib64/python3.6/contextlib.py", line 81, in __enter__ 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors return next(self.gen) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors File "/usr/lib/python3.6/site-packages/oslo_cache/_memcache_pool.py", line 142, in acquire 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors {'id': id(self), 'seconds': self._connection_get_timeout}) 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors oslo_cache.exception.QueueEmpty: Unable to get a connection from pool id 140291074137672 after 10 seconds. 2022-03-18 13:10:00.593 23 ERROR oslo_middleware.catch_errors 2022-03-18 13:10:00.594 23 INFO neutron.wsgi [-] 172.17.1.10 "GET /v2.0/subnets?name=ostest-kkmms-nodes&tags=openshiftClusterID%3Dostest-kkmms HTTP/1.1" status: 500 len: 380 time: 10.6359427 2022-03-18 13:10:00.595 25 DEBUG neutron.wsgi [-] (25) accepted ('172.17.1.67', 35364) server /usr/lib/python3.6/site-packages/eventlet/wsgi.py:985 2022-03-18 13:10:00.595 23 ERROR oslo_middleware.catch_errors [-] An error occurred during processing the request: GET /v2.0/subnets?name=ostest-kkmms-nodes&tags=openshiftClusterID%3Dostest-kkmms HTTP/1.0 Accept: application/json Version-Release number of selected component (if applicable): 16.2_20220304.1 How reproducible: Not sure Steps to Reproduce: 1. We had the PerfScale team running cloud density tests on OpenShift with Kuryr deployed on top of OpenStack. Actual results: Kuryr is unable to make these queries to get information about the subnet. Expected results: Kuryr able to do these queries and proceed. Additional info:
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory (Red Hat OpenStack Platform 16.2.5 (Train) bug fix and enhancement advisory), and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2023:1763