Description of problem: While attempting to delete a Load Balancer the provisioning status is moved to PENDING_DELETE and remains that way, blocking the deletion process to finalize. The following tracebacks were found on the logs regarding that specific lb: 2021-07-17 13:49:26.131 19 INFO octavia.api.v2.controllers.load_balancer [req-b8b3cbd8-3014-4c45-9680-d4c67346ed1c - 1e38d4dfbfb7427787725df69fabc22b - default default] Sending delete Load Balancer 19d8e465-c704-40a9-b1fd-5b0824408e5d to provider ovn 2021-07-17 13:49:26.139 19 DEBUG ovn_octavia_provider.helper [-] Handling request lb_delete with info {'id': '19d8e465-c704-40a9-b1fd-5b0824408e5d', 'cascade': True} request_handler /usr/lib/python3.6/site-packages/ovn_octavia_provider/helper.py:303 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper [-] Exception occurred during deletion of loadbalancer: RuntimeError: dictionary changed size during iteration 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper Traceback (most recent call last): 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper File "/usr/lib/python3.6/site-packages/ovn_octavia_provider/helper.py", line 907, in lb_delete 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper status = self._lb_delete(loadbalancer, ovn_lb, status) 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper File "/usr/lib/python3.6/site-packages/ovn_octavia_provider/helper.py", line 960, in _lb_delete 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper for ls in self._find_lb_in_table(ovn_lb, 'Logical_Switch'): 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper File "/usr/lib/python3.6/site-packages/ovn_octavia_provider/helper.py", line 289, in _find_lb_in_table 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper return [item for item in self.ovn_nbdb_api.tables[table].rows.values() 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper File "/usr/lib/python3.6/site-packages/ovn_octavia_provider/helper.py", line 289, in <listcomp> 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper return [item for item in self.ovn_nbdb_api.tables[table].rows.values() 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper File "/usr/lib64/python3.6/_collections_abc.py", line 761, in __iter__ 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper for key in self._mapping: 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper RuntimeError: dictionary changed size during iteration 2021-07-17 13:49:26.196 19 ERROR ovn_octavia_provider.helper 2021-07-17 13:49:26.446 13 DEBUG octavia.common.keystone [req-267feb7e-2235-43d9-bec8-88ff532b9019 - 1e38d4dfbfb7427787725df69fabc22b - default default] Request path is / and it does not require keystone authentication process_request /usr/lib/python3.6/site-packages/octavia/common/keystone.py:77 2021-07-17 13:49:26.554 19 DEBUG ovn_octavia_provider.helper [-] Updating status to octavia: {'loadbalancers': [{'id': '19d8e465-c704-40a9-b1fd-5b0824408e5d', 'provisioning_status': 'ERROR', 'operating_status': 'ERROR'}], 'listeners': [{'id': '0806594a-4ed7-4889-81fa-6fd8d02b0d80', 'provisioning_status': 'DELETED', 'operating_status': 'OFFLINE'}], 'pools': [{'id': 'b8a98db0-6d2e-4745-b533-d2eb3548d1b9', 'provisioning_status': 'DELETED'}], 'members': [{'id': '08464181-728b-425a-b690-d3eb656f7e0a', 'provisioning_status': 'DELETED'}]} _update_status_to_octavia /usr/lib/python3.6/site-packages/ovn_octavia_provider/helper.py:32 2021-07-17 13:50:14.860 13 DEBUG ovn_octavia_provider.helper [-] Handling request lb_delete with info {'id': '19d8e465-c704-40a9-b1fd-5b0824408e5d', 'cascade': True} request_handler /usr/lib/python3.6/site-packages/ovn_octavia_provider/helper.py:303 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command (LsGetCommand): ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Logical_Switch with name=neutron-4f5f73bd-3efc-4bbe-a16b-850408be7654 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last): 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 39, in execute 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command self.run_idl(None) 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 329, in run_idl 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command self.result = self.api.lookup(self.table, self.record) 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 208, in lookup 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command return self._lookup(table, record) 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py", line 268, in _lookup 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command row = idlutils.row_by_value(self, rl.table, rl.column, record) 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 114, in row_by_value 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command raise RowNotFound(table=table, col=column, match=match) 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Logical_Switch with name=neutron-4f5f73bd-3efc-4bbe-a16b-850408be7654 2021-07-17 13:50:14.862 13 ERROR ovsdbapp.backend.ovs_idl.command 2021-07-17 13:50:14.862 13 WARNING ovn_octavia_provider.helper [-] LogicalSwitch neutron-4f5f73bd-3efc-4bbe-a16b-850408be7654 could not be found. Cannot delete Load Balancer from it: ovsdbapp.backend.ovs_idl.idlutils.RowNotFound: Cannot find Logical_Switch with name=neutron-4f5f73bd-3efc-4bbe-a16b-850408be7654 [stack@mecha-central ~]$ openstack loadbalancer list |grep 19d8e465-c704-40a9-b1fd-5b0824408e5d /usr/lib/python3.6/site-packages/oslo_utils/fnmatch.py:23: DeprecationWarning: Using the oslo.utils's 'fnmatch' module is deprecated, please use the stdlib 'fnmatch' module. "Using the oslo.utils's 'fnmatch' module is deprecated, " | 19d8e465-c704-40a9-b1fd-5b0824408e5d | e2e-ephemeral-1239-6605/csi-hostpath-resizer | 1e38d4dfbfb7427787725df69fabc22b | 172.30.87.92 | PENDING_DELETE | ERROR | ovn | Version-Release number of selected component (if applicable): Using tripleo master. [stack@mecha-central ~]$ sudo podman images |grep octavia quay.io/tripleomaster/openstack-octavia-api current-tripleo 4ffe82287b58 3 days ago 1.45 GB quay.io/tripleomaster/openstack-octavia-health-manager current-tripleo 7af589231899 3 days ago 1.3 GB quay.io/tripleomaster/openstack-octavia-worker current-tripleo 2c5bc4bdcb11 3 days ago 1.3 GB quay.io/tripleomaster/openstack-octavia-housekeeping current-tripleo 78efd3fe05c3 3 days ago 1.3 GB How reproducible: Steps to Reproduce: 1. 2. 3. Actual results: Expected results: Additional info:
Thanks for the report Maysa, looks like there is a race condition in the way we're calling the DB. I created an upstream bug to track this, along with a potential fix. Will update based on testing and review
Just for the record to allow the clean up of any Load Balancer stuck on PENDING_* state with Kuryr, as an operator, we had to interact with the database and move the Load Balancer only to ERROR state and let Kuryr handle the deletion and creation. More info on https://access.redhat.com/solutions/4251821
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory (Release of components for Red Hat OpenStack Platform 17.0 (Wallaby)), and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHEA-2022:6543