Bug 2045082

Summary: oslo_config.cfg.DuplicateOptError: duplicate option: host
Product: Red Hat OpenStack Reporter: Pavel Sedlák <psedlak>
Component: python-networking-ovnAssignee: Rodolfo Alonso <ralonsoh>
Status: CLOSED ERRATA QA Contact: Eran Kuris <ekuris>
Severity: urgent Docs Contact:
Priority: urgent    
Version: 16.1 (Train)CC: afazekas, apevec, bbonguar, froyo, gthiemon, ihrachys, itbrown, lhh, lpeer, ltomasbo, majopela, mgarciac, njohnston, oblaut, ralonsoh, rhos-maint, scohen, shrjoshi, spower
Target Milestone: z8Keywords: AutomationBlocker, Triaged
Target Release: 16.1 (Train on RHEL 8.2)   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: python-networking-ovn-7.3.1-1.20220113183502.el8ost Doc Type: No Doc Update
Doc Text:
Story Points: ---
Clone Of: 1972254 Environment:
Last Closed: 2022-03-24 11:03:21 UTC Type: ---
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On: 1972254    
Bug Blocks:    

Description Pavel Sedlák 2022-01-25 15:09:39 UTC
+++ This bug was initially created as a clone of Bug #1972254 +++

Description of problem:

/controller-1/var/log/containers/octavia/octavia.log.gz
2021-06-15 10:39:46.022 12 ERROR octavia.api.drivers.driver_factory [-] Unable to load provider driver ovn due to: duplicate option: host: oslo_config.cfg.DuplicateOptError: duplicate option: host

Version-Release number of selected component (if applicable):
RHOS-16.2-RHEL-8-20210614.n.1
tag: 16.2_20210614.1

Actual results:
ERROR message in the octavia log.
503 response on `openstack loadbalancer list` 
http://10.0.0.111:9876 "GET /v2.0/lbaas/loadbalancers HTTP/1.1" 503 None

Expected results:

octavia service responding.

--- Additional comment from Gregory Thiemonge on 2021-06-15 17:22:36 CEST ---

More logs from controller-0/var/log/extra/podman/containers/octavia_api/log/httpd/octavia_wsgi_error.log.gz

[Tue Jun 15 13:25:31.451063 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100] mod_wsgi (pid=13): Failed to exec Python script file '/var/www/cgi-bin/octavia/app'.
[Tue Jun 15 13:25:31.451113 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100] mod_wsgi (pid=13): Exception occurred processing WSGI script '/var/www/cgi-bin/octavia/app'.
[Tue Jun 15 13:25:31.455622 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100] Traceback (most recent call last):
[Tue Jun 15 13:25:31.455757 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/octavia/api/drivers/driver_factory.py", line 44, in get_driver
[Tue Jun 15 13:25:31.455771 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     invoke_on_load=True).driver
[Tue Jun 15 13:25:31.455782 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/stevedore/driver.py", line 61, in __init__
[Tue Jun 15 13:25:31.455789 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     warn_on_missing_entrypoint=warn_on_missing_entrypoint
[Tue Jun 15 13:25:31.455798 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/stevedore/named.py", line 81, in __init__
[Tue Jun 15 13:25:31.455803 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     verify_requirements)
[Tue Jun 15 13:25:31.455811 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/stevedore/extension.py", line 203, in _load_plugins
[Tue Jun 15 13:25:31.455816 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     self._on_load_failure_callback(self, ep, err)
[Tue Jun 15 13:25:31.455825 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/stevedore/extension.py", line 195, in _load_plugins
[Tue Jun 15 13:25:31.455830 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     verify_requirements,
[Tue Jun 15 13:25:31.455838 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/stevedore/named.py", line 158, in _load_one_plugin
[Tue Jun 15 13:25:31.455844 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     verify_requirements,
[Tue Jun 15 13:25:31.455852 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/stevedore/extension.py", line 223, in _load_one_plugin
[Tue Jun 15 13:25:31.455858 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     plugin = ep.resolve()
[Tue Jun 15 13:25:31.455867 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2324, in resolve
[Tue Jun 15 13:25:31.455873 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     module = __import__(self.module_name, fromlist=['__name__'], level=0)
[Tue Jun 15 13:25:31.455882 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/networking_ovn/octavia/ovn_driver.py", line 43, in <module>
[Tue Jun 15 13:25:31.455887 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     from networking_ovn.ovsdb import impl_idl_ovn
[Tue Jun 15 13:25:31.455895 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/networking_ovn/ovsdb/impl_idl_ovn.py", line 36, in <module>
[Tue Jun 15 13:25:31.455900 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     from networking_ovn.ovsdb import ovsdb_monitor
[Tue Jun 15 13:25:31.455908 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/networking_ovn/ovsdb/ovsdb_monitor.py", line 32, in <module>
[Tue Jun 15 13:25:31.455913 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     from networking_ovn.common import hash_ring_manager
[Tue Jun 15 13:25:31.455938 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/networking_ovn/common/hash_ring_manager.py", line 26, in <module>
[Tue Jun 15 13:25:31.455944 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     from neutron import service
[Tue Jun 15 13:25:31.455952 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/neutron/service.py", line 37, in <module>
[Tue Jun 15 13:25:31.455957 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     from neutron.common import config
[Tue Jun 15 13:25:31.455965 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/neutron/common/config.py", line 50, in <module>
[Tue Jun 15 13:25:31.455969 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     common_config.register_core_common_config_opts()
[Tue Jun 15 13:25:31.455978 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/neutron/conf/common.py", line 160, in register_core_common_config_opts
[Tue Jun 15 13:25:31.455983 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     cfg.register_opts(core_opts)
[Tue Jun 15 13:25:31.455991 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/oslo_config/cfg.py", line 2055, in __inner
[Tue Jun 15 13:25:31.455996 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     result = f(self, *args, **kwargs)
[Tue Jun 15 13:25:31.456005 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/oslo_config/cfg.py", line 2317, in register_opts
[Tue Jun 15 13:25:31.456010 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     self.register_opt(opt, group, clear_cache=False)
[Tue Jun 15 13:25:31.456018 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/oslo_config/cfg.py", line 2059, in __inner
[Tue Jun 15 13:25:31.456023 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     return f(self, *args, **kwargs)
[Tue Jun 15 13:25:31.456031 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/oslo_config/cfg.py", line 2306, in register_opt
[Tue Jun 15 13:25:31.456036 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     if _is_opt_registered(self._opts, opt):
[Tue Jun 15 13:25:31.456044 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/oslo_config/cfg.py", line 368, in _is_opt_registered
[Tue Jun 15 13:25:31.456049 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     raise DuplicateOptError(opt.name)
[Tue Jun 15 13:25:31.456072 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100] oslo_config.cfg.DuplicateOptError: duplicate option: host
[Tue Jun 15 13:25:31.456085 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100] 
[Tue Jun 15 13:25:31.456088 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100] During handling of the above exception, another exception occurred:
[Tue Jun 15 13:25:31.456091 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100] 
[Tue Jun 15 13:25:31.456096 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100] Traceback (most recent call last):
[Tue Jun 15 13:25:31.456118 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/var/www/cgi-bin/octavia/app", line 52, in <module>
[Tue Jun 15 13:25:31.456122 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     application = setup_app()
[Tue Jun 15 13:25:31.456127 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/octavia/api/app.py", line 51, in setup_app
[Tue Jun 15 13:25:31.456130 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     _init_drivers()
[Tue Jun 15 13:25:31.456134 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/octavia/api/app.py", line 43, in _init_drivers
[Tue Jun 15 13:25:31.456137 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     driver_factory.get_driver(provider)
[Tue Jun 15 13:25:31.456148 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]   File "/usr/lib/python3.6/site-packages/octavia/api/drivers/driver_factory.py", line 49, in get_driver
[Tue Jun 15 13:25:31.456151 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100]     raise exceptions.ProviderNotFound(prov=provider)
[Tue Jun 15 13:25:31.456161 2021] [wsgi:error] [pid 13] [remote 172.17.1.88:59100] octavia.common.exceptions.ProviderNotFound: Provider 'ovn' was not found.

--- Additional comment from Brian Haley on 2021-06-15 18:55:37 CEST ---

We found the upstream change that caused this [0] and proposed a revert, re-assigning to networking-ovn component as we'll probably need to cherry-pick directly to 16.2 instead of waiting for upstream revert to merge.

[0] https://review.opendev.org/c/openstack/networking-ovn/+/790218

--- Additional comment from Itzik Brown on 2021-06-21 15:54:50 CEST ---

It blocks Openshift on Openstack with 16.2

--- Additional comment from errata-xmlrpc on 2021-09-15 09:15:55 CEST ---

Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Red Hat OpenStack Platform (RHOSP) 16.2 enhancement advisory), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2021:3483

Comment 9 Pavel Sedlák 2022-02-02 22:17:51 UTC
Requests to OC neutron are failing with: "HttpException: 503: Server Error for url: http://10.0.0.116:9696/v2.0/networks, 503 Service Unavailable: No server is available to handle this request."

neutron@controller-0 server.log shows these critical & errors:
> 2022-02-02 14:45:17.882 7 INFO neutron.service [req-5e76c186-2908-4edf-90a0-09105bcb3f0f - - - - -] Neutron service started, listening on 172.17.1.146:9696
> 2022-02-02 14:45:17.883 7 DEBUG neutron_lib.callbacks.manager [req-5e76c186-2908-4edf-90a0-09105bcb3f0f - - - - -] Notify callbacks ['networking_ovn.ml2.mech_driver.OVNMechanismDriver.pre_fork_initialize--9223372036586778410'] for process, before_spawn _notify_loop /usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py:193
> 2022-02-02 14:45:17.884 29 DEBUG neutron_lib.callbacks.manager [-] Notify callbacks ['networking_ovn.ml2.mech_driver.OVNMechanismDriver.post_fork_initialize-267997439'] for process, after_init _notify_loop /usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py:193
> 2022-02-02 14:45:17.889 28 DEBUG neutron_lib.callbacks.manager [-] Notify callbacks ['networking_ovn.ml2.mech_driver.OVNMechanismDriver.post_fork_initialize-267997439'] for process, after_init _notify_loop /usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py:193
> 2022-02-02 14:45:17.898 7 DEBUG neutron_lib.callbacks.manager [req-5e76c186-2908-4edf-90a0-09105bcb3f0f - - - - -] Callback networking_ovn.ml2.mech_driver.OVNMechanismDriver.pre_fork_initialize--9223372036586778410 raised 'str' object has no attribute 'register_all' _notify_loop /usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py:210
> 2022-02-02 14:45:17.898 7 DEBUG neutron_lib.callbacks.manager [req-5e76c186-2908-4edf-90a0-09105bcb3f0f - - - - -] Notify callbacks [] for process, abort_spawn _notify_loop /usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py:193
> 2022-02-02 14:45:17.900 7 CRITICAL neutron [req-5e76c186-2908-4edf-90a0-09105bcb3f0f - - - - -] Unhandled error: neutron_lib.callbacks.exceptions.CallbackFailure: Callback networking_ovn.ml2.mech_driver.OVNMechanismDriver.pre_fork_initialize--9223372036586778410 failed with "'str' object has no attribute 'register_all'"
> 2022-02-02 14:45:17.900 7 ERROR neutron Traceback (most recent call last):
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/bin/neutron-server", line 10, in <module>
> 2022-02-02 14:45:17.900 7 ERROR neutron     sys.exit(main())
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron/cmd/eventlet/server/__init__.py", line 19, in main
> 2022-02-02 14:45:17.900 7 ERROR neutron     server.boot_server(wsgi_eventlet.eventlet_wsgi_server)
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron/server/__init__.py", line 68, in boot_server
> 2022-02-02 14:45:17.900 7 ERROR neutron     server_func()
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron/server/wsgi_eventlet.py", line 23, in eventlet_wsgi_server
> 2022-02-02 14:45:17.900 7 ERROR neutron     neutron_api = service.serve_wsgi(service.NeutronApiService)
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron/service.py", line 94, in serve_wsgi
> 2022-02-02 14:45:17.900 7 ERROR neutron     registry.publish(resources.PROCESS, events.BEFORE_SPAWN, service)
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron_lib/callbacks/registry.py", line 60, in publish
> 2022-02-02 14:45:17.900 7 ERROR neutron     _get_callback_manager().publish(resource, event, trigger, payload=payload)
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py", line 149, in publish
> 2022-02-02 14:45:17.900 7 ERROR neutron     return self.notify(resource, event, trigger, payload=payload)
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron_lib/db/utils.py", line 108, in _wrapped
> 2022-02-02 14:45:17.900 7 ERROR neutron     raise db_exc.RetryRequest(e)
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
> 2022-02-02 14:45:17.900 7 ERROR neutron     self.force_reraise()
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
> 2022-02-02 14:45:17.900 7 ERROR neutron     six.reraise(self.type_, self.value, self.tb)
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/six.py", line 675, in reraise
> 2022-02-02 14:45:17.900 7 ERROR neutron     raise value
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron_lib/db/utils.py", line 103, in _wrapped
> 2022-02-02 14:45:17.900 7 ERROR neutron     return function(*args, **kwargs)
> 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py", line 174, in notify
> 2022-02-02 14:45:17.900 7 ERROR neutron     raise exceptions.CallbackFailure(errors=errors)
> 2022-02-02 14:45:17.900 7 ERROR neutron neutron_lib.callbacks.exceptions.CallbackFailure: Callback networking_ovn.ml2.mech_driver.OVNMechanismDriver.pre_fork_initialize--9223372036586778410 failed with "'str' object has no attribute 'register_all'"
> 2022-02-02 14:45:17.900 7 ERROR neutron 
> 2022-02-02 14:45:17.914 27 INFO networking_ovn.ovsdb.impl_idl_ovn [-] Getting OvsdbNbOvnIdl for WorkerService with retry
> 2022-02-02 14:45:17.919 26 INFO networking_ovn.ovsdb.impl_idl_ovn [-] Getting OvsdbNbOvnIdl for WorkerService with retry
> 2022-02-02 14:45:17.924 29 INFO networking_ovn.ovsdb.impl_idl_ovn [-] Getting OvsdbNbOvnIdl for WorkerService with retry
> 2022-02-02 14:45:17.925 28 INFO networking_ovn.ovsdb.impl_idl_ovn [-] Getting OvsdbNbOvnIdl for WorkerService with retry
> 2022-02-02 14:45:17.972 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> 2022-02-02 14:45:17.972 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> 2022-02-02 14:45:17.972 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> 2022-02-02 14:45:17.973 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.973 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.973 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.973 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index target autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.973 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.973 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.974 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.974 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.974 26 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:172.17.1.97:6641: connecting...
> 2022-02-02 14:45:17.975 26 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:172.17.1.97:6641: connected
> 2022-02-02 14:45:17.977 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> 2022-02-02 14:45:17.978 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> 2022-02-02 14:45:17.978 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> 2022-02-02 14:45:17.978 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.978 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.978 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.979 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index target autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.979 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.979 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.979 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> 2022-02-02 14:45:17.979 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.979 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> 2022-02-02 14:45:17.979 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.979 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> 2022-02-02 14:45:17.979 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.980 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.980 29 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:172.17.1.97:6641: connecting...
> 2022-02-02 14:45:17.980 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.980 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index target autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.980 29 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:172.17.1.97:6641: connected
> 2022-02-02 14:45:17.980 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.980 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.980 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.981 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> 2022-02-02 14:45:17.981 27 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:172.17.1.97:6641: connecting...
> 2022-02-02 14:45:17.981 27 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:172.17.1.97:6641: connected
> 2022-02-02 14:45:17.992 26 DEBUG networking_ovn.common.hash_ring_manager [-] Disallow caching, nodes 0<4 _wait_startup_before_caching /usr/lib/python3.6/site-packages/networking_ovn/common/hash_ring_manager.py:61
> 2022-02-02 14:45:17.996 26 ERROR networking_ovn.ovsdb.ovsdb_monitor [-] HashRing is empty, error: Hash Ring returned empty when hashing "b'30290467-ab7e-4463-900b-f67ad3cb4945'". This should never happen in a normal situation, please check the status of your cluster: networking_ovn.common.exceptions.HashRingIsEmpty: Hash Ring returned empty when hashing "b'30290467-ab7e-4463-900b-f67ad3cb4945'". This should never happen in a normal situation, please check the status of your cluster
> 2022-02-02 14:45:18.024 26 DEBUG networking_ovn.common.hash_ring_manager [-] Disallow caching, nodes 0<4 _wait_startup_before_caching /usr/lib/python3.6/site-packages/networking_ovn/common/hash_ring_manager.py:61
> 2022-02-02 14:45:18.025 29 DEBUG networking_ovn.common.hash_ring_manager [-] Disallow caching, nodes 0<4 _wait_startup_before_caching /usr/lib/python3.6/site-packages/networking_ovn/common/hash_ring_manager.py:61
> 2022-02-02 14:45:18.028 26 ERROR networking_ovn.ovsdb.ovsdb_monitor [-] HashRing is empty, error: Hash Ring returned empty when hashing "b'3d8ad4fb-9a0c-4969-b793-78e0545b6ba5'". This should never happen in a normal situation, please check the status of your cluster: networking_ovn.common.exceptions.HashRingIsEmpty: Hash Ring returned empty when hashing "b'3d8ad4fb-9a0c-4969-b793-78e0545b6ba5'". This should never happen in a normal situation, please check the status of your cluster
> 2022-02-02 14:45:18.028 27 DEBUG networking_ovn.common.hash_ring_manager [-] Disallow caching, nodes 0<4 _wait_startup_before_caching /usr/lib/python3.6/site-packages/networking_ovn/common/hash_ring_manager.py:61

so likely the fix did not worked as intended?

Comment 12 Eran Kuris 2022-02-03 15:59:16 UTC
(In reply to Pavel Sedlák from comment #9)
> Requests to OC neutron are failing with: "HttpException: 503: Server Error
> for url: http://10.0.0.116:9696/v2.0/networks, 503 Service Unavailable: No
> server is available to handle this request."
> 
> neutron@controller-0 server.log shows these critical & errors:
> > 2022-02-02 14:45:17.882 7 INFO neutron.service [req-5e76c186-2908-4edf-90a0-09105bcb3f0f - - - - -] Neutron service started, listening on 172.17.1.146:9696
> > 2022-02-02 14:45:17.883 7 DEBUG neutron_lib.callbacks.manager [req-5e76c186-2908-4edf-90a0-09105bcb3f0f - - - - -] Notify callbacks ['networking_ovn.ml2.mech_driver.OVNMechanismDriver.pre_fork_initialize--9223372036586778410'] for process, before_spawn _notify_loop /usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py:193
> > 2022-02-02 14:45:17.884 29 DEBUG neutron_lib.callbacks.manager [-] Notify callbacks ['networking_ovn.ml2.mech_driver.OVNMechanismDriver.post_fork_initialize-267997439'] for process, after_init _notify_loop /usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py:193
> > 2022-02-02 14:45:17.889 28 DEBUG neutron_lib.callbacks.manager [-] Notify callbacks ['networking_ovn.ml2.mech_driver.OVNMechanismDriver.post_fork_initialize-267997439'] for process, after_init _notify_loop /usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py:193
> > 2022-02-02 14:45:17.898 7 DEBUG neutron_lib.callbacks.manager [req-5e76c186-2908-4edf-90a0-09105bcb3f0f - - - - -] Callback networking_ovn.ml2.mech_driver.OVNMechanismDriver.pre_fork_initialize--9223372036586778410 raised 'str' object has no attribute 'register_all' _notify_loop /usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py:210
> > 2022-02-02 14:45:17.898 7 DEBUG neutron_lib.callbacks.manager [req-5e76c186-2908-4edf-90a0-09105bcb3f0f - - - - -] Notify callbacks [] for process, abort_spawn _notify_loop /usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py:193
> > 2022-02-02 14:45:17.900 7 CRITICAL neutron [req-5e76c186-2908-4edf-90a0-09105bcb3f0f - - - - -] Unhandled error: neutron_lib.callbacks.exceptions.CallbackFailure: Callback networking_ovn.ml2.mech_driver.OVNMechanismDriver.pre_fork_initialize--9223372036586778410 failed with "'str' object has no attribute 'register_all'"
> > 2022-02-02 14:45:17.900 7 ERROR neutron Traceback (most recent call last):
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/bin/neutron-server", line 10, in <module>
> > 2022-02-02 14:45:17.900 7 ERROR neutron     sys.exit(main())
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron/cmd/eventlet/server/__init__.py", line 19, in main
> > 2022-02-02 14:45:17.900 7 ERROR neutron     server.boot_server(wsgi_eventlet.eventlet_wsgi_server)
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron/server/__init__.py", line 68, in boot_server
> > 2022-02-02 14:45:17.900 7 ERROR neutron     server_func()
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron/server/wsgi_eventlet.py", line 23, in eventlet_wsgi_server
> > 2022-02-02 14:45:17.900 7 ERROR neutron     neutron_api = service.serve_wsgi(service.NeutronApiService)
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron/service.py", line 94, in serve_wsgi
> > 2022-02-02 14:45:17.900 7 ERROR neutron     registry.publish(resources.PROCESS, events.BEFORE_SPAWN, service)
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron_lib/callbacks/registry.py", line 60, in publish
> > 2022-02-02 14:45:17.900 7 ERROR neutron     _get_callback_manager().publish(resource, event, trigger, payload=payload)
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py", line 149, in publish
> > 2022-02-02 14:45:17.900 7 ERROR neutron     return self.notify(resource, event, trigger, payload=payload)
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron_lib/db/utils.py", line 108, in _wrapped
> > 2022-02-02 14:45:17.900 7 ERROR neutron     raise db_exc.RetryRequest(e)
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
> > 2022-02-02 14:45:17.900 7 ERROR neutron     self.force_reraise()
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
> > 2022-02-02 14:45:17.900 7 ERROR neutron     six.reraise(self.type_, self.value, self.tb)
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/six.py", line 675, in reraise
> > 2022-02-02 14:45:17.900 7 ERROR neutron     raise value
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron_lib/db/utils.py", line 103, in _wrapped
> > 2022-02-02 14:45:17.900 7 ERROR neutron     return function(*args, **kwargs)
> > 2022-02-02 14:45:17.900 7 ERROR neutron   File "/usr/lib/python3.6/site-packages/neutron_lib/callbacks/manager.py", line 174, in notify
> > 2022-02-02 14:45:17.900 7 ERROR neutron     raise exceptions.CallbackFailure(errors=errors)
> > 2022-02-02 14:45:17.900 7 ERROR neutron neutron_lib.callbacks.exceptions.CallbackFailure: Callback networking_ovn.ml2.mech_driver.OVNMechanismDriver.pre_fork_initialize--9223372036586778410 failed with "'str' object has no attribute 'register_all'"
> > 2022-02-02 14:45:17.900 7 ERROR neutron 
> > 2022-02-02 14:45:17.914 27 INFO networking_ovn.ovsdb.impl_idl_ovn [-] Getting OvsdbNbOvnIdl for WorkerService with retry
> > 2022-02-02 14:45:17.919 26 INFO networking_ovn.ovsdb.impl_idl_ovn [-] Getting OvsdbNbOvnIdl for WorkerService with retry
> > 2022-02-02 14:45:17.924 29 INFO networking_ovn.ovsdb.impl_idl_ovn [-] Getting OvsdbNbOvnIdl for WorkerService with retry
> > 2022-02-02 14:45:17.925 28 INFO networking_ovn.ovsdb.impl_idl_ovn [-] Getting OvsdbNbOvnIdl for WorkerService with retry
> > 2022-02-02 14:45:17.972 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> > 2022-02-02 14:45:17.972 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> > 2022-02-02 14:45:17.972 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> > 2022-02-02 14:45:17.973 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.973 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.973 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.973 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index target autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.973 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.973 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.974 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.974 26 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.974 26 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:172.17.1.97:6641: connecting...
> > 2022-02-02 14:45:17.975 26 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:172.17.1.97:6641: connected
> > 2022-02-02 14:45:17.977 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> > 2022-02-02 14:45:17.978 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> > 2022-02-02 14:45:17.978 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> > 2022-02-02 14:45:17.978 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.978 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.978 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.979 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index target autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.979 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.979 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.979 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> > 2022-02-02 14:45:17.979 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.979 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> > 2022-02-02 14:45:17.979 29 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.979 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:104
> > 2022-02-02 14:45:17.979 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.980 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.980 29 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:172.17.1.97:6641: connecting...
> > 2022-02-02 14:45:17.980 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.980 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index target autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.980 29 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:172.17.1.97:6641: connected
> > 2022-02-02 14:45:17.980 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.980 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.980 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.981 27 DEBUG ovsdbapp.backend.ovs_idl [-] Created index name autocreate_indices /usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:121
> > 2022-02-02 14:45:17.981 27 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:172.17.1.97:6641: connecting...
> > 2022-02-02 14:45:17.981 27 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:172.17.1.97:6641: connected
> > 2022-02-02 14:45:17.992 26 DEBUG networking_ovn.common.hash_ring_manager [-] Disallow caching, nodes 0<4 _wait_startup_before_caching /usr/lib/python3.6/site-packages/networking_ovn/common/hash_ring_manager.py:61
> > 2022-02-02 14:45:17.996 26 ERROR networking_ovn.ovsdb.ovsdb_monitor [-] HashRing is empty, error: Hash Ring returned empty when hashing "b'30290467-ab7e-4463-900b-f67ad3cb4945'". This should never happen in a normal situation, please check the status of your cluster: networking_ovn.common.exceptions.HashRingIsEmpty: Hash Ring returned empty when hashing "b'30290467-ab7e-4463-900b-f67ad3cb4945'". This should never happen in a normal situation, please check the status of your cluster
> > 2022-02-02 14:45:18.024 26 DEBUG networking_ovn.common.hash_ring_manager [-] Disallow caching, nodes 0<4 _wait_startup_before_caching /usr/lib/python3.6/site-packages/networking_ovn/common/hash_ring_manager.py:61
> > 2022-02-02 14:45:18.025 29 DEBUG networking_ovn.common.hash_ring_manager [-] Disallow caching, nodes 0<4 _wait_startup_before_caching /usr/lib/python3.6/site-packages/networking_ovn/common/hash_ring_manager.py:61
> > 2022-02-02 14:45:18.028 26 ERROR networking_ovn.ovsdb.ovsdb_monitor [-] HashRing is empty, error: Hash Ring returned empty when hashing "b'3d8ad4fb-9a0c-4969-b793-78e0545b6ba5'". This should never happen in a normal situation, please check the status of your cluster: networking_ovn.common.exceptions.HashRingIsEmpty: Hash Ring returned empty when hashing "b'3d8ad4fb-9a0c-4969-b793-78e0545b6ba5'". This should never happen in a normal situation, please check the status of your cluster
> > 2022-02-02 14:45:18.028 27 DEBUG networking_ovn.common.hash_ring_manager [-] Disallow caching, nodes 0<4 _wait_startup_before_caching /usr/lib/python3.6/site-packages/networking_ovn/common/hash_ring_manager.py:61
> 
> so likely the fix did not worked as intended?

Pavel are you going to report another bug?

The original issue that was reported here does not look as reproduced on phase 2 jobs and this bz is set as on_qa. 
I think the new issue needs to be reported in another bz.
wdyt?

Comment 15 Eran Kuris 2022-02-06 08:24:42 UTC
according to the phase 2 jobs, it looks like the issue was fixed on : 
core_puddle: RHOS-16.1-RHEL-8-20220203.n.1

Comment 21 errata-xmlrpc 2022-03-24 11:03:21 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Red Hat OpenStack Platform 16.1.8 bug fix and enhancement advisory), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2022:0986