Description of problem: Deploying a CentOS Stream 9 based host with CentOS Stream 8 based engine: 2021-09-17 06:58:14,053+0000 INFO (jsonrpc/5) [api.network] START setupNetworks(networks={'ovirtmgmt': {'ipv6autoconf': False, 'nic': 'enp1s0', 'bridged': 'true', 'bootproto': 'dhcp', 'defaultRoute': True, 'dhc pv6': False, 'STP': 'no', 'mtu': 1500, 'switch': 'legacy'}}, bondings={}, options={'connectivityTimeout': 120, 'commitOnSuccess': True, 'connectivityCheck': 'true'}) from=::ffff:10.46.8.230,40598, flow_id=1e231b 4a-a7e3-4c2c-bfd8-d32efb6d7294 (api:48) 2021-09-17 06:59:14,053+0000 WARN (vdsm.Scheduler) [Executor] Worker blocked: <Worker name=jsonrpc/5 running <Task <JsonRpcTask {'jsonrpc': '2.0', 'method': 'Host.setupNetworks', 'params': {'networks': {'ovirtmgmt': {'ipv6autoconf': False, 'nic': 'enp1s0', 'bridged': 'true', 'bootproto': 'dhcp', 'defaultRoute': True, 'dhcpv6': False, 'STP': 'no', 'mtu': 1500, 'switch': 'legacy'}}, 'bondings': {}, 'options': {'connectivityTimeout': 120, 'commitOnSuccess': True, 'connectivityCheck': 'true'}}, 'id': '53f12e41-daef-42c4-a172-d377c05f0be7'} at 0x7f32e0728220> timeout=60, duration=60.00 at 0x7f32e07280d0> task#=12 at 0x7f32f8077340>, traceback: File: "/usr/lib64/python3.9/threading.py", line 930, in _bootstrap self._bootstrap_inner() File: "/usr/lib64/python3.9/threading.py", line 973, in _bootstrap_inner self.run() File: "/usr/lib64/python3.9/threading.py", line 910, in run self._target(*self._args, **self._kwargs) File: "/usr/lib/python3.9/site-packages/vdsm/common/concurrent.py", line 260, in run ret = func(*args, **kwargs) File: "/usr/lib/python3.9/site-packages/vdsm/executor.py", line 301, in _run self._execute_task() File: "/usr/lib/python3.9/site-packages/vdsm/executor.py", line 315, in _execute_task task() File: "/usr/lib/python3.9/site-packages/vdsm/executor.py", line 391, in __call__ self._callable() File: "/usr/lib/python3.9/site-packages/yajsonrpc/__init__.py", line 264, in __call__ self._handler(self._ctx, self._req) File: "/usr/lib/python3.9/site-packages/yajsonrpc/__init__.py", line 307, in _serveRequest response = self._handle_request(req, ctx) File: "/usr/lib/python3.9/site-packages/yajsonrpc/__init__.py", line 349, in _handle_request res = method(**params) File: "/usr/lib/python3.9/site-packages/vdsm/rpc/Bridge.py", line 198, in _dynamicMethod result = fn(*methodArgs) File: "<decorator-gen-501>", line 2, in setupNetworks File: "/usr/lib/python3.9/site-packages/vdsm/common/api.py", line 50, in method ret = func(*args, **kwargs) File: "/usr/lib/python3.9/site-packages/vdsm/API.py", line 1577, in setupNetworks supervdsm.getProxy().setupNetworks(networks, bondings, options) File: "/usr/lib/python3.9/site-packages/vdsm/common/supervdsm.py", line 56, in __call__ return callMethod() File: "/usr/lib/python3.9/site-packages/vdsm/common/supervdsm.py", line 53, in <lambda> getattr(self._supervdsmProxy._svdsm, self._funcName)(*args, File: "<string>", line 2, in setupNetworks File: "/usr/lib64/python3.9/multiprocessing/managers.py", line 810, in _callmethod kind, result = conn.recv() File: "/usr/lib64/python3.9/multiprocessing/connection.py", line 259, in recv buf = self._recv_bytes() File: "/usr/lib64/python3.9/multiprocessing/connection.py", line 423, in _recv_bytes buf = self._recv(4) File: "/usr/lib64/python3.9/multiprocessing/connection.py", line 388, in _recv chunk = read(handle, remaining) (executor:362) 2021-09-17 07:00:15,654+0000 ERROR (jsonrpc/5) [vds] connectivity check failed (API:1584) Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/vdsm/API.py", line 1577, in setupNetworks supervdsm.getProxy().setupNetworks(networks, bondings, options) File "/usr/lib/python3.9/site-packages/vdsm/common/supervdsm.py", line 56, in __call__ return callMethod() File "/usr/lib/python3.9/site-packages/vdsm/common/supervdsm.py", line 53, in <lambda> getattr(self._supervdsmProxy._svdsm, self._funcName)(*args, File "<string>", line 2, in setupNetworks File "/usr/lib64/python3.9/multiprocessing/managers.py", line 825, in _callmethod raise convert_to_error(kind, result) vdsm.network.errors.ConfigNetworkError: (10, 'connectivity check failed') 2021-09-17 07:00:15,655+0000 INFO (jsonrpc/5) [api.network] FINISH setupNetworks return={'status': {'code': 10, 'message': 'connectivity check failed'}} from=::ffff:10.46.8.230,40598, flow_id=1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294 (api:54) 2021-09-17 07:00:15,655+0000 INFO (jsonrpc/5) [jsonrpc.JsonRpcServer] RPC call Host.setupNetworks failed (error 10) in 121.60 seconds (__init__:311) On the engine side: 2021-09-17 06:58:13,521Z INFO [org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] Before acquiring lock-timeout 'EngineLock:{exclusiveLocks='[HOST_NETWORK72a01c31-8ae6-42b9-9b72-ad0692fef08f=HOST_NETWORK]', sharedLocks=''}' 2021-09-17 06:58:13,523Z INFO [org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] Lock-timeout acquired to object 'EngineLock:{exclusiveLocks='[HOST_NETWORK72a01c31-8ae6-42b9-9b72-ad0692fef08f=HOST_NETWORK]', sharedLocks=''}' 2021-09-17 06:58:13,549Z INFO [org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] Running command: HostSetupNetworksCommand internal: false. Entities affected : ID: 72a01c31-8ae6-42b9-9b72-ad0692fef08f Type: VDSAction group CONFIGURE_HOST_NETWORK with role type ADMIN 2021-09-17 06:58:13,549Z INFO [org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] Before acquiring lock in order to prevent monitoring for host 'stream9' from data-center 'Default' 2021-09-17 06:58:13,549Z INFO [org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] Lock acquired, from now a monitoring of host will be skipped for host 'stream9' from data-center 'Default' 2021-09-17 06:58:13,550Z INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.HostSetupNetworksVDSCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] START, HostSetupNetworksVDSCommand(HostName = stream9, HostSetupNetworksVdsCommandParameters:{hostId='72a01c31-8ae6-42b9-9b72-ad0692fef08f', vds='Host[stream9,72a01c31-8ae6-42b9-9b72-ad0692fef08f]', rollbackOnFailure='true', commitOnSuccess='true', connectivityTimeout='120', networks='[HostNetwork:{defaultRoute='true', bonding='false', networkName='ovirtmgmt', vdsmName='ovirtmgmt', nicName='enp1s0', vlan='null', vmNetwork='true', stp='false', properties='null', ipv4BootProtocol='DHCP', ipv4Address='null', ipv4Netmask='null', ipv4Gateway='null', ipv6BootProtocol='NONE', ipv6Address='null', ipv6Prefix='null', ipv6Gateway='null', nameServers='null'}]', removedNetworks='[]', bonds='[]', removedBonds='[]', clusterSwitchType='LEGACY', managementNetworkChanged='true'}), log id: 4afc556c 2021-09-17 06:58:13,551Z INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.HostSetupNetworksVDSCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] FINISH, HostSetupNetworksVDSCommand, return: , log id: 4afc556c 2021-09-17 06:58:16,059Z WARN [org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] Unexpected return value: Status [code=-1, message=Internal timeout occurred] 2021-09-17 06:58:16,059Z ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] Failed in 'PollVDS' method ... 2021-09-17 07:01:13,904Z ERROR [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] EVENT_ID: VDS_BROKER_COMMAND_FAILURE(10,802), VDSM stream9 command HostSetupNetworksVDS failed: Message timeout which can be caused by communication issues 2021-09-17 07:01:13,904Z ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.HostSetupNetworksVDSCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] Command 'HostSetupNetworksVDSCommand(HostName = stream9, HostSetupNetworksVdsCommandParameters:{hostId='72a01c31-8ae6-42b9-9b72-ad0692fef08f', vds='Host[stream9,72a01c31-8ae6-42b9-9b72-ad0692fef08f]', rollbackOnFailure='true', commitOnSuccess='true', connectivityTimeout='120', networks='[HostNetwork:{defaultRoute='true', bonding='false', networkName='ovirtmgmt', vdsmName='ovirtmgmt', nicName='enp1s0', vlan='null', vmNetwork='true', stp='false', properties='null', ipv4BootProtocol='DHCP', ipv4Address='null', ipv4Netmask='null', ipv4Gateway='null', ipv6BootProtocol='NONE', ipv6Address='null', ipv6Prefix='null', ipv6Gateway='null', nameServers='null'}]', removedNetworks='[]', bonds='[]', removedBonds='[]', clusterSwitchType='LEGACY', managementNetworkChanged='true'})' execution failed: VDSGenericException: VDSNetworkException: Message timeout which can be caused by communication issues 2021-09-17 07:01:13,904Z ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.HostSetupNetworksVDSCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] Error: VDSGenericException: VDSNetworkException: Message timeout which can be caused by communication issues 2021-09-17 07:01:13,904Z INFO [org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] Host setup networks finished. Lock released. Monitoring can run now for host 'stream9' from data-center 'Default' 2021-09-17 07:01:13,904Z ERROR [org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] Command 'org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand' failed: EngineException: org.ovirt.engine.core.vdsbroker.vdsbroker.VDSNetworkException: VDSGenericException: VDSNetworkException: Message timeout which can be caused by communication issues (Failed with error VDS_NETWORK_ERROR and code 5022) 2021-09-17 07:01:13,917Z INFO [org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand] (default task-5) [1e231b4a-a7e3-4c2c-bfd8-d32efb6d7294] Lock freed to object 'EngineLock:{exclusiveLocks='[HOST_NETWORK72a01c31-8ae6-42b9-9b72-ad0692fef08f=HOST_NETWORK]', sharedLocks=''}' Version-Release number of selected component (if applicable): vdsm-4.50-24.git0a932c0e7.el9.x86_64 nmstate-2.0.0-0.3.alpha2.el9.noarch NetworkManager-1.32.10-2.el9.x86_64 ovirt-engine-4.5.0-0.0.master.20210914212845.git140ef22272e.el8.noarch
The connectivity check fails because ovirtmgmt receives different IP. This is related to systemd behavioral change [0]. The workaround for this could be: 1) Use static IP before host deploy 2) Add additional systemd config into /etc/systemd/network/00-MAC.link [Match] OriginalName=* [Link] MACAddressPolicy=none Regards, Ales [0] https://bugzilla.redhat.com/1921094
The setup networks is working again on el9s.
This bugzilla is included in oVirt 4.5.0 release, published on April 20th 2022. Since the problem described in this bug report should be resolved in oVirt 4.5.0 release, it has been closed with a resolution of CURRENT RELEASE. If the solution does not work for you, please open a new bug report.