Bug 1731214
Summary: | After creating a bunch of VMs we get to a point where no VMs can be created due to VirtualInterfaceCreateException: Virtual Interface creation failed | ||
---|---|---|---|
Product: | Red Hat OpenStack | Reporter: | David Hill <dhill> |
Component: | openstack-neutron | Assignee: | Lucas Alvares Gomes <lmartins> |
Status: | CLOSED DUPLICATE | QA Contact: | Eran Kuris <ekuris> |
Severity: | urgent | Docs Contact: | |
Priority: | urgent | ||
Version: | 13.0 (Queens) | CC: | amuller, chrisw, lmartins, njohnston, pveiga, schari, scohen |
Target Milestone: | --- | Keywords: | Reopened, Scale, TestBlocker |
Target Release: | --- | ||
Hardware: | x86_64 | ||
OS: | Linux | ||
Whiteboard: | |||
Fixed In Version: | Doc Type: | If docs needed, set a value | |
Doc Text: | Story Points: | --- | |
Clone Of: | Environment: | ||
Last Closed: | 2022-06-30 10:46:08 UTC | Type: | Bug |
Regression: | --- | Mount Type: | --- |
Documentation: | --- | CRM: | |
Verified Versions: | Category: | --- | |
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
Cloudforms Team: | --- | Target Upstream Version: | |
Embargoed: |
Description
David Hill
2019-07-18 16:10:49 UTC
[root@overcloud-control-0 neutron]# tail -F server.log | grep ERROR 2019-07-18 16:46:28.378 28 ERROR ovsdbapp.backend.ovs_idl.vlog [-] tcp:10.10.10.10:6642: no response to inactivity probe after 5 seconds, disconnecting: Exception: Could not retrieve schema from tcp:10.10.10.10:6642: Connection refused 2019-07-18 16:46:30.810 40 ERROR ovsdbapp.backend.ovs_idl.vlog [-] tcp:10.10.10.10:6642: no response to inactivity probe after 5 seconds, disconnecting: Exception: Could not retrieve schema from tcp:10.10.10.10:6642: Connection refused 2019-07-18 16:46:43.816 40 ERROR ovsdbapp.backend.ovs_idl.vlog [-] tcp:10.10.10.10:6642: no response to inactivity probe after 5 seconds, disconnecting: Exception: Could not retrieve schema from tcp:10.10.10.10:6642: Connection refused 2019-07-18 17:11:14.222 53 ERROR ovsdbapp.backend.ovs_idl.transaction [-] OVSDB Error: {"details":"Transaction causes multiple rows in \"Logical_Switch_Port\" table to have identical values (\"2e74e4b2-f0ab-493b-bcdc-aa54273374b1\") for index on column \"name\". First row, with UUID f85f5e43-bada-41e8-b3e0-9ec6edc04689, was inserted by this transaction. Second row, with UUID 65449685-0cad-45d3-9dcc-ad957615c606, existed in the database before this transaction and was not modified by the transaction.","error":"constraint violation"} 2019-07-18 17:11:14.223 53 ERROR ovsdbapp.backend.ovs_idl.transaction [req-e16e3e31-96df-484f-897e-db6221b13867 - - - - -] Traceback (most recent call last): raise RuntimeError(msg) RuntimeError: OVSDB Error: {"details":"Transaction causes multiple rows in \"Logical_Switch_Port\" table to have identical values (\"2e74e4b2-f0ab-493b-bcdc-aa54273374b1\") for index on column \"name\". First row, with UUID f85f5e43-bada-41e8-b3e0-9ec6edc04689, was inserted by this transaction. Second row, with UUID 65449685-0cad-45d3-9dcc-ad957615c606, existed in the database before this transaction and was not modified by the transaction.","error":"constraint violation"} 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance [req-e16e3e31-96df-484f-897e-db6221b13867 - - - - -] Failed to fix resource 2e74e4b2-f0ab-493b-bcdc-aa54273374b1 (type: ports): RuntimeError: OVSDB Error: {"details":"Transaction causes multiple rows in \"Logical_Switch_Port\" table to have identical values (\"2e74e4b2-f0ab-493b-bcdc-aa54273374b1\") for index on column \"name\". First row, with UUID f85f5e43-bada-41e8-b3e0-9ec6edc04689, was inserted by this transaction. Second row, with UUID 65449685-0cad-45d3-9dcc-ad957615c606, existed in the database before this transaction and was not modified by the transaction.","error":"constraint violation"} 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance Traceback (most recent call last): 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance File "/usr/lib/python2.7/site-packages/networking_ovn/common/maintenance.py", line 232, in check_for_inconsistencies 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance self._fix_create_update(row) 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance File "/usr/lib/python2.7/site-packages/networking_ovn/common/maintenance.py", line 160, in _fix_create_update 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance res_map['ovn_create'](n_obj) 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance File "/usr/lib/python2.7/site-packages/networking_ovn/common/ovn_client.py", line 350, in create_port 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance self.add_txns_to_sync_port_dns_records(txn, port) 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance File "/usr/lib64/python2.7/contextlib.py", line 24, in __exit__ 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance self.gen.next() 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance File "/usr/lib/python2.7/site-packages/networking_ovn/ovsdb/impl_idl_ovn.py", line 139, in transaction 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance yield t 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance File "/usr/lib64/python2.7/contextlib.py", line 24, in __exit__ 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance self.gen.next() 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance File "/usr/lib/python2.7/site-packages/ovsdbapp/api.py", line 102, in transaction 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance del self._nested_txns_map[cur_thread_id] 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance File "/usr/lib/python2.7/site-packages/ovsdbapp/api.py", line 59, in __exit__ 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance self.result = self.commit() 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance File "/usr/lib/python2.7/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 62, in commit 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance raise result.ex 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance RuntimeError: OVSDB Error: {"details":"Transaction causes multiple rows in \"Logical_Switch_Port\" table to have identical values (\"2e74e4b2-f0ab-493b-bcdc-aa54273374b1\") for index on column \"name\". First row, with UUID f85f5e43-bada-41e8-b3e0-9ec6edc04689, was inserted by this transaction. Second row, with UUID 65449685-0cad-45d3-9dcc-ad957615c606, existed in the database before this transaction and was not modified by the transaction.","error":"constraint violation"} 2019-07-18 17:11:14.223 53 ERROR networking_ovn.common.maintenance It looks like setting this on all the nodes (controller + computes) solved the issue: ovs-vsctl set open . external_ids:ovn-remote-probe-interval=180000 This issue is related to this one https://bugzilla.redhat.com/show_bug.cgi?id=1723463 |