Bug 2048418 - OSP 13, neutron ml2-ovs, DPDK - stop/wait/start of nova instance sometimes deletes tbr and yields Error executing command: RowNotFound: Cannot find Bridge with name=
Summary: OSP 13, neutron ml2-ovs, DPDK - stop/wait/start of nova instance sometimes de...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-neutron
Version: 13.0 (Queens)
Hardware: All
OS: All
high
high
Target Milestone: async
: 13.0 (Queens)
Assignee: Miguel Lavalle
QA Contact: Eran Kuris
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2022-01-31 08:49 UTC by Rohini Diwakar
Modified: 2023-10-12 16:58 UTC (History)
14 users (show)

Fixed In Version: openstack-neutron-12.1.1-42.6.el7ost
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2023-06-28 15:07:35 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Bugzilla 1655177 1 None None None 2022-02-01 15:56:09 UTC
Red Hat Issue Tracker OSP-12374 0 None None None 2022-01-31 08:52:12 UTC
Red Hat Product Errata RHBA-2023:3902 0 None None None 2023-06-28 15:07:45 UTC

Description Rohini Diwakar 2022-01-31 08:49:38 UTC
Description of problem:
After running openstack server stop/start, instances didn't come up and stayed in SHUTOFF state. 

Issue same as described in BZ-1810451

In nova-compute.log,

2022-01-29 01:24:29.080 62 ERROR vif_plug_ovs.linux_net [-] Unable to execute ['ovs-vsctl', '--timeout=120', '--', 'set', 'interface', u'vhu7003efbf-21', 'mtu_request=9000']. Exception: Unexpected error while running command.
Command: ovs-vsctl --timeout=120 -- set interface vhu7003efbf-21 mtu_request=9000
Exit code: 1
Stdout: u''
Stderr: u'ovs-vsctl: no row "vhu7003efbf-21" in table Interface\n'
2022-01-29 01:24:29.082 8 ERROR os_vif [req-be2864cb-073c-4e8a-88ad-d07c2d28e094 7223b4f08dc7459d8563318e17a8019f 541dedf4c78243cf9d1c624a3427c91f - default default] Failed to plug vif VIFVHostUser(active=True,address=fa:16:3e:a2:4b:5d,has_traffic_filtering=False,id=7003efbf-2185-412d-b43b-714592b7212b,mode='server',network=Network(0fde5720-ef2b-4038-8739-e7ede246e192),path='/var/lib/vhost_sockets/vhu7003efbf-21',plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='vhu7003efbf-21'): AgentError: Error during following call to agent: ['ovs-vsctl', '--timeout=120', '--', 'set', 'interface', u'vhu7003efbf-21', 'mtu_request=9000']
2022-01-29 01:24:29.082 8 ERROR os_vif Traceback (most recent call last):
2022-01-29 01:24:29.082 8 ERROR os_vif   File "/usr/lib/python2.7/site-packages/os_vif/init.py", line 77, in plug
2022-01-29 01:24:29.082 8 ERROR os_vif     plugin.plug(vif, instance_info)
2022-01-29 01:24:29.082 8 ERROR os_vif   File "/usr/lib/python2.7/site-packages/vif_plug_ovs/ovs.py", line 213, in plug
2022-01-29 01:24:29.082 8 ERROR os_vif     self._plug_vhostuser(vif, instance_info)
2022-01-29 01:24:29.082 8 ERROR os_vif   File "/usr/lib/python2.7/site-packages/vif_plug_ovs/ovs.py", line 148, in _plug_vhostuser
2022-01-29 01:24:29.082 8 ERROR os_vif     vif, vif_name, instance_info, *args)
2022-01-29 01:24:29.082 8 ERROR os_vif   File "/usr/lib/python2.7/site-packages/vif_plug_ovs/ovs.py", line 119, in _create_vif_port
2022-01-29 01:24:29.082 8 ERROR os_vif     *kwargs)
2022-01-29 01:24:29.082 8 ERROR os_vif   File "/usr/lib/python2.7/site-packages/oslo_privsep/priv_context.py", line 207, in _wrap
2022-01-29 01:24:29.082 8 ERROR os_vif     return self.channel.remote_call(name, args, kwargs)
2022-01-29 01:24:29.082 8 ERROR os_vif   File "/usr/lib/python2.7/site-packages/oslo_privsep/daemon.py", line 202, in remote_call
2022-01-29 01:24:29.082 8 ERROR os_vif     raise exc_type(*result[2])
2022-01-29 01:24:29.082 8 ERROR os_vif AgentError: Error during following call to agent: ['ovs-vsctl', '--timeout=120', '--', 'set', 'interface', u'vhu7003efbf-21', 'mtu_request=9000']
2022-01-29 01:24:29.082 8 ERROR os_vif 
2022-01-29 01:24:29.085 8 ERROR nova.virt.libvirt.driver [req-be2864cb-073c-4e8a-88ad-d07c2d28e094 7223b4f08dc7459d8563318e17a8019f 541dedf4c78243cf9d1c624a3427c91f - default default] [instance: 3a4ce996-476c-4db3-bdb4-f99dae7dcb72] Failed to start libvirt guest: InternalError: Failure running os_vif plugin plug method: Failed to plug VIF VIFVHostUser(active=True,address=fa:16:3e:a2:4b:5d,has_traffic_filtering=False,id=7003efbf-2185-412d-b43b-714592b7212b,mode='server',network=Network(0fde5720-ef2b-4038-8739-e7ede246e192),path='/var/lib/vhost_sockets/vhu7003efbf-21',plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='vhu7003efbf-21'). Got error: Error during following call to agent: ['ovs-vsctl', '--timeout=120', '--', 'set', 'interface', u'vhu7003efbf-21', 'mtu_request=9000']

Around the same time in openvswitch-agent.log,

2022-01-29 01:24:25.018 409276 DEBUG neutron.agent.linux.async_process [-] Output received from [ovsdb-client monitor tcp:127.0.0.1:6640 Interface name,ofport,external_ids --format=json]: {"data":[["aef86567-183
5-43e9-ad53-4bc70ffef38e","delete","vhu7003efbf-21",1,["map",[["attached-mac","fa:16:3e:a2:4b:5d"],["bridge_name","tbr-6fa74ed9-4"],["iface-id","7003efbf-2185-412d-b43b-714592b7212b"],["iface-status","active"],[
"subport_ids","["1a892b9d-5bed-43ad-a13e-f85b9c3c9b23"]"],["trunk_id","6fa74ed9-4e10-4ed8-a9ab-25fe0a8e1f98"],["vm-uuid","3a4ce996-476c-4db3-bdb4-f99dae7dcb72"]]]]],"headings":["row","action","name","ofport","
external_ids"]} _read_stdout /usr/lib/python2.7/site-packages/neutron/agent/linux/async_process.py:260
.
.
2022-01-29 01:24:34.586 409276 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command: RowNotFound: Cannot find Bridge with name=tbr-6fa74ed9-4
2022-01-29 01:24:34.586 409276 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
2022-01-29 01:24:34.586 409276 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python2.7/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 38, in execute
2022-01-29 01:24:34.586 409276 ERROR ovsdbapp.backend.ovs_idl.command     self.run_idl(None)
2022-01-29 01:24:34.586 409276 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python2.7/site-packages/ovsdbapp/schema/open_vswitch/commands.py", line 335, in run_idl
2022-01-29 01:24:34.586 409276 ERROR ovsdbapp.backend.ovs_idl.command     br = idlutils.row_by_value(self.api.idl, 'Bridge', 'name', self.bridge)
2022-01-29 01:24:34.586 409276 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python2.7/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 63, in row_by_value
2022-01-29 01:24:34.586 409276 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
2022-01-29 01:24:34.586 409276 ERROR ovsdbapp.backend.ovs_idl.command RowNotFound: Cannot find Bridge with name=tbr-6fa74ed9-4
2022-01-29 01:24:34.586 409276 ERROR ovsdbapp.backend.ovs_idl.command 
2022-01-29 01:24:34.587 409276 ERROR neutron.services.trunk.drivers.openvswitch.agent.ovsdb_handler [-] Cannot obtain interface list for bridge tbr-6fa74ed9-4: Cannot find Bridge with name=tbr-6fa74ed9-4: RowNotFound: Cannot find Bridge with name=tbr-6fa74ed9-4
2022-01-29 01:24:34.588 409276 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command: RowNotFound: Cannot find Bridge with name=tbr-6fa74ed9-4
2022-01-29 01:24:34.588 409276 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
2022-01-29 01:24:34.588 409276 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python2.7/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 38, in execute
2022-01-29 01:24:34.588 409276 ERROR ovsdbapp.backend.ovs_idl.command     self.run_idl(None)
2022-01-29 01:24:34.588 409276 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python2.7/site-packages/ovsdbapp/schema/open_vswitch/commands.py", line 325, in run_idl
2022-01-29 01:24:34.588 409276 ERROR ovsdbapp.backend.ovs_idl.command     br = idlutils.row_by_value(self.api.idl, 'Bridge', 'name', self.bridge)
2022-01-29 01:24:34.588 409276 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python2.7/site-packages/ovsdbapp/backend/ovs_idl/idlutils.py", line 63, in row_by_value
2022-01-29 01:24:34.588 409276 ERROR ovsdbapp.backend.ovs_idl.command     raise RowNotFound(table=table, col=column, match=match)
2022-01-29 01:24:34.588 409276 ERROR ovsdbapp.backend.ovs_idl.command RowNotFound: Cannot find Bridge with name=tbr-6fa74ed9-4
2022-01-29 01:24:34.588 409276 ERROR ovsdbapp.backend.ovs_idl.command 
2022-01-29 01:24:34.588 409276 ERROR neutron.services.trunk.drivers.openvswitch.agent.ovsdb_handler [-] Failed to store metadata for trunk 6fa74ed9-4e10-4ed8-a9ab-25fe0a8e1f98: Cannot find Bridge with name=tbr-6fa74ed9-4: RowNotFound: Cannot find Bridge with name=tbr-6fa74ed9-4
2022-01-29 01:24:34.588 409276 DEBUG neutron.services.trunk.drivers.openvswitch.agent.trunk_manager [-] Trunk bridge with ID 6fa74ed9-4e10-4ed8-a9ab-25fe0a8e1f98 does not exist. remove_trunk /usr/lib/python2.7/site-packages/neutron/services/trunk/drivers/openvswitch/agent/trunk_manager.py:231
2022-01-29 01:24:34.589 409276 DEBUG neutron.services.trunk.drivers.openvswitch.agent.ovsdb_handler [-] Deleted resources associated to trunk: 6fa74ed9-4e10-4ed8-a9ab-25fe0a8e1f98 handle_trunk_remove /usr/lib/python2.7/site-packages/neutron/services/trunk/drivers/openvswitch/agent/ovsdb_handler.py:241

Version-Release number of selected component (if applicable):
RHOSP13


Additional info:
$ rpm -qa | grep openstack-neutron
openstack-neutron-l2gw-agent-12.0.2-0.20200729125107.fe947e3.el7ost.noarch
openstack-neutron-metering-agent-12.1.1-35.1.el7ost.noarch
openstack-neutron-common-12.1.1-35.1.el7ost.noarch
openstack-neutron-ml2-12.1.1-35.1.el7ost.noarch
openstack-neutron-sriov-nic-agent-12.1.1-35.1.el7ost.noarch
openstack-neutron-12.1.1-35.1.el7ost.noarch
openstack-neutron-lbaas-12.0.1-0.20190803015156.b86fcef.el7ost.noarch
openstack-neutron-linuxbridge-12.1.1-35.1.el7ost.noarch
openstack-neutron-lbaas-ui-4.0.1-0.20200422165059.16919a9.el7ost.noarch
openstack-neutron-openvswitch-12.1.1-35.1.el7ost.noarch

Comment 27 errata-xmlrpc 2023-06-28 15:07:35 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Red Hat OpenStack Platform 13.0 bug fix and enhancement advisory), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2023:3902


Note You need to log in before you can comment on or make changes to this bug.