Bug 1967993 - tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_reboot_server_hard fails w/ timeout
Summary: tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_re...
Keywords:
Status: CLOSED DUPLICATE of bug 1965897
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-nova
Version: 16.2 (Train)
Hardware: x86_64
OS: Linux
unspecified
low
Target Milestone: ---
: ---
Assignee: OSP DFG:Compute
QA Contact: OSP DFG:Compute
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2021-06-04 16:32 UTC by wes hayutin
Modified: 2023-03-21 19:44 UTC (History)
7 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2021-06-07 12:55:35 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Issue Tracker OSP-4392 0 None None None 2022-08-17 13:49:46 UTC

Description wes hayutin 2021-06-04 16:32:52 UTC
Description of problem:

skipping this test.. this bug documents the skip of this test and allows for further debug if needed.  This may not be a consistent error.

https://sf.hosted.upshift.rdu2.redhat.com/logs/94/211494/41/check/periodic-tripleo-ci-rhel-8-bm_envD-3ctlr_1comp-featureset035-rhos-16.2/0a9bee9/logs/undercloud/var/log/tempest/stestr_results.html

WARN: failed: route add -net "0.0.0.0/0" gw "10.100.0.1"
OK
checking http://169.254.169.254/2009-04-04/instance-id
failed 1/20: up 20.75. request failed
failed 2/20: up 70.19. request failed
failed 3/20: up 119.52. request failed
failed 4/20: up 168.91. request failed
failed 5/20: up 218.25. request failed
failed 6/20: up 267.62. request failed

2021-06-04 13:38:14,253 334976 INFO     [tempest.lib.common.rest_client] Request (ServerActionsTestJSON:tearDown): 200 GET https://10.9.122.93:13774/v2.1/servers/950baa43-c0d3-4d4a-83b9-8593eca9d377 2.227s
2021-06-04 13:38:14,254 334976 DEBUG    [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-OpenStack-Nova-API-Version': '2.1', 'X-Auth-Token': '<omitted>'}
        Body: None
    Response - Headers: {'date': 'Fri, 04 Jun 2021 13:38:12 GMT', 'server': 'Apache', 'content-length': '1724', 'openstack-api-version': 'compute 2.1', 'x-openstack-nova-api-version': '2.1', 'vary': 'OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding', 'x-openstack-request-id': 'req-7cf22697-205a-4a74-ac92-7ec21279222d', 'x-compute-request-id': 'req-7cf22697-205a-4a74-ac92-7ec21279222d', 'connection': 'close', 'content-type': 'application/json', 'status': '200', 'content-location': 'https://10.9.122.93:13774/v2.1/servers/950baa43-c0d3-4d4a-83b9-8593eca9d377'}
        Body: b'{"server": {"id": "950baa43-c0d3-4d4a-83b9-8593eca9d377", "name": "tempest-ServerActionsTestJSON-server-44596450", "status": "ACTIVE", "tenant_id": "c2428407831f45a1b8fdd9dc8e131303", "user_id": "eae567d8d5f64c1d8885f171fb7f3e6c", "metadata": {}, "hostId": "924d71b07b07d99c8ffbdcf5f672ecb2a06d9b85dd409334d0e1fc02", "image": {"id": "f635a7c6-468b-41c1-9797-16906ee64625", "links": [{"rel": "bookmark", "href": "https://10.9.122.93:13774/images/f635a7c6-468b-41c1-9797-16906ee64625"}]}, "flavor": {"id": "39a0c386-097b-42d7-b556-ee9e0e1f714d", "links": [{"rel": "bookmark", "href": "https://10.9.122.93:13774/flavors/39a0c386-097b-42d7-b556-ee9e0e1f714d"}]}, "created": "2021-06-04T13:28:03Z", "updated": "2021-06-04T13:32:49Z", "addresses": {"tempest-ServerActionsTestJSON-2135000759-network": [{"version": 4, "addr": "10.100.0.10", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:13:8b:f6"}, {"version": 4, "addr": "10.9.122.89", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:13:8b:f6"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://10.9.122.93:13774/v2.1/servers/950baa43-c0d3-4d4a-83b9-8593eca9d377"}, {"rel": "bookmark", "href": "https://10.9.122.93:13774/servers/950baa43-c0d3-4d4a-83b9-8593eca9d377"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "", "key_name": "tempest-keypair-855034991", "OS-SRV-USG:launched_at": "2021-06-04T13:28:17.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-securitygroup--30906850"}], "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}}'
}}}

Traceback (most recent call last):
  File "/usr/lib/python3.6/site-packages/tempest/lib/common/ssh.py", line 113, in _get_ssh_connection
    sock=proxy_chan)
  File "/usr/lib/python3.6/site-packages/paramiko/client.py", line 362, in connect
    raise NoValidConnectionsError(errors)
paramiko.ssh_exception.NoValidConnectionsError: [Errno None] Unable to connect to port 22 on 10.9.122.89

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.6/site-packages/tempest/api/compute/servers/test_server_actions.py", line 162, in test_reboot_server_hard
    self._test_reboot_server('HARD')
  File "/usr/lib/python3.6/site-packages/tempest/api/compute/servers/test_server_actions.py", line 133, in _test_reboot_server
    boot_time = linux_client.get_boot_time()
  File "/usr/lib/python3.6/site-packages/tempest/common/utils/linux/remote_client.py", line 85, in get_boot_time
    boot_secs = self.exec_command(cmd)
  File "/usr/lib/python3.6/site-packages/tempest/lib/common/utils/linux/remote_client.py", line 59, in wrapper
    six.reraise(*original_exception)
  File "/usr/lib/python3.6/site-packages/six.py", line 675, in reraise
    raise value
  File "/usr/lib/python3.6/site-packages/tempest/lib/common/utils/linux/remote_client.py", line 32, in wrapper
    return function(self, *args, **kwargs)
  File "/usr/lib/python3.6/site-packages/tempest/lib/common/utils/linux/remote_client.py", line 107, in exec_command
    return self.ssh_client.exec_command(cmd)
  File "/usr/lib/python3.6/site-packages/tempest/lib/common/ssh.py", line 159, in exec_command
    ssh = self._get_ssh_connection()
  File "/usr/lib/python3.6/site-packages/tempest/lib/common/ssh.py", line 129, in _get_ssh_connection
    password=self.password)
tempest.lib.exceptions.SSHTimeout: Connection to the 10.9.122.89 via SSH timed out.
User: cirros, Password: O7^bhec+EpVpckw

Comment 1 wes hayutin 2021-06-04 17:07:04 UTC
[instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] detaching network adapter failed.: libvirt.libvirtError: internal error: End of file from qemu monitor
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] Traceback (most recent call last):
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 2373, in detach_interface
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     supports_device_missing_error_code=supports_device_missing)
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/guest.py", line 466, in detach_device_with_retry
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     _try_detach_device(conf, persistent, live)
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/guest.py", line 455, in _try_detach_device
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     ctx.reraise = True
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     self.force_reraise()
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     six.reraise(self.type_, self.value, self.tb)
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/six.py", line 675, in reraise
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     raise value
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/guest.py", line 408, in _try_detach_device
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     self.detach_device(conf, persistent=persistent, live=live)
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/guest.py", line 516, in detach_device
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     self._domain.detachDeviceFlags(device_xml, flags=flags)
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 190, in doit
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     result = proxy_call(self._autowrap, f, *args, **kwargs)
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 148, in proxy_call
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     rv = execute(f, *args, **kwargs)
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 129, in execute
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     six.reraise(c, e, tb)
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/six.py", line 675, in reraise
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     raise value
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 83, in tworker
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     rv = meth(*args, **kwargs)
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]   File "/usr/lib64/python3.6/site-packages/libvirt.py", line 1534, in detachDeviceFlags
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3]     raise libvirtError('virDomainDetachDeviceFlags() failed')
2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] libvirt.libvirtError: internal error: End of file from qemu monitor

https://sf.hosted.upshift.rdu2.redhat.com/logs/94/211494/41/check/periodic-tripleo-ci-rhel-8-bm_envD-3ctlr_1comp-featureset035-rhos-16.2/0a9bee9/logs/overcloud-novacompute-0/var/log/extra/errors.txt.txt

Comment 2 wes hayutin 2021-06-04 17:29:51 UTC
2021-06-04 13:28:17.047 8 INFO nova.virt.libvirt.driver [-] [instance: 950baa43-c0d3-4d4a-83b9-8593eca9d377] Instance spawned successfully.
https://sf.hosted.upshift.rdu2.redhat.com/logs/94/211494/41/check/periodic-tripleo-ci-rhel-8-bm_envD-3ctlr_1comp-featureset035-rhos-16.2/0a9bee9/logs/overcloud-novacompute-0/var/log/containers/nova/

2021-06-04 13:29:51.025 8 WARNING nova.compute.manager [req-69401e2a-be18-48c5-bbc0-f6aff638b239 1304f0c3ebd343ecb71577edd51feacd 7892e0c99aef4adf91a0c7be2c2bfdc6 - default default] [instance: 950baa43-c0d3-4d4a-83b9-8593eca9d377] Received unexpected event network-vif-plugged-ac68c2e6-6c0c-4f2c-9810-71ac1176db6e for instance with vm_state active and task_state None.

Comment 3 wes hayutin 2021-06-04 17:33:01 UTC
There are a lot of OVSDB errors on the controller that could be contributing to this

https://sf.hosted.upshift.rdu2.redhat.com/logs/94/211494/41/check/periodic-tripleo-ci-rhel-8-bm_envD-3ctlr_1comp-featureset035-rhos-16.2/0a9bee9/logs/overcloud-controller-0/var/log/extra/errors.txt.txt

2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command: RuntimeError: OVSDB Error: The transaction failed because the IDL has been configured to require a database lock but didn't get it yet or has already lost it
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.6/site-packages/ovsdbapp/api.py", line 111, in transaction
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command     yield self._nested_txns_map[cur_thread_id]
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command KeyError: 140507849345328
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command 
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command During handling of the above exception, another exception occurred:
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command 
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last):
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 42, in execute
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command     t.add(self)
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.6/contextlib.py", line 88, in __exit__
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.6/site-packages/networking_ovn/ovsdb/impl_idl_ovn.py", line 197, in transaction
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command     yield t
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib64/python3.6/contextlib.py", line 88, in __exit__
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command     next(self.gen)
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.6/site-packages/ovsdbapp/api.py", line 119, in transaction
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command     del self._nested_txns_map[cur_thread_id]
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.6/site-packages/ovsdbapp/api.py", line 69, in __exit__
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command     self.result = self.commit()
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 62, in commit
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command     raise result.ex
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 128, in run
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command     txn.results.put(txn.do_commit())
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command   File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 115, in do_commit
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command     raise RuntimeError(msg)
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command RuntimeError: OVSDB Error: The transaction failed because the IDL has been configured to require a database lock but didn't get it yet or has already lost it
2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command

Comment 4 wes hayutin 2021-06-04 17:34:17 UTC
also the connection to oslo was lost around the time the test failed..

2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines [-] Database connection was found disconnected; reconnecting: oslo_db.exception.DBConnectionError: (pymysql.err.OperationalError) (2013, 'Lost connection to MySQL server during query')

2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines Traceback (most recent call last):
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines   File "/usr/lib64/python3.6/site-packages/sqlalchemy/engine/base.py", line 1244, in _execute_context
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines     cursor, statement, parameters, context
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines   File "/usr/lib64/python3.6/site-packages/sqlalchemy/engine/default.py", line 552, in do_execute
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines     cursor.execute(statement, parameters)
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines   File "/usr/lib/python3.6/site-packages/pymysql/cursors.py", line 163, in execute
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines     result = self._query(query)
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines   File "/usr/lib/python3.6/site-packages/pymysql/cursors.py", line 321, in _query
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines     conn.query(q)
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines   File "/usr/lib/python3.6/site-packages/pymysql/connections.py", line 505, in query
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines     self._affected_rows = self._read_query_result(unbuffered=unbuffered)
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines   File "/usr/lib/python3.6/site-packages/pymysql/connections.py", line 724, in _read_query_result
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines     result.read()
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines   File "/usr/lib/python3.6/site-packages/pymysql/connections.py", line 1069, in read
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines     first_packet = self.connection._read_packet()
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines   File "/usr/lib/python3.6/site-packages/pymysql/connections.py", line 646, in _read_packet
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines     packet_header = self._read_bytes(4)
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines   File "/usr/lib/python3.6/site-packages/pymysql/connections.py", line 699, in _read_bytes
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines     CR.CR_SERVER_LOST, "Lost connection to MySQL server during query")
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines pymysql.err.OperationalError: (2013, 'Lost connection to MySQL server during query')
2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines

Comment 5 Kashyap Chamarthy 2021-06-07 12:55:35 UTC
This one too (just as this was: https://bugzilla.redhat.com/show_bug.cgi?id=1967995) is a dupe of the following:

https://bugzilla.redhat.com/show_bug.cgi?id=1965897 — 16.2 / 17 line jobs are failing on tempest tests with "ERROR tempest.lib.common.ssh paramiko.ssh_exception.NoValidConnectionsError: [Errno None] Unable to connect to port 22 on <IP>"

*** This bug has been marked as a duplicate of bug 1965897 ***


Note You need to log in before you can comment on or make changes to this bug.