Bug 1967993
Summary: | tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_reboot_server_hard fails w/ timeout | ||
---|---|---|---|
Product: | Red Hat OpenStack | Reporter: | wes hayutin <whayutin> |
Component: | openstack-nova | Assignee: | OSP DFG:Compute <osp-dfg-compute> |
Status: | CLOSED DUPLICATE | QA Contact: | OSP DFG:Compute <osp-dfg-compute> |
Severity: | low | Docs Contact: | |
Priority: | unspecified | ||
Version: | 16.2 (Train) | CC: | dasmith, eglynn, jhakimra, kchamart, sbauza, sgordon, vromanso |
Target Milestone: | --- | ||
Target Release: | --- | ||
Hardware: | x86_64 | ||
OS: | Linux | ||
Whiteboard: | |||
Fixed In Version: | Doc Type: | If docs needed, set a value | |
Doc Text: | Story Points: | --- | |
Clone Of: | Environment: | ||
Last Closed: | 2021-06-07 12:55:35 UTC | Type: | Bug |
Regression: | --- | Mount Type: | --- |
Documentation: | --- | CRM: | |
Verified Versions: | Category: | --- | |
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
Cloudforms Team: | --- | Target Upstream Version: | |
Embargoed: |
Description
wes hayutin
2021-06-04 16:32:52 UTC
[instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] detaching network adapter failed.: libvirt.libvirtError: internal error: End of file from qemu monitor 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] Traceback (most recent call last): 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/driver.py", line 2373, in detach_interface 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] supports_device_missing_error_code=supports_device_missing) 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/guest.py", line 466, in detach_device_with_retry 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] _try_detach_device(conf, persistent, live) 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/guest.py", line 455, in _try_detach_device 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] ctx.reraise = True 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__ 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] self.force_reraise() 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] six.reraise(self.type_, self.value, self.tb) 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/six.py", line 675, in reraise 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] raise value 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/guest.py", line 408, in _try_detach_device 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] self.detach_device(conf, persistent=persistent, live=live) 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/nova/virt/libvirt/guest.py", line 516, in detach_device 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] self._domain.detachDeviceFlags(device_xml, flags=flags) 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 190, in doit 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] result = proxy_call(self._autowrap, f, *args, **kwargs) 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 148, in proxy_call 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] rv = execute(f, *args, **kwargs) 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 129, in execute 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] six.reraise(c, e, tb) 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/six.py", line 675, in reraise 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] raise value 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib/python3.6/site-packages/eventlet/tpool.py", line 83, in tworker 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] rv = meth(*args, **kwargs) 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] File "/usr/lib64/python3.6/site-packages/libvirt.py", line 1534, in detachDeviceFlags 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] raise libvirtError('virDomainDetachDeviceFlags() failed') 2021-06-04 13:01:20.475 ERROR /var/log/containers/nova/nova-compute.log.1: 8 ERROR nova.virt.libvirt.driver [instance: 5000c59e-f9dd-4c25-9529-f281b1e53ff3] libvirt.libvirtError: internal error: End of file from qemu monitor https://sf.hosted.upshift.rdu2.redhat.com/logs/94/211494/41/check/periodic-tripleo-ci-rhel-8-bm_envD-3ctlr_1comp-featureset035-rhos-16.2/0a9bee9/logs/overcloud-novacompute-0/var/log/extra/errors.txt.txt 2021-06-04 13:28:17.047 8 INFO nova.virt.libvirt.driver [-] [instance: 950baa43-c0d3-4d4a-83b9-8593eca9d377] Instance spawned successfully. https://sf.hosted.upshift.rdu2.redhat.com/logs/94/211494/41/check/periodic-tripleo-ci-rhel-8-bm_envD-3ctlr_1comp-featureset035-rhos-16.2/0a9bee9/logs/overcloud-novacompute-0/var/log/containers/nova/ 2021-06-04 13:29:51.025 8 WARNING nova.compute.manager [req-69401e2a-be18-48c5-bbc0-f6aff638b239 1304f0c3ebd343ecb71577edd51feacd 7892e0c99aef4adf91a0c7be2c2bfdc6 - default default] [instance: 950baa43-c0d3-4d4a-83b9-8593eca9d377] Received unexpected event network-vif-plugged-ac68c2e6-6c0c-4f2c-9810-71ac1176db6e for instance with vm_state active and task_state None. There are a lot of OVSDB errors on the controller that could be contributing to this https://sf.hosted.upshift.rdu2.redhat.com/logs/94/211494/41/check/periodic-tripleo-ci-rhel-8-bm_envD-3ctlr_1comp-featureset035-rhos-16.2/0a9bee9/logs/overcloud-controller-0/var/log/extra/errors.txt.txt 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command [-] Error executing command: RuntimeError: OVSDB Error: The transaction failed because the IDL has been configured to require a database lock but didn't get it yet or has already lost it 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last): 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/ovsdbapp/api.py", line 111, in transaction 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command yield self._nested_txns_map[cur_thread_id] 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command KeyError: 140507849345328 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command During handling of the above exception, another exception occurred: 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command Traceback (most recent call last): 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/command.py", line 42, in execute 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command t.add(self) 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib64/python3.6/contextlib.py", line 88, in __exit__ 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command next(self.gen) 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/networking_ovn/ovsdb/impl_idl_ovn.py", line 197, in transaction 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command yield t 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib64/python3.6/contextlib.py", line 88, in __exit__ 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command next(self.gen) 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/ovsdbapp/api.py", line 119, in transaction 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command del self._nested_txns_map[cur_thread_id] 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/ovsdbapp/api.py", line 69, in __exit__ 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command self.result = self.commit() 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 62, in commit 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command raise result.ex 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/connection.py", line 128, in run 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command txn.results.put(txn.do_commit()) 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command File "/usr/lib/python3.6/site-packages/ovsdbapp/backend/ovs_idl/transaction.py", line 115, in do_commit 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command raise RuntimeError(msg) 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command RuntimeError: OVSDB Error: The transaction failed because the IDL has been configured to require a database lock but didn't get it yet or has already lost it 2021-06-04 13:27:04.340 ERROR /var/log/containers/neutron/server.log.1: 19 ERROR ovsdbapp.backend.ovs_idl.command also the connection to oslo was lost around the time the test failed.. 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines [-] Database connection was found disconnected; reconnecting: oslo_db.exception.DBConnectionError: (pymysql.err.OperationalError) (2013, 'Lost connection to MySQL server during query') 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines Traceback (most recent call last): 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines File "/usr/lib64/python3.6/site-packages/sqlalchemy/engine/base.py", line 1244, in _execute_context 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines cursor, statement, parameters, context 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines File "/usr/lib64/python3.6/site-packages/sqlalchemy/engine/default.py", line 552, in do_execute 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines cursor.execute(statement, parameters) 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines File "/usr/lib/python3.6/site-packages/pymysql/cursors.py", line 163, in execute 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines result = self._query(query) 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines File "/usr/lib/python3.6/site-packages/pymysql/cursors.py", line 321, in _query 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines conn.query(q) 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines File "/usr/lib/python3.6/site-packages/pymysql/connections.py", line 505, in query 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines self._affected_rows = self._read_query_result(unbuffered=unbuffered) 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines File "/usr/lib/python3.6/site-packages/pymysql/connections.py", line 724, in _read_query_result 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines result.read() 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines File "/usr/lib/python3.6/site-packages/pymysql/connections.py", line 1069, in read 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines first_packet = self.connection._read_packet() 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines File "/usr/lib/python3.6/site-packages/pymysql/connections.py", line 646, in _read_packet 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines packet_header = self._read_bytes(4) 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines File "/usr/lib/python3.6/site-packages/pymysql/connections.py", line 699, in _read_bytes 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines CR.CR_SERVER_LOST, "Lost connection to MySQL server during query") 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines pymysql.err.OperationalError: (2013, 'Lost connection to MySQL server during query') 2021-06-04 12:37:25.615 ERROR /var/log/containers/nova/nova-conductor.log: 7 ERROR oslo_db.sqlalchemy.engines This one too (just as this was: https://bugzilla.redhat.com/show_bug.cgi?id=1967995) is a dupe of the following: https://bugzilla.redhat.com/show_bug.cgi?id=1965897 — 16.2 / 17 line jobs are failing on tempest tests with "ERROR tempest.lib.common.ssh paramiko.ssh_exception.NoValidConnectionsError: [Errno None] Unable to connect to port 22 on <IP>" *** This bug has been marked as a duplicate of bug 1965897 *** |