Bug 1723668 - VDSM command Get Host Statistics failed: Internal JSON-RPC error: {'reason': '[Errno 19] vnet<x> is not present in the system'}
Summary: VDSM command Get Host Statistics failed: Internal JSON-RPC error: {'reason': ...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: vdsm
Version: 4.3.0
Hardware: x86_64
OS: Linux
medium
high
Target Milestone: ovirt-4.4.0
: ---
Assignee: Bell Levin
QA Contact: Michael Burman
URL:
Whiteboard:
: 1722746 1821040 (view as bug list)
Depends On: 1722746
Blocks: 1744572
TreeView+ depends on / blocked
 
Reported: 2019-06-25 06:31 UTC by Juan Orti
Modified: 2023-10-14 04:25 UTC (History)
12 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Previously, during virtual machine shut down, the VDSM command Get Host Statistics occasionally failed with an Internal JSON-RPC error `{'reason': '[Errno 19] vnet<x> is not present in the system'}`. This failure happened because the shutdown could make an interface disappear while statistics were being gathered. The current release fixes this issue. It prevents such failures from being reported.
Clone Of: 1722746
: 1744572 (view as bug list)
Environment:
Last Closed: 2020-08-04 13:27:06 UTC
oVirt Team: Network
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Knowledge Base (Solution) 4242981 0 Troubleshoot None Errors command Get Host Statistics failed: Internal JSON-RPC error: {'reason': '[Errno 19] vnet111 is not present in the... 2019-06-25 06:35:07 UTC
Red Hat Product Errata RHEA-2020:3246 0 None None None 2020-08-04 13:27:38 UTC
oVirt gerrit 101385 0 'None' MERGED net: Fix error if net removed during stats request 2021-01-26 00:36:05 UTC

Description Juan Orti 2019-06-25 06:31:19 UTC
+++ This bug was initially created as a clone of Bug #1722746 +++

Description of problem:
Occasionally this kind of messages can be seen in the manager:

VDSM rhvh01.example.com command Get Host Statistics failed: Internal JSON-RPC error: {'reason': '[Errno 19] vnet111 is not present in the system'}


This is logged in vdsm.log:

2019-06-17 22:54:57,723+0200 INFO  (jsonrpc/4) [api.virt] START shutdown(delay=u'30', message=u'System Administrator has initiated shutdown of this Virtual Machine. Virtual Machine is shutting down.', reboot=False, timeout=None, force=False) from=::ffff:10.0.0.2,43422, flow_id=515ed70d-84e4-41ca-b0bd-166413a99c26, vmId=96328f4c-b30e-4b54-95a9-a437fabeebe5 (api:48)
2019-06-17 22:54:57,724+0200 INFO  (jsonrpc/4) [virt.vm] (vmId='96328f4c-b30e-4b54-95a9-a437fabeebe5') Starting powerdown (vmpowerdown:70)
2019-06-17 22:54:57,725+0200 INFO  (jsonrpc/4) [api.virt] FINISH shutdown return={'status': {'message': 'Machine shutting down', 'code': 0}} from=::ffff:10.0.0.2,43422, flow_id=515ed70d-84e4-41ca-b0bd-166413a99c26, vmId=96328f4c-b30e-4b54-95a9-a437fabeebe5 (api:54)
2019-06-17 22:54:57,725+0200 INFO  (jsonrpc/4) [jsonrpc.JsonRpcServer] RPC call VM.shutdown succeeded in 0.00 seconds (__init__:312)
2019-06-17 22:55:08,255+0200 INFO  (jsonrpc/5) [api.host] START getAllVmStats() from=::ffff:10.0.0.2,43422 (api:48)
2019-06-17 22:55:08,378+0200 INFO  (jsonrpc/5) [api.host] FINISH getAllVmStats return={'status': {'message': 'Done', 'code': 0}, 'statsList': (suppressed)} from=::ffff:10.0.0.2,43422 (api:54)
2019-06-17 22:55:08,405+0200 INFO  (jsonrpc/5) [jsonrpc.JsonRpcServer] RPC call Host.getAllVmStats succeeded in 0.16 seconds (__init__:312)
2019-06-17 22:55:11,173+0200 INFO  (jsonrpc/2) [api.host] START getStats() from=::ffff:10.0.0.2,43422 (api:48)
2019-06-17 22:55:11,476+0200 INFO  (jsonrpc/2) [api.host] FINISH getStats error=[Errno 19] vnet111 is not present in the system from=::ffff:10.0.0.2,43422 (api:52)
2019-06-17 22:55:11,476+0200 ERROR (jsonrpc/2) [jsonrpc.JsonRpcServer] Internal server error (__init__:350)
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/yajsonrpc/__init__.py", line 345, in _handle_request
    res = method(**params)
  File "/usr/lib/python2.7/site-packages/vdsm/rpc/Bridge.py", line 194, in _dynamicMethod
    result = fn(*methodArgs)
  File "<string>", line 2, in getStats
  File "/usr/lib/python2.7/site-packages/vdsm/common/api.py", line 50, in method
    ret = func(*args, **kwargs)
  File "/usr/lib/python2.7/site-packages/vdsm/API.py", line 1428, in getStats
    multipath=True)}
  File "/usr/lib/python2.7/site-packages/vdsm/host/api.py", line 50, in get_stats
    decStats = stats.produce(first_sample, last_sample)
  File "/usr/lib/python2.7/site-packages/vdsm/host/stats.py", line 72, in produce
    stats.update(get_interfaces_stats())
  File "/usr/lib/python2.7/site-packages/vdsm/host/stats.py", line 154, in get_interfaces_stats
    return net_api.network_stats()
  File "/usr/lib/python2.7/site-packages/vdsm/network/api.py", line 63, in network_stats
    return netstats.report()
  File "/usr/lib/python2.7/site-packages/vdsm/network/netstats.py", line 32, in report
    stats = link_stats.report()
  File "/usr/lib/python2.7/site-packages/vdsm/network/link/stats.py", line 34, in report
    stats[i.device] = i.statistics()
  File "/usr/lib/python2.7/site-packages/vdsm/network/link/iface.py", line 232, in statistics
    'state': 'up' if self.is_oper_up() else 'down',
  File "/usr/lib/python2.7/site-packages/vdsm/network/link/iface.py", line 193, in is_oper_up
    properties = self.properties()
  File "/usr/lib/python2.7/site-packages/vdsm/network/link/iface.py", line 165, in properties
    info = link.get_link(self._dev)
  File "/usr/lib/python2.7/site-packages/vdsm/network/netlink/link.py", line 38, in get_link
    name)
IOError: [Errno 19] vnet111 is not present in the system
2019-06-17 22:55:11,478+0200 INFO  (jsonrpc/2) [jsonrpc.JsonRpcServer] RPC call Host.getStats failed (error -32603) in 0.31 seconds (__init__:312)

Version-Release number of selected component (if applicable):
vdsm-4.30.13-4.el7ev.x86_64


How reproducible:
Ocasionally, when powering off VMs

Steps to Reproduce:
1. Power off VMs
2. When it coincides with a getStats of the host, this error can be seen some times

Actual results:
Get Host Statistics failed is reported in the manager.

Expected results:
No errors of getting statistics of a VM in the process of shutdown.

Additional info:
This can be seen in engine.log:

2019-06-17 22:54:57,712+02 INFO  [org.ovirt.engine.core.bll.ShutdownVmCommand] (default task-10675) [515ed70d-84e4-41ca-b0bd-166413a99c26] Running command: ShutdownVmCommand internal: false. Entities affected :  ID: 96328f4c-b30e-4b54-95a9-a437fabeebe5 Type: VMAction group SHUT_DOWN_VM with role type USER
2019-06-17 22:54:57,715+02 INFO  [org.ovirt.engine.core.bll.ShutdownVmCommand] (default task-10675) [515ed70d-84e4-41ca-b0bd-166413a99c26] Entered (VM 'VM01').
2019-06-17 22:54:57,717+02 INFO  [org.ovirt.engine.core.bll.ShutdownVmCommand] (default task-10675) [515ed70d-84e4-41ca-b0bd-166413a99c26] Sending shutdown command for VM 'VM01'.
2019-06-17 22:54:57,718+02 INFO  [org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand] (default task-10675) [515ed70d-84e4-41ca-b0bd-166413a99c26] START, DestroyVmVDSCommand( DestroyVmVDSCommandParameters:{hostId='3c3226f2-30d5-4068-bda5-51241cd8355b', vmId='96328f4c-b30e-4b54-95a9-a437fabeebe5', secondsToWait='30', gracefully='true', reason='', ignoreNoVm='false'}), log id: 5330cafc
2019-06-17 22:54:57,721+02 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (default task-10675) [515ed70d-84e4-41ca-b0bd-166413a99c26] START, DestroyVDSCommand(HostName = rhvh01.example.com, DestroyVmVDSCommandParameters:{hostId='3c3226f2-30d5-4068-bda5-51241cd8355b', vmId='96328f4c-b30e-4b54-95a9-a437fabeebe5', secondsToWait='30', gracefully='true', reason='', ignoreNoVm='false'}), log id: 600f9351
2019-06-17 22:54:57,725+02 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (default task-10675) [515ed70d-84e4-41ca-b0bd-166413a99c26] FINISH, DestroyVDSCommand, log id: 600f9351
2019-06-17 22:54:57,728+02 INFO  [org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand] (default task-10675) [515ed70d-84e4-41ca-b0bd-166413a99c26] FINISH, DestroyVmVDSCommand, return: PoweringDown, log id: 5330cafc
2019-06-17 22:54:57,755+02 INFO  [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (default task-10675) [515ed70d-84e4-41ca-b0bd-166413a99c26] EVENT_ID: USER_INITIATED_SHUTDOWN_VM(73), VM shutdown initiated by user01@example.local-authz on VM VM01 (Host: rhvh01.example.com).

2019-06-17 22:55:11,478+02 WARN  [org.ovirt.engine.core.vdsbroker.vdsbroker.GetStatsAsyncVDSCommand] (EE-ManagedThreadFactory-engineScheduled-Thread-20) [] Unexpected return value: Status [code=-32603, message=Internal JSON-RPC error: {'reason': '[Errno 19] vnet111 is not present in the system'}]
2019-06-17 22:55:11,478+02 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.GetStatsAsyncVDSCommand] (EE-ManagedThreadFactory-engineScheduled-Thread-20) [] Failed in 'Get Host Statistics' method
2019-06-17 22:55:11,479+02 WARN  [org.ovirt.engine.core.vdsbroker.vdsbroker.GetStatsAsyncVDSCommand] (EE-ManagedThreadFactory-engineScheduled-Thread-20) [] Unexpected return value: Status [code=-32603, message=Internal JSON-RPC error: {'reason': '[Errno 19] vnet111 is not present in the system'}]
2019-06-17 22:55:11,482+02 ERROR [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (EE-ManagedThreadFactory-engineScheduled-Thread-20) [] EVENT_ID: VDS_BROKER_COMMAND_FAILURE(10,802), VDSM rhvh01.example.com command Get Host Statistics failed: Internal JSON-RPC error: {'reason': '[Errno 19] vnet111 is not present in the system'}
2019-06-17 22:55:11,482+02 ERROR [org.ovirt.engine.core.vdsbroker.monitoring.HostMonitoring] (EE-ManagedThreadFactory-engineScheduled-Thread-20) [] Unable to GetStats: VDSErrorException: VDSGenericException: VDSErrorException: Failed to Get Host Statistics, error = Internal JSON-RPC error: {'reason': '[Errno 19] vnet111 is not present in the system'}, code = -32603

2019-06-17 22:56:10,292+02 INFO  [org.ovirt.engine.core.vdsbroker.monitoring.VmAnalyzer] (EE-ManagedThreadFactory-engineScheduled-Thread-74) [] VM '96328f4c-b30e-4b54-95a9-a437fabeebe5' was reported as Down on VDS '3c3226f2-30d5-4068-bda5-51241cd8355b'(rhvh01.example.com)
2019-06-17 22:56:10,293+02 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (EE-ManagedThreadFactory-engineScheduled-Thread-74) [] START, DestroyVDSCommand(HostName = rhvh01.example.com, DestroyVmVDSCommandParameters:{hostId='3c3226f2-30d5-4068-bda5-51241cd8355b', vmId='96328f4c-b30e-4b54-95a9-a437fabeebe5', secondsToWait='0', gracefully='false', reason='', ignoreNoVm='true'}), log id: 54f67c5f
2019-06-17 22:56:10,299+02 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand] (EE-ManagedThreadFactory-engineScheduled-Thread-74) [] FINISH, DestroyVDSCommand, log id: 54f67c5f
2019-06-17 22:56:10,299+02 INFO  [org.ovirt.engine.core.vdsbroker.monitoring.VmAnalyzer] (EE-ManagedThreadFactory-engineScheduled-Thread-74) [] VM '96328f4c-b30e-4b54-95a9-a437fabeebe5'(VM01) moved from 'PoweringDown' --> 'Down'
2019-06-17 22:56:10,318+02 INFO  [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (EE-ManagedThreadFactory-engineScheduled-Thread-74) [] EVENT_ID: VM_DOWN(61), VM VM01 is down. Exit message: Admin shut down from the engine

Comment 3 Edward Haas 2019-06-25 08:50:55 UTC
The statistic gathering is not atomic in nature, therefore, the entities that exist at the beginning of the transaction may not exist during the whole transaction period.
Therefore, statistics gathering may fail just because one interface disappeared in the middle.

We could make this tolerable by absorbing such errors at the interface statistics level.

Comment 6 Michael Burman 2019-08-01 07:17:43 UTC
*** Bug 1722746 has been marked as a duplicate of this bug. ***

Comment 10 RHV bug bot 2019-10-22 17:25:57 UTC
WARN: Bug status wasn't changed from MODIFIED to ON_QA due to the following reason:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status wasn't changed from MODIFIED to ON_QA due to the following reason:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 11 RHV bug bot 2019-10-22 17:39:04 UTC
WARN: Bug status wasn't changed from MODIFIED to ON_QA due to the following reason:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status wasn't changed from MODIFIED to ON_QA due to the following reason:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 12 RHV bug bot 2019-10-22 17:46:21 UTC
WARN: Bug status wasn't changed from MODIFIED to ON_QA due to the following reason:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status wasn't changed from MODIFIED to ON_QA due to the following reason:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 13 RHV bug bot 2019-10-22 18:02:10 UTC
WARN: Bug status wasn't changed from MODIFIED to ON_QA due to the following reason:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status wasn't changed from MODIFIED to ON_QA due to the following reason:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 14 Michael Burman 2019-10-24 10:13:03 UTC
Verified on - vdsm-4.40.0-127.gitc628cce.el8ev.x86_64

Comment 15 RHV bug bot 2019-11-19 11:53:18 UTC
WARN: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 16 RHV bug bot 2019-11-19 12:03:17 UTC
WARN: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 17 RHV bug bot 2019-12-13 13:16:40 UTC
WARN: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 18 RHV bug bot 2019-12-20 17:46:08 UTC
WARN: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 19 RHV bug bot 2020-01-08 14:50:26 UTC
WARN: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 20 RHV bug bot 2020-01-08 15:18:39 UTC
WARN: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 21 RHV bug bot 2020-01-24 19:52:07 UTC
WARN: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status (VERIFIED) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 22 Dominik Holler 2020-04-07 07:38:37 UTC
*** Bug 1821040 has been marked as a duplicate of this bug. ***

Comment 27 errata-xmlrpc 2020-08-04 13:27:06 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (RHV RHEL Host (ovirt-host) 4.4), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2020:3246

Comment 28 Mai Ling 2023-06-15 12:00:50 UTC
I've experienced a similar error log message caused by name of the kernel network device changed from `eno1` to ascii code 03. never found out what caused that change (ram errors? it went away after reboot. no relevant os logs), and vdsm.log kept repeating this error:

2023-06-12 17:40:34,378+0300 ERROR (jsonrpc/1) [jsonrpc.JsonRpcServer] Internal server error (__init__:354)
Traceback (most recent call last):
  File "/usr/lib/python3.6/site-packages/yajsonrpc/__init__.py", line 349, in _handle_request
    res = method(**params)
  File "/usr/lib/python3.6/site-packages/vdsm/rpc/Bridge.py", line 198, in _dynamicMethod
    result = fn(*methodArgs)
  File "<decorator-gen-471>", line 2, in getStats
  File "/usr/lib/python3.6/site-packages/vdsm/common/api.py", line 50, in method
    ret = func(*args, **kwargs)
  File "/usr/lib/python3.6/site-packages/vdsm/API.py", line 1456, in getStats
    multipath=True)
  File "/usr/lib/python3.6/site-packages/vdsm/host/api.py", line 49, in get_stats
    decStats = stats.produce(first_sample, last_sample)
  File "/usr/lib/python3.6/site-packages/vdsm/host/stats.py", line 72, in produce
    stats.update(get_interfaces_stats())
  File "/usr/lib/python3.6/site-packages/vdsm/host/stats.py", line 154, in get_interfaces_stats
    return net_api.network_stats()
  File "/usr/lib/python3.6/site-packages/vdsm/network/api.py", line 61, in network_stats
    return netstats.report()
  File "/usr/lib/python3.6/site-packages/vdsm/network/netstats.py", line 32, in report
    stats = link_stats.report()
  File "/usr/lib/python3.6/site-packages/vdsm/network/link/stats.py", line 33, in report
    for iface_properties in iface.list():
  File "/usr/lib/python3.6/site-packages/vdsm/network/link/iface.py", line 230, in list
    properties['type'] = get_alternative_type(properties['name'])
  File "/usr/lib/python3.6/site-packages/vdsm/network/link/iface.py", line 254, in get_alternative_type
    driver_name = ethtool.driver_name(device)
  File "/usr/lib/python3.6/site-packages/vdsm/network/ethtool.py", line 51, in driver_name
    fcntl.ioctl(sock, SIOCETHTOOL, data)
OSError: [Errno 19] No such device

I thought commenting here which comes up in search results for error 19 might help some other one-in-a-million unlucky person hitting such an experience.

Comment 29 Red Hat Bugzilla 2023-10-14 04:25:05 UTC
The needinfo request[s] on this closed bug have been removed as they have been unresolved for 120 days


Note You need to log in before you can comment on or make changes to this bug.