Bug 1261918 - [scale] RHEV-M UI shows incorrect number of VMs are running on Host.
Summary: [scale] RHEV-M UI shows incorrect number of VMs are running on Host.
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: vdsm
Version: 3.5.4
Hardware: x86_64
OS: Linux
medium
high
Target Milestone: ovirt-3.6.0-rc3
: 3.6.0
Assignee: Francesco Romani
QA Contact: Yuri Obshansky
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2015-09-10 12:39 UTC by Yuri Obshansky
Modified: 2019-09-12 08:57 UTC (History)
9 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2016-03-09 19:45:26 UTC
oVirt Team: Virt
Target Upstream Version:


Attachments (Terms of Use)
UI screenshot (67.31 KB, image/png)
2015-09-10 12:40 UTC, Yuri Obshansky
no flags Details
Engine log (1.78 MB, application/x-gzip)
2015-09-10 12:40 UTC, Yuri Obshansky
no flags Details
vdsm log (803.86 KB, application/x-gzip)
2015-09-10 12:41 UTC, Yuri Obshansky
no flags Details
vdsm 1 log (928.47 KB, application/x-xz)
2015-09-10 12:41 UTC, Yuri Obshansky
no flags Details


Links
System ID Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2016:0362 normal SHIPPED_LIVE vdsm 3.6.0 bug fix and enhancement update 2016-03-09 23:49:32 UTC
oVirt gerrit 46005 master MERGED vm: always hold confLock when updating pauseCode Never
oVirt gerrit 46771 ovirt-3.6 MERGED vm: always hold confLock when updating pauseCode Never

Description Yuri Obshansky 2015-09-10 12:39:06 UTC
Description of problem:
During REST API load test detected difference in represented by RHEV-M UI amount of running VMs which is bigger than real created VMs in cluster.
In my situation I have only 100 VMs in cluster,
but UI shows me 126 VMS
I run # vdsClient -s 0 getVdsStats | grep vmCount
    vmCount = 52
Load test is 100 concurrent threads which simulate REST API calls to VMs 
(GET different info, POST stopVM and POST startVM).

Version-Release number of selected component (if applicable):
3.5.4-1.2.el6ev
vdsm-4.16.13.1-1.el6ev

How reproducible:
100 %

Steps to Reproduce:
1. Start Load Test and give it run for some time
2. Check amount in RHEV-M UI/Hosts/Virtual Machines  
3. Run # vdsClient -s 0 getVdsStats | grep vmCount

Actual results:
UI shows me 126 VMS
# vdsClient -s 0 getVdsStats | grep vmCount
vmCount = 52
Actual number of VMs is 100

Expected results:
UI shows me 52 VMS
# vdsClient -s 0 getVdsStats | grep vmCount
vmCount = 52
And no more that we have actual VMS in Setup

Additional info:

Comment 1 Yuri Obshansky 2015-09-10 12:40:03 UTC
Created attachment 1072161 [details]
UI screenshot

Comment 2 Yuri Obshansky 2015-09-10 12:40:40 UTC
Created attachment 1072163 [details]
Engine log

Comment 3 Yuri Obshansky 2015-09-10 12:41:17 UTC
Created attachment 1072164 [details]
vdsm log

Comment 4 Yuri Obshansky 2015-09-10 12:41:44 UTC
Created attachment 1072166 [details]
vdsm 1 log

Comment 5 Yuri Obshansky 2015-09-10 12:45:18 UTC
[root@dhcp-yuri ovirt-engine]# rpm -qa | grep ovirt-engine
rhevm-setup-plugin-ovirt-engine-common-3.5.4-1.2.el6ev.noarch
rhevm-setup-plugin-ovirt-engine-3.5.4-1.2.el6ev.noarch

[root@ucs1-b420-1 vdsm]# rpm -qa | grep vdsm
vdsm-cli-4.16.13.1-1.el6ev.noarch
vdsm-python-zombiereaper-4.16.13.1-1.el6ev.noarch
vdsm-xmlrpc-4.16.13.1-1.el6ev.noarch
vdsm-yajsonrpc-4.16.13.1-1.el6ev.noarch
vdsm-4.16.13.1-1.el6ev.x86_64
vdsm-python-4.16.13.1-1.el6ev.noarch
vdsm-jsonrpc-4.16.13.1-1.el6ev.noarch

Comment 6 Francesco Romani 2015-09-10 12:50:29 UTC
This VDSM is old (4.16.13), but it already includes the fix for https://bugzilla.redhat.com/show_bug.cgi?id=1143968 , hence patch 46005.

Comment 7 Michal Skrivanek 2015-09-29 11:55:03 UTC
missed oVirt 3.6.0 branch-off, but this is not an oVirt 3.6 GA blocker

Comment 9 Francesco Romani 2016-01-19 15:31:57 UTC
This bz was caused by a race, I don't think this deserves mention in documentation.

Comment 10 Yuri Obshansky 2016-02-01 13:52:22 UTC
The bug was verified and didn't reproduce.

[root@host25-rack06 ~]# rpm -qa | grep ovirt-engine
rhevm-setup-plugin-ovirt-engine-common-3.6.3-0.1.el6.noarch
ovirt-engine-extension-aaa-jdbc-1.0.5-1.el6ev.noarch
rhevm-setup-plugin-ovirt-engine-3.6.3-0.1.el6.noarch

[root@host02-rack06 ~]# rpm -qa | grep vdsm
vdsm-python-4.17.19-0.el7ev.noarch
vdsm-hook-vmfex-dev-4.17.19-0.el7ev.noarch
vdsm-jsonrpc-4.17.19-0.el7ev.noarch
vdsm-yajsonrpc-4.17.19-0.el7ev.noarch
vdsm-xmlrpc-4.17.19-0.el7ev.noarch
vdsm-cli-4.17.19-0.el7ev.noarch
vdsm-4.17.19-0.el7ev.noarch
vdsm-infra-4.17.19-0.el7ev.noarch 

Can be closed.

Comment 12 errata-xmlrpc 2016-03-09 19:45:26 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://rhn.redhat.com/errata/RHBA-2016-0362.html


Note You need to log in before you can comment on or make changes to this bug.