Description of problem: During REST API load test detected difference in represented by RHEV-M UI amount of running VMs which is bigger than real created VMs in cluster. In my situation I have only 100 VMs in cluster, but UI shows me 126 VMS I run # vdsClient -s 0 getVdsStats | grep vmCount vmCount = 52 Load test is 100 concurrent threads which simulate REST API calls to VMs (GET different info, POST stopVM and POST startVM). Version-Release number of selected component (if applicable): 3.5.4-1.2.el6ev vdsm-4.16.13.1-1.el6ev How reproducible: 100 % Steps to Reproduce: 1. Start Load Test and give it run for some time 2. Check amount in RHEV-M UI/Hosts/Virtual Machines 3. Run # vdsClient -s 0 getVdsStats | grep vmCount Actual results: UI shows me 126 VMS # vdsClient -s 0 getVdsStats | grep vmCount vmCount = 52 Actual number of VMs is 100 Expected results: UI shows me 52 VMS # vdsClient -s 0 getVdsStats | grep vmCount vmCount = 52 And no more that we have actual VMS in Setup Additional info:
Created attachment 1072161 [details] UI screenshot
Created attachment 1072163 [details] Engine log
Created attachment 1072164 [details] vdsm log
Created attachment 1072166 [details] vdsm 1 log
[root@dhcp-yuri ovirt-engine]# rpm -qa | grep ovirt-engine rhevm-setup-plugin-ovirt-engine-common-3.5.4-1.2.el6ev.noarch rhevm-setup-plugin-ovirt-engine-3.5.4-1.2.el6ev.noarch [root@ucs1-b420-1 vdsm]# rpm -qa | grep vdsm vdsm-cli-4.16.13.1-1.el6ev.noarch vdsm-python-zombiereaper-4.16.13.1-1.el6ev.noarch vdsm-xmlrpc-4.16.13.1-1.el6ev.noarch vdsm-yajsonrpc-4.16.13.1-1.el6ev.noarch vdsm-4.16.13.1-1.el6ev.x86_64 vdsm-python-4.16.13.1-1.el6ev.noarch vdsm-jsonrpc-4.16.13.1-1.el6ev.noarch
This VDSM is old (4.16.13), but it already includes the fix for https://bugzilla.redhat.com/show_bug.cgi?id=1143968 , hence patch 46005.
missed oVirt 3.6.0 branch-off, but this is not an oVirt 3.6 GA blocker
This bz was caused by a race, I don't think this deserves mention in documentation.
The bug was verified and didn't reproduce. [root@host25-rack06 ~]# rpm -qa | grep ovirt-engine rhevm-setup-plugin-ovirt-engine-common-3.6.3-0.1.el6.noarch ovirt-engine-extension-aaa-jdbc-1.0.5-1.el6ev.noarch rhevm-setup-plugin-ovirt-engine-3.6.3-0.1.el6.noarch [root@host02-rack06 ~]# rpm -qa | grep vdsm vdsm-python-4.17.19-0.el7ev.noarch vdsm-hook-vmfex-dev-4.17.19-0.el7ev.noarch vdsm-jsonrpc-4.17.19-0.el7ev.noarch vdsm-yajsonrpc-4.17.19-0.el7ev.noarch vdsm-xmlrpc-4.17.19-0.el7ev.noarch vdsm-cli-4.17.19-0.el7ev.noarch vdsm-4.17.19-0.el7ev.noarch vdsm-infra-4.17.19-0.el7ev.noarch Can be closed.
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://rhn.redhat.com/errata/RHBA-2016-0362.html