Bug 1209394

Summary: domstats cannot be used when do not add any option
Product: Red Hat Enterprise Linux 7 Reporter: Luyao Huang <lhuang>
Component: libvirtAssignee: Peter Krempa <pkrempa>
Status: CLOSED ERRATA QA Contact: Virtualization Bugs <virt-bugs>
Severity: medium Docs Contact:
Priority: medium    
Version: 7.2CC: dyuan, hliu, honzhang, mzhan, rbalakri
Target Milestone: rc   
Target Release: ---   
Hardware: x86_64   
OS: Linux   
Whiteboard:
Fixed In Version: libvirt-1.2.15-1.el7 Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2015-11-19 06:27:26 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description Luyao Huang 2015-04-07 09:27:14 UTC
description of problem:
domstats cannot be used when do not add any option

Version-Release number of selected component (if applicable):
libvirt-1.2.14-1.el7.x86_64

How reproducible:
100%

Steps to Reproduce:

# virsh domstats
error: An error occurred, but the cause is unknown


Actual results:
error: An error occurred, but the cause is unknown

Expected results:
fix it

Additional info:

I found there is an error in libvirtd.log:

2015-04-07 08:19:18.180+0000: 13321: error : virStorageFileBackendGlusterInit:628 : failed to initialize gluster connection to server: '/var/run/gluster/glusterd.socket': Transport endpoint is not connected

and test with this vm force:

# virsh domstats r6-qcow2
error: failed to initialize gluster connection to server: '/var/run/gluster/glusterd.socket': Transport endpoint is not connected

Comment 1 Peter Krempa 2015-04-16 07:40:52 UTC
Fixed upstream:

commit 25aa7035d373ab17eb0ba773ad521e0dbfb7449d
Author: Peter Krempa <pkrempa>
Date:   Wed Apr 15 18:14:30 2015 +0200

    qemu: bulk stats: Ignore errors from missing/inaccessible disks
    
    Rather than erroring out make the best attempt to retrieve other data if
    disks are inaccessible or missing. The failure will still be logged
    though.
    
    Since the bulk stats API is called on multiple domains an error like
    this makes the API unusable. This regression was introduced by commit
    596a13713420e01b20ce3dc3fdbe06d073682675

v1.2.14-198-g25aa703

Comment 3 vivian zhang 2015-07-15 07:50:57 UTC
I can produce this bug with build libvirt-1.2.14-1.el7.x86_64

# virsh domstats
error: An error occurred, but the cause is unknown


verify this with build libvirt-1.2.17-2.el7.x86_64

# virsh list --all
 Id    Name                           State
----------------------------------------------------
 6     vm1                            running
 -     mig-11                         shut off
 -     mig-12                         shut off
 -     mig-13                         shut off
 -     mig-14                         shut off
 -     mig-15                         shut off
 -     mig-3                          shut off
 -     mig-4                          shut off
 -     mig-5                          shut off
 -     mig-6                          shut off
 -     mig-7                          shut off
 -     mig-8                          shut off
 -     mig-9                          shut off
 -     r                              shut off
 -     r7                             shut off
 -     rhel7                          shut off
 -     rhel7-ovmf                     shut off
 -     rhelvm1                        shut off
 -     rhevm                          shut off
 -     test                           shut off
 -     vm2                            shut off
 -     win7                           shut off
 -     win71                          shut off


# virsh domstats
Domain: 'win7'
  state.state=5
  state.reason=0
  balloon.maximum=1048576
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=hda
  block.0.path=/var/lib/libvirt/images/kvm-win7-x86_64-qcow2.img
  block.0.allocation=15386755072
  block.0.capacity=16106127360
  block.0.physical=15386738688

Domain: 'mig-7'
  state.state=5
  state.reason=0
  balloon.maximum=262144
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/mnt/mig-7

Domain: 'mig-5'
  state.state=5
  state.reason=0
  balloon.maximum=262144
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/mnt/mig-5

Domain: 'vm2'
  state.state=5
  state.reason=0
  balloon.maximum=1048576
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=hda
  block.0.path=/home/wzhang/VirtualMachines/rhel6.img
  block.0.allocation=3855093760
  block.0.capacity=8388608000
  block.0.physical=3855089664

Domain: 'r7'
  state.state=5
  state.reason=0
  balloon.maximum=1048576
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/var/lib/libvirt/images/rhel7.0-2.qcow2
  block.0.allocation=3981164544
  block.0.capacity=8589934592
  block.0.physical=8591507456

Domain: 'mig-11'
  state.state=5
  state.reason=0
  balloon.maximum=262144
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/mnt/mig-11

Domain: 'mig-12'
  state.state=5
  state.reason=0
  balloon.maximum=262144
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/mnt/mig-12

Domain: 'test'
  state.state=5
  state.reason=0
  balloon.maximum=1048576
  vcpu.current=2
  vcpu.maximum=2
  block.count=1
  block.0.name=vda
  block.0.path=/var/lib/libvirt/images/test.img
  block.0.allocation=109219840
  block.0.capacity=8589934592
  block.0.physical=7409324032

Domain: 'rhel7'
  state.state=5
  state.reason=0
  balloon.maximum=1048576
  vcpu.current=8
  vcpu.maximum=8
  block.count=1
  block.0.name=vda
  block.0.path=/var/lib/libvirt/images/test.rhel7.s2
  block.0.allocation=64540672
  block.0.capacity=8589934592
  block.0.physical=971572224

Domain: 'mig-3'
  state.state=5
  state.reason=0
  balloon.maximum=262144
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/mnt/mig-3

Domain: 'mig-13'
  state.state=5
  state.reason=0
  balloon.maximum=262144
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/mnt/mig-13

Domain: 'mig-9'
  state.state=5
  state.reason=0
  balloon.maximum=262144
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/mnt/mig-9

Domain: 'win71'
  state.state=5
  state.reason=0
  balloon.maximum=1048576
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=hda
  block.0.path=/var/lib/libvirt/images/kvm-win7-x86_64-qcow2.1.img
  block.0.allocation=15423389696
  block.0.capacity=16106127360
  block.0.physical=15423373312

Domain: 'mig-8'
  state.state=5
  state.reason=0
  balloon.maximum=262144
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/mnt/mig-8

Domain: 'mig-15'
  state.state=5
  state.reason=0
  balloon.maximum=262144
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/mnt/mig-15

Domain: 'mig-14'
  state.state=5
  state.reason=0
  balloon.maximum=262144
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/mnt/mig-14

Domain: 'r'
  state.state=5
  state.reason=0
  balloon.maximum=1048576
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/var/lib/libvirt/images/rhel7.0-1.qcow2

Domain: 'mig-4'
  state.state=5
  state.reason=0
  balloon.maximum=262144
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/mnt/mig-4

Domain: 'rhel7-ovmf'
  state.state=5
  state.reason=0
  balloon.maximum=1024000
  vcpu.current=8
  vcpu.maximum=8
  block.count=3
  block.0.name=hdc
  block.0.path=var/lib/libvirt/images/test.img
  block.0.allocation=109219840
  block.0.capacity=7409324032
  block.0.physical=7409324032
  block.1.name=fda
  block.1.path=/var/lib/libvirt/images/floppy.img
  block.2.name=vda
  block.2.path=/var/lib/libvirt/images/r71_gpt.img

Domain: 'rhelvm1'
  state.state=5
  state.reason=0
  balloon.maximum=1048576
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=hda
  block.0.path=/var/lib/libvirt/images/test.img
  block.0.allocation=109219840
  block.0.capacity=8589934592
  block.0.physical=7409324032

Domain: 'rhevm'
  state.state=5
  state.reason=0
  balloon.maximum=1048576
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=hda
  block.0.path=/var/lib/libvirt/images/rhevm.img

Domain: 'mig-6'
  state.state=5
  state.reason=0
  balloon.maximum=262144
  vcpu.current=1
  vcpu.maximum=1
  block.count=1
  block.0.name=vda
  block.0.path=/mnt/mig-6

Domain: 'vm1'
  state.state=1
  state.reason=1
  cpu.time=4719913425
  cpu.user=460000000
  cpu.system=1460000000
  balloon.current=1048576
  balloon.maximum=1048576
  vcpu.current=1
  vcpu.maximum=1
  vcpu.0.state=1
  vcpu.0.time=4490000000
  net.count=2
  net.0.name=vnet0
  net.0.rx.bytes=664
  net.0.rx.pkts=9
  net.0.rx.errs=0
  net.0.rx.drop=0
  net.0.tx.bytes=0
  net.0.tx.pkts=0
  net.0.tx.errs=0
  net.0.tx.drop=0
  net.1.name=vnet1
  net.1.rx.bytes=612
  net.1.rx.pkts=8
  net.1.rx.errs=0
  net.1.rx.drop=0
  net.1.tx.bytes=0
  net.1.tx.pkts=0
  net.1.tx.errs=0
  net.1.tx.drop=0
  block.count=2
  block.0.name=hda
  block.0.path=/mnt/wzhang/rhel7.0-3.qcow2
  block.0.rd.reqs=2151
  block.0.rd.bytes=1101312
  block.0.rd.times=1546714619
  block.0.wr.reqs=0
  block.0.wr.bytes=0
  block.0.wr.times=0
  block.0.fl.reqs=0
  block.0.fl.times=0
  block.0.allocation=0
  block.0.capacity=9663676416
  block.0.physical=1516363776
  block.1.name=vda
  block.1.path=/mnt/wzhang/rhel7.qcow2
  block.1.rd.reqs=0
  block.1.rd.bytes=0
  block.1.rd.times=0
  block.1.wr.reqs=0
  block.1.wr.bytes=0
  block.1.wr.times=0
  block.1.fl.reqs=0
  block.1.fl.times=0
  block.1.allocation=0
  block.1.capacity=8589934592
  block.1.physical=3988996096

list all domains status success, no error found

move to verified

Comment 5 errata-xmlrpc 2015-11-19 06:27:26 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://rhn.redhat.com/errata/RHBA-2015-2202.html