Bug 1429288

Summary: RHVH 4.0.7 cannot be up again in the side of RHEVM 4.0 after upgrade
Product: Red Hat Enterprise Virtualization Manager Reporter: jianwu <jiawu>
Component: rhev-hypervisor-ngAssignee: Yuval Turgeman <yturgema>
Status: CLOSED ERRATA QA Contact: jianwu <jiawu>
Severity: urgent Docs Contact:
Priority: unspecified    
Version: 4.0.7CC: bugs, cshao, dfediuck, dguo, eheftman, gklein, jiawu, leiwang, lsurette, mgoldboi, rbalakri, rbarry, Rhev-m-bugs, srevivo, weiwang, yaniwang, ycui, ykaul, ylavi, yturgema, yzhao
Target Milestone: ovirt-4.1.1Keywords: Regression, TestBlocker, ZStream
Target Release: ---   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: imgbased-0.9.17-0.1.el7ev Doc Type: Bug Fix
Doc Text:
Previously, VDSM was not configured after upgrading Red Hat Virtualization Host (RHVH). As a result RHVH could not run together with a Manager that was running an older version. In this release, by running vdsm-tool configure --force on boot, the VDSM is configured successfully.
Story Points: ---
Clone Of:
: 1429594 (view as bug list) Environment:
Last Closed: 2017-04-20 19:03:48 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: Node RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On:    
Bug Blocks: 1429594    
Attachments:
Description Flags
4.0.7 related log about upgrade issue
none
related picture about this issue
none
another picture about vdsmd.service
none
journalctl output none

Description jianwu 2017-03-06 02:41:00 UTC
Created attachment 1260242 [details]
4.0.7 related log about upgrade issue

Description of problem:

when register old bulid  RHVH 4.0 to rhevm 4.0, it cannot be up again after upgrade to RHVH 4.0.7 new build from engine side

Version-Release number of selected component (if applicable):
Before upgrade:
rhvh-4.0-0.20160919.1
After upgrade:
redhat-virtualization-host-4.0-20170302.0.x86_64(new build)
imgbased-0.8.15-0.1.el7ev.noarch
kernel-3.10.0-514.10.2.el7.x86_64


How reproducible:
100% 

Steps to Reproduce:
1. Install RHVH 4.0 old build via anaconda GUI
2. Reboot and set local repo
3. Register to rhevm 4.0 and upgrade from rhevm side
4. Upgrade successfully and Reboot into new system
5. Check RHVH status in the side of rhevm
6. Check vdsm status # service vdsmd status

Actual results:
After step 5, RHVH 4.0.7 cannot be up in the side of rhevm 4.0
After step 6, 
#service vdsmd status
   vdsmd.service - Virtual Desktop Server Manager
   Loaded: loaded (/usr/lib/systemd/system/vdsmd.service; enabled; vendor preset: enabled)
   Active: failed (Result: start-limit) since Sun 2017-03-05 19:00:20 CST; 23min ago
  Process: 5023 ExecStartPre=/usr/libexec/vdsm/vdsmd_init_common.sh --pre-start (code=exited, status=1/FAILURE)
.........................................

Expected results:
After step 5, RHVH 4.0.7 can be up in the side of rhevm 4.0

Additional info:
1. The same issue about upgrade via "yum update" way.
2. The RHVH 4.0.7 will be up in the rhevm side when install directly

Comment 1 jianwu 2017-03-06 02:43:34 UTC
Created attachment 1260243 [details]
related picture about this issue

Comment 2 jianwu 2017-03-06 02:44:11 UTC
Created attachment 1260244 [details]
another picture about vdsmd.service

Comment 3 jianwu 2017-03-06 03:00:21 UTC
This bug blocks upgrade test in RHEVM side, and the same issue happens via "yum update" way ,so I could not check status in rhevm side and related test.

Comment 4 Ryan Barry 2017-03-06 03:20:23 UTC
Can you please provide journalctl output?

Comment 5 jianwu 2017-03-06 04:22:44 UTC
Created attachment 1260262 [details]
journalctl output

Comment 6 jianwu 2017-03-06 04:24:27 UTC
(In reply to Ryan Barry from comment #4)
> Can you please provide journalctl output?

Hi,
I have upload attachment 1260262 [details] for journalcl output, please check it

Thanks

Comment 7 Ryan Barry 2017-03-06 08:04:32 UTC
Thanks jianwu.

I'm not sure this is unexpected behavior, though it's not nice.

We've carefully avoided adding a 'vdsm-tool configure --force' as part of NGN booting (which vintage RHV-H had).

The version of RHV-H used for upgrading was particularly old, but I'm not sure how vdsm handles this behind the scenes.

The version of lvmlocal.conf from the old version did not have the necessary configuration. In fact, it appears to have matched the installed version exactly:

# mount
/dev/mapper/rhvh_dhcp--10--229-rhvh--4.0--0.20160919.0+1 on /tmp/a type xfs (rw,relatime,seclabel,attr2,inode64,logbsize=256k,sunit=512,swidth=512,noquota)
# diff -u /tmp/a/etc/lvm/lvmlocal.conf /tmp/a/usr/share/factory/etc/lvm/lvmlocal.conf

In this case, we took the lvmlocal.conf from the new image, but this is also not what vdsm expects. Differences between /etc/lvm/lvmlocal.conf before and after "vdsm-tool configure --force" are significant.

I don't see anything in the relevant topic branch which looks like vdsm should keep/handle this on upgrades either. I'm not very familiar with how vdsm handles upgrades, though.

A quick glance at the specfile show:

if ! %{_bindir}/vdsm-tool is-configured >/dev/null 2>&1; then
   %{_bindir}/vdsm-tool configure --force >/dev/null 2>&1
fi

I'd like to match this if possible.

Yuvalt:

Can you please add a method in plugins/osupdater which matches this?

Comment 10 jianwu 2017-03-13 11:09:32 UTC
Hi, 
I have verified this issue on redhat-virtualization-host-4.1-20170308.1.x86_6

Version-Release number of selected component (if applicable):
Before upgrade:
rhvh-4.0-0.20160919.1
After upgrade:
redhat-virtualization-host-4.1-20170308.1.x86_6(new build)
imgbased-0.9.17-0.1.el7ev


Test Results:

RHVH new build could be up again in the side of RHEVM 4.0

So I thinks this bug is fixed in this build, I will change status to verified.

Comment 11 Emma Heftman 2017-04-12 13:15:23 UTC
Hi Yuval
Can you please confirm that this doc text is technically accurate.

Previously, VDSM was not configured after upgrading Red Hat Virtualization Host (RHVH). As a result RHVH could not run together with a Manager that was running an older version. In this release, by running  vdsm-tool configure --force on boot, the VDSM is successfully configured and RHVH 4.1 can run next to a 4.0 Manager.

Comment 12 Yuval Turgeman 2017-04-13 07:34:47 UTC
Hi Emma, sounds good, but I dont think it's a 4.1 thing, the bug was reported for 4.0.7 RHVH

Comment 13 errata-xmlrpc 2017-04-20 19:03:48 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2017:1114