Bug 1035797

Summary: TRACKING - [RHEVH] RHEVH upgrade failed ... ImportError: No module named puppet.puppet_page .
Product: Red Hat Enterprise Virtualization Manager Reporter: Martin Pavlik <mpavlik>
Component: ovirt-nodeAssignee: Fabian Deutsch <fdeutsch>
Status: CLOSED DUPLICATE QA Contact: Pavel Stehlik <pstehlik>
Severity: high Docs Contact:
Priority: urgent    
Version: 3.3.0CC: acathrow, alonbl, bazulay, dougsland, gklein, gouyang, iheim, Rhev-m-bugs, yeylon
Target Milestone: ---Keywords: TestOnly
Target Release: 3.3.0   
Hardware: All   
OS: Linux   
Whiteboard: node
Fixed In Version: Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of:
: 1036061 (view as bug list) Environment:
Last Closed: 2013-12-10 12:28:29 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: Node RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Bug Depends On:    
Bug Blocks: 1036061    
Attachments:
Description Flags
log_collector none

Description Martin Pavlik 2013-11-28 13:52:04 UTC
Created attachment 830244 [details]
log_collector

Description of problem:
RHEVM reports that RHEVH upgrade failed 

Host rhevh_bug installation failed. Unexpected error during execution: Traceback (most recent call last): File "/etc/ovirt-config-boot.d/puppet_autoinstall.py", line 20, in <module> from ovirt.node.setup.puppet.puppet_page import * ImportError: No module named puppet.puppet_page .

upgrade from rhev-hypervisor6-6.5-20131121.0.el6ev 
to
RHEV Hypervisor - 6.5 - 20131127.0.el6
using RHEVM

Version-Release number of selected component (if applicable):
Red Hat Enterprise Virtualization Manager Version: 3.3.0-0.37.beta1.el6ev


How reproducible:


Steps to Reproduce:
1. add rhev-hypervisor6-6.5-20131121.0.el6ev to 3.3 cluster
2. install rpm with RHEV Hypervisor - 6.5 - 20131127.0.el6 to RHEVM
3. put host to maintenance run upgrade

Actual results:
install failed, despite failed install host is upgraded and gets UP after clicking activate

Expected results:
successful upgrade with no error

Additional info:
2013-11-28 14:28:14,535 ERROR [org.ovirt.engine.core.utils.ssh.SSHDialog] (pool-4-thread-48) SSH error running command root.66.13:'/usr/share/vdsm-reg/vdsm-upgrade': java.lang.RuntimeException: Unexpected error during execution: Traceback (most recent call last):
  File "/etc/ovirt-config-boot.d/puppet_autoinstall.py", line 20, in <module>
    from ovirt.node.setup.puppet.puppet_page import *
ImportError: No module named puppet.puppet_page

	at org.ovirt.engine.core.utils.ssh.SSHDialog.executeCommand(SSHDialog.java:330) [utils.jar:]
	at org.ovirt.engine.core.bll.OVirtNodeUpgrade.execute(OVirtNodeUpgrade.java:203) [bll.jar:]
	at org.ovirt.engine.core.bll.InstallVdsCommand.upgradeNode(InstallVdsCommand.java:243) [bll.jar:]
	at org.ovirt.engine.core.bll.InstallVdsCommand.executeCommand(InstallVdsCommand.java:103) [bll.jar:]
	at org.ovirt.engine.core.bll.CommandBase.executeWithoutTransaction(CommandBase.java:1134) [bll.jar:]
	at org.ovirt.engine.core.bll.CommandBase.executeActionInTransactionScope(CommandBase.java:1219) [bll.jar:]
	at org.ovirt.engine.core.bll.CommandBase.runInTransaction(CommandBase.java:1895) [bll.jar:]
	at org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInSuppressed(TransactionSupport.java:174) [utils.jar:]
	at org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInScope(TransactionSupport.java:116) [utils.jar:]
	at org.ovirt.engine.core.bll.CommandBase.execute(CommandBase.java:1239) [bll.jar:]
	at org.ovirt.engine.core.bll.CommandBase.executeAction(CommandBase.java:362) [bll.jar:]
	at org.ovirt.engine.core.bll.MultipleActionsRunner.executeValidatedCommand(MultipleActionsRunner.java:175) [bll.jar:]
	at org.ovirt.engine.core.bll.MultipleActionsRunner.RunCommands(MultipleActionsRunner.java:156) [bll.jar:]
	at org.ovirt.engine.core.bll.MultipleActionsRunner$1.run(MultipleActionsRunner.java:94) [bll.jar:]
	at org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil$InternalWrapperRunnable.run(ThreadPoolUtil.java:71) [utils.jar:]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [rt.jar:1.7.0_45]
	at java.util.concurrent.FutureTask.run(FutureTask.java:262) [rt.jar:1.7.0_45]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [rt.jar:1.7.0_45]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [rt.jar:1.7.0_45]
	at java.lang.Thread.run(Thread.java:744) [rt.jar:1.7.0_45]

Comment 1 Fabian Deutsch 2013-11-29 10:07:09 UTC
This looks like a bug in the ovirt-node codebase. Cloned to allow you to better track and test later on.

Comment 2 Fabian Deutsch 2013-12-10 11:26:24 UTC
Moving this bug to ovirt-host-deploy (please correct), because this should be tracked by some RHEV-M team member.

Comment 3 Alon Bar-Lev 2013-12-10 11:52:23 UTC
(In reply to Fabian Deutsch from comment #2)
> Moving this bug to ovirt-host-deploy (please correct), because this should
> be tracked by some RHEV-M team member.

Why?

This failing plugins/puppet_autoinstall.py[1] file is at ovirt-node.

[1] http://gerrit.ovirt.org/gitweb?p=ovirt-node.git;a=blob;f=plugins/puppet_autoinstall.py;hb=HEAD

Comment 4 Fabian Deutsch 2013-12-10 11:59:26 UTC
(In reply to Alon Bar-Lev from comment #3)
> (In reply to Fabian Deutsch from comment #2)
> > Moving this bug to ovirt-host-deploy (please correct), because this should
> > be tracked by some RHEV-M team member.
> 
> Why?
> 
> This failing plugins/puppet_autoinstall.py[1] file is at ovirt-node.
> 
> [1]
> http://gerrit.ovirt.org/gitweb?p=ovirt-node.git;a=blob;f=plugins/
> puppet_autoinstall.py;hb=HEAD

Yes, that is why we cloned this bug to bug 1036061 (ovirt-node bug which is going to carry the fix).
This bug (as said in the title and in comment 1) is merely for testing and tracking.

Please move it to some RHEV-M component or close it if you don't need it.

Comment 5 Alon Bar-Lev 2013-12-10 12:05:48 UTC
I am unsure why there should be two bugs for same issue.

The faulty component is ovirt-node, there is no reason to move it to any other component.

If you think that someone should monitor this and for some reason we need duplicate bug, please find him so he can put him-self as assigned.

If not, please mark one bug as duplicate of the other.

Comment 6 Fabian Deutsch 2013-12-10 12:28:29 UTC
Like with other components it can be helpful to track the high level bug in a TestOnly bug, to ensure that a bug fix in a low level component really fixed the bug.

*** This bug has been marked as a duplicate of bug 1036061 ***