Bug 1112370
Summary: | yum packager is active while running with offline packager | |||
---|---|---|---|---|
Product: | [oVirt] otopi | Reporter: | Marina Kalinin <mkalinin> | |
Component: | Plugins.packagers | Assignee: | Alon Bar-Lev <alonbl> | |
Status: | CLOSED CURRENTRELEASE | QA Contact: | Jiri Belka <jbelka> | |
Severity: | urgent | Docs Contact: | ||
Priority: | urgent | |||
Version: | 1.0.0 | CC: | aberezin, bazulay, dougsland, iheim, jbelka, oourfali, pstehlik, rbalakri, Rhev-m-bugs, yeylon | |
Target Milestone: | --- | Keywords: | ZStream | |
Target Release: | 1.2.2 | Flags: | alonbl:
devel_ack+
|
|
Hardware: | Unspecified | |||
OS: | Unspecified | |||
Whiteboard: | infra | |||
Fixed In Version: | ovirt-engine-3.5.0_rc1 | Doc Type: | Bug Fix | |
Doc Text: | Story Points: | --- | ||
Clone Of: | ||||
: | 1114657 (view as bug list) | Environment: | ||
Last Closed: | 2015-02-11 20:42:05 UTC | Type: | Bug | |
Regression: | --- | Mount Type: | --- | |
Documentation: | --- | CRM: | ||
Verified Versions: | Category: | --- | ||
oVirt Team: | Infra | RHEL 7.3 requirements from Atomic Host: | ||
Cloudforms Team: | --- | Target Upstream Version: | ||
Embargoed: | ||||
Bug Depends On: | ||||
Bug Blocks: | 1114657, 1142923, 1156165 |
Description
Marina Kalinin
2014-06-23 18:29:23 UTC
From /var/log/ovirt-engine/host-deploy/ovirt-20140623131027-10.10.183.71-621e3055.log ~~~ 2014-06-23 17:10:29 DEBUG otopi.context context._executeMethod:138 Stage internal_packages METHOD otopi.plugins.otopi.packagers.yumpackager.Plugin._internal_packages_end 2014-06-23 17:10:29 DEBUG otopi.plugins.otopi.packagers.yumpackager yumpackager.verbose:88 Yum Building transaction 2014-06-23 17:10:29 ERROR otopi.plugins.otopi.packagers.yumpackager yumpackager.error:97 Yum Cannot retrieve repository metadata (repomd.xml) for repository: plugin_repo. Please verify its path and try again 2014-06-23 17:10:29 DEBUG otopi.context context._executeMethod:152 method exception Traceback (most recent call last): File "/tmp/ovirt-ZlmhmkkfLD/pythonlib/otopi/context.py", line 142, in _executeMethod method['method']() File "/tmp/ovirt-ZlmhmkkfLD/otopi-plugins/otopi/packagers/yumpackager.py", line 237, in _internal_packages_end if self._miniyum.buildTransaction(): File "/tmp/ovirt-ZlmhmkkfLD/pythonlib/otopi/miniyum.py", line 910, in buildTransaction rc, msg = self._yb.buildTransaction() File "/usr/lib/python2.6/site-packages/yum/__init__.py", line 979, in buildTransaction File "/usr/lib/python2.6/site-packages/yum/depsolve.py", line 737, in resolveDeps File "/usr/lib/python2.6/site-packages/yum/__init__.py", line 899, in <lambda> File "/usr/lib/python2.6/site-packages/yum/depsolve.py", line 110, in _getTsInfo File "/usr/lib/python2.6/site-packages/yum/__init__.py", line 887, in <lambda> File "/usr/lib/python2.6/site-packages/yum/__init__.py", line 669, in _getSacks File "/usr/lib/python2.6/site-packages/yum/repos.py", line 308, in populateSack File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 165, in populate File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 223, in _check_db_version File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 1256, in _check_db_version File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 1455, in <lambda> File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 1451, in _getRepoXML RepoError: Cannot retrieve repository metadata (repomd.xml) for repository: plugin_repo. Please verify its path and try again 2014-06-23 17:10:29 ERROR otopi.context context._executeMethod:161 Failed to execute stage 'Environment packages setup': Cannot retrieve repository metadata (repomd.xml) for repository: plugin_repo. Please verify its path and try again 2014-06-23 17:10:29 DEBUG otopi.transaction transaction.abort:131 aborting 'Yum Transaction' ~~~ From engine.log: ~~~ 2014-06-23 13:10:27,899 ERROR [org.ovirt.engine.core.utils.ssh.SSHDialog] (org.ovirt.thread.pool-4-thread-38) SSH error running command root.183.71:'umask 0077; MYTMP="$(mktemp -t ovirt-XXXXXXXXXX)"; trap "chmod -R u+rwX \"${MYTMP}\" > /dev/null 2>&1; rm -fr \"${MYTMP}\" > /dev/null 2>&1" 0; rm -fr "${MYTMP}" && mkdir "${MYTMP}" && tar --warning=no-timestamp -C "${MYTMP}" -x && "${MYTMP}"/setup DIALOG/dialect=str:machine DIALOG/customization=bool:True': java.io.IOException: Command returned failure code 1 during SSH session 'root.183.71' at org.ovirt.engine.core.utils.ssh.SSHClient.executeCommand(SSHClient.java:527) [utils.jar:] at org.ovirt.engine.core.utils.ssh.SSHDialog.executeCommand(SSHDialog.java:318) [utils.jar:] at org.ovirt.engine.core.bll.VdsDeploy.execute(VdsDeploy.java:1046) [bll.jar:] at org.ovirt.engine.core.bll.InstallVdsCommand.installHost(InstallVdsCommand.java:241) [bll.jar:] at org.ovirt.engine.core.bll.InstallVdsCommand.executeCommand(InstallVdsCommand.java:156) [bll.jar:] at org.ovirt.engine.core.bll.ApproveVdsCommand.executeCommand(ApproveVdsCommand.java:45) [bll.jar:] at org.ovirt.engine.core.bll.CommandBase.executeWithoutTransaction(CommandBase.java:1133) [bll.jar:] at org.ovirt.engine.core.bll.CommandBase.executeActionInTransactionScope(CommandBase.java:1218) [bll.jar:] at org.ovirt.engine.core.bll.CommandBase.runInTransaction(CommandBase.java:1894) [bll.jar:] at org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInSuppressed(TransactionSupport.java:174) [utils.jar:] at org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInScope(TransactionSupport.java:116) [utils.jar:] at org.ovirt.engine.core.bll.CommandBase.execute(CommandBase.java:1238) [bll.jar:] at org.ovirt.engine.core.bll.CommandBase.executeAction(CommandBase.java:351) [bll.jar:] at org.ovirt.engine.core.bll.MultipleActionsRunner.executeValidatedCommand(MultipleActionsRunner.java:189) [bll.jar:] at org.ovirt.engine.core.bll.MultipleActionsRunner.runCommands(MultipleActionsRunner.java:156) [bll.jar:] at org.ovirt.engine.core.bll.MultipleActionsRunner$2.run(MultipleActionsRunner.java:165) [bll.jar:] at org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil$InternalWrapperRunnable.run(ThreadPoolUtil.java:97) [utils.jar:] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [rt.jar:1.7.0_55] at java.util.concurrent.FutureTask.run(FutureTask.java:262) [rt.jar:1.7.0_55] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [rt.jar:1.7.0_55] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [rt.jar:1.7.0_55] at java.lang.Thread.run(Thread.java:744) [rt.jar:1.7.0_55] ~~~ For reference: https://bugzilla.redhat.com/show_bug.cgi?id=1086687 - request to improve logging. I can ssh to the host, with and without ssh keys: # ssh -i /etc/pki/ovirt-engine/keys/engine_id_rsa root.183.71 -> does not require password. # ssh root.183.71 -> requires password please attach entire log, the node detection probably not working properly. My fault, broken when we introduced the install while in version lock. But the root cause was committed a year ago! Strange nobody found it. packagers: yum: disable self if not active this breaks offline packager if exists, introduced at: 2b4f0bdccf9 since 2013-16-07, not sure how it passed all QA cycles. Bug-Url: https://bugzilla.redhat.com/show_bug.cgi?id=1112370 Change-Id: I2ddd0ac38828bc0536026df465fc17557ac20ce7 Signed-off-by: Alon Bar-Lev <alonbl> updating version, the issue is ever since 3.2.0, probably some ovirt-node behaves differently when accessing yum api. same solution applies. This BZ needs d/s RHEVM with fixed otopi. We don't have d/s engine yet and adding RHEVH into oVirt is not supported. (In reply to Jiri Belka from comment #14) > This BZ needs d/s RHEVM with fixed otopi. We don't have d/s engine yet and > adding RHEVH into oVirt is not supported. I do not understand how bug#1111303 is related. Also, I do not understand why latest otopi within brew does not solve this. This is for 3.5, we don't have 3.5 engine d/s. IIRC one cannot add RHEVH into oVirt which is still 3.5 for us now. Mixing 3.5 otopi into RHEVM 3.4 is a hack and we do not verify BZs like this (usually). BZ1111303 is really not related, I thought it's a bug for d/s engine. I think we should wait for real d/s engine or do you think that using 3.5 otopi in 3.4 engine would be OK for verification? (In reply to Jiri Belka from comment #16) > This is for 3.5, we don't have 3.5 engine d/s. IIRC one cannot add RHEVH > into oVirt which is still 3.5 for us now. > > Mixing 3.5 otopi into RHEVM 3.4 is a hack and we do not verify BZs like this > (usually). > > I think we should wait for real d/s engine or do you think that using 3.5 > otopi in 3.4 engine would be OK for verification? will be OK. you can wait for downstream 3.5 build if you like like every other bug. why have you added blocking bug#1116416? And I do not understand this statement from comment#14: > We don't have d/s engine yet and adding RHEVH into oVirt is not supported. if you wait for 3.5 downstream build, just ignore all bugs until you have this version. it cannot be that something is unsupported in software that is not available. ok rhevm-3.5.0-0.10.master.el6ev.noarch 3.2 20140118.1.3.el6_5 -> 3.2 20140225.0.3.2.el6_5 from 3.2 rhevm to 3.5 rhemv (3.2 cluster) |