Bug 1112370 - yum packager is active while running with offline packager
Summary: yum packager is active while running with offline packager
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: otopi
Classification: oVirt
Component: Plugins.packagers
Version: 1.0.0
Hardware: Unspecified
OS: Unspecified
urgent
urgent
Target Milestone: ---
: 1.2.2
Assignee: Alon Bar-Lev
QA Contact: Jiri Belka
URL:
Whiteboard: infra
Depends On:
Blocks: 1114657 rhev3.5beta 1156165
TreeView+ depends on / blocked
 
Reported: 2014-06-23 18:29 UTC by Marina Kalinin
Modified: 2019-04-28 10:40 UTC (History)
10 users (show)

Fixed In Version: ovirt-engine-3.5.0_rc1
Doc Type: Bug Fix
Doc Text:
Clone Of:
: 1114657 (view as bug list)
Environment:
Last Closed: 2015-02-11 20:42:05 UTC
oVirt Team: Infra
Embargoed:
alonbl: devel_ack+


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Knowledge Base (Solution) 969093 0 None None None Never
oVirt gerrit 29089 0 master MERGED packagers: yum: disable self if not active Never
oVirt gerrit 29090 0 otopi-1.2 MERGED packagers: yum: disable self if not active Never

Description Marina Kalinin 2014-06-23 18:29:23 UTC
Description of problem:
I cannot approve or register RHEV-H 3.2 host on any version of RHEVM (under 3.2 compatibility mode cluster, of course): nor 3.2, nor 3.4, I think same with 3.3.


Version-Release number of selected component (if applicable):
rhevm-3.4.0-0.21.el6ev.noarch
host:
- rhev-hypervisor6-6.5-20140118.1.3.2.el6_5, available here:
https://rhn.redhat.com/rhn/software/channel/downloads/Download.do?cid=12564
or my custom build on top of it with latest openssl and gnutls.
- vdsm-4.10.2-30.1.el6ev


How reproducible:
When it is clean install, I succeed to register the host most of the times.
But once the host is upgraded - no luck.

Steps to Reproduce - last variation that reproduced twice:
1. Upgrade 3.2 version of rhevh. Even if take the latest 3.2 from RHN and do reinstall of the host with the same iso.
2. Remove the host from current rhevm.
3. Trying adding host to a new instance of RHEV-M.
Initiate this either from the TUI or from the Admin Portal->AddNewHost.

Actual results:
Failure.
From the events log:
>Failed to install Host ibm-x3550m4-16.gsslab.rdu2.redhat.com. Yum Cannot retrieve repository metadata (repomd.xml) for repository: plugin_repo. Please verify its path and try again.
>Host ibm-x3550m4-16.gsslab.rdu2.redhat.com installation failed. Command returned failure code 1 during SSH session 'root.183.71'.

More logs are coming in the next update.

Expected results:
Host should be registered successfully.


Additional info:
I need help debugging the problem, even if it is not going to be fixed, saying it is 3.2 rhev-h.

Comment 1 Marina Kalinin 2014-06-23 18:34:18 UTC
From /var/log/ovirt-engine/host-deploy/ovirt-20140623131027-10.10.183.71-621e3055.log 
~~~
2014-06-23 17:10:29 DEBUG otopi.context context._executeMethod:138 Stage internal_packages METHOD otopi.plugins.otopi.packagers.yumpackager.Plugin._internal_packages_end
2014-06-23 17:10:29 DEBUG otopi.plugins.otopi.packagers.yumpackager yumpackager.verbose:88 Yum Building transaction
2014-06-23 17:10:29 ERROR otopi.plugins.otopi.packagers.yumpackager yumpackager.error:97 Yum Cannot retrieve repository metadata (repomd.xml) for repository: plugin_repo. Please verify its path and try again
2014-06-23 17:10:29 DEBUG otopi.context context._executeMethod:152 method exception
Traceback (most recent call last):
  File "/tmp/ovirt-ZlmhmkkfLD/pythonlib/otopi/context.py", line 142, in _executeMethod
    method['method']()
  File "/tmp/ovirt-ZlmhmkkfLD/otopi-plugins/otopi/packagers/yumpackager.py", line 237, in _internal_packages_end
    if self._miniyum.buildTransaction():
  File "/tmp/ovirt-ZlmhmkkfLD/pythonlib/otopi/miniyum.py", line 910, in buildTransaction
    rc, msg = self._yb.buildTransaction()
  File "/usr/lib/python2.6/site-packages/yum/__init__.py", line 979, in buildTransaction
  File "/usr/lib/python2.6/site-packages/yum/depsolve.py", line 737, in resolveDeps
  File "/usr/lib/python2.6/site-packages/yum/__init__.py", line 899, in <lambda>
  File "/usr/lib/python2.6/site-packages/yum/depsolve.py", line 110, in _getTsInfo
  File "/usr/lib/python2.6/site-packages/yum/__init__.py", line 887, in <lambda>
  File "/usr/lib/python2.6/site-packages/yum/__init__.py", line 669, in _getSacks
  File "/usr/lib/python2.6/site-packages/yum/repos.py", line 308, in populateSack
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 165, in populate
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 223, in _check_db_version
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 1256, in _check_db_version
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 1455, in <lambda>
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 1451, in _getRepoXML
RepoError: Cannot retrieve repository metadata (repomd.xml) for repository: plugin_repo. Please verify its path and try again
2014-06-23 17:10:29 ERROR otopi.context context._executeMethod:161 Failed to execute stage 'Environment packages setup': Cannot retrieve repository metadata (repomd.xml) for repository: plugin_repo. Please verify its path and try again
2014-06-23 17:10:29 DEBUG otopi.transaction transaction.abort:131 aborting 'Yum Transaction'
~~~

From engine.log:
~~~
2014-06-23 13:10:27,899 ERROR [org.ovirt.engine.core.utils.ssh.SSHDialog] (org.ovirt.thread.pool-4-thread-38) SSH error running command root.183.71:'umask 0077; MYTMP="$(mktemp -t ovirt-XXXXXXXXXX)"; trap "chmod -R u+rwX \"${MYTMP}\" > /dev/null 2>&1; rm -fr \"${MYTMP}\" > /dev/null 2>&1" 0; rm -fr "${MYTMP}" && mkdir "${MYTMP}" && tar --warning=no-timestamp -C "${MYTMP}" -x &&  "${MYTMP}"/setup DIALOG/dialect=str:machine DIALOG/customization=bool:True': java.io.IOException: Command returned failure code 1 during SSH session 'root.183.71'
        at org.ovirt.engine.core.utils.ssh.SSHClient.executeCommand(SSHClient.java:527) [utils.jar:]
        at org.ovirt.engine.core.utils.ssh.SSHDialog.executeCommand(SSHDialog.java:318) [utils.jar:]
        at org.ovirt.engine.core.bll.VdsDeploy.execute(VdsDeploy.java:1046) [bll.jar:]
        at org.ovirt.engine.core.bll.InstallVdsCommand.installHost(InstallVdsCommand.java:241) [bll.jar:]
        at org.ovirt.engine.core.bll.InstallVdsCommand.executeCommand(InstallVdsCommand.java:156) [bll.jar:]
        at org.ovirt.engine.core.bll.ApproveVdsCommand.executeCommand(ApproveVdsCommand.java:45) [bll.jar:]
        at org.ovirt.engine.core.bll.CommandBase.executeWithoutTransaction(CommandBase.java:1133) [bll.jar:]
        at org.ovirt.engine.core.bll.CommandBase.executeActionInTransactionScope(CommandBase.java:1218) [bll.jar:]
        at org.ovirt.engine.core.bll.CommandBase.runInTransaction(CommandBase.java:1894) [bll.jar:]
        at org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInSuppressed(TransactionSupport.java:174) [utils.jar:]
        at org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInScope(TransactionSupport.java:116) [utils.jar:]
        at org.ovirt.engine.core.bll.CommandBase.execute(CommandBase.java:1238) [bll.jar:]
        at org.ovirt.engine.core.bll.CommandBase.executeAction(CommandBase.java:351) [bll.jar:]
        at org.ovirt.engine.core.bll.MultipleActionsRunner.executeValidatedCommand(MultipleActionsRunner.java:189) [bll.jar:]
        at org.ovirt.engine.core.bll.MultipleActionsRunner.runCommands(MultipleActionsRunner.java:156) [bll.jar:]
        at org.ovirt.engine.core.bll.MultipleActionsRunner$2.run(MultipleActionsRunner.java:165) [bll.jar:]
        at org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil$InternalWrapperRunnable.run(ThreadPoolUtil.java:97) [utils.jar:]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [rt.jar:1.7.0_55]
        at java.util.concurrent.FutureTask.run(FutureTask.java:262) [rt.jar:1.7.0_55]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [rt.jar:1.7.0_55]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [rt.jar:1.7.0_55]
        at java.lang.Thread.run(Thread.java:744) [rt.jar:1.7.0_55]
~~~
For reference: https://bugzilla.redhat.com/show_bug.cgi?id=1086687 - request to improve logging.

I can ssh to the host, with and without ssh keys:
# ssh -i /etc/pki/ovirt-engine/keys/engine_id_rsa root.183.71  -> does not require password.
# ssh root.183.71  -> requires password

Comment 2 Alon Bar-Lev 2014-06-23 18:35:53 UTC
please attach entire log, the node detection probably not working properly.

Comment 6 Alon Bar-Lev 2014-06-23 19:42:44 UTC
My fault, broken when we introduced the install while in version lock. But the root cause was committed a year ago! Strange nobody found it.

    packagers: yum: disable self if not active
    
    this breaks offline packager if exists, introduced at:
    2b4f0bdccf9 since 2013-16-07, not sure how it passed all QA cycles.
    
    Bug-Url: https://bugzilla.redhat.com/show_bug.cgi?id=1112370
    Change-Id: I2ddd0ac38828bc0536026df465fc17557ac20ce7
    Signed-off-by: Alon Bar-Lev <alonbl>

Comment 8 Alon Bar-Lev 2014-06-25 19:56:33 UTC
updating version, the issue is ever since 3.2.0, probably some ovirt-node behaves differently when accessing yum api. same solution applies.

Comment 14 Jiri Belka 2014-08-07 08:20:21 UTC
This BZ needs d/s RHEVM with fixed otopi. We don't have d/s engine yet and adding RHEVH into oVirt is not supported.

Comment 15 Alon Bar-Lev 2014-08-07 18:48:09 UTC
(In reply to Jiri Belka from comment #14)
> This BZ needs d/s RHEVM with fixed otopi. We don't have d/s engine yet and
> adding RHEVH into oVirt is not supported.

I do not understand how bug#1111303 is related.

Also, I do not understand why latest otopi within brew does not solve this.

Comment 16 Jiri Belka 2014-08-08 06:35:48 UTC
This is for 3.5, we don't have 3.5 engine d/s. IIRC one cannot add RHEVH into oVirt which is still 3.5 for us now.

Mixing 3.5 otopi into RHEVM 3.4 is a hack and we do not verify BZs like this (usually).

BZ1111303 is really not related, I thought it's a bug for d/s engine.

I think we should wait for real d/s engine or do you think that using 3.5 otopi in 3.4 engine would be OK for verification?

Comment 17 Alon Bar-Lev 2014-08-08 07:24:45 UTC
(In reply to Jiri Belka from comment #16)
> This is for 3.5, we don't have 3.5 engine d/s. IIRC one cannot add RHEVH
> into oVirt which is still 3.5 for us now.
> 
> Mixing 3.5 otopi into RHEVM 3.4 is a hack and we do not verify BZs like this
> (usually).
>
> I think we should wait for real d/s engine or do you think that using 3.5
> otopi in 3.4 engine would be OK for verification?

will be OK.
you can wait for downstream 3.5 build if you like like every other bug.

why have you added blocking bug#1116416?

And I do not understand this statement from comment#14:

>  We don't have d/s engine yet and adding RHEVH into oVirt is not supported.

if you wait for 3.5 downstream build, just ignore all bugs until you have this version. it cannot be that something is unsupported in software that is not available.

Comment 18 Jiri Belka 2014-09-05 14:13:58 UTC
ok rhevm-3.5.0-0.10.master.el6ev.noarch

3.2 20140118.1.3.el6_5 -> 3.2 20140225.0.3.2.el6_5 from 3.2 rhevm to 3.5 rhemv (3.2 cluster)


Note You need to log in before you can comment on or make changes to this bug.