Bug 1788090 - Failed to reinstall a host on upgraded 4.4: Task Copy vdsm and QEMU CSRs failed to execute
Summary: Failed to reinstall a host on upgraded 4.4: Task Copy vdsm and QEMU CSRs fail...
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: ovirt-engine
Classification: oVirt
Component: ovirt-host-deploy-ansible
Version: 4.4.0
Hardware: Unspecified
OS: Unspecified
unspecified
medium
Target Milestone: ovirt-4.4.0
: ---
Assignee: Dana
QA Contact: Petr Matyáš
URL:
Whiteboard:
Depends On:
Blocks: 1751324
TreeView+ depends on / blocked
 
Reported: 2020-01-06 11:52 UTC by Petr Matyáš
Modified: 2020-05-20 20:03 UTC (History)
6 users (show)

Fixed In Version:
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2020-05-20 20:03:21 UTC
oVirt Team: Infra
Embargoed:
pm-rhel: ovirt-4.4+
pm-rhel: blocker?


Attachments (Terms of Use)
host deploy log (353.25 KB, text/plain)
2020-01-06 11:52 UTC, Petr Matyáš
no flags Details
Engine logs (1.22 MB, application/gzip)
2020-02-20 13:01 UTC, Yedidyah Bar David
no flags Details
Host logs (2.92 MB, application/gzip)
2020-02-20 13:02 UTC, Yedidyah Bar David
no flags Details


Links
System ID Private Priority Status Summary Last Updated
oVirt gerrit 107314 0 master MERGED ansible-runner: Amend selinux configuration 2021-01-23 07:42:08 UTC

Description Petr Matyáš 2020-01-06 11:52:58 UTC
Created attachment 1650086 [details]
host deploy log

Description of problem:
I'm trying to reinstall rhel7 host in my 4.4 setup and it's failing on copying vdsm and qemu csrs, however I don't see anything failing in logs.

Version-Release number of selected component (if applicable):
ovirt-engine-4.4.0-0.13.master.el7.noarch
ovirt-host-deploy-common-1.9.0-0.0.master.20191128124417.gitd2b9fa5.el7ev.noarch
vdsm-4.30.38-1.el7ev.x86_64

How reproducible:
always

Steps to Reproduce:
1. have a rhel7 host in 4.4 setup without being HE host
2. reinstall with HE deploy
3.

Actual results:
failed on copying vdsm and qemu csrs

Expected results:
should succeed

Additional info:

Comment 1 Petr Matyáš 2020-01-06 12:12:11 UTC
This also fails for host cert reenrolment with much better exception:
2020-01-06 13:08:23,381+01 ERROR [org.ovirt.engine.core.common.utils.ansible.AnsibleExecutor] (EE-ManagedExecutorService-commandCoordi
nator-Thread-3) [4a57945d-f618-4801-b5be-89ad85a5b748] Exception: Task Copy vdsm and QEMU CSRs failed to execute:
2020-01-06 13:08:23,413+01 ERROR [org.ovirt.engine.core.bll.hostdeploy.HostEnrollCertificateInternalCommand] (EE-ManagedExecutorServic
e-commandCoordinator-Thread-3) [4a57945d-f618-4801-b5be-89ad85a5b748] Command 'org.ovirt.engine.core.bll.hostdeploy.HostEnrollCertific
ateInternalCommand' failed: Task Copy vdsm and QEMU CSRs failed to execute:
2020-01-06 13:08:23,413+01 ERROR [org.ovirt.engine.core.bll.hostdeploy.HostEnrollCertificateInternalCommand] (EE-ManagedExecutorServic
e-commandCoordinator-Thread-3) [4a57945d-f618-4801-b5be-89ad85a5b748] Exception: org.ovirt.engine.core.common.utils.ansible.AnsibleRun
nerCallException: Task Copy vdsm and QEMU CSRs failed to execute:
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.common.utils.ansible.AnsibleRunnerHTTPClient.processEvents(AnsibleRunn
erHTTPClient.java:213)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.common.utils.ansible.AnsibleExecutor.runCommand(AnsibleExecutor.java:1
55)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.common.utils.ansible.AnsibleExecutor.runCommand(AnsibleExecutor.java:5
8)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.common.utils.ansible.AnsibleExecutor.runCommand(AnsibleExecutor.java:4
5)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.hostdeploy.HostEnrollCertificateInternalCommand.executeCommand(Hos
tEnrollCertificateInternalCommand.java:68)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.CommandBase.executeWithoutTransaction(CommandBase.java:1168)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.CommandBase.executeActionInTransactionScope(CommandBase.java:1326)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.CommandBase.runInTransaction(CommandBase.java:2005)
        at org.ovirt.engine.core.utils//org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInSuppressed(TransactionSupp
ort.java:164)
        at org.ovirt.engine.core.utils//org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInScope(TransactionSupport.j
ava:103)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.CommandBase.execute(CommandBase.java:1386)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.CommandBase.executeAction(CommandBase.java:420)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.executor.DefaultBackendActionExecutor.execute(DefaultBackendAction
Executor.java:13)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.Backend.runAction(Backend.java:451)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.Backend.runAction(Backend.java:697)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
        at org.jboss.as.ee.6.GA-redhat-00001//org.jboss.as.ee.component.ManagedReferenceMethodInterceptor.processInvocation(Manage
dReferenceMethodInterceptor.java:52)
        at org.jboss.invocation.1.Final-redhat-1//org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:422)
        at org.jboss.invocation.1.Final-redhat-1//org.jboss.invocation.InterceptorContext$Invocation.proceed(InterceptorContext.ja
va:509)
        at org.jboss.as.weld.common.6.GA-redhat-00001//org.jboss.as.weld.interceptors.Jsr299BindingsInterceptor.delegateIntercepti
on(Jsr299BindingsInterceptor.java:78)
        at org.jboss.as.weld.common.6.GA-redhat-00001//org.jboss.as.weld.interceptors.Jsr299BindingsInterceptor.doMethodIntercepti
on(Jsr299BindingsInterceptor.java:88)
        at org.jboss.as.weld.common.6.GA-redhat-00001//org.jboss.as.weld.interceptors.Jsr299BindingsInterceptor.processInvocation(
Jsr299BindingsInterceptor.java:101)
        at org.jboss.as.ee.6.GA-redhat-00001//org.jboss.as.ee.component.interceptors.UserInterceptorFactory$1.processInvocation(Us
erInterceptorFactory.java:63)
        at org.jboss.invocation.1.Final-redhat-1//org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:422)
        at org.jboss.invocation.1.Final-redhat-1//org.jboss.invocation.InterceptorContext$Invocation.proceed(InterceptorContext.ja
va:509)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.interceptors.CorrelationIdTrackerInterceptor.aroundInvoke(Correlat
ionIdTrackerInterceptor.java:13)
....
        at org.jboss.invocation.1.Final-redhat-1//org.jboss.invocation.InterceptorContext.run(InterceptorContext.java:438)
        at org.wildfly.security.elytron-private.5.Final-redhat-00001//org.wildfly.security.manager.WildFlySecurityManager.doChecke
d(WildFlySecurityManager.java:631)
        at org.jboss.invocation.1.Final-redhat-1//org.jboss.invocation.AccessCheckingInterceptor.processInvocation(AccessCheckingI
nterceptor.java:57)
        at org.jboss.invocation.1.Final-redhat-1//org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:422)
        at org.jboss.invocation.1.Final-redhat-1//org.jboss.invocation.ChainedInterceptor.processInvocation(ChainedInterceptor.jav
a:53)
        at org.jboss.as.ee.6.GA-redhat-00001//org.jboss.as.ee.component.ViewService$View.invoke(ViewService.java:198)
        at org.jboss.as.ee.6.GA-redhat-00001//org.jboss.as.ee.component.ViewDescription$1.processInvocation(ViewDescription.java:1
85)
        at org.jboss.as.ee.6.GA-redhat-00001//org.jboss.as.ee.component.ProxyInvocationHandler.invoke(ProxyInvocationHandler.java:
81)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.interfaces.BackendCommandObjectsHandler$$$view4.runAction(Unknown
Source)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
        at org.jboss.weld.core.6.Final-redhat-00003//org.jboss.weld.util.reflection.Reflections.invokeAndUnwrap(Reflections.java:4
10)
        at org.jboss.weld.core.6.Final-redhat-00003//org.jboss.weld.module.ejb.EnterpriseBeanProxyMethodHandler.invoke(EnterpriseB
eanProxyMethodHandler.java:134)
        at org.jboss.weld.core.6.Final-redhat-00003//org.jboss.weld.bean.proxy.EnterpriseTargetBeanInstance.invoke(EnterpriseTarge
tBeanInstance.java:56)
        at org.jboss.weld.core.6.Final-redhat-00003//org.jboss.weld.module.ejb.InjectionPointPropagatingEnterpriseTargetBeanInstan
ce.invoke(InjectionPointPropagatingEnterpriseTargetBeanInstance.java:68)
        at org.jboss.weld.core.6.Final-redhat-00003//org.jboss.weld.bean.proxy.ProxyMethodHandler.invoke(ProxyMethodHandler.java:1
06)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.BackendCommandObjectsHandler$BackendInternal$BackendLocal$20492596
18$Proxy$_$$_Weld$EnterpriseProxy$.runAction(Unknown Source)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.tasks.CommandExecutor.executeCommand(CommandExecutor.java:60)
        at deployment.engine.ear.bll.jar//org.ovirt.engine.core.bll.tasks.CommandExecutor.lambda$executeAsyncCommand$0(CommandExecutor
.java:49)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
        at org.glassfish.javax.enterprise.concurrent.0.redhat-1//org.glassfish.enterprise.concurrent.internal.ManagedFutureTask.ru
n(ManagedFutureTask.java:141)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:834)
        at org.glassfish.javax.enterprise.concurrent.0.redhat-1//org.glassfish.enterprise.concurrent.ManagedThreadFactoryImpl$Mana
gedThread.run(ManagedThreadFactoryImpl.java:250)

2020-01-06 13:08:23,462+01 ERROR [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (EE-ManagedExecutorService-com
mandCoordinator-Thread-3) [4a57945d-f618-4801-b5be-89ad85a5b748] EVENT_ID: HOST_CERTIFICATION_ENROLLMENT_FAILED(882), Failed to enroll
 certificate for host slot-11 (User: admin@internal-authz).

Comment 2 Yedidyah Bar David 2020-01-06 13:42:32 UTC
(In reply to Petr Matyáš from comment #0)
> Created attachment 1650086 [details]
> host deploy log
> 
> Description of problem:
> I'm trying to reinstall rhel7 host in my 4.4 setup and it's failing on

You mean rhel7 with oVirt/RHV 4.3? 4.4 requires el8.

> copying vdsm and qemu csrs, however I don't see anything failing in logs.
> 
> Version-Release number of selected component (if applicable):
> ovirt-engine-4.4.0-0.13.master.el7.noarch

Not sure what's this version, but upstream master does not use ovirt-host-deploy, since ce9a4f0 (merged Dec 8).

> ovirt-host-deploy-common-1.9.0-0.0.master.20191128124417.gitd2b9fa5.el7ev.
> noarch
> vdsm-4.30.38-1.el7ev.x86_64
> 
> How reproducible:
> always
> 
> Steps to Reproduce:
> 1. have a rhel7 host in 4.4 setup without being HE host

As I said, that's not supported. Is it a 4.3 host?

> 2. reinstall with HE deploy

OK, I think that's at least a reasonable flow to ask for:

1. Have 4.3 engine+hosts
2. Update engine to 4.4
3. Reinstall host with HE deploy

I think that if at all, it will only be supported (in RHV) via a RHVH image, because we only have a single channel for 4 hosts.

So please try it with an up-to-date engine. If it fails, we still need to decide whether we want to support it - it might be decided that for HE, only 4.4 (el8) hosts will be supported.

Thanks.

Comment 3 Petr Matyáš 2020-01-06 13:48:59 UTC
Yes, I have 4.3 hosts in 4.3 cluster in 4.4 engine, which should still work.
I can't upgrade the hosts to rhel8 because the hardware doesn't support it (till we make some upgrades).

The same issue is for installation on rhel7 host with 4.3 repos.

Also the flow you provided is exactly what I did.

I can retry whenever we might get a new build.

Comment 4 RHEL Program Management 2020-01-16 09:40:23 UTC
This bug report has Keywords: Regression or TestBlocker.
Since no regressions or test blockers are allowed between releases, it is also being identified as a blocker for this release. Please resolve ASAP.

Comment 6 RHV bug bot 2020-01-24 19:49:15 UTC
INFO: Bug status wasn't changed from MODIFIED to ON_QA due to the following reason:

[No relevant external trackers attached]

For more info please contact: infra

Comment 7 Petr Matyáš 2020-01-27 12:32:27 UTC
This is still happening on my setup with updated engine to ovirt-engine-4.4.0-0.17.master.el7.noarch,
I even updated all the packages currently present on host.

Comment 8 Michal Skrivanek 2020-01-27 13:19:23 UTC
with the same error as in comment #1?

Comment 9 Petr Matyáš 2020-01-27 13:22:38 UTC
Yes, exactly the same.

Comment 10 Michal Skrivanek 2020-01-27 14:48:00 UTC
related to upgrade, we now use a different security context for cert requests and if the old one exists with the same name ansible fails to remove/replace the file because of selinux denial

This is not a problem for clean installs

Workaround is to remove old requests (bulk remove of everything in /etc/pki/ovirt-engine/requests is good enough)

Comment 11 Michal Skrivanek 2020-01-27 14:49:10 UTC
also, likely not relevant to the host version

Comment 12 Yedidyah Bar David 2020-02-20 13:00:49 UTC
Failed for me as well now. host-deploy log has:

2020-02-20 13:22:50 IST - TASK [ovirt-host-deploy-vdsm-certificates : Copy vdsm and QEMU CSRs] ***********
2020-02-20 13:22:53 IST - 
2020-02-20 13:22:53 IST - {
  "status" : "OK",
  "msg" : "",
  "data" : {
    "uuid" : "e1f104fb-f9d4-4cf3-a4c9-62fdfda27167",
    "counter" : 109,
    "stdout" : "",
    "start_line" : 108,
    "end_line" : 108,
    "runner_ident" : "290c35fa-53d3-11ea-a36c-001a4a231728",
    "event" : "runner_on_failed",
    "event_data" : {
      "playbook" : "ovirt-host-deploy.yml",
      "playbook_uuid" : "ed1a0da8-d63a-4740-820d-b3a3d97a38ab",
      "play" : "all",
      "play_uuid" : "001a4a23-1728-896a-bcbf-000000000006",
      "play_pattern" : "all",
      "task" : "Copy vdsm and QEMU CSRs",
      "task_uuid" : "001a4a23-1728-896a-bcbf-0000000001f4",
      "task_action" : "copy",
      "task_args" : "",
      "task_path" : "/usr/share/ovirt-engine/ansible-runner-service-project/project/roles/ovirt-host-deploy-vdsm-certificates/tasks/main.yml:30",
      "role" : "ovirt-host-deploy-vdsm-certificates",
      "host" : "didi-centos8-host.lab.eng.tlv2.redhat.com",
      "remote_addr" : "didi-centos8-host.lab.eng.tlv2.redhat.com",
      "res" : {
        "results" : [ {
          "diff" : [ ],
          "msg" : "Failed to replace file: b'/var/lib/ovirt-engine/.ansible/tmp/ansible-tmp-1582197769.8334732-74078116791820/source' to /etc/pki/ovirt-engine/requests/didi-centos8-host.lab.
eng.tlv2.redhat.com.req: [Errno 13] Permission denied: b'/etc/pki/ovirt-engine/requests/.ansible_tmpi0i_8473didi-centos8-host.lab.eng.tlv2.redhat.com.req'",
          "exception" : "Traceback (most recent call last):\n  File \"/tmp/ansible_copy_payload_koi4spdm/ansible_copy_payload.zip/ansible/module_utils/basic.py\", line 2252, in atomic_move\n
    os.rename(b_src, b_dest)\nOSError: [Errno 18] Invalid cross-device link: b'/var/lib/ovirt-engine/.ansible/tmp/ansible-tmp-1582197769.8334732-74078116791820/source' -> b'/etc/pki/ovirt-en
gine/requests/didi-centos8-host.lab.eng.tlv2.redhat.com.req'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n  File \"/usr/lib64
/python3.6/shutil.py\", line 550, in move\n    os.rename(src, real_dst)\nOSError: [Errno 18] Invalid cross-device link: b'/var/lib/ovirt-engine/.ansible/tmp/ansible-tmp-1582197769.8334732-74
078116791820/source' -> b'/etc/pki/ovirt-engine/requests/.ansible_tmpi0i_8473didi-centos8-host.lab.eng.tlv2.redhat.com.req'\n\nDuring handling of the above exception, another exception occur
red:\n\nTraceback (most recent call last):\n  File \"/tmp/ansible_copy_payload_koi4spdm/ansible_copy_payload.zip/ansible/module_utils/basic.py\", line 2296, in atomic_move\n    shutil.move(b
_src, b_tmp_dest_name)\n  File \"/usr/lib64/python3.6/shutil.py\", line 564, in move\n    copy_function(src, real_dst)\n  File \"/usr/lib64/python3.6/shutil.py\", line 264, in copy2\n    cop
ystat(src, dst, follow_symlinks=follow_symlinks)\n  File \"/usr/lib64/python3.6/shutil.py\", line 229, in copystat\n    _copyxattr(src, dst, follow_symlinks=follow)\n  File \"/usr/lib64/pyth
on3.6/shutil.py\", line 165, in _copyxattr\n    os.setxattr(dst, name, value, follow_symlinks=follow_symlinks)\nPermissionError: [Errno 13] Permission denied: b'/etc/pki/ovirt-engine/request
s/.ansible_tmpi0i_8473didi-centos8-host.lab.eng.tlv2.redhat.com.req'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n  File \"/t
mp/ansible_copy_payload_koi4spdm/ansible_copy_payload.zip/ansible/module_utils/basic.py\", line 2300, in atomic_move\n    shutil.copy2(b_src, b_tmp_dest_name)\n  File \"/usr/lib64/python3.6/
shutil.py\", line 264, in copy2\n    copystat(src, dst, follow_symlinks=follow_symlinks)\n  File \"/usr/lib64/python3.6/shutil.py\", line 229, in copystat\n    _copyxattr(src, dst, follow_sy
mlinks=follow)\n  File \"/usr/lib64/python3.6/shutil.py\", line 165, in _copyxattr\n    os.setxattr(dst, name, value, follow_symlinks=follow_symlinks)\nPermissionError: [Errno 13] Permission
 denied: b'/etc/pki/ovirt-engine/requests/.ansible_tmpi0i_8473didi-centos8-host.lab.eng.tlv2.redhat.com.req'\n",
          "failed" : true,

Engine is in the CI-build-result of a pending patch to use el8 [1][2]. The engine inside it is ovirt-engine-4.4.0-0.0.master.20200218121717.git14967178f54.el8.noarch .

Since this is a more-or-less first attempt to use such an appliance, it might be decided that it's a bug in the appliance building, so an update to [1] would be needed. E.g. some selinux issue or whatever. Didn't check further yet.

[1] https://gerrit.ovirt.org/107003
[2] https://jenkins.ovirt.org/job/ovirt-appliance_standard-check-patch/223/

Comment 13 Yedidyah Bar David 2020-02-20 13:01:43 UTC
Created attachment 1664360 [details]
Engine logs

Comment 14 Yedidyah Bar David 2020-02-20 13:02:23 UTC
Created attachment 1664361 [details]
Host logs

Comment 15 Evgeny Slutsky 2020-02-23 11:32:31 UTC
(In reply to Yedidyah Bar David from comment #12)
> Failed for me as well now. host-deploy log has:
> 
> 2020-02-20 13:22:50 IST - TASK [ovirt-host-deploy-vdsm-certificates : Copy
> vdsm and QEMU CSRs] ***********
> 2020-02-20 13:22:53 IST - 
> 2020-02-20 13:22:53 IST - {
>   "status" : "OK",
>   "msg" : "",
>   "data" : {
>     "uuid" : "e1f104fb-f9d4-4cf3-a4c9-62fdfda27167",
>     "counter" : 109,
>     "stdout" : "",
>     "start_line" : 108,
>     "end_line" : 108,
>     "runner_ident" : "290c35fa-53d3-11ea-a36c-001a4a231728",
>     "event" : "runner_on_failed",
>     "event_data" : {
>       "playbook" : "ovirt-host-deploy.yml",
>       "playbook_uuid" : "ed1a0da8-d63a-4740-820d-b3a3d97a38ab",
>       "play" : "all",
>       "play_uuid" : "001a4a23-1728-896a-bcbf-000000000006",
>       "play_pattern" : "all",
>       "task" : "Copy vdsm and QEMU CSRs",
>       "task_uuid" : "001a4a23-1728-896a-bcbf-0000000001f4",
>       "task_action" : "copy",
>       "task_args" : "",
>       "task_path" :
> "/usr/share/ovirt-engine/ansible-runner-service-project/project/roles/ovirt-
> host-deploy-vdsm-certificates/tasks/main.yml:30",
>       "role" : "ovirt-host-deploy-vdsm-certificates",
>       "host" : "didi-centos8-host.lab.eng.tlv2.redhat.com",
>       "remote_addr" : "didi-centos8-host.lab.eng.tlv2.redhat.com",
>       "res" : {
>         "results" : [ {
>           "diff" : [ ],
>           "msg" : "Failed to replace file:
> b'/var/lib/ovirt-engine/.ansible/tmp/ansible-tmp-1582197769.8334732-
> 74078116791820/source' to
> /etc/pki/ovirt-engine/requests/didi-centos8-host.lab.
> eng.tlv2.redhat.com.req: [Errno 13] Permission denied:
> b'/etc/pki/ovirt-engine/requests/.ansible_tmpi0i_8473didi-centos8-host.lab.
> eng.tlv2.redhat.com.req'",
>           "exception" : "Traceback (most recent call last):\n  File
> \"/tmp/ansible_copy_payload_koi4spdm/ansible_copy_payload.zip/ansible/
> module_utils/basic.py\", line 2252, in atomic_move\n
>     os.rename(b_src, b_dest)\nOSError: [Errno 18] Invalid cross-device link:
> b'/var/lib/ovirt-engine/.ansible/tmp/ansible-tmp-1582197769.8334732-
> 74078116791820/source' -> b'/etc/pki/ovirt-en
> gine/requests/didi-centos8-host.lab.eng.tlv2.redhat.com.req'\n\nDuring
> handling of the above exception, another exception occurred:\n\nTraceback
> (most recent call last):\n  File \"/usr/lib64
> /python3.6/shutil.py\", line 550, in move\n    os.rename(src,
> real_dst)\nOSError: [Errno 18] Invalid cross-device link:
> b'/var/lib/ovirt-engine/.ansible/tmp/ansible-tmp-1582197769.8334732-74
> 078116791820/source' ->
> b'/etc/pki/ovirt-engine/requests/.ansible_tmpi0i_8473didi-centos8-host.lab.
> eng.tlv2.redhat.com.req'\n\nDuring handling of the above exception, another
> exception occur
> red:\n\nTraceback (most recent call last):\n  File
> \"/tmp/ansible_copy_payload_koi4spdm/ansible_copy_payload.zip/ansible/
> module_utils/basic.py\", line 2296, in atomic_move\n    shutil.move(b
> _src, b_tmp_dest_name)\n  File \"/usr/lib64/python3.6/shutil.py\", line 564,
> in move\n    copy_function(src, real_dst)\n  File
> \"/usr/lib64/python3.6/shutil.py\", line 264, in copy2\n    cop
> ystat(src, dst, follow_symlinks=follow_symlinks)\n  File
> \"/usr/lib64/python3.6/shutil.py\", line 229, in copystat\n   
> _copyxattr(src, dst, follow_symlinks=follow)\n  File \"/usr/lib64/pyth
> on3.6/shutil.py\", line 165, in _copyxattr\n    os.setxattr(dst, name,
> value, follow_symlinks=follow_symlinks)\nPermissionError: [Errno 13]
> Permission denied: b'/etc/pki/ovirt-engine/request
> s/.ansible_tmpi0i_8473didi-centos8-host.lab.eng.tlv2.redhat.com.
> req'\n\nDuring handling of the above exception, another exception
> occurred:\n\nTraceback (most recent call last):\n  File \"/t
> mp/ansible_copy_payload_koi4spdm/ansible_copy_payload.zip/ansible/
> module_utils/basic.py\", line 2300, in atomic_move\n    shutil.copy2(b_src,
> b_tmp_dest_name)\n  File \"/usr/lib64/python3.6/
> shutil.py\", line 264, in copy2\n    copystat(src, dst,
> follow_symlinks=follow_symlinks)\n  File \"/usr/lib64/python3.6/shutil.py\",
> line 229, in copystat\n    _copyxattr(src, dst, follow_sy
> mlinks=follow)\n  File \"/usr/lib64/python3.6/shutil.py\", line 165, in
> _copyxattr\n    os.setxattr(dst, name, value,
> follow_symlinks=follow_symlinks)\nPermissionError: [Errno 13] Permission
>  denied:
> b'/etc/pki/ovirt-engine/requests/.ansible_tmpi0i_8473didi-centos8-host.lab.
> eng.tlv2.redhat.com.req'\n",
>           "failed" : true,
> 
> Engine is in the CI-build-result of a pending patch to use el8 [1][2]. The
> engine inside it is
> ovirt-engine-4.4.0-0.0.master.20200218121717.git14967178f54.el8.noarch .
> 
> Since this is a more-or-less first attempt to use such an appliance, it
> might be decided that it's a bug in the appliance building, so an update to
> [1] would be needed. E.g. some selinux issue or whatever. Didn't check
> further yet.
> 
> [1] https://gerrit.ovirt.org/107003
> [2] https://jenkins.ovirt.org/job/ovirt-appliance_standard-check-patch/223/
+1
i've also encounter it in clean hosted-engine 4.4 installation (host installed from scratch) , the engine appliance is based on el8.

Comment 16 Evgeny Slutsky 2020-02-24 12:36:21 UTC
temporarily setting `setenforce 0` on the engine Before executing  ovirt_host fixes the issue:
https://github.com/oVirt/ovirt-ansible-hosted-engine-setup/blob/7fbf90156b423f41814482c76cd34293b556cb2b/tasks/bootstrap_local_vm/05_add_host.yml#L111

Comment 17 Evgeny Slutsky 2020-02-26 08:36:58 UTC
proposing workaround patch in the ovirt-ansible-hosted-engine-setup https://github.com/oVirt/ovirt-ansible-hosted-engine-setup/pull/292

Comment 18 Yedidyah Bar David 2020-03-01 10:28:53 UTC
With 107314, 'Reinstall' worked for me from the web ui (on an engine after failed HE deploy, left running).

Comment 19 Petr Matyáš 2020-03-20 11:05:49 UTC
Verified on ovirt-engine-4.4.0-0.25.master.el8ev.noarch

Comment 20 Sandro Bonazzola 2020-05-20 20:03:21 UTC
This bugzilla is included in oVirt 4.4.0 release, published on May 20th 2020.

Since the problem described in this bug report should be
resolved in oVirt 4.4.0 release, it has been closed with a resolution of CURRENT RELEASE.

If the solution does not work for you, please open a new bug report.


Note You need to log in before you can comment on or make changes to this bug.