Bug 1855959 - OVA export or import of VM or template failed with Internal server error
Summary: OVA export or import of VM or template failed with Internal server error
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: ovirt-engine
Version: 4.4.0
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: ovirt-4.4.1-1
: ---
Assignee: Martin Necas
QA Contact: Nisim Simsolo
URL:
Whiteboard:
: 1856300 (view as bug list)
Depends On: 1851998
Blocks: 1836303 1845458
TreeView+ depends on / blocked
 
Reported: 2020-07-11 09:55 UTC by Nisim Simsolo
Modified: 2020-08-04 16:23 UTC (History)
8 users (show)

Fixed In Version: rhv 4.4.1-12
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2020-08-04 16:23:10 UTC
oVirt Team: Virt
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
engine.log (161.03 KB, application/x-xz)
2020-07-11 09:57 UTC, Nisim Simsolo
no flags Details
vdsm.log (395.73 KB, application/x-xz)
2020-07-11 09:58 UTC, Nisim Simsolo
no flags Details


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2020:3317 0 None None None 2020-08-04 16:23:18 UTC

Description Nisim Simsolo 2020-07-11 09:55:19 UTC
Description of problem:
OVA export or import of VM or template failed with internal server error in engine.log:

* OVA import:
2020-07-11 12:48:47,442+03 ERROR [org.ovirt.engine.core.bll.GetVmFromOvaQuery] (default task-9) [7904c5ad-58d2-4abe-afef-41e2585270ad] Query 'GetVmFromOvaQuery' failed: EngineException: Failed to query OVA info (Failed with error GeneralException and code 100)
2020-07-11 12:48:47,443+03 ERROR [org.ovirt.engine.core.bll.GetVmFromOvaQuery] (default task-9) [7904c5ad-58d2-4abe-afef-41e2585270ad] Exception: org.ovirt.engine.core.common.errors.EngineException: EngineException: Failed to query OVA info (Failed with error GeneralException and code 100)
.
.
-----------------------------
*OVA export:
2020-07-11 12:49:37,054+03 ERROR [org.ovirt.engine.core.common.utils.ansible.AnsibleExecutor] (default task-9) [3e0b01c6-dd9b-4dad-8f76-4a4935841d81] Exception: Internal server error
2020-07-11 12:49:37,055+03 WARN  [org.ovirt.engine.core.bll.exportimport.ExportVmToOvaCommand] (default task-9) [3e0b01c6-dd9b-4dad-8f76-4a4935841d81] Validation of action 'ExportVmToOva' failed for user admin@internal-authz. Reasons: VAR__ACTION__EXPORT,VAR__TYPE__VM,ACTION_TYPE_FAILED_INVALID_OVA_DESTINATION_FOLDER,$vdsName rose07,$directory /home/nisim/rhv_ova2
2020-07-11 12:49:37,056+03 INFO  [org.ovirt.engine.core.bll.exportimport.ExportVmToOvaCommand] (default task-9) [3e0b01c6-dd9b-4dad-8f76-4a4935841d81] Lock freed to object 'EngineLock:{exclusiveLocks='[c5653231-315d-4a5e-949c-3e3f51216142=VM]', sharedLocks=''}'


Version-Release number of selected component (if applicable):
ovirt-engine-4.4.1.8-0.7.el8ev
vdsm-4.40.22-1.el8ev.x86_64
libvirt-daemon-6.0.0-25.module+el8.2.1+7154+47ffd890.x86_64
qemu-kvm-4.2.0-29.module+el8.2.1+7297+a825794d.x86_64

How reproducible:
100%

Steps to Reproduce:
1. Try to export VM as OVA
2. Try to import VM OVA
3. Try to export template as OVA
4. Try to import template OVA

Actual results:
All failed with internal server error.

Expected results:
OVA export/import of VM or template should succeed.

Additional info:
engine.log and vdsm.log attached.

Comment 1 Nisim Simsolo 2020-07-11 09:57:54 UTC
Created attachment 1700681 [details]
engine.log

Comment 2 Nisim Simsolo 2020-07-11 09:58:23 UTC
Created attachment 1700682 [details]
vdsm.log

Comment 3 Liran Rotenberg 2020-07-12 08:25:48 UTC
The problem is a conflict of ansible-runner-service and SELinux policy, harming every ansible-runner-service flows.

ansible-runner-1.4.5-1.el8ar.noarch
ansible-runner-service-1.0.2-1.el8ev.noarch
python3-ansible-runner-1.4.5-1.el8ar.noarch

The ansible-runner-service config file under /etc/ansible-runner-service/config.yaml is:
version: 1
playbooks_root_dir: '/usr/share/ovirt-engine/ansible-runner-service-project'
ssh_private_key: '/etc/pki/ovirt-engine/keys/engine_id_rsa'
port: 50001
target_user: root
log_path: '/var/log/ovirt-engine'

The environment is after upgrade, SELinux is enforced.

From journalctl:
Jul 12 10:35:03 nsimsolo41.scl.lab.tlv.redhat.com setroubleshoot[59139]: SELinux is preventing httpd from open access on the file /var/log/ovirt-engine/ansible-runner-service.log. For compl>
Jul 12 10:35:03 nsimsolo41.scl.lab.tlv.redhat.com platform-python[59139]: SELinux is preventing httpd from open access on the file /var/log/ovirt-engine/ansible-runner-service.log.
                                                                          
                                                                          *****  Plugin catchall (100. confidence) suggests   **************************
                                                                          
                                                                          If you believe that httpd should be allowed open access on the ansible-runner-service.log file by default.
                                                                          Then you should report this as a bug.
                                                                          You can generate a local policy module to allow this access.
                                                                          Do
                                                                          allow this access for now by executing:
                                                                          # ausearch -c 'httpd' --raw | audit2allow -M my-httpd
                                                                          # semodule -X 300 -i my-httpd.pp

The service couldn't to the log, preventing it from running the playbook. Therefore we see the traceback in the engine, getting a wrong result from the service.

Workaround:
1. Change SELinux to permissive (# setenforce 0)
2. Removing the logging from config.yaml (removing the line: log_path: '/var/log/ovirt-engine') - I didn't try it.

After changing the SELinux to permissive, everything works again. I tried to change back to enforced and it's still working.
But, after rebooting the machine the error is showing up again.

The current log is:
-rw-r--r--. 1 ovirt ovirt system_u:object_r:var_log_t:s0     5747228 Jul 12 11:11 ansible-runner-service.log
 
Moving it to infra for further investigation.

Comment 6 Martin Perina 2020-07-13 09:15:58 UTC
Marking as TestOnly to test OVA export flow, because the real fix will be included as a part of BZ1851998

Comment 7 Michal Skrivanek 2020-07-14 07:22:08 UTC
*** Bug 1856300 has been marked as a duplicate of this bug. ***

Comment 9 Nisim Simsolo 2020-07-27 10:11:17 UTC
Verified:
ovirt-engine-4.4.1.10-0.1.el8ev
vdsm-4.40.22-1.el8ev.x86_64
qemu-kvm-4.2.0-29.module+el8.2.1+7297+a825794d.x86_64
libvirt-daemon-6.0.0-25.module+el8.2.1+7154+47ffd890.x86_64

Comment 11 errata-xmlrpc 2020-08-04 16:23:10 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (RHV Manager (ovirt-engine) 4.4 (0-day)), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2020:3317


Note You need to log in before you can comment on or make changes to this bug.