Bug 1855959
| Summary: | OVA export or import of VM or template failed with Internal server error | ||||||||
|---|---|---|---|---|---|---|---|---|---|
| Product: | Red Hat Enterprise Virtualization Manager | Reporter: | Nisim Simsolo <nsimsolo> | ||||||
| Component: | ovirt-engine | Assignee: | Martin Necas <mnecas> | ||||||
| Status: | CLOSED ERRATA | QA Contact: | Nisim Simsolo <nsimsolo> | ||||||
| Severity: | high | Docs Contact: | |||||||
| Priority: | high | ||||||||
| Version: | 4.4.0 | CC: | bugs, dfodor, lrotenbe, mavital, mperina, mtessun, nsimsolo, pmatyas | ||||||
| Target Milestone: | ovirt-4.4.1-1 | Keywords: | Regression, TestOnly | ||||||
| Target Release: | --- | ||||||||
| Hardware: | Unspecified | ||||||||
| OS: | Unspecified | ||||||||
| Whiteboard: | |||||||||
| Fixed In Version: | rhv 4.4.1-12 | Doc Type: | No Doc Update | ||||||
| Doc Text: | Story Points: | --- | |||||||
| Clone Of: | Environment: | ||||||||
| Last Closed: | 2020-08-04 16:23:10 UTC | Type: | Bug | ||||||
| Regression: | --- | Mount Type: | --- | ||||||
| Documentation: | --- | CRM: | |||||||
| Verified Versions: | Category: | --- | |||||||
| oVirt Team: | Virt | RHEL 7.3 requirements from Atomic Host: | |||||||
| Cloudforms Team: | --- | Target Upstream Version: | |||||||
| Embargoed: | |||||||||
| Bug Depends On: | 1851998 | ||||||||
| Bug Blocks: | 1836303, 1845458 | ||||||||
| Attachments: |
|
||||||||
Created attachment 1700681 [details]
engine.log
Created attachment 1700682 [details]
vdsm.log
The problem is a conflict of ansible-runner-service and SELinux policy, harming every ansible-runner-service flows.
ansible-runner-1.4.5-1.el8ar.noarch
ansible-runner-service-1.0.2-1.el8ev.noarch
python3-ansible-runner-1.4.5-1.el8ar.noarch
The ansible-runner-service config file under /etc/ansible-runner-service/config.yaml is:
version: 1
playbooks_root_dir: '/usr/share/ovirt-engine/ansible-runner-service-project'
ssh_private_key: '/etc/pki/ovirt-engine/keys/engine_id_rsa'
port: 50001
target_user: root
log_path: '/var/log/ovirt-engine'
The environment is after upgrade, SELinux is enforced.
From journalctl:
Jul 12 10:35:03 nsimsolo41.scl.lab.tlv.redhat.com setroubleshoot[59139]: SELinux is preventing httpd from open access on the file /var/log/ovirt-engine/ansible-runner-service.log. For compl>
Jul 12 10:35:03 nsimsolo41.scl.lab.tlv.redhat.com platform-python[59139]: SELinux is preventing httpd from open access on the file /var/log/ovirt-engine/ansible-runner-service.log.
***** Plugin catchall (100. confidence) suggests **************************
If you believe that httpd should be allowed open access on the ansible-runner-service.log file by default.
Then you should report this as a bug.
You can generate a local policy module to allow this access.
Do
allow this access for now by executing:
# ausearch -c 'httpd' --raw | audit2allow -M my-httpd
# semodule -X 300 -i my-httpd.pp
The service couldn't to the log, preventing it from running the playbook. Therefore we see the traceback in the engine, getting a wrong result from the service.
Workaround:
1. Change SELinux to permissive (# setenforce 0)
2. Removing the logging from config.yaml (removing the line: log_path: '/var/log/ovirt-engine') - I didn't try it.
After changing the SELinux to permissive, everything works again. I tried to change back to enforced and it's still working.
But, after rebooting the machine the error is showing up again.
The current log is:
-rw-r--r--. 1 ovirt ovirt system_u:object_r:var_log_t:s0 5747228 Jul 12 11:11 ansible-runner-service.log
Moving it to infra for further investigation.
Marking as TestOnly to test OVA export flow, because the real fix will be included as a part of BZ1851998 *** Bug 1856300 has been marked as a duplicate of this bug. *** Verified: ovirt-engine-4.4.1.10-0.1.el8ev vdsm-4.40.22-1.el8ev.x86_64 qemu-kvm-4.2.0-29.module+el8.2.1+7297+a825794d.x86_64 libvirt-daemon-6.0.0-25.module+el8.2.1+7154+47ffd890.x86_64 Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory (RHV Manager (ovirt-engine) 4.4 (0-day)), and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2020:3317 |
Description of problem: OVA export or import of VM or template failed with internal server error in engine.log: * OVA import: 2020-07-11 12:48:47,442+03 ERROR [org.ovirt.engine.core.bll.GetVmFromOvaQuery] (default task-9) [7904c5ad-58d2-4abe-afef-41e2585270ad] Query 'GetVmFromOvaQuery' failed: EngineException: Failed to query OVA info (Failed with error GeneralException and code 100) 2020-07-11 12:48:47,443+03 ERROR [org.ovirt.engine.core.bll.GetVmFromOvaQuery] (default task-9) [7904c5ad-58d2-4abe-afef-41e2585270ad] Exception: org.ovirt.engine.core.common.errors.EngineException: EngineException: Failed to query OVA info (Failed with error GeneralException and code 100) . . ----------------------------- *OVA export: 2020-07-11 12:49:37,054+03 ERROR [org.ovirt.engine.core.common.utils.ansible.AnsibleExecutor] (default task-9) [3e0b01c6-dd9b-4dad-8f76-4a4935841d81] Exception: Internal server error 2020-07-11 12:49:37,055+03 WARN [org.ovirt.engine.core.bll.exportimport.ExportVmToOvaCommand] (default task-9) [3e0b01c6-dd9b-4dad-8f76-4a4935841d81] Validation of action 'ExportVmToOva' failed for user admin@internal-authz. Reasons: VAR__ACTION__EXPORT,VAR__TYPE__VM,ACTION_TYPE_FAILED_INVALID_OVA_DESTINATION_FOLDER,$vdsName rose07,$directory /home/nisim/rhv_ova2 2020-07-11 12:49:37,056+03 INFO [org.ovirt.engine.core.bll.exportimport.ExportVmToOvaCommand] (default task-9) [3e0b01c6-dd9b-4dad-8f76-4a4935841d81] Lock freed to object 'EngineLock:{exclusiveLocks='[c5653231-315d-4a5e-949c-3e3f51216142=VM]', sharedLocks=''}' Version-Release number of selected component (if applicable): ovirt-engine-4.4.1.8-0.7.el8ev vdsm-4.40.22-1.el8ev.x86_64 libvirt-daemon-6.0.0-25.module+el8.2.1+7154+47ffd890.x86_64 qemu-kvm-4.2.0-29.module+el8.2.1+7297+a825794d.x86_64 How reproducible: 100% Steps to Reproduce: 1. Try to export VM as OVA 2. Try to import VM OVA 3. Try to export template as OVA 4. Try to import template OVA Actual results: All failed with internal server error. Expected results: OVA export/import of VM or template should succeed. Additional info: engine.log and vdsm.log attached.