Description of problem: When importing OVA of a VM with Q35/UEFI, the import failed with: 2022-04-19 09:28:43,001+03 ERROR [org.ovirt.engine.core.bll.exportimport.ExtractOvaCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-89) [12a71e5b-2334-4bf8-9aac-a360611cc180] Command 'org.ovirt.engine.core.bll.exportimport.ExtractOvaCommand' failed: Duplicate key nvram (attempted merging values =0=QlpoO............ Version-Release number of selected component (if applicable): ovirt-engine-4.5.0.2-0.7.el8ev vdsm-4.50.0.12-1.el8ev.x86_64 qemu-kvm-6.2.0-11.module+el8.6.0+14707+5aa4b42d.x86_64 libvirt-daemon-8.0.0-5.module+el8.6.0+14480+c0a3aa0f.x86_64 How reproducible: 100% Steps to Reproduce: 1. Create new VM with Q35/UEFI chipset and firmware, install RHEL8 on that VM. 2. Export VM as OVA. 3. Import OVA Actual results: Import failed Expected results: Import should succeed Additional info: engine.log, vdsm.log and OVA logs attached.
This bug report has Keywords: Regression or TestBlocker. Since no regressions or test blockers are allowed between releases, it is also being identified as a blocker for this release. Please resolve ASAP.
Note that the duplicate key error doesn't refer to the database as I suspected initially but to the Java code that collects the items from the stream into a map..
I could reproduce the error with OVA provided by Nisim in my environment. However, after upgrading to ovirt-engine 4.5.1-0.2.master.20220419102739.git021b0f27b7.el8 and ansible-runner 2.1.3-1.el8, the import succeeds. I was told there was a different but similar problem, fixed recently, with ansible-runner.
(In reply to Milan Zamazal from comment #7) > I could reproduce the error with OVA provided by Nisim in my environment. > However, after upgrading to ovirt-engine > 4.5.1-0.2.master.20220419102739.git021b0f27b7.el8 and ansible-runner > 2.1.3-1.el8, the import succeeds. I was told there was a different but > similar problem, fixed recently, with ansible-runner. so move it back to ON_QE fo retest, NEW is not going to help anyone
Reassigned, Issue still exist: ovirt-engine-4.5.0.4-0.1.el8ev ansible-runner-2.1.3-1.el8ev.noarch vdsm-4.50.0.13-1.el8ev.x86_64 qemu-kvm-6.2.0-11.module+el8.6.0+14707+5aa4b42d.x86_64 libvirt-daemon-8.0.0-5.module+el8.6.0+14480+c0a3aa0f.x86_64 engine.log attached, see 2022-04-24 21:44:44,840+03 ERROR [org.ovirt.engine.core.bll.exportimport.ExtractOvaCommand]
Target release should be placed once a package build is known to fix a issue. Since this bug is not modified, the target version has been reset. Please use target milestone to plan a fix for a oVirt release.
I could reproduce the problem once from multiple attempts in 4.5.0.4. So I can confirm the problem is still present but I don't know yet under which circumstances it occurs or what causes it.
There is a difference in ovirt-ova-external-data-ansible logs in /var/log/ovirt-engine/ova/. If the import succeeds, there is 2022-04-25 11:37:20 UTC - TASK [python-ver-detect : Run import yaml on py3] ****************************** 2022-04-25 11:37:21 UTC - TASK [ovirt-ova-external-data : Run query script] ****************************** 2022-04-25 11:37:21 UTC - { ... "stdout" : "", "stderr" : "", ... 2022-04-25 11:37:21 UTC - TASK [python-ver-detect : Set facts] ******************************************* 2022-04-25 11:37:23 UTC - TASK [ovirt-ova-external-data : Run query script] ****************************** 2022-04-25 11:37:23 UTC - { ... "stdout" : ";nvram==0=...\r\n", "stdout_lines" : [ ";nvram==0=" ], ... } there. In case the import failed (just once in my setup): 2022-04-25 11:26:44 UTC - TASK [python-ver-detect : Run import yaml on py3] ****************************** 2022-04-25 11:26:45 UTC - TASK [ovirt-ova-external-data : Run query script] ****************************** 2022-04-25 11:26:45 UTC - { "stdout" : ";nvram==0=...\r\n", "stdout_lines" : [ ";nvram==0=..." ], } 2022-04-25 11:26:45 UTC - TASK [python-ver-detect : Set facts] ******************************************* 2022-04-25 11:26:47 UTC - TASK [ovirt-ova-external-data : Run query script] ****************************** 2022-04-25 11:26:47 UTC - { ... "stdout" : ";nvram==0=...\r\n", "stdout_lines" : [ ";nvram==0=..." ], ... } So it looks like the OVA retrieval always runs twice, but sometimes provides the output just on the second run and sometimes in both runs. I don't think it should run twice -- a possible problem in running ansible-runner?
Probably but this makes it more difficult for me to understand how it can be fixed by https://github.com/oVirt/ovirt-engine/pull/237
well, it will;-) moving to 4.5.1 as the workaround is relatively simple, even if weird.
(In reply to Michal Skrivanek from comment #18) > well, it will;-) So can you please explain me how on https://github.com/oVirt/ovirt-engine/pull/237?
Verified: ovirt-engine-4.5.0.6-0.7.el8ev ansible-runner-2.1.3-1.el8ev.noarch vdsm-4.50.0.13-1.el8ev.x86_64 qemu-kvm-6.2.0-11.module+el8.6.0+14707+5aa4b42d.x86_64 libvirt-daemon-8.0.0-5.module+el8.6.0+14480+c0a3aa0f.x86_64 Verification scenario: 1. Export VM with Q35/UEFI as OVA. 2. Import OVA many times, as a single import and with multiple import. Verify import succeeds, run VM and verify VM is running and OS is available. Observe engine.log and vdsm.log and verify there are no errors. 3. Repeat steps 1-2 for Q35/SecureBoot. 4. Eepeat steps 1-2 for Q35/BIOS. 5. Repeat steps 1-2 for I440FX/BIOS. 6. Repeat steps 1-2 for Q35/UEFI Windows VM.