Bug 1795672 - Failed to deploy HE 4.4.
Summary: Failed to deploy HE 4.4.
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: ovirt-engine-dwh
Classification: oVirt
Component: General
Version: 4.4.0
Hardware: x86_64
OS: Linux
unspecified
urgent
Target Milestone: ovirt-4.4.0
: ---
Assignee: Shirly Radco
QA Contact: Nikolai Sednev
URL:
Whiteboard:
Depends On: 1804672
Blocks: 1527077 1569593 1641694 1664479 1665138 1686575 1695523 1727581 1768511
TreeView+ depends on / blocked
 
Reported: 2020-01-28 15:39 UTC by Nikolai Sednev
Modified: 2020-05-20 20:01 UTC (History)
17 users (show)

Fixed In Version: ovirt-engine-dwh-4.4.0.1-1.el8ev
Clone Of:
Environment:
Last Closed: 2020-05-20 20:01:54 UTC
oVirt Team: Integration
Embargoed:
pm-rhel: ovirt-4.4+
pm-rhel: blocker?


Attachments (Terms of Use)
sosreport from alma04 (6.41 MB, application/x-xz)
2020-01-28 15:39 UTC, Nikolai Sednev
no flags Details
sosreport from puma18 (5.48 MB, application/x-xz)
2020-03-25 09:35 UTC, Nikolai Sednev
no flags Details

Description Nikolai Sednev 2020-01-28 15:39:50 UTC
Created attachment 1656034 [details]
sosreport from alma04

Description of problem:
Failed to deploy HE 4.4.

[ ERROR ] fatal: [localhost]: FAILED! => {"attempts": 180, "changed": true, "cmd": ["hosted-engine", "--vm-status", "--json"], "delta": "0:00:00.397463", "end": "2020-01-28 10:24:43.801109", "rc": 0, "start": "2020-01-28 10:24:43.403646", "stderr": "", "stderr_lines": [], "stdout": "{\"1\": {\"host-id\": 1, \"host-ts\": 12573, \"score\": 0, \"engine-status\": {\"vm\": \"down_unexpected\", \"health\": \"bad\", \"detail\": \"Down\", \"reason\": \"bad vm status\"}, \"hostname\": \"alma04.qa.lab.tlv.redhat.com\", \"maintenance\": false, \"stopped\": false, \"crc32\": \"c578ad26\", \"conf_on_shared_storage\": true, \"local_conf_timestamp\": 12573, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=12573 (Tue Jan 28 10:24:41 2020)\\nhost-id=1\\nscore=0\\nvm_conf_refresh_time=12573 (Tue Jan 28 10:24:41 2020)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUnexpectedlyDown\\nstopped=False\\ntimeout=Wed Dec 31 22:33:03 1969\\n\", \"live-data\": true}, \"global_maintenance\": false}", "stdout_lines": ["{\"1\": {\"host-id\": 1, \"host-ts\": 12573, \"score\": 0, \"engine-status\": {\"vm\": \"down_unexpected\", \"health\": \"bad\", \"detail\": \"Down\", \"reason\": \"bad vm status\"}, \"hostname\": \"alma04.qa.lab.tlv.redhat.com\", \"maintenance\": false, \"stopped\": false, \"crc32\": \"c578ad26\", \"conf_on_shared_storage\": true, \"local_conf_timestamp\": 12573, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=12573 (Tue Jan 28 10:24:41 2020)\\nhost-id=1\\nscore=0\\nvm_conf_refresh_time=12573 (Tue Jan 28 10:24:41 2020)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUnexpectedlyDown\\nstopped=False\\ntimeout=Wed Dec 31 22:33:03 1969\\n\", \"live-data\": true}, \"global_maintenance\": false}"]}
[ INFO  ] TASK [ovirt.hosted_engine_setup : Check VM status at virt level]
[ INFO  ] TASK [ovirt.hosted_engine_setup : Fail if engine VM is not running]
[ ERROR ] fatal: [localhost]: FAILED! => {"changed": false, "msg": "Engine VM is not running, please check vdsm logs"}
[ ERROR ] Failed to execute stage 'Closing up': Failed executing ansible-playbook

Version-Release number of selected component (if applicable):
ovirt-hosted-engine-setup-2.4.0-1.el8ev.noarch
ovirt-hosted-engine-ha-2.4.0-1.el8ev.noarch
rhvm-appliance.x86_64 2:4.4-20200116.0.el8ev  
@rhv-4.4.0                      
Linux 4.18.0-167.el8.x86_64 #1 SMP Sun Dec 15 01:24:23 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux               
Red Hat Enterprise Linux release 8.2 Beta (Ootpa)


How reproducible:
100%

Steps to Reproduce:
1.Deploy HE over NFS.

Actual results:
Deployment fails.

Expected results:
Should succeed.

Additional info:

Comment 1 Evgeny Slutsky 2020-02-11 11:27:31 UTC
is this still reproducible?

Comment 2 Nikolai Sednev 2020-02-18 10:48:48 UTC
(In reply to Evgeny Slutsky from comment #1)
> is this still reproducible?

Yes.
[ ERROR ] fatal: [localhost]: FAILED! => {"attempts": 180, "changed": true, "cmd": ["hosted-engine", "--vm-status", "--json"], "delta": "0:00:00.396908", "end": "2020-02-18 12:43:23.732684", "rc": 0, "start": "2020-02-18 12:43:23.335776", "stderr": "", "stderr_lines": [], "stdout": "{\"1\": {\"host-id\": 1, \"host-ts\": 6344, \"score\": 0, \"engine-status\": {\"vm\": \"down_unexpected\", \"health\": \"bad\", \"detail\": \"Down\", \"reason\": \"bad vm status\"}, \"hostname\": \"alma03.qa.lab.tlv.redhat.com\", \"maintenance\": false, \"stopped\": false, \"crc32\": \"f2360116\", \"conf_on_shared_storage\": true, \"local_conf_timestamp\": 6344, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6344 (Tue Feb 18 12:43:14 2020)\\nhost-id=1\\nscore=0\\nvm_conf_refresh_time=6344 (Tue Feb 18 12:43:14 2020)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUnexpectedlyDown\\nstopped=False\\ntimeout=Thu Jan  1 03:49:22 1970\\n\", \"live-data\": true}, \"global_maintenance\": false}", "stdout_lines": ["{\"1\": {\"host-id\": 1, \"host-ts\": 6344, \"score\": 0, \"engine-status\": {\"vm\": \"down_unexpected\", \"health\": \"bad\", \"detail\": \"Down\", \"reason\": \"bad vm status\"}, \"hostname\": \"alma03.qa.lab.tlv.redhat.com\", \"maintenance\": false, \"stopped\": false, \"crc32\": \"f2360116\", \"conf_on_shared_storage\": true, \"local_conf_timestamp\": 6344, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=6344 (Tue Feb 18 12:43:14 2020)\\nhost-id=1\\nscore=0\\nvm_conf_refresh_time=6344 (Tue Feb 18 12:43:14 2020)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUnexpectedlyDown\\nstopped=False\\ntimeout=Thu Jan  1 03:49:22 1970\\n\", \"live-data\": true}, \"global_maintenance\": false}"]}
[ INFO  ] TASK [ovirt.hosted_engine_setup : Check VM status at virt level]
[ INFO  ] TASK [ovirt.hosted_engine_setup : Fail if engine VM is not running]
[ ERROR ] fatal: [localhost]: FAILED! => {"changed": false, "msg": "Engine VM is not running, please check vdsm logs"}
[ ERROR ] Failed to execute stage 'Closing up': Failed executing ansible-playbook
[ INFO  ] Stage: Clean up
[ INFO  ] Cleaning temporary resources
[ INFO  ] TASK [ovirt.hosted_engine_setup : Execute just a specific set of steps]
[ INFO  ] ok: [localhost]
[ INFO  ] TASK [ovirt.hosted_engine_setup : Force facts gathering]
[ INFO  ] ok: [localhost]
[ INFO  ] TASK [ovirt.hosted_engine_setup : Fetch logs from the engine VM]
[ INFO  ] ok: [localhost]
[ INFO  ] TASK [ovirt.hosted_engine_setup : Set destination directory path]
[ INFO  ] ok: [localhost]
[ INFO  ] TASK [ovirt.hosted_engine_setup : Create destination directory]
[ INFO  ] changed: [localhost]
[ INFO  ] TASK [ovirt.hosted_engine_setup : include_tasks]
[ INFO  ] ok: [localhost]
[ INFO  ] TASK [ovirt.hosted_engine_setup : Find the local appliance image]
[ INFO  ] ok: [localhost]
[ INFO  ] TASK [ovirt.hosted_engine_setup : Set local_vm_disk_path]
[ INFO  ] ok: [localhost]
[ INFO  ] TASK [ovirt.hosted_engine_setup : Give the vm time to flush dirty buffers]
[ INFO  ] ok: [localhost -> localhost]
[ INFO  ] TASK [ovirt.hosted_engine_setup : Copy engine logs]
[ INFO  ] changed: [localhost]
[ INFO  ] TASK [ovirt.hosted_engine_setup : include_tasks]
[ INFO  ] ok: [localhost]
[ INFO  ] TASK [ovirt.hosted_engine_setup : Remove local vm dir]
[ INFO  ] changed: [localhost]
[ INFO  ] TASK [ovirt.hosted_engine_setup : Remove temporary entry in /etc/hosts for the local VM]
[ INFO  ] ok: [localhost]
[ INFO  ] Generating answer file '/var/lib/ovirt-hosted-engine-setup/answers/answers-20200218124417.conf'
[ INFO  ] Stage: Pre-termination
[ INFO  ] Stage: Termination
[ ERROR ] Hosted Engine deployment failed: please check the logs for the issue, fix accordingly or re-deploy from scratch.
          Log file is located at /var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-20200218112915-u14bi9.log

Comment 3 Nikolai Sednev 2020-02-18 11:52:02 UTC
Tested on:
ovirt-hosted-engine-ha-2.4.2-1.el8ev.noarch
ovirt-hosted-engine-setup-2.4.1-1.el8ev.noarch
Red Hat Enterprise Linux release 8.2 Beta (Ootpa)
Linux 4.18.0-167.el8.x86_64 #1 SMP Sun Dec 15 01:24:23 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux
repodata/	2020-02-09 00:41	-	 rhv-4.4.0-20

Comment 4 Evgeny Slutsky 2020-02-19 10:20:31 UTC
by looking at the env it  looks like permission error:

vdsm-client VM getStats vmID=d2f6dfd2-4f88-46a9-b713-aa74ee7e5465
[
    {
        "exitCode": 1,
        "exitMessage": "internal error: child reported (status=125): unable to open /var/run/vdsm/storage/74b3c4fa-c4cd-4c9b-a251-c099cc63ed51/3d092b2d-9668-4099-bf90-6020dfded169/3ddaeade-8b68-4b69-9f0f-ca3ca04e8d19: Permission denied",
        "exitReason": 1,
        "status": "Down",
        "statusTime": "4380676080",
        "vmId": "d2f6dfd2-4f88-46a9-b713-aa74ee7e5465"
    }
]



- disabling selinux didnt help,

lrwxrwxrwx. 1 vdsm kvm  165 Feb 19 12:08 3d092b2d-9668-4099-bf90-6020dfded169 -> /rhev/data-center/mnt/yellow-vdsb.qa.lab.tlv.redhat.com:_Compute__NFS_nsednev__he__1/74b3c4fa-c4cd-4c9b-a251-c099cc63ed51/images/3d092b2d-9668-4099-bf90-6020dfded169


[root@alma03 3d092b2d-9668-4099-bf90-6020dfded169]# stat  /var/run/vdsm/storage/74b3c4fa-c4cd-4c9b-a251-c099cc63ed51/3d092b2d-9668-4099-bf90-6020dfded169/3ddaeade-8b68-4b69-9f0f-ca3ca04e8d19
  File: /var/run/vdsm/storage/74b3c4fa-c4cd-4c9b-a251-c099cc63ed51/3d092b2d-9668-4099-bf90-6020dfded169/3ddaeade-8b68-4b69-9f0f-ca3ca04e8d19
  Size: 63350767616	Blocks: 121634960  IO Block: 1048576 regular file
Device: 30h/48d	Inode: 15528627761  Links: 1
Access: (0660/-rw-rw----)  Uid: (   36/    vdsm)   Gid: (   36/     kvm)
Context: system_u:object_r:nfs_t:s0
Access: 2020-02-18 12:27:11.104496879 +0200
Modify: 2020-02-18 12:25:14.678009051 +0200
Change: 2020-02-18 12:25:14.678009051 +0200
 Birth: -
[root@alma03 3d092b2d-9668-4099-bf90-6020dfded169]# id qemu
uid=107(qemu) gid=107(qemu) groups=107(qemu),11(cdrom),36(kvm)

Comment 5 Evgeny Slutsky 2020-02-19 10:26:07 UTC
after running libvirtd in debug got same errors:

https://paste.centos.org/view/dfeae27e

Comment 6 Nikolai Sednev 2020-02-19 12:25:35 UTC
My deployment was not over root-squashed NFS storage, hence I don't see how my bug depends on 1804672.

Comment 7 Peter Krempa 2020-02-19 12:36:57 UTC
Evgeny, did we talk about some different bug?

Comment 8 Evgeny Slutsky 2020-02-19 12:39:45 UTC
nope.. its same bug,
root cannot access the files in this storage:
touch /var/run/vdsm/storage/74b3c4fa-c4cd-4c9b-a251-c099cc63ed51/3d092b2d-9668-4099-bf90-6020dfded169/test
touch: cannot touch '/var/run/vdsm/storage/74b3c4fa-c4cd-4c9b-a251-c099cc63ed51/3d092b2d-9668-4099-bf90-6020dfded169/test': Permission denied

Comment 12 Nikolai Sednev 2020-02-19 14:09:08 UTC
HE 4.3.9 deployment at the same storage working just fine:
ovirt-hosted-engine-ha-2.3.6-1.el7ev.noarch
ovirt-hosted-engine-setup-2.3.12-1.el7ev.noarch
libvirt-client-4.5.0-23.el7_7.5.x86_64
libvirt-lock-sanlock-4.5.0-23.el7_7.5.x86_64

HE 4.4 deployment fails:
ovirt-hosted-engine-ha-2.4.2-1.el8ev.noarch
ovirt-hosted-engine-setup-2.4.1-1.el8ev.noarch
libvirt-client-6.0.0-5.module+el8.2.0+5765+64816f89.x86_64
libvirt-lock-sanlock-6.0.0-5.module+el8.2.0+5765+64816f89.x86_64

It's not the storage side issue.

Comment 13 Nir Soffer 2020-02-23 16:03:23 UTC
Looks like a duplicate of bug 1776843.

Nikolai, can you share output of:
- "cat /etc/exports" on the NFS server
- "exportfs -v" on the NFS server

Comment 14 Nikolai Sednev 2020-02-24 12:39:35 UTC
(In reply to Nir Soffer from comment #13)
> Looks like a duplicate of bug 1776843.
> 
> Nikolai, can you share output of:
> - "cat /etc/exports" on the NFS server
> - "exportfs -v" on the NFS server

Sorry, but not at the moment, I already provided the environment for 3 days and have no it anymore.

Comment 15 Nikolai Sednev 2020-02-24 12:44:14 UTC
Again, 4.3.9 deployments are just fine on that NFS storage, so I don't see any issue with the storage also for 4.4 to work from the storage side.

Comment 17 Nikolai Sednev 2020-02-24 14:36:39 UTC
Please also take a look on https://bugzilla.redhat.com/show_bug.cgi?id=1467813, it might be helpful.

Comment 24 Nikolai Sednev 2020-02-29 11:24:34 UTC
NFS deployment on these components:
rhvm-appliance.x86_64 2:4.4-20200123.0.el8ev rhv-4.4.0                                               
sanlock-3.8.0-2.el8.x86_64
qemu-kvm-4.2.0-12.module+el8.2.0+5858+afd073bc.x86_64
vdsm-4.40.5-1.el8ev.x86_64
libvirt-client-6.0.0-7.module+el8.2.0+5869+c23fe68b.x86_64
ovirt-hosted-engine-setup-2.4.2-2.el8ev.noarch
ovirt-hosted-engine-ha-2.4.2-1.el8ev.noarch
Linux 4.18.0-183.el8.x86_64 #1 SMP Sun Feb 23 20:50:47 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
Red Hat Enterprise Linux release 8.2 Beta (Ootpa)

Engine is:
Red Hat Enterprise Linux Server release 7.8 Beta (Maipo)
Linux 3.10.0-1123.el7.x86_64 #1 SMP Tue Jan 14 03:44:38 EST 2020 x86_64 x86_64 x86_64 GNU/Linux

Result - Pass.

Comment 25 Nikolai Sednev 2020-03-25 08:17:49 UTC
Getting issues with repositories again, due to described bellow I was unable to install ovirt-hosted-engine-setup.
Updating Subscription Management repositories.
Unable to read consumer identity
This system is not registered to Red Hat Subscription Management. You can use subscription-manager to register.
Last metadata expiration check: 0:00:04 ago on Mon 23 Mar 2020 04:03:15 PM IST.
Error: 
 Problem: package ovirt-host-4.4.0-0.5.beta.el8ev.x86_64 requires ovirt-provider-ovn-driver, but none of the providers can be installed
  - package ovirt-hosted-engine-setup-2.4.2-2.el8ev.noarch requires ovirt-host >= 4.4.0, but none of the providers can be installed
  - package ovirt-provider-ovn-driver-1.2.29-1.el8ev.noarch requires python3-openvswitch >= 2.7, but none of the providers can be installed
  - conflicting requests
  - nothing provides python3-openvswitch2.11 needed by rhv-python-openvswitch-1:2.11-7.el8ev.noarch
(try to add '--skip-broken' to skip uninstallable packages or '--nobest' to use not only best candidate packages)
[root@puma18 ~]#  yum install -y ovirt-hosted-engine-setup
Updating Subscription Management repositories.
Unable to read consumer identity
This system is not registered to Red Hat Subscription Management. You can use subscription-manager to register.
Last metadata expiration check: 0:00:37 ago on Mon 23 Mar 2020 04:03:15 PM IST.
Error: 
 Problem: package ovirt-host-4.4.0-0.5.beta.el8ev.x86_64 requires ovirt-provider-ovn-driver, but none of the providers can be installed
  - package ovirt-hosted-engine-setup-2.4.2-2.el8ev.noarch requires ovirt-host >= 4.4.0, but none of the providers can be installed
  - package ovirt-provider-ovn-driver-1.2.29-1.el8ev.noarch requires python3-openvswitch >= 2.7, but none of the providers can be installed
  - conflicting requests
  - nothing provides python3-openvswitch2.11 needed by rhv-python-openvswitch-1:2.11-7.el8ev.noarch

Moving back to assigned.

Comment 26 Michal Skrivanek 2020-03-25 08:27:01 UTC
you need fastdatapath repos, they're part of the compose. please retest

Comment 27 Nikolai Sednev 2020-03-25 09:18:18 UTC
(In reply to Michal Skrivanek from comment #26)
> you need fastdatapath repos, they're part of the compose. please retest

New repos provided by you solved the issue. I was able to install ovirt-hosted-engine-setup installer:
ovirt-hosted-engine-ha-2.4.2-1.el8ev.noarch
ovirt-hosted-engine-setup-2.4.2-2.el8ev.noarch

Comment 28 Nikolai Sednev 2020-03-25 09:23:51 UTC
During deployment it failed with this error:
[ ERROR ] fatal: [localhost -> nsednev-he-4.scl.lab.tlv.redhat.com]: FAILED! => {"changed": true, "cmd": ["engine-setup", "--accept-defaults", "--config-append=/root/ovirt-engine-answers", "--offline"], "delta": "0:00:22.842075", "end": "2020-03-25 11:21:10.763447", "msg": "non-zero return code", "rc": 1, "start": "2020-03-25 11:20:47.921372", "stderr": "\nERROR: Python 2 is disabled in RHEL8.\n\n- For guidance on porting to Python 3, see the\n    Conservative Python3 Porting Guide:\n    http://portingguide.readthedocs.io/\n\n- If you need Python 2 at runtime:\n   - Use the python27 module\n\n- If you do not have access to BZ#1533919:\n   - Use the python27 module\n\n- If you need to use Python 2 only at RPM build time:\n   - File a bug blocking BZ#1533919:\n       https://bugzilla.redhat.com/show_bug.cgi?id=1533919\n   - Set the environment variable RHEL_ALLOW_PYTHON2_FOR_BUILD=1\n       (Note that if you do not file the bug as above,\n       this workaround will break without warning in the future.)\n\n- If you need to use Python 2 only for tests:\n   - File a bug blocking BZ#1533919:\n       https://bugzilla.redhat.com/show_bug.cgi?id=1533919\n     (If your test tool does not have a Bugzilla component,\n        feel free to use `python2`.)\n   - Use /usr/bin/python2-for-tests instead of python2 to run\n       your tests.\n       (Note that if you do not file the bug as above,\n       this workaround will break without warning in the future.)\n\nFor details, see https://hurl.corp.redhat.com/rhel8-py2\n\nFatal Python error: Python 2 is disabled\n\nERROR: Python 2 is disabled in RHEL8.\n\n- For guidance on porting to Python 3, see the\n    Conservative Python3 Porting Guide:\n    http://portingguide.readthedocs.io/\n\n- If you need Python 2 at runtime:\n   - Use the python27 module\n\n- If you do not have access to BZ#1533919:\n   - Use the python27 module\n\n- If you need to use Python 2 only at RPM build time:\n   - File a bug blocking BZ#1533919:\n       https://bugzilla.redhat.com/show_bug.cgi?id=1533919\n   - Set the environment variable RHEL_ALLOW_PYTHON2_FOR_BUILD=1\n       (Note that if you do not file the bug as above,\n       this workaround will break without warning in the future.)\n\n- If you need to use Python 2 only for tests:\n   - File a bug blocking BZ#1533919:\n       https://bugzilla.redhat.com/show_bug.cgi?id=1533919\n     (If your test tool does not have a Bugzilla component,\n        feel free to use `python2`.)\n   - Use /usr/bin/python2-for-tests instead of python2 to run\n       your tests.\n       (Note that if you do not file the bug as above,\n       this workaround will break without warning in the future.)\n\nFor details, see https://hurl.corp.redhat.com/rhel8-py2\n\nFatal Python error: Python 2 is disabled\n/usr/share/otopi/otopi-functions: line 47: [: : integer expression expected", "stderr_lines": ["", "ERROR: Python 2 is disabled in RHEL8.", "", "- For guidance on porting to Python 3, see the", "    Conservative Python3 Porting Guide:", "    http://portingguide.readthedocs.io/", "", "- If you need Python 2 at runtime:", "   - Use the python27 module", "", "- If you do not have access to BZ#1533919:", "   - Use the python27 module", "", "- If you need to use Python 2 only at RPM build time:", "   - File a bug blocking BZ#1533919:", "       https://bugzilla.redhat.com/show_bug.cgi?id=1533919", "   - Set the environment variable RHEL_ALLOW_PYTHON2_FOR_BUILD=1", "       (Note that if you do not file the bug as above,", "       this workaround will break without warning in the future.)", "", "- If you need to use Python 2 only for tests:", "   - File a bug blocking BZ#1533919:", "       https://bugzilla.redhat.com/show_bug.cgi?id=1533919", "     (If your test tool does not have a Bugzilla component,", "        feel free to use `python2`.)", "   - Use /usr/bin/python2-for-tests instead of python2 to run", "       your tests.", "       (Note that if you do not file the bug as above,", "       this workaround will break without warning in the future.)", "", "For details, see https://hurl.corp.redhat.com/rhel8-py2", "", "Fatal Python error: Python 2 is disabled", "", "ERROR: Python 2 is disabled in RHEL8.", "", "- For guidance on porting to Python 3, see the", "    Conservative Python3 Porting Guide:", "    http://portingguide.readthedocs.io/", "", "- If you need Python 2 at runtime:", "   - Use the python27 module", "", "- If you do not have access to BZ#1533919:", "   - Use the python27 module", "", "- If you need to use Python 2 only at RPM build time:", "   - File a bug blocking BZ#1533919:", "       https://bugzilla.redhat.com/show_bug.cgi?id=1533919", "   - Set the environment variable RHEL_ALLOW_PYTHON2_FOR_BUILD=1", "       (Note that if you do not file the bug as above,", "       this workaround will break without warning in the future.)", "", "- If you need to use Python 2 only for tests:", "   - File a bug blocking BZ#1533919:", "       https://bugzilla.redhat.com/show_bug.cgi?id=1533919", "     (If your test tool does not have a Bugzilla component,", "        feel free to use `python2`.)", "   - Use /usr/bin/python2-for-tests instead of python2 to run", "       your tests.", "       (Note that if you do not file the bug as above,", "       this workaround will break without warning in the future.)", "", "For details, see https://hurl.corp.redhat.com/rhel8-py2", "", "Fatal Python error: Python 2 is disabled", "/usr/share/otopi/otopi-functions: line 47: [: : integer expression expected"], "stdout": "[ INFO  ] Stage: Initializing\n[ INFO  ] Stage: Environment setup\n          Configuration files: /etc/ovirt-engine-setup.conf.d/10-packaging-wsp.conf, /etc/ovirt-engine-setup.conf.d/10-packaging.conf, /root/ovirt-engine-answers\n          Log file: /var/log/ovirt-engine/setup/ovirt-engine-setup-20200325112049-tq8gq9.log\n          Version: otopi-1.9.0 (otopi-1.9.0-1.el8ev)\n[ INFO  ] Stage: Environment packages setup\n[ INFO  ] Stage: Programs detection\n[ INFO  ] Stage: Environment setup (late)\n[ INFO  ] Stage: Environment customization\n         \n          --== PRODUCT OPTIONS ==--\n         \n          Set up Cinderlib integration\n          (Currently in tech preview)\n          (Yes, No) [No]: \n          Configure ovirt-provider-ovn (Yes, No) [Yes]: \n         \n          * Please note * : Data Warehouse is required for the engine.\n          If you choose to not configure it on this host, you have to configure\n          it on a remote host, and then configure the engine on this host so\n          that it can access the database of the remote Data Warehouse host.\n          Configure Data Warehouse on this host (Yes, No) [Yes]: \n         \n          --== PACKAGES ==--\n         \n         \n          --== NETWORK CONFIGURATION ==--\n         \n          Host fully qualified DNS name of this server [nsednev-he-4.scl.lab.tlv.redhat.com]: \n[ INFO  ] firewalld will be configured as firewall manager.\n         \n          --== DATABASE CONFIGURATION ==--\n         \n          Where is the DWH database located? (Local, Remote) [Local]: \n          Setup can configure the local postgresql server automatically for the DWH to run. This may conflict with existing applications.\n          Would you like Setup to automatically configure postgresql and create DWH database, or prefer to perform that manually? (Automatic, Manual) [Automatic]: \n         \n          --== OVIRT ENGINE CONFIGURATION ==--\n         \n          Use default credentials (admin@internal) for ovirt-provider-ovn (Yes, No) [Yes]: \n         \n          --== STORAGE CONFIGURATION ==--\n         \n         \n          --== PKI CONFIGURATION ==--\n         \n          Organization name for certificate [scl.lab.tlv.redhat.com]: \n         \n          --== APACHE CONFIGURATION ==--\n         \n         \n          --== SYSTEM CONFIGURATION ==--\n         \n         \n          --== MISC CONFIGURATION ==--\n         \n          Please choose Data Warehouse sampling scale:\n          (1) Basic\n          (2) Full\n          (1, 2)[1]: \n         \n          --== END OF CONFIGURATION ==--\n         \n[ INFO  ] Stage: Setup validation\n         \n          --== CONFIGURATION PREVIEW ==--\n         \n          Application mode                        : both\n          Default SAN wipe after delete           : False\n          Host FQDN                               : nsednev-he-4.scl.lab.tlv.redhat.com\n          Firewall manager                        : firewalld\n          Update Firewall                         : True\n          Set up Cinderlib integration            : False\n          Configure local Engine database         : True\n          Set application as default page         : True\n          Configure Apache SSL                    : True\n          Engine database host                    : localhost\n          Engine database port                    : 5432\n          Engine database secured connection      : False\n          Engine database host name validation    : False\n          Engine database name                    : engine\n          Engine database user name               : engine\n          Engine installation                     : True\n          PKI organization                        : scl.lab.tlv.redhat.com\n          Set up ovirt-provider-ovn               : True\n          Configure WebSocket Proxy               : True\n          DWH installation                        : True\n          DWH database host                       : localhost\n          DWH database port                       : 5432\n          Configure local DWH database            : True\n          Configure VMConsole Proxy               : True\n[ INFO  ] Stage: Transaction setup\n[ INFO  ] Stopping engine service\n[ INFO  ] Stopping ovirt-fence-kdump-listener service\n[ INFO  ] Stopping dwh service\n[ INFO  ] Stopping vmconsole-proxy service\n[ INFO  ] Stopping websocket-proxy service\n[ INFO  ] Stage: Misc configuration (early)\n[ INFO  ] Stage: Package installation\n[ INFO  ] Stage: Misc configuration\n[ INFO  ] Upgrading CA\n[ INFO  ] Initializing PostgreSQL\n[ INFO  ] Creating PostgreSQL 'engine' database\n[ INFO  ] Configuring PostgreSQL\n[ INFO  ] Creating PostgreSQL 'ovirt_engine_history' database\n[ INFO  ] Configuring PostgreSQL\n[ INFO  ] Creating CA: /etc/pki/ovirt-engine/ca.pem\n[ INFO  ] Creating CA: /etc/pki/ovirt-engine/qemu-ca.pem\n[ INFO  ] Updating OVN SSL configuration\n[ INFO  ] Creating/refreshing DWH database schema\n[ ERROR ] Failed to execute stage 'Misc configuration': Command '/usr/share/ovirt-engine-dwh/dbscripts/schema.sh' failed to execute\n[ INFO  ] Rolling back DWH database schema\n[ INFO  ] Clearing DWH database ovirt_engine_history\n[ INFO  ] Stage: Clean up\n          Log file is located at /var/log/ovirt-engine/setup/ovirt-engine-setup-20200325112049-tq8gq9.log\n[ INFO  ] Generating answer file '/var/lib/ovirt-engine/setup/answers/20200325112110-setup.conf'\n[ INFO  ] Stage: Pre-termination\n[ INFO  ] Stage: Termination\n[ ERROR ] Execution of setup failed", "stdout_lines": ["[ INFO  ] Stage: Initializing", "[ INFO  ] Stage: Environment setup", "          Configuration files: /etc/ovirt-engine-setup.conf.d/10-packaging-wsp.conf, /etc/ovirt-engine-setup.conf.d/10-packaging.conf, /root/ovirt-engine-answers", "          Log file: /var/log/ovirt-engine/setup/ovirt-engine-setup-20200325112049-tq8gq9.log", "          Version: otopi-1.9.0 (otopi-1.9.0-1.el8ev)", "[ INFO  ] Stage: Environment packages setup", "[ INFO  ] Stage: Programs detection", "[ INFO  ] Stage: Environment setup (late)", "[ INFO  ] Stage: Environment customization", "         ", "          --== PRODUCT OPTIONS ==--", "         ", "          Set up Cinderlib integration", "          (Currently in tech preview)", "          (Yes, No) [No]: ", "          Configure ovirt-provider-ovn (Yes, No) [Yes]: ", "         ", "          * Please note * : Data Warehouse is required for the engine.", "          If you choose to not configure it on this host, you have to configure", "          it on a remote host, and then configure the engine on this host so", "          that it can access the database of the remote Data Warehouse host.", "          Configure Data Warehouse on this host (Yes, No) [Yes]: ", "         ", "          --== PACKAGES ==--", "         ", "         ", "          --== NETWORK CONFIGURATION ==--", "         ", "          Host fully qualified DNS name of this server [nsednev-he-4.scl.lab.tlv.redhat.com]: ", "[ INFO  ] firewalld will be configured as firewall manager.", "         ", "          --== DATABASE CONFIGURATION ==--", "         ", "          Where is the DWH database located? (Local, Remote) [Local]: ", "          Setup can configure the local postgresql server automatically for the DWH to run. This may conflict with existing applications.", "          Would you like Setup to automatically configure postgresql and create DWH database, or prefer to perform that manually? (Automatic, Manual) [Automatic]: ", "         ", "          --== OVIRT ENGINE CONFIGURATION ==--", "         ", "          Use default credentials (admin@internal) for ovirt-provider-ovn (Yes, No) [Yes]: ", "         ", "          --== STORAGE CONFIGURATION ==--", "         ", "         ", "          --== PKI CONFIGURATION ==--", "         ", "          Organization name for certificate [scl.lab.tlv.redhat.com]: ", "         ", "          --== APACHE CONFIGURATION ==--", "         ", "         ", "          --== SYSTEM CONFIGURATION ==--", "         ", "         ", "          --== MISC CONFIGURATION ==--", "         ", "          Please choose Data Warehouse sampling scale:", "          (1) Basic", "          (2) Full", "          (1, 2)[1]: ", "         ", "          --== END OF CONFIGURATION ==--", "         ", "[ INFO  ] Stage: Setup validation", "         ", "          --== CONFIGURATION PREVIEW ==--", "         ", "          Application mode                        : both", "          Default SAN wipe after delete           : False", "          Host FQDN                               : nsednev-he-4.scl.lab.tlv.redhat.com", "          Firewall manager                        : firewalld", "          Update Firewall                         : True", "          Set up Cinderlib integration            : False", "          Configure local Engine database         : True", "          Set application as default page         : True", "          Configure Apache SSL                    : True", "          Engine database host                    : localhost", "          Engine database port                    : 5432", "          Engine database secured connection      : False", "          Engine database host name validation    : False", "          Engine database name                    : engine", "          Engine database user name               : engine", "          Engine installation                     : True", "          PKI organization                        : scl.lab.tlv.redhat.com", "          Set up ovirt-provider-ovn               : True", "          Configure WebSocket Proxy               : True", "          DWH installation                        : True", "          DWH database host                       : localhost", "          DWH database port                       : 5432", "          Configure local DWH database            : True", "          Configure VMConsole Proxy               : True", "[ INFO  ] Stage: Transaction setup", "[ INFO  ] Stopping engine service", "[ INFO  ] Stopping ovirt-fence-kdump-listener service", "[ INFO  ] Stopping dwh service", "[ INFO  ] Stopping vmconsole-proxy service", "[ INFO  ] Stopping websocket-proxy service", "[ INFO  ] Stage: Misc configuration (early)", "[ INFO  ] Stage: Package installation", "[ INFO  ] Stage: Misc configuration", "[ INFO  ] Upgrading CA", "[ INFO  ] Initializing PostgreSQL", "[ INFO  ] Creating PostgreSQL 'engine' database", "[ INFO  ] Configuring PostgreSQL", "[ INFO  ] Creating PostgreSQL 'ovirt_engine_history' database", "[ INFO  ] Configuring PostgreSQL", "[ INFO  ] Creating CA: /etc/pki/ovirt-engine/ca.pem", "[ INFO  ] Creating CA: /etc/pki/ovirt-engine/qemu-ca.pem", "[ INFO  ] Updating OVN SSL configuration", "[ INFO  ] Creating/refreshing DWH database schema", "[ ERROR ] Failed to execute stage 'Misc configuration': Command '/usr/share/ovirt-engine-dwh/dbscripts/schema.sh' failed to execute", "[ INFO  ] Rolling back DWH database schema", "[ INFO  ] Clearing DWH database ovirt_engine_history", "[ INFO  ] Stage: Clean up", "          Log file is located at /var/log/ovirt-engine/setup/ovirt-engine-setup-20200325112049-tq8gq9.log", "[ INFO  ] Generating answer file '/var/lib/ovirt-engine/setup/answers/20200325112110-setup.conf'", "[ INFO  ] Stage: Pre-termination", "[ INFO  ] Stage: Termination", "[ ERROR ] Execution of setup failed"]}
[ INFO  ] TASK [ovirt.engine-setup : Clean temporary files]
[ INFO  ] changed: [localhost -> nsednev-he-4.scl.lab.tlv.redhat.com]
[ ERROR ] Failed to execute stage 'Closing up': Failed executing ansible-playbook

Addind sosreport from the host.

Comment 29 Nikolai Sednev 2020-03-25 09:33:35 UTC
Tested on:

ovirt-hosted-engine-ha-2.4.2-1.el8ev.noarch
ovirt-hosted-engine-setup-2.4.2-2.el8ev.noarch
Red Hat Enterprise Linux release 8.2 Beta (Ootpa)
Linux 4.18.0-190.el8.x86_64 #1 SMP Wed Mar 18 09:34:40 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux

Comment 30 Nikolai Sednev 2020-03-25 09:35:14 UTC
Created attachment 1673353 [details]
sosreport from puma18

Comment 32 Michal Skrivanek 2020-03-27 13:33:38 UTC
Fixed in DWH - ovirt-engine-dwh-4.4.0.1-1.el8ev

Comment 33 RHEL Program Management 2020-03-27 13:33:47 UTC
This bug report has Keywords: Regression or TestBlocker.
Since no regressions or test blockers are allowed between releases, it is also being identified as a blocker for this release. Please resolve ASAP.

Comment 35 Nikolai Sednev 2020-03-29 12:31:19 UTC
rhvm-appliance-4.4-20200326.0.el8ev.x86_64 doesn't contain ovirt-engine-dwh-4.4.0.1-1.el8ev.
Please supply updated appliance for verification.

Comment 36 Yuval Turgeman 2020-03-29 13:20:33 UTC
(In reply to Nikolai Sednev from comment #35)
> rhvm-appliance-4.4-20200326.0.el8ev.x86_64 doesn't contain
> ovirt-engine-dwh-4.4.0.1-1.el8ev.
> Please supply updated appliance for verification.

it does...

$ getmanifest.py rhvm-appliance-4.4-20200326.0.el8ev | grep ovirt-engine-dwh
ovirt-engine-dwh-4.4.0.1-1.el8ev
ovirt-engine-dwh-setup-4.4.0.1-1.el8ev

Comment 38 Yuval Turgeman 2020-03-29 14:09:09 UTC
This is a deployment issue, perhaps Anton or Dusan can assist

Comment 39 Michal Skrivanek 2020-03-30 11:09:49 UTC
you need to enable virt:8.2 module

Comment 40 Nikolai Sednev 2020-03-30 11:35:37 UTC
(In reply to Michal Skrivanek from comment #39)
> you need to enable virt:8.2 module

It was already enabled anyway and didn't worked.
yum module -y reset virt
yum module -y enable virt:8.2

Comment 41 Michal Skrivanek 2020-03-30 12:19:45 UTC
Problem: package ovirt-hosted-engine-setup-2.4.4-1.el8ev.noarch requires qemu-img >= 15:3.1.0-20.module+el8+2888+cdc893a8, but none of the providers can be installed
  - conflicting requests
  - package qemu-img-15:4.2.0-16.module+el8.2.0+6092+4f2391c1.x86_64 is filtered out by modular filtering

clearly points to the module not being enabled despite seeing that package available. Please clarify what didn't work

Comment 42 Michal Skrivanek 2020-03-30 17:37:39 UTC
[sorry, reset state by mistake]

Comment 43 Nikolai Sednev 2020-03-31 07:40:00 UTC
(In reply to Michal Skrivanek from comment #41)
> Problem: package ovirt-hosted-engine-setup-2.4.4-1.el8ev.noarch requires
> qemu-img >= 15:3.1.0-20.module+el8+2888+cdc893a8, but none of the providers
> can be installed
>   - conflicting requests
>   - package qemu-img-15:4.2.0-16.module+el8.2.0+6092+4f2391c1.x86_64 is
> filtered out by modular filtering
> 
> clearly points to the module not being enabled despite seeing that package
> available. Please clarify what didn't work

Installation of ovirt-hosted-engine-setup-2.4.4-1.el8ev.noarch package didn't worked as reported in Comment #37 Nikolai Sednev 2020-03-29 14:05:45 UTC. Something within the repositories was broken and Yesterday it was fixed.

Deployment of HE 4.4 on NFS has succeeded with rhv-openvswitch-2.11-7.el8ev.noarch, service was running fine on engine and host.
There was no dpdk package installed on host at all, although it's visible from yum as dpdk.x86_64 19.11-4.el8 rhel-8-appstream-rpms           

Tested on:
rhvm-appliance.x86_64 2:4.4-20200326.0.el8ev
ovirt-hosted-engine-setup-2.4.4-1.el8ev.noarch
ovirt-hosted-engine-ha-2.4.2-1.el8ev.noarch
Red Hat Enterprise Linux release 8.2 Beta (Ootpa)
Linux 4.18.0-193.el8.x86_64 #1 SMP Fri Mar 27 14:35:58 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux

Engine:
ovirt-engine-setup-base-4.4.0-0.26.master.el8ev.noarch
ovirt-engine-4.4.0-0.26.master.el8ev.noarch
openvswitch2.11-2.11.0-48.el8fdp.x86_64
Linux 4.18.0-192.el8.x86_64 #1 SMP Tue Mar 24 14:06:40 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
Red Hat Enterprise Linux release 8.2 Beta (Ootpa)

Comment 44 Sandro Bonazzola 2020-05-20 20:01:54 UTC
This bugzilla is included in oVirt 4.4.0 release, published on May 20th 2020.

Since the problem described in this bug report should be
resolved in oVirt 4.4.0 release, it has been closed with a resolution of CURRENT RELEASE.

If the solution does not work for you, please open a new bug report.


Note You need to log in before you can comment on or make changes to this bug.