Created attachment 1534077 [details] sosreport from alma03 Description of problem: I've tried to remotely deploy HE using these components: ovirt-ansible-hosted-engine-setup-1.0.11-0.1.master.20190206102238.el7.noarch ovirt-ansible-repositories-1.1.4-2.el7ev.noarch ovirt-ansible-engine-setup-1.1.7-1.el7ev.noarch ansible-2.7.7-1.el7ae.noarch ovirt-hosted-engine-ha-2.3.1-1.el7ev.noarch ovirt-hosted-engine-setup-2.3.3-1.el7ev.noarch Linux 3.10.0-957.5.1.el7.x86_64 #1 SMP Wed Dec 19 10:46:58 EST 2018 x86_64 x86_64 x86_64 GNU/Linux Red Hat Enterprise Linux Server release 7.6 (Maipo) Tried deploy from remote server puma19 using the same components and this yml: hosted_engine_deploy_remotehost.yml--- - name: Deploy oVirt hosted engine hosts: alma03.qa.lab.tlv.redhat.com vars: he_bridge_if: enp5s0f0 he_fqdn: nsednev-he-1.qa.lab.tlv.redhat.com he_vm_mac_addr: 00:16:3e:7b:b8:53 he_domain_type: nfs he_storage_domain_addr: yellow-vdsb.qa.lab.tlv.redhat.com he_storage_domain_path: /Compute_NFS/nsednev_he_1/ roles: - role: ovirt.hosted_engine_setup he_appliance_password: somepassword he_admin_password: anotherpassword puma19 ~]# ansible-playbook -i alma03.qa.lab.tlv.redhat.com, /root/hosted_engine_deploy_remotehost.yml Deployment failed with: fatal: [alma03.qa.lab.tlv.redhat.com -> nsednev-he-1.qa.lab.tlv.redhat.com]: FAILED! => {"changed": false, "elapsed": 195, "msg": "timed out waiting for ping module test success: Failed to connect to the host via ssh: ssh: connect to host nsednev-he-1.qa.lab.tlv.redhat.com port 22: Connection timed out"} [root@alma03 ~]# ssh nsednev-he-1.qa.lab.tlv.redhat.com The authenticity of host 'nsednev-he-1.qa.lab.tlv.redhat.com (192.168.122.167)' can't be established. ECDSA key fingerprint is SHA256:mesSsrMXMmJEUJIK2BftyiS5OZcASQBPWnQOHcMmYuY. ECDSA key fingerprint is MD5:da:e3:30:0a:6c:c6:b8:c7:35:70:7f:94:ab:26:5b:e5. Are you sure you want to continue connecting (yes/no)? yes Warning: Permanently added 'nsednev-he-1.qa.lab.tlv.redhat.com,192.168.122.167' (ECDSA) to the list of known hosts. root.lab.tlv.redhat.com's password: [root@nsednev-he-1 ~]# now it's working I'm just wondering if we need a longer timeout Version-Release number of selected component (if applicable): ovirt-ansible-hosted-engine-setup-1.0.11-0.1.master.20190206102238.el7.noarch ovirt-ansible-repositories-1.1.4-2.el7ev.noarch ovirt-ansible-engine-setup-1.1.7-1.el7ev.noarch ansible-2.7.7-1.el7ae.noarch ovirt-hosted-engine-ha-2.3.1-1.el7ev.noarch ovirt-hosted-engine-setup-2.3.3-1.el7ev.noarch How reproducible: 100% Steps to Reproduce: 1.Prepare yml as follows: cat /home/nsednev/Downloads/logs/hosted_engine_deploy_remotehost.yml--- - name: Deploy oVirt hosted engine hosts: alma03.qa.lab.tlv.redhat.com vars: he_bridge_if: enp5s0f0 he_fqdn: nsednev-he-1.qa.lab.tlv.redhat.com he_vm_mac_addr: 00:16:3e:7b:b8:53 he_domain_type: nfs he_storage_domain_addr: yellow-vdsb.qa.lab.tlv.redhat.com he_storage_domain_path: /Compute_NFS/nsednev_he_1/ roles: - role: ovirt.hosted_engine_setup he_appliance_password: somepassword he_admin_password: anotherpassword 2.Deploy HE from remote server with components that described above to some host with the same components using yml from step 1. Actual results: Deployment fails. Expected results: Deployment should succeed. Additional info: Sosreport from alma03 (host on which I was running HE deployment from remote server puma19.
First of all, you have a mistake in the playbook - the password variables lines that are in the 'roles:' section should be in the 'vars:' section. Second, this is not the recommended way to execute the role. The recommended way is to do this: ansible-playbook -i host123.localdomain, hosted_engine_deploy.yml --extra-vars='@he_deployment.json' --extra-vars='@passwords.yml' --ask-vault-pass more info: https://github.com/oVirt/ovirt-ansible-hosted-engine-setup I think it will be better to verify the way we instruct the user. moving back to ON_QA
(In reply to Ido Rosenzwig from comment #1) > First of all, you have a mistake in the playbook - the password variables > lines that are in the 'roles:' section should be in the 'vars:' section. > > Second, this is not the recommended way to execute the role. > The recommended way is to do this: > > ansible-playbook -i host123.localdomain, hosted_engine_deploy.yml > --extra-vars='@he_deployment.json' --extra-vars='@passwords.yml' > --ask-vault-pass > more info: https://github.com/oVirt/ovirt-ansible-hosted-engine-setup > > I think it will be better to verify the way we instruct the user. > > moving back to ON_QA The deployment procedure was discussed with Simone prior to verification and approved. After I've got failure we decided to cover it with this bug. Moving back to assigned.
Moving to 4.3.2 not being identified as blocker for 4.3.1.
puma18 ~]# ansible-playbook -i alma03.qa.lab.tlv.redhat.com, /root/hosted_engine_deploy.yml Deployment was successful over nfs with the following contents of yml: --- - name: Deploy oVirt hosted engine hosts: alma03.qa.lab.tlv.redhat.com vars: he_bridge_if: enp5s0f0 he_fqdn: nsednev-he-1.qa.lab.tlv.redhat.com he_vm_mac_addr: 00:16:3e:7b:b8:53 he_domain_type: nfs he_storage_domain_addr: yellow-vdsb.qa.lab.tlv.redhat.com he_storage_domain_path: /Compute_NFS/nsednev_he_1/ he_mem_size_MB: 4096 he_ansible_host_name: alma03.qa.lab.tlv.redhat.com he_appliance_password: somepassword he_admin_password: someanotherpassword roles: - role: ovirt.hosted_engine_setup Tested on: ovirt-hosted-engine-setup-2.3.6-1.el7ev.noarch ovirt-hosted-engine-ha-2.3.1-1.el7ev.noarch rhvm-appliance-4.3-20190305.1.el7.x86_64 ansible-2.7.8-1.el7ae.noarch ovirt-ansible-hosted-engine-setup-1.0.12-1.el7ev.noarch Linux 3.10.0-957.10.1.el7.x86_64 #1 SMP Thu Feb 7 07:12:53 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux Red Hat Enterprise Linux Server release 7.6 (Maipo) Moving to verified.
This bugzilla is included in oVirt 4.3.2 release, published on March 19th 2019. Since the problem described in this bug report should be resolved in oVirt 4.3.2 release, it has been closed with a resolution of CURRENT RELEASE. If the solution does not work for you, please open a new bug report.