Bug 1807060 - hosted-engine-setup deployment fails with ovirt-engine-4.4-el8 appliance - nic `eth0` hardcoded.
Summary: hosted-engine-setup deployment fails with ovirt-engine-4.4-el8 appliance - n...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: ovirt-ansible-roles
Version: 4.4.0
Hardware: Unspecified
OS: Unspecified
urgent
urgent
Target Milestone: ovirt-4.4.0
: 4.4.0
Assignee: Asaf Rachmani
QA Contact: Nikolai Sednev
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2020-02-25 14:43 UTC by Evgeny Slutsky
Modified: 2020-08-04 13:23 UTC (History)
5 users (show)

Fixed In Version: ovirt-ansible-hosted-engine-setup-1.1.1
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2020-08-04 13:23:46 UTC
oVirt Team: Integration
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github oVirt ovirt-ansible-hosted-engine-setup pull 304 0 None closed configure engine VM to start with consistent interface name: eth0 2020-11-09 20:01:21 UTC

Description Evgeny Slutsky 2020-02-25 14:43:12 UTC
versions:

ovirt-hosted-engine-setup-2.4.2-0.0.master.20200206064524.git75e3e01.el8.noarch
ovirt-ansible-hosted-engine-setup-1.0.36-0.1.master.20200224135826.el8.noarch
ovirt-engine-appliance-4.4-20200224174113.1.el8.x86_64



steps to reproduce:
1. deploy hosted engine on centos 8

yum install https://resources.ovirt.org/pub/yum-repo/ovirt-release-master.rpm -y
yum install ovirt-engine-appliance -y
yum install ovirt-hosted-engine-setup -y
yum install firewalld -y

run ovirt-hosted-engine-setup with NFS .



when deploying hosted-engine with EL8 based engine   getting this error:

 INFO  ] TASK [ovirt.hosted_engine_setup : Check engine VM health]
[ ERROR ] fatal: [localhost]: FAILED! => {"attempts": 180, "changed": true, "cmd": ["hosted-engine", "--vm-status", "--json"], "delta": "0:00:00.280504", "end": "2020-02-25 12:15:44.182497", "rc": 0, "start": "2020-02-25 12:15:43.901993", "stderr": "", "stderr_lines": [], "stdout": "{\"1\": {\"host-id\": 1, \"host-ts\": 5263, \"score\": 3400, \"engine-status\": {\"vm\": \"up\", \"health\": \"bad\", \"detail\": \"Powering up\", \"reason\": \"bad vm status\"}, \"hostname\": \"hosted-engine\", \"maintenance\": false, \"stopped\": false, \"crc32\": \"ee76a6c1\", \"conf_on_shared_storage\": true, \"local_conf_timestamp\": 5263, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=5263 (Tue Feb 25 12:15:43 2020)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=5263 (Tue Feb 25 12:15:43 2020)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\", \"live-data\": true}, \"global_maintenance\": false}", "stdout_lines": ["{\"1\": {\"host-id\": 1, \"host-ts\": 5263, \"score\": 3400, \"engine-status\": {\"vm\": \"up\", \"health\": \"bad\", \"detail\": \"Powering up\", \"reason\": \"bad vm status\"}, \"hostname\": \"hosted-engine\", \"maintenance\": false, \"stopped\": false, \"crc32\": \"ee76a6c1\", \"conf_on_shared_storage\": true, \"local_conf_timestamp\": 5263, \"extra\": \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=5263 (Tue Feb 25 12:15:43 2020)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=5263 (Tue Feb 25 12:15:43 2020)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineStarting\\nstopped=False\\n\", \"live-data\": true}, \"global_maintenance\": false}"]}




in log file:

2020-02-24 16:02:28,221+0000 INFO ansible task start {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_task': "ovirt.hosted_engine_setup : Fail if Engine IP is different from engine's he_fqdn resolved IP"}
2020-02-24 16:02:28,222+0000 DEBUG ansible on_any args TASK: ovirt.hosted_engine_setup : Fail if Engine IP is different from engine's he_fqdn resolved IP kwargs is_conditional:False 
2020-02-24 16:02:28,222+0000 DEBUG ansible on_any args localhostTASK: ovirt.hosted_engine_setup : Fail if Engine IP is different from engine's he_fqdn resolved IP kwargs 
2020-02-24 16:02:28,678+0000 DEBUG var changed: host "localhost" var "ansible_play_hosts" type "<class 'list'>" value: "[]"
2020-02-24 16:02:28,679+0000 DEBUG var changed: host "localhost" var "ansible_play_batch" type "<class 'list'>" value: "[]"
2020-02-24 16:02:28,679+0000 DEBUG var changed: host "localhost" var "play_hosts" type "<class 'list'>" value: "[]"
2020-02-24 16:02:28,679+0000 ERROR ansible failed {
    "ansible_host": "localhost",
    "ansible_playbook": "/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml",
    "ansible_result": {
        "_ansible_no_log": false,
        "changed": false,
        "msg": "Engine VM IP address is  while the engine's he_fqdn engine.es.localvms.com resolves to 192.168.100.103. If you are using DHCP, check your DHCP reservation configuration"
    },
    "ansible_task": "Fail if Engine IP is different from engine's he_fqdn resolved IP",
    "ansible_type": "task",
    "status": "FAILED",
    "task_duration": 0
}
2020-02-24 16:02:28,679+0000 DEBUG ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7f74dc1c3320> kwargs ignore_errors:None 
2020-02-24 16:02:28,682+0000 INFO ansible stats {
    "ansible_playbook": "/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml",
    "ansible_playbook_duration": "25:02 Minutes",
    "ansible_result": "type: <class 'dict'>\nstr: {'localhost': {'ok': 123, 'failures': 1, 'unreachable': 0, 'changed': 51, 'skipped': 14, 'rescued': 1, 'ignored': 1}}",
    "ansible_type": "finish",
    "status": "FAILED"
}

as it seems in engine VM console , there is no `eth0` interface:

1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000
    link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
    inet 127.0.0.1/8 scope host lo
       valid_lft forever preferred_lft forever
    inet6 ::1/128 scope host 
       valid_lft forever preferred_lft forever
2: ens3: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc mq state UP group default qlen 1000
    link/ether 52:54:00:45:ca:93 brd ff:ff:ff:ff:ff:ff
    inet 192.168.100.103/24 brd 192.168.100.255 scope global dynamic noprefixroute ens3
       valid_lft 2590sec preferred_lft 2590sec
    inet6 fe80::3821:a71a:7110:bca1/64 scope link noprefixroute 
       valid_lft forever preferred_lft forever

in journal:

[root@engine ~]# journalctl | grep -i eth0
Feb 25 12:59:02 engine.es.localvms.com kernel: virtio_net virtio0 ens3: renamed from eth0

Comment 4 Nikolai Sednev 2020-04-01 14:55:48 UTC
Worked for me on fresh and clean environment, I successfully deployed HE 4.4 over NFS.

Tested on host with these components:
rhvm-appliance.x86_64 2:4.4-20200326.0.el8ev
ovirt-hosted-engine-setup-2.4.4-1.el8ev.noarch
ovirt-hosted-engine-ha-2.4.2-1.el8ev.noarch
Red Hat Enterprise Linux release 8.2 Beta (Ootpa)
Linux 4.18.0-193.el8.x86_64 #1 SMP Fri Mar 27 14:35:58 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux

Engine:
ovirt-engine-setup-base-4.4.0-0.26.master.el8ev.noarch
ovirt-engine-4.4.0-0.26.master.el8ev.noarch
openvswitch2.11-2.11.0-48.el8fdp.x86_64
Linux 4.18.0-192.el8.x86_64 #1 SMP Tue Mar 24 14:06:40 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
Red Hat Enterprise Linux release 8.2 Beta (Ootpa)

Comment 10 errata-xmlrpc 2020-08-04 13:23:46 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (RHV Engine and Host Common Packages 4.4), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2020:3309


Note You need to log in before you can comment on or make changes to this bug.