Bug 1809569 - [duplicate dns entries] HE deployment is Failing on RHVH-4.4
Summary: [duplicate dns entries] HE deployment is Failing on RHVH-4.4
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Gluster Storage
Classification: Red Hat Storage
Component: rhhi
Version: rhhiv-1.8
Hardware: x86_64
OS: Linux
urgent
urgent
Target Milestone: ---
: RHHI-V 1.8
Assignee: Gobinda Das
QA Contact: milind
URL:
Whiteboard:
Depends On: 1809640
Blocks: RHHI-V-1.8-Engineering-Inflight-BZs
TreeView+ depends on / blocked
 
Reported: 2020-03-03 12:25 UTC by milind
Modified: 2020-08-04 14:51 UTC (History)
4 users (show)

Fixed In Version:
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2020-08-04 14:51:33 UTC
Embargoed:


Attachments (Terms of Use)
HE log file (2.90 MB, text/plain)
2020-03-03 13:19 UTC, milind
no flags Details


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHEA-2020:3314 0 None None None 2020-08-04 14:51:55 UTC

Internal Links: 1809640

Comment 2 milind 2020-03-03 12:55:48 UTC
Description of problem:
HE Deployment is Failing in RHVH-4.4
"Host node.example.com installation failed. Failed to configure management network on the host.

------------------------------------------------
if applicable 
rhv-openvswitch-ovn-host-2.11-6.el8ev.noarch
openvswitch-selinux-extra-policy-1.0-19.el8fdp.noarch
openvswitch2.11-2.11.0-48.el8fdp.x86_64
rhv-openvswitch-2.11-6.el8ev.noarch
rhv-openvswitch-ovn-common-2.11-6.el8ev.noarch
rhv-python-openvswitch-2.11-6.el8ev.noarch
python3-openvswitch2.11-2.11.0-48.el8fdp.x86_64
----------------------------------------------
How reproducible:
Always

-------------------------------------------
Steps to Reproduce:
1. From cockpit click in Hyperconverged and deploy gluster it will be successful
2.Deploy HE this step will fail

---------------------
Actual results:
Failing 
-------------------------------------------
Expected results:
The deployment should be successful without any failure 
---------------------------------------------------------
Additional info:

2020-03-03 08:51:49,964 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible ok {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_host': 'localhost', 'ansible_task': 'Wait for the host to be up', 'task_duration': 595}
2020-03-03 08:51:49,964 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fa9b17cd278> kwargs 
2020-03-03 08:51:50,540 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible task start {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_task': 'ovirt.hosted_engine_setup : debug'}
2020-03-03 08:51:50,541 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible on_any args TASK: ovirt.hosted_engine_setup : debug kwargs is_conditional:False 
2020-03-03 08:51:50,541 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible on_any args localhostTASK: ovirt.hosted_engine_setup : debug kwargs 
2020-03-03 08:51:51,136 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible ok {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_host': 'localhost', 'ansible_task': '', 'task_duration': 1}
2020-03-03 08:51:51,136 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fa9b177fa20> kwargs 
2020-03-03 08:51:51,727 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible task start {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_task': 'ovirt.hosted_engine_setup : set_fact'}
2020-03-03 08:51:51,727 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible on_any args TASK: ovirt.hosted_engine_setup : set_fact kwargs is_conditional:False 
2020-03-03 08:51:51,727 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible on_any args localhostTASK: ovirt.hosted_engine_setup : set_fact kwargs 
2020-03-03 08:51:52,279 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | var changed: host "localhost" var "host_id" type "<class 'ansible.utils.unsafe_proxy.AnsibleUnsafeText'>" value: ""52f73d41-0dc2-4d7c-99c3-4faa4f061a21""
2020-03-03 08:51:52,280 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible ok {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_host': 'localhost', 'ansible_task': '', 'task_duration': 1}
2020-03-03 08:51:52,280 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible on_any args <ansible.executor.task_result.TaskResult object at 0x7fa9b17a4b70> kwargs 
2020-03-03 08:51:52,837 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible task start {'status': 'OK', 'ansible_type': 'task', 'ansible_playbook': '/usr/share/ovirt-hosted-engine-setup/ansible/trigger_role.yml', 'ansible_task': 'ovirt.hosted_engine_setup : Collect error events from the Engine'}
2020-03-03 08:51:52,837 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible on_any args TASK: ovirt.hosted_engine_setup : Collect error events from the Engine kwargs is_conditional:False 
2020-03-03 08:51:52,838 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | ansible on_any args localhostTASK: ovirt.hosted_engine_setup : Collect error events from the Engine kwargs 
2020-03-03 08:51:53,914 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | var changed: host "localhost" var "ovirt_events" type "<class 'list'>" value: "[
    {
        "cluster": {
            "href": "/ovirt-engine/api/clusters/5958d582-5d2a-11ea-9c23-004554194801",
            "id": "5958d582-5d2a-11ea-9c23-004554194801",
            "name": "Default"
        },
        "code": 505,
        "correlation_id": "2f69a817",
        "custom_id": -1,
        "description": "Host node.example.com installation failed. Failed to configure management network on the host.",
        "flood_rate": 0,
        "host": {
            "href": "/ovirt-engine/api/hosts/52f73d41-0dc2-4d7c-99c3-4faa4f061a21",
            "id": "52f73d41-0dc2-4d7c-99c3-4faa4f061a21",
            "name": "node.example.com"
        },
        "href": "/ovirt-engine/api/events/114",
        "id": "114",
        "index": 114,
        "origin": "oVirt",
        "severity": "error",
        "time": "2020-03-03 08:51:49.309000+00:00",
        "user": {
            "href": "/ovirt-engine/api/users/88fdc342-5d2a-11ea-b6c8-004554194801",
            "id": "88fdc342-5d2a-11ea-b6c8-004554194801",
            "name": "admin@internal-authz"
        }
    },
    {
        "code": 9000,
        "custom_id": -1,
        "description": "Failed to verify Power Management configuration for Host node.example.com.",
        "flood_rate": 0,
        "host": {
            "href": "/ovirt-engine/api/hosts/52f73d41-0dc2-4d7c-99c3-4faa4f061a21",
            "id": "52f73d41-0dc2-4d7c-99c3-4faa4f061a21",
            "name": "node.example.com"
        },
        "href": "/ovirt-engine/api/events/6",
        "id": "6",
        "index": 6,
        "origin": "oVirt",
        "severity": "alert",
        "time": "2020-03-03 08:41:48.684000+00:00"
    }
]"
2020-03-03 08:51:53,914 p=1021659 u=root n=ovirt-hosted-engine-setup-ansible | var changed: host "localhost" var "error_events" type "<class 'dict'>" value: "{
    "ansible_facts": {
        "ovirt_events": [
            {
                "cluster": {
                    "href": "/ovirt-engine/api/clusters/5958d582-5d2a-11ea-9c23-004554194801",
                    "id": "5958d582-5d2a-11ea-9c23-004554194801",
                    "name": "Default"
                },
                "code": 505,
                "correlation_id": "2f69a817",
                "custom_id": -1,
                "description": "Host node.example.com installation failed. Failed to configure management network on the host.",
                "flood_rate": 0,
                "host": {
                    "href": "/ovirt-engine/api/hosts/52f73d41-0dc2-4d7c-99c3-4faa4f061a21",
                    "id": "52f73d41-0dc2-4d7c-99c3-4faa4f061a21",
                    "name": "node.example.com"
                },
                "href": "/ovirt-engine/api/events/114",
                "id": "114",
                "index": 114,
                "origin": "oVirt",
                "severity": "error",
                "time": "2020-03-03 08:51:49.309000+00:00",
                "user": {
                    "href": "/ovirt-engine/api/users/88fdc342-5d2a-11ea-b6c8-004554194801",
                    "id": "88fdc342-5d2a-11ea-b6c8-004554194801",
                    "name": "admin@internal-authz"
                }
            },
            {
                "code": 9000,
                "custom_id": -1,
                "description": "Failed to verify Power Management configuration for Host node.example.com.",
                "flood_rate": 0,
                "host": {
                    "href": "/ovirt-engine/api/hosts/52f73d41-0dc2-4d7c-99c3-4faa4f061a21",
                    "id": "52f73d41-0dc2-4d7c-99c3-4faa4f061a21",
                    "name": "node.example.com"
                },
                "href": "/ovirt-engine/api/events/6",
                "id": "6",
                "index": 6,
                "origin": "oVirt",
                "severity": "alert",
                "time": "2020-03-03 08:41:48.684000+00:00"
            }
        ]
    },

Comment 3 milind 2020-03-03 13:19:52 UTC
Created attachment 1667196 [details]
HE log file

Comment 8 milind 2020-03-17 09:05:41 UTC
Hi Michael,
I will update once we have RHVH 4.4
Thanks

Comment 9 Michael Burman 2020-04-26 06:51:17 UTC
(In reply to milind from comment #8)
> Hi Michael,
> I will update once we have RHVH 4.4
> Thanks

Hi 
Should be already available for testing. tnx

Comment 10 SATHEESARAN 2020-05-05 02:27:03 UTC
RHHI-V 1.8 deployment with 3 node works good with the workaround from Bug 1823423.
The particular issue on the bug is not seen

The builds used for the verification are:
RHVH-4.4-20200417.0-RHVH-x86_64-dvd1.iso 
rhvm-appliance-4.4-20200417.0.el8ev.x86_64.rpm 

@Milind, could you verify this bug also with single node RHHI-V 1.8 deployment
and verify this bug ?

Comment 11 milind 2020-05-05 11:35:47 UTC
As the HE deployment is successfully done marking this bug as verified

Comment 14 errata-xmlrpc 2020-08-04 14:51:33 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (RHHI for Virtualization 1.8 bug fix and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2020:3314


Note You need to log in before you can comment on or make changes to this bug.