Bug 1709752

Summary: Ansible creates duplicate host entries when 2 inferfaces are used
Product: Red Hat Satellite Reporter: S.Schwiedel <stefan.schwiedel>
Component: AnsibleAssignee: satellite6-bugs <satellite6-bugs>
Status: CLOSED CURRENTRELEASE QA Contact: Lukas Pramuk <lpramuk>
Severity: medium Docs Contact:
Priority: unspecified    
Version: 6.4.2CC: ahumbe, dchaudha, jalviso, mmccune, oprazak, pcreech, satellite6-bugs, sthirugn, wclark
Target Milestone: 6.7.0Keywords: Triaged
Target Release: Unused   
Hardware: All   
OS: Linux   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of:
: 1769878 1769890 (view as bug list) Environment:
Last Closed: 2019-11-11 14:37:13 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Attachments:
Description Flags
The patch has been created from the upstream patch and has been successfully tested on Satellite 6.5.2
none
hotfix RPM for Satellite 6.5.2 none

Description S.Schwiedel 2019-05-14 09:09:31 UTC
Description of problem:
If a host has 2 network interfaces and the secondary interface is used for remote execution, then an Ansible run will create a second host entry in Satellite with the hostname of the second interface, but with IP and MAC of the primary interface.

Overwriting /etc/rhsm/facts/katello.facts or /etc/hosts doesn`t work.
Adding an entry "gather_subset = min" to /etc/foreman-proxy/ansible.cfg doesn`t work either.

Version-Release number of selected component (if applicable):
ansible-2.6.14-1.el7ae.noarch
tfm-rubygem-foreman_ansible-2.2.9-8.el7sat.noarch
tfm-rubygem-hammer_cli_foreman_ansible-0.1.1-1.el7sat.noarch
rubygem-smart_proxy_ansible-2.0.2-3.el7sat.noarch
ansiblerole-insights-client-1.5-1.el7sat.noarch
tfm-rubygem-foreman_ansible_core-2.1.1-1.el7sat.noarch

How reproducible:
Delete second host entry, run Ansible, check host entries.

Steps to Reproduce:
1. create host with 2 interfaces
2. configure only the second interface for remote execution
3. run ansible
4. check host entries and you see both names

Actual results:
2 hosts are registered after ansible run when the host has 2 interfaces and the second interface is configured for remote execution.

Expected results:
Only 1 host with hostname of the primary interface is registered to satellite

Additional info:
ansible version 2.6.14

Maybe caused by the following component:
/opt/theforeman/tfm/root/usr/share/gems/gems/foreman_ansible-2.2.9/app/services/foreman_ansible/fact_importer.rb

Comment 3 S.Schwiedel 2019-05-14 09:31:56 UTC
I found a workaround:

Change the Job Template and add a filter to gather facts only for the interface ens192. Should be changed to regex but this prevents at least dulicate entries.
Anyway, a filter should not be required by default.

---
- hosts: all
  tasks:
    - name: Display all parameters known for the Foreman host
      debug:
        var: foreman_params
    - name: 'Collect only facts returned by facter'
      setup:
        filter:
          - 'ansible_ens192'
  roles:
<%- if @host.all_ansible_roles.present? -%>
<%=   @host.all_ansible_roles.map { |role| "    - #{role.name.strip}" }.join("\n") %>
<%- end -%>

Comment 4 S.Schwiedel 2019-05-14 10:51:22 UTC
The filter doesn`t work. Seems to be a mistake.

Comment 5 S.Schwiedel 2019-05-14 11:38:23 UTC
The issue is, that the Main Job Template "Ansible Roles - Ansible Default - Ansible" causes this issue.

I did a 100% identical copy of it and it did not create a duplicate entry. No Filter required.

The Fact ansible_fqdn is correct with both templates but the result is different.


"Ansible Roles - Ansible Default - Ansible"
creates second host entry

"Copy of Ansible Role Job Template"
doesn`t create second host entry

Comment 6 S.Schwiedel 2019-05-14 11:56:15 UTC
By not using the default template, Ansible doesn`t report back to satellite. "Last report" will not be updated.

Comment 7 Ondřej Pražák 2019-06-12 13:29:23 UTC
Connecting redmine issue https://projects.theforeman.org/issues/25803 from this bug

Comment 8 Bryan Kearney 2019-06-12 14:05:41 UTC
Moving this bug to POST for triage into Satellite 6 since the upstream issue https://projects.theforeman.org/issues/25803 has been resolved.

Comment 10 S.Schwiedel 2019-10-17 05:42:39 UTC
Created attachment 1626665 [details]
The patch has been created from the upstream patch and has been successfully tested on Satellite 6.5.2

I`ve created a patch for Satellite 6.5.2 and Version tfm-rubygem-foreman_ansible-2.2.14-3.el7sat.noarch.

This patch was the source:
https://projects.theforeman.org/projects/ansible/repository/revisions/2f99d6d71d9cf32eb93d30b60da8ba841099853a

Comment 12 wclark 2019-10-25 18:16:44 UTC
Created attachment 1629317 [details]
hotfix RPM for Satellite 6.5.2

Comment 13 wclark 2019-10-25 18:21:13 UTC
Created Hotfix RPM for Satellite 6.5.2.

To install it:

1. Download the hotfix RPM attached to this case and copy it to Satellite server

2. # yum localinstall /path/to/hotfix/rpm

3. # foreman-maintain service restart

Comment 14 Mike McCune 2019-11-11 14:37:13 UTC
The resolution to this bug was included in tfm-rubygem-foreman_ansible-3.0.7.1-1.el7sat.noarch as delivered in 6.6.0.

If the scenario described in this bug re-ocurrs on 6.6.0 or later, please re-open with details.