RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 2127901 - [ansible-freeipa] ipareplica: Add undeploy cleanup
Summary: [ansible-freeipa] ipareplica: Add undeploy cleanup
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Linux 8
Classification: Red Hat
Component: ansible-freeipa
Version: 8.4
Hardware: Unspecified
OS: Unspecified
unspecified
unspecified
Target Milestone: rc
: ---
Assignee: Thomas Woerner
QA Contact: Varun Mylaraiah
Filip Hanzelka
URL:
Whiteboard:
Depends On:
Blocks: 2127903
TreeView+ depends on / blocked
 
Reported: 2022-09-19 10:33 UTC by Thomas Woerner
Modified: 2023-11-14 16:14 UTC (History)
2 users (show)

Fixed In Version: ansible-freeipa-1.10.0-1.el8
Doc Type: Enhancement
Doc Text:
.The `ipaserver_remove_on_server` and `ipaserver_ignore_topology_disconnect` options are now available in the `ipaserver` role If removing a replica from an Identity Management (IdM) topology by using the `remove_server_from_domain` option of the `ipaserver` `ansible-freeipa` role leads to a disconnected topology, you must now specify which part of the domain you want to preserve. Specifically, you must do the following: * Specify the `ipaserver_remove_on_server` value to identify which part of the topology you want to preserve. * Set `ipaserver_ignore_topology_disconnect` to True. Note that if removing a replica from IdM by using the `remove_server_from_domain` option preserves a connected topology, neither of these options is required.
Clone Of:
: 2127903 (view as bug list)
Environment:
Last Closed: 2023-11-14 15:26:23 UTC
Type: Bug
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Issue Tracker FREEIPA-6764 0 None None None 2022-09-19 10:37:26 UTC
Red Hat Issue Tracker FREEIPA-8769 0 None None None 2022-09-19 10:52:16 UTC
Red Hat Issue Tracker RHELPLAN-134295 0 None None None 2022-09-19 10:52:30 UTC
Red Hat Product Errata RHBA-2023:6926 0 None None None 2023-11-14 15:26:56 UTC

Description Thomas Woerner 2022-09-19 10:33:32 UTC
When a replica was undeployed, the information about the replica is still present. A connection to another replica or the server should be done to completely remove the replica from the topology.

Comment 1 Thomas Woerner 2023-04-05 15:51:37 UTC
Upstream PR: https://github.com/freeipa/ansible-freeipa/pull/1068

Comment 6 Varun Mylaraiah 2023-04-20 02:22:08 UTC
Verified

ansible-2.9.27-1.el8ae.noarch
ansible-freeipa-1.10.0-1.el8.noarch


 PASSED ansible_freeipa_tests/replica/test_idm_deploy_replica.py::TestReplicaUndeployment::test_replica_remove_without_topology_disconnected
 PASSED ansible_freeipa_tests/replica/test_idm_deploy_replica.py::TestReplicaUndeployment::test_replica_remove_with_topology_disconnected
 PASSED ansible_freeipa_tests/replica/test_idm_deploy_replica.py::TestReplicaUndeployment::test_replica_uninstall

ansible_freeipa_tests/replica/test_idm_deploy_replica.py::TestReplicaUndeployment::test_replica_remove_with_topology_disconnected
------------------------------ Captured log call -------------------------------
INFO     pytest_multihost.host.Host.replica4.OpenSSHTransport:transport.py:397 RUN ['kinit', 'admin']
DEBUG    pytest_multihost.host.Host.replica4.cmd9:transport.py:519 RUN ['kinit', 'admin']
DEBUG    pytest_multihost.host.Host.replica4.cmd9:transport.py:563 bash: line 1: cd: /root/multihost_tests: No such file or directory
DEBUG    pytest_multihost.host.Host.replica4.cmd9:transport.py:563 bash: line 2: /root/multihost_tests/env.sh: No such file or directory
DEBUG    pytest_multihost.host.Host.replica4.cmd9:transport.py:563 Password for admin: 
DEBUG    pytest_multihost.host.Host.replica4.cmd9:transport.py:217 Exit code: 0
INFO     pytest_multihost.host.Host.replica4.OpenSSHTransport:transport.py:397 RUN ['ipa', 'server-show', 'replica3.ipadomain.test']
DEBUG    pytest_multihost.host.Host.replica4.cmd10:transport.py:519 RUN ['ipa', 'server-show', 'replica3.ipadomain.test']
DEBUG    pytest_multihost.host.Host.replica4.cmd10:transport.py:563 bash: line 1: cd: /root/multihost_tests: No such file or directory
DEBUG    pytest_multihost.host.Host.replica4.cmd10:transport.py:563 bash: line 2: /root/multihost_tests/env.sh: No such file or directory
DEBUG    pytest_multihost.host.Host.replica4.cmd10:transport.py:563   Server name: replica3.ipadomain.test
DEBUG    pytest_multihost.host.Host.replica4.cmd10:transport.py:563   Managed suffixes: domain
DEBUG    pytest_multihost.host.Host.replica4.cmd10:transport.py:563   Min domain level: 1
DEBUG    pytest_multihost.host.Host.replica4.cmd10:transport.py:563   Max domain level: 1
DEBUG    pytest_multihost.host.Host.replica4.cmd10:transport.py:563   Enabled server roles: IPA master
DEBUG    pytest_multihost.host.Host.replica4.cmd10:transport.py:217 Exit code: 0
INFO     pytest_multihost.host.Host.replica4.OpenSSHTransport:transport.py:397 RUN ['kdestroy', '-A']
DEBUG    pytest_multihost.host.Host.replica4.cmd11:transport.py:519 RUN ['kdestroy', '-A']
DEBUG    pytest_multihost.host.Host.replica4.cmd11:transport.py:563 bash: line 1: cd: /root/multihost_tests: No such file or directory
DEBUG    pytest_multihost.host.Host.replica4.cmd11:transport.py:563 bash: line 2: /root/multihost_tests/env.sh: No such file or directory
DEBUG    pytest_multihost.host.Host.replica4.cmd11:transport.py:217 Exit code: 0
INFO     pytest_multihost.host.Host.ansible.OpenSSHTransport:transport.py:433 PUT inventory/replicas.hosts
DEBUG    pytest_multihost.host.Host.ansible.cmd30:transport.py:519 RUN ['tee', 'inventory/replicas.hosts']
DEBUG    pytest_multihost.host.Host.ansible.cmd30:transport.py:217 Exit code: 0
INFO     pytest_multihost.host.Host.ansible.OpenSSHTransport:transport.py:433 PUT install-replicas.yaml
DEBUG    pytest_multihost.host.Host.ansible.cmd31:transport.py:519 RUN ['tee', 'install-replicas.yaml']
DEBUG    pytest_multihost.host.Host.ansible.cmd31:transport.py:217 Exit code: 0
INFO     pytest_multihost.host.Host.ansible.OpenSSHTransport:transport.py:397 RUN ['ansible-playbook', '--ssh-extra-args="-o StrictHostKeyChecking=no"', '-vv', '-i', 'inventory/replicas.hosts', 'install-replicas.yaml']
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:519 RUN ['ansible-playbook', '--ssh-extra-args="-o StrictHostKeyChecking=no"', '-vv', '-i', 'inventory/replicas.hosts', 'install-replicas.yaml']
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 bash: line 1: cd: /root/multihost_tests: No such file or directory
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 bash: line 2: /root/multihost_tests/env.sh: No such file or directory
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 ansible-playbook 2.9.27
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563   config file = /etc/ansible/ansible.cfg
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563   configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563   ansible python module location = /usr/lib/python3.6/site-packages/ansible
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563   executable location = /usr/bin/ansible-playbook
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563   python version = 3.6.8 (default, Jan 23 2023, 22:31:05) [GCC 8.5.0 20210514 (Red Hat 8.5.0-18)]
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Using /etc/ansible/ansible.cfg as config file
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'actionable', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'counter_enabled', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'debug', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'dense', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'dense', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'full_skip', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'json', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'minimal', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'null', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'oneline', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'selective', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'skippy', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'stderr', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'unixy', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 Skipping callback 'yaml', as we already have a stdout callback.
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 PLAYBOOK: install-replicas.yaml ************************************************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 1 plays in install-replicas.yaml
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 PLAY [Playbook to unconfigure IPA replica] *************************************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [Gathering Facts] *********************************************************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /root/install-replicas.yaml:2
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 ok: [replica3.ipadomain.test]
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 META: ran handlers
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [ipareplica : Import variables specific to distribution] ******************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipareplica/tasks/main.yml:4
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 ok: [replica3.ipadomain.test] => (item=/usr/share/ansible/roles/ipareplica/vars/RedHat-8.yml) => {"ansible_facts": {"ipareplica_packages": ["@idm:DL1/server"], "ipareplica_packages_adtrust": ["@idm:DL1/adtrust"], "ipareplica_packages_dns": ["@idm:DL1/dns"], "ipareplica_packages_firewalld": ["firewalld"]}, "ansible_included_var_files": ["/usr/share/ansible/roles/ipareplica/vars/RedHat-8.yml"], "ansible_loop_var": "item", "changed": false, "item": "/usr/share/ansible/roles/ipareplica/vars/RedHat-8.yml"}
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [ipareplica : Install IPA replica] ****************************************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipareplica/tasks/main.yml:19
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 skipping: [replica3.ipadomain.test] => {"changed": false, "skip_reason": "Conditional result was False"}
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [ipareplica : Uninstall IPA replica] **************************************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipareplica/tasks/main.yml:23
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 included: /usr/share/ansible/roles/ipareplica/tasks/uninstall.yml for replica3.ipadomain.test
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [ipareplica : Set parameters] *********************************************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipareplica/tasks/uninstall.yml:4
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 ok: [replica3.ipadomain.test] => {"ansible_facts": {"_ignore_last_of_role": false, "_ignore_topology_disconnect": true, "_remove_from_domain": true, "_remove_on_server": "replica4.ipadomain.test"}, "changed": false}
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [Uninstall - Uninstall replica] *******************************************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipareplica/tasks/uninstall.yml:11
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [ipaserver : Import variables specific to distribution] *******************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipaserver/tasks/main.yml:4
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 ok: [replica3.ipadomain.test] => (item=/usr/share/ansible/roles/ipaserver/vars/RedHat-8.yml) => {"ansible_facts": {"ipaserver_packages": ["@idm:DL1/server"], "ipaserver_packages_adtrust": ["@idm:DL1/adtrust"], "ipaserver_packages_dns": ["@idm:DL1/dns"], "ipaserver_packages_firewalld": ["firewalld"]}, "ansible_included_var_files": ["/usr/share/ansible/roles/ipaserver/vars/RedHat-8.yml"], "ansible_loop_var": "item", "changed": false, "item": "/usr/share/ansible/roles/ipaserver/vars/RedHat-8.yml"}
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [ipaserver : Install IPA server] ******************************************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipaserver/tasks/main.yml:19
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 skipping: [replica3.ipadomain.test] => {"changed": false, "skip_reason": "Conditional result was False"}
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [ipaserver : Uninstall IPA server] ****************************************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipaserver/tasks/main.yml:23
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 included: /usr/share/ansible/roles/ipaserver/tasks/uninstall.yml for replica3.ipadomain.test
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [ipaserver : Uninstall - Set server hostname for removal] *****************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipaserver/tasks/uninstall.yml:4
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 ok: [replica3.ipadomain.test] => {"ansible_facts": {"_remove_hostname": "replica3.ipadomain.test"}, "changed": false}
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [ipaserver : Uninstall - Fail on missing ipaadmin_password for server removal] ***
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipaserver/tasks/uninstall.yml:12
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 skipping: [replica3.ipadomain.test] => {"changed": false, "skip_reason": "Conditional result was False"}
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [Uninstall - Fail on missing ipaserver_remove_on_server with ipaserver_ignore_topology_disconnect] ***
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipaserver/tasks/uninstall.yml:17
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 skipping: [replica3.ipadomain.test] => {"changed": false, "skip_reason": "Conditional result was False"}
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [ipaserver : Uninstall - Get connected server] ****************************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipaserver/tasks/uninstall.yml:23
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 skipping: [replica3.ipadomain.test] => {"changed": false, "skip_reason": "Conditional result was False"}
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [ipaserver : Uninstall - Server del "replica3.ipadomain.test"] ************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipaserver/tasks/uninstall.yml:32
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 changed: [replica3.ipadomain.test -> replica4.ipadomain.test] => {"changed": true}
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 TASK [ipaserver : Uninstall - Uninstall IPA server] ****************************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 task path: /usr/share/ansible/roles/ipaserver/tasks/uninstall.yml:45
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 changed: [replica3.ipadomain.test] => {"changed": true, "cmd": ["/usr/sbin/ipa-server-install", "--uninstall", "-U", "--ignore-topology-disconnect"], "delta": "0:00:35.734056", "end": "2023-04-19 07:41:47.841374", "failed_when_result": false, "rc": 0, "start": "2023-04-19 07:41:12.107318", "stderr": "Lookup failed: Preferred host replica3.ipadomain.test does not provide CA.\nForcing removal of replica3.ipadomain.test\nIgnoring topology connectivity errors.\nFailed to cleanup replica3.ipadomain.test DNS entries: no matching entry found\nYou may need to manually remove them from the tree\nRemoving seLinux file context /dev/shm/slapd-IPADOMAIN-TEST with label dirsrv_tmpfs_t.\nRemoving seLinux file context /etc/dirsrv/slapd-IPADOMAIN-TEST with label dirsrv_config_t.\nRemoving seLinux file context /etc/dirsrv/slapd-IPADOMAIN-TEST/schema with label dirsrv_config_t.\nRemoving seLinux file context /var/lib/dirsrv/slapd-IPADOMAIN-TEST/bak with label dirsrv_var_lib_t.\nRemoving seLinux file context /var/lib/dirsrv/slapd-IPADOMAIN-TEST/db with label dirsrv_var_lib_t.\nRemoving seLinux file context /var/lib/dirsrv/slapd-IPADOMAIN-TEST/ldif with label dirsrv_var_lib_t.\nRemoving seLinux file context /var/log/dirsrv/slapd-IPADOMAIN-TEST with label dirsrv_var_log_t.\nRemoving seLinux file context /var/run/dirsrv with label dirsrv_var_run_t.\nRemoving seLinux file context /var/run/lock/dirsrv/slapd-IPADOMAIN-TEST with label dirsrv_var_lock_t.\nRemoving Kerberos service principals from /etc/krb5.keytab\nDisabling client Kerberos and LDAP configurations\nRedundant SSSD configuration file /etc/sssd/sssd.conf was moved to /etc/sssd/sssd.conf.deleted\nRestoring client configuration files\nRestoring ipadomain.test as NIS domain.\nnscd daemon is not installed, skip configuration\nnslcd daemon is not installed, skip configuration\nSystemwide CA database updated.\nClient uninstall complete.\nThe ipa-client-install command was successful\nThe ipa-server-install command was successful", "stderr_lines": ["Lookup failed: Preferred host replica3.ipadomain.test does not provide CA.", "Forcing removal of replica3.ipadomain.test", "Ignoring topology connectivity errors.", "Failed to cleanup replica3.ipadomain.test DNS entries: no matching entry found", "You may need to manually remove them from the tree", "Removing seLinux file context /dev/shm/slapd-IPADOMAIN-TEST with label dirsrv_tmpfs_t.", "Removing seLinux file context /etc/dirsrv/slapd-IPADOMAIN-TEST with label dirsrv_config_t.", "Removing seLinux file context /etc/dirsrv/slapd-IPADOMAIN-TEST/schema with label dirsrv_config_t.", "Removing seLinux file context /var/lib/dirsrv/slapd-IPADOMAIN-TEST/bak with label dirsrv_var_lib_t.", "Removing seLinux file context /var/lib/dirsrv/slapd-IPADOMAIN-TEST/db with label dirsrv_var_lib_t.", "Removing seLinux file context /var/lib/dirsrv/slapd-IPADOMAIN-TEST/ldif with label dirsrv_var_lib_t.", "Removing seLinux file context /var/log/dirsrv/slapd-IPADOMAIN-TEST with label dirsrv_var_log_t.", "Removing seLinux file context /var/run/dirsrv with label dirsrv_var_run_t.", "Removing seLinux file context /var/run/lock/dirsrv/slapd-IPADOMAIN-TEST with label dirsrv_var_lock_t.", "Removing Kerberos service principals from /etc/krb5.keytab", "Disabling client Kerberos and LDAP configurations", "Redundant SSSD configuration file /etc/sssd/sssd.conf was moved to /etc/sssd/sssd.conf.deleted", "Restoring client configuration files", "Restoring ipadomain.test as NIS domain.", "nscd daemon is not installed, skip configuration", "nslcd daemon is not installed, skip configuration", "Systemwide CA database updated.", "Client uninstall complete.", "The ipa-client-install command was successful", "The ipa-server-install command was successful"], "stdout": "Updating DNS system records\n--------------------------------------------\nDeleted IPA server \"replica3.ipadomain.test\"\n--------------------------------------------\nShutting down all IPA services\nUnconfiguring web server\nUnconfiguring krb5kdc\nUnconfiguring kadmin\nUnconfiguring directory server\nUnconfiguring ipa-custodia\nUnconfiguring ipa-otpd\nRemoving IPA client configuration", "stdout_lines": ["Updating DNS system records", "--------------------------------------------", "Deleted IPA server \"replica3.ipadomain.test\"", "--------------------------------------------", "Shutting down all IPA services", "Unconfiguring web server", "Unconfiguring krb5kdc", "Unconfiguring kadmin", "Unconfiguring directory server", "Unconfiguring ipa-custodia", "Unconfiguring ipa-otpd", "Removing IPA client configuration"]}
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 META: ran handlers
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 META: ran handlers
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 PLAY RECAP *********************************************************************
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 replica3.ipadomain.test    : ok=9    changed=2    unreachable=0    failed=0    skipped=5    rescued=0    ignored=0   
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:563 
DEBUG    pytest_multihost.host.Host.ansible.cmd32:transport.py:217 Exit code: 0
INFO     pytest_multihost.host.Host.replica4.OpenSSHTransport:transport.py:397 RUN ['kinit', 'admin']
DEBUG    pytest_multihost.host.Host.replica4.cmd12:transport.py:519 RUN ['kinit', 'admin']
DEBUG    pytest_multihost.host.Host.replica4.cmd12:transport.py:563 bash: line 1: cd: /root/multihost_tests: No such file or directory
DEBUG    pytest_multihost.host.Host.replica4.cmd12:transport.py:563 bash: line 2: /root/multihost_tests/env.sh: No such file or directory
DEBUG    pytest_multihost.host.Host.replica4.cmd12:transport.py:563 Password for admin: 
DEBUG    pytest_multihost.host.Host.replica4.cmd12:transport.py:217 Exit code: 0
INFO     pytest_multihost.host.Host.replica4.OpenSSHTransport:transport.py:397 RUN ['ipa', 'server-find']
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:519 RUN ['ipa', 'server-find']
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563 bash: line 1: cd: /root/multihost_tests: No such file or directory
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563 bash: line 2: /root/multihost_tests/env.sh: No such file or directory
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563 ---------------------
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563 2 IPA servers matched
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563 ---------------------
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563   Server name: master.ipadomain.test
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563   Min domain level: 1
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563   Max domain level: 1
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563 
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563   Server name: replica4.ipadomain.test
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563   Min domain level: 1
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563   Max domain level: 1
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563 ----------------------------
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563 Number of entries returned 2
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:563 ----------------------------
DEBUG    pytest_multihost.host.Host.replica4.cmd13:transport.py:217 Exit code: 0
INFO     pytest_multihost.host.Host.replica4.OpenSSHTransport:transport.py:397 RUN ['kdestroy', '-A']
DEBUG    pytest_multihost.host.Host.replica4.cmd14:transport.py:519 RUN ['kdestroy', '-A']
DEBUG    pytest_multihost.host.Host.replica4.cmd14:transport.py:563 bash: line 1: cd: /root/multihost_tests: No such file or directory
DEBUG    pytest_multihost.host.Host.replica4.cmd14:transport.py:563 bash: line 2: /root/multihost_tests/env.sh: No such file or directory
DEBUG    pytest_multihost.host.Host.replica4.cmd14:transport.py:217 Exit code: 0

Comment 9 errata-xmlrpc 2023-11-14 15:26:23 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (ansible-freeipa bug fix and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2023:6926


Note You need to log in before you can comment on or make changes to this bug.