Description of problem: Running the "v3_7 upgrade_control_plane" playbook failed on 'docker : Fix SELinux Permissions on /var/lib/containers' task, due to restorecon command not found. Version-Release number of the following components: rpm -q openshift-ansible openshift-ansible-3.7.23-1.git.0.bc406aa.el7.noarch rpm -q ansible ansible-2.4.1.0-1.el7.noarch ansible --version ansible 2.4.1.0 config file = /etc/ansible/ansible.cfg configured module search path = Default w/o overrides How reproducible: Repeatedly, until modifying specific playbook, to prepend commands with "/sbin/", or updating PATH via environment directive. Prepended restorecon & swapoff commands with '/sbin/ in the files: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/main.yml /usr/share/ansible/openshift-ansible/roles/openshift_node_upgrade/tasks/main.yml Steps to Reproduce: 1. ansible-playbook /usr/share/ansible/openshift-ansible/playbooks/byo/openshift-cluster/upgrades/v3_7/upgrade_control_plane.yml Actual results: TASK [docker : Fix SELinux Permissions on /var/lib/containers] ******************************************************************************************************** fatal: [denqca3osmas01.qic.tiaa-cref.org]: FAILED! => {"changed": false, "cmd": "restorecon -R /var/lib/containers/", "failed": true, "msg": "[Errno 2] No such file or directory", "rc": 2} NO MORE HOSTS LEFT **************************************************************************************************************************************************** Expected results: The upgrade_control_plane playbook upgrades master nodes. Additional info: During LAB cluster upgrade from 3.6.173.0.5 to 3.7.23, using ansible-2.4.1.0 & atomic-openshift-utils-3.7.23, but also in 3.5-> 3.6 upgrade (see BZ #1546247). Customer reports this has been an issue since 3.3 install. This is likely due to customer's "secured non-tty ssh session environment" not including /sbin in the PATH. See customer Support Case https://access.redhat.com/support/cases/#/case/02037120 for logs, ansible.cfg & hosts inventory file.
*** Bug 1546872 has been marked as a duplicate of this bug. ***
After discussing the solution we're going to implement is to document the expected PATH and request that admins ensure that PATH is correct for both ansible_ssh_user and root. This would be the default for RHEL 7. We feel that it's incorrect to alter all code paths to use explicit paths when those paths may change between OS releases and it's undesirable to write conditional logic for every binary we use.
The default path for the root user in RHEL is `/sbin:/bin:/usr/sbin:/usr/bin`. If the user has these directories in their PATH then the openshift-ansible installer should succeed without issue.
https://github.com/openshift/openshift-docs/pull/7869
*** Bug 1552454 has been marked as a duplicate of this bug. ***
Commits pushed to master at https://github.com/openshift/openshift-docs https://github.com/openshift/openshift-docs/commit/3a768745495f1d3316151a68fb665209db3a7acf Bug 1546254- Add required path for openshift-ansible installer to succeed. https://github.com/openshift/openshift-docs/commit/f6b25ec3f153e899e9bb7ffd41bc1cb8e9cb8bcc Merge pull request #7869 from fabianvf/bz1546254 Bug 1546254- Add required path for openshift-ansible installer to succeed.
LGTM. https://github.com/openshift/openshift-docs/pull/8581
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2018:2009