Description of problem:
Remove glusterfs client packages while uninstalling glusterfs. We wipe out the entire storage and delete few directories. However, we don't remove the glusterfs packages. Hence, after uninstalling the glusterfs using the playbooks and rerunning the installation playbook again fails.
This is because the installer skips the glusterfs-fuse package installation since the packages are already installed on the nodes. However, installer fails to create the log files since the parent directories are removed which are provided by glusterfs package.
This is the error installer throws:
"msg": "Error mounting /tmp/openshift-glusterfs-registry-nn90GP: ERROR: failed to create logfile \"/var/log/glusterfs/tmp-openshift-glusterfs-registry-nn90GP.log\" (No s
uch file or directory)\nERROR: failed to open logfile /var/log/glusterfs/tmp-openshift-glusterfs-registry-nn90GP.log\nMount failed. Please check the log file for more detail
This fails because /var/log/glusterfs directory is not present. This directory is created by glusterfs package.
[root@localhost ~]# rpm -qf /var/log/glusterfs/
During installation, this is the task which installs glusterfs-fuse package.
- name: Install GlusterFS storage plugin dependencies
until: result is succeeded
The other packages are installed as a dependency of glusterfs-fuse and glusterfs package.
[root@localhost ~]# rpm -q glusterfs-fuse --requires
config(glusterfs-fuse) = 3.12.2-18.el7
glusterfs(x86-64) = 3.12.2-18.el7
glusterfs-client-xlators(x86-64) = 3.12.2-18.el7
[root@localhost ~]# rpm -q glusterfs --requires
glusterfs-libs(x86-64) = 3.12.2-18.el7
the uninstall playbook followed by a install play
Removing the glusterfs packages and then running the install playbook works as expected. It installs the glusterfs* packages and creates all the necessary files and directories.
Created a PR for this:
Version-Release number of the following components:
rpm -q openshift-ansible
There appear to be no active cases related to this bug. As such we're closing this bug in order to focus on bugs that are still tied to active customer cases. Please re-open this bug if you feel it was closed in error or a new active case is attached.