Bugzilla will be upgraded to version 5.0. The upgrade date is tentatively scheduled for 2 December 2018, pending final testing and feedback.
Bug 1542406 - [CNS] Deploy CNS with glusterfs_registry group configure as registry storage failed due to attribute missing
[CNS] Deploy CNS with glusterfs_registry group configure as registry storage ...
Status: CLOSED ERRATA
Product: OpenShift Container Platform
Classification: Red Hat
Component: Installer (Show other bugs)
3.9.0
Unspecified Unspecified
high Severity high
: ---
: 3.9.0
Assigned To: Michael Gugino
Wenkai Shi
:
Depends On:
Blocks:
  Show dependency treegraph
 
Reported: 2018-02-06 05:05 EST by Wenkai Shi
Modified: 2018-04-09 09:44 EDT (History)
6 users (show)

See Also:
Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed: 2018-03-28 10:26:32 EDT
Type: Bug
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)


External Trackers
Tracker ID Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2018:0489 None None None 2018-03-28 10:26 EDT

  None (edit)
Description Wenkai Shi 2018-02-06 05:05:08 EST
Description of problem:
Deploy CNS with glusterfs_registry group configure as registry storage failed due to attribute missing.
Installer failed in TASK [openshift_hosted : Mount registry volume].

Version-Release number of the following components:
openshift-ansible-3.9.0-0.38.0.git.0.57e1184.el7

How reproducible:
100%

Steps to Reproduce:
1. Deploy CNS with glusterfs_registry group configure as registry storage
# cat hosts
[OSEv3:children]
masters
nodes
glusterfs_registry
[OSEv3:vars]
...
openshift_hosted_registry_storage_kind=glusterfs
...
[nodes]
...
glusterfs1.example.com
glusterfs2.example.com
glusterfs3.example.com
[glusterfs_registry]
glusterfs1.example.com glusterfs_devices="['/dev/sdb']"
glusterfs2.example.com glusterfs_devices="['/dev/sdb']"
glusterfs3.example.com glusterfs_devices="['/dev/sdb']"
2.
3.

Actual results:
# ansible-playbook -i hosts -v /usr/share/ansible/openshift-ansible/playbooks/deploy_cluster.yml
...
TASK [openshift_hosted : Mount registry volume] ********************************
Tuesday 06 February 2018  09:48:33 +0000 (0:00:00.420)       0:36:10.525 ****** 
fatal: [ec2-xx-xx-xx-xx.compute-1.amazonaws.com]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'dict object' has no attribute 'hosted'\n\nThe error appears to have been in '/usr/share/ansible/openshift-ansible/roles/openshift_hosted/tasks/storage/glusterfs.yml': line 34, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n- name: Mount registry volume\n  ^ here\n\nexception type: <class 'ansible.errors.AnsibleUndefinedVariable'>\nexception: 'dict object' has no attribute 'hosted'"}
...

Expected results:
Installer should success.

Additional info:
Comment 1 Scott Dodson 2018-02-06 09:36:11 EST
Which version of ansible was used in this test? This seems like a regression.
Comment 2 Jose A. Rivera 2018-02-06 09:55:57 EST
I swear we already fixed this... Scott, has something changed for 3.9 that would make a reference to "openshift.hosted.registry.storage.glusterfs.path" no longer valid?
Comment 3 Wenkai Shi 2018-02-07 01:11:08 EST
Sorry for miss ansible version: ansible-2.4.3.0-1.el7ae
Can reproduce it with this version too: ansible-2.4.2.0-2.el7
Comment 4 Michael Gugino 2018-02-21 18:05:46 EST
This is a regression caused by: https://github.com/openshift/openshift-ansible/pull/6969

PR Created: https://github.com/openshift/openshift-ansible/pull/7244
Comment 5 Michael Gugino 2018-02-22 11:42:10 EST
PR Merged.
Comment 7 Wenkai Shi 2018-02-28 03:01:01 EST
Verified with version openshift-ansible-3.9.1-1.git.0.9862628.el7, the registry volume could mount succeed.
Comment 10 errata-xmlrpc 2018-03-28 10:26:32 EDT
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2018:0489

Note You need to log in before you can comment on or make changes to this bug.