Description of problem: When deploying logging and specifying gluster as the dynamic storage provider, generated PVC's are missing storage-class. Version-Release number of the following components: * openshift-ansible-3.11.123 * ansible-2.6 How reproducible: Consistently Steps to Reproduce: 1. Deploy logging using glusterfs dynamic storage for persistence # +++++++++++++++++++++++ # Cluster Logging # ++++++++++++++++++++++++ openshift_logging_install_logging: true openshift_logging_es_cluster_size: 3 openshift_logging_elasticsearch_storage_type: pvc openshift_logging_es_pvc_storage_class_name: glusterfs-registry openshift_logging_es_pvc_size: 20Gi openshift_logging_es_pvc_dynamic: true openshift_logging_es_number_of_shards: 3 openshift_logging_es_number_of_replicas: 1 openshift_logging_es_nodeselector: {node-role.kubernetes.io/infra: "true"} ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Actual results: Playbook runs successfully. However, Elasticsearch PVC's are missing gluster storage-class reference, even though it was specified in the Ansible variables: $ oc get pvc NAME STATUS VOLUME CAPACITY ACCESS MODES STORAGECLASS AGE logging-es-0 Pending 1d logging-es-1 Pending 1d logging-es-2 Pending 1d Expected results: Elasticsearch PVC's would correctly reference gluster storage-class and bind automatically.
*** Bug 1737605 has been marked as a duplicate of this bug. ***
*** Bug 1745983 has been marked as a duplicate of this bug. ***
The pvc can be create when use none default sc. Move to verified. openshift3/ose-ansible:v3.11. [311]$ oc get pvc NAME STATUS VOLUME CAPACITY ACCESS MODES STORAGECLASS AGE logging-es-0 Bound pvc-e7ddf6b8-c96d-11e9-8b2f-0edabec3cc80 10Gi RWO bogus-glusterfs 4m logging-es-1 Bound pvc-0a0dce92-c96e-11e9-8b2f-0edabec3cc80 10Gi RWO bogus-glusterfs 3m [311]$ oc get sc NAME PROVISIONER AGE bogus-glusterfs kubernetes.io/aws-ebs 27m gp2 (default) kubernetes.io/aws-ebs 15m
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2019:2580