Bug 1530183 - "v3.7" should be replaced with "v3.9" in openshift-ansible code
Summary: "v3.7" should be replaced with "v3.9" in openshift-ansible code
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Installer
Version: 3.9.0
Hardware: Unspecified
OS: Unspecified
high
medium
Target Milestone: ---
: 3.9.0
Assignee: Vadim Rutkovsky
QA Contact: Johnny Liu
URL:
Whiteboard:
: 1518806 1523063 1532097 1532100 (view as bug list)
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2018-01-02 08:24 UTC by Johnny Liu
Modified: 2018-06-18 18:27 UTC (History)
12 users (show)

Fixed In Version: openshift-ansible-3.9.0-0.42.0.git.0.1a9a61b.el7
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2018-06-18 14:39:30 UTC


Attachments (Terms of Use)

Description Johnny Liu 2018-01-02 08:24:27 UTC
Description of problem:
Now 3.9 testing is started, but many plabyooks are still referring "v3.7" as default value in the following grep output, especially openshift_service_catalog/ansible_service_broker/template_service_broker/logging/metrics/openshift_prometheus, need update them to "v3.9"

$ find . -type f|grep -v README|grep -v ansible.spec|grep -v 'v3.7'|grep -v upgrades|xargs grep v3.7
./playbooks/aws/provisioning_vars.yml.example:openshift_release: # v3.7
./utils/src/ooinstall/cli_installer.py:        'major_playbook': 'v3_7/upgrade.yml',
./utils/src/ooinstall/cli_installer.py:        'minor_playbook': 'v3_7/upgrade.yml',
./inventory/hosts.example:openshift_release=v3.7
./inventory/hosts.example:#openshift_image_tag=v3.7.0
./inventory/hosts.example:#openshift_metrics_image_version=v3.7
./inventory/hosts.example:#openshift_metrics_image_version=v3.7
./inventory/hosts.example:#openshift_logging_image_version=v3.7.0
./inventory/hosts.example:#openshift_service_catalog_image_version=v3.7
Binary file ./.git/objects/pack/pack-ebfdf27d9754d5f343abfe0313ddca391f53c462.pack matches
Binary file ./.git/index matches
./roles/openshift_prometheus/vars/openshift-enterprise.yml:l_openshift_prometheus_image_version: "{{ openshift_prometheus_image_version | default('v3.7') }}"
./roles/openshift_prometheus/vars/openshift-enterprise.yml:l_openshift_prometheus_proxy_image_version: "{{ openshift_prometheus_proxy_image_version | default('v3.7') }}"
./roles/openshift_prometheus/vars/openshift-enterprise.yml:l_openshift_prometheus_alertmanager_image_version: "{{ openshift_prometheus_alertmanager_image_version | default('v3.7') }}"
./roles/openshift_prometheus/vars/openshift-enterprise.yml:l_openshift_prometheus_alertbuffer_image_version: "{{ openshift_prometheus_alertbuffer_image_version | default('v3.7') }}"
./roles/template_service_broker/vars/openshift-enterprise.yml:__template_service_broker_version: "v3.7"
./roles/openshift_logging_curator/vars/openshift-enterprise.yml:__openshift_logging_curator_image_version: "{{ openshift_logging_image_version | default ('v3.7') }}"
./roles/openshift_logging_elasticsearch/vars/openshift-enterprise.yml:__openshift_logging_elasticsearch_image_version: "{{ openshift_logging_image_version | default ('v3.7') }}"
./roles/openshift_logging_elasticsearch/vars/openshift-enterprise.yml:__openshift_logging_elasticsearch_proxy_image_version: "{{ openshift_logging_image_version | default ('v3.7') }}"
./roles/openshift_logging_eventrouter/vars/openshift-enterprise.yml:__openshift_logging_eventrouter_image_version: "{{ openshift_logging_image_version | default ('v3.7') }}"
./roles/openshift_examples/examples-sync.sh:ORIGIN_VERSION=${1:-v3.7}
./roles/openshift_logging_fluentd/vars/openshift-enterprise.yml:__openshift_logging_fluentd_image_version: "{{ openshift_logging_image_version | default ('v3.7') }}"
./roles/openshift_logging_mux/vars/openshift-enterprise.yml:__openshift_logging_mux_image_version: "{{ openshift_logging_image_version | default ('v3.7') }}"
./roles/openshift_logging_kibana/vars/openshift-enterprise.yml:__openshift_logging_kibana_image_version: "{{ openshift_logging_image_version | default ('v3.7') }}"
./roles/openshift_logging_kibana/vars/openshift-enterprise.yml:__openshift_logging_kibana_proxy_image_version: "{{ openshift_logging_image_version | default ('v3.7') }}"
./roles/openshift_health_checker/test/docker_image_availability_test.py:            openshift_image_tag="v3.7.0-alpha.0",
./roles/openshift_health_checker/test/docker_image_availability_test.py:        "registry.example.com/spam/registry-console:v3.7",
./roles/openshift_health_checker/test/docker_image_availability_test.py:            openshift_image_tag="v3.7.0-alpha.0",
./roles/openshift_health_checker/openshift_checks/docker_image_availability.py:        # enterprise template just uses v3.6, v3.7, etc
./roles/ansible_service_broker/vars/openshift-enterprise.yml:__ansible_service_broker_image_tag: v3.7
./roles/ansible_service_broker/vars/openshift-enterprise.yml:__ansible_service_broker_registry_tag: v3.7
./roles/openshift_metrics/vars/openshift-enterprise.yml:__openshift_metrics_image_version: "v3.7"
./roles/openshift_service_catalog/vars/openshift-enterprise.yml:__openshift_service_catalog_image_version: "v3.7"
./roles/openshift_facts/library/openshift_facts.py:            examples_content_version = 'v3.7'


Version-Release number of the following components:
openshift-ansible master branch
$ git describe
openshift-ansible-3.9.0-0.11.0


How reproducible:

Steps to Reproduce:
1.
2.
3.

Actual results:
Please include the entire output from the last TASK line through the end of output if an error is generated

Expected results:

Additional info:
Please attach logs from ansible-playbook with the -vvv flag

Comment 1 Vadim Rutkovsky 2018-01-12 14:12:39 UTC
*** Bug 1532097 has been marked as a duplicate of this bug. ***

Comment 2 Vadim Rutkovsky 2018-01-12 14:13:03 UTC
*** Bug 1532100 has been marked as a duplicate of this bug. ***

Comment 3 Vadim Rutkovsky 2018-01-15 11:12:15 UTC
Working on this in https://github.com/openshift/openshift-ansible/pull/6712

Comment 4 Scott Dodson 2018-01-17 14:44:36 UTC
*** Bug 1518806 has been marked as a duplicate of this bug. ***

Comment 5 Scott Dodson 2018-01-17 14:45:20 UTC
*** Bug 1516564 has been marked as a duplicate of this bug. ***

Comment 7 Vadim Rutkovsky 2018-01-22 17:22:06 UTC
*** Bug 1523063 has been marked as a duplicate of this bug. ***

Comment 8 Vadim Rutkovsky 2018-02-12 07:58:36 UTC
Fix available in openshift-ansible-3.9.0-0.42.0.git.0.1a9a61b.el7

Comment 9 Johnny Liu 2018-02-13 08:16:37 UTC
Verified this bug with openshift-ansible-3.9.0-0.42.0.git.0.1a9a61b.el7.noarch, and PASS.

Now the default image tag would be set to `openshift_image_tag`.

# find . -type f|grep -v README|grep -v ansible.spec|grep  -v example|grep -v "v3.7"|grep -v upgrades|xargs grep v3.7
./utils/src/ooinstall/cli_installer.py:        'major_playbook': 'v3_7/upgrade.yml',
./utils/src/ooinstall/cli_installer.py:        'minor_playbook': 'v3_7/upgrade.yml',
Binary file ./.git/objects/pack/pack-ebfdf27d9754d5f343abfe0313ddca391f53c462.pack matches
Binary file ./.git/index matches
./roles/openshift_bootstrap_autoapprover/files/openshift-bootstrap-controller.yaml:        image: openshift/node:v3.7.0-rc.0
./roles/openshift_health_checker/test/docker_image_availability_test.py:            openshift_image_tag="v3.7.0-alpha.0",
./roles/openshift_health_checker/test/docker_image_availability_test.py:        "registry.example.com/spam/registry-console:v3.7",
./roles/openshift_health_checker/test/docker_image_availability_test.py:            openshift_image_tag="v3.7.0-alpha.0",
./roles/openshift_health_checker/openshift_checks/docker_image_availability.py:        # enterprise template just uses v3.6, v3.7, etc
./roles/openshift_facts/library/openshift_facts.py:            examples_content_version = 'v3.7'

# oc describe po webconsole-777f89fcd5-rgmfr -n openshift-web-console | grep Image
    Image:         registry.reg-aws.openshift.com:443/openshift3/ose-web-console:v3.9.0

# oc describe po apiserver-xnnxq -n kube-service-catalog | grep Image
    Image:         registry.reg-aws.openshift.com:443/openshift3/ose-service-catalog:v3.9.0


# oc describe po controller-manager-rlvmm -n kube-service-catalog | grep Image
    Image:         registry.reg-aws.openshift.com:443/openshift3/ose-service-catalog:v3.9.0

# oc describe po asb-1-fbbp7 -n openshift-ansible-service-broker | grep Image
    Image:          registry.reg-aws.openshift.com:443/openshift3/ose-ansible-service-broker:v3.9.0

# oc describe po apiserver-8jtg4 -n openshift-template-service-broker |grep Image
    Image:         registry.reg-aws.openshift.com:443/openshift3/ose-template-service-broker:v3.9.0


Note You need to log in before you can comment on or make changes to this bug.