Bug 1570596 - The check that openshift_release should be in openshift_image_tag/openshift_pkg_version did not take effect due to openshift_release was hardcode to 3.10
Summary: The check that openshift_release should be in openshift_image_tag/openshift_p...
Keywords:
Status: CLOSED WONTFIX
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Cluster Version Operator
Version: 3.10.0
Hardware: Unspecified
OS: Unspecified
low
low
Target Milestone: ---
: 3.10.0
Assignee: Michael Gugino
QA Contact: liujia
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2018-04-23 10:01 UTC by liujia
Modified: 2018-05-21 15:36 UTC (History)
7 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2018-04-26 13:12:49 UTC
Target Upstream Version:


Attachments (Terms of Use)

Description liujia 2018-04-23 10:01:11 UTC
Description of problem:
Run upgrade against containerzied ocp with openshift_release=v3.9 and openshift_image_tag=v3.10.0-0.27.0 set in inventory file. The upgrade should be exited at task [assert openshift_release in openshift_image_tag] when pre check stage.

The root cause should be openshift_release set in hosts file did not take effect. It was hardcode to be 3.10.


TASK [openshift_version : assert openshift_release in openshift_image_tag] **************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/first_master.yml:38
ok: [x] => {
    "changed": false,
    "msg": "All assertions passed"
}

TASK [openshift_version : assert openshift_release in openshift_pkg_version] ************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/first_master.yml:45
ok: [x] => {
    "changed": false,
    "msg": "All assertions passed"
}

TASK [openshift_version : debug] ********************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/first_master.yml:53
ok: [x] => {
    "openshift_release": "3.10"
}

TASK [openshift_version : debug] ********************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/first_master.yml:55
ok: [x] => {
    "openshift_image_tag": "v3.10.0-0.27.0"
}

TASK [openshift_version : debug] ********************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/first_master.yml:57
ok: [x] => {
    "openshift_pkg_version": "-3.10*"
}

TASK [openshift_version : debug] ********************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/first_master.yml:59
ok: [x] => {
    "openshift_version": "3.10"
}




Version-Release number of the following components:
openshift-ansible-3.10.0-0.27.0.git.0.abed3b7.el7.noarch

How reproducible:
always

Steps to Reproduce:
1. Container/rpm install ocp v3.9
2. Upgrade above ocp with following variables set in inventory file
openshift_release=v3.9
openshift_image_tag=v3.10.0-0.27.0 or openshift_pkg_version=-3.10.0-0.27.0

3.

Actual results:
Upgrade finished.

Expected results:
Upgrade should abort when check that openshift_release was not in openshift_image_tag/openshift_pkg_version

Additional info:
Please attach logs from ansible-playbook with the -vvv flag

Comment 1 Michael Gugino 2018-04-23 13:17:13 UTC
Marking priority to low.  This actually completes what the user probably wants.  I believe we were setting openshift_release during upgrades, I'll look into if the logic needs an update there.

Comment 2 Scott Dodson 2018-04-26 13:12:49 UTC
Setting either of these values to 3.10 version seems like a clear indication of intent to upgrade. I don't think we'll get to fix this.

Comment 3 liujia 2018-04-27 01:07:10 UTC
@Scott @Michael

For this issue, the root cause was that openshift_release set in hosts file did not take effect. In v3.10, openshift_release was hardcode to be 3.10 whatever user setting this variable. So I want to have a confirm if that was what we expected? If that means "openshift_release" can be removed from inventory file?

Comment 4 liujia 2018-04-27 01:08:55 UTC
(In reply to liujia from comment #3)
> @Scott @Michael
> 
> For this issue, the root cause was that openshift_release set in hosts file
> did not take effect. In v3.10, openshift_release was hardcode to be 3.10
> whatever user setting this variable. So I want to have a confirm if that was
> what we expected? If that means "openshift_release" can be removed from
> inventory file?

From 3.10, we did not support user to set "openshift_release" variable, am I right?

Comment 5 Johnny Liu 2018-05-07 10:21:46 UTC
Tried this scenario in a fresh install, no such issue.

Comment 6 Michael Gugino 2018-05-21 15:36:57 UTC
openshift_release is still recommended for installs and upgrades; Though, if not set for installs, we will default to openshift_release == actual release (in this case, 3.10).

For upgrades, we hard-set openshift_release in a playbook.  Setting it will probably not have much of an effect, but I recommend setting it correctly before upgrading.

Probably in the future we will remove it entirely and just hard-code it; our ability to support multiple releases per openshift-ansible release is removed due to drastic changes in installer behavior.


Note You need to log in before you can comment on or make changes to this bug.