Bug 1464000 - [3.5]Failed to add a specified version's node to an existed containerized cluster
[3.5]Failed to add a specified version's node to an existed containerized clu...
Status: CLOSED NOTABUG
Product: OpenShift Container Platform
Classification: Red Hat
Component: Installer (Show other bugs)
3.5.1
Unspecified Unspecified
high Severity high
: ---
: ---
Assigned To: Scott Dodson
Johnny Liu
:
Depends On:
Blocks:
  Show dependency treegraph
 
Reported: 2017-06-22 04:59 EDT by liujia
Modified: 2017-06-22 05:03 EDT (History)
4 users (show)

See Also:
Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed: 2017-06-22 05:03:25 EDT
Type: Bug
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)

  None (edit)
Description liujia 2017-06-22 04:59:09 EDT
Description of problem:
Failed to add a specified version's node to existed containerized cluster due to 
a wrong openshift version was set in task openshift_version and openshift_image_tag defined in hosts file did not work.

TASK [openshift_version : Set precise containerized version to configure if openshift_release specified] ***
changed: [x.x.x.x] => {
    "changed": true,
    "cmd": [
        "docker",
        "run",
        "--rm",
        "openshift3/ose:v3.5",
        "version"
    ],
    "delta": "0:00:01.270633",
    "end": "2017-06-21 23:24:18.446576",
    "invocation": {
        "module_args": {
            "_raw_params": "docker run --rm openshift3/ose:v3.5 version",
            "_uses_shell": false,
            "chdir": null,
            "creates": null,
            "executable": null,
            "removes": null,
            "warn": true
        },
        "module_name": "command"
    },
    "rc": 0,
    "start": "2017-06-21 23:24:17.175943",
    "warnings": []
}

STDOUT:

openshift v3.5.5.28
kubernetes v1.5.2+43a9be4
etcd 3.1.0
...
....

TASK [openshift_excluder : Install docker excluder] ****************************
fatal: [x.x.x.x]: FAILED! => {
    "changed": false,
    "failed": true,
    "invocation": {
        "module_args": {
            "conf_file": null,
            "disable_gpg_check": false,
            "disablerepo": null,
            "enablerepo": null,
            "exclude": null,
            "install_repoquery": true,
            "list": null,
            "name": [
                "atomic-openshift-docker-excluder-3.5.5.28*"
            ],
            "state": "present",
            "update_cache": false,
            "validate_certs": true
        }
    },
    "rc": 126,
    "results": [
        "No package matching 'atomic-openshift-docker-excluder-3.5.5.28*' found available, installed or updated"
    ]
}

MSG:

No package matching 'atomic-openshift-docker-excluder-3.5.5.28*' found available, installed or updated



Version-Release number of selected component (if applicable):
atomic-openshift-utils-3.5.83-1.git.0.2ed3a4d.el7.noarch

How reproducible:
always

Steps to Reproduce:
1. Containerized install ocp3.5 ha env(v3.5.5.27),we only configure openshift v3.5.5.27 repo on these nodes. (no openshift 3.5.5.28 repo configured)
2. Edit inventory file to add 
openshift_image_tag=v3.5.5.27
3. Add a node to above cluster
# ansible-playbook -i hosts-1498011002 /usr/share/ansible/openshift-ansible/playbooks/byo/openshift-node/scaleup.yml

Actual results:
Scaleup failed.

Expected results:
Expected node was added successfully into the cluster, and the openshift version of the new nodes should be equal to `openshift_image_tag` defined in inventory hosts file.

Additional info:
Comment 1 liujia 2017-06-22 05:03:25 EDT
Use a wrong inventory file, not a bug.

Note You need to log in before you can comment on or make changes to this bug.