Bug 1638816 - playbooks/ovirt/openshift-cluster/ovirt-vm-infra.yml fails with ""The conditional check 'ip_cond' failed..."
Summary: playbooks/ovirt/openshift-cluster/ovirt-vm-infra.yml fails with ""The conditi...
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: ovirt-openshift-extensions
Version: 4.3.0
Hardware: Unspecified
OS: Unspecified
unspecified
high
Target Milestone: ovirt-4.3.0
: ---
Assignee: Roy Golan
QA Contact: Jan Zmeskal
URL: https://github.com/oVirt/ovirt-ansibl...
Whiteboard:
Depends On: 1629601 1639167
Blocks:
TreeView+ depends on / blocked
 
Reported: 2018-10-12 13:53 UTC by Jiri Belka
Modified: 2019-05-31 08:49 UTC (History)
6 users (show)

Fixed In Version:
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2019-05-31 08:49:39 UTC
oVirt Team: Infra
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github https://github.com/oVirt ovirt-openshift-extensions issues 73 0 None None None 2018-11-01 09:00:27 UTC
Red Hat Bugzilla 1639167 0 unspecified CLOSED 'VmsModule' object has no attribute '_get_minor' 2021-02-22 00:41:40 UTC

Internal Links: 1639167

Description Jiri Belka 2018-10-12 13:53:09 UTC
Description of problem:

I tried to deploy OCP on RHV/oVirt, it works but the playbook itself fails with, this is to track this issue for OCP on RHV/oVirt itself:

---%>---
TASK [oVirt.vm-infra : Wait for VMs IP] ****************************************************************************************************************************************************************************************************************************************
task path: /usr/share/ansible/roles/oVirt.vm-infra/tasks/vm_state_present.yml:107
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: root
<127.0.0.1> EXEC /bin/sh -c 'echo ~root && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1539350992.74-143349631837496 `" && echo ansible-tmp-1539350992.74-143349631837496="` echo /root/.ansible/tmp/ansible-tmp-1539350992.74-143349631837496 `" ) && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/cloud/ovirt/ovirt_vms_facts.py
<127.0.0.1> PUT /root/.ansible/tmp/ansible-local-19806gXM1aq/tmpfC7YxR TO /root/.ansible/tmp/ansible-tmp-1539350992.74-143349631837496/ovirt_vms_facts.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1539350992.74-143349631837496/ /root/.ansible/tmp/ansible-tmp-1539350992.74-143349631837496/ovirt_vms_facts.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python2 /root/.ansible/tmp/ansible-tmp-1539350992.74-143349631837496/ovirt_vms_facts.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1539350992.74-143349631837496/ > /dev/null 2>&1 && sleep 0'
fatal: [localhost]: FAILED! => {
    "msg": "The conditional check 'ip_cond' failed. The error was: template error while templating string: no filter named 'ovirtvmipv4'. String: {% if ovirt_vms | ovirtvmipv4 | length > 0 %} True {% else %} False {% endif %}"
}

TASK [oVirt.vm-infra : Logout from oVirt] **************************************************************************************************************************************************************************************************************************************
task path: /usr/share/ansible/roles/oVirt.vm-infra/tasks/main.yml:40
skipping: [localhost] => {
    "changed": false, 
    "skip_reason": "Conditional result was False"
}
        to retry, use: --limit @/usr/share/ansible/openshift-ansible/playbooks/ovirt/openshift-cluster/ovirt-vm-infra.retry

PLAY RECAP *********************************************************************************************************************************************************************************************************************************************************************
localhost                  : ok=81   changed=5    unreachable=0    failed=1   
---%<---

according to omachace@ it is because "it cannot find filter ovirtvmipv4".
see https://github.com/oVirt/ovirt-ansible-vm-infra/commit/b00367e07837c5637c6c4f483755f6331fa26576

Version-Release number of selected component (if applicable):
openshift-ansible-3.11.16-1.git.0.4ac6f81.el7.noarch
ovirt-ansible-image-template-1.1.7-1.el7ev.noarch
ovirt-ansible-vm-infra-1.1.9-1.el7ev.noarch


How reproducible:
100%

Steps to Reproduce:
1. run playbooks/ovirt/openshift-cluster/ovirt-vm-infra.yml with 4.2.6 ovirt ansible rpms
2.
3.

Actual results:
the play creates vm but failed in the end

Expected results:
should finish without problem

Additional info:

Comment 1 Jiri Belka 2018-10-15 08:23:18 UTC
Even with updated RPMs of ovirt ansible roles, the playbook fails - in another failure - because of BZ1638816 (broken ansible 2.6 compatibility).

Comment 2 Jiri Belka 2018-10-15 10:09:21 UTC
Workaround - use ovirt ansible RPMS > 1.9 and use ansible 2.7 on bastion host - this will cause playbooks/ovirt/openshift-cluster/ovirt-vm-infra.yml run finish successfully. After that, I would recommend downgrade of ansible back to 2.6 as this is version describe currently in official OCP docs.

Comment 3 Ondra Machacek 2018-10-16 12:03:23 UTC
Should work OK with ovirt-ansible-vm-infra-1.1.11, and any Ansible version >= 2.5. ovirt-ansible-vm-infra-1.1.11 will be released as part of 4.2.7.

Comment 4 Roy Golan 2018-11-01 08:56:57 UTC
The ovirt-openshift-extensions-ci container image is installing ovirt 4.2 release rpm for some reason it is not picking up ovirt-ansible-vm-infra 1.1.11

Comment 5 Roy Golan 2018-11-01 09:00:27 UTC
Obviously the rpm was updates but the container image was not recreated ever since the update. I'm spinning a build for the container.

https://github.com/oVirt/ovirt-openshift-extensions/issues/73

Comment 8 Jan Zmeskal 2019-04-10 14:48:00 UTC
This was a regression of oVirt.vm-infra role with ansible version 2.7 and the ovirt sdk 4.2. Now we use 2.7 and ovirt sdk 4.3 so the issue is not present any more.


Note You need to log in before you can comment on or make changes to this bug.