Bug 1841072 - [Kuryr][UPI on OSP] security group failed to deleted due to same name conflict
Summary: [Kuryr][UPI on OSP] security group failed to deleted due to same name conflict
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Installer
Version: 4.4
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: ---
: 4.5.0
Assignee: Maysa Macedo
QA Contact: weiwei jiang
URL:
Whiteboard:
Depends On:
Blocks: 1842378
TreeView+ depends on / blocked
 
Reported: 2020-05-28 09:44 UTC by weiwei jiang
Modified: 2020-07-13 17:42 UTC (History)
2 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Cause: OpenStack resources with same name are not allowed to be removed. Consequence: The UPI down-security-groups.yaml playbook can fail to complete when more than one security group exists with the same name. Fix: The unique security group ID is now used when specifying the security group to be deleted. Result: All the security groups are deleted and down-security-groups.yaml playbook finishes successfully.
Clone Of:
Environment:
Last Closed: 2020-07-13 17:42:23 UTC
Target Upstream Version:


Attachments (Terms of Use)


Links
System ID Priority Status Summary Last Updated
Github openshift installer pull 3682 None closed Bug 1841072: [UPI] Rely on security group ID when deleting it 2020-07-06 18:36:57 UTC
Red Hat Product Errata RHBA-2020:2409 None None None 2020-07-13 17:42:44 UTC

Description weiwei jiang 2020-05-28 09:44:38 UTC
Description of problem:
When trying to destroy OCP with kuryr networkType, got following error:

TASK [Remove the cluster security groups] **************************************
task path: /root/jenkins/workspace/Remove VMs/cucushift/private-openshift-misc/v3-launch-templates/functionality-testing/aos-4_5/hosts/upi_on_openstack-scripts/down-01_security-groups.yaml:18
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: root
<localhost> EXEC /bin/sh -c 'echo ~root && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622 && echo ansible-tmp-1590657557.7999048-800705-130657195955622="` echo /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622 `" ) && sleep 0'
Using module file /usr/lib/python3.6/site-packages/ansible/modules/commands/command.py
<localhost> PUT /root/.ansible/tmp/ansible-local-8006723cyph40k/tmpnl31uh2s TO /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622/AnsiballZ_command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622/ /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622/ > /dev/null 2>&1 && sleep 0'
<localhost> EXEC /bin/sh -c 'echo ~root && sleep 0'
failed: [localhost] (item=[0, 'openshift-operator-lifecycle-manager/olm-operator-metrics']) => {
    "ansible_loop_var": "item",
    "changed": true,
    "cmd": [
        "openstack",
        "security",
        "group",
        "delete",
        "openshift-operator-lifecycle-manager/olm-operator-metrics"
    ],
    "delta": "0:00:03.575276",
    "end": "2020-05-28 05:19:21.557325",
    "invocation": {
        "module_args": {
            "_raw_params": "openstack security group delete openshift-operator-lifecycle-manager/olm-operator-metrics",
            "_uses_shell": false,
            "argv": null,
            "chdir": null,
            "creates": null,
            "executable": null,
            "removes": null,
            "stdin": null,
            "stdin_add_newline": true,
            "strip_empty_ends": true,
            "warn": true
        }
    },
    "item": [
        0,
        "openshift-operator-lifecycle-manager/olm-operator-metrics"
    ],
    "msg": "non-zero return code",
    "rc": 1,
    "start": "2020-05-28 05:19:17.982049",
    "stderr": "Failed to delete group with name or ID 'openshift-operator-lifecycle-manager/olm-operator-metrics': More than one SecurityGroup exists with the name 'openshift-operator-lifecycle-manager/olm-operator-metrics'.\n1 of 1 groups failed to delete.",
    "stderr_lines": [
        "Failed to delete group with name or ID 'openshift-operator-lifecycle-manager/olm-operator-metrics': More than one SecurityGroup exists with the name 'openshift-operator-lifecycle-manager/olm-operator-metrics'.",
        "1 of 1 groups failed to delete."
    ],
    "stdout": "",
    "stdout_lines": []
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461 && echo ansible-tmp-1590657561.6021688-800705-55737478790461="` echo /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461 `" ) && sleep 0'
Using module file /usr/lib/python3.6/site-packages/ansible/modules/commands/command.py
<localhost> PUT /root/.ansible/tmp/ansible-local-8006723cyph40k/tmp63yuqg4h TO /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461/AnsiballZ_command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461/ /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461/ > /dev/null 2>&1 && sleep 0'
failed: [localhost] (item=[1, 'openshift-operator-lifecycle-manager/olm-operator-metrics']) => {
    "ansible_loop_var": "item",
    "changed": true,
    "cmd": [
        "openstack",
        "security",
        "group",
        "delete",
        "openshift-operator-lifecycle-manager/olm-operator-metrics"
    ],
    "delta": "0:00:03.512790",
    "end": "2020-05-28 05:19:25.292774",
    "invocation": {
        "module_args": {
            "_raw_params": "openstack security group delete openshift-operator-lifecycle-manager/olm-operator-metrics",
            "_uses_shell": false,
            "argv": null,
            "chdir": null,
            "creates": null,
            "executable": null,
            "removes": null,
            "stdin": null,
            "stdin_add_newline": true,
            "strip_empty_ends": true,
            "warn": true
        }
    },
    "item": [
        1,
        "openshift-operator-lifecycle-manager/olm-operator-metrics"
    ],
    "msg": "non-zero return code",
    "rc": 1,
    "start": "2020-05-28 05:19:21.779984",
    "stderr": "Failed to delete group with name or ID 'openshift-operator-lifecycle-manager/olm-operator-metrics': More than one SecurityGroup exists with the name 'openshift-operator-lifecycle-manager/olm-operator-metrics'.\n1 of 1 groups failed to delete.",
    "stderr_lines": [
        "Failed to delete group with name or ID 'openshift-operator-lifecycle-manager/olm-operator-metrics': More than one SecurityGroup exists with the name 'openshift-operator-lifecycle-manager/olm-operator-metrics'.",
        "1 of 1 groups failed to delete."
    ],
    "stdout": "",
    "stdout_lines": []
}

Version-Release number of the following components:
4.5.0-0.nightly-2020-05-28-023530

How reproducible:
Not sure

Steps to Reproduce:
1. Install UPI on OSP with Kuryr as networkType
2. After it succeed, try to destroy it
3. Check if destroy work well.

Actual results:
TASK [List security groups] ****************************************************
task path: /root/jenkins/workspace/Remove VMs/cucushift/private-openshift-misc/v3-launch-templates/functionality-testing/aos-4_5/hosts/upi_on_openstack-scripts/down-01_security-groups.yaml:13
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: root
<localhost> EXEC /bin/sh -c 'echo ~root && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir /root/.ansible/tmp/ansible-tmp-1590657553.613817-800685-75109734100296 && echo ansible-tmp-1590657553.613817-800685-75109734100296="` echo /root/.ansible/tmp/ansible-tmp-1590657553.613817-800685-75109734100296 `" ) && sleep 0'
Using module file /usr/lib/python3.6/site-packages/ansible/modules/commands/command.py
<localhost> PUT /root/.ansible/tmp/ansible-local-8006723cyph40k/tmp83k82a8a TO /root/.ansible/tmp/ansible-tmp-1590657553.613817-800685-75109734100296/AnsiballZ_command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1590657553.613817-800685-75109734100296/ /root/.ansible/tmp/ansible-tmp-1590657553.613817-800685-75109734100296/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1590657553.613817-800685-75109734100296/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1590657553.613817-800685-75109734100296/ > /dev/null 2>&1 && sleep 0'
changed: [localhost] => {
    "changed": true,
    "cmd": [
        "openstack",
        "security",
        "group",
        "list",
        "--tags",
        "openshiftClusterID=wj45uos528z-mkgz6",
        "-f",
        "value",
        "-c",
        "Name"
    ],
    "delta": "0:00:03.759990",
    "end": "2020-05-28 05:19:17.725389",
    "invocation": {
        "module_args": {
            "_raw_params": "openstack security group list --tags openshiftClusterID=wj45uos528z-mkgz6 -f value -c Name",
            "_uses_shell": false,
            "argv": null,
            "chdir": null,
            "creates": null,
            "executable": null,
            "removes": null,
            "stdin": null,
            "stdin_add_newline": true,
            "strip_empty_ends": true,
            "warn": true
        }
    },
    "rc": 0,
    "start": "2020-05-28 05:19:13.965399",
    "stderr": "",
    "stderr_lines": [],
    "stdout": "openshift-operator-lifecycle-manager/olm-operator-metrics\nopenshift-operator-lifecycle-manager/olm-operator-metrics",
    "stdout_lines": [
        "openshift-operator-lifecycle-manager/olm-operator-metrics",
        "openshift-operator-lifecycle-manager/olm-operator-metrics"
    ]
}

TASK [Remove the cluster security groups] **************************************
task path: /root/jenkins/workspace/Remove VMs/cucushift/private-openshift-misc/v3-launch-templates/functionality-testing/aos-4_5/hosts/upi_on_openstack-scripts/down-01_security-groups.yaml:18
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: root
<localhost> EXEC /bin/sh -c 'echo ~root && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622 && echo ansible-tmp-1590657557.7999048-800705-130657195955622="` echo /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622 `" ) && sleep 0'
Using module file /usr/lib/python3.6/site-packages/ansible/modules/commands/command.py
<localhost> PUT /root/.ansible/tmp/ansible-local-8006723cyph40k/tmpnl31uh2s TO /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622/AnsiballZ_command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622/ /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1590657557.7999048-800705-130657195955622/ > /dev/null 2>&1 && sleep 0'
<localhost> EXEC /bin/sh -c 'echo ~root && sleep 0'
failed: [localhost] (item=[0, 'openshift-operator-lifecycle-manager/olm-operator-metrics']) => {
    "ansible_loop_var": "item",
    "changed": true,
    "cmd": [
        "openstack",
        "security",
        "group",
        "delete",
        "openshift-operator-lifecycle-manager/olm-operator-metrics"
    ],
    "delta": "0:00:03.575276",
    "end": "2020-05-28 05:19:21.557325",
    "invocation": {
        "module_args": {
            "_raw_params": "openstack security group delete openshift-operator-lifecycle-manager/olm-operator-metrics",
            "_uses_shell": false,
            "argv": null,
            "chdir": null,
            "creates": null,
            "executable": null,
            "removes": null,
            "stdin": null,
            "stdin_add_newline": true,
            "strip_empty_ends": true,
            "warn": true
        }
    },
    "item": [
        0,
        "openshift-operator-lifecycle-manager/olm-operator-metrics"
    ],
    "msg": "non-zero return code",
    "rc": 1,
    "start": "2020-05-28 05:19:17.982049",
    "stderr": "Failed to delete group with name or ID 'openshift-operator-lifecycle-manager/olm-operator-metrics': More than one SecurityGroup exists with the name 'openshift-operator-lifecycle-manager/olm-operator-metrics'.\n1 of 1 groups failed to delete.",
    "stderr_lines": [
        "Failed to delete group with name or ID 'openshift-operator-lifecycle-manager/olm-operator-metrics': More than one SecurityGroup exists with the name 'openshift-operator-lifecycle-manager/olm-operator-metrics'.",
        "1 of 1 groups failed to delete."
    ],
    "stdout": "",
    "stdout_lines": []
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461 && echo ansible-tmp-1590657561.6021688-800705-55737478790461="` echo /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461 `" ) && sleep 0'
Using module file /usr/lib/python3.6/site-packages/ansible/modules/commands/command.py
<localhost> PUT /root/.ansible/tmp/ansible-local-8006723cyph40k/tmp63yuqg4h TO /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461/AnsiballZ_command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461/ /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1590657561.6021688-800705-55737478790461/ > /dev/null 2>&1 && sleep 0'
failed: [localhost] (item=[1, 'openshift-operator-lifecycle-manager/olm-operator-metrics']) => {
    "ansible_loop_var": "item",
    "changed": true,
    "cmd": [
        "openstack",
        "security",
        "group",
        "delete",
        "openshift-operator-lifecycle-manager/olm-operator-metrics"
    ],
    "delta": "0:00:03.512790",
    "end": "2020-05-28 05:19:25.292774",
    "invocation": {
        "module_args": {
            "_raw_params": "openstack security group delete openshift-operator-lifecycle-manager/olm-operator-metrics",
            "_uses_shell": false,
            "argv": null,
            "chdir": null,
            "creates": null,
            "executable": null,
            "removes": null,
            "stdin": null,
            "stdin_add_newline": true,
            "strip_empty_ends": true,
            "warn": true
        }
    },
    "item": [
        1,
        "openshift-operator-lifecycle-manager/olm-operator-metrics"
    ],
    "msg": "non-zero return code",
    "rc": 1,
    "start": "2020-05-28 05:19:21.779984",
    "stderr": "Failed to delete group with name or ID 'openshift-operator-lifecycle-manager/olm-operator-metrics': More than one SecurityGroup exists with the name 'openshift-operator-lifecycle-manager/olm-operator-metrics'.\n1 of 1 groups failed to delete.",
    "stderr_lines": [
        "Failed to delete group with name or ID 'openshift-operator-lifecycle-manager/olm-operator-metrics': More than one SecurityGroup exists with the name 'openshift-operator-lifecycle-manager/olm-operator-metrics'.",
        "1 of 1 groups failed to delete."
    ],
    "stdout": "",
    "stdout_lines": []
}

Expected results:
It should be destroyed without any error

Additional info:
Please attach logs from ansible-playbook with the -vvv flag

Comment 1 Maysa Macedo 2020-05-28 10:51:53 UTC
Hi Weiwei,
Could you tell me if this is with OSP13 or OSP16, and if OSP16 is it with ovn-octavia provider?
Also, I believe the playbooks being ran are outdated as the name is down-01_security-groups.yaml and 4.5 upstream is down-security-groups.yaml.

Comment 5 weiwei jiang 2020-05-29 05:39:09 UTC
Also reproduced with  4.4.0-0.nightly-2020-05-28-042707

(shiftstack) [stack@undercloud-0 ~]$ openstack security group list 
+--------------------------------------+------------------------------------------------------------+------------------------+----------------------------------+------------------------------------------+
| ID                                   | Name                                                       | Description            | Project                          | Tags                                     |
+--------------------------------------+------------------------------------------------------------+------------------------+----------------------------------+------------------------------------------+
| 486f25f2-19d7-42ee-8a4c-49723c203b31 | openshift-dns/dns-default                                  |                        | 15580ce70f6a462d84168895c93eba76 | ['openshiftClusterID=wj44uos528y-62nph'] |
| 4e919807-909e-42a1-b03a-e059c8319138 | openshift-operator-lifecycle-manager/packageserver-service |                        | 15580ce70f6a462d84168895c93eba76 | ['openshiftClusterID=wj44uos528y-62nph'] |
| 602ef0ca-202d-4ab3-9030-85958803f24d | openshift-dns/dns-default                                  |                        | 15580ce70f6a462d84168895c93eba76 | ['openshiftClusterID=wj44uos528y-62nph'] |
| 6f22656b-d782-4939-93f8-5f2b045388fd | openshift-operator-lifecycle-manager/packageserver-service |                        | 15580ce70f6a462d84168895c93eba76 | ['openshiftClusterID=wj44uos528y-62nph'] |
| 84ad1d67-9268-4ec2-86b2-8ab74325bcc2 | openshift-operator-lifecycle-manager/packageserver-service |                        | 15580ce70f6a462d84168895c93eba76 | ['openshiftClusterID=wj44uos528y-62nph'] |
| a242ff18-1e74-4fdf-88bf-360a0ffe4e41 | openshift-dns/dns-default                                  |                        | 15580ce70f6a462d84168895c93eba76 | ['openshiftClusterID=wj44uos528y-62nph'] |
| a53651e2-486e-4359-b32c-0718c44c99a1 | default                                                    | Default security group | 15580ce70f6a462d84168895c93eba76 | []                                       |
+--------------------------------------+------------------------------------------------------------+------------------------+----------------------------------+------------------------------------------+

Comment 6 Luis Tomas Bolivar 2020-05-29 06:28:42 UTC
(In reply to weiwei jiang from comment #5)
> Also reproduced with  4.4.0-0.nightly-2020-05-28-042707
> 
> (shiftstack) [stack@undercloud-0 ~]$ openstack security group list 
> +--------------------------------------+-------------------------------------
> -----------------------+------------------------+----------------------------
> ------+------------------------------------------+
> | ID                                   | Name                               
> | Description            | Project                          | Tags          
> |
> +--------------------------------------+-------------------------------------
> -----------------------+------------------------+----------------------------
> ------+------------------------------------------+
> | 486f25f2-19d7-42ee-8a4c-49723c203b31 | openshift-dns/dns-default          
> |                        | 15580ce70f6a462d84168895c93eba76 |
> ['openshiftClusterID=wj44uos528y-62nph'] |
> | 4e919807-909e-42a1-b03a-e059c8319138 |
> openshift-operator-lifecycle-manager/packageserver-service |                
> | 15580ce70f6a462d84168895c93eba76 |
> ['openshiftClusterID=wj44uos528y-62nph'] |
> | 602ef0ca-202d-4ab3-9030-85958803f24d | openshift-dns/dns-default          
> |                        | 15580ce70f6a462d84168895c93eba76 |
> ['openshiftClusterID=wj44uos528y-62nph'] |
> | 6f22656b-d782-4939-93f8-5f2b045388fd |
> openshift-operator-lifecycle-manager/packageserver-service |                
> | 15580ce70f6a462d84168895c93eba76 |
> ['openshiftClusterID=wj44uos528y-62nph'] |
> | 84ad1d67-9268-4ec2-86b2-8ab74325bcc2 |
> openshift-operator-lifecycle-manager/packageserver-service |                
> | 15580ce70f6a462d84168895c93eba76 |
> ['openshiftClusterID=wj44uos528y-62nph'] |
> | a242ff18-1e74-4fdf-88bf-360a0ffe4e41 | openshift-dns/dns-default          
> |                        | 15580ce70f6a462d84168895c93eba76 |
> ['openshiftClusterID=wj44uos528y-62nph'] |
> | a53651e2-486e-4359-b32c-0718c44c99a1 | default                            
> | Default security group | 15580ce70f6a462d84168895c93eba76 | []            
> |
> +--------------------------------------+-------------------------------------
> -----------------------+------------------------+----------------------------
> ------+------------------------------------------+

This seems to be a duplicate of https://bugzilla.redhat.com/show_bug.cgi?id=1839180 (that is the reason why those SGs are there). However we should delete the SG by using the UUID and not the name. So let's keep both

Comment 9 weiwei jiang 2020-06-02 01:40:47 UTC
Checked with 4.5.0-0.nightly-2020-06-01-043833, moved to verify.

ansible-playbook 2.8.10
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/home/jenkins/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /home/jenkins/venv/lib64/python3.6/site-packages/ansible
  executable location = /home/jenkins/venv/bin/ansible-playbook
  python version = 3.6.8 (default, Sep 26 2019, 11:57:09) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
Using /etc/ansible/ansible.cfg as config file
host_list declined parsing /home/jenkins/workspace/Remove VMs/cucushift/workdir/install-dir/inventory.yaml as it did not pass it's verify_file() method
script declined parsing /home/jenkins/workspace/Remove VMs/cucushift/workdir/install-dir/inventory.yaml as it did not pass it's verify_file() method
Parsed /home/jenkins/workspace/Remove VMs/cucushift/workdir/install-dir/inventory.yaml inventory source with yaml plugin

PLAYBOOK: down-01_security-groups.yaml *****************************************
2 plays in /home/jenkins/workspace/Remove VMs/cucushift/private-openshift-misc/v3-launch-templates/functionality-testing/aos-4_5/hosts/upi_on_openstack-scripts/down-01_security-groups.yaml
Read vars_file 'metadata.json'
Read vars_file 'metadata.json'
Read vars_file 'metadata.json'

PLAY [localhost] ***************************************************************
META: ran handlers
Read vars_file 'metadata.json'

TASK [Compute resource names] **************************************************
task path: /home/jenkins/workspace/Remove VMs/cucushift/private-openshift-misc/v3-launch-templates/functionality-testing/aos-4_5/hosts/upi_on_openstack-scripts/common.yaml:8
ok: [localhost] => {
    "ansible_facts": {
        "cluster_id_tag": "openshiftClusterID=wj45uos601a-khm7f",
        "os_bootstrap_ignition": "wj45uos601a-khm7f-bootstrap-ignition.json",
        "os_bootstrap_server_name": "wj45uos601a-khm7f-bootstrap",
        "os_compute_server_name": "wj45uos601a-khm7f-worker",
        "os_compute_trunk_name": "wj45uos601a-khm7f-worker-trunk",
        "os_cp_server_group_name": "wj45uos601a-khm7f-master",
        "os_cp_server_name": "wj45uos601a-khm7f-master",
        "os_cp_trunk_name": "wj45uos601a-khm7f-master-trunk",
        "os_network": "wj45uos601a-khm7f-network",
        "os_port_api": "wj45uos601a-khm7f-api-port",
        "os_port_bootstrap": "wj45uos601a-khm7f-bootstrap-port",
        "os_port_ingress": "wj45uos601a-khm7f-ingress-port",
        "os_port_master": "wj45uos601a-khm7f-master-port",
        "os_port_worker": "wj45uos601a-khm7f-worker-port",
        "os_router": "wj45uos601a-khm7f-external-router",
        "os_sg_master": "wj45uos601a-khm7f-master",
        "os_sg_worker": "wj45uos601a-khm7f-worker",
        "os_subnet": "wj45uos601a-khm7f-nodes",
        "os_svc_network": "wj45uos601a-khm7f-kuryr-service-network",
        "os_svc_subnet": "wj45uos601a-khm7f-kuryr-service-subnet",
        "subnet_pool": "wj45uos601a-khm7f-kuryr-pod-subnetpool"
    },
    "changed": false
}
META: ran handlers
META: ran handlers

PLAY [all] *********************************************************************
META: ran handlers

TASK [List security groups] ****************************************************
task path: /home/jenkins/workspace/Remove VMs/cucushift/private-openshift-misc/v3-launch-templates/functionality-testing/aos-4_5/hosts/upi_on_openstack-scripts/down-01_security-groups.yaml:13
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: default
<localhost> EXEC /bin/sh -c 'echo ~default && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/jenkins/.ansible/tmp/ansible-tmp-1590998784.255829-214004381048309 `" && echo ansible-tmp-1590998784.255829-214004381048309="` echo /home/jenkins/.ansible/tmp/ansible-tmp-1590998784.255829-214004381048309 `" ) && sleep 0'
Using module file /home/jenkins/venv/lib64/python3.6/site-packages/ansible/modules/commands/command.py
<localhost> PUT /home/jenkins/.ansible/tmp/ansible-local-1469slka4mko/tmpl3_byk2u TO /home/jenkins/.ansible/tmp/ansible-tmp-1590998784.255829-214004381048309/AnsiballZ_command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /home/jenkins/.ansible/tmp/ansible-tmp-1590998784.255829-214004381048309/ /home/jenkins/.ansible/tmp/ansible-tmp-1590998784.255829-214004381048309/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/home/jenkins/venv/bin/python3 /home/jenkins/.ansible/tmp/ansible-tmp-1590998784.255829-214004381048309/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /home/jenkins/.ansible/tmp/ansible-tmp-1590998784.255829-214004381048309/ > /dev/null 2>&1 && sleep 0'
changed: [localhost] => {
    "changed": true,
    "cmd": [
        "openstack",
        "security",
        "group",
        "list",
        "--tags",
        "openshiftClusterID=wj45uos601a-khm7f",
        "-f",
        "value",
        "-c",
        "ID"
    ],
    "delta": "0:00:03.398370",
    "end": "2020-06-01 08:06:28.079851",
    "invocation": {
        "module_args": {
            "_raw_params": "openstack security group list --tags openshiftClusterID=wj45uos601a-khm7f -f value -c ID",
            "_uses_shell": false,
            "argv": null,
            "chdir": null,
            "creates": null,
            "executable": null,
            "removes": null,
            "stdin": null,
            "stdin_add_newline": true,
            "strip_empty_ends": true,
            "warn": true
        }
    },
    "rc": 0,
    "start": "2020-06-01 08:06:24.681481",
    "stderr": "",
    "stderr_lines": [],
    "stdout": "07e86af4-b946-4ff3-b97b-4583f1e27435\n52eef20b-87be-455d-9896-1ae11f87ed2e\nb62f3afb-3c17-4000-8d6e-3f3ee72628eb",
    "stdout_lines": [
        "07e86af4-b946-4ff3-b97b-4583f1e27435",
        "52eef20b-87be-455d-9896-1ae11f87ed2e",
        "b62f3afb-3c17-4000-8d6e-3f3ee72628eb"
    ]
}

TASK [Remove the cluster security groups] **************************************
task path: /home/jenkins/workspace/Remove VMs/cucushift/private-openshift-misc/v3-launch-templates/functionality-testing/aos-4_5/hosts/upi_on_openstack-scripts/down-01_security-groups.yaml:18
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: default
<localhost> EXEC /bin/sh -c 'echo ~default && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/jenkins/.ansible/tmp/ansible-tmp-1590998788.1467671-40313162048972 `" && echo ansible-tmp-1590998788.1467671-40313162048972="` echo /home/jenkins/.ansible/tmp/ansible-tmp-1590998788.1467671-40313162048972 `" ) && sleep 0'
Using module file /home/jenkins/venv/lib64/python3.6/site-packages/ansible/modules/commands/command.py
<localhost> PUT /home/jenkins/.ansible/tmp/ansible-local-1469slka4mko/tmp16bey5g9 TO /home/jenkins/.ansible/tmp/ansible-tmp-1590998788.1467671-40313162048972/AnsiballZ_command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /home/jenkins/.ansible/tmp/ansible-tmp-1590998788.1467671-40313162048972/ /home/jenkins/.ansible/tmp/ansible-tmp-1590998788.1467671-40313162048972/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/home/jenkins/venv/bin/python3 /home/jenkins/.ansible/tmp/ansible-tmp-1590998788.1467671-40313162048972/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /home/jenkins/.ansible/tmp/ansible-tmp-1590998788.1467671-40313162048972/ > /dev/null 2>&1 && sleep 0'
changed: [localhost] => (item=[0, '07e86af4-b946-4ff3-b97b-4583f1e27435']) => {
    "ansible_loop_var": "item",
    "changed": true,
    "cmd": [
        "openstack",
        "security",
        "group",
        "delete",
        "07e86af4-b946-4ff3-b97b-4583f1e27435"
    ],
    "delta": "0:00:03.707514",
    "end": "2020-06-01 08:06:32.057192",
    "invocation": {
        "module_args": {
            "_raw_params": "openstack security group delete 07e86af4-b946-4ff3-b97b-4583f1e27435",
            "_uses_shell": false,
            "argv": null,
            "chdir": null,
            "creates": null,
            "executable": null,
            "removes": null,
            "stdin": null,
            "stdin_add_newline": true,
            "strip_empty_ends": true,
            "warn": true
        }
    },
    "item": [
        0,
        "07e86af4-b946-4ff3-b97b-4583f1e27435"
    ],
    "rc": 0,
    "start": "2020-06-01 08:06:28.349678",
    "stderr": "",
    "stderr_lines": [],
    "stdout": "",
    "stdout_lines": []
}
<localhost> EXEC /bin/sh -c 'echo ~default && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/jenkins/.ansible/tmp/ansible-tmp-1590998792.1007419-161347845822234 `" && echo ansible-tmp-1590998792.1007419-161347845822234="` echo /home/jenkins/.ansible/tmp/ansible-tmp-1590998792.1007419-161347845822234 `" ) && sleep 0'
Using module file /home/jenkins/venv/lib64/python3.6/site-packages/ansible/modules/commands/command.py
<localhost> PUT /home/jenkins/.ansible/tmp/ansible-local-1469slka4mko/tmp9jo7u902 TO /home/jenkins/.ansible/tmp/ansible-tmp-1590998792.1007419-161347845822234/AnsiballZ_command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /home/jenkins/.ansible/tmp/ansible-tmp-1590998792.1007419-161347845822234/ /home/jenkins/.ansible/tmp/ansible-tmp-1590998792.1007419-161347845822234/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/home/jenkins/venv/bin/python3 /home/jenkins/.ansible/tmp/ansible-tmp-1590998792.1007419-161347845822234/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /home/jenkins/.ansible/tmp/ansible-tmp-1590998792.1007419-161347845822234/ > /dev/null 2>&1 && sleep 0'
changed: [localhost] => (item=[1, '52eef20b-87be-455d-9896-1ae11f87ed2e']) => {
    "ansible_loop_var": "item",
    "changed": true,
    "cmd": [
        "openstack",
        "security",
        "group",
        "delete",
        "52eef20b-87be-455d-9896-1ae11f87ed2e"
    ],
    "delta": "0:00:03.334130",
    "end": "2020-06-01 08:06:35.628018",
    "invocation": {
        "module_args": {
            "_raw_params": "openstack security group delete 52eef20b-87be-455d-9896-1ae11f87ed2e",
            "_uses_shell": false,
            "argv": null,
            "chdir": null,
            "creates": null,
            "executable": null,
            "removes": null,
            "stdin": null,
            "stdin_add_newline": true,
            "strip_empty_ends": true,
            "warn": true
        }
    },
    "item": [
        1,
        "52eef20b-87be-455d-9896-1ae11f87ed2e"
    ],
    "rc": 0,
    "start": "2020-06-01 08:06:32.293888",
    "stderr": "",
    "stderr_lines": [],
    "stdout": "",
    "stdout_lines": []
}
<localhost> EXEC /bin/sh -c 'echo ~default && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/jenkins/.ansible/tmp/ansible-tmp-1590998795.666621-54872755158345 `" && echo ansible-tmp-1590998795.666621-54872755158345="` echo /home/jenkins/.ansible/tmp/ansible-tmp-1590998795.666621-54872755158345 `" ) && sleep 0'
Using module file /home/jenkins/venv/lib64/python3.6/site-packages/ansible/modules/commands/command.py
<localhost> PUT /home/jenkins/.ansible/tmp/ansible-local-1469slka4mko/tmphmapbxva TO /home/jenkins/.ansible/tmp/ansible-tmp-1590998795.666621-54872755158345/AnsiballZ_command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /home/jenkins/.ansible/tmp/ansible-tmp-1590998795.666621-54872755158345/ /home/jenkins/.ansible/tmp/ansible-tmp-1590998795.666621-54872755158345/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/home/jenkins/venv/bin/python3 /home/jenkins/.ansible/tmp/ansible-tmp-1590998795.666621-54872755158345/AnsiballZ_command.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /home/jenkins/.ansible/tmp/ansible-tmp-1590998795.666621-54872755158345/ > /dev/null 2>&1 && sleep 0'
changed: [localhost] => (item=[2, 'b62f3afb-3c17-4000-8d6e-3f3ee72628eb']) => {
    "ansible_loop_var": "item",
    "changed": true,
    "cmd": [
        "openstack",
        "security",
        "group",
        "delete",
        "b62f3afb-3c17-4000-8d6e-3f3ee72628eb"
    ],
    "delta": "0:00:03.677519",
    "end": "2020-06-01 08:06:39.552364",
    "invocation": {
        "module_args": {
            "_raw_params": "openstack security group delete b62f3afb-3c17-4000-8d6e-3f3ee72628eb",
            "_uses_shell": false,
            "argv": null,
            "chdir": null,
            "creates": null,
            "executable": null,
            "removes": null,
            "stdin": null,
            "stdin_add_newline": true,
            "strip_empty_ends": true,
            "warn": true
        }
    },
    "item": [
        2,
        "b62f3afb-3c17-4000-8d6e-3f3ee72628eb"
    ],
    "rc": 0,
    "start": "2020-06-01 08:06:35.874845",
    "stderr": "",
    "stderr_lines": [],
    "stdout": "",
    "stdout_lines": []
}
META: ran handlers
META: ran handlers

PLAY RECAP *********************************************************************
localhost                  : ok=3    changed=2    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

Comment 10 errata-xmlrpc 2020-07-13 17:42:23 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2020:2409


Note You need to log in before you can comment on or make changes to this bug.