Bug 1821653 - [UPI] Wrong attempt to delete LB with same name and with status PENDING
Summary: [UPI] Wrong attempt to delete LB with same name and with status PENDING
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Installer
Version: 4.5
Hardware: Unspecified
OS: Unspecified
high
medium
Target Milestone: ---
: 4.5.0
Assignee: Maysa Macedo
QA Contact: weiwei jiang
URL:
Whiteboard:
Depends On:
Blocks: 1822379
TreeView+ depends on / blocked
 
Reported: 2020-04-07 10:43 UTC by Maysa Macedo
Modified: 2020-07-13 17:26 UTC (History)
2 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Cause: Octavia does not allow deletion of load balancers with provisioning status PENDING-* and OpenStack resources with same name cannot be removed. Consequence: The down-load-balancers.yaml playbook fails to successfully complete the deletion of all Octavia resources when load balancers have provisioning status PENDING-* or there is more than one load balancer with same name. Fix: The deletion of load balancers with provisioning_status PENDING-* is skipped. The execution of the down-load-balancers.yaml playbook needs to be retried once the load balancers have transitioned to `ACTIVE`. Also, the unique load balancer ID is used, instead of the name when specifying the resource to be deleted. Result: All load balancers are deleted.
Clone Of:
: 1822379 (view as bug list)
Environment:
Last Closed: 2020-07-13 17:25:52 UTC
Target Upstream Version:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github openshift installer pull 3419 0 None closed Bug 1821653: Fix LB deletion for lbs with same or status pending 2021-02-02 17:42:29 UTC
Red Hat Product Errata RHBA-2020:2409 0 None None None 2020-07-13 17:26:05 UTC

Description Maysa Macedo 2020-04-07 10:43:19 UTC
Description of problem:

When destroying an UPI cluster with Kuryr, it's possible that LBs with the same name might exist or with provisioning_status 'PENDING-*', these scenarios break the lbs destroy as they are not allowed.


Version-Release number of the following components:
rpm -q openshift-ansible
rpm -q ansible
ansible --version

How reproducible:

Steps to Reproduce:
1.
2.
3.

Actual results:
Please include the entire output from the last TASK line through the end of output if an error is generated

Expected results:

Additional info:
Please attach logs from ansible-playbook with the -vvv flag

Comment 3 weiwei jiang 2020-04-15 04:50:29 UTC
Blocked by https://bugzilla.redhat.com/show_bug.cgi?id=1823631

Comment 4 weiwei jiang 2020-04-24 09:35:37 UTC
Verified with 4.5.0-0.nightly-2020-04-21-103613.

TASK [Remove the cluster load balancers] ***************************************
task path: /root/jenkins/workspace/Remove VMs@2/cucushift/private-openshift-misc/v3-launch-templates/functionality-testing/aos-4_5/hosts/upi_on_openstack-scripts/down-03_load-balancers.yaml:69
skipping: [localhost] => (item={'provider': 'octavia', 'description': 'openshiftClusterID=wj45krr423c-9k64h', 'admin_state_up': True, 'pools': [], 'created_at': '2020-04-24T09:33:47', 'provisioning_status': 'PENDING_CREATE', 'updated_at': None, 'vip_qos_policy_id': None, 'vip_network_id': 'db9a5175-20ee-4596-a234-ebd78f0f902f', 'listeners': [], 'vip_port_id': 'c58e0ea3-9dd4-448f-9770-dcbe2c14a585', 'flavor_id': '', 'vip_address': '172.31.0.2', 'vip_subnet_id': 'f5dd54da-446c-49c7-9940-401b95fbfa01', 'project_id': '2e21d7246c8f4dbba4881c386e018ba5', 'id': '9debd966-bc0e-4d05-8ce6-1fca1feb40e0', 'operating_status': 'OFFLINE', 'name': 'awjiang-test-5'})  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": {
        "admin_state_up": true,
        "created_at": "2020-04-24T09:33:47",
        "description": "openshiftClusterID=wj45krr423c-9k64h",
        "flavor_id": "",
        "id": "9debd966-bc0e-4d05-8ce6-1fca1feb40e0",
        "listeners": [],
        "name": "awjiang-test-5",
        "operating_status": "OFFLINE",
        "pools": [],
        "project_id": "2e21d7246c8f4dbba4881c386e018ba5",
        "provider": "octavia",
        "provisioning_status": "PENDING_CREATE",
        "updated_at": null,
        "vip_address": "172.31.0.2",
        "vip_network_id": "db9a5175-20ee-4596-a234-ebd78f0f902f",
        "vip_port_id": "c58e0ea3-9dd4-448f-9770-dcbe2c14a585",
        "vip_qos_policy_id": null,
        "vip_subnet_id": "f5dd54da-446c-49c7-9940-401b95fbfa01"
    },
    "skip_reason": "Conditional result was False"
}
skipping: [localhost] => (item={'provider': 'octavia', 'description': 'openshiftClusterID=wj45krr423c-9k64h', 'admin_state_up': True, 'pools': [], 'created_at': '2020-04-24T09:33:50', 'provisioning_status': 'PENDING_CREATE', 'updated_at': None, 'vip_qos_policy_id': None, 'vip_network_id': 'db9a5175-20ee-4596-a234-ebd78f0f902f', 'listeners': [], 'vip_port_id': 'aea35692-f1ce-4fc0-8df5-4189f166d802', 'flavor_id': '', 'vip_address': '172.31.0.8', 'vip_subnet_id': 'f5dd54da-446c-49c7-9940-401b95fbfa01', 'project_id': '2e21d7246c8f4dbba4881c386e018ba5', 'id': '5f384f81-2d41-427f-9d08-19aa085fa3bc', 'operating_status': 'OFFLINE', 'name': 'awjiang-test-3'})  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": {
        "admin_state_up": true,
        "created_at": "2020-04-24T09:33:50",
        "description": "openshiftClusterID=wj45krr423c-9k64h",
        "flavor_id": "",
        "id": "5f384f81-2d41-427f-9d08-19aa085fa3bc",
        "listeners": [],
        "name": "awjiang-test-3",
        "operating_status": "OFFLINE",
        "pools": [],
        "project_id": "2e21d7246c8f4dbba4881c386e018ba5",
        "provider": "octavia",
        "provisioning_status": "PENDING_CREATE",
        "updated_at": null,
        "vip_address": "172.31.0.8",
        "vip_network_id": "db9a5175-20ee-4596-a234-ebd78f0f902f",
        "vip_port_id": "aea35692-f1ce-4fc0-8df5-4189f166d802",
        "vip_qos_policy_id": null,
        "vip_subnet_id": "f5dd54da-446c-49c7-9940-401b95fbfa01"
    },
    "skip_reason": "Conditional result was False"
}
META: ran handlers
META: ran handlers

Comment 5 errata-xmlrpc 2020-07-13 17:25:52 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2020:2409


Note You need to log in before you can comment on or make changes to this bug.