Bug 1799393 - LB in error status continuosly created without deleting previous ones
Summary: LB in error status continuosly created without deleting previous ones
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Networking
Version: 4.4
Hardware: Unspecified
OS: Unspecified
unspecified
medium
Target Milestone: ---
: 4.3.z
Assignee: Maysa Macedo
QA Contact: Jon Uriarte
URL:
Whiteboard:
Depends On: 1799178
Blocks:
TreeView+ depends on / blocked
 
Reported: 2020-02-06 16:56 UTC by Maysa Macedo
Modified: 2020-03-10 23:53 UTC (History)
4 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of: 1799178
Environment:
Last Closed: 2020-03-10 23:53:27 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github openshift kuryr-kubernetes pull 160 0 None closed [release-4.3] Bug 1799393: Ensure LB with error status is only recreated after deleted 2020-03-04 21:13:15 UTC
Red Hat Product Errata RHBA-2020:0676 0 None None None 2020-03-10 23:53:43 UTC

Description Maysa Macedo 2020-02-06 16:56:21 UTC
+++ This bug was initially created as a clone of Bug #1799178 +++

Description of problem:

As soon as a load balancer creation is triggered its status is
"PENDING_CREATE" and the creation of the other LB resources
are consequently triggered. If in the meantime the LB transitioned
to "ERROR" status the creation of the resources will be retried
all the time due to timeout on provisioning the LB, causing a
controller restart and new LB creation, whithout the previous one
deleted.

2020-01-29 11:44:07.728 1 DEBUG kuryr_kubernetes.handlers.retry [-] Report handler unhealthy LoadBalancerHandler __call__ /usr/lib/python3.6/site-packages/kuryr_kubernetes/handlers/retry.py:89
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging [-] Failed to handle event {'type': 'ADDED', 'object': {'kind': 'Endpoints', 'apiVersion': 'v1', 'metadata': {'name': 'console', 'namespace': 'openshift-console', 'selfLink
': '/api/v1/namespaces/openshift-console/endpoints/console', 'uid': 'dd46526a-fa50-4ca8-bee1-3f7cd0c69025', 'resourceVersion': '50641', 'creationTimestamp': '2020-01-29T11:04:50Z', 'labels': {'app': 'console'}, 'annotations': {'endpoints.
kubernetes.io/last-change-trigger-time': '2020-01-29T11:04:50Z', 'openstack.org/kuryr-lbaas-spec': '{"versioned_object.data": {"ip": "172.30.67.45", "lb_ip": null, "ports": [{"versioned_object.data": {"name": "https", "port": 443, "protoc
ol": "TCP", "targetPort": "8443"}, "versioned_object.name": "LBaaSPortSpec", "versioned_object.namespace": "kuryr_kubernetes", "versioned_object.version": "1.1"}], "project_id": "968cd882ee5145d4a3e30b9612b0cae0", "security_groups_ids": [
"17045f45-f383-4e89-bc11-ad74510b70a7"], "subnet_id": "1711fbdc-63da-496d-b1a0-162585595226", "type": "ClusterIP"}, "versioned_object.name": "LBaaSServiceSpec", "versioned_object.namespace": "kuryr_kubernetes", "versioned_object.version":
 "1.0"}', 'openstack.org/kuryr-lbaas-state': '{"versioned_object.data": {"listeners": [], "loadbalancer": {"versioned_object.data": {"id": "75d509c8-8f4c-4da1-9174-09434c4e2bf6", "ip": "172.30.66.204", "name": "openshift-console/console",
 "port_id": "4bd5850d-1227-4693-a4d7-f7d5ec5c1de8", "project_id": "968cd882ee5145d4a3e30b9612b0cae0", "provider": "amphora", "security_groups": ["17045f45-f383-4e89-bc11-ad74510b70a7"], "subnet_id": "1711fbdc-63da-496d-b1a0-162585595226"}
, "versioned_object.name": "LBaaSLoadBalancer", "versioned_object.namespace": "kuryr_kubernetes", "versioned_object.version": "1.3"}, "members": [], "pools": [], "service_pub_ip_info": null}, "versioned_object.name": "LBaaSState", "versio
ned_object.namespace": "kuryr_kubernetes", "versioned_object.version": "1.0"}'}}, 'subsets': [{'addresses': [{'ip': '10.128.110.124', 'nodeName': 'ostest-frj2q-master-0', 'targetRef': {'kind': 'Pod', 'namespace': 'openshift-console', 'nam
e': 'console-6d8d877799-v7zxj', 'uid': 'b4011044-6bba-4dc5-8df0-cd0288fbde0f', 'resourceVersion': '37228'}}, {'ip': '10.128.110.229', 'nodeName': 'ostest-frj2q-master-1', 'targetRef': {'kind': 'Pod', 'namespace': 'openshift-console', 'nam
e': 'console-6d8d877799-hhnt5', 'uid': '56363132-f345-4bdd-ab1b-b0ee026911cd', 'resourceVersion': '39120'}}], 'ports': [{'name': 'https', 'port': 8443, 'protocol': 'TCP'}]}]}}: kuryr_kubernetes.exceptions.ResourceNotReady: Resource not re
ady: LBaaSLoadBalancer(id=ec08054e-8d30-4984-8b8c-07542768ce2a,ip=172.30.67.45,name='openshift-console/console',port_id=61033fd3-3612-40a1-b358-54dc21317685,project_id='968cd882ee5145d4a3e30b9612b0cae0',provider='amphora',security_groups=
[17045f45-f383-4e89-bc11-ad74510b70a7],subnet_id=1711fbdc-63da-496d-b1a0-162585595226)
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging Traceback (most recent call last):
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/handlers/logging.py", line 37, in __call__
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging self._handler(event)
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/handlers/retry.py", line 90, in __call__
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging self._handler.set_liveness(alive=False)
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 220, in __exit__
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging self.force_reraise()
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/oslo_utils/excutils.py", line 196, in force_reraise
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging six.reraise(self.type_, self.value, self.tb)
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/six.py", line 693, in reraise
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging raise value
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/handlers/retry.py", line 78, in __call__
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging self._handler(event)
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/handlers/k8s_base.py", line 75, in __call__
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging self.on_present(obj)
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/handlers/lbaas.py", line 188, in on_present
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging if self._sync_lbaas_members(endpoints, lbaas_state, lbaas_spec):
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/handlers/lbaas.py", line 275, in _sync_lbaas_members
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging if self._sync_lbaas_pools(endpoints, lbaas_state, lbaas_spec):
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/handlers/lbaas.py", line 473, in _sync_lbaas_pools
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging if self._sync_lbaas_listeners(endpoints, lbaas_state, lbaas_spec):
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/handlers/lbaas.py", line 533, in _sync_lbaas_listeners
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging if self._add_new_listeners(endpoints, lbaas_spec, lbaas_state):
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/handlers/lbaas.py", line 562, in _add_new_listeners
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging service_type=lbaas_spec.type)
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/drivers/lbaasv2.py", line 526, in ensure_listener
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging self._find_listener, _LB_STS_POLL_SLOW_INTERVAL)
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/drivers/lbaasv2.py", line 846, in _ensure_provisioned
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging self._wait_for_provisioning(loadbalancer, remaining, interval)
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/drivers/lbaasv2.py", line 887, in _wait_for_provisioning
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging raise k_exc.ResourceNotReady(loadbalancer)
2020-01-29 11:44:07.729 1 ERROR kuryr_kubernetes.handlers.logging kuryr_kubernetes.exceptions.ResourceNotReady: Resource not ready: LBaaSLoadBalancer(id=ec08054e-8d30-4984-8b8c-07542768ce2a,ip=172.30.67.45,name='openshift-console/console'
,port_id=61033fd3-3612-40a1-b358-54dc21317685,project_id='968cd882ee5145d4a3e30b9612b0cae0',provider='amphora',security_groups=[17045f45-f383-4e89-bc11-ad74510b70a7],subnet_id=1711fbdc-63da-496d-b1a0-162585595226)

Version-Release number of selected component (if applicable):

OSP13

How reproducible:


Steps to Reproduce:
1.
2.
3.

Actual results:


Expected results:


Additional info:

Comment 4 Jon Uriarte 2020-02-24 15:13:44 UTC
Verified in 4.3.0-0.nightly-2020-02-20-235803 on top of OSP 16 RHOS_TRUNK-16.0-RHEL-8-20200220.n.0 compose.

A new LB is created when a LB transitions to ERROR status while it's in PENDING_CREATE status.

$ oc new-project test
$ oc run --image kuryr/demo pod1
$ oc get pods -o wide
NAME            READY   STATUS      RESTARTS   AGE   IP               NODE                        NOMINATED NODE   READINESS GATES
pod1-1-deploy   0/1     Completed   0          56s   10.128.107.91    ostest-pshnn-worker-c7smg   <none>           <none>
pod1-1-pb6ch    1/1     Running     0          43s   10.128.107.215   ostest-pshnn-worker-qtckd   <none>           <none>

$ oc expose dc/pod1 --port 80 --target-port 8080
$ oc get svc                                                                                                                                                                    
NAME   TYPE        CLUSTER-IP       EXTERNAL-IP   PORT(S)   AGE
pod1   ClusterIP   172.30.93.248   <none>        80/TCP    4s

LB list:
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+
| id                                   | name       | project_id                       | vip_address    | provisioning_status | provider |
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+
| 6214b835-dd33-48f7-b11c-eb400a20baff | test/pod1  | fc1c99846fef46ddb4f6f9b61162213f | 172.30.93.248  | PENDING_CREATE      | octavia  |
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+

ssh to one controller
$ sudo docker exec -it galera-bundle-podman-0 mysql
MariaDB [octavia]> use octavia;
MariaDB [octavia]> update load_balancer set provisioning_status='ERROR' where id='6214b835-dd33-48f7-b11c-eb400a20baff';

LB list:
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+
| id                                   | name       | project_id                       | vip_address    | provisioning_status | provider |
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+
| 6214b835-dd33-48f7-b11c-eb400a20baff | test/pod1  | fc1c99846fef46ddb4f6f9b61162213f | 172.30.93.248  | ERROR               | octavia  |
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+

+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+
| id                                   | name       | project_id                       | vip_address    | provisioning_status | provider |
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+
| 6214b835-dd33-48f7-b11c-eb400a20baff | test/pod1  | fc1c99846fef46ddb4f6f9b61162213f | 172.30.93.248  | PENDING_DELETE      | octavia  |
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+

A new one is created:
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+
| id                                   | name       | project_id                       | vip_address    | provisioning_status | provider |
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+
| 8e654842-b888-4093-a962-5dfd67c97dd1 | test/pod1  | fc1c99846fef46ddb4f6f9b61162213f | 172.30.93.248  | PENDING_CREATE      | octavia  |
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+

+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+
| id                                   | name       | project_id                       | vip_address    | provisioning_status | provider |
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+
| 8e654842-b888-4093-a962-5dfd67c97dd1 | test/pod1  | fc1c99846fef46ddb4f6f9b61162213f | 172.30.93.248  | ACTIVE              | octavia  |
+--------------------------------------+------------+----------------------------------+----------------+---------------------+----------+

Comment 6 errata-xmlrpc 2020-03-10 23:53:27 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2020:0676


Note You need to log in before you can comment on or make changes to this bug.