Bug 1898950 - When scaling replicas to zero, Octavia loadbalancer pool members are not updated accordingly
Summary: When scaling replicas to zero, Octavia loadbalancer pool members are not upda...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Networking
Version: 3.11.0
Hardware: x86_64
OS: Linux
high
high
Target Milestone: ---
: 4.6.z
Assignee: Luis Tomas Bolivar
QA Contact: GenadiC
URL:
Whiteboard:
Depends On: 1897142
Blocks: 1900491
TreeView+ depends on / blocked
 
Reported: 2020-11-18 11:28 UTC by OpenShift BugZilla Robot
Modified: 2020-11-30 16:46 UTC (History)
5 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
: 1900491 (view as bug list)
Environment:
Last Closed: 2020-11-30 16:46:09 UTC
Target Upstream Version:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github openshift kuryr-kubernetes pull 407 0 None closed [release-4.6] Bug 1898950: Ensure members are deleted from pools when there is no endpoints 2020-11-30 07:18:31 UTC
Red Hat Product Errata RHBA-2020:5115 0 None None None 2020-11-30 16:46:29 UTC

Comment 3 rlobillo 2020-11-23 14:53:59 UTC
Verified on 4.6.0-0.nightly-2020-11-22-160856 over OSP 13 with amphoras (puddle: 2020-11-13.1).


creating a deployment with 3 replicas and service with below files:

$ cat demo_deployment.yaml 
apiVersion: apps/v1
kind: Deployment
metadata:
  name: demo
  labels:
    app: demo
spec:
  replicas: 3
  selector:
    matchLabels:
      app: demo
  template:
    metadata:
      labels:
        app: demo
    spec:
      containers:
      - name: demo
        image: kuryr/demo
        ports:
        - containerPort: 8080

$ cat demo_svc.yaml 
apiVersion: v1
kind: Service
metadata:
  name: demo
labels:
  app: demo
spec:
  selector:                  
    app: demo
  ports:
  - port: 80
    protocol: TCP
    targetPort: 8080

The result:

$ oc get all
NAME                       READY   STATUS    RESTARTS   AGE
pod/demo-66cdc7b66-mm4dj   1/1     Running   0          69s
pod/demo-66cdc7b66-pzptv   1/1     Running   0          69s
pod/demo-66cdc7b66-t5lnk   1/1     Running   0          69s

NAME           TYPE        CLUSTER-IP      EXTERNAL-IP   PORT(S)   AGE
service/demo   ClusterIP   172.30.247.29   <none>        80/TCP    46m

NAME                   READY   UP-TO-DATE   AVAILABLE   AGE
deployment.apps/demo   3/3     3            3           47m

NAME                             DESIRED   CURRENT   READY   AGE
replicaset.apps/demo-66cdc7b66   3         3         3       47m



and 

$ openstack loadbalancer show test/demo                                                                                                                    
+---------------------+--------------------------------------+
| Field               | Value                                |
+---------------------+--------------------------------------+
| admin_state_up      | True                                 |
| created_at          | 2020-11-23T14:03:10                  |
| description         |                                      |
| flavor_id           | None                                 |
| id                  | eb93ed99-6f83-415e-9053-043966ab3bd8 |
| listeners           | c2a86359-3bb9-42d5-bea6-d74dac9423fb |
| name                | test/demo                            |
| operating_status    | ONLINE                               |
| pools               | 73a84f77-baa3-4260-8196-143955dc0cf7 |
| project_id          | b71e29afb4bc49adb192c68d438c78b3     |
| provider            | amphora                              |
| provisioning_status | ACTIVE                               |
| updated_at          | 2020-11-23T14:48:36                  |
| vip_address         | 172.30.247.29                        |
| vip_network_id      | d4be0a43-24a6-414b-b79d-24732745c5cc |
| vip_port_id         | b60a7d60-735f-4612-b38d-5b2e3b3259ef |
| vip_qos_policy_id   | None                                 |
| vip_subnet_id       | 09c9f8bb-4487-42e2-80af-ec491998ff7d |
+---------------------+--------------------------------------+


$ openstack loadbalancer member list 73a84f77-baa3-4260-8196-143955dc0cf7
+--------------------------------------+--------------------------------+----------------------------------+---------------------+----------------+---------------+------------------+--------
+
| id                                   | name                           | project_id                       | provisioning_status | address        | protocol_port | operating_status | weight 
|
+--------------------------------------+--------------------------------+----------------------------------+---------------------+----------------+---------------+------------------+--------
+
| 79996957-d9ec-471c-8f3b-0a9d7f710ac3 | test/demo-66cdc7b66-t5lnk:8080 | b71e29afb4bc49adb192c68d438c78b3 | ACTIVE              | 10.128.116.9   |          8080 | NO_MONITOR       |      1 
|
| 12276dc9-f2de-4554-b5b2-86ed23bf30e1 | test/demo-66cdc7b66-mm4dj:8080 | b71e29afb4bc49adb192c68d438c78b3 | ACTIVE              | 10.128.116.159 |          8080 | NO_MONITOR       |      1 
|
| 49808f54-85a7-431f-be3e-04affc9c7da4 | test/demo-66cdc7b66-pzptv:8080 | b71e29afb4bc49adb192c68d438c78b3 | ACTIVE              | 10.128.117.110 |          8080 | NO_MONITOR       |      1 
|
+--------------------------------------+--------------------------------+----------------------------------+---------------------+----------------+---------------+------------------+--------
+

now, scaling to 0 with below command:

$ oc scale --replicas=0 deployment.apps/demo
deployment.apps/demo scaled


After a while, all the members are removed from the pool:

$ openstack loadbalancer member list 73a84f77-baa3-4260-8196-143955dc0cf7

(shiftstack) [stack@undercloud-0 ~]$

Comment 5 errata-xmlrpc 2020-11-30 16:46:09 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (OpenShift Container Platform 4.6.6 bug fix update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2020:5115


Note You need to log in before you can comment on or make changes to this bug.