Bug 1522713 - healthmonitor did not take effect
Summary: healthmonitor did not take effect
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-neutron-lbaas
Version: 12.0 (Pike)
Hardware: x86_64
OS: Linux
high
high
Target Milestone: z1
: 12.0 (Pike)
Assignee: Carlos Goncalves
QA Contact: Alexander Stafeyev
URL:
Whiteboard:
Depends On: 1512375
Blocks: 1522711
TreeView+ depends on / blocked
 
Reported: 2017-12-06 10:01 UTC by Carlos Goncalves
Modified: 2022-08-16 11:36 UTC (History)
8 users (show)

Fixed In Version: openstack-neutron-lbaas-11.0.1-6.el7ost
Doc Type: If docs needed, set a value
Doc Text:
Clone Of: 1512375
Environment:
Last Closed: 2018-01-30 20:02:19 UTC
Target Upstream Version:
Embargoed:
nmagnezi: needinfo-


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
OpenStack gerrit 521250 0 None None None 2017-12-06 10:01:52 UTC
Red Hat Issue Tracker OSP-4790 0 None None None 2022-08-16 11:36:43 UTC
Red Hat Product Errata RHBA-2018:0245 0 normal SHIPPED_LIVE openstack-neutron bug fix advisory 2018-02-16 03:55:18 UTC

Description Carlos Goncalves 2017-12-06 10:01:53 UTC
+++ This bug was initially created as a clone of Bug #1512375 +++

Description of problem:

Create a healthmonitor and add to a pool, shutdown or delete one member but member state is not set to inactive when the max of failed retries is reached for the instance.



Version-Release number of selected component (if applicable):
RHOSP10


How reproducible:


Steps to Reproduce:
1. Create Lbaasv2: 

$ nova list
+--------------------------------------+-------+--------+------------+-------------+---------------------+
| ID                                   | Name  | Status | Task State | Power State | Networks            |
+--------------------------------------+-------+--------+------------+-------------+---------------------+
| e7e10318-ac85-4191-8a56-fed46f2422f9 | node1 | ACTIVE | -          | Running     | private=10.10.1.103 |
| 8d691b56-cedb-4723-82f4-65938e2f71a8 | node2 | ACTIVE | -          | Running     | private=10.10.1.102 |
+--------------------------------------+-------+--------+------------+-------------+---------------------+

$ neutron lbaas-loadbalancer-status  041b24a3-55e1-4880-b04f-cf22ab057b30
{
    "loadbalancer": {
        "name": "lb1", 
        "provisioning_status": "ACTIVE", 
        "listeners": [
            {
                "name": "listener1", 
                "provisioning_status": "ACTIVE", 
                "pools": [
                    {
                        "name": "pool1", 
                        "provisioning_status": "ACTIVE", 
                        "healthmonitor": {
                            "provisioning_status": "ACTIVE", 
                            "type": "HTTP", 
                            "id": "083f068f-3ee1-471b-a6ba-2438648078c0", 
                            "name": ""
                        }, 
                        "members": [
                            {
                                "name": "", 
                                "provisioning_status": "ACTIVE", 
                                "address": "10.10.1.103", 
                                "protocol_port": 80, 
                                "id": "97e4de09-7e88-41fb-9e7f-5b061fd4d44f", 
                                "operating_status": "ONLINE"
                            }, 
                            {
                                "name": "", 
                                "provisioning_status": "ACTIVE", 
                                "address": "10.10.1.102", 
                                "protocol_port": 80, 
                                "id": "9bdc780f-2553-40cf-82bf-aa43ab94ce8a", 
                                "operating_status": "ONLINE"
                            }
                        ], 
                        "id": "17c63199-b63e-4f22-83a3-baea98345923", 
                        "operating_status": "ONLINE"
                    }
                ], 
                "l7policies": [], 
                "id": "2c1f8311-d834-45af-8efb-05ff1684e2a6", 
                "operating_status": "ONLINE"
            }
        ], 
        "pools": [
            {
                "name": "pool1", 
                "provisioning_status": "ACTIVE", 
                "healthmonitor": {
                    "provisioning_status": "ACTIVE", 
                    "type": "HTTP", 
                    "id": "083f068f-3ee1-471b-a6ba-2438648078c0", 
                    "name": ""
                }, 
                "members": [
                    {
                        "name": "", 
                        "provisioning_status": "ACTIVE", 
                        "address": "10.10.1.103", 
                        "protocol_port": 80, 
                        "id": "97e4de09-7e88-41fb-9e7f-5b061fd4d44f", 
                        "operating_status": "ONLINE"
                    }, 
                    {
                        "name": "", 
                        "provisioning_status": "ACTIVE", 
                        "address": "10.10.1.102", 
                        "protocol_port": 80, 
                        "id": "9bdc780f-2553-40cf-82bf-aa43ab94ce8a", 
                        "operating_status": "ONLINE"
                    }
                ], 
                "id": "17c63199-b63e-4f22-83a3-baea98345923", 
                "operating_status": "ONLINE"
            }
        ], 
        "id": "041b24a3-55e1-4880-b04f-cf22ab057b30", 
        "operating_status": "ONLINE"
    }
}



2. Delete one of the member:
  
$ nova delete e7e10318-ac85-4191-8a56-fed46f2422f9

$ nova list
+--------------------------------------+-------+--------+------------+-------------+---------------------+
| ID                                   | Name  | Status | Task State | Power State | Networks            |
+--------------------------------------+-------+--------+------------+-------------+---------------------+
| 8d691b56-cedb-4723-82f4-65938e2f71a8 | node2 | ACTIVE | -          | Running     | private=10.10.1.102 |
+--------------------------------------+-------+--------+------------+-------------+---------------------+


3. Check lbaas status again:

$ neutron lbaas-loadbalancer-status  041b24a3-55e1-4880-b04f-cf22ab057b30
{
    "loadbalancer": {
        "name": "lb1", 
        "provisioning_status": "ACTIVE", 
        "listeners": [
            {
                "name": "listener1", 
                "provisioning_status": "ACTIVE", 
                "pools": [
                    {
                        "name": "pool1", 
                        "provisioning_status": "ACTIVE", 
                        "healthmonitor": {
                            "provisioning_status": "ACTIVE", 
                            "type": "HTTP", 
                            "id": "083f068f-3ee1-471b-a6ba-2438648078c0", 
                            "name": ""
                        }, 
                        "members": [
                            {
                                "name": "", 
                                "provisioning_status": "ACTIVE", 
                                "address": "10.10.1.103", 
                                "protocol_port": 80, 
                                "id": "97e4de09-7e88-41fb-9e7f-5b061fd4d44f", 
                                "operating_status": "ONLINE"
                            }, 
                            {
                                "name": "", 
                                "provisioning_status": "ACTIVE", 
                                "address": "10.10.1.102", 
                                "protocol_port": 80, 
                                "id": "9bdc780f-2553-40cf-82bf-aa43ab94ce8a", 
                                "operating_status": "ONLINE"
                            }
                        ], 
                        "id": "17c63199-b63e-4f22-83a3-baea98345923", 
                        "operating_status": "ONLINE"
                    }
                ], 
                "l7policies": [], 
                "id": "2c1f8311-d834-45af-8efb-05ff1684e2a6", 
                "operating_status": "ONLINE"
            }
        ], 
        "pools": [
            {
                "name": "pool1", 
                "provisioning_status": "ACTIVE", 
                "healthmonitor": {
                    "provisioning_status": "ACTIVE", 
                    "type": "HTTP", 
                    "id": "083f068f-3ee1-471b-a6ba-2438648078c0", 
                    "name": ""
                }, 
                "members": [
                    {
                        "name": "", 
                        "provisioning_status": "ACTIVE", 
                        "address": "10.10.1.103", 
                        "protocol_port": 80, 
                        "id": "97e4de09-7e88-41fb-9e7f-5b061fd4d44f", 
                        "operating_status": "ONLINE"
                    }, 
                    {
                        "name": "", 
                        "provisioning_status": "ACTIVE", 
                        "address": "10.10.1.102", 
                        "protocol_port": 80, 
                        "id": "9bdc780f-2553-40cf-82bf-aa43ab94ce8a", 
                        "operating_status": "ONLINE"
                    }
                ], 
                "id": "17c63199-b63e-4f22-83a3-baea98345923", 
                "operating_status": "ONLINE"
            }
        ], 
        "id": "041b24a3-55e1-4880-b04f-cf22ab057b30", 
        "operating_status": "ONLINE"
    }
}


Actual results:
The operating_status for member 10.10.1.103 is still online.

Expected results:
The operating_status for member 10.10.1.103 should be offline or error. 


Additional info:
There are some similar issue reported in upstream:
https://bugs.launchpad.net/neutron/+bug/1548774
https://bugs.launchpad.net/octavia/+bug/1607309

--- Additional comment from Jakub Libosvar on 2017-11-13 09:35:13 EST ---

Nir is going to look at this one

Comment 9 errata-xmlrpc 2018-01-30 20:02:19 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2018:0245


Note You need to log in before you can comment on or make changes to this bug.