Bug 2094801 - Kuryr controller keep restarting when handling IPs with leading zeros
Summary: Kuryr controller keep restarting when handling IPs with leading zeros
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Networking
Version: 4.11
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: ---
: 4.11.0
Assignee: Michał Dulko
QA Contact: Itzik Brown
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2022-06-08 10:15 UTC by Itzik Brown
Modified: 2022-08-10 11:17 UTC (History)
1 user (show)

Fixed In Version:
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2022-08-10 11:16:53 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github openshift kuryr-kubernetes pull 671 0 None open Bug 2094801: Strip leading zeros from "funny" Service IPs 2022-06-15 07:06:29 UTC
Red Hat Product Errata RHSA-2022:5069 0 None None None 2022-08-10 11:17:06 UTC

Description Itzik Brown 2022-06-08 10:15:59 UTC
Description of problem:
When running the following test:
"[sig-network] CVE-2021-29923 IPv4 Service Type ClusterIP with leading zeros should work interpreted as decimal [Suite:openshift/conformance/parallel] [Suite:k8s]

Kuryr controller keep restarting in a loop.
From the log:
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging [-] Failed to handle event [req-ab728a03-1b58-4af2-9d56-13d3b96fe6c2]: {'type': 'ADDED', 'object': {'apiVersion': 'openstack.org/v1', 'kind': 'KuryrLoadBalancer', 'metadata': {'creationTimestamp': '2022-06-08T08:38:11Z', 'finalizers': ['kuryr.openstack.org/kuryrloadbalancer-finalizers'], 'generation': 2, 'managedFields': [{'apiVersion': 'openstack.org/v1', 'fieldsType': 'FieldsV1', 'fieldsV1': {'f:metadata': {'f:finalizers': {'.': {}, 'v:"kuryr.openstack.org/kuryrloadbalancer-finalizers"': {}}, 'f:ownerReferences': {'.': {}, 'k:{"uid":"88749503-bbce-4a95-b262-5f3c59d831ef"}': {}}}, 'f:spec': {'.': {}, 'f:endpointSlices': {}, 'f:ip': {}, 'f:ports': {}, 'f:project_id': {}, 'f:provider': {}, 'f:security_groups_ids': {}, 'f:subnet_id': {}, 'f:timeout_client_data': {}, 'f:timeout_member_data': {}, 'f:type': {}}, 'f:status': {}}, 'manager': 'python-requests', 'operation': 'Update', 'time': '2022-06-08T08:40:45Z'}], 'name': 'funny-ip', 'namespace': 'e2e-funny-ips-3590', 'ownerReferences': [{'apiVersion': 'v1', 'kind': 'Service', 'name': 'funny-ip', 'uid': '88749503-bbce-4a95-b262-5f3c59d831ef'}], 'resourceVersion': '107583', 'uid': 'c1c9e054-b93a-4869-b92d-b3833b604734'}, 'spec': {'endpointSlices': [{'endpoints': [{'addresses': ['10.128.150.194'], 'conditions': {'ready': True}, 'targetRef': {'kind': 'Pod', 'name': 'funny-ip-7r5sf', 'namespace': 'e2e-funny-ips-3590', 'uid': '3870501e-00d1-49cb-bb42-b30fa5004652'}}], 'ports': [{'name': 'http', 'port': 9376, 'protocol': 'TCP'}]}], 'ip': '172.30.0.011', 'ports': [{'name': 'http', 'port': 7180, 'protocol': 'TCP', 'targetPort': '9376'}], 'project_id': '55760bdece1b45ff88cbe453e696e355', 'provider': 'ovn', 'security_groups_ids': ['e1edfb10-83af-4719-ae0b-6b6871d5fd8a'], 'subnet_id': '6d3d6514-9035-4c3b-8524-f0bc7476a474', 'timeout_client_data': 0, 'timeout_member_data': 0, 'type': 'ClusterIP'}, 'status': {}}}: openstack.exceptions.BadRequestException: BadRequestException: 400: Client Error for url: http://10.46.43.17:9876/v2.0/lbaas/loadbalancers, Invalid input for field/attribute vip_address. Value: '172.30.0.011'. Value should be IPv4 or IPv6 format
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging Traceback (most recent call last):
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/handlers/logging.py", line 38, in __call__
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     self._handler(event, *args, **kwargs)
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/handlers/retry.py", line 85, in __call__
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     self._handler(event, *args, retry_info=info, **kwargs)
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/handlers/k8s_base.py", line 90, in __call__
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     self.on_present(obj, *args, **kwargs)
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/handlers/loadbalancer.py", line 134, in on_present
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     changed = self._sync_lbaas_members(loadbalancer_crd)
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/handlers/loadbalancer.py", line 358, in _sync_lbaas_members
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     if self._sync_lbaas_pools(loadbalancer_crd):
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/handlers/loadbalancer.py", line 654, in _sync_lbaas_pools
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     if self._sync_lbaas_listeners(loadbalancer_crd):
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/handlers/loadbalancer.py", line 732, in _sync_lbaas_listeners
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     if self._sync_lbaas_loadbalancer(loadbalancer_crd):
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/handlers/loadbalancer.py", line 891, in _sync_lbaas_loadbalancer
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     provider=loadbalancer_crd['spec'].get('provider'))
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/drivers/lbaasv2.py", line 168, in ensure_loadbalancer
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     response = self._ensure_loadbalancer(request)
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/drivers/lbaasv2.py", line 797, in _ensure_loadbalancer
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     result = self._create_loadbalancer(loadbalancer)
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/kuryr_kubernetes/controller/drivers/lbaasv2.py", line 543, in _create_loadbalancer
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     response = lbaas.create_load_balancer(**request)
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/openstack/load_balancer/v2/_proxy.py", line 46, in create_load_balancer
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     return self._create(_lb.LoadBalancer, **attrs)
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/openstack/proxy.py", line 463, in _create
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     return res.create(self, base_path=base_path)
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/openstack/resource.py", line 1364, in create
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     self._translate_response(response, has_body=has_body)
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/openstack/resource.py", line 1177, in _translate_response
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     exceptions.raise_from_response(response, error_message=error_message)
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging   File "/usr/lib/python3.6/site-packages/openstack/exceptions.py", line 238, in raise_from_response
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging     http_status=http_status, request_id=request_id
2022-06-08 08:44:24.492 1 ERROR kuryr_kubernetes.handlers.logging openstack.exceptions.BadRequestException: BadRequestException: 400: Client Error for url: http://10.46.43.17:9876/v2.0/lbaas/loadbalancers, Invalid input for field/attribute vip_address. Value: '172.30.0.011'. Value should be IPv4 or IPv6 format

Version-Release number of selected component (if applicable):
4.11.0-0.nightly-2022-06-06-025509

How reproducible:
100%

Steps to Reproduce:
Run the above test

Actual results:


Expected results:


Additional info:

Comment 1 ShiftStack Bugwatcher 2022-06-09 07:05:33 UTC
Removing the Triaged keyword because:
* the priority assessment is missing

Comment 3 Itzik Brown 2022-06-20 07:23:53 UTC
Test passed on:
RHOS-16.2-RHEL-8-20220311.n.1
4.11.0-0.nightly-2022-06-15-222801

Comment 4 Itzik Brown 2022-06-20 13:06:30 UTC
works with 4.11.0-0.nightly-2022-06-20-084444

Comment 6 errata-xmlrpc 2022-08-10 11:16:53 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Important: OpenShift Container Platform 4.11.0 bug fix and security update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2022:5069


Note You need to log in before you can comment on or make changes to this bug.