Bug 2080966

Summary: failed to reload listeners because of selinux error
Product: Red Hat OpenStack Reporter: Gregory Thiemonge <gthiemon>
Component: openstack-octaviaAssignee: Gregory Thiemonge <gthiemon>
Status: CLOSED ERRATA QA Contact: Bruna Bonguardo <bbonguar>
Severity: high Docs Contact:
Priority: high    
Version: 17.0 (Wallaby)CC: apevec, cjeanner, ihrachys, jpichon, jschluet, lhh, lpeer, majopela, oschwart, scohen, tweining
Target Milestone: AlphaKeywords: AutomationBlocker, Regression, Triaged
Target Release: 17.0   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: openstack-octavia-8.0.2-0.20220506160857.3b1068e.el9ost openstack-selinux-0.8.31-0.20220428171339.400a9a5.el9ost Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2022-09-21 12:20:53 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description Gregory Thiemonge 2022-05-02 13:25:43 UTC
Description of problem:
Tests are failing in OSP17 with Octavia, Octavia cannot reload the listeners in the amphora

CI logs (worker.log):

2022-04-28 18:33:41.348 18 DEBUG octavia.amphorae.drivers.haproxy.rest_api_driver [req-cb04087e-5400-49d8-a2bc-7c66c13d8e9c - 3c1743617d4142c5bd0bf77332d9fab3 - - -] request url loadbalancer/047b4ae5-739a-42c7-9ebe-7e42e53d4f20/03c9c508-51c3-4419-a346-436880921e9b/haproxy request /usr/lib/python3.9/site-packages/octavia/amphorae/drivers/haproxy/rest_api_driver.py:678
2022-04-28 18:33:41.349 18 DEBUG octavia.amphorae.drivers.haproxy.rest_api_driver [req-cb04087e-5400-49d8-a2bc-7c66c13d8e9c - 3c1743617d4142c5bd0bf77332d9fab3 - - -] request url https://172.24.1.172:9443/1.0/loadbalancer/047b4ae5-739a-42c7-9ebe-7e42e53d4f20/03c9c508-51c3-4419-a346-436880921e9b/haproxy request /usr/lib/python3.9/site-packages/octavia/amphorae/drivers/haproxy/rest_api_driver.py:681
2022-04-28 18:33:41.612 14 DEBUG octavia.amphorae.drivers.haproxy.rest_api_driver [req-4ab46e95-6a98-4456-8569-8240c7ea7b57 - aaaf8e8845e84605aedb8c56ab5aff3f - - -] Connected to amphora. Response: <Response [500]> request /usr/lib/python3.9/site-packages/octavia/amphorae/drivers/haproxy/rest_api_driver.py:701
2022-04-28 18:33:41.612 14 ERROR octavia.amphorae.drivers.haproxy.exceptions [req-4ab46e95-6a98-4456-8569-8240c7ea7b57 - aaaf8e8845e84605aedb8c56ab5aff3f - - -] Amphora agent returned unexpected result code 500 with response {'message': 'Error starting haproxy', 'details': 'Redirecting to /bin/systemctl start haproxy-56dd3d4b-9649-49ee-a554-b2b15339e6a3.service\nJob for haproxy-56dd3d4b-9649-49ee-a554-b2b15339e6a3.service failed because the control process exited with error code.\nSee "systemctl status haproxy-56dd3d4b-9649-49ee-a554-b2b15339e6a3.service" and "journalctl -xeu haproxy-56dd3d4b-9649-49ee-a554-b2b15339e6a3.service" for details.\n'}
2022-04-28 18:33:41.619 14 WARNING octavia.controller.worker.v1.controller_worker [-] Task 'octavia.controller.worker.v1.tasks.amphora_driver_tasks.ListenersUpdate' (42c4a026-725a-4a11-933e-70a9ed5fbb18) transitioned into state 'FAILURE' from state 'RUNNING'
3 predecessors (most recent first):
  Atom 'octavia.controller.worker.v1.tasks.database_tasks.MarkPoolPendingCreateInDB' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'pool': <octavia.common.data_models.Pool object at 0x7f7b0c38f940>}, 'provides': None}
  |__Atom 'octavia.controller.worker.v1.tasks.lifecycle_tasks.PoolToErrorOnRevertTask' {'intention': 'EXECUTE', 'state': 'SUCCESS', 'requires': {'pool': <octavia.common.data_models.Pool object at 0x7f7b0c38f940>, 'listeners': [], 'loadbalancer': <octavia.common.data_models.LoadBalancer object at 0x7f7b0c3514c0>}, 'provides': None}
     |__Flow 'octavia-create-pool-flow': octavia.amphorae.drivers.haproxy.exceptions.InternalServerError: Internal Server Error
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker Traceback (most recent call last):
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python3.9/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker     result = task.execute(**arguments)
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python3.9/site-packages/octavia/controller/worker/v1/tasks/amphora_driver_tasks.py", line 100, in execute
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker     self.amphora_driver.update(loadbalancer)
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python3.9/site-packages/octavia/amphorae/drivers/haproxy/rest_api_driver.py", line 253, in update
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker     self.update_amphora_listeners(loadbalancer, amphora)
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python3.9/site-packages/octavia/amphorae/drivers/haproxy/rest_api_driver.py", line 225, in update_amphora_listeners
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker     self.clients[amphora.api_version].reload_listener(
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python3.9/site-packages/octavia/amphorae/drivers/haproxy/rest_api_driver.py", line 907, in _action
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker     return exc.check_exception(r)
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker   File "/usr/lib/python3.9/site-packages/octavia/amphorae/drivers/haproxy/exceptions.py", line 44, in check_exception
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker     raise responses[status_code]()
2022-04-28 18:33:41.619 14 ERROR octavia.controller.worker.v1.controller_worker octavia.amphorae.drivers.haproxy.exceptions.InternalServerError: Internal Server Error


Another run (logs from the amphora)

May  2 14:18:36 amphora-a44a1007-6fe5-4213-88f8-a21e8a10c5f3 systemd[1]: Starting HAProxy Load Balancer...                                                                                                                                    
May  2 14:18:36 amphora-a44a1007-6fe5-4213-88f8-a21e8a10c5f3 ip[1163]: Cannot open network namespace "amphora-haproxy": Permission denied                                                                                                      
May  2 14:18:36 amphora-a44a1007-6fe5-4213-88f8-a21e8a10c5f3 systemd[1]: haproxy-a26aec91-a004-4d98-b84e-0499b286ed48.service: Main process exited, code=exited, status=255/EXCEPTION                                                          
May  2 14:18:36 amphora-a44a1007-6fe5-4213-88f8-a21e8a10c5f3 systemd[1]: haproxy-a26aec91-a004-4d98-b84e-0499b286ed48.service: Failed with result 'exit-code'.                                                                                
May  2 14:18:36 amphora-a44a1007-6fe5-4213-88f8-a21e8a10c5f3 systemd[1]: Failed to start HAProxy Load Balancer.


SELinux related logs with permissive=1:

type=AVC msg=audit(1651494369.458:466): avc:  denied  { read } for  pid=1705 comm="ip" dev="nsfs" ino=4026532274 scontext=system_u:system_r:haproxy_t:s0 tcontext=system_u:object_r:nsfs_t:s0 tclass=file permissive=1
type=AVC msg=audit(1651494369.458:466): avc:  denied  { open } for  pid=1705 comm="ip" path="/run/netns/amphora-haproxy" dev="nsfs" ino=4026532274 scontext=system_u:system_r:haproxy_t:s0 tcontext=system_u:object_r:nsfs_t:s0 tclass=file permissive=1


Version-Release number of selected component (if applicable):
17.0

Comment 13 Omer Schwartz 2022-05-19 09:26:54 UTC
The Octavia OSP17 jobs run on RHEL9, so as the following build

https://rhos-ci-jenkins.lab.eng.tlv2.redhat.com/view/DFG/view/network/view/octavia/job/DFG-network-octavia-17.0_director-rhel-virthost-3cont_3comp-ipv4-geneve-actstby/45/

contains tests which show that the fix works, I am moving this BZ to VERIFIED.

Comment 18 errata-xmlrpc 2022-09-21 12:20:53 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Release of components for Red Hat OpenStack Platform 17.0 (Wallaby)), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2022:6543