Bug 2188231

Summary: [AWS]Many AVC failures seen with : denied { create } comm="rhc-worker-play" and comm="ansible-playboo".
Product: Red Hat Enterprise Linux 9 Reporter: libhe
Component: rhc-worker-playbookAssignee: CSI Client Tools Bugs <csi-client-tools-bugs>
Status: CLOSED MIGRATED QA Contact: CSI Client Tools Bugs <csi-client-tools-bugs>
Severity: medium Docs Contact:
Priority: unspecified    
Version: 9.2CC: cmarinea, linl, lvrabec, mmalik, ptoscano, redakkan, xiliang, ymao, zpytela
Target Milestone: rcKeywords: MigratedToJIRA, Regression, Reopened
Target Release: ---   
Hardware: x86_64   
OS: Linux   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2023-10-19 13:55:54 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Attachments:
Description Flags
os_tests.tests.test_general_check.TestGeneralCheck.test_check_avclog.debug none

Description libhe 2023-04-20 07:52:04 UTC
Created attachment 1958486 [details]
os_tests.tests.test_general_check.TestGeneralCheck.test_check_avclog.debug

Description of problem:
Many AVC failures seen with : denied { create } comm="rhc-worker-play" and comm="ansible-playboo".

----
type=PROCTITLE msg=audit(04/19/2023 12:26:23.290:371) : proctitle=/usr/bin/python3 /usr/libexec/rhc/rhc-worker-playbook.worker 
type=SYSCALL msg=audit(04/19/2023 12:26:23.290:371) : arch=x86_64 syscall=openat success=no exit=EACCES(Permission denied) a0=AT_FDCWD a1=0x7fbc46f27050 a2=O_WRONLY|O_CREAT|O_EXCL|O_CLOEXEC a3=0x1a4 items=0 ppid=24199 pid=24518 auid=unset uid=root gid=root euid=root suid=root fsuid=root egid=root sgid=root fsgid=root tty=(none) ses=unset comm=rhc-worker-play exe=/usr/bin/python3.9 subj=system_u:system_r:rhcd_t:s0 key=(null) 
type=AVC msg=audit(04/19/2023 12:26:23.290:371) : avc:  denied  { write } for  pid=24518 comm=rhc-worker-play name=__pycache__ dev="nvme0n1p4" ino=25798780 scontext=system_u:system_r:rhcd_t:s0 tcontext=system_u:object_r:lib_t:s0 tclass=dir permissive=0 
----
type=PROCTITLE msg=audit(04/19/2023 12:26:37.251:393) : proctitle=/usr/bin/python3.11 /bin/ansible-playbook /tmp/tmpra9n_3m3/project/main.json 
type=SYSCALL msg=audit(04/19/2023 12:26:37.251:393) : arch=x86_64 syscall=openat success=no exit=EACCES(Permission denied) a0=AT_FDCWD a1=0x7f0ab341e2d0 a2=O_WRONLY|O_CREAT|O_EXCL|O_CLOEXEC a3=0x1a4 items=0 ppid=24518 pid=24948 auid=unset uid=root gid=root euid=root suid=root fsuid=root egid=root sgid=root fsgid=root tty=pts0 ses=unset comm=ansible-playboo exe=/usr/bin/python3.11 subj=system_u:system_r:rhcd_t:s0 key=(null) 
type=AVC msg=audit(04/19/2023 12:26:37.251:393) : avc:  denied  { write } for  pid=24948 comm=ansible-playboo name=__pycache__ dev="nvme0n1p4" ino=1354793 scontext=system_u:system_r:rhcd_t:s0 tcontext=system_u:object_r:lib_t:s0 tclass=dir permissive=0 
----

Version-Release number of selected components (if applicable):
RHEL-9.2

How reproducible:
100%

Steps to Reproduce:
1. Launch an aws instance with ami-0a7cc1e66703db662(RHEL-9.2.0-20230419.48)
2. Check if avc log exist
   sudo ausearch -m AVC -ts today 

Actual results:
Many AVC denial seen for comm="rhc-worker-play" and comm="ansible-playboo"
http://10.73.196.244/results/iscsi/os_tests/20230419/home/jenkins/workspace/aws_os_tests_x86_64-844/os_tests_result_i3en.6xlarge/attachments/TestGeneralCheck.os_tests.tests.test_general_check.TestGeneralCheck.test_check_avclog/os_tests.tests.test_general_check.TestGeneralCheck.test_check_avclog.debug
...

Expected results:
No AVC denial should be there

Additional info:

Comment 1 Zdenek Pytela 2023-04-20 11:41:38 UTC
Hello,

What is the rhc package version?

  # rpm -q rhc
  # semodule -lfull | grep rhcd

Comment 2 libhe 2023-04-20 13:35:40 UTC
[ec2-user@ip-10-0-25-4 ~]$ rpm -q rhc
rhc-0.2.2-1.el9.x86_64

[ec2-user@ip-10-0-25-4 ~]$ sudo semodule -lfull | grep rhcd
100 rhcd                         pp

Comment 3 Zdenek Pytela 2023-04-20 13:42:36 UTC
Are you aware of any SELinux or rhc related changes since the installation time? The rhcd_t domain should be permissive:

rhel93# rpm -q rhc
rhc-0.2.2-1.el9.x86_64
rhel93# rpm -q rhc --scripts
postinstall scriptlet (using /bin/sh):
if [ -x /usr/sbin/selinuxenabled ] && /usr/sbin/selinuxenabled; then
    /usr/sbin/semanage permissive --add rhcd_t || true
fi
postuninstall scriptlet (using /bin/sh):
if [ $1 -eq 0 ]; then
    if [ -x /usr/sbin/selinuxenabled ] && /usr/sbin/selinuxenabled; then
        /usr/sbin/semanage permissive --delete rhcd_t || true
    fi
fi
rhel93# semodule -lfull | grep rhcd
400 permissive_rhcd_t            cil
100 rhcd                         pp

Comment 4 libhe 2023-04-20 14:05:52 UTC
NO any change for SELinux or rhc related since the installation with the latest AWS AMI.

Comment 5 Zdenek Pytela 2023-04-20 18:04:11 UTC
Can you also run this?

  $ rpm -q rhc --scripts

The only explanation which comes to my mind is that selinux was disabled when rhc was being installed. In that case, reinstallation should help:

  # dnf reinstall rhc

Comment 6 libhe 2023-04-21 02:34:41 UTC
(In reply to Zdenek Pytela from comment #5)
> Can you also run this?
> 
>   $ rpm -q rhc --scripts
> 
> The only explanation which comes to my mind is that selinux was disabled
> when rhc was being installed. In that case, reinstallation should help:

[ec2-user@ip-10-0-16-30 ~]$ rpm -q rhc --scripts
postinstall scriptlet (using /bin/sh):
if [ -x /usr/sbin/selinuxenabled ] && /usr/sbin/selinuxenabled; then
    /usr/sbin/semanage permissive --add rhcd_t || true
fi
postuninstall scriptlet (using /bin/sh):
if [ $1 -eq 0 ]; then
    if [ -x /usr/sbin/selinuxenabled ] && /usr/sbin/selinuxenabled; then
        /usr/sbin/semanage permissive --delete rhcd_t || true
    fi
fi

> 
>   # dnf reinstall rhc
Yes, reinstallation works.

Comment 7 libhe 2023-04-21 02:39:47 UTC
BTW, with the build before 20230419, there is no such issue observed.

Here is the result with build RHEL-9.2.0-20230418.20_x86_64
[ec2-user@ip-10-0-22-92 ~]$ sudo semodule -lfull | grep rhcd
100 rhcd                         pp          

[ec2-user@ip-10-0-22-92 ~]$ sudo ausearch -m AVC -ts today
<no matches>

[ec2-user@ip-10-0-22-92 ~]$ rpm -q rhc
rhc-0.2.2-1.el9.x86_64

Comment 8 Zdenek Pytela 2023-08-11 14:48:45 UTC
Switching the component, but I believe all issues like this have already been addressed.

Comment 12 Rehana 2023-09-06 09:25:02 UTC
Thanks for confirming that this is no longer an issue. Please feel free to report new issue if you have issues again. Closing this for now.

Comment 14 RHEL Program Management 2023-10-19 13:51:40 UTC
Issue migration from Bugzilla to Jira is in process at this time. This will be the last message in Jira copied from the Bugzilla bug.

Comment 15 RHEL Program Management 2023-10-19 13:55:54 UTC
This BZ has been automatically migrated to the issues.redhat.com Red Hat Issue Tracker. All future work related to this report will be managed there.

Due to differences in account names between systems, some fields were not replicated.  Be sure to add yourself to Jira issue's "Watchers" field to continue receiving updates and add others to the "Need Info From" field to continue requesting information.

To find the migrated issue, look in the "Links" section for a direct link to the new issue location. The issue key will have an icon of 2 footprints next to it, and begin with "RHEL-" followed by an integer.  You can also find this issue by visiting https://issues.redhat.com/issues/?jql= and searching the "Bugzilla Bug" field for this BZ's number, e.g. a search like:

"Bugzilla Bug" = 1234567

In the event you have trouble locating or viewing this issue, you can file an issue by sending mail to rh-issues. You can also visit https://access.redhat.com/articles/7032570 for general account information.