This bug has been migrated to another issue tracking site. It has been closed here and may no longer be being monitored.

If you would like to get updates for this issue, or to participate in it, you may do so at Red Hat Issue Tracker .
RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 2188231 - [AWS]Many AVC failures seen with : denied { create } comm="rhc-worker-play" and comm="ansible-playboo".
Summary: [AWS]Many AVC failures seen with : denied { create } comm="rhc-worker-play" a...
Keywords:
Status: CLOSED MIGRATED
Alias: None
Product: Red Hat Enterprise Linux 9
Classification: Red Hat
Component: rhc-worker-playbook
Version: 9.2
Hardware: x86_64
OS: Linux
unspecified
medium
Target Milestone: rc
: ---
Assignee: CSI Client Tools Bugs
QA Contact: CSI Client Tools Bugs
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2023-04-20 07:52 UTC by libhe
Modified: 2023-10-19 13:55 UTC (History)
9 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2023-10-19 13:55:54 UTC
Type: Bug
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
os_tests.tests.test_general_check.TestGeneralCheck.test_check_avclog.debug (271.25 KB, text/plain)
2023-04-20 07:52 UTC, libhe
no flags Details


Links
System ID Private Priority Status Summary Last Updated
Red Hat Issue Tracker   RHEL-14277 0 None Migrated None 2023-10-19 13:55:53 UTC
Red Hat Issue Tracker RHELPLAN-155271 0 None None None 2023-04-20 07:54:19 UTC

Description libhe 2023-04-20 07:52:04 UTC
Created attachment 1958486 [details]
os_tests.tests.test_general_check.TestGeneralCheck.test_check_avclog.debug

Description of problem:
Many AVC failures seen with : denied { create } comm="rhc-worker-play" and comm="ansible-playboo".

----
type=PROCTITLE msg=audit(04/19/2023 12:26:23.290:371) : proctitle=/usr/bin/python3 /usr/libexec/rhc/rhc-worker-playbook.worker 
type=SYSCALL msg=audit(04/19/2023 12:26:23.290:371) : arch=x86_64 syscall=openat success=no exit=EACCES(Permission denied) a0=AT_FDCWD a1=0x7fbc46f27050 a2=O_WRONLY|O_CREAT|O_EXCL|O_CLOEXEC a3=0x1a4 items=0 ppid=24199 pid=24518 auid=unset uid=root gid=root euid=root suid=root fsuid=root egid=root sgid=root fsgid=root tty=(none) ses=unset comm=rhc-worker-play exe=/usr/bin/python3.9 subj=system_u:system_r:rhcd_t:s0 key=(null) 
type=AVC msg=audit(04/19/2023 12:26:23.290:371) : avc:  denied  { write } for  pid=24518 comm=rhc-worker-play name=__pycache__ dev="nvme0n1p4" ino=25798780 scontext=system_u:system_r:rhcd_t:s0 tcontext=system_u:object_r:lib_t:s0 tclass=dir permissive=0 
----
type=PROCTITLE msg=audit(04/19/2023 12:26:37.251:393) : proctitle=/usr/bin/python3.11 /bin/ansible-playbook /tmp/tmpra9n_3m3/project/main.json 
type=SYSCALL msg=audit(04/19/2023 12:26:37.251:393) : arch=x86_64 syscall=openat success=no exit=EACCES(Permission denied) a0=AT_FDCWD a1=0x7f0ab341e2d0 a2=O_WRONLY|O_CREAT|O_EXCL|O_CLOEXEC a3=0x1a4 items=0 ppid=24518 pid=24948 auid=unset uid=root gid=root euid=root suid=root fsuid=root egid=root sgid=root fsgid=root tty=pts0 ses=unset comm=ansible-playboo exe=/usr/bin/python3.11 subj=system_u:system_r:rhcd_t:s0 key=(null) 
type=AVC msg=audit(04/19/2023 12:26:37.251:393) : avc:  denied  { write } for  pid=24948 comm=ansible-playboo name=__pycache__ dev="nvme0n1p4" ino=1354793 scontext=system_u:system_r:rhcd_t:s0 tcontext=system_u:object_r:lib_t:s0 tclass=dir permissive=0 
----

Version-Release number of selected components (if applicable):
RHEL-9.2

How reproducible:
100%

Steps to Reproduce:
1. Launch an aws instance with ami-0a7cc1e66703db662(RHEL-9.2.0-20230419.48)
2. Check if avc log exist
   sudo ausearch -m AVC -ts today 

Actual results:
Many AVC denial seen for comm="rhc-worker-play" and comm="ansible-playboo"
http://10.73.196.244/results/iscsi/os_tests/20230419/home/jenkins/workspace/aws_os_tests_x86_64-844/os_tests_result_i3en.6xlarge/attachments/TestGeneralCheck.os_tests.tests.test_general_check.TestGeneralCheck.test_check_avclog/os_tests.tests.test_general_check.TestGeneralCheck.test_check_avclog.debug
...

Expected results:
No AVC denial should be there

Additional info:

Comment 1 Zdenek Pytela 2023-04-20 11:41:38 UTC
Hello,

What is the rhc package version?

  # rpm -q rhc
  # semodule -lfull | grep rhcd

Comment 2 libhe 2023-04-20 13:35:40 UTC
[ec2-user@ip-10-0-25-4 ~]$ rpm -q rhc
rhc-0.2.2-1.el9.x86_64

[ec2-user@ip-10-0-25-4 ~]$ sudo semodule -lfull | grep rhcd
100 rhcd                         pp

Comment 3 Zdenek Pytela 2023-04-20 13:42:36 UTC
Are you aware of any SELinux or rhc related changes since the installation time? The rhcd_t domain should be permissive:

rhel93# rpm -q rhc
rhc-0.2.2-1.el9.x86_64
rhel93# rpm -q rhc --scripts
postinstall scriptlet (using /bin/sh):
if [ -x /usr/sbin/selinuxenabled ] && /usr/sbin/selinuxenabled; then
    /usr/sbin/semanage permissive --add rhcd_t || true
fi
postuninstall scriptlet (using /bin/sh):
if [ $1 -eq 0 ]; then
    if [ -x /usr/sbin/selinuxenabled ] && /usr/sbin/selinuxenabled; then
        /usr/sbin/semanage permissive --delete rhcd_t || true
    fi
fi
rhel93# semodule -lfull | grep rhcd
400 permissive_rhcd_t            cil
100 rhcd                         pp

Comment 4 libhe 2023-04-20 14:05:52 UTC
NO any change for SELinux or rhc related since the installation with the latest AWS AMI.

Comment 5 Zdenek Pytela 2023-04-20 18:04:11 UTC
Can you also run this?

  $ rpm -q rhc --scripts

The only explanation which comes to my mind is that selinux was disabled when rhc was being installed. In that case, reinstallation should help:

  # dnf reinstall rhc

Comment 6 libhe 2023-04-21 02:34:41 UTC
(In reply to Zdenek Pytela from comment #5)
> Can you also run this?
> 
>   $ rpm -q rhc --scripts
> 
> The only explanation which comes to my mind is that selinux was disabled
> when rhc was being installed. In that case, reinstallation should help:

[ec2-user@ip-10-0-16-30 ~]$ rpm -q rhc --scripts
postinstall scriptlet (using /bin/sh):
if [ -x /usr/sbin/selinuxenabled ] && /usr/sbin/selinuxenabled; then
    /usr/sbin/semanage permissive --add rhcd_t || true
fi
postuninstall scriptlet (using /bin/sh):
if [ $1 -eq 0 ]; then
    if [ -x /usr/sbin/selinuxenabled ] && /usr/sbin/selinuxenabled; then
        /usr/sbin/semanage permissive --delete rhcd_t || true
    fi
fi

> 
>   # dnf reinstall rhc
Yes, reinstallation works.

Comment 7 libhe 2023-04-21 02:39:47 UTC
BTW, with the build before 20230419, there is no such issue observed.

Here is the result with build RHEL-9.2.0-20230418.20_x86_64
[ec2-user@ip-10-0-22-92 ~]$ sudo semodule -lfull | grep rhcd
100 rhcd                         pp          

[ec2-user@ip-10-0-22-92 ~]$ sudo ausearch -m AVC -ts today
<no matches>

[ec2-user@ip-10-0-22-92 ~]$ rpm -q rhc
rhc-0.2.2-1.el9.x86_64

Comment 8 Zdenek Pytela 2023-08-11 14:48:45 UTC
Switching the component, but I believe all issues like this have already been addressed.

Comment 12 Rehana 2023-09-06 09:25:02 UTC
Thanks for confirming that this is no longer an issue. Please feel free to report new issue if you have issues again. Closing this for now.

Comment 14 RHEL Program Management 2023-10-19 13:51:40 UTC
Issue migration from Bugzilla to Jira is in process at this time. This will be the last message in Jira copied from the Bugzilla bug.

Comment 15 RHEL Program Management 2023-10-19 13:55:54 UTC
This BZ has been automatically migrated to the issues.redhat.com Red Hat Issue Tracker. All future work related to this report will be managed there.

Due to differences in account names between systems, some fields were not replicated.  Be sure to add yourself to Jira issue's "Watchers" field to continue receiving updates and add others to the "Need Info From" field to continue requesting information.

To find the migrated issue, look in the "Links" section for a direct link to the new issue location. The issue key will have an icon of 2 footprints next to it, and begin with "RHEL-" followed by an integer.  You can also find this issue by visiting https://issues.redhat.com/issues/?jql= and searching the "Bugzilla Bug" field for this BZ's number, e.g. a search like:

"Bugzilla Bug" = 1234567

In the event you have trouble locating or viewing this issue, you can file an issue by sending mail to rh-issues. You can also visit https://access.redhat.com/articles/7032570 for general account information.


Note You need to log in before you can comment on or make changes to this bug.