RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 1331585 - [SELinux]: Cases in pynfs test suite fails because of selinux errors.
Summary: [SELinux]: Cases in pynfs test suite fails because of selinux errors.
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Linux 6
Classification: Red Hat
Component: selinux-policy
Version: 6.8
Hardware: x86_64
OS: Linux
high
urgent
Target Milestone: rc
: ---
Assignee: Lukas Vrabec
QA Contact: Milos Malik
Mirek Jahoda
URL:
Whiteboard:
Depends On:
Blocks: 1343211 1380695 1393267
TreeView+ depends on / blocked
 
Reported: 2016-04-28 21:36 UTC by Shashank Raj
Modified: 2017-03-21 09:46 UTC (History)
19 users (show)

Fixed In Version: selinux-policy-3.7.19-293.el6
Doc Type: Bug Fix
Doc Text:
Previously, with SELinux in enforcing mode, it was not possible to create a UNIX Domain Socket on the Red Hat Gluster Storage volumes. As a consequence, the user could not store containers on the volumes. The relevant policy module has been updated, and the user is now able to store containers on the Red Hat Storage Gluster volumes.
Clone Of: 1331559
: 1393267 (view as bug list)
Environment:
Last Closed: 2017-03-21 09:46:06 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2017:0627 0 normal SHIPPED_LIVE selinux-policy bug fix update 2017-03-21 12:29:23 UTC

Description Shashank Raj 2016-04-28 21:36:05 UTC
+++ This bug was initially created as a clone of Bug #1331559 +++

Description of problem:

Cases in pynfs test suite fails because of selinux errors.

Version-Release number of selected component (if applicable):

nfs-ganesha-2.3.1-4

selinux-policy-3.13.1-60.el7_2.3.noarch
selinux-policy-targeted-3.13.1-60.el7_2.3.noarch
selinux-policy-devel-3.13.1-60.el7_2.3.noarch

How reproducible:

Always

Steps to Reproduce:

1. Configure nfs-ganesha on a 4 node cluster.
2. Create a dist-rep volume and enable ganesha on it
3. From the client start executing pynfs test suite on the volume

Observe that some of the cases fails and some of them are skipped 

LOOKSOCK st_lookup.testSocket                                     : FAILURE
           LOOKUP of /testvolume/tree/socket should return
           NFS4_OK, instead got NFS4ERR_NOENT

MKSOCK   st_create.testSocket                                     : FAILURE
           CREATE in empty dir should return NFS4_OK, instead got
           NFS4ERR_ACCESS


ACC1s    st_access.testReadSocket                                 : OMIT
           Dependency LOOKSOCK st_lookup.testSocket had status
           FAILURE.


RM1s     st_remove.testSocket                                     : OMIT
           Dependency MKSOCK st_create.testSocket had status
           FAILURE.

4. Observe in /var/log/audit/audit.log, below AVC's are seen on the nodes which are the cause for the failure of cases in pynfs test suite

type=AVC msg=audit(1461894042.391:1523): avc:  denied  { create } for  pid=3648 comm="glusterfsd" name="block" scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461894042.395:1524): avc:  denied  { create } for  pid=3596 comm="glusterfsd" name="char" scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=chr_file

type=AVC msg=audit(1461894568.891:1525): avc:  denied  { create } for  pid=12946 comm="glusterfsd" name="MKCHAR" scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=chr_file

type=AVC msg=audit(1461894568.896:1526): avc:  denied  { create } for  pid=12945 comm="glusterfsd" name="MKFIFO" scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=fifo_file

type=AVC msg=audit(1461894568.908:1527): avc:  denied  { create } for  pid=12943 comm="glusterfsd" name="MKSOCK" scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=sock_file

type=AVC msg=audit(1461894042.751:7314): avc:  denied  { create } for  pid=20049 comm="glusterfsd" name="socket" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=sock_file

type=AVC msg=audit(1461894042.754:7315): avc:  denied  { create } for  pid=28872 comm="glusterfsd" name="fifo" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=fifo_file

type=AVC msg=audit(1461894569.237:7316): avc:  denied  { create } for  pid=30659 comm="glusterfsd" name="MKBLK" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461896527.268:1693): avc:  denied  { setattr } for  pid=24468 comm="glusterfsd" name="block" dev=dm-41 ino=16779714 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461896527.268:1694): avc:  denied  { getattr } for  pid=24468 comm="glusterfsd" path="/bricks/brick2/b2/tree/block" dev=dm-41 ino=16779714 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461896527.268:1695): avc:  denied  { link } for  pid=24468 comm="glusterfsd" name="block" dev=dm-41 ino=16779714 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461896527.272:1697): avc:  denied  { setattr } for  pid=24465 comm="glusterfsd" name="char" dev=dm-51 ino=8388994 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=chr_file

type=AVC msg=audit(1461896527.272:1698): avc:  denied  { getattr } for  pid=24465 comm="glusterfsd" path="/bricks/brick0/b0/tree/char" dev=dm-51 ino=8388994 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=chr_file

type=AVC msg=audit(1461896527.272:1699): avc:  denied  { link } for  pid=24465 comm="glusterfsd" name="char" dev=dm-51 ino=8388994 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=chr_file

type=AVC msg=audit(1461897054.208:1707): avc:  denied  { setattr } for  pid=26454 comm="glusterfsd" name="MKFIFO" dev=dm-51 ino=62917347 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=fifo_file

type=AVC msg=audit(1461897054.208:1708): avc:  denied  { getattr } for  pid=26454 comm="glusterfsd" path="/bricks/brick0/b0/tmp/MKFIFO" dev=dm-51 ino=62917347 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=fifo_file

type=AVC msg=audit(1461897054.208:1709): avc:  denied  { link } for  pid=26454 comm="glusterfsd" name="MKFIFO" dev=dm-51 ino=62917347 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=fifo_file

ype=AVC msg=audit(1461897054.219:1711): avc:  denied  { setattr } for  pid=26450 comm="glusterfsd" name="MKSOCK" dev=dm-41 ino=4502948 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=sock_file

type=AVC msg=audit(1461897054.219:1712): avc:  denied  { getattr } for  pid=26450 comm="glusterfsd" path="/bricks/brick2/b2/tmp/MKSOCK" dev=dm-41 ino=4502948 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=sock_file

type=AVC msg=audit(1461897054.219:1713): avc:  denied  { link } for  pid=26450 comm="glusterfsd" name="MKSOCK" dev=dm-41 ino=4502948 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=sock_file

type=AVC msg=audit(1461897176.745:1720): avc:  denied  { unlink } for  pid=26451 comm="glusterfsd" name="e7764b03-7bbe-4bbc-8625-b5e8e7aa3457" dev=dm-41 ino=4194565 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461897535.876:1721): avc:  denied  { unlink } for  pid=28513 comm="glusterfsd" name="0443e6f9-9d95-43f7-a9d7-2f9bee102ecd" dev=dm-41 ino=4194589 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=chr_file

type=AVC msg=audit(1461897535.999:1722): avc:  denied  { unlink } for  pid=29826 comm="glusterfsd" name="cecccfb9-6e81-429a-a9fa-3066ae4b375e" dev=dm-51 ino=62914835 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=fifo_file

type=AVC msg=audit(1461897538.841:1723): avc:  denied  { unlink } for  pid=29825 comm="glusterfsd" name="4a4759ae-7942-49ad-842c-b1eb0522268e" dev=dm-51 ino=62914836 scontext=system_u:system_r:glusterd_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=sock_file

type=AVC msg=audit(1461896527.628:7555): avc:  denied  { setattr } for  pid=11943 comm="glusterfsd" name="socket" dev=dm-6 ino=50335810 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=sock_file

type=AVC msg=audit(1461896527.628:7556): avc:  denied  { getattr } for  pid=11943 comm="glusterfsd" path="/bricks/brick0/b0/tree/socket" dev=dm-6 ino=50335810 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=sock_file

type=AVC msg=audit(1461896527.629:7557): avc:  denied  { link } for  pid=11943 comm="glusterfsd" name="socket" dev=dm-6 ino=50335810 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=sock_file

type=AVC msg=audit(1461896527.632:7559): avc:  denied  { setattr } for  pid=11943 comm="glusterfsd" name="fifo" dev=dm-6 ino=50335811 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=fifo_file

type=AVC msg=audit(1461896527.632:7560): avc:  denied  { getattr } for  pid=11943 comm="glusterfsd" path="/bricks/brick0/b0/tree/fifo" dev=dm-6 ino=50335811 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=fifo_file
type=AVC msg=audit(1461896527.632:7561): avc:  denied  { link } for  pid=11943 comm="glusterfsd" name="fifo" dev=dm-6 ino=50335811 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=fifo_file

type=AVC msg=audit(1461897054.547:7569): avc:  denied  { setattr } for  pid=13915 comm="glusterfsd" name="MKBLK" dev=dm-16 ino=46139883 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461897054.547:7570): avc:  denied  { getattr } for  pid=13915 comm="glusterfsd" path="/bricks/brick2/b2/tmp/MKBLK" dev=dm-16 ino=46139883 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461897054.547:7571): avc:  denied  { link } for  pid=13915 comm="glusterfsd" name="MKBLK" dev=dm-16 ino=46139883 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461897177.101:7578): avc:  denied  { create } for  pid=13919 comm="glusterfsd" name="RM1c" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=chr_file

type=AVC msg=audit(1461897177.102:7579): avc:  denied  { setattr } for  pid=13919 comm="glusterfsd" name="RM1c" dev=dm-16 ino=46139911 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=chr_file

type=AVC msg=audit(1461897177.102:7580): avc:  denied  { getattr } for  pid=13919 comm="glusterfsd" path="/bricks/brick2/b2/tmp/RM1c" dev=dm-16 ino=46139911 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=chr_file

type=AVC msg=audit(1461897177.102:7581): avc:  denied  { link } for  pid=13919 comm="glusterfsd" name="RM1c" dev=dm-16 ino=46139911 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=chr_file

type=AVC msg=audit(1461897177.104:7582): avc:  denied  { unlink } for  pid=11926 comm="glusterfsd" name="08a1506f-32d0-4a77-a052-afd268ecaa76" dev=dm-16 ino=46139911 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=chr_file

type=AVC msg=audit(1461897177.144:7583): avc:  denied  { create } for  pid=15969 comm="glusterfsd" name="RM1f" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=fifo_file

type=AVC msg=audit(1461897177.144:7584): avc:  denied  { setattr } for  pid=15969 comm="glusterfsd" name="RM1f" dev=dm-11 ino=62914817 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=fifo_file

type=AVC msg=audit(1461897177.144:7585): avc:  denied  { getattr } for  pid=15969 comm="glusterfsd" path="/bricks/brick1/b1/tmp/RM1f" dev=dm-11 ino=62914817 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=fifo_file

type=AVC msg=audit(1461897177.144:7586): avc:  denied  { link } for  pid=15969 comm="glusterfsd" name="RM1f" dev=dm-11 ino=62914817 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=fifo_file

type=AVC msg=audit(1461897177.147:7587): avc:  denied  { unlink } for  pid=15969 comm="glusterfsd" name="65c5010e-48dd-4688-b54f-dc576965041f" dev=dm-11 ino=62914817 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=fifo_file

type=AVC msg=audit(1461897177.172:7588): avc:  denied  { unlink } for  pid=11926 comm="glusterfsd" name="480ecc03-d335-4f16-b220-2af6c2337415" dev=dm-16 ino=46139911 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=sock_file

type=AVC msg=audit(1461897427.746:7589): avc:  denied  { create } for  pid=17217 comm="glusterfsd" name="SATT12b" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461897427.747:7590): avc:  denied  { setattr } for  pid=17217 comm="glusterfsd" name="SATT12b" dev=dm-6 ino=37749018 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461897427.747:7591): avc:  denied  { getattr } for  pid=17217 comm="glusterfsd" path="/bricks/brick0/b0/tmp/SATT12b" dev=dm-6 ino=37749018 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461897427.747:7592): avc:  denied  { link } for  pid=17217 comm="glusterfsd" name="SATT12b" dev=dm-6 ino=37749018 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=blk_file

type=AVC msg=audit(1461897535.952:7593): avc:  denied  { unlink } for  pid=15970 comm="glusterfsd" name="89339ae6-7560-475f-940a-4ca77ad937a0" dev=dm-6 ino=37749018 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:file_t:s0 tclass=blk_file


Actual results:

Cases in pynfs test suite fails because of selinux errors.

Expected results:

No denial AVC's should be seen and it should not effect any functionality

Additional info:

Comment 2 Milos Malik 2016-05-05 07:11:31 UTC
I need to know the output of following command:

# matchpathcon /bricks  /bricks/brick0

It seems that the bricks mountpoint is mislabeled completely. Following command may help, if file context equivalence is set correctly:

# restorecon -Rv /bricks

Comment 3 Shashank Raj 2016-05-05 15:44:58 UTC
I ran the tests again on a fresh setup with details as below:

[root@dhcp43-33 exports]# matchpathcon /bricks  /bricks/brick0
/bricks	system_u:object_r:default_t:s0
/bricks/brick0	system_u:object_r:glusterd_brick_t:s0

In enforcing mode:

there are around 2 failed cases and 50 dependent skipped cases, failed cases are below:

LOOKSOCK st_lookup.testSocket                                     : FAILURE
           LOOKUP of /testvolume/tree/socket should return
           NFS4_OK, instead got NFS4ERR_NOENT

MKSOCK   st_create.testSocket                                     : FAILURE
           CREATE in empty dir should return NFS4_OK, instead got
           NFS4ERR_ACCESS

and the following AVC's are observed in audit.log

type=AVC msg=audit(1462480124.699:350): avc:  denied  { create } for  pid=16497 comm="glusterfsd" name="MKSOCK" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462480124.870:425): avc:  denied  { create } for  pid=15296 comm="glusterfsd" name="MKSOCK" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462479597.505:415): avc:  denied  { create } for  pid=11567 comm="glusterfsd" name="socket" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462479598.306:427): avc:  denied  { create } for  pid=9570 comm="glusterfsd" name="socket" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

--------------------------------------------------------------------------

In permissive mode:

All the cases passed with below AVC's in audit.log

type=AVC msg=audit(1462481726.816:383): avc:  denied  { create } for  pid=25652 comm="glusterfsd" name="MKSOCK" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462481199.769:448): avc:  denied  { create } for  pid=13766 comm="glusterfsd" name="socket" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file


type=AVC msg=audit(1462481726.816:384): avc:  denied  { setattr } for  pid=25652 comm="glusterfsd" name="MKSOCK" dev=dm-16 ino=62915497 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

ype=AVC msg=audit(1462481199.770:449): avc:  denied  { setattr } for  pid=13766 comm="glusterfsd" name="socket" dev=dm-6 ino=8388888 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462481726.816:385): avc:  denied  { getattr } for  pid=25652 comm="glusterfsd" path="/bricks/brick2/b2/tmp/MKSOCK" dev=dm-16 ino=62915497 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462481199.770:450): avc:  denied  { getattr } for  pid=13766 comm="glusterfsd" path="/bricks/brick0/b0/tree/socket" dev=dm-6 ino=8388888 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462481726.816:386): avc:  denied  { link } for  pid=25652 comm="glusterfsd" name="MKSOCK" dev=dm-16 ino=62915497 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462481199.770:451): avc:  denied  { link } for  pid=13766 comm="glusterfsd" name="socket" dev=dm-6 ino=8388888 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462482205.624:393): avc:  denied  { unlink } for  pid=29560 comm="glusterfsd" name="e7049fab-60bc-4562-a1bc-575a2e12de0d" dev=dm-6 ino=62917648 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462482099.851:468): avc:  denied  { create } for  pid=27431 comm="glusterfsd" name="SATT6s" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462481848.917:458): avc:  denied  { unlink } for  pid=25033 comm="glusterfsd" name="4a878063-2595-4aa4-8768-e03ebd149b1e" dev=dm-16 ino=62914943 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462481849.717:470): avc:  denied  { unlink } for  pid=24276 comm="glusterfsd" name="4a878063-2595-4aa4-8768-e03ebd149b1e" dev=dm-16 ino=62914943 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462481726.816:383): avc:  denied  { create } for  pid=25652 comm="glusterfsd" name="MKSOCK" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462482099.852:469): avc:  denied  { setattr } for  pid=27431 comm="glusterfsd" name="SATT6s" dev=dm-6 ino=62917648 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462481199.769:448): avc:  denied  { create } for  pid=13766 comm="glusterfsd" name="socket" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462481200.570:460): avc:  denied  { create } for  pid=23443 comm="glusterfsd" name="socket" scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

type=AVC msg=audit(1462482099.852:470): avc:  denied  { getattr } for  pid=27431 comm="glusterfsd" path="/bricks/brick0/b0/tmp/SATT6s" dev=dm-6 ino=62917648 scontext=unconfined_u:system_r:glusterd_t:s0 tcontext=unconfined_u:object_r:glusterd_brick_t:s0 tclass=sock_file

let me know in case any other information is required.

Comment 4 Milos Malik 2016-05-06 06:52:21 UTC
Could you re-run your scenario after applying this workaround?

# cat bz1331561.te
policy_module(bz1331561, 1.0)

require {
  type glusterd_t;
  type glusterd_brick_t;
  class sock_file { create getattr setattr link unlink };
}

allow glusterd_t glusterd_brick_t : sock_file { create getattr setattr link unlink };

# make -f /usr/share/selinux/devel/Makefile 
Compiling targeted bz1331561 module
/usr/bin/checkmodule:  loading policy configuration from tmp/bz1331561.tmp
/usr/bin/checkmodule:  policy configuration loaded
/usr/bin/checkmodule:  writing binary representation (version 17) to tmp/bz1331561.mod
Creating targeted bz1331561.pp policy package
rm tmp/bz1331561.mod.fc tmp/bz1331561.mod
# semodule -i bz1331561.pp 
#

Comment 5 Shashank Raj 2016-05-06 09:58:12 UTC
Verified the bug with the above workaround and all the cases are passed without any issues and no AVC's are seen in audit.log

Comment 15 errata-xmlrpc 2017-03-21 09:46:06 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://rhn.redhat.com/errata/RHBA-2017-0627.html


Note You need to log in before you can comment on or make changes to this bug.