Bug 1202924 - httpd status shows several puppet error (warning?) messages
Summary: httpd status shows several puppet error (warning?) messages
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Satellite
Classification: Red Hat
Component: SELinux
Version: 6.1.0
Hardware: Unspecified
OS: Unspecified
low
low
Target Milestone: Unspecified
Assignee: Lukas Zapletal
QA Contact: Og Maciel
URL: http://projects.theforeman.org/issues...
Whiteboard:
: Red Hat1219964 1364124 1416781 (view as bug list)
Depends On:
Blocks: Red Hat1122832 Red Hat1479962
TreeView+ depends on / blocked
 
Reported: 2015-03-17 17:37 UTC by Og Maciel
Modified: 2021-06-10 10:52 UTC (History)
27 users (show)

Fixed In Version: foreman-selinux-1.15.4
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2018-02-21 16:54:17 UTC
Target Upstream Version:


Attachments (Terms of Use)
foreman-debug (215.01 KB, application/x-xz)
2015-03-17 17:37 UTC, Og Maciel
no flags Details
attached strace of puppet-master process (342.08 KB, application/x-gzip)
2017-02-27 14:21 UTC, Peter Vreman
no flags Details
selinux avs with dontaudit disabled (110.49 KB, text/x-vhdl)
2017-03-09 14:07 UTC, Peter Vreman
no flags Details
/var/log/messages during the reproducing (29.30 KB, text/plain)
2017-03-09 14:07 UTC, Peter Vreman
no flags Details


Links
System ID Private Priority Status Summary Last Updated
Foreman Issue Tracker 9805 0 Normal Closed Puppet master generates lots of invalid context messages 2020-03-31 10:03:09 UTC

Description Og Maciel 2015-03-17 17:37:56 UTC
Created attachment 1002886 [details]
foreman-debug

Description of problem:

httpd status shows several puppet error (warning?) messages. I am not sure what they really mean or what it could mean to this installation, but it looks like it is failing to get puppet to run against the Satellite instance itself.

Version-Release number of selected component (if applicable):

* Satellite-6.1.0-RHEL-7-20150311.1

How reproducible:


Steps to Reproduce:
1. Install latest Satellite 6.1 build
2. Check the status of httpd (service httpd status or systemctl status httpd -l
3.

Actual results:

Several puppet-related error messages, including a return code of 1 when 0 was expected.

Expected results:


Additional info:

# systemctl status httpd -l
httpd.service - The Apache HTTP Server
   Loaded: loaded (/usr/lib/systemd/system/httpd.service; enabled)
   Active: active (running) since Tue 2015-03-17 11:57:09 EDT; 30min ago
 Main PID: 19677 (httpd)
   Status: "Total requests: 0; Current requests/sec: 0; Current traffic:   0 B/sec"
   CGroup: /system.slice/httpd.service
           ├─19677 /usr/sbin/httpd -DFOREGROUND
           ├─19709 PassengerWatchdog
           ├─19712 PassengerHelperAgent
           ├─19719 PassengerLoggingAgent
           ├─19728 (wsgi:pulp)     -DFOREGROUND
           ├─19729 /usr/sbin/httpd -DFOREGROUND
           ├─19730 /usr/sbin/httpd -DFOREGROUND
           ├─19731 /usr/sbin/httpd -DFOREGROUND
           ├─19732 /usr/sbin/httpd -DFOREGROUND
           ├─19733 /usr/sbin/httpd -DFOREGROUND
           ├─19734 /usr/sbin/httpd -DFOREGROUND
           ├─19735 /usr/sbin/httpd -DFOREGROUND
           ├─19736 /usr/sbin/httpd -DFOREGROUND
           ├─20398 Passenger RackApp: /usr/share/foreman
           └─21073 Passenger RackApp: /etc/puppet/rack

Mar 17 12:02:03 ibm-x3250m4-06.lab.eng.rdu2.redhat.com puppet-master[21049]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Mar 17 12:02:03 ibm-x3250m4-06.lab.eng.rdu2.redhat.com puppet-master[21049]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_var_lib_t:s0
Mar 17 12:02:03 ibm-x3250m4-06.lab.eng.rdu2.redhat.com puppet-master[21049]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_var_lib_t:s0
Mar 17 12:02:03 ibm-x3250m4-06.lab.eng.rdu2.redhat.com puppet-master[21049]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_var_lib_t:s0
Mar 17 12:02:03 ibm-x3250m4-06.lab.eng.rdu2.redhat.com puppet-master[21049]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_var_lib_t:s0
Mar 17 12:02:03 ibm-x3250m4-06.lab.eng.rdu2.redhat.com puppet-master[21049]: Starting Puppet master version 3.6.2
Mar 17 12:02:03 ibm-x3250m4-06.lab.eng.rdu2.redhat.com puppet-master[21073]: Failed to find ibm-x3250m4-06.lab.eng.rdu2.redhat.com via exec: Execution of '/etc/puppet/node.rb ibm-x3250m4-06.lab.eng.rdu2.redhat.com' returned 1:
Mar 17 12:02:09 ibm-x3250m4-06.lab.eng.rdu2.redhat.com puppet-master[21073]: Compiled catalog for ibm-x3250m4-06.lab.eng.rdu2.redhat.com in environment production in 0.03 seconds
Mar 17 12:02:44 ibm-x3250m4-06.lab.eng.rdu2.redhat.com puppet-master[21073]: Compiled catalog for ibm-x3250m4-06.lab.eng.rdu2.redhat.com in environment production in 0.00 seconds
Mar 17 12:05:02 ibm-x3250m4-06.lab.eng.rdu2.redhat.com puppet-master[21073]: Compiled catalog for ibm-x3250m4-06.lab.eng.rdu2.redhat.com in environment production in 0.00 seconds

Comment 1 Og Maciel 2015-03-17 17:48:08 UTC
fwiw

# katello-service restart
hutting down Katello services...
Redirecting to /bin/systemctl stop  foreman-tasks.service
Redirecting to /bin/systemctl stop  httpd.service
Redirecting to /bin/systemctl stop  pulp_celerybeat.service
Redirecting to /bin/systemctl stop  pulp_workers.service
Redirecting to /bin/systemctl stop  pulp_resource_manager.service
Stopping elasticsearch (via systemctl):                    [  OK  ]
Redirecting to /bin/systemctl stop  mongod.service
Redirecting to /bin/systemctl stop  qpidd.service
Done.
Starting Katello services...
Redirecting to /bin/systemctl start  qpidd.service
Redirecting to /bin/systemctl start  mongod.service
Starting elasticsearch (via systemctl):                    [  OK  ]
Redirecting to /bin/systemctl start  pulp_resource_manager.service
Redirecting to /bin/systemctl start  pulp_workers.service
Redirecting to /bin/systemctl start  pulp_celerybeat.service
Redirecting to /bin/systemctl start  httpd.service
Redirecting to /bin/systemctl start  foreman-tasks.service
Done.


# # systemctl status httpd -l
httpd.service - The Apache HTTP Server
   Loaded: loaded (/usr/lib/systemd/system/httpd.service; enabled)
   Active: active (running) since Tue 2015-03-17 13:42:22 EDT; 3min 21s ago
  Process: 13738 ExecStop=/bin/kill -WINCH ${MAINPID} (code=exited, status=0/SUCCESS)
 Main PID: 14117 (httpd)
   Status: "Total requests: 0; Current requests/sec: 0; Current traffic:   0 B/sec"
   CGroup: /system.slice/httpd.service
           ├─14117 /usr/sbin/httpd -DFOREGROUND
           ├─14148 PassengerWatchdog
           ├─14152 PassengerHelperAgent
           ├─14160 PassengerLoggingAgent
           ├─14172 (wsgi:pulp)     -DFOREGROUND
           ├─14173 /usr/sbin/httpd -DFOREGROUND
           ├─14174 /usr/sbin/httpd -DFOREGROUND
           ├─14175 /usr/sbin/httpd -DFOREGROUND
           ├─14176 /usr/sbin/httpd -DFOREGROUND
           ├─14177 /usr/sbin/httpd -DFOREGROUND
           ├─14178 /usr/sbin/httpd -DFOREGROUND
           ├─14179 /usr/sbin/httpd -DFOREGROUND
           ├─14180 /usr/sbin/httpd -DFOREGROUND
           ├─14270 /bin/bash /usr/bin/ruby193-ruby /usr/share/gems/gems/passenger-4.0.18/helper-scripts/rack-preloader.rb
           ├─14279 scl enable ruby193 v8314 ruby /usr/share/gems/gems/passenger-4.0.18/helper-scripts/rack-preloader.rb
           ├─14280 /bin/bash /var/tmp/sclEcPl9I
           ├─14286 Passenger AppPreloader: /usr/share/foreman
           ├─14353 Passenger RackApp: /usr/share/foreman
           └─14364 Passenger RackApp: /usr/share/foreman

Mar 17 13:42:23 ibm-x3250m4-06.lab.eng.rdu2.redhat.com pulp[14172]: pulp.server.agent.direct.services:INFO: AMQP broker configured: URL: ssl://ibm-x3250m4-06.lab.eng.rdu2.redhat.com|SSL: ca: /etc/pki/katello/certs/katello-default-ca.crt|key: None|certificate: /etc/pki/katello/qpid_client_striped.crt|host-validation: False
Mar 17 13:42:23 ibm-x3250m4-06.lab.eng.rdu2.redhat.com pulp[14172]: gofer.messaging.adapter.qpid.connection:INFO: connecting: URL: ssl://ibm-x3250m4-06.lab.eng.rdu2.redhat.com|SSL: ca: /etc/pki/katello/certs/katello-default-ca.crt|key: None|certificate: /etc/pki/katello/qpid_client_striped.crt|host-validation: False
Mar 17 13:42:23 ibm-x3250m4-06.lab.eng.rdu2.redhat.com pulp[14172]: gofer.messaging.adapter.qpid.connection:INFO: connected: ssl://ibm-x3250m4-06.lab.eng.rdu2.redhat.com
Mar 17 13:42:23 ibm-x3250m4-06.lab.eng.rdu2.redhat.com pulp[14172]: pulp.server.agent.direct.services:INFO: Task reply handler, started.
Mar 17 13:42:23 ibm-x3250m4-06.lab.eng.rdu2.redhat.com pulp[14172]: pulp.server.agent.direct.services:INFO: AMQP reply handler started
Mar 17 13:42:23 ibm-x3250m4-06.lab.eng.rdu2.redhat.com pulp[14172]: pulp.server.webservices.application:INFO: *************************************************************
Mar 17 13:42:23 ibm-x3250m4-06.lab.eng.rdu2.redhat.com pulp[14172]: pulp.server.webservices.application:INFO: The Pulp server has been successfully initialized
Mar 17 13:42:23 ibm-x3250m4-06.lab.eng.rdu2.redhat.com pulp[14172]: pulp.server.webservices.application:INFO: *************************************************************
Mar 17 13:42:23 ibm-x3250m4-06.lab.eng.rdu2.redhat.com pulp[14172]: gofer.messaging.adapter.qpid.connection:INFO: connecting: URL: ssl://ibm-x3250m4-06.lab.eng.rdu2.redhat.com|SSL: ca: /etc/pki/katello/certs/katello-default-ca.crt|key: None|certificate: /etc/pki/katello/qpid_client_striped.crt|host-validation: False
Mar 17 13:42:23 ibm-x3250m4-06.lab.eng.rdu2.redhat.com pulp[14172]: gofer.messaging.adapter.qpid.connection:INFO: connected: ssl://ibm-x3250m4-06.lab.eng.rdu2.redhat.com

Comment 3 Og Maciel 2015-03-17 18:25:46 UTC
fyi from /var/log/messages:

Mar 17 14:04:58 ibm-x3250m4-06 puppet-master[15099]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_var_lib_t:s0

Comment 4 Lukas Zapletal 2015-03-18 08:57:20 UTC
We have two issues here, the SELinux one is harmless, I see this upstream too without any side effects. Reported here:

http://projects.theforeman.org/issues/9804

The initial node.rb run fails for unknown reason to me. I am no puppet expert, reported upstream and linking with this BZ as this looks more important to me. But again, this is harmless as catalogue is compiled several times after this error message. Must be some ordering in our installer. Looks harmless too.

http://projects.theforeman.org/issues/9805

Setting sat-future as this is nice-to-have.

Comment 6 Ranjan Kumar 2015-10-27 12:05:25 UTC
Output of sestatus:
$ sudo sestatus
[sudo] password for mertensb: 
SELinux status:                 enabled
SELinuxfs mount:                /sys/fs/selinux
SELinux root directory:         /etc/selinux
Loaded policy name:             targeted
Current mode:                   enforcing
Mode from config file:          enforcing
Policy MLS status:              enabled
Policy deny_unknown status:     allowed
Max kernel policy version:      28

Comment 7 Thom Carlin 2016-09-05 17:30:41 UTC
Also saw this in QCI 1.0/Sat 6.2

Comment 8 Peter Vreman 2016-10-05 12:56:46 UTC
This issue is there more frequent at least after every reboot.

vrempet@li-lc-1017 ~
$ grep -c /etc/selinux/targeted/contexts/files/file_contexts /var/log/messages
512

vrempet@li-lc-1017 ~
$ grep /etc/selinux/targeted/contexts/files/file_contexts /var/log/messages | tail -n 5
Oct  5 12:45:11 li-lc-1017 puppet-master[9026]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Oct  5 12:45:38 li-lc-1017 puppet-master[9046]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Oct  5 12:45:38 li-lc-1017 puppet-master[9046]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Oct  5 12:45:38 li-lc-1017 puppet-master[9046]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Oct  5 12:45:38 li-lc-1017 puppet-master[9046]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0

vrempet@li-lc-1017 ~
$ uptime
 12:55:58 up 13 min,  1 user,  load average: 0.01, 0.51, 0.65

Comment 10 Peter Vreman 2016-12-28 08:33:57 UTC
On RHEL7.3 with Sat6.2.6

The message also depends on the amount of puppet modules and content views. See below that it is spamming the syslog with 18000 messages in 3 days.

$ head -n1 /var/log/messages
Dec 25 03:11:12 li-lc-1017 rsyslogd: [origin software="rsyslogd" swVersion="7.4.7" x-pid="1502" x-info="http://www.rsyslog.com"] rsyslogd was HUPed

$ tail -n1 /var/log/messages
Dec 28 08:31:03 li-lc-1017 dbus-daemon[1047]: dbus[1047]: [system] Successfully activated service 'org.freedesktop.problems'

$ grep -c /etc/selinux/targeted/contexts/files/file_contexts /var/log/messages
18312

Comment 11 Lukas Zapletal 2017-01-02 13:57:40 UTC
Peter I have a new installation of Sat 6.2 with few puppet modules installed from Forge, I don't yet see any messages. How do I reproduce this behavior? Indeed 18000 is too many.

Comment 12 Peter Vreman 2017-01-02 16:53:13 UTC
Lukas,
The issues happen for every puppet-master started from httpd. And as puppet-master is stopped and started from rack based on the load the warnings are also being logged when a new puppet-master instance has to be spawned.

See below that a httpd.service restart does not trigger it, because the puppet-master is only spawned when needed, then when a puppet agent connects to the puppet-master an instance is spawned and the (in my case 172) warnings are generated:

[crash] root@li-lc-1578:~# grep -c /etc/selinux/targeted/contexts/files/file_contexts /var/log/messages
1536
[crash] root@li-lc-1578:~# systemctl restart httpd.service
[crash] root@li-lc-1578:~# grep -c /etc/selinux/targeted/contexts/files/file_contexts /var/log/messages
1536
[crash] root@li-lc-1578:~# passenger-status
Version : 4.0.18
Date    : 2017-01-02 16:44:32 +0000
Instance: 28806
----------- General information -----------
Max pool size : 12
Processes     : 4
Requests in top-level queue : 0

----------- Application groups -----------
/usr/share/foreman#default:
  App root: /usr/share/foreman
  Requests in queue: 0
  * PID: 29152   Sessions: 0       Processed: 3       Uptime: 1m 44s
    CPU: 0%      Memory  : 224M    Last used: 1m 43s ago
  * PID: 29181   Sessions: 0       Processed: 0       Uptime: 1m 44s
    CPU: 0%      Memory  : 171M    Last used: 1m 44s ago
  * PID: 29218   Sessions: 0       Processed: 0       Uptime: 1m 42s
    CPU: 0%      Memory  : 164M    Last used: 1m 42s ago
  * PID: 29246   Sessions: 0       Processed: 0       Uptime: 1m 42s
    CPU: 0%      Memory  : 139M    Last used: 1m 42s ago

[crash] root@li-lc-1578:~# puppet agent -t --noop --tags=dummy
Info: Retrieving pluginfacts
Info: Retrieving plugin
Info: Loading facts
Info: Caching catalog for li-lc-1578.hag.hilti.com
Info: Applying configuration version '1483375504'
Notice: Finished catalog run in 0.89 seconds
[crash] root@li-lc-1578:~# passenger-status
Version : 4.0.18
Date    : 2017-01-02 16:45:12 +0000
Instance: 28806
----------- General information -----------
Max pool size : 12
Processes     : 5
Requests in top-level queue : 0

----------- Application groups -----------
/usr/share/foreman#default:
  App root: /usr/share/foreman
  Requests in queue: 0
  * PID: 29152   Sessions: 0       Processed: 7       Uptime: 2m 24s
    CPU: 2%      Memory  : 504M    Last used: 2s ago
  * PID: 29181   Sessions: 0       Processed: 0       Uptime: 2m 24s
    CPU: 0%      Memory  : 171M    Last used: 2m 24s ago
  * PID: 29218   Sessions: 0       Processed: 0       Uptime: 2m 22s
    CPU: 0%      Memory  : 164M    Last used: 2m 22s ago
  * PID: 29246   Sessions: 0       Processed: 0       Uptime: 2m 22s
    CPU: 0%      Memory  : 139M    Last used: 2m 22s ago

/etc/puppet/rack#default:
  App root: /etc/puppet/rack
  Requests in queue: 0
  * PID: 29384   Sessions: 0       Processed: 5       Uptime: 15s
    CPU: 27%     Memory  : 67M     Last used: 3s ago

[crash] root@li-lc-1578:~# grep -c /etc/selinux/targeted/contexts/files/file_contexts /var/log/messages
1708


There are some more related info:
- KB https://access.redhat.com/solutions/2131031
- private BZ Red Hathttps://bugzilla.redhat.com/show_bug.cgi?id=1219964

Comment 13 Lukas Zapletal 2017-01-03 16:38:58 UTC
I am unable to reproduce here, installed some puppet modules from Forge, added to a host, checked in several times, restarted all services, still no errors. Out of curiousity, do you have some interesting output from

restorecon -Fvvn /

Comment 14 Peter Vreman 2017-01-04 10:09:52 UTC
Lukas,

The restorecon does nothing:

[crash] root@li-lc-1578:~# restorecon -Fvvn /
[crash] root@li-lc-1578:~#

I am on RHEL7.3, it might there be the major update on the selinux tools that is included that it breaks.

I am using Sat6.2.6 with foreman-selinux-1.11.0.2
At least i also have audit2allow failures:

#============= passenger_t ==============

#!!!! WARNING: 'unlabeled_t' is a base type.
allow passenger_t unlabeled_t:dir write;

#!!!! The file '/run/foreman/pids/dynflow_executor.pid' is mislabeled on your system.
#!!!! Fix with $ restorecon -R -v /run/foreman/pids/dynflow_executor.pid
allow passenger_t unlabeled_t:file { append getattr read write };
allow passenger_t var_run_t:dir { rmdir setattr };
allow passenger_t var_run_t:file { getattr unlink };
allow passenger_t var_run_t:sock_file { setattr unlink };


These look to be fixed in https://github.com/theforeman/foreman-selinux/commit/86fa6e79ef80b9dc1fbbbc9d1e86aa2646c3bcdf. But that is not part of Sat6.2.6




I tried also compiling and loading the semodule myself to see how that worked:

[crash] root@li-lc-1578:/usr/share/doc/foreman-selinux-1.11.0.2# sed 's+@@VERSION@@+99+' -i foreman.te
[crash] root@li-lc-1578:/usr/share/doc/foreman-selinux-1.11.0.2# sh -x -e ./foreman.sh
++ dirname ./foreman.sh
+ DIRNAME=.
+ cd .
+ USAGE='./foreman.sh [ --update ]'
++ id -u
+ '[' 0 '!=' 0 ']'
+ '[' 0 -eq 1 ']'
+ '[' 0 -ge 2 ']'
+ echo 'Building and Loading Policy'
Building and Loading Policy
+ set -x
+ make -f /usr/share/selinux/devel/Makefile
Compiling targeted foreman module
foreman.te:435: Warning: miscfiles_read_certs() has been deprecated, please use miscfiles_read_generic_certs() instead.
/usr/bin/checkmodule:  loading policy configuration from tmp/foreman.tmp
/usr/bin/checkmodule:  policy configuration loaded
/usr/bin/checkmodule:  writing binary representation (version 17) to tmp/foreman.mod
Creating targeted foreman.pp policy package
rm tmp/foreman.mod.fc tmp/foreman.mod
+ /usr/sbin/semodule -i foreman.pp
Failed to resolve typeattributeset statement at /etc/selinux/targeted/tmp/modules/400/foreman/cil:117
/usr/sbin/semodule:  Failed!


After some testing i could build the module to load with the following changes

--- foreman.te.170104-1 2017-01-04 09:01:15.322621583 +0000
+++ foreman.te  2017-01-04 09:39:18.797808580 +0000
@@ -151,6 +151,7 @@
     type httpd_tmp_t;
     type ifconfig_exec_t;
     type init_t;
+    type initrc_exec_t;
     type passenger_t;
     type passenger_tmp_t;
     type proc_net_t;
@@ -424,9 +425,9 @@
 # is fixed.
 init_exec_script_files(passenger_t)

-ifndef(`distro_rhel7', `
-    consoletype_exec(passenger_t)
-')
+#ifndef(`distro_rhel7', `
+#    consoletype_exec(passenger_t)
+#')

After this the module loads without errors. But sadly the '/etc/selinux/targeted/contexts/files/file_contexts' errors did not go away.

Comment 15 Lukas Zapletal 2017-01-19 09:16:41 UTC
Peter, do you experience those on Satellite embedded Capsule or externally installed Capsule? So far I was only testing on Satellite embedded one. This could be the issue I think.

Comment 16 Peter Vreman 2017-01-19 14:29:07 UTC
Hoi Lukas,

I have the issues on Satellite embedded Capsule. I am not yet using external Capsules.

Peter

Comment 17 Lukas Zapletal 2017-01-20 16:57:29 UTC
Can you tell if puppet policy is loaded?

semodule -l | grep pupp
puppet  1.4.0

Also is this flag turned on:

semanage boolean --list | grep passenger_run_puppetmaster
passenger_run_puppetmaster     (on   ,   on)  Allow passenger to run puppetmaster

Also list the rules:

sesearch --allow -t puppet_var_lib_t | grep passenger
   allow passenger_t puppet_var_lib_t : file { ioctl read write create getattr setattr lock append unlink link rename open } ; 
   allow passenger_t puppet_var_lib_t : dir { ioctl read write getattr lock add_name remove_name search open } ; 
   allow passenger_t puppet_var_lib_t : file { relabelfrom relabelto } ; 
   allow passenger_t puppet_var_lib_t : dir { create setattr relabelfrom relabelto rmdir } ; 


The puppet policy is in RHEL not in Satellite, we only add some rules around passenger because Satellite 6 runs PM in passenger by default. I wish I could help, but I need a reproducer. Maybe we can find out this way. But if you are able to isolate which action is triggering that, this would be great help. Is that applying a manifest? Or content promotion?

Maybe if you could tail production.log and audit.log toghether we could see the flow of incoming calls and selinux messages. Tail will concatenate them into single stream.

Comment 18 Lukas Zapletal 2017-01-31 15:15:58 UTC
*** Bug 1416781 has been marked as a duplicate of this bug. ***

Comment 19 Peter Vreman 2017-01-31 15:26:52 UTC
I executed the same commands:

[crash] root@li-lc-1578:~# semodule -l | grep pupp
puppet  1.4.0
[crash] root@li-lc-1578:~# semanage boolean --list | grep passenger_run_puppetmaster
passenger_run_puppetmaster     (on   ,   on)  Allow passenger to run puppetmaster
[crash] root@li-lc-1578:~# sesearch --allow -t puppet_var_lib_t | grep passenger
   allow passenger_t puppet_var_lib_t : file { ioctl read write create getattr setattr lock append unlink link rename open } ;
   allow passenger_t puppet_var_lib_t : dir { ioctl read write getattr lock add_name remove_name search open } ;
   allow passenger_t puppet_var_lib_t : file { relabelfrom relabelto } ;
   allow passenger_t puppet_var_lib_t : dir { create setattr relabelfrom relabelto rmdir } ;
[crash] root@li-lc-1578:~#

Comment 20 Peter Vreman 2017-01-31 15:27:16 UTC
Reproduction with tail of the log files:

[crash] root@li-lc-1578:~# systemctl restart httpd.service

[crash] root@li-lc-1578:~# sudo -u hoici hammer -c /opt/hoici/etc/sat6/hammer-hoici.yaml task list --search 'state != stopped'
-------------------------------------|------|-------|---------------------|----------|---------|---------|----------------------------|------------
ID                                   | NAME | OWNER | STARTED AT          | ENDED AT | STATE   | RESULT  | TASK ACTION                | TASK ERRORS
-------------------------------------|------|-------|---------------------|----------|---------|---------|----------------------------|------------
8310a1f9-7295-41fc-9b70-7f5233f4164c |      |       | 2017/01/31 13:48:49 |          | running | pending | Listen on candlepin events |
a3fc489b-8078-4327-a528-393062fa6ce9 |      |       | 2017/01/31 13:48:49 |          | running | pending | Monitor Event Queue        |
-------------------------------------|------|-------|---------------------|----------|---------|---------|----------------------------|------------

[crash] root@li-lc-1578:~# tail -f /var/log/audit/audit.log /var/log/foreman/production.log /var/log/messages

(In other console trigger startup of puppet-master)

vrempet@li-lc-1578 ~
$ sudo puppet agent -t --noop --tags=dummy
Info: Retrieving pluginfacts
Info: Retrieving plugin
Info: Loading facts
Info: Caching catalog for li-lc-1578.hag.hilti.com
Info: Applying configuration version '1485876177'
Notice: Finished catalog run in 0.84 seconds


Output of the tail -f:

==> /var/log/audit/audit.log <==
type=LOGIN msg=audit(1485876121.987:20811): pid=16312 uid=0 subj=system_u:system_r:crond_t:s0-s0:c0.c1023 old-auid=4294967295 auid=0 old-ses=4294967295 ses=2213 res=1
type=USER_START msg=audit(1485876122.004:20812): pid=16312 uid=0 auid=0 ses=2213 subj=system_u:system_r:crond_t:s0-s0:c0.c1023 msg='op=PAM:session_open grantors=pam_loginuid,pam_keyinit,pam_limits,pam_systemd acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success'
type=CRED_REFR msg=audit(1485876122.005:20813): pid=16312 uid=0 auid=0 ses=2213 subj=system_u:system_r:crond_t:s0-s0:c0.c1023 msg='op=PAM:setcred grantors=pam_env,pam_localuser,pam_unix acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success'
type=CRED_DISP msg=audit(1485876122.041:20814): pid=16312 uid=0 auid=0 ses=2213 subj=system_u:system_r:crond_t:s0-s0:c0.c1023 msg='op=PAM:setcred grantors=pam_env,pam_localuser,pam_unix acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success'
type=USER_END msg=audit(1485876122.043:20815): pid=16312 uid=0 auid=0 ses=2213 subj=system_u:system_r:crond_t:s0-s0:c0.c1023 msg='op=PAM:session_close grantors=pam_loginuid,pam_keyinit,pam_limits,pam_systemd acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success'
type=USER_CMD msg=audit(1485876140.233:20816): pid=16504 uid=0 auid=2331 ses=2208 subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='cwd="/root" cmd=68616D6D6572202D63202F6F70742F686F6963692F6574632F736174362F68616D6D65722D686F6963692E79616D6C207461736B206C697374202D2D73656172636820737461746520213D2073746F70706564 terminal=pts/0 res=success'
type=CRED_ACQ msg=audit(1485876140.234:20817): pid=16504 uid=0 auid=2331 ses=2208 subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='op=PAM:setcred grantors=pam_env,pam_localuser,pam_unix acct="hoici" exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/0 res=success'
type=USER_START msg=audit(1485876140.235:20818): pid=16504 uid=0 auid=2331 ses=2208 subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='op=PAM:session_open grantors=pam_keyinit,pam_limits acct="hoici" exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/0 res=success'
type=USER_END msg=audit(1485876151.973:20819): pid=16504 uid=0 auid=2331 ses=2208 subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='op=PAM:session_close grantors=pam_keyinit,pam_limits acct="hoici" exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/0 res=success'
type=CRED_DISP msg=audit(1485876151.973:20820): pid=16504 uid=0 auid=2331 ses=2208 subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='op=PAM:setcred grantors=pam_env,pam_localuser,pam_unix acct="hoici" exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/0 res=success'

==> /var/log/foreman/production.log <==
2017-01-31 15:22:20 [app] [I] Started GET "/apidoc/v2.en.json" for 127.0.0.1 at 2017-01-31 15:22:20 +0000
2017-01-31 15:22:20 [app] [I] Processing by Apipie::ApipiesController#index as JSON
2017-01-31 15:22:20 [app] [I]   Parameters: {"version"=>"v2.en.json", "apipy"=>{}}
2017-01-31 15:22:20 [app] [I] Sent file /usr/share/foreman/public/apipie-cache/apidoc/v2.en.json (0.1ms)
2017-01-31 15:22:20 [app] [I] Completed 200 OK in 1ms (ActiveRecord: 0.0ms)
2017-01-31 15:22:24 [app] [I] Started GET "/foreman_tasks/api/tasks?search=state+%21%3D+stopped&page=1&per_page=1000&sort=%7B%7D" for 127.0.0.1 at 2017-01-31 15:22:24 +0000
2017-01-31 15:22:24 [app] [I] Processing by ForemanTasks::Api::TasksController#index as JSON
2017-01-31 15:22:24 [app] [I]   Parameters: {"search"=>"state != stopped", "page"=>"1", "per_page"=>"1000", "sort"=>"{}", "task"=>{}}
2017-01-31 15:22:24 [app] [I] Authorized user hoici(hoici )
2017-01-31 15:22:31 [app] [I] Completed 200 OK in 7842ms (Views: 0.8ms | ActiveRecord: 7733.5ms)

==> /var/log/messages <==
Jan 31 15:21:40 li-lc-1578 pulp: gofer.messaging.adapter.qpid.connection:INFO: opened: qpid+ssl://li-lc-1578.hag.hilti.com:5671
Jan 31 15:21:40 li-lc-1578 pulp: gofer.messaging.adapter.connect:INFO: connected: qpid+ssl://li-lc-1578.hag.hilti.com:5671
Jan 31 15:21:40 li-lc-1578 pulp: gofer.messaging.adapter.qpid.connection:INFO: opened: qpid+ssl://li-lc-1578.hag.hilti.com:5671
Jan 31 15:21:40 li-lc-1578 pulp: gofer.messaging.adapter.connect:INFO: connected: qpid+ssl://li-lc-1578.hag.hilti.com:5671
Jan 31 15:22:01 li-lc-1578 systemd[1]: Created slice user-0.slice.
Jan 31 15:22:01 li-lc-1578 systemd[1]: Starting user-0.slice.
Jan 31 15:22:02 li-lc-1578 systemd[1]: Started Session 2213 of user root.
Jan 31 15:22:02 li-lc-1578 systemd[1]: Starting Session 2213 of user root.
Jan 31 15:22:02 li-lc-1578 systemd[1]: Removed slice user-0.slice.
Jan 31 15:22:02 li-lc-1578 systemd[1]: Stopping user-0.slice.

==> /var/log/audit/audit.log <==
type=USER_CMD msg=audit(1485876169.684:20821): pid=16553 uid=2331 auid=2331 ses=2211 subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='cwd="/users/vrempet" cmd=707570706574206167656E74202D74202D2D6E6F6F70202D2D746167733D64756D6D79 terminal=pts/1 res=success'
type=CRED_ACQ msg=audit(1485876169.684:20822): pid=16553 uid=0 auid=2331 ses=2211 subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='op=PAM:setcred grantors=pam_env,pam_localuser,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/1 res=success'
type=USER_START msg=audit(1485876169.685:20823): pid=16553 uid=0 auid=2331 ses=2211 subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='op=PAM:session_open grantors=pam_keyinit,pam_limits acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/1 res=success'

==> /var/log/messages <==
Jan 31 15:22:51 li-lc-1578 puppet-master[16563]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Jan 31 15:22:51 li-lc-1578 puppet-master[16563]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Jan 31 15:22:51 li-lc-1578 puppet-master[16563]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Jan 31 15:22:51 li-lc-1578 puppet-master[16563]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Jan 31 15:22:51 li-lc-1578 puppet-master[16563]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_var_lib_t:s0
Jan 31 15:22:51 li-lc-1578 puppet-master[16563]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_var_lib_t:s0
... (160 more lines) ....
Jan 31 15:22:52 li-lc-1578 puppet-master[16563]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Jan 31 15:22:52 li-lc-1578 puppet-master[16563]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Jan 31 15:22:52 li-lc-1578 puppet-master[16563]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Jan 31 15:22:52 li-lc-1578 puppet-master[16563]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Jan 31 15:22:52 li-lc-1578 puppet-master[16563]: Starting Puppet master version 3.8.6

==> /var/log/foreman/production.log <==
2017-01-31 15:22:52 [app] [I] Started GET "/node/li-lc-1578.hag.hilti.com?format=yml" for 10.92.14.128 at 2017-01-31 15:22:52 +0000
2017-01-31 15:22:52 [app] [I] Processing by HostsController#externalNodes as YML
2017-01-31 15:22:52 [app] [I]   Parameters: {"name"=>"li-lc-1578.hag.hilti.com"}
2017-01-31 15:22:53 [app] [I]   Rendered text template (0.0ms)
2017-01-31 15:22:53 [app] [I] Completed 200 OK in 396ms (Views: 1.4ms | ActiveRecord: 32.7ms)

==> /var/log/messages <==
Jan 31 15:22:53 li-lc-1578 puppet-agent[16554]: Retrieving pluginfacts
Jan 31 15:22:53 li-lc-1578 puppet-agent[16554]: Retrieving plugin
Jan 31 15:22:54 li-lc-1578 puppet-agent[16554]: Loading facts

==> /var/log/foreman/production.log <==
2017-01-31 15:22:56 [app] [I] Started POST "/api/hosts/facts" for 10.92.14.128 at 2017-01-31 15:22:56 +0000
2017-01-31 15:22:56 [app] [I] Processing by Api::V2::HostsController#facts as JSON
2017-01-31 15:22:56 [app] [I]   Parameters: {"facts"=>"[FILTERED]", "name"=>"li-lc-1578.hag.hilti.com", "certname"=>"li-lc-1578.hag.hilti.com", "apiv"=>"v2", :host=>{"name"=>"li-lc-1578.hag.hilti.com", "certname"=>"li-lc-1578.hag.hilti.com"}}
2017-01-31 15:22:56 [app] [I] Import facts for 'li-lc-1578.hag.hilti.com' completed. Added: 1, Updated: 8, Deleted 0 facts
2017-01-31 15:22:57 [app] [I] Completed 201 Created in 282ms (Views: 4.7ms | ActiveRecord: 38.4ms)
2017-01-31 15:22:57 [app] [I] Started GET "/node/li-lc-1578.hag.hilti.com?format=yml" for 10.92.14.128 at 2017-01-31 15:22:57 +0000
2017-01-31 15:22:57 [app] [I] Processing by HostsController#externalNodes as YML
2017-01-31 15:22:57 [app] [I]   Parameters: {"name"=>"li-lc-1578.hag.hilti.com"}
2017-01-31 15:22:57 [app] [I]   Rendered text template (0.0ms)
2017-01-31 15:22:57 [app] [I] Completed 200 OK in 124ms (Views: 0.5ms | ActiveRecord: 9.5ms)

==> /var/log/messages <==
Jan 31 15:22:57 li-lc-1578 puppet-master[16581]: (Scope(Class[Base_hoi])) [li-lc-1578] Hiera tier=crash, dc=LI, maingroup=Infra, subgroup=Sat6Sandbox
Jan 31 15:22:59 li-lc-1578 puppet-master[16581]: (Scope(Class[Mgt_kdump])) [li-lc-1578] type_kdump=nfs - crashkernel_size=256 - running_crashkernel=256
Jan 31 15:22:59 li-lc-1578 puppet-master[16581]: Compiled catalog for li-lc-1578.hag.hilti.com in environment KT_Hilti_Hostgroup_hgp_crash_sat_6_2_ci_407 in 2.34 seconds
Jan 31 15:23:00 li-lc-1578 puppet-agent[16554]: Caching catalog for li-lc-1578.hag.hilti.com

==> /var/log/audit/audit.log <==
type=USER_ACCT msg=audit(1485876181.054:20824): pid=16963 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:crond_t:s0-s0:c0.c1023 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_localuser acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success'
type=CRED_ACQ msg=audit(1485876181.054:20825): pid=16963 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:crond_t:s0-s0:c0.c1023 msg='op=PAM:setcred grantors=pam_env,pam_localuser,pam_unix acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success'
type=LOGIN msg=audit(1485876181.055:20826): pid=16963 uid=0 subj=system_u:system_r:crond_t:s0-s0:c0.c1023 old-auid=4294967295 auid=0 old-ses=4294967295 ses=2214 res=1

==> /var/log/messages <==
Jan 31 15:23:01 li-lc-1578 systemd[1]: Created slice user-0.slice.
Jan 31 15:23:01 li-lc-1578 systemd[1]: Starting user-0.slice.
Jan 31 15:23:01 li-lc-1578 systemd[1]: Started Session 2214 of user root.
Jan 31 15:23:01 li-lc-1578 systemd[1]: Starting Session 2214 of user root.

==> /var/log/audit/audit.log <==
type=USER_START msg=audit(1485876181.073:20827): pid=16963 uid=0 auid=0 ses=2214 subj=system_u:system_r:crond_t:s0-s0:c0.c1023 msg='op=PAM:session_open grantors=pam_loginuid,pam_keyinit,pam_limits,pam_systemd acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success'
type=CRED_REFR msg=audit(1485876181.073:20828): pid=16963 uid=0 auid=0 ses=2214 subj=system_u:system_r:crond_t:s0-s0:c0.c1023 msg='op=PAM:setcred grantors=pam_env,pam_localuser,pam_unix acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success'
type=CRED_DISP msg=audit(1485876181.114:20829): pid=16963 uid=0 auid=0 ses=2214 subj=system_u:system_r:crond_t:s0-s0:c0.c1023 msg='op=PAM:setcred grantors=pam_env,pam_localuser,pam_unix acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success'
type=USER_END msg=audit(1485876181.115:20830): pid=16963 uid=0 auid=0 ses=2214 subj=system_u:system_r:crond_t:s0-s0:c0.c1023 msg='op=PAM:session_close grantors=pam_loginuid,pam_keyinit,pam_limits,pam_systemd acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success'

==> /var/log/messages <==
Jan 31 15:23:01 li-lc-1578 systemd[1]: Removed slice user-0.slice.
Jan 31 15:23:01 li-lc-1578 systemd[1]: Stopping user-0.slice.
Jan 31 15:23:01 li-lc-1578 puppet-agent[16554]: Applying configuration version '1485876177'
Jan 31 15:23:01 li-lc-1578 puppet-agent[16554]: Finished catalog run in 0.84 seconds
Jan 31 15:23:02 li-lc-1578 puppet-master[16581]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Jan 31 15:23:02 li-lc-1578 puppet-master[16581]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Jan 31 15:23:02 li-lc-1578 puppet-master[16581]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Jan 31 15:23:02 li-lc-1578 puppet-master[16581]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0

==> /var/log/foreman/production.log <==
2017-01-31 15:23:02 [app] [I] Started POST "/api/reports" for 10.92.14.128 at 2017-01-31 15:23:02 +0000
2017-01-31 15:23:02 [app] [I] Processing by Api::V2::ReportsController#create as JSON
2017-01-31 15:23:02 [app] [I]   Parameters: {"report"=>"[FILTERED]", "apiv"=>"v2"}
2017-01-31 15:23:02 [app] [I] processing report for li-lc-1578.hag.hilti.com
2017-01-31 15:23:02 [app] [I] Imported report for li-lc-1578.hag.hilti.com in 0.09 seconds
2017-01-31 15:23:02 [app] [I]   Rendered api/v2/reports/create.json.rabl (21.8ms)
2017-01-31 15:23:02 [app] [I] Completed 201 Created in 394ms (Views: 30.9ms | ActiveRecord: 40.3ms)

==> /var/log/audit/audit.log <==
type=USER_END msg=audit(1485876182.837:20831): pid=16553 uid=0 auid=2331 ses=2211 subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='op=PAM:session_close grantors=pam_keyinit,pam_limits acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/1 res=success'
type=CRED_DISP msg=audit(1485876182.838:20832): pid=16553 uid=0 auid=2331 ses=2211 subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='op=PAM:setcred grantors=pam_env,pam_localuser,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/1 res=success'
^C
[crash] root@li-lc-1578:~#

Comment 21 Lukas Zapletal 2017-02-01 15:24:23 UTC
Adding SELinux team member to the loop. Hey, I struggle identifying what can be wrong, puppet master (a process that is started from httpd/mod_passenger) sends many these invalid context system_u:object_r:puppet_etc_t:s0 error messages and it is flooding system log. I am unable to reproduce, but it looks like this happens when puppet master process starts up.

Puppet master is a tool that reads some "cookbooks" or "recipes" in /etc/puppet and sends configuration to clients via HTTPS. We've verified that all files in /etc/puppet are with expected label (which is puppet_etc_t).

Core policy provides this type along with rules for puppet agent and puppet master when started standalone, but in Satellite 6 we configure it with mod_passenger, so in foreman.te we add few more rules around that:

https://github.com/theforeman/foreman-selinux/blob/develop/foreman.te#L254-L267

Now, I need your opinion and guide on how to debug this? I am out of ideas, this looks like some kind of bug in SELinux itself. Please provide us some commands that could rule out this.

The only idea I have left is to use audit or systemtap to print all files which are being read from /etc/puppet comparing it against the output, so we can at least identify which files are these. Thanks.

Comment 22 Johan Bergström 2017-02-27 09:33:55 UTC
I'm affected by this aswell. On capsules aswell as on satellite server.

caosle # zgrep -c /etc/selinux/targeted/contexts/files/file_contexts messages*
messages:3052
messages-20170101.gz:58500
messages-20170109.gz:78156
messages-20170115.gz:35412
messages-20170123.gz:56160
messages-20170205:77064
messages-20170212:84146
messages-20170219:49012
messages-20170227:69088

satellite # # zgrep -c /etc/selinux/targeted/contexts/files/file_contexts messages*
messages:168
messages-20170206.gz:468
messages-20170212.gz:1892
messages-20170219.gz:504
messages-20170226.gz:336

Comment 23 Peter Vreman 2017-02-27 14:20:54 UTC
I was able to generate an strace. In that trace the error was generated 4 times.
From the trace it is visble there was a try to open a selinuxfs pseudo-file that fails and then in the next step the error is send to syslog:

$ zgrep -A1 EACCES strace-puppet-master-selinuxfs-error.log.gz
[pid 10471] 13:07:47 open("/sys/fs/selinux/context", O_RDWR) = -1 EACCES (Permission denied)
[pid 10471] 13:07:47 sendto(7, "<27>Feb 27 13:07:47 puppet-maste"..., 145, MSG_NOSIGNAL, NULL, 0) = 145
--
[pid 10471] 13:07:47 open("/sys/fs/selinux/context", O_RDWR) = -1 EACCES (Permission denied)
[pid 10471] 13:07:47 sendto(7, "<27>Feb 27 13:07:47 puppet-maste"..., 145, MSG_NOSIGNAL, NULL, 0) = 145
--
[pid 10471] 13:07:47 open("/sys/fs/selinux/context", O_RDWR) = -1 EACCES (Permission denied)
[pid 10471] 13:07:47 sendto(7, "<27>Feb 27 13:07:47 puppet-maste"..., 145, MSG_NOSIGNAL, NULL, 0) = 145
--
[pid 10471] 13:07:47 open("/sys/fs/selinux/context", O_RDWR) = -1 EACCES (Permission denied)
[pid 10471] 13:07:47 sendto(7, "<27>Feb 27 13:07:47 puppet-maste"..., 145, MSG_NOSIGNAL, NULL, 0) = 145

Comment 24 Peter Vreman 2017-02-27 14:21:47 UTC
Created attachment 1258111 [details]
attached strace of puppet-master process

Comment 25 Lukas Zapletal 2017-02-28 09:24:36 UTC
Thanks Peter, this is nice progress. Given the fact these all messages are the same length, this is indeed suspicious. Also can you dig puppet master logs near this time (Feb 27 13:07:47) so we can compare whole contents of these sendto statements? They can be just ordinary messages generated by puppet-master, but let's comare those using the time and length of the message so we can find them. If I remove the hostname, the error report is 143 characters long, removing PID makes it little shorter as well depending on the PID length, it's near.

What is also interesting is this:

[pid 10471] 13:07:42 stat("/etc/puppet/environments/KT_Hilti_Hostgroup_hgp_crash_sat_6_2_ci_407/modules/base_selinux/manifests/init.rb", 0x7fba390690b0) = -1 ENOENT (No such file or directory)
[pid 10471] 13:07:42 sendto(7, "<29>Feb 27 13:07:42 puppet-maste"..., 123, MSG_NOSIGNAL, NULL, 0) = 123

Can you check this file and it's permissions and SELinux context?

/etc/puppet/environments/KT_Hilti_Hostgroup_hgp_crash_sat_6_2_ci_407/modules/base_selinux/manifests/init.rb

Another interesting one is:

[pid 10471] 13:07:43 sendto(7, "<29>Feb 27 13:07:43 puppet-maste"..., 162, MSG_NOSIGNAL, NULL, 0) = 162
[pid 10471] 13:07:43 stat("/var/lib/puppet/templates", 0x7fba39060ed0) = -1 ENOENT (No such file or directory)

Check this dir as well please:

/var/lib/puppet/templates

Just to rule out there are some zombie symlinks or anything like that preventing SELinux to read them.

Comment 26 Peter Vreman 2017-02-28 13:40:58 UTC
The syslog entries at '13:07:47' are exactly 4 puppet-master with the selinux:

Feb 27 13:07:46 li-lc-1578 puppet-agent[10522]: Applying configuration version '1488200860'
Feb 27 13:07:47 li-lc-1578 puppet-agent[10522]: Finished catalog run in 0.88 seconds
Feb 27 13:07:47 li-lc-1578 puppet-master[10466]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Feb 27 13:07:47 li-lc-1578 puppet-master[10466]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Feb 27 13:07:47 li-lc-1578 puppet-master[10466]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Feb 27 13:07:47 li-lc-1578 puppet-master[10466]: /etc/selinux/targeted/contexts/files/file_contexts:  invalid context system_u:object_r:puppet_etc_t:s0
Feb 27 13:08:01 li-lc-1578 systemd[1]: Started Session 55340 of user root.


The puppet environment directory (created by Pulp), the manifest contains an init.pp and not the init.rb. But this is just a normal puppet module that will configure /etc/sysconfig/selinux on the clients. It is nothing special for the puppet-master.

[crash] root@li-lc-1578:~# ls -lZ /etc/puppet/environments/KT_Hilti_Hostgroup_hgp_crash_sat_6_2_ci_407/modules/base_selinux/manifests/init.rb
ls: cannot access /etc/puppet/environments/KT_Hilti_Hostgroup_hgp_crash_sat_6_2_ci_407/modules/base_selinux/manifests/init.rb: No such file or directory

[crash] root@li-lc-1578:~# ls -lZ /etc/puppet/environments/KT_Hilti_Hostgroup_hgp_crash_sat_6_2_ci_407/modules/base_selinux/manifests/init.pp
-rw-rw-r--. apache apache system_u:object_r:puppet_etc_t:s0 /etc/puppet/environments/KT_Hilti_Hostgroup_hgp_crash_sat_6_2_ci_407/modules/base_selinux/manifests/init.pp


The systemwide dir /var/lib/puppet/templates does not exists:

[crash] root@li-lc-1578:~# ls -lZ /var/lib/puppet/templates
ls: cannot access /var/lib/puppet/templates: No such file or directory

Comment 27 Lukas Zapletal 2017-02-28 14:30:03 UTC
Thanks Peter, I am asking our SELinux guys to give some analysis on comment 23. Something is preventing puppet master from accessing /sys/fs/selinux/context

Comment 28 Lukas Zapletal 2017-03-08 14:37:37 UTC
Hello Peter,

Got some feedback from SELinux team. So first check if secure_mode_policyload is in

semanage boolean -l -C

or

getsebool -a | grep secure_mode_policyload

If it's on, then please try to change the value to off

setsebool -P secure_mode_policyload off

To put the value back on, you need to restart the system again. Thanks.

Comment 29 Ben 2017-03-08 16:10:13 UTC
Datapoint:

I'm seeing the same puppet/SELinux messages issue.

[root@satellite1 ~]# semanage boolean -l -C
SELinux boolean                State  Default Description

httpd_tmp_exec                 (on   ,   on)  Allow httpd to tmp exec
httpd_setrlimit                (on   ,   on)  Allow httpd to setrlimit
httpd_can_network_connect      (on   ,   on)  Allow httpd to can network connect
pulp_manage_puppet             (on   ,   on)  Allow pulp to manage puppet
[root@satellite1 ~]#

So, no joy here.

Comment 30 Peter Vreman 2017-03-08 17:23:48 UTC
Output for me:

[crash] root@li-lc-1578:~# semanage boolean -l -C
SELinux boolean                State  Default Description

nagios_run_sudo                (on   ,   on)  Allow nagios/nrpe to call sudo from NRPE utils scripts.
pulp_manage_puppet             (on   ,   on)  Allow pulp to manage puppet
use_nfs_home_dirs              (on   ,   on)  Support NFS home directories
httpd_setrlimit                (on   ,   on)  Allow httpd daemon to change its resource limits
httpd_can_network_connect      (on   ,   on)  Allow HTTPD scripts and modules to connect to the network using TCP.
httpd_tmp_exec                 (on   ,   on)  Allow Apache to execute tmp content.
[crash] root@li-lc-1578:~# getsebool -a | grep secure_mode_policyload
secure_mode_policyload --> off

Comment 31 Johan Bergström 2017-03-08 21:26:14 UTC
Satellite & Capsule have same values:

# semanage boolean -l -C
SELinux boolean                State  Default Description

httpd_tmp_exec                 (on   ,   on)  Allow httpd to tmp exec
httpd_setrlimit                (on   ,   on)  Allow httpd to setrlimit
httpd_can_network_connect      (on   ,   on)  Allow httpd to can network connect
pulp_manage_puppet             (on   ,   on)  Allow pulp to manage puppet
# getsebool -a | grep secure_mode_policyload
secure_mode_policyload --> off

Comment 32 Lukas Zapletal 2017-03-09 09:46:50 UTC
Thank you, can I have your RHEL and Satellite versions? I already know that Peter is using Satellite 6.2 on RHEL7.3.

And more importantly, puppet (and master) version you are running with.

Comment 33 Lukas Zapletal 2017-03-09 09:51:05 UTC
This is Sat 6.2.8:

# rpm -q puppet mod_passenger httpd selinux-policy
puppet-3.8.6-2.el7sat.noarch
mod_passenger-4.0.18-22.el7sat.x86_64
httpd-2.4.6-45.el7.x86_64
selinux-policy-3.13.1-102.el7_3.15.noarch

Comment 34 Peter Vreman 2017-03-09 09:53:27 UTC
vrempet@li-lc-1578 ~
$ rpm -qa satellite redhat-release-server selinux-policy puppet mod_passenger httpd selinux-policy
puppet-3.8.6-2.el7sat.noarch
selinux-policy-3.13.1-102.el7_3.13.noarch
satellite-6.2.8-4.0.el7sat.noarch
httpd-2.4.6-45.el7.x86_64
mod_passenger-4.0.18-22.el7sat.x86_64
redhat-release-server-7.3-7.el7.x86_64

Comment 35 Ben 2017-03-09 10:19:37 UTC
# rpm -qa satellite redhat-release-server selinux-policy puppet mod_passenger
puppet-3.8.6-2.el7sat.noarch
selinux-policy-3.13.1-102.el7_3.15.noarch
redhat-release-server-7.3-7.el7.x86_64
mod_passenger-4.0.18-22.el7sat.x86_64
satellite-6.2.8-4.0.el7sat.noarch

Comment 36 Lukas Zapletal 2017-03-09 12:04:18 UTC
Thanks one more thing, can you pastebin ps auxwww, I am only interested in all puppet or passeger or http processes (grep it). Just to doublecheck the source domain of puppet master.

Comment 37 Petr Lautrbach 2017-03-09 12:20:15 UTC
It seems like the puppet-master[10466] process, which tries to validate 'system_u:object_r:puppet_etc_t:s0' type, is not allowed to do that.

While puppetmaster_t xan do it:

# sesearch -A -s puppetmaster_t -c security
Found 1 semantic av rules:
   allow puppetmaster_t security_t : security check_context ; 

It's possible that the process is run with other type  and there's no rule which would allow 'check_context' permission for this type on 'security' class.

Unfortunately these checks are dontaudit'ed in the default policy so you can't  see any such AVC denial messages in the audit.log.

Could you please:

1. disable dontaudit rules

# semodule -DB

2. re-run the reproducer

3. attach the output of

# ausearch -m avc,user_avc,selinux_err -ts recent

or audit.log



There's an idea that the process is run with passenger_t and this type is indeed not allowed to check_context.

Comment 38 Petr Lautrbach 2017-03-09 12:22:23 UTC
(In reply to Lukas Zapletal from comment #36)
> Thanks one more thing, can you pastebin ps auxwww, 

Or rather 'ps auxwwwZ' which would show SELinux labels on processes as well.

Comment 39 Peter Vreman 2017-03-09 14:06:17 UTC
Below is the details of the denial to the selinuxfs

[crash] root@li-lc-1578:~# sealert -l 6dac971a-d8f8-4a22-891a-0e1ca764811c
SELinux is preventing /usr/bin/ruby from 'read, write' accesses on the file context.

*****  Plugin setenforce (91.4 confidence) suggests   ************************

If you believe /usr/bin/ruby tried to disable SELinux.
Then you may be under attack by a hacker, since confined applications should never need this access.
Do
contact your security administrator and report this issue.

*****  Plugin catchall (9.59 confidence) suggests   **************************

If you believe that ruby should be allowed read write access on the context file by default.
Then you should report this as a bug.
You can generate a local policy module to allow this access.
Do
allow this access for now by executing:
# ausearch -c 'ruby' --raw | audit2allow -M my-ruby
# semodule -i my-ruby.pp


Additional Information:
Source Context                system_u:system_r:passenger_t:s0
Target Context                system_u:object_r:security_t:s0
Target Objects                context [ file ]
Source                        ruby
Source Path                   /usr/bin/ruby
Port                          <Unknown>
Host                          li-lc-1578
Source RPM Packages           ruby-2.0.0.648-29.el7.x86_64
Target RPM Packages
Policy RPM                    selinux-policy-3.13.1-102.el7_3.13.noarch
Selinux Enabled               True
Policy Type                   targeted
Enforcing Mode                Enforcing
Host Name                     li-lc-1578
Platform                      Linux li-lc-1578 3.10.0-514.6.1.el7.x86_64 #1 SMP
                              Sat Dec 10 11:15:38 EST 2016 x86_64 x86_64
Alert Count                   100
First Seen                    2017-03-09 14:01:48 UTC
Last Seen                     2017-03-09 14:02:28 UTC
Local ID                      6dac971a-d8f8-4a22-891a-0e1ca764811c

Raw Audit Messages
type=AVC msg=audit(1489068148.691:53517): avc:  denied  { read write } for  pid=28819 comm="ruby" name="context" dev="selinuxfs" ino=5 scontext=system_u:system_r:passenger_t:s0 tcontext=system_u:object_r:security_t:s0 tclass=file


type=SYSCALL msg=audit(1489068148.691:53517): arch=x86_64 syscall=open success=no exit=EACCES a0=7f2847ba92e0 a1=2 a2=7f2847ba92f7 a3=f items=0 ppid=28792 pid=28819 auid=4294967295 uid=52 gid=52 euid=52 suid=52 fsuid=52 egid=52 sgid=52 fsgid=52 tty=(none) ses=4294967295 comm=ruby exe=/usr/bin/ruby subj=system_u:system_r:passenger_t:s0 key=(null)

Hash: ruby,passenger_t,security_t,file,read,write

Comment 40 Peter Vreman 2017-03-09 14:07:06 UTC
Created attachment 1261582 [details]
selinux avs with dontaudit disabled

Comment 41 Peter Vreman 2017-03-09 14:07:46 UTC
Created attachment 1261583 [details]
/var/log/messages during the reproducing

Comment 42 Petr Lautrbach 2017-03-09 15:08:10 UTC
So it's most likely the problem with passenger_t which is not allowed to check_context.

As a workaround, a simple SELinux module can be used until it's fixed correctly in the right place.

1. create a module:

# cat > passenger-allow-check-context.cil <<EOF
(allow passenger_t security_t (security (check_context)))
EOF

2. load the module

# semodule -i passenger-allow-check-context.cil

If it fixes the problem you can re-enable dontaudit rules:

# semodule -B

Note: do not forget to remove this module when an appropriate fix is delivered:

# semodule -r passenger-allow-check-context

Comment 43 Peter Vreman 2017-03-09 17:10:23 UTC
The proposed module does not work

[crash] root@li-lc-1578:~# semodule -l | grep passenger
passenger       1.1.1
passenger-allow-check-context   (null)

I still get the following alert (with dontaudit disabled):

[crash] root@li-lc-1578:~# sealert -l 6dac971a-d8f8-4a22-891a-0e1ca764811c
SELinux is preventing /usr/bin/ruby from 'read, write' accesses on the file context.

*****  Plugin setenforce (91.4 confidence) suggests   ************************

If you believe /usr/bin/ruby tried to disable SELinux.
Then you may be under attack by a hacker, since confined applications should never need this access.
Do
contact your security administrator and report this issue.

*****  Plugin catchall (9.59 confidence) suggests   **************************

If you believe that ruby should be allowed read write access on the context file by default.
Then you should report this as a bug.
You can generate a local policy module to allow this access.
Do
allow this access for now by executing:
# ausearch -c 'ruby' --raw | audit2allow -M my-ruby
# semodule -i my-ruby.pp


Additional Information:
Source Context                system_u:system_r:passenger_t:s0
Target Context                system_u:object_r:security_t:s0
Target Objects                context [ file ]
Source                        ruby
Source Path                   /usr/bin/ruby
Port                          <Unknown>
Host                          li-lc-1578
Source RPM Packages           ruby-2.0.0.648-29.el7.x86_64
Target RPM Packages
Policy RPM                    selinux-policy-3.13.1-102.el7_3.13.noarch
Selinux Enabled               True
Policy Type                   targeted
Enforcing Mode                Enforcing
Host Name                     li-lc-1578
Platform                      Linux li-lc-1578 3.10.0-514.6.1.el7.x86_64 #1 SMP
                              Sat Dec 10 11:15:38 EST 2016 x86_64 x86_64
Alert Count                   302
First Seen                    2017-03-09 14:01:48 UTC
Last Seen                     2017-03-09 17:01:04 UTC
Local ID                      6dac971a-d8f8-4a22-891a-0e1ca764811c

Raw Audit Messages
type=AVC msg=audit(1489078864.288:57132): avc:  denied  { read write } for  pid=23433 comm="ruby" name="context" dev="selinuxfs" ino=5 scontext=system_u:system_r:passenger_t:s0 tcontext=system_u:object_r:security_t:s0 tclass=file


type=SYSCALL msg=audit(1489078864.288:57132): arch=x86_64 syscall=open success=no exit=EACCES a0=7f1c54bff2e0 a1=2 a2=7f1c54bff2f7 a3=f items=0 ppid=1 pid=23433 auid=4294967295 uid=52 gid=52 euid=52 suid=52 fsuid=52 egid=52 sgid=52 fsgid=52 tty=(none) ses=4294967295 comm=ruby exe=/usr/bin/ruby subj=system_u:system_r:passenger_t:s0 key=(null)

Hash: ruby,passenger_t,security_t,file,read,write

Comment 44 Petr Lautrbach 2017-03-10 09:43:14 UTC
Apparently I was too fast with the module and didn't try it. I'm sorry.

I have a version which works for me now:

# cat > passenger-allow-check-context.cil <<EOF
(typeattributeset cil_gen_require passenger_t)
(typeattributeset cil_gen_require security_t)
(allow passenger_t security_t (file (append getattr ioctl lock open read write)))
(allow passenger_t security_t (security (check_context)))
EOF

# semodule -i passenger-allow-check-context.cil

It should fix problems related to passenger_t trying to do check_context.

Comment 45 Lukas Zapletal 2017-03-10 14:15:27 UTC
Peter can you get us ps auxwwwZ | grep passenger output? We have suspicion that the application is running in a different context.

Comment 46 Johan Bergström 2017-03-13 09:42:13 UTC
~]$ ps auxwwwZ | grep -E 'passenger|http|puppet' | grep -v grep
system_u:system_r:puppetagent_t:s0 root   1238  0.0  0.2 256216 45236 ?        Ssl  Feb11   1:52 /usr/bin/ruby /usr/bin/puppet agent --no-daemonize
system_u:system_r:passenger_t:s0 puppet   5239  1.1  0.3 290332 56812 ?        Sl   08:16   2:23 Passenger RackApp: /etc/puppet/rack
system_u:system_r:passenger_t:s0 puppet  10130  1.3  0.3 290344 55592 ?        Sl   11:19   0:17 Passenger RackApp: /etc/puppet/rack
system_u:system_r:httpd_t:s0    apache   12057  0.9  0.0 199228 11764 ?        S    08:49   1:41 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:passenger_t:s0 puppet  13173  0.5  0.2 153860 47948 ?        Sl   11:37   0:01 Passenger AppPreloader: /etc/puppet/rack
system_u:system_r:passenger_t:s0 puppet  13233  1.6  0.3 290336 55396 ?        Sl   11:37   0:03 Passenger RackApp: /etc/puppet/rack
system_u:system_r:passenger_t:s0 puppet  13732  0.3  0.3 290432 54160 ?        Sl   11:39   0:00 Passenger RackApp: /etc/puppet/rack
system_u:system_r:passenger_t:s0 puppet  14005  0.0  0.1 105916 16720 ?        Ssl  11:41   0:00 ruby /etc/puppet/node.rb lus00103.lij.fi
system_u:system_r:httpd_t:s0    root     15969  0.0  0.0 196344  9772 ?        Ss   Mar06   0:32 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   18383  0.1  0.4 968448 69576 ?        Sl   04:02   0:29 (wsgi:pulp)     -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   18384  0.1  0.4 968452 69564 ?        Sl   04:02   0:29 (wsgi:pulp)     -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   18385  0.1  0.4 1033984 69568 ?       Sl   04:02   0:30 (wsgi:pulp)     -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   18386  0.0  0.2 622160 33868 ?        Sl   04:02   0:02 (wsgi:pulp-cont -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   18387  0.0  0.2 687696 34244 ?        Sl   04:02   0:02 (wsgi:pulp-cont -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   18388  0.0  0.2 687696 34184 ?        Sl   04:02   0:02 (wsgi:pulp-cont -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   18389  0.0  0.3 870656 61928 ?        Sl   04:02   0:07 (wsgi:pulp_forg -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   18390  0.0  0.3 805124 61928 ?        Sl   04:02   0:07 (wsgi:pulp_forg -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   18391  0.0  0.3 805124 61928 ?        Sl   04:02   0:07 (wsgi:pulp_forg -DFOREGROUND
system_u:system_r:passenger_t:s0 root    18392  0.0  0.0 214780  1840 ?        Ssl  04:02   0:00 PassengerWatchdog
system_u:system_r:passenger_t:s0 root    18395  0.2  0.0 1226396 3832 ?        Sl   04:02   1:02 PassengerHelperAgent
system_u:system_r:passenger_t:s0 nobody  18402  0.0  0.0 217032  3880 ?        Sl   04:02   0:00 PassengerLoggingAgent
system_u:system_r:httpd_t:s0    apache   24323  0.9  0.0 199216 11760 ?        S    09:49   1:02 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   26703  0.9  0.0 199220 11704 ?        S    10:02   0:55 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   26947  0.9  0.0 199216 11432 ?        S    10:04   0:54 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   26989  0.9  0.0 199120 11452 ?        S    10:04   0:52 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   27016  0.9  0.0 199140 11404 ?        S    10:04   0:55 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   27091  0.9  0.0 199236 11528 ?        S    10:04   0:53 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   28052  0.8  0.0 199132 11308 ?        S    10:11   0:48 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   28080  0.9  0.0 199188 11464 ?        S    10:11   0:50 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   28375  0.8  0.0 199128 11316 ?        S    10:12   0:45 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   29503  0.8  0.0 199164 11344 ?        S    10:16   0:42 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   31470  0.8  0.0 199136 11340 ?        S    10:25   0:36 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   31471  0.8  0.0 199132 11320 ?        S    10:25   0:40 /usr/sbin/httpd -DFOREGROUND
system_u:system_r:httpd_t:s0    apache   31472  0.8  0.0 199128 11340 ?        S    10:25   0:39 /usr/sbin/httpd -DFOREGROUND

Comment 47 Peter Vreman 2017-03-13 10:43:47 UTC
[crash] root@li-lc-1578:~# ps auxwwwZ | grep passenger
system_u:system_r:passenger_t:s0 root     2958  0.0  0.0 214780  1804 ?        Ssl  Mar12   0:00 PassengerWatchdog
system_u:system_r:passenger_t:s0 root     2961  1.1  0.0 1556052 8292 ?        Sl   Mar12  22:04 PassengerHelperAgent
system_u:system_r:passenger_t:s0 nobody   2967  0.0  0.0 217032  3864 ?        Sl   Mar12   0:00 PassengerLoggingAgent
system_u:system_r:passenger_t:s0 foreman  3138  0.0  0.0   9508  1320 ?        S    Mar12   0:00 /bin/bash /usr/bin/tfm-ruby /usr/share/gems/gems/passenger-4.0.18/helper-scripts/rack-preloader.rb
system_u:system_r:passenger_t:s0 foreman  3146  0.0  0.0   4300   560 ?        S    Mar12   0:00 scl enable tfm bash /tmp/tmp.y7El9nZ7O6
system_u:system_r:passenger_t:s0 foreman  3147  0.0  0.0   9504  1384 ?        S    Mar12   0:00 /bin/bash /var/tmp/sclJwlutb
system_u:system_r:passenger_t:s0 foreman  3188  0.0  0.0   9508  1140 ?        S    Mar12   0:00 bash /tmp/tmp.y7El9nZ7O6
system_u:system_r:passenger_t:s0 foreman  3189  0.0  0.4 701564 298192 ?       Sl   Mar12   0:22 Passenger AppPreloader: /usr/share/foreman
system_u:system_r:passenger_t:s0 foreman  3417  0.0  1.0 2306468 715708 ?      Sl   Mar12   1:42 Passenger RackApp: /usr/share/foreman
system_u:system_r:passenger_t:s0 foreman  3446  0.2  0.9 2241028 641864 ?      Sl   Mar12   4:14 Passenger RackApp: /usr/share/foreman
system_u:system_r:passenger_t:s0 foreman  3483  0.0  0.4 1905664 326704 ?      Sl   Mar12   0:06 Passenger RackApp: /usr/share/foreman
system_u:system_r:passenger_t:s0 foreman  3511  0.0  0.4 1905760 325856 ?      Sl   Mar12   0:05 Passenger RackApp: /usr/share/foreman
system_u:system_r:passenger_t:s0 foreman  3539  0.0  0.5 1905856 329080 ?      Sl   Mar12   0:05 Passenger RackApp: /usr/share/foreman
system_u:system_r:passenger_t:s0 foreman  3569  0.0  0.5 1905952 333672 ?      Sl   Mar12   0:05 Passenger RackApp: /usr/share/foreman
system_u:system_r:passenger_t:s0 foreman  3597  0.1  1.1 2372580 774808 ?      Sl   Mar12   3:01 Passenger RackApp: /usr/share/foreman
system_u:system_r:passenger_t:s0 foreman  3627  0.0  1.0 2302512 722672 ?      Sl   Mar12   0:15 Passenger RackApp: /usr/share/foreman
system_u:system_r:passenger_t:s0 foreman  6984  0.0  0.1 490240 112440 ?       Sl   Mar06   0:04 dynflow_executor_monitor
system_u:system_r:passenger_t:s0 foreman  6987  0.6  4.3 4745500 2839348 ?     Sl   Mar06  64:05 dynflow_executor
system_u:system_r:passenger_t:s0 puppet   9358  2.8  0.0 155292 49536 ?        Sl   10:42   0:01 Passenger AppPreloader: /etc/puppet/rack
system_u:system_r:passenger_t:s0 puppet   9374  8.7  0.1 291852 74700 ?        Sl   10:42   0:03 Passenger RackApp: /etc/puppet/rack
unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 root 10293 0.0  0.0 112656 968 pts/1 S+ 10:43   0:00 grep --color=auto passenger

Comment 48 Peter Vreman 2017-03-13 10:44:59 UTC
Confirmed that the fix in https://bugzilla.redhat.com/show_bug.cgi?id=1202924#c44 works.

Comment 49 Lukas Zapletal 2017-03-13 15:22:37 UTC
Thank you, we will provide short term fix for Passenger 4 and in the future (P5) we will rebase on their policy.

Comment 50 Satellite Program 2017-03-13 16:19:38 UTC
Upstream bug assigned to lzap

Comment 51 Satellite Program 2017-03-13 16:19:46 UTC
Upstream bug assigned to lzap

Comment 52 Lukas Zapletal 2017-05-12 08:09:47 UTC
*** Red HatBug 1219964 has been marked as a duplicate of this bug. ***

Comment 53 Lukas Zapletal 2017-05-12 08:09:52 UTC
*** Bug 1364124 has been marked as a duplicate of this bug. ***

Comment 54 Lukas Zapletal 2017-05-12 08:45:56 UTC
Final resolution is:

The error "file_contexts: invalid context" is harmless and does not affect operation of puppet master. To get rid of it, use this (RHEL 7.3 only).

cat > passenger-allow-check-context.cil <<EOF
(typeattributeset cil_gen_require passenger_t)
(typeattributeset cil_gen_require security_t)
(allow passenger_t security_t (file (append getattr ioctl lock open read write)))
(allow passenger_t security_t (security (check_context)))
EOF
semodule -i passenger-allow-check-context.cil

On RHEL 6 it's more commands:

yum install policycoreutils-python setools-console selinux-policy-devel
mkdir passenger-allow-check-context
cd passenger-allow-check-context
cat >passenger-allow-check-context.te <<EOP
policy_module(passenger-allow-check-context, 1.0)
gen_require(`
 type passenger_t;
')
selinux_validate_context(passenger_t)
EOP
make -f /usr/share/selinux/devel/Makefile
semodule -i passenger-allow-check-context.pp

Updated KBASE article: https://access.redhat.com/node/2576541 (currently draft will be published soon).

Upstream patch is pending. Scheduling this bug to go into 6.3.

Thanks for all invonved in this, we were unable to reproduce.

Comment 55 Bryan Kearney 2017-05-23 14:53:04 UTC
This did not make 6.3. moving to backlog.

Comment 61 Satellite Program 2017-08-31 16:19:42 UTC
Moving this bug to POST for triage into Satellite 6 since the upstream issue http://projects.theforeman.org/issues/9805 has been resolved.

Comment 62 Og Maciel 2017-11-28 20:07:21 UTC
Cannot reproduce it with satellite-6.3.0-21.0.beta.el7sat.noarch.

Comment 63 Satellite Program 2018-02-21 16:54:17 UTC
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA.
> 
> For information on the advisory, and where to find the updated files, follow the link below.
> 
> If the solution does not work for you, open a new bug report.
> 
> https://access.redhat.com/errata/RHSA-2018:0336


Note You need to log in before you can comment on or make changes to this bug.