RDO tickets are now tracked in Jira https://issues.redhat.com/projects/RDO/issues/
Bug 1049895 - [RDO][OpenStack-SELinux): AVCs left in messages after deployment of neutron-controller / neutron-compute using foreman.
Summary: [RDO][OpenStack-SELinux): AVCs left in messages after deployment of neutron-c...
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: RDO
Classification: Community
Component: openstack-foreman-installer
Version: unspecified
Hardware: x86_64
OS: Linux
high
medium
Target Milestone: ---
: ---
Assignee: Jason Guiditta
QA Contact: yeylon@redhat.com
URL:
Whiteboard:
Depends On:
Blocks: 1053623
TreeView+ depends on / blocked
 
Reported: 2014-01-08 12:57 UTC by Omri Hochman
Modified: 2016-04-18 06:47 UTC (History)
8 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed:
Embargoed:


Attachments (Terms of Use)
messages.log (1.62 MB, text/plain)
2014-01-08 12:59 UTC, Omri Hochman
no flags Details

Description Omri Hochman 2014-01-08 12:57:09 UTC
[RDO][OpenStack-SELinux): AVCs left in messages after deployment of neutron-controller using foreman.

Environment (RDO Jan 2014): 
------------
selinux-policy-3.7.19-231.el6.noarch
selinux-policy-targeted-3.7.19-231.el6.noarch
python-neutron-2014.1-0.1.b1.el6.noarch
openstack-neutron-2014.1-0.1.b1.el6.noarch
openstack-neutron-openvswitch-2014.1-0.1.b1.el6.noarch



Steps:
------
- Attempt to deploy neutron-controller using foreman_server.sh.

Results: 
----------
- Installation of neutron-controller finished successfully . 
- AVCs Errors remain under /var/log/messages 


Jan  8 13:11:27 puma03 puppet-agent[14842]: Finished catalog run in 63.15 seconds
Jan  8 13:40:14 puma03 kernel: type=1400 audit(1389181214.663:409): avc:  denied  { read } for  pid=16589 comm="ip" path="/proc/16499/status" dev=proc ino=217472 scontext=unconfined_u:system
_r:ifconfig_t:s0 tcontext=unconfined_u:system_r:initrc_t:s0 tclass=file
Jan  8 13:40:14 puma03 kernel: type=1400 audit(1389181214.664:410): avc:  denied  { read } for  pid=16590 comm="ip" path="/proc/16499/status" dev=proc ino=217472 scontext=unconfined_u:system
_r:ifconfig_t:s0 tcontext=unconfined_u:system_r:initrc_t:s0 tclass=file
Jan  8 13:40:14 puma03 kernel: type=1400 audit(1389181214.667:411): avc:  denied  { read } for  pid=16591 comm="ip" path="/proc/16499/status" dev=proc ino=217472 scontext=unconfined_u:system
_r:ifconfig_t:s0 tcontext=unconfined_u:system_r:initrc_t:s0 tclass=file
Jan  8 13:40:14 puma03 kernel: type=1400 audit(1389181214.669:412): avc:  denied  { read } for  pid=16592 comm="ip" path="/proc/16499/status" dev=proc ino=217472 scontext=unconfined_u:system
_r:ifconfig_t:s0 tcontext=unconfined_u:system_r:initrc_t:s0 tclass=file
Jan  8 13:40:29 puma03 puppet-agent[16499]: (/Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]/content) content changed '{md5}716b40d1de69187bba4a36821f699591' to '{md5}ec999e9a079a8fad9e
374a8e01df47f4'
Jan  8 13:40:29 puma03 kernel: type=1400 audit(1389181229.306:413): avc:  denied  { read write } for  pid=16803 comm="rsync" path="/tmp/puppet20140108-16499-1b1ril9-0" dev=dm-0 ino=524356 sc
ontext=unconfined_u:system_r:rsync_t:s0 tcontext=unconfined_u:object_r:initrc_tmp_t:s0 tclass=file
Jan  8 13:40:29 puma03 kernel: type=1400 audit(1389181229.309:414): avc:  denied  { name_connect } for  pid=16803 comm="rsync" dest=873 scontext=unconfined_u:system_r:rsync_t:s0 tcontext=sys
tem_u:object_r:rsync_port_t:s0 tclass=tcp_socket
Jan  8 13:40:29 puma03 kernel: type=1400 audit(1389181229.492:415): avc:  denied  { read write } for  pid=16810 comm="rsync" path="/tmp/puppet20140108-16499-t5g2b2-0" dev=dm-0 ino=524356 sco
ntext=unconfined_u:system_r:rsync_t:s0 tcontext=unconfined_u:object_r:initrc_tmp_t:s0 tclass=file
Jan  8 13:40:29 puma03 kernel: type=1400 audit(1389181229.492:416): avc:  denied  { name_connect } for  pid=16810 comm="rsync" dest=873 scontext=unconfined_u:system_r:rsync_t:s0 tcontext=sys
tem_u:object_r:rsync_port_t:s0 tclass=tcp_socket
Jan  8 13:40:30 puma03 puppet-agent[16499]: (/Stage[main]/Heat::Engine/Heat_config[DEFAULT/auth_encryption_key]/value) value changed 'cb750ac50459614a6554e2380cef35ec' to '%ENCRYPTION_KEY%'
Jan  8 13:40:30 puma03 kernel: type=1400 audit(1389181230.417:417): avc:  denied  { read write } for  pid=16820 comm="rsync" path="/tmp/puppet20140108-16499-189mvvj-0" dev=dm-0 ino=524356 sc
ontext=unconfined_u:system_r:rsync_t:s0 tcontext=unconfined_u:object_r:initrc_tmp_t:s0 tclass=file
Jan  8 13:40:30 puma03 kernel: type=1400 audit(1389181230.418:418): avc:  denied  { name_connect } for  pid=16820 comm="rsync" dest=873 scontext=unconfined_u:system_r:rsync_t:s0 tcontext=sys
tem_u:object_r:rsync_port_t:s0 tclass=tcp_socket
Jan  8 13:40:32 puma03 puppet-agent[16499]: (/Stage[main]/Swift::Proxy/Service[swift-proxy]/ensure) ensure changed 'stopped' to 'running'
Jan  8 13:40:32 puma03 abrt: detected unhandled Python exception in '/usr/bin/swift-proxy-server'
Jan  8 13:40:32 puma03 abrtd: New client connected
Jan  8 13:40:32 puma03 abrtd: Directory 'pyhook-2014-01-08-13:40:32-16906' creation detected
Jan  8 13:40:32 puma03 abrt-server[16924]: Saved Python crash dump of pid 16906 to /var/spool/abrt/pyhook-2014-01-08-13:40:32-16906
Jan  8 13:40:32 puma03 abrtd: Package 'openstack-swift-proxy' isn't signed with proper key
Jan  8 13:40:32 puma03 abrtd: 'post-create' on '/var/spool/abrt/pyhook-2014-01-08-13:40:32-16906' exited with 1
Jan  8 13:40:32 puma03 abrtd: Deleting problem directory '/var/spool/abrt/pyhook-2014-01-08-13:40:32-16906'
Jan  8 13:40:32 puma03 puppet-agent[16499]: (/Stage[main]/Horizon/File_line[httpd_listen_on_bind_address_80]/ensure) created
Jan  8 13:40:43 puma03 puppet-agent[16499]: (/Stage[main]/Quickstack::Neutron::Controller/Exec[neutron-db-manage upgrade]/returns) executed successfully
Jan  8 13:41:32 puma03 puppet-agent[16499]: (/Stage[main]/Heat::Api_cfn/Service[heat-api-cfn]) Triggered 'refresh' from 1 events
Jan  8 13:41:33 puma03 puppet-agent[16499]: (/Stage[main]/Heat::Api/Service[heat-api]) Triggered 'refresh' from 1 events
Jan  8 13:41:33 puma03 puppet-agent[16499]: (/Stage[main]/Heat::Engine/Exec[heat-encryption-key-replacement]/returns) executed successfully
Jan  8 13:41:33 puma03 puppet-agent[16499]: (/Stage[main]/Heat::Engine/Service[heat-engine]) Triggered 'refresh' from 1 events
Jan  8 13:41:33 puma03 puppet-agent[16499]: (/Stage[main]/Heat::Api_cloudwatch/Service[heat-api-cloudwatch]) Triggered 'refresh' from 1 events
Jan  8 13:41:34 puma03 puppet-agent[16499]: (/Stage[main]/Apache/Service[httpd]) Triggered 'refresh' from 2 events
Jan  8 13:41:34 puma03 puppet-agent[16499]: Finished catalog run in 69.78 seconds
Jan  8 14:10:15 puma03 kernel: type=1400 audit(1389183015.645:419): avc:  denied  { read } for  pid=18313 comm="ip" path="/proc/18223/status" dev=proc ino=222926 scontext=unconfined_u:system_r:ifconfig_t:s0 tcontext=unconfined_u:system_r:initrc_t:s0 tclass=file
Jan  8 14:10:15 puma03 kernel: type=1400 audit(1389183015.647:420): avc:  denied  { read } for  pid=18314 comm="ip" path="/proc/18223/status" dev=proc ino=222926 scontext=unconfined_u:system_r:ifconfig_t:s0 tcontext=unconfined_u:system_r:initrc_t:s0 tclass=file
Jan  8 14:10:15 puma03 kernel: type=1400 audit(1389183015.649:421): avc:  denied  { read } for  pid=18315 comm="ip" path="/proc/18223/status" dev=proc ino=222926 scontext=unconfined_u:system_r:ifconfig_t:s0 tcontext=unconfined_u:system_r:initrc_t:s0 tclass=file
Jan  8 14:10:15 puma03 kernel: type=1400 audit(1389183015.650:422): avc:  denied  { read } for  pid=18316 comm="ip" path="/proc/18223/status" dev=proc ino=222926 scontext=unconfined_u:system_r:ifconfig_t:s0 tcontext=unconfined_u:system_r:initrc_t:s0 tclass=file
Jan  8 14:10:29 puma03 puppet-agent[18223]: (/Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]/content) content changed '{md5}716b40d1de69187bba4a36821f699591' to '{md5}ec999e9a079a8fad9e374a8e01df47f4'
Jan  8 14:10:29 puma03 kernel: type=1400 audit(1389183029.819:423): avc:  denied  { read write } for  pid=18527 comm="rsync" path="/tmp/puppet20140108-18223-5dcjp8-0" dev=dm-0 ino=524356 scontext=unconfined_u:system_r:rsync_t:s0 tcontext=unconfined_u:object_r:initrc_tmp_t:s0 tclass=file
Jan  8 14:10:29 puma03 kernel: type=1400 audit(1389183029.822:424): avc:  denied  { name_connect } for  pid=18527 comm="rsync" dest=873 scontext=unconfined_u:system_r:rsync_t:s0 tcontext=system_u:object_r:rsync_port_t:s0 tclass=tcp_socket
Jan  8 14:10:29 puma03 kernel: type=1400 audit(1389183029.948:425): avc:  denied  { read write } for  pid=18534 comm="rsync" path="/tmp/puppet20140108-18223-vhuzc5-0" dev=dm-0 ino=524356 scontext=unconfined_u:system_r:rsync_t:s0 tcontext=unconfined_u:object_r:initrc_tmp_t:s0 tclass=file
Jan  8 14:10:29 puma03 kernel: type=1400 audit(1389183029.948:426): avc:  denied  { name_connect } for  pid=18534 comm="rsync" dest=873 scontext=unconfined_u:system_r:rsync_t:s0 tcontext=system_u:object_r:rsync_port_t:s0 tclass=tcp_socket
Jan  8 14:10:30 puma03 puppet-agent[18223]: (/Stage[main]/Heat::Engine/Heat_config[DEFAULT/auth_encryption_key]/value) value changed '9467294f037a797e619afafc7b3106da' to '%ENCRYPTION_KEY%'
Jan  8 14:10:30 puma03 kernel: type=1400 audit(1389183030.796:427): avc:  denied  { read write } for  pid=18544 comm="rsync" path="/tmp/puppet20140108-18223-wfa1ed-0" dev=dm-0 ino=524356 scontext=unconfined_u:system_r:rsync_t:s0 tcontext=unconfined_u:object_r:initrc_tmp_t:s0 tclass=file
Jan  8 14:10:30 puma03 kernel: type=1400 audit(1389183030.797:428): avc:  denied  { name_connect } for  pid=18544 comm="rsync" dest=873 scontext=unconfined_u:system_r:rsync_t:s0 tcontext=system_u:object_r:rsync_port_t:s0 tclass=tcp_socket
Jan  8 14:10:32 puma03 puppet-agent[18223]: (/Stage[main]/Swift::Proxy/Service[swift-proxy]/ensure) ensure changed 'stopped' to 'running'
Jan  8 14:10:32 puma03 abrt: detected unhandled Python exception in '/usr/bin/swift-proxy-server'
Jan  8 14:10:32 puma03 abrtd: New client connected
Jan  8 14:10:32 puma03 abrtd: Directory 'pyhook-2014-01-08-14:10:32-18630' creation detected
Jan  8 14:10:32 puma03 abrt-server[18648]: Saved Python crash dump of pid 18630 to /var/spool/abrt/pyhook-2014-01-08-14:10:32-18630
Jan  8 14:10:33 puma03 abrtd: Package 'openstack-swift-proxy' isn't signed with proper key

Comment 1 Omri Hochman 2014-01-08 12:59:56 UTC
Created attachment 847114 [details]
messages.log

I'm Adding /var/log/messages.

Note: there is no openstack-selinux package installed on the machine (might be puppet-modules issue?) .

Comment 2 Omri Hochman 2014-01-08 13:03:17 UTC
( It's RHEL6.5. ) 

Adding Info regarding the *foreman-server* packages:

foreman-server$  rpm -qa | grep foreman
----------------------------------------
ruby193-rubygem-foreman_simplify-0.0.5-1.el6.noarch
foreman-proxy-1.3.0-1.el6.noarch
foreman-1.3.2-1.el6.noarch
foreman-mysql-1.3.2-1.el6.noarch
openstack-foreman-installer-1.0.1-2.el6.noarch
foreman-selinux-1.3.0-1.el6.noarch
foreman-mysql2-1.3.2-1.el6.noarch
rubygem-foreman_api-0.1.9-1.el6.noarch
foreman-installer-1.3.1-1.el6.noarch
selinux-policy-3.7.19-231.el6.noarch
selinux-policy-targeted-3.7.19-231.el6.noarch
puppet-server-3.4.2-1.el6.noarch
puppet-3.4.2-1.el6.noarch
packstack-modules-puppet-2013.2.1-0.27.dev936.el6.noarch

Comment 3 Omri Hochman 2014-01-08 13:38:53 UTC
Note: I've encountered the same AVCs on the neutron-compute machine after deployment by foreman on /var/log/messages:

Jan  8 15:11:31 puma02 kernel: type=1400 audit(1389186691.782:4): avc:  denied  { read } for  pid=25666 comm="ip" path="/proc/25501/status" dev=proc ino=41702 scontext=unconfined_u:system_r:ifconfig_t:s0 tcontext=unconfined_u:system_r:initrc_t:s0 tclass=file
Jan  8 15:11:31 puma02 kernel: type=1400 audit(1389186691.784:5): avc:  denied  { read } for  pid=25667 comm="ip" path="/proc/25501/status" dev=proc ino=41702 scontext=unconfined_u:system_r:ifconfig_t:s0 tcontext=unconfined_u:system_r:initrc_t:s0 tclass=file
Jan  8 15:11:31 puma02 kernel: type=1400 audit(1389186691.791:6): avc:  denied  { read } for  pid=25668 comm="ip" path="/proc/25501/status" dev=proc ino=41702 scontext=unconfined_u:system_r:ifconfig_t:s0 tcontext=unconfined_u:system_r:initrc_t:s0 tclass=file
Jan  8 15:11:31 puma02 kernel: type=1400 audit(1389186691.793:7): avc:  denied  { read } for  pid=25669 comm="ip" path="/proc/25501/status" dev=proc ino=41702 scontext=unconfined_u:system_r:ifconfig_t:s0 tcontext=unconfined_u:system_r:initrc_t:s0 tclass=file
Jan  8 15:11:37 puma02 puppet-agent[25501]: Finished catalog run in 0.10

Comment 4 Omri Hochman 2014-01-08 13:50:33 UTC
the thing is: foreman do not deploy openstack-selinux-0.1.3-2.el6ost.noarch on the neutron-controller / neutron-compute machines as opposed to packstack. 

I checked a neutron machine deployed with packstack and it has openstack-selinux installed and no AVC's in /var/log/messages.

Comment 5 Terry Wilson 2014-01-08 21:59:50 UTC
And one thing to note is that Fedora doesn't have an openstack-selinux package and also gets these errors.

Comment 6 Jason Guiditta 2014-01-15 17:01:44 UTC
Terry, is there a reason for this not being in the fedora/rdo repo?  And this implies that packstack would get AVCs on fedora as well, correct?  If yes to both, we may be able to at least partially solve this with a rebuild of the peel package for fedora.  Note however that openstack-foreman doesn't run on Fedora currently anyway, due to a combination of no testing, and that foreman doesn't use SCLs for fedora (which we currently depend on in the installer, though that can be fixed as time allows).

Comment 7 Terry Wilson 2014-01-15 17:06:23 UTC
Jason, I'm not sure why openstack-selinux doesn't exist in Fedora. And yes, packstack gets these errors as well.

Lon, do you know why fedora doesn't have an openstack-selinux package?

Comment 8 Lon Hohberger 2014-01-15 18:06:34 UTC
openstack-selinux only provides patches and bridges gaps on EL6 distributions which are updated far less frequently.

On Fedora, the selinux-policy package is very close to what is in the upstream selinux-policy git repository, so fixes belong there.

Comment 9 Lon Hohberger 2014-04-21 17:33:08 UTC
Ok, so this needs to simply have openstack-selinux installed.

Comment 10 Martin Magr 2014-04-29 15:33:56 UTC
Package openstack-selinux is installed by Packstack on RHEL. So either we need to fix openstack-selinux or Foreman should install the package in it's manifests. Noting can be done on o-p-m side in this case.

Comment 11 Omri Hochman 2014-06-25 19:04:56 UTC
Unable to reproduce  - 
Seems that openstack-foreman-installer-2.0.8-1.el6ost.noarch --> Is installing on the hosts   openstack-selinux-0.5.2-2.el7ost.noarch which solve this issue.


Note You need to log in before you can comment on or make changes to this bug.