RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 1415167 - pam_acct_mgmt with pam_sss.so fails in unprivileged container unless selinux_provider = none is used
Summary: pam_acct_mgmt with pam_sss.so fails in unprivileged container unless selinux_...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Linux 7
Classification: Red Hat
Component: sssd
Version: 7.3
Hardware: Unspecified
OS: Unspecified
unspecified
unspecified
Target Milestone: rc
: ---
Assignee: Michal Zidek
QA Contact: Nikhil Dehadrai
URL:
Whiteboard:
Depends On:
Blocks: 1405326
TreeView+ depends on / blocked
 
Reported: 2017-01-20 12:52 UTC by Jan Pazdziora
Modified: 2020-05-02 18:36 UTC (History)
11 users (show)

Fixed In Version: sssd-1.15.2-14.el7
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2017-08-01 09:02:33 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github SSSD sssd issues 4330 0 None closed selinux_provider fails in a container if libsemanage is not available 2020-05-08 02:12:22 UTC
Red Hat Product Errata RHEA-2017:2294 0 normal SHIPPED_LIVE sssd bug fix and enhancement update 2017-08-01 12:39:55 UTC

Description Jan Pazdziora 2017-01-20 12:52:51 UTC
Description of problem:

When pam_sss.so is used in IPA-enrolled unprivileged docker container to control access to services via HBAC, the pam_acct_mgmt fails.

Version-Release number of selected component (if applicable):

On the host:

kernel-3.10.0-514.el7.x86_64
selinux-policy-3.13.1-102.el7.noarch

In the container:

libselinux-2.5-6.el7.x86_64
libselinux-utils-2.5-6.el7.x86_64
libselinux-python-2.5-6.el7.x86_64
sssd-1.14.0-43.el7_3.11.x86_64

How reproducible:

Deterministic.

Steps to Reproduce:
1. On RHEL machine, git clone https://pagure.io/webauthinfra.git ; cd webauthinfra
2. apply patch

diff --git a/src/Dockerfile.www b/src/Dockerfile.www
index 4d0d1d9..143e75c 100644
--- a/src/Dockerfile.www
+++ b/src/Dockerfile.www
@@ -1,5 +1,5 @@
-FROM fedora:24
-RUN dnf install -y /usr/sbin/ipa-client-install /usr/bin/ipsilon-client-install ipsilon-saml2 httpd mod_ssl mod_auth_gssapi mod_interc
+FROM rhel7
+RUN yum install --disablerepo='*' --enablerepo=rhel-7-server-rpms -y /usr/sbin/ipa-client-install /usr/bin/ipsilon-client-install ipsi
 COPY init-data ipa-client-enroll ipsilon-client-configure populate-data-volume www-setup-apache /usr/sbin/
 RUN chmod a+x /usr/sbin/init-data /usr/sbin/ipa-client-enroll /usr/sbin/ipsilon-client-configure /usr/sbin/populate-data-volume /usr/s
 COPY ipa-client-enroll.service ipsilon-client-configure.service populate-data-volume.service www-setup-apache.service /usr/lib/systemd
diff --git a/src/www-mod_wsgi-gssapi.conf b/src/www-mod_wsgi-gssapi.conf
index 77cf2cc..e3f586d 100644
--- a/src/www-mod_wsgi-gssapi.conf
+++ b/src/www-mod_wsgi-gssapi.conf
@@ -43,7 +43,7 @@ LoadModule lookup_identity_module modules/mod_lookup_identity.so
   InterceptFormPAMService webapp
   InterceptFormLogin username
   InterceptFormPassword password
-  InterceptGETOnSuccess on
+  # InterceptGETOnSuccess on
 
   LookupOutput env
   LookupUserAttr mail REMOTE_USER_EMAIL " "
diff --git a/src/www-proxy-gssapi.conf b/src/www-proxy-gssapi.conf
index efea3ce..f9f61e6 100644
--- a/src/www-proxy-gssapi.conf
+++ b/src/www-proxy-gssapi.conf
@@ -31,7 +31,7 @@ LoadModule lookup_identity_module modules/mod_lookup_identity.so
   InterceptFormPAMService webapp
   InterceptFormLogin username
   InterceptFormPassword password
-  InterceptGETOnSuccess on
+  # InterceptGETOnSuccess on
 
   LookupOutput headers
   LookupUserAttr mail X-REMOTE-USER-EMAIL " "

3. Enroll the RHEL host.
4. docker pull freeipa/freeipa-server:fedora-24 ; docker tag freeipa/freeipa-server:fedora-24 freeipa-server
5. Install docker-compose, for example via

curl -L https://github.com/docker/compose/releases/download/1.10.0/docker-compose-`uname -s`-`uname -m` > /usr/local/bin/docker-compose
chmod +x /usr/local/bin/docker-compose

6. docker-compose build
7. docker-compose up
8. Wait until the output shows

 client_1  | Usage:
client_1  |   ssh -X -i client-data/id_rsa -p 55022 developer@localhost firefox -no-remote
client_1  |   To kinit, in the browser started with ^^^ visit http://localhost/
client_1  |   or execute
client_1  |   cat ipa-data/admin-password | ssh -i client-data/id_rsa -p 55022 developer@localhost kinit admin

9. cat ipa-data/admin-password | docker exec -i webauthinfra_client_1 kinit admin
10. docker exec -ti webauthinfra_client_1 curl -si --negotiate -u : https://www.example.test/login/

Actual results:

HTTP/1.1 401 Unauthorized
Date: Fri, 20 Jan 2017 12:47:20 GMT
Server: Apache/2.4.6 (Red Hat Enterprise Linux) OpenSSL/1.0.1e-fips mod_auth_gssapi/1.4.0 mod_wsgi/3.4 Python/2.7.5
WWW-Authenticate: Negotiate
Content-Length: 123
Content-Type: text/html; charset=iso-8859-1

HTTP/1.1 401 Unauthorized
Date: Fri, 20 Jan 2017 12:47:20 GMT
Server: Apache/2.4.6 (Red Hat Enterprise Linux) OpenSSL/1.0.1e-fips mod_auth_gssapi/1.4.0 mod_wsgi/3.4 Python/2.7.5
WWW-Authenticate: Negotiate oYG3MIG0oAMKAQChCwYJKoZIhvcSAQICooGfBIGcYIGZBgkqhkiG9xIBAgICAG+BiTCBhqADAgEFoQMCAQ+iejB4oAMCARKicQRvkXo3+6SrWGyKnWk5shxakGTSeb42vQQ+XIvIUeUGGBkwfkLVUE5ko4ui5zi4Uigubo7EeH/+TqSYbuut92ijBoAuTxJNBjytX3e6PgItoF1wrwfLaFmxCD037BbG2zgUyeqWyQNgpI07zLR9SPpE
Content-Length: 123
Content-Type: text/html; charset=iso-8859-1

<html><meta http-equiv="refresh" content="0; URL=/login/?noext=1"><body>Kerberos authentication did not pass.</body></html>

When debug_level is set to 6 in webauthinfra_www_1 container in /etc/sssd/sssd.conf and sssd restarted, sssd logs show

==> /var/log/sssd/selinux_child.log <==
(Fri Jan 20 12:49:50 2017) [[sssd[selinux_child[1201]]]] [main] (0x0400): selinux_child started.
(Fri Jan 20 12:49:50 2017) [[sssd[selinux_child[1201]]]] [main] (0x0400): context initialized
(Fri Jan 20 12:49:50 2017) [[sssd[selinux_child[1201]]]] [main] (0x0400): performing selinux operations
(Fri Jan 20 12:49:50 2017) [[sssd[selinux_child[1201]]]] [sss_semanage_init] (0x0020): SELinux policy not managed
(Fri Jan 20 12:49:50 2017) [[sssd[selinux_child[1201]]]] [get_seuser] (0x0020): Cannot create SELinux handle
(Fri Jan 20 12:49:50 2017) [[sssd[selinux_child[1201]]]] [sss_semanage_init] (0x0020): SELinux policy not managed
(Fri Jan 20 12:49:50 2017) [[sssd[selinux_child[1201]]]] [set_seuser] (0x0020): Cannot init SELinux management
(Fri Jan 20 12:49:50 2017) [[sssd[selinux_child[1201]]]] [main] (0x0020): Cannot set SELinux login context.
(Fri Jan 20 12:49:50 2017) [[sssd[selinux_child[1201]]]] [main] (0x0020): selinux_child failed!

==> /var/log/sssd/sssd_example.test.log <==
(Fri Jan 20 12:49:50 2017) [sssd[be[example.test]]] [read_pipe_handler] (0x0400): EOF received, client finished
(Fri Jan 20 12:49:50 2017) [sssd[be[example.test]]] [selinux_child_done] (0x0020): selinux_child_parse_response failed: [22][Invalid argument]
(Fri Jan 20 12:49:50 2017) [sssd[be[example.test]]] [dp_req_done] (0x0400): DP Request [PAM SELinux #3]: Request handler finished [0]: Success
(Fri Jan 20 12:49:50 2017) [sssd[be[example.test]]] [_dp_req_recv] (0x0400): DP Request [PAM SELinux #3]: Receiving request data.
(Fri Jan 20 12:49:50 2017) [sssd[be[example.test]]] [dp_req_destructor] (0x0400): DP Request [PAM SELinux #3]: Request removed.
(Fri Jan 20 12:49:50 2017) [sssd[be[example.test]]] [dp_req_destructor] (0x0400): Number of active DP request: 0
(Fri Jan 20 12:49:50 2017) [sssd[be[example.test]]] [child_sig_handler] (0x0020): child [1201] failed with status [1].

==> /var/log/sssd/sssd_pam.log <==
(Fri Jan 20 12:49:50 2017) [sssd[pam]] [pam_dp_process_reply] (0x0200): received: [4 (System error)][example.test]
(Fri Jan 20 12:49:50 2017) [sssd[pam]] [pam_reply] (0x0200): pam_reply called with result [4]: System error.
(Fri Jan 20 12:49:50 2017) [sssd[pam]] [pam_reply] (0x0200): blen: 29
(Fri Jan 20 12:49:50 2017) [sssd[pam]] [client_recv] (0x0200): Client disconnected!

Expected results:

HTTP/1.1 401 Unauthorized
Date: Fri, 20 Jan 2017 12:51:07 GMT
Server: Apache/2.4.6 (Red Hat Enterprise Linux) OpenSSL/1.0.1e-fips mod_auth_gssapi/1.4.0 mod_wsgi/3.4 Python/2.7.5
WWW-Authenticate: Negotiate
Content-Length: 123
Content-Type: text/html; charset=iso-8859-1

HTTP/1.1 302 Found
Date: Fri, 20 Jan 2017 12:51:08 GMT
Server: WSGIServer/0.1 Python/2.7.12
WWW-Authenticate: Negotiate oYG3MIG0oAMKAQChCwYJKoZIhvcSAQICooGfBIGcYIGZBgkqhkiG9xIBAgICAG+BiTCBhqADAgEFoQMCAQ+iejB4oAMCARKicQRvb+C80tVteOSSJCA9Ao8jCCvFAqe6Wa0uqey7u90j8Iz+V/Jx5ubMVypvP9SvIpT/DPya0Jhngo06JH+ND5RwkBSpEYHlm3jZZo/lJYKKo/qJrZlzvH9T5ZQGOykR9c4axUHxD2X+Vcmvrl6xXKd7
Vary: Cookie
X-Frame-Options: SAMEORIGIN
Content-Type: text/html; charset=utf-8
Location: /
Set-Cookie: csrftoken=T6M3M78mg0AYVi6qGg8IvCx8jln3SOt9BmVhox2wvGA3i34X13jre5pa6JCW7Mpr; expires=Fri, 19-Jan-2018 12:51:08 GMT; Max-Age=31449600; Path=/
Set-Cookie: sessionid=nusfx73ibstzjjtzqod1lwy1a949lc9t; expires=Fri, 03-Feb-2017 12:51:08 GMT; httponly; Max-Age=1209600; Path=/
Transfer-Encoding: chunked

Additional info:

The expected output can be achieved by setting selinux_provider = none in [domain/*] section of /etc/sssd/sssd.conf in webauthinfra_www_1 container.

Comment 1 Jan Pazdziora 2017-01-20 12:53:52 UTC
Things fail even when the webauthinfra is not patched and full Fedora 24 stack is run, or when the www container is Fedora 25, on Fedora 25 host.

Comment 4 Jakub Hrozek 2017-02-03 09:28:04 UTC
Upstream ticket:
https://fedorahosted.org/sssd/ticket/3297

Comment 5 Lukas Slebodnik 2017-02-03 12:24:38 UTC
The title is misleading. It is not possible to install the sssd-ipa package without libsemanage

sh# rpm -q --requires sssd-ipa | grep semanag
libsemanage.so.1()(64bit)
libsemanage.so.1(LIBSEMANAGE_1.0)(64bit)
libsss_semanage.so()(64bit)

Comment 6 Michal Zidek 2017-02-03 12:39:08 UTC
Yes, sorry I named the problem wrongly on the meeting. The bug is reproducible if the policy is not managed with libsemanage or if there is an error communicating with libsemanage. The libsemanage will be present on the system in both cases because it is requirement for SSSD package.

The case when the policy is not managed with libsemanage is not handled gracefully ATM.

Comment 7 Jan Pazdziora 2017-02-11 10:10:09 UTC
Note to self: workaround for webauthinfra pushed as 72448fd1afddcd5a90cb1c1c052360d67ac57e26.

Comment 8 Lukas Slebodnik 2017-04-06 12:03:43 UTC
master:
* 78a08d30b5fbf6e1e3b589e0cf67022e0c1faa33

sssd-1-14:
* 31e4bc07ea17e3e91df28260f6a517b9774b948e

sssd-1-13:
* 963acdfb8b40aca449cf61f85949b4d7bc5ee133

Comment 35 Nikhil Dehadrai 2017-06-22 18:07:00 UTC
IPA-server: ipa-server-4.5.0-19.el7.x86_64
IPA-client: ipa-client-4.5.0-19.el7.x86_64
sssd: sssd-1.15.2-50.el7.x86_64

Verified the bug with following observations:
1) On a non-atomic host 9 (In my case RHEL 7.4), setup a container with own django app, ipa-client and web-server using RHEL 7.3 image.
2) On rhel 73z, Verified that when kinit command is run and initiating curl command, user is not authorized to access login page. 
3) On  updating the image to rhel 74, Verified that when kinit command is run and initiating curl command, user authorized to access login page. 

Refer the console output and logs below:
#docker run -d -h webauth.testrelm.test --tmpfs /tmp --tmpfs /run -v /sys/fs/cgroup:/sys/fs/cgroup:ro --name ipatest1 webauth

#uncomment module inside /etc/httpd/conf.modules.d/55-authnz_pam.conf

#rpm -qa | grep sssd
sssd-common-1.14.0-43.el7_3.14.x86_64
sssd-common-pac-1.14.0-43.el7_3.14.x86_64
sssd-ad-1.14.0-43.el7_3.14.x86_64
sssd-ldap-1.14.0-43.el7_3.14.x86_64
sssd-1.14.0-43.el7_3.14.x86_64
sssd-krb5-common-1.14.0-43.el7_3.14.x86_64
sssd-ipa-1.14.0-43.el7_3.14.x86_64
sssd-krb5-1.14.0-43.el7_3.14.x86_64
sssd-proxy-1.14.0-43.el7_3.14.x86_64
python-sssdconfig-1.14.0-43.el7_3.14.noarch
sssd-client-1.14.0-43.el7_3.14.x86_64
python2-ipalib-4.4.0-14.el7_3.7.noarch
ipa-client-4.4.0-14.el7_3.7.x86_64
python-iniparse-0.4-9.el7.noarch
libipa_hbac-1.14.0-43.el7_3.14.x86_64
ipa-client-common-4.4.0-14.el7_3.7.noarch
python-libipa_hbac-1.14.0-43.el7_3.14.x86_64
python-ipaddress-1.0.16-2.el7.noarch
python2-ipaclient-4.4.0-14.el7_3.7.noarch
sssd-ipa-1.14.0-43.el7_3.14.x86_64
ipa-common-4.4.0-14.el7_3.7.noarch

#[root@webauth yum.repos.d]# ps -ef
UID         PID   PPID  C STIME TTY          TIME CMD
root          1      0  0 08:14 ?        00:00:00 /usr/sbin/init
root         15      1  0 08:14 ?        00:00:00 /usr/lib/systemd/systemd-journald
dbus         30      1  0 08:14 ?        00:00:00 /bin/dbus-daemon --system --address=systemd: --nofork --nopidfile --systemd-activation
root        606      1  0 08:41 ?        00:00:00 /usr/sbin/sssd -D -f
root        607    606  0 08:41 ?        00:00:00 /usr/libexec/sssd/sssd_be --domain testrelm.test --uid 0 --gid 0 --debug-to-files
root        608    606  0 08:41 ?        00:00:00 /usr/libexec/sssd/sssd_nss --uid 0 --gid 0 --debug-to-files
root        609    606  0 08:41 ?        00:00:00 /usr/libexec/sssd/sssd_sudo --uid 0 --gid 0 --debug-to-files
root        610    606  0 08:41 ?        00:00:00 /usr/libexec/sssd/sssd_pam --uid 0 --gid 0 --debug-to-files
root        611    606  0 08:41 ?        00:00:00 /usr/libexec/sssd/sssd_pac --uid 0 --gid 0 --debug-to-files
root        664      1  0 08:45 ?        00:00:00 /usr/sbin/httpd -DFOREGROUND
apache      665    664  0 08:45 ?        00:00:00 /usr/sbin/httpd -DFOREGROUND
apache      666    664  0 08:45 ?        00:00:00 /usr/sbin/httpd -DFOREGROUND
apache      667    664  0 08:45 ?        00:00:00 /usr/sbin/httpd -DFOREGROUND
apache      668    664  0 08:45 ?        00:00:00 /usr/sbin/httpd -DFOREGROUND
apache      669    664  0 08:45 ?        00:00:00 /usr/sbin/httpd -DFOREGROUND
apache      675    664  0 08:46 ?        00:00:00 /usr/sbin/httpd -DFOREGROUND
root        691      0  0 10:39 ?        00:00:00 bash
root        720      0  0 12:08 ?        00:00:00 bash
root        735    720  0 12:08 ?        00:00:00 tail /var/log/sssd/sssd_pam.log -f
root        748    691  0 12:18 ?        00:00:00 ps -ef

#Enable debug_level for sssd


#Run ipa-client-install in container (rhel 73z). 
ipa-client-install


#On rhel74 ipa server create http principal

[root@auto-hv-01-guest06 ~]# ipa service-find --principal http/webauth.testrelm.test
-----------------
1 service matched
-----------------
  Principal name: http/webauth.testrelm.test
  Principal alias: http/webauth.testrelm.test
  Keytab: True
----------------------------
Number of entries returned 1
----------------------------

on container, store the principal and store in /etc/http.keytab
$ ipa-getkeytab -s auto-hv-01-guest06.testrelm.test --principal http/webauth.testrelm.test -k /etc/http.keytab

start the httpd daemon 

kinit foobar1

[root@webauth conf.modules.d]# curl  -L -v --negotiate -u : http://webauth.testrelm.test/login/
* About to connect() to webauth.testrelm.test port 80 (#0)
*   Trying 172.17.0.3...
* Connected to webauth.testrelm.test (172.17.0.3) port 80 (#0)
> GET /login/ HTTP/1.1
> User-Agent: curl/7.29.0
> Host: webauth.testrelm.test
> Accept: */*
>
< HTTP/1.1 401 Unauthorized
< Date: Thu, 22 Jun 2017 12:08:38 GMT
< Server: Apache/2.4.6 (Red Hat Enterprise Linux) mod_auth_gssapi/1.4.0 mod_auth_kerb/5.4 mod_wsgi/3.4 Python/2.7.5
< WWW-Authenticate: Negotiate
< Content-Length: 381
< Content-Type: text/html; charset=iso-8859-1
<
* Ignoring the response-body
* Connection #0 to host webauth.testrelm.test left intact
* Issue another request to this URL: 'http://webauth.testrelm.test/login/'
* Found bundle for host webauth.testrelm.test: 0x9eae80
* Re-using existing connection! (#0) with host webauth.testrelm.test
* Connected to webauth.testrelm.test (172.17.0.3) port 80 (#0)
* Server auth using GSS-Negotiate with user ''
> GET /login/ HTTP/1.1
> Authorization: Negotiate YIICZQYJKoZIhvcSAQICAQBuggJUMIICUKADAgEFoQMCAQ6iBwMFACAAAACjggFkYYIBYDCCAVygAwIBBaEPGw1URVNUUkVMTS5URVNUoigwJqADAgEDoR8wHRsESFRUUBsVd2ViYXV0aC50ZXN0cmVsbS50ZXN0o4IBGDCCARSgAwIBEqEDAgECooIBBgSCAQKW67pO02w0zYlpgkgTIw+9fKiplDbACLeSTcmUFfcTC4DNhOLTVrartOK24uu1HalIhGey1nAYlcmpsNN7mhB8LF5OsQyL5oODWmuNwp/ymH0kCV+WUC5ORWewUyFyxmCUhQNY1gmrarCnqh2lWR8Ggf3k8i0mkMgj/lTJ2do07zodw9LdXHjX3WZVpMIswNo1Jal/Ktp0qsTBQvBSMPemNg7p6SCwzVnJ7tIRtSTAnxwwaZuZvXki97ebq625RHDiGVa2OGAtlJNgBTKLWC65+ttWObFawsrvOuyWncVIHQM+O+e4xrpA966Hy1tIiHY8e0/UyVVmkNd3iWHoVYcLojukgdIwgc+gAwIBEqKBxwSBxG3M8cJkTwORpzo1972Su9LOsomrvTrRXG3+L64ecWnPvUbTC6SYeCJQPn1fYdZgmoo9t9BQ4K7Pwc+xFDUSW7kYV60hx9aw5o8M1d85OpisOGSGoKwUNo4rcpEf4eisnyh/rwsfNQ7h1mMrDt/hdeYUDF8p+t6J0iTdcpl0RFlmO1BGVWr38+L8ZFm42j7KBrReLfDGnHFlmUKQIiMRULfmZxWPJIk1uGpl5xBqFR3XbnLSG8JlDbLKp8RddpsCLojaW5E=
> User-Agent: curl/7.29.0
> Host: webauth.testrelm.test
> Accept: */*
>
< HTTP/1.1 401 Unauthorized
< Date: Thu, 22 Jun 2017 12:08:38 GMT
< Server: Apache/2.4.6 (Red Hat Enterprise Linux) mod_auth_gssapi/1.4.0 mod_auth_kerb/5.4 mod_wsgi/3.4 Python/2.7.5
* Authentication problem. Ignoring this.
< WWW-Authenticate: Negotiate YIGZBgkqhkiG9xIBAgICAG+BiTCBhqADAgEFoQMCAQ+iejB4oAMCARKicQRvtTNtAMWO0Gte4gQGARwhSa++PFelap4Cg1dHTwM8d1H9XCwL8zxnFxkZUrpykfT+f1LjocgQxKMDB3A/hG0SHZnZUnbARct9U09pplCkGNRwfil21tVd8g6NECr4HFmGhNHVp2eKi/Xj7n0H4V1l
< Content-Length: 381
< Content-Type: text/html; charset=iso-8859-1
<
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>401 Unauthorized</title>
</head><body>
<h1>Unauthorized</h1>
<p>This server could not verify that you
are authorized to access the document
requested.  Either you supplied the wrong
credentials (e.g., bad password), or your
browser doesn't understand how to supply
the credentials required.</p>
</body></html>
* Connection #0 to host webauth.testrelm.test left intact

selinux_child.log
(Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [main] (0x0400): selinux_child started.
(Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [main] (0x0400): context initialized
(Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [main] (0x0400): performing selinux operations
(Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [sss_semanage_init] (0x0020): SELinux policy not managed
(Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [get_seuser] (0x0020): Cannot create SELinux handle
(Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [sss_semanage_init] (0x0020): SELinux policy not managed
(Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [set_seuser] (0x0020): Cannot init SELinux management
(Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [main] (0x0020): Cannot set SELinux login context.
(Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [main] (0x0020): selinux_child failed!

pam_sssd.log
(Thu Jun 22 12:08:38 2017) [sssd[pam]] [pam_print_data] (0x0100): cli_pid: 666
(Thu Jun 22 12:08:38 2017) [sssd[pam]] [pam_print_data] (0x0100): logon name: foobar1
(Thu Jun 22 12:08:38 2017) [sssd[pam]] [sss_dp_issue_request] (0x0400): Issuing request for [0x56522d81cc60:3:foobar1@testrelm.test]
(Thu Jun 22 12:08:38 2017) [sssd[pam]] [sss_dp_get_account_msg] (0x0400): Creating request for [testrelm.test][0x3][BE_REQ_INITGROUPS][1][name=foobar1:-]
(Thu Jun 22 12:08:38 2017) [sssd[pam]] [sss_dp_internal_get_send] (0x0400): Entering request [0x56522d81cc60:3:foobar1@testrelm.test]
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_check_user_search] (0x0100): Requesting info for [foobar1]
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_check_user_search] (0x0400): Returning info for user [foobar1@testrelm.test]
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pd_set_primary_name] (0x0400): User's primary name is foobar1
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_dp_send_req] (0x0100): Sending request with the following data:
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): command: SSS_PAM_ACCT_MGMT
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): domain: testrelm.test
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): user: foobar1
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): service: fin-app-prod
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): tty: not set
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): ruser: not set
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): rhost: 172.17.0.3
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): authtok type: 0
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): newauthtok type: 0
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): priv: 0
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): cli_pid: 666
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): logon name: foobar1
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_dom_forwarder] (0x0100): pam_dp_send_req returned 0
(Thu Jun 22 12:08:39 2017) [sssd[pam]] [sss_dp_req_destructor] (0x0400): Deleting request: [0x56522d81cc60:3:foobar1@testrelm.test]
(Thu Jun 22 12:08:41 2017) [sssd[pam]] [pam_dp_process_reply] (0x0200): received: [4 (System error)][testrelm.test]
(Thu Jun 22 12:08:41 2017) [sssd[pam]] [pam_reply] (0x0200): pam_reply called with result [4]: System error.
(Thu Jun 22 12:08:41 2017) [sssd[pam]] [pam_reply] (0x0200): blen: 30
(Thu Jun 22 12:08:41 2017) [sssd[pam]] [client_recv] (0x0200): Client disconnected!


# use the latest rhel74 repo inside container and update ipa-client/sssd package and re-run curl command

[root@webauth yum.repos.d]# curl  -L -v --negotiate -u : http://webauth.testrelm.test/login/
* About to connect() to webauth.testrelm.test port 80 (#0)
*   Trying 172.17.0.3...
* Connected to webauth.testrelm.test (172.17.0.3) port 80 (#0)

pam_sssd.log
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): logon name: foobar1
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_send] (0x0400): CR #1: New request 'Initgroups by name'
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_process_input] (0x0400): CR #1: Parsing input name [foobar1]
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [sss_parse_name_for_domains] (0x0200): name 'foobar1' matched expression for domain 'testrelm.test', user is foobar1
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_set_name] (0x0400): CR #1: Setting name [foobar1]
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_select_domains] (0x0400): CR #1: Performing a single domain search
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_domains] (0x0400): CR #1: Search will check the cache and check the data provider
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_set_domain] (0x0400): CR #1: Using domain [testrelm.test]
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_prepare_domain_data] (0x0400): CR #1: Preparing input data for domain [testrelm.test] rules
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_send] (0x0400): CR #1: Looking up foobar1
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_ncache] (0x0400): CR #1: Checking negative cache for [foobar1]
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_ncache] (0x0400): CR #1: [foobar1] is not present in negative cache
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_cache] (0x0400): CR #1: Looking up [foobar1] in cache
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_send] (0x0400): CR #1: Returning [foobar1] from cache
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_ncache_filter] (0x0400): CR #1: This request type does not support filtering result by negative cache
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_create_and_add_result] (0x0400): CR #1: Found 2 entries in domain testrelm.test
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_done] (0x0400): CR #1: Finished: Success
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pd_set_primary_name] (0x0400): User's primary name is foobar1
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_dp_send_req] (0x0100): Sending request with the following data:
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): command: SSS_PAM_ACCT_MGMT
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): domain: testrelm.test
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): user: foobar1
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): service: fin-app-prod
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): tty: not set
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): ruser: not set
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): rhost: 172.17.0.3
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): authtok type: 0
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): newauthtok type: 0
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): priv: 0
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): cli_pid: 668
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): logon name: foobar1
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_dom_forwarder] (0x0100): pam_dp_send_req returned 0
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_dp_process_reply] (0x0200): received: [0 (Success)][testrelm.test]
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_reply] (0x0200): pam_reply called with result [0]: Success.
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [filter_responses] (0x0100): [pam_response_filter] not available, not fatal.
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_reply] (0x0200): blen: 30
(Thu Jun 22 12:28:18 2017) [sssd[pam]] [client_recv] (0x0200): Client disconnected!

selinux_child.log
(Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[872]]]] [sss_semanage_init] (0x0400): SELinux policy not managed via libsemanage
(Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[872]]]] [pack_buffer] (0x0400): result [0]
(Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[872]]]] [main] (0x0400): selinux_child completed successfully
(Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[873]]]] [main] (0x0400): selinux_child started.
(Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[873]]]] [main] (0x0400): context initialized
(Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[873]]]] [main] (0x0400): performing selinux operations
(Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[873]]]] [sss_semanage_init] (0x0400): SELinux policy not managed via libsemanage
(Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[873]]]] [pack_buffer] (0x0400): result [0]
(Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[873]]]] [main] (0x0400): selinux_child completed successfully

Thus on the basis of above observations, marking the status of bug to "VERIFIED".

Comment 36 errata-xmlrpc 2017-08-01 09:02:33 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2017:2294


Note You need to log in before you can comment on or make changes to this bug.