Bug 1415167
Summary: | pam_acct_mgmt with pam_sss.so fails in unprivileged container unless selinux_provider = none is used | ||
---|---|---|---|
Product: | Red Hat Enterprise Linux 7 | Reporter: | Jan Pazdziora <jpazdziora> |
Component: | sssd | Assignee: | Michal Zidek <mzidek> |
Status: | CLOSED ERRATA | QA Contact: | Nikhil Dehadrai <ndehadra> |
Severity: | unspecified | Docs Contact: | |
Priority: | unspecified | ||
Version: | 7.3 | CC: | grajaiya, jhrozek, jpazdziora, ksiddiqu, lslebodn, mkosek, mzidek, nsoman, pbrezina, sgoveas, tomek |
Target Milestone: | rc | ||
Target Release: | --- | ||
Hardware: | Unspecified | ||
OS: | Unspecified | ||
Whiteboard: | |||
Fixed In Version: | sssd-1.15.2-14.el7 | Doc Type: | If docs needed, set a value |
Doc Text: | Story Points: | --- | |
Clone Of: | Environment: | ||
Last Closed: | 2017-08-01 09:02:33 UTC | Type: | Bug |
Regression: | --- | Mount Type: | --- |
Documentation: | --- | CRM: | |
Verified Versions: | Category: | --- | |
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
Cloudforms Team: | --- | Target Upstream Version: | |
Embargoed: | |||
Bug Depends On: | |||
Bug Blocks: | 1405326 |
Description
Jan Pazdziora
2017-01-20 12:52:51 UTC
Things fail even when the webauthinfra is not patched and full Fedora 24 stack is run, or when the www container is Fedora 25, on Fedora 25 host. Upstream ticket: https://fedorahosted.org/sssd/ticket/3297 The title is misleading. It is not possible to install the sssd-ipa package without libsemanage sh# rpm -q --requires sssd-ipa | grep semanag libsemanage.so.1()(64bit) libsemanage.so.1(LIBSEMANAGE_1.0)(64bit) libsss_semanage.so()(64bit) Yes, sorry I named the problem wrongly on the meeting. The bug is reproducible if the policy is not managed with libsemanage or if there is an error communicating with libsemanage. The libsemanage will be present on the system in both cases because it is requirement for SSSD package. The case when the policy is not managed with libsemanage is not handled gracefully ATM. Note to self: workaround for webauthinfra pushed as 72448fd1afddcd5a90cb1c1c052360d67ac57e26. master: * 78a08d30b5fbf6e1e3b589e0cf67022e0c1faa33 sssd-1-14: * 31e4bc07ea17e3e91df28260f6a517b9774b948e sssd-1-13: * 963acdfb8b40aca449cf61f85949b4d7bc5ee133 IPA-server: ipa-server-4.5.0-19.el7.x86_64 IPA-client: ipa-client-4.5.0-19.el7.x86_64 sssd: sssd-1.15.2-50.el7.x86_64 Verified the bug with following observations: 1) On a non-atomic host 9 (In my case RHEL 7.4), setup a container with own django app, ipa-client and web-server using RHEL 7.3 image. 2) On rhel 73z, Verified that when kinit command is run and initiating curl command, user is not authorized to access login page. 3) On updating the image to rhel 74, Verified that when kinit command is run and initiating curl command, user authorized to access login page. Refer the console output and logs below: #docker run -d -h webauth.testrelm.test --tmpfs /tmp --tmpfs /run -v /sys/fs/cgroup:/sys/fs/cgroup:ro --name ipatest1 webauth #uncomment module inside /etc/httpd/conf.modules.d/55-authnz_pam.conf #rpm -qa | grep sssd sssd-common-1.14.0-43.el7_3.14.x86_64 sssd-common-pac-1.14.0-43.el7_3.14.x86_64 sssd-ad-1.14.0-43.el7_3.14.x86_64 sssd-ldap-1.14.0-43.el7_3.14.x86_64 sssd-1.14.0-43.el7_3.14.x86_64 sssd-krb5-common-1.14.0-43.el7_3.14.x86_64 sssd-ipa-1.14.0-43.el7_3.14.x86_64 sssd-krb5-1.14.0-43.el7_3.14.x86_64 sssd-proxy-1.14.0-43.el7_3.14.x86_64 python-sssdconfig-1.14.0-43.el7_3.14.noarch sssd-client-1.14.0-43.el7_3.14.x86_64 python2-ipalib-4.4.0-14.el7_3.7.noarch ipa-client-4.4.0-14.el7_3.7.x86_64 python-iniparse-0.4-9.el7.noarch libipa_hbac-1.14.0-43.el7_3.14.x86_64 ipa-client-common-4.4.0-14.el7_3.7.noarch python-libipa_hbac-1.14.0-43.el7_3.14.x86_64 python-ipaddress-1.0.16-2.el7.noarch python2-ipaclient-4.4.0-14.el7_3.7.noarch sssd-ipa-1.14.0-43.el7_3.14.x86_64 ipa-common-4.4.0-14.el7_3.7.noarch #[root@webauth yum.repos.d]# ps -ef UID PID PPID C STIME TTY TIME CMD root 1 0 0 08:14 ? 00:00:00 /usr/sbin/init root 15 1 0 08:14 ? 00:00:00 /usr/lib/systemd/systemd-journald dbus 30 1 0 08:14 ? 00:00:00 /bin/dbus-daemon --system --address=systemd: --nofork --nopidfile --systemd-activation root 606 1 0 08:41 ? 00:00:00 /usr/sbin/sssd -D -f root 607 606 0 08:41 ? 00:00:00 /usr/libexec/sssd/sssd_be --domain testrelm.test --uid 0 --gid 0 --debug-to-files root 608 606 0 08:41 ? 00:00:00 /usr/libexec/sssd/sssd_nss --uid 0 --gid 0 --debug-to-files root 609 606 0 08:41 ? 00:00:00 /usr/libexec/sssd/sssd_sudo --uid 0 --gid 0 --debug-to-files root 610 606 0 08:41 ? 00:00:00 /usr/libexec/sssd/sssd_pam --uid 0 --gid 0 --debug-to-files root 611 606 0 08:41 ? 00:00:00 /usr/libexec/sssd/sssd_pac --uid 0 --gid 0 --debug-to-files root 664 1 0 08:45 ? 00:00:00 /usr/sbin/httpd -DFOREGROUND apache 665 664 0 08:45 ? 00:00:00 /usr/sbin/httpd -DFOREGROUND apache 666 664 0 08:45 ? 00:00:00 /usr/sbin/httpd -DFOREGROUND apache 667 664 0 08:45 ? 00:00:00 /usr/sbin/httpd -DFOREGROUND apache 668 664 0 08:45 ? 00:00:00 /usr/sbin/httpd -DFOREGROUND apache 669 664 0 08:45 ? 00:00:00 /usr/sbin/httpd -DFOREGROUND apache 675 664 0 08:46 ? 00:00:00 /usr/sbin/httpd -DFOREGROUND root 691 0 0 10:39 ? 00:00:00 bash root 720 0 0 12:08 ? 00:00:00 bash root 735 720 0 12:08 ? 00:00:00 tail /var/log/sssd/sssd_pam.log -f root 748 691 0 12:18 ? 00:00:00 ps -ef #Enable debug_level for sssd #Run ipa-client-install in container (rhel 73z). ipa-client-install #On rhel74 ipa server create http principal [root@auto-hv-01-guest06 ~]# ipa service-find --principal http/webauth.testrelm.test ----------------- 1 service matched ----------------- Principal name: http/webauth.testrelm.test Principal alias: http/webauth.testrelm.test Keytab: True ---------------------------- Number of entries returned 1 ---------------------------- on container, store the principal and store in /etc/http.keytab $ ipa-getkeytab -s auto-hv-01-guest06.testrelm.test --principal http/webauth.testrelm.test -k /etc/http.keytab start the httpd daemon kinit foobar1 [root@webauth conf.modules.d]# curl -L -v --negotiate -u : http://webauth.testrelm.test/login/ * About to connect() to webauth.testrelm.test port 80 (#0) * Trying 172.17.0.3... * Connected to webauth.testrelm.test (172.17.0.3) port 80 (#0) > GET /login/ HTTP/1.1 > User-Agent: curl/7.29.0 > Host: webauth.testrelm.test > Accept: */* > < HTTP/1.1 401 Unauthorized < Date: Thu, 22 Jun 2017 12:08:38 GMT < Server: Apache/2.4.6 (Red Hat Enterprise Linux) mod_auth_gssapi/1.4.0 mod_auth_kerb/5.4 mod_wsgi/3.4 Python/2.7.5 < WWW-Authenticate: Negotiate < Content-Length: 381 < Content-Type: text/html; charset=iso-8859-1 < * Ignoring the response-body * Connection #0 to host webauth.testrelm.test left intact * Issue another request to this URL: 'http://webauth.testrelm.test/login/' * Found bundle for host webauth.testrelm.test: 0x9eae80 * Re-using existing connection! (#0) with host webauth.testrelm.test * Connected to webauth.testrelm.test (172.17.0.3) port 80 (#0) * Server auth using GSS-Negotiate with user '' > GET /login/ HTTP/1.1 > Authorization: Negotiate YIICZQYJKoZIhvcSAQICAQBuggJUMIICUKADAgEFoQMCAQ6iBwMFACAAAACjggFkYYIBYDCCAVygAwIBBaEPGw1URVNUUkVMTS5URVNUoigwJqADAgEDoR8wHRsESFRUUBsVd2ViYXV0aC50ZXN0cmVsbS50ZXN0o4IBGDCCARSgAwIBEqEDAgECooIBBgSCAQKW67pO02w0zYlpgkgTIw+9fKiplDbACLeSTcmUFfcTC4DNhOLTVrartOK24uu1HalIhGey1nAYlcmpsNN7mhB8LF5OsQyL5oODWmuNwp/ymH0kCV+WUC5ORWewUyFyxmCUhQNY1gmrarCnqh2lWR8Ggf3k8i0mkMgj/lTJ2do07zodw9LdXHjX3WZVpMIswNo1Jal/Ktp0qsTBQvBSMPemNg7p6SCwzVnJ7tIRtSTAnxwwaZuZvXki97ebq625RHDiGVa2OGAtlJNgBTKLWC65+ttWObFawsrvOuyWncVIHQM+O+e4xrpA966Hy1tIiHY8e0/UyVVmkNd3iWHoVYcLojukgdIwgc+gAwIBEqKBxwSBxG3M8cJkTwORpzo1972Su9LOsomrvTrRXG3+L64ecWnPvUbTC6SYeCJQPn1fYdZgmoo9t9BQ4K7Pwc+xFDUSW7kYV60hx9aw5o8M1d85OpisOGSGoKwUNo4rcpEf4eisnyh/rwsfNQ7h1mMrDt/hdeYUDF8p+t6J0iTdcpl0RFlmO1BGVWr38+L8ZFm42j7KBrReLfDGnHFlmUKQIiMRULfmZxWPJIk1uGpl5xBqFR3XbnLSG8JlDbLKp8RddpsCLojaW5E= > User-Agent: curl/7.29.0 > Host: webauth.testrelm.test > Accept: */* > < HTTP/1.1 401 Unauthorized < Date: Thu, 22 Jun 2017 12:08:38 GMT < Server: Apache/2.4.6 (Red Hat Enterprise Linux) mod_auth_gssapi/1.4.0 mod_auth_kerb/5.4 mod_wsgi/3.4 Python/2.7.5 * Authentication problem. Ignoring this. < WWW-Authenticate: Negotiate YIGZBgkqhkiG9xIBAgICAG+BiTCBhqADAgEFoQMCAQ+iejB4oAMCARKicQRvtTNtAMWO0Gte4gQGARwhSa++PFelap4Cg1dHTwM8d1H9XCwL8zxnFxkZUrpykfT+f1LjocgQxKMDB3A/hG0SHZnZUnbARct9U09pplCkGNRwfil21tVd8g6NECr4HFmGhNHVp2eKi/Xj7n0H4V1l < Content-Length: 381 < Content-Type: text/html; charset=iso-8859-1 < <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>401 Unauthorized</title> </head><body> <h1>Unauthorized</h1> <p>This server could not verify that you are authorized to access the document requested. Either you supplied the wrong credentials (e.g., bad password), or your browser doesn't understand how to supply the credentials required.</p> </body></html> * Connection #0 to host webauth.testrelm.test left intact selinux_child.log (Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [main] (0x0400): selinux_child started. (Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [main] (0x0400): context initialized (Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [main] (0x0400): performing selinux operations (Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [sss_semanage_init] (0x0020): SELinux policy not managed (Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [get_seuser] (0x0020): Cannot create SELinux handle (Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [sss_semanage_init] (0x0020): SELinux policy not managed (Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [set_seuser] (0x0020): Cannot init SELinux management (Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [main] (0x0020): Cannot set SELinux login context. (Thu Jun 22 08:46:12 2017) [[sssd[selinux_child[676]]]] [main] (0x0020): selinux_child failed! pam_sssd.log (Thu Jun 22 12:08:38 2017) [sssd[pam]] [pam_print_data] (0x0100): cli_pid: 666 (Thu Jun 22 12:08:38 2017) [sssd[pam]] [pam_print_data] (0x0100): logon name: foobar1 (Thu Jun 22 12:08:38 2017) [sssd[pam]] [sss_dp_issue_request] (0x0400): Issuing request for [0x56522d81cc60:3:foobar1@testrelm.test] (Thu Jun 22 12:08:38 2017) [sssd[pam]] [sss_dp_get_account_msg] (0x0400): Creating request for [testrelm.test][0x3][BE_REQ_INITGROUPS][1][name=foobar1:-] (Thu Jun 22 12:08:38 2017) [sssd[pam]] [sss_dp_internal_get_send] (0x0400): Entering request [0x56522d81cc60:3:foobar1@testrelm.test] (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_check_user_search] (0x0100): Requesting info for [foobar1] (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_check_user_search] (0x0400): Returning info for user [foobar1@testrelm.test] (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pd_set_primary_name] (0x0400): User's primary name is foobar1 (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_dp_send_req] (0x0100): Sending request with the following data: (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): command: SSS_PAM_ACCT_MGMT (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): domain: testrelm.test (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): user: foobar1 (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): service: fin-app-prod (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): tty: not set (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): ruser: not set (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): rhost: 172.17.0.3 (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): authtok type: 0 (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): newauthtok type: 0 (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): priv: 0 (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): cli_pid: 666 (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_print_data] (0x0100): logon name: foobar1 (Thu Jun 22 12:08:39 2017) [sssd[pam]] [pam_dom_forwarder] (0x0100): pam_dp_send_req returned 0 (Thu Jun 22 12:08:39 2017) [sssd[pam]] [sss_dp_req_destructor] (0x0400): Deleting request: [0x56522d81cc60:3:foobar1@testrelm.test] (Thu Jun 22 12:08:41 2017) [sssd[pam]] [pam_dp_process_reply] (0x0200): received: [4 (System error)][testrelm.test] (Thu Jun 22 12:08:41 2017) [sssd[pam]] [pam_reply] (0x0200): pam_reply called with result [4]: System error. (Thu Jun 22 12:08:41 2017) [sssd[pam]] [pam_reply] (0x0200): blen: 30 (Thu Jun 22 12:08:41 2017) [sssd[pam]] [client_recv] (0x0200): Client disconnected! # use the latest rhel74 repo inside container and update ipa-client/sssd package and re-run curl command [root@webauth yum.repos.d]# curl -L -v --negotiate -u : http://webauth.testrelm.test/login/ * About to connect() to webauth.testrelm.test port 80 (#0) * Trying 172.17.0.3... * Connected to webauth.testrelm.test (172.17.0.3) port 80 (#0) pam_sssd.log (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): logon name: foobar1 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_send] (0x0400): CR #1: New request 'Initgroups by name' (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_process_input] (0x0400): CR #1: Parsing input name [foobar1] (Thu Jun 22 12:28:18 2017) [sssd[pam]] [sss_parse_name_for_domains] (0x0200): name 'foobar1' matched expression for domain 'testrelm.test', user is foobar1 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_set_name] (0x0400): CR #1: Setting name [foobar1] (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_select_domains] (0x0400): CR #1: Performing a single domain search (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_domains] (0x0400): CR #1: Search will check the cache and check the data provider (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_set_domain] (0x0400): CR #1: Using domain [testrelm.test] (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_prepare_domain_data] (0x0400): CR #1: Preparing input data for domain [testrelm.test] rules (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_send] (0x0400): CR #1: Looking up foobar1 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_ncache] (0x0400): CR #1: Checking negative cache for [foobar1] (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_ncache] (0x0400): CR #1: [foobar1] is not present in negative cache (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_cache] (0x0400): CR #1: Looking up [foobar1] in cache (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_send] (0x0400): CR #1: Returning [foobar1] from cache (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_search_ncache_filter] (0x0400): CR #1: This request type does not support filtering result by negative cache (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_create_and_add_result] (0x0400): CR #1: Found 2 entries in domain testrelm.test (Thu Jun 22 12:28:18 2017) [sssd[pam]] [cache_req_done] (0x0400): CR #1: Finished: Success (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pd_set_primary_name] (0x0400): User's primary name is foobar1 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_dp_send_req] (0x0100): Sending request with the following data: (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): command: SSS_PAM_ACCT_MGMT (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): domain: testrelm.test (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): user: foobar1 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): service: fin-app-prod (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): tty: not set (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): ruser: not set (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): rhost: 172.17.0.3 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): authtok type: 0 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): newauthtok type: 0 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): priv: 0 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): cli_pid: 668 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_print_data] (0x0100): logon name: foobar1 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_dom_forwarder] (0x0100): pam_dp_send_req returned 0 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_dp_process_reply] (0x0200): received: [0 (Success)][testrelm.test] (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_reply] (0x0200): pam_reply called with result [0]: Success. (Thu Jun 22 12:28:18 2017) [sssd[pam]] [filter_responses] (0x0100): [pam_response_filter] not available, not fatal. (Thu Jun 22 12:28:18 2017) [sssd[pam]] [pam_reply] (0x0200): blen: 30 (Thu Jun 22 12:28:18 2017) [sssd[pam]] [client_recv] (0x0200): Client disconnected! selinux_child.log (Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[872]]]] [sss_semanage_init] (0x0400): SELinux policy not managed via libsemanage (Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[872]]]] [pack_buffer] (0x0400): result [0] (Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[872]]]] [main] (0x0400): selinux_child completed successfully (Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[873]]]] [main] (0x0400): selinux_child started. (Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[873]]]] [main] (0x0400): context initialized (Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[873]]]] [main] (0x0400): performing selinux operations (Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[873]]]] [sss_semanage_init] (0x0400): SELinux policy not managed via libsemanage (Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[873]]]] [pack_buffer] (0x0400): result [0] (Thu Jun 22 12:28:18 2017) [[sssd[selinux_child[873]]]] [main] (0x0400): selinux_child completed successfully Thus on the basis of above observations, marking the status of bug to "VERIFIED". Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHEA-2017:2294 |