Red Hat Bugzilla – Bug 1238458
Hammer ping output has no response for pulp_auth only on providing wrong credentials
Last modified: 2017-03-21 08:26:14 EDT
Description of problem: pulp_auth has no response. # hammer ping candlepin: Status: ok Server Response: Duration: 18ms candlepin_auth: Status: ok Server Response: Duration: 20ms pulp: Status: ok Server Response: Duration: 18ms pulp_auth: Status: Server Response: Message: elasticsearch: Status: ok Server Response: Duration: 10ms foreman_tasks: Status: ok Server Response: Duration: 1ms Version-Release number of selected component (if applicable): 6.0.8 on a RHEL 7 box, selinux is in permissive, at time of install might have been in enforcing. How reproducible: unable Steps to Reproduce: customer has done the following 1. selinux enforcing on rhel 7 2. yum install katello 3. katello-installer 4. set up ldap 5. # hammer ping Actual results: # hammer ping pulp_auth: Status: Server Response: Message: Expected results: pulp_auth: Status: ok Server Response: Message: 10ms Additional info: When you log into the webUI > admin > about > back-end processes pulp_auth says ok
Since this issue was entered in Red Hat Bugzilla, the release flag has been set to ? to ensure that it is properly evaluated for this release.
I am not seeing the above behavior with the latest Satellite 6.1 GA builds (snap 15); therefore, going to move it over to QA for verification. Katello and Pulp rpms installed are: [root@sat61 ~]# rpm -qa|grep katello pulp-katello-0.5-1.el7sat.noarch katello-installer-2.3.17-1.el7sat.noarch rubygem-hammer_cli_katello-0.0.7.17-1.el7sat.noarch katello-service-2.2.0.14-1.el7sat.noarch ruby193-rubygem-katello-2.2.0.65-1.el7sat.noarch katello-certs-tools-2.2.1-1.el7sat.noarch katello-2.2.0.14-1.el7sat.noarch katello-server-ca-1.0-1.noarch katello-installer-base-2.3.17-1.el7sat.noarch katello-common-2.2.0.14-1.el7sat.noarch katello-debug-2.2.0.14-1.el7sat.noarch katello-default-ca-1.0-1.noarch [root@sat61 ~]# rpm -qa|grep pulp pulp-selinux-2.6.0.15-1.el7sat.noarch pulp-katello-0.5-1.el7sat.noarch python-isodate-0.5.0-4.pulp.el7sat.noarch python-pulp-docker-common-0.2.5-1.el7sat.noarch python-pulp-common-2.6.0.15-1.el7sat.noarch pulp-nodes-common-2.6.0.15-1.el7sat.noarch pulp-server-2.6.0.15-1.el7sat.noarch python-pulp-puppet-common-2.6.0.15-1.el7sat.noarch python-pulp-rpm-common-2.6.0.15-1.el7sat.noarch pulp-docker-plugins-0.2.5-1.el7sat.noarch pulp-puppet-plugins-2.6.0.15-1.el7sat.noarch pulp-rpm-plugins-2.6.0.15-1.el7sat.noarch rubygem-smart_proxy_pulp-1.0.1.2-1.el7sat.noarch python-kombu-3.0.24-10.pulp.el7sat.noarch python-pulp-bindings-2.6.0.15-1.el7sat.noarch pulp-nodes-parent-2.6.0.15-1.el7sat.noarch pulp-puppet-tools-2.6.0.15-1.el7sat.noarch [root@sat61 ~]# hammer ping [Foreman] Username: admin [Foreman] Password for admin: candlepin: Status: ok Server Response: Duration: 73ms candlepin_auth: Status: ok Server Response: Duration: 67ms pulp: Status: ok Server Response: Duration: 98ms pulp_auth: Status: ok Server Response: Duration: 87ms elasticsearch: Status: ok Server Response: Duration: 77ms foreman_tasks: Status: ok Server Response: Duration: 1ms [root@sat61 ~]#
This works fine in Sat 6.1 GA Snap 15.
# hammer ping [Foreman] Username: admin [Foreman] Password for admin: candlepin: Status: ok Server Response: Duration: 13ms candlepin_auth: Status: ok Server Response: Duration: 14ms pulp: Status: ok Server Response: Duration: 172ms pulp_auth: Status: ok Server Response: Duration: 13ms elasticsearch: Status: ok Server Response: Duration: 14ms foreman_tasks: Status: ok Server Response: Duration: 0ms
Removing "blocks rel notes" because bug fix now verified.
FAILED QA: I accidentally found a way of replicating the bug. It appears on providing wrong credentials: # hammer ping [Foreman] Username: admin [Foreman] Password for admin: wrongPasswd candlepin: Status: ok Server Response: Duration: 23ms candlepin_auth: Status: ok Server Response: Duration: 25ms pulp: Status: ok Server Response: Duration: 37ms pulp_auth: Status: Server Response: Message: elasticsearch: Status: ok Server Response: Duration: 12ms foreman_tasks: Status: ok Server Response: Duration: 1ms I'm also able to replicate this on the very latest compose: # rpm -qa | grep katello katello-default-ca-1.0-1.noarch katello-common-2.2.0.14-1.el7sat.noarch katello-service-2.2.0.14-1.el7sat.noarch rubygem-hammer_cli_katello-0.0.7.17-1.el7sat.noarch katello-installer-base-2.3.17-1.el7sat.noarch ruby193-rubygem-katello-2.2.0.65-1.el7sat.noarch katello-server-ca-1.0-1.noarch katello-installer-2.3.17-1.el7sat.noarch katello-debug-2.2.0.14-1.el7sat.noarch katello-2.2.0.14-1.el7sat.noarch pulp-katello-0.5-1.el7sat.noarch katello-certs-tools-2.2.1-1.el7sat.noarch # rpm -qa | grep pulp pulp-puppet-plugins-2.6.0.15-1.el7sat.noarch python-pulp-bindings-2.6.0.15-1.el7sat.noarch pulp-puppet-tools-2.6.0.15-1.el7sat.noarch python-pulp-rpm-common-2.6.0.15-1.el7sat.noarch pulp-rpm-plugins-2.6.0.15-1.el7sat.noarch python-isodate-0.5.0-4.pulp.el7sat.noarch python-pulp-puppet-common-2.6.0.15-1.el7sat.noarch pulp-server-2.6.0.15-1.el7sat.noarch pulp-nodes-parent-2.6.0.15-1.el7sat.noarch pulp-selinux-2.6.0.15-1.el7sat.noarch python-kombu-3.0.24-10.pulp.el7sat.noarch python-pulp-docker-common-0.2.5-1.el7sat.noarch pulp-nodes-common-2.6.0.15-1.el7sat.noarch rubygem-smart_proxy_pulp-1.0.1.2-1.el7sat.noarch pulp-docker-plugins-0.2.5-1.el7sat.noarch python-pulp-common-2.6.0.15-1.el7sat.noarch pulp-katello-0.5-1.el7sat.noarch
Adding Blocks: 1190823 again
the issue leads to API: $ curl -ku admin:changeme https://<fqdn>/katello/api/v2/ping {"status":"ok","services":{"elasticsearch":{"status":"ok","duration_ms":"8"},"foreman_tasks":{"status":"ok","duration_ms":"1"},"candlepin":{"status":"ok","duration_ms":"19"},"candlepin_auth":{"status":"ok","duration_ms":"21"},"pulp":{"status":"ok","duration_ms":"213"},"pulp_auth":{"status":"ok","duration_ms":"18"}}} $ curl -k https://<fqdn>/katello/api/v2/ping {"status":"FAIL","services":{"elasticsearch":{"status":"ok","duration_ms":"6"},"foreman_tasks":{"status":"ok","duration_ms":"0"},"candlepin":{"status":"ok","duration_ms":"205"},"candlepin_auth":{"status":"ok","duration_ms":"21"},"pulp":{"status":"ok","duration_ms":"59"},"pulp_auth":{}}}
Created redmine issue http://projects.theforeman.org/issues/11701 from this bug
Upstream bug component is Content Management
Upstream bug component is Pulp
Using a Satellite-6.1.0-RHEL-7-20151002.0 compose: Providing correct password: --------------------------- [root@cloud-qe-10 ~]# hammer -u admin -p changeme ping candlepin: Status: ok Server Response: Duration: 15ms candlepin_auth: Status: ok Server Response: Duration: 16ms pulp: Status: ok Server Response: Duration: 24ms pulp_auth: Status: ok Server Response: Duration: 13ms elasticsearch: Status: ok Server Response: Duration: 8ms foreman_tasks: Status: ok Server Response: Duration: 0ms Providing INcorrect password: ----------------------------- [root@cloud-qe-10 ~]# hammer -u admin -p wrongpassword ping candlepin: Status: ok Server Response: Duration: 14ms candlepin_auth: Status: ok Server Response: Duration: 15ms pulp: Status: ok Server Response: Duration: 24ms pulp_auth: Status: Server Response: elasticsearch: Status: ok Server Response: Duration: 6ms foreman_tasks: Status: ok Server Response: Duration: 0ms
Interesting: curl -k https://localhost/katello/api/v2/ping {"status":"ok","services": {"elasticsearch":{"status":"ok","duration_ms":"6"}, "foreman_tasks":{"status":"ok","duration_ms":"0"}, "candlepin":{"status":"ok","duration_ms":"14"}, "candlepin_auth":{"status":"ok","duration_ms":"16"}, "pulp":{"status":"ok","duration_ms":"25"}}} Seems that hammer cli is adding pulp_auth to the output
Fails QE since the output is not what we expected to see.
Fix will require 2 commits: https://github.com/theforeman/hammer-cli/commit/5bb4d24d59b4bb744d694b8da50d70854995e728 and an upcoming hammer-cli-katello PR
hammer-cli-katello PR: https://github.com/Katello/hammer-cli-katello/pull/324
The fix is to hide pulp_auth if credentials are incorrect: [root@qe-sat6-rhel67 ~]# hammer -u admin -p changeme ping candlepin: Status: ok Server Response: Duration: 37ms candlepin_auth: Status: ok Server Response: Duration: 38ms pulp: Status: ok Server Response: Duration: 40ms pulp_auth: Status: ok Server Response: Duration: 217ms elasticsearch: Status: ok Server Response: Duration: 13ms foreman_tasks: Status: ok Server Response: Duration: 1ms [root@qe-sat6-rhel67 ~]# hammer -u admin -p wrongpassword ping candlepin: Status: ok Server Response: Duration: 36ms candlepin_auth: Status: ok Server Response: Duration: 34ms pulp: Status: ok Server Response: Duration: 35ms elasticsearch: Status: ok Server Response: Duration: 5ms foreman_tasks: Status: ok Server Response: Duration: 0ms also [root@qe-sat6-rhel67 ~]# curl -k https://localhost/katello/api/v2/ping {"status":"ok","services":{"elasticsearch":{"status":"ok","duration_ms":"4"},"foreman_tasks":{"status":"ok","duration_ms":"0"},"candlepin":{"status":"ok","duration_ms":"34"},"candlepin_auth":{"status":"ok","duration_ms":"37"},"pulp":{"status":"ok","duration_ms":"36"}}} [root@qe-sat6-rhel67 ~]# curl -k https://admin:changeme@localhost/katello/api/v2/ping {"status":"ok","services":{"elasticsearch":{"status":"ok","duration_ms":"5"},"foreman_tasks":{"status":"ok","duration_ms":"0"},"candlepin":{"status":"ok","duration_ms":"34"},"candlepin_auth":{"status":"ok","duration_ms":"37"},"pulp":{"status":"ok","duration_ms":"32"},"pulp_auth":{"status":"ok","duration_ms":"20"}}} This is now VERIFIED on Satellite-6.1.0-RHEL-{6,7}-20151006.0 compose
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2015:1911