Bug 1238458

Summary: Hammer ping output has no response for pulp_auth only on providing wrong credentials
Product: Red Hat Satellite Reporter: Kathryn Dixon <kdixon>
Component: PulpAssignee: Justin Sherrill <jsherril>
Status: CLOSED ERRATA QA Contact: Og Maciel <omaciel>
Severity: high Docs Contact: Russell Dickenson <rdickens>
Priority: unspecified    
Version: 6.1.0CC: bbuckingham, bkearney, chpeters, cwelton, jsherril, mmccune, omaciel, rdickens, rplevka, sthirugn
Target Milestone: UnspecifiedKeywords: ReleaseNotes, Triaged
Target Release: Unused   
Hardware: Unspecified   
OS: Unspecified   
URL: http://projects.theforeman.org/issues/11701
Whiteboard:
Fixed In Version: Doc Type: Known Issue
Doc Text:
Issue: The `pulp_auth` service does not provide a response to the Hammer `ping` command if incorrect credentials are provided. The following example illustrates the output provided when the `ping` command is run with invalid credentials.: --- # hammer ping candlepin: Status: ok Server Response: Duration: 18ms candlepin_auth: Status: ok Server Response: Duration: 20ms pulp: Status: ok Server Response: Duration: 18ms pulp_auth: Status: Server Response: Message: elasticsearch: Status: ok Server Response: Duration: 10ms foreman_tasks: Status: ok Server Response: Duration: 1ms --- Workaround: Unknown.
Story Points: ---
Clone Of: Environment:
Last Closed: 2015-10-15 18:20:17 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On:    
Bug Blocks: 1190823    

Description Kathryn Dixon 2015-07-01 23:25:27 UTC
Description of problem: pulp_auth has no response.

# hammer ping

candlepin:
    Status:          ok
    Server Response: Duration: 18ms
candlepin_auth:
    Status:          ok
    Server Response: Duration: 20ms
pulp:
    Status:          ok
    Server Response: Duration: 18ms
pulp_auth:
    Status:
    Server Response: Message:
elasticsearch:
    Status:          ok
    Server Response: Duration: 10ms
foreman_tasks:
    Status:          ok
    Server Response: Duration: 1ms

Version-Release number of selected component (if applicable):

6.0.8 on a RHEL 7 box, selinux is in permissive, at time of install might have been in enforcing.


How reproducible:

unable

Steps to Reproduce: 

customer has done the following
1. selinux enforcing on rhel 7
2. yum install katello
3. katello-installer
4. set up ldap
5. # hammer ping

Actual results:

# hammer ping

pulp_auth:
    Status: 
    Server Response: Message:

Expected results:

pulp_auth:
    Status:          ok
    Server Response: Message: 10ms

Additional info:

When you log into the webUI > admin > about > back-end processes pulp_auth says ok

Comment 1 RHEL Program Management 2015-07-01 23:43:34 UTC
Since this issue was entered in Red Hat Bugzilla, the release flag has been
set to ? to ensure that it is properly evaluated for this release.

Comment 3 Brad Buckingham 2015-08-04 19:22:03 UTC
I am not seeing the above behavior with the latest Satellite 6.1 GA builds (snap 15); therefore, going to move it over to QA for verification.

Katello and Pulp rpms installed are:

[root@sat61 ~]# rpm -qa|grep katello
pulp-katello-0.5-1.el7sat.noarch
katello-installer-2.3.17-1.el7sat.noarch
rubygem-hammer_cli_katello-0.0.7.17-1.el7sat.noarch
katello-service-2.2.0.14-1.el7sat.noarch
ruby193-rubygem-katello-2.2.0.65-1.el7sat.noarch
katello-certs-tools-2.2.1-1.el7sat.noarch
katello-2.2.0.14-1.el7sat.noarch
katello-server-ca-1.0-1.noarch
katello-installer-base-2.3.17-1.el7sat.noarch
katello-common-2.2.0.14-1.el7sat.noarch
katello-debug-2.2.0.14-1.el7sat.noarch
katello-default-ca-1.0-1.noarch

[root@sat61 ~]# rpm -qa|grep pulp
pulp-selinux-2.6.0.15-1.el7sat.noarch
pulp-katello-0.5-1.el7sat.noarch
python-isodate-0.5.0-4.pulp.el7sat.noarch
python-pulp-docker-common-0.2.5-1.el7sat.noarch
python-pulp-common-2.6.0.15-1.el7sat.noarch
pulp-nodes-common-2.6.0.15-1.el7sat.noarch
pulp-server-2.6.0.15-1.el7sat.noarch
python-pulp-puppet-common-2.6.0.15-1.el7sat.noarch
python-pulp-rpm-common-2.6.0.15-1.el7sat.noarch
pulp-docker-plugins-0.2.5-1.el7sat.noarch
pulp-puppet-plugins-2.6.0.15-1.el7sat.noarch
pulp-rpm-plugins-2.6.0.15-1.el7sat.noarch
rubygem-smart_proxy_pulp-1.0.1.2-1.el7sat.noarch
python-kombu-3.0.24-10.pulp.el7sat.noarch
python-pulp-bindings-2.6.0.15-1.el7sat.noarch
pulp-nodes-parent-2.6.0.15-1.el7sat.noarch
pulp-puppet-tools-2.6.0.15-1.el7sat.noarch

[root@sat61 ~]# hammer ping
[Foreman] Username: admin
[Foreman] Password for admin: 
candlepin:      
    Status:          ok
    Server Response: Duration: 73ms
candlepin_auth: 
    Status:          ok
    Server Response: Duration: 67ms
pulp:           
    Status:          ok
    Server Response: Duration: 98ms
pulp_auth:      
    Status:          ok
    Server Response: Duration: 87ms
elasticsearch:  
    Status:          ok
    Server Response: Duration: 77ms
foreman_tasks:  
    Status:          ok
    Server Response: Duration: 1ms

[root@sat61 ~]#

Comment 4 sthirugn@redhat.com 2015-08-05 18:53:58 UTC
This works fine in Sat 6.1 GA Snap 15.

Comment 5 sthirugn@redhat.com 2015-08-05 18:54:16 UTC
# hammer ping
[Foreman] Username: admin
[Foreman] Password for admin: 
candlepin:      
    Status:          ok
    Server Response: Duration: 13ms
candlepin_auth: 
    Status:          ok
    Server Response: Duration: 14ms
pulp:           
    Status:          ok
    Server Response: Duration: 172ms
pulp_auth:      
    Status:          ok
    Server Response: Duration: 13ms
elasticsearch:  
    Status:          ok
    Server Response: Duration: 14ms
foreman_tasks:  
    Status:          ok
    Server Response: Duration: 0ms

Comment 6 David O'Brien 2015-08-06 00:43:13 UTC
Removing "blocks rel notes" because bug fix now verified.

Comment 7 Roman Plevka 2015-08-09 13:25:40 UTC
FAILED QA:

I accidentally found a way of replicating the bug.
It appears on providing wrong credentials:

# hammer ping
[Foreman] Username: admin
[Foreman] Password for admin: wrongPasswd
candlepin:      
    Status:          ok
    Server Response: Duration: 23ms
candlepin_auth: 
    Status:          ok
    Server Response: Duration: 25ms
pulp:           
    Status:          ok
    Server Response: Duration: 37ms
pulp_auth:      
    Status:          
    Server Response: Message:
elasticsearch:  
    Status:          ok
    Server Response: Duration: 12ms
foreman_tasks:  
    Status:          ok
    Server Response: Duration: 1ms


I'm also able to replicate this on the very latest compose:

# rpm -qa | grep katello
katello-default-ca-1.0-1.noarch
katello-common-2.2.0.14-1.el7sat.noarch
katello-service-2.2.0.14-1.el7sat.noarch
rubygem-hammer_cli_katello-0.0.7.17-1.el7sat.noarch
katello-installer-base-2.3.17-1.el7sat.noarch
ruby193-rubygem-katello-2.2.0.65-1.el7sat.noarch
katello-server-ca-1.0-1.noarch
katello-installer-2.3.17-1.el7sat.noarch
katello-debug-2.2.0.14-1.el7sat.noarch
katello-2.2.0.14-1.el7sat.noarch
pulp-katello-0.5-1.el7sat.noarch
katello-certs-tools-2.2.1-1.el7sat.noarch

# rpm -qa | grep pulp
pulp-puppet-plugins-2.6.0.15-1.el7sat.noarch
python-pulp-bindings-2.6.0.15-1.el7sat.noarch
pulp-puppet-tools-2.6.0.15-1.el7sat.noarch
python-pulp-rpm-common-2.6.0.15-1.el7sat.noarch
pulp-rpm-plugins-2.6.0.15-1.el7sat.noarch
python-isodate-0.5.0-4.pulp.el7sat.noarch
python-pulp-puppet-common-2.6.0.15-1.el7sat.noarch
pulp-server-2.6.0.15-1.el7sat.noarch
pulp-nodes-parent-2.6.0.15-1.el7sat.noarch
pulp-selinux-2.6.0.15-1.el7sat.noarch
python-kombu-3.0.24-10.pulp.el7sat.noarch
python-pulp-docker-common-0.2.5-1.el7sat.noarch
pulp-nodes-common-2.6.0.15-1.el7sat.noarch
rubygem-smart_proxy_pulp-1.0.1.2-1.el7sat.noarch
pulp-docker-plugins-0.2.5-1.el7sat.noarch
python-pulp-common-2.6.0.15-1.el7sat.noarch
pulp-katello-0.5-1.el7sat.noarch

Comment 8 sthirugn@redhat.com 2015-08-10 13:26:04 UTC
Adding Blocks: 1190823 again

Comment 9 Roman Plevka 2015-08-10 20:49:03 UTC
the issue leads to API:

$ curl -ku admin:changeme https://<fqdn>/katello/api/v2/ping
{"status":"ok","services":{"elasticsearch":{"status":"ok","duration_ms":"8"},"foreman_tasks":{"status":"ok","duration_ms":"1"},"candlepin":{"status":"ok","duration_ms":"19"},"candlepin_auth":{"status":"ok","duration_ms":"21"},"pulp":{"status":"ok","duration_ms":"213"},"pulp_auth":{"status":"ok","duration_ms":"18"}}}

$ curl -k https://<fqdn>/katello/api/v2/ping
{"status":"FAIL","services":{"elasticsearch":{"status":"ok","duration_ms":"6"},"foreman_tasks":{"status":"ok","duration_ms":"0"},"candlepin":{"status":"ok","duration_ms":"205"},"candlepin_auth":{"status":"ok","duration_ms":"21"},"pulp":{"status":"ok","duration_ms":"59"},"pulp_auth":{}}}

Comment 13 Justin Sherrill 2015-09-04 17:22:09 UTC
Created redmine issue http://projects.theforeman.org/issues/11701 from this bug

Comment 14 Bryan Kearney 2015-09-04 18:04:04 UTC
Upstream bug component is Content Management

Comment 15 Bryan Kearney 2015-09-09 20:03:50 UTC
Upstream bug component is Pulp

Comment 18 Og Maciel 2015-10-05 20:14:19 UTC
Using a Satellite-6.1.0-RHEL-7-20151002.0 compose:

Providing correct password:
---------------------------

[root@cloud-qe-10 ~]# hammer -u admin -p changeme ping
candlepin:
    Status:          ok
    Server Response: Duration: 15ms
candlepin_auth:
    Status:          ok
    Server Response: Duration: 16ms
pulp:
    Status:          ok
    Server Response: Duration: 24ms
pulp_auth:
    Status:          ok
    Server Response: Duration: 13ms
elasticsearch:
    Status:          ok
    Server Response: Duration: 8ms
foreman_tasks:
    Status:          ok
    Server Response: Duration: 0ms

Providing INcorrect password:
-----------------------------

[root@cloud-qe-10 ~]# hammer -u admin -p wrongpassword ping
candlepin:
    Status:          ok
    Server Response: Duration: 14ms
candlepin_auth:
    Status:          ok
    Server Response: Duration: 15ms
pulp:
    Status:          ok
    Server Response: Duration: 24ms
pulp_auth:
    Status:
    Server Response:
elasticsearch:
    Status:          ok
    Server Response: Duration: 6ms
foreman_tasks:
    Status:          ok
    Server Response: Duration: 0ms

Comment 19 Og Maciel 2015-10-05 20:18:28 UTC
Interesting:

curl -k https://localhost/katello/api/v2/ping
{"status":"ok","services":
{"elasticsearch":{"status":"ok","duration_ms":"6"},
"foreman_tasks":{"status":"ok","duration_ms":"0"},
"candlepin":{"status":"ok","duration_ms":"14"},
"candlepin_auth":{"status":"ok","duration_ms":"16"},
"pulp":{"status":"ok","duration_ms":"25"}}}

Seems that hammer cli is adding pulp_auth to the output

Comment 20 Og Maciel 2015-10-05 20:19:43 UTC
Fails QE since the output is not what we expected to see.

Comment 21 Justin Sherrill 2015-10-05 21:03:32 UTC
Fix will require 2 commits: 

https://github.com/theforeman/hammer-cli/commit/5bb4d24d59b4bb744d694b8da50d70854995e728

and an upcoming hammer-cli-katello PR

Comment 22 Justin Sherrill 2015-10-05 21:08:26 UTC
hammer-cli-katello PR:   https://github.com/Katello/hammer-cli-katello/pull/324

Comment 25 Og Maciel 2015-10-06 15:48:20 UTC
The fix is to hide pulp_auth if credentials are incorrect:

[root@qe-sat6-rhel67 ~]# hammer -u admin -p changeme ping
candlepin:
    Status:          ok
    Server Response: Duration: 37ms
candlepin_auth:
    Status:          ok
    Server Response: Duration: 38ms
pulp:
    Status:          ok
    Server Response: Duration: 40ms
pulp_auth:
    Status:          ok
    Server Response: Duration: 217ms
elasticsearch:
    Status:          ok
    Server Response: Duration: 13ms
foreman_tasks:
    Status:          ok
    Server Response: Duration: 1ms

[root@qe-sat6-rhel67 ~]# hammer -u admin -p wrongpassword  ping
candlepin:
    Status:          ok
    Server Response: Duration: 36ms
candlepin_auth:
    Status:          ok
    Server Response: Duration: 34ms
pulp:
    Status:          ok
    Server Response: Duration: 35ms
elasticsearch:
    Status:          ok
    Server Response: Duration: 5ms
foreman_tasks:
    Status:          ok
    Server Response: Duration: 0ms

also


[root@qe-sat6-rhel67 ~]# curl -k https://localhost/katello/api/v2/ping
{"status":"ok","services":{"elasticsearch":{"status":"ok","duration_ms":"4"},"foreman_tasks":{"status":"ok","duration_ms":"0"},"candlepin":{"status":"ok","duration_ms":"34"},"candlepin_auth":{"status":"ok","duration_ms":"37"},"pulp":{"status":"ok","duration_ms":"36"}}}
[root@qe-sat6-rhel67 ~]# curl -k https://admin:changeme@localhost/katello/api/v2/ping
{"status":"ok","services":{"elasticsearch":{"status":"ok","duration_ms":"5"},"foreman_tasks":{"status":"ok","duration_ms":"0"},"candlepin":{"status":"ok","duration_ms":"34"},"candlepin_auth":{"status":"ok","duration_ms":"37"},"pulp":{"status":"ok","duration_ms":"32"},"pulp_auth":{"status":"ok","duration_ms":"20"}}}


This is now VERIFIED on Satellite-6.1.0-RHEL-{6,7}-20151006.0 compose

Comment 27 errata-xmlrpc 2015-10-15 18:20:17 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2015:1911