Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.
Red Hat Satellite engineering is moving the tracking of its product development work on Satellite to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "Satellite project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs will be migrated starting at the end of May. If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "Satellite project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/SAT-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.

Bug 1245657

Summary: Tasks getting stuck on pulp side with state 'waiting' being assigned to queue 'None.dq'
Product: Red Hat Satellite Reporter: Ivan Necas <inecas>
Component: PulpAssignee: Brian Bouterse <bmbouter>
Status: CLOSED CURRENTRELEASE QA Contact: Katello QA List <katello-qa-list>
Severity: unspecified Docs Contact:
Priority: unspecified    
Version: 6.1.0CC: bmbouter, chpeters, cwelton, daviddavis, dkliban, ggainey, ipanova, mhrivnak, mmccune, pcreech, rchan, ttereshc
Target Milestone: Unspecified   
Target Release: Unused   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2015-08-12 16:04:18 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description Ivan Necas 2015-07-22 13:36:07 UTC
Description of problem:
It happened several times during regenerating applicability task 

{"exception"=>nil,
    "task_type"=>
     "pulp.server.managers.consumer.applicability.regenerate_applicability_for_consumers",
    "_href"=>"/pulp/api/v2/tasks/a74bb9e8-6465-43fa-807a-48045cef2ee4/",
    "task_id"=>"a74bb9e8-6465-43fa-807a-48045cef2ee4",
    "tags"=>["pulp:action:content_applicability_regeneration"],
    "finish_time"=>nil,
    "start_time"=>nil,
    "traceback"=>nil,
    "spawned_tasks"=>[],
    "progress_report"=>{},
    "queue"=>"None.dq",
    "state"=>"waiting",
    "worker_name"=>nil,
    "result"=>nil,
    "error"=>nil,
    "_id"=>{"$oid"=>"55add4a768cda8c1ffd4853b"},
    "id"=>"55add4a7506f9c24d186aff5"}],

Comment 2 Ivan Necas 2015-07-22 13:48:40 UTC
Seems similar to https://bugzilla.redhat.com/show_bug.cgi?id=1237265

Comment 12 pulp-infra@redhat.com 2015-07-28 13:00:16 UTC
The Pulp upstream bug status is at NEW. Updating the external tracker on this bug.

Comment 13 pulp-infra@redhat.com 2015-07-28 13:00:18 UTC
The Pulp upstream bug priority is at Normal. Updating the external tracker on this bug.

Comment 14 pulp-infra@redhat.com 2015-07-28 16:30:19 UTC
The Pulp upstream bug status is at POST. Updating the external tracker on this bug.

Comment 15 Mike McCune 2015-07-28 16:57:11 UTC
https://github.com/celery/kombu/issues/498

Comment 16 pulp-infra@redhat.com 2015-07-28 17:00:23 UTC
The Pulp upstream bug status is at MODIFIED. Updating the external tracker on this bug.

Comment 17 Brian Bouterse 2015-07-28 17:26:16 UTC
QE to verify this you should do the following on an EL7 machine:

1. Setup a default installation of Sat6
2. Verify that python-kombu-3.0.24-10 is installed
3. Start sat6 and sanity check that it is working (ie: sync+publish)
4. With Pulp running, list the connections by running `sudo -u apache qpid-stat --ssl-certificate=/etc/pki/katello/qpid_client_striped.crt -b amqps://s61.sat.lab.tlv.redhat.com:5671 -c`. You may need to install qpid-tools to run `qpid-stat`.
5. Verify that all of the connections show they are using ANONYMOUS. Ensure none of them are using PLAIN.

Comment 19 Corey Welton 2015-07-30 02:07:07 UTC
QE notes:  substitute your hostname or localhost for the amqps:// url above.

Comment 20 Corey Welton 2015-07-30 02:09:14 UTC
Snap 15: Nearly everything is ANONYMOUS, but I am still seeing a couple PLAIN connections?


[root@qe-sat6-rhel71 ~]# sudo -u apache qpid-stat --ssl-certificate=/etc/pki/katello/qpid_client_striped.crt -b amqps://`hostname`:5671 -c
Connections
  connection                           cproc             cpid   mech       auth        connected   idle        msgIn  msgOut
  ============================================================================================================================
  qpid.127.0.0.1:5671-127.0.0.1:33545  celery            25342  ANONYMOUS  anonymous   4h 21m 4s   0s             1      2
  qpid.127.0.0.1:5671-127.0.0.1:42193  mod_wsgi          25400  ANONYMOUS  anonymous   4h 3m 8s    4s             2      1
  qpid.127.0.0.1:5671-127.0.0.1:46238  celery            25370  ANONYMOUS  anonymous   3h 53m 8s   4s             1      2
  qpid.127.0.0.1:5671-127.0.0.1:48021  celery            25342  ANONYMOUS  anonymous   20m 4s      19m 54s        6      4
  qpid.127.0.0.1:5671-127.0.0.1:48027  celery            25363  ANONYMOUS  anonymous   20m 4s      19m 54s        6      4
  qpid.127.0.0.1:5671-127.0.0.1:48608  qpid-stat         24351  ANONYMOUS  anonymous   0s          0s             1      0
  qpid.127.0.0.1:5671-127.0.0.1:50787  celery            25129  ANONYMOUS  anonymous   5h 48m 21s  5h 48m 14s     0      0
  qpid.127.0.0.1:5671-127.0.0.1:50788  celery            25129  ANONYMOUS  anonymous   5h 48m 21s  5h 48m 14s     0      0
  qpid.127.0.0.1:5671-127.0.0.1:50789  celery            25129  ANONYMOUS  anonymous   5h 48m 21s  0s             6   75.4k
  qpid.127.0.0.1:5671-127.0.0.1:50794  celery            25180  ANONYMOUS  anonymous   5h 48m 20s  14m 44s       12   2.61k
  qpid.127.0.0.1:5671-127.0.0.1:50796  celery            25180  ANONYMOUS  anonymous   5h 48m 20s  0s          18.2k     2
  qpid.127.0.0.1:5671-127.0.0.1:50797  celery            25246  ANONYMOUS  anonymous   5h 48m 20s  19m 54s       12    365
  qpid.127.0.0.1:5671-127.0.0.1:50798  celery            25244  ANONYMOUS  anonymous   5h 48m 20s  19m 54s       12     19
  qpid.127.0.0.1:5671-127.0.0.1:50799  celery            25246  ANONYMOUS  anonymous   5h 48m 20s  0s          11.5k     2
  qpid.127.0.0.1:5671-127.0.0.1:50800  celery            25244  ANONYMOUS  anonymous   5h 48m 20s  0s          10.4k     2
  qpid.127.0.0.1:5671-127.0.0.1:50801  celery            25242  ANONYMOUS  anonymous   5h 48m 20s  3h 53m 4s     12     13
  qpid.127.0.0.1:5671-127.0.0.1:50802  celery            25242  ANONYMOUS  anonymous   5h 48m 20s  0s          10.4k     2
  qpid.127.0.0.1:5671-127.0.0.1:50803  celery            25248  ANONYMOUS  anonymous   5h 48m 20s  14m 44s       12   4.84k
  qpid.127.0.0.1:5671-127.0.0.1:50804  celery            25248  ANONYMOUS  anonymous   5h 48m 20s  0s          24.9k     2
  qpid.127.0.0.1:5671-127.0.0.1:50806  mod_wsgi          25400  ANONYMOUS  anonymous   5h 48m 19s  4s             1      1
  qpid.127.0.0.1:5671-127.0.0.1:50807  mod_wsgi          25400  ANONYMOUS  anonymous   5h 48m 19s  4s             0      0
  qpid.127.0.0.1:5671-127.0.0.1:50808  -                        ANONYMOUS  anonymous   5h 48m 17s  5h 48m 14s     0      0
  qpid.127.0.0.1:5671-127.0.0.1:50926  ruby              25837  ANONYMOUS  anonymous   5h 45m 11s  0s             0   1.27k
  qpid.127.0.0.1:5671-127.0.0.1:50959  Qpid Java Client  26178  PLAIN      guest@QPID  5h 42m 26s  14s         1.27k     0
  qpid.127.0.0.1:5671-127.0.0.1:50972  Qpid Java Client  26178  PLAIN      guest@QPID  5h 41m 47s  14s            0   1.27k
  qpid.127.0.0.1:5671-127.0.0.1:51908  mod_wsgi          25400  ANONYMOUS  anonymous   5h 36m 37s  3h 40m 44s   476      4
  qpid.127.0.0.1:5671-127.0.0.1:51913  celery            25307  ANONYMOUS  anonymous   5h 36m 37s  14m 44s     5.19k     4
  qpid.127.0.0.1:5671-127.0.0.1:51960  celery            25397  ANONYMOUS  anonymous   5h 36m 28s  14m 44s      140      4
  qpid.127.0.0.1:5671-127.0.0.1:52049  mod_wsgi          25400  ANONYMOUS  anonymous   5h 36m 7s   2h 2m 54s    542      8
  qpid.127.0.0.1:5671-127.0.0.1:52325  mod_wsgi          25400  ANONYMOUS  anonymous   3h 40m 50s  56m 54s      599      4
  qpid.127.0.0.1:5671-127.0.0.1:52989  mod_wsgi          25400  ANONYMOUS  anonymous   4h 31m 4s   0s             2      1
  qpid.127.0.0.1:5671-127.0.0.1:55814  mod_wsgi          25400  ANONYMOUS  anonymous   5h 25m 58s  14m 44s      615      8
  qpid.127.0.0.1:5671-127.0.0.1:60962  mod_wsgi          25400  ANONYMOUS  anonymous   5h 7m 35s   2h 46m 34s   258      8


Bouncing this back to dev for clarification.  May be able to close it out later, however

Comment 21 Michael Hrivnak 2015-07-30 11:59:59 UTC
Pulp definitely is not using the "Qpid Java Client". I suspect those two connections are from candlepin, but I'll let Brian weigh in on whether they should be changed.

Comment 22 Brian Bouterse 2015-07-30 12:44:13 UTC
mhrivnak is right, those other connections are coming from another area in sat6, likely candlepin since they are the only Java client users that I know of.

They must also be using SSL connections because the broker config requires it, so I think having them switch to ANONYMOUS would be safe and good. It will help them avoid any low-level sasl library prompting in cases where PLAIN is used and either the username or password are blank or empty.

They can specify the mech list when they use the Qpid Java client so it should be straightforward for them to update.

Comment 23 Corey Welton 2015-07-30 13:21:54 UTC
Ok, so I suppose we can mark this bz verified, and open a new one for Java client.  Thanks!

Comment 24 Bryan Kearney 2015-08-12 16:04:18 UTC
This bug was fixed in Satellite 6.1.1 which was delivered on 12 August, 2015.

Comment 25 pulp-infra@redhat.com 2015-10-22 15:00:58 UTC
The Pulp upstream bug status is at ON_QA. Updating the external tracker on this bug.

Comment 26 pulp-infra@redhat.com 2015-11-09 21:00:57 UTC
The Pulp upstream bug status is at CLOSED - CURRENTRELEASE. Updating the external tracker on this bug.