Login
[x]
Log in using an account from:
Fedora Account System
Red Hat Associate
Red Hat Customer
Or login using a Red Hat Bugzilla account
Forgot Password
Login:
Hide Forgot
Create an Account
Red Hat Bugzilla – Attachment 895511 Details for
Bug 1088481
qpidd restart needed to unblock celery processing
[?]
New
Simple Search
Advanced Search
My Links
Browse
Requests
Reports
Current State
Search
Tabular reports
Graphical reports
Duplicates
Other Reports
User Changes
Plotly Reports
Bug Status
Bug Severity
Non-Defaults
|
Product Dashboard
Help
Page Help!
Bug Writing Guidelines
What's new
Browser Support Policy
5.0.4.rh83 Release notes
FAQ
Guides index
User guide
Web Services
Contact
Legal
This site requires JavaScript to be enabled to function correctly, please enable it.
verifying screen log
screen.log (text/plain), 46.83 KB, created by
mkovacik
on 2014-05-14 14:39:16 UTC
(
hide
)
Description:
verifying screen log
Filename:
MIME Type:
Creator:
mkovacik
Created:
2014-05-14 14:39:16 UTC
Size:
46.83 KB
patch
obsolete
> > >### inducing the issue >[root@ec2-54-73-98-199 pulp_auto]# head -755 /usr/lib/python2.7/site-packages/kombu/transport/qpid.py | tail -4 > options['qpid.max_count'] = 10 > #options['qpid.auto_delete_timeout'] = AUTO_DELETE_TIMEOUT > #if queue.startswith('celeryev') or queue.endswith('pidbox'): > # options['qpid.policy_type'] = 'ring' >[root@ec2-54-73-98-199 pulp_auto]# mongo pulp_database --eval "db.dropDatabase()" >MongoDB shell version: 2.4.6 >connecting to: pulp_database >[object Object] >[root@ec2-54-73-98-199 pulp_auto]# sudo -u apache pulp-manage-db >/usr/lib64/python2.7/site-packages/pymongo/mongo_replica_set_client.py:340: UserWarning: libevent version mismatch: system version is '2.0.21-stable' but this gevent is compiled against '2.0.18-stable' > from gevent import Greenlet >Loading content types. >Content types loaded. >Ensuring the admin role and user are in place. >Admin role and user are in place. >Beginning database migrations. >Applying pulp.server.db.migrations version 1 >Migration to pulp.server.db.migrations version 1 complete. >Applying pulp.server.db.migrations version 2 >Migration to pulp.server.db.migrations version 2 complete. >Applying pulp.server.db.migrations version 3 >Migration to pulp.server.db.migrations version 3 complete. >Applying pulp.server.db.migrations version 4 >Migration to pulp.server.db.migrations version 4 complete. >Applying pulp.server.db.migrations version 5 >Migration to pulp.server.db.migrations version 5 complete. >Applying pulp.server.db.migrations version 6 >Migration to pulp.server.db.migrations version 6 complete. >Applying pulp.server.db.migrations version 7 >Migration to pulp.server.db.migrations version 7 complete. >Applying pulp.server.db.migrations version 8 >Migration to pulp.server.db.migrations version 8 complete. >Applying pulp.server.db.migrations version 9 >Migration to pulp.server.db.migrations version 9 complete. >Applying pulp_puppet.plugins.migrations version 1 >Migration to pulp_puppet.plugins.migrations version 1 complete. >Applying pulp_puppet.plugins.migrations version 2 >Migration to pulp_puppet.plugins.migrations version 2 complete. >Applying pulp_rpm.plugins.migrations version 1 >Migration to pulp_rpm.plugins.migrations version 1 complete. >Applying pulp_rpm.plugins.migrations version 2 >Migration to pulp_rpm.plugins.migrations version 2 complete. >Applying pulp_rpm.plugins.migrations version 3 >Migration to pulp_rpm.plugins.migrations version 3 complete. >Applying pulp_rpm.plugins.migrations version 4 >Migration to pulp_rpm.plugins.migrations version 4 complete. >Applying pulp_rpm.plugins.migrations version 5 >Migration to pulp_rpm.plugins.migrations version 5 complete. >Applying pulp_rpm.plugins.migrations version 6 >Migration to pulp_rpm.plugins.migrations version 6 complete. >Applying pulp_rpm.plugins.migrations version 7 >Migration to pulp_rpm.plugins.migrations version 7 complete. >Applying pulp_rpm.plugins.migrations version 8 >Migration to pulp_rpm.plugins.migrations version 8 complete. >Applying pulp_rpm.plugins.migrations version 9 >Migration to pulp_rpm.plugins.migrations version 9 complete. >Applying pulp_rpm.plugins.migrations version 10 >Migration to pulp_rpm.plugins.migrations version 10 complete. >Applying pulp_rpm.plugins.migrations version 11 >Migration to pulp_rpm.plugins.migrations version 11 complete. >Applying pulp_rpm.plugins.migrations version 12 >Migration to pulp_rpm.plugins.migrations version 12 complete. >Applying pulp_rpm.plugins.migrations version 13 >Migration to pulp_rpm.plugins.migrations version 13 complete. >Applying pulp_rpm.plugins.migrations version 14 >Migration to pulp_rpm.plugins.migrations version 14 complete. >Applying pulp_rpm.plugins.migrations version 15 >Migration to pulp_rpm.plugins.migrations version 15 complete. >Database migrations complete. >[root@ec2-54-73-98-199 pulp_auto]# systemctl start `systemctl list-unit-files | egrep 'pulp|httpd|goferd' | cut -d\ -f1` >[root@ec2-54-73-98-199 pulp_auto]# date >Wed May 14 13:45:42 UTC 2014 >[root@ec2-54-73-98-199 pulp_auto]# qpid-stat -q >Queues > queue dur autoDel excl msg msgIn msgOut bytes bytesIn bytesOut cons bind > ========================================================================================================================================================================== > 183603f7-dbd7-4f91-bd0f-2e8d3f5aabcc:1.0 Y Y 0 2 2 0 172 172 1 2 > 1dfc73ea-87a4-46b1-afa6-0de6bd78b1bf:1.0 Y Y 0 0 0 0 0 0 1 2 > 411c4693-a747-40a1-8b5f-fe270cacaa76:1.0 Y Y 0 4 4 0 1.76k 1.76k 1 2 > 411c4693-a747-40a1-8b5f-fe270cacaa76:3.0 Y Y 0 6 6 0 2.08k 2.08k 1 2 > 577a4f72-d88b-4f96-9111-0691799b4782:1.0 Y Y 0 4 4 0 1.80k 1.80k 1 2 > 75f7a147-057a-4cdf-b1aa-6f09eff3afd8:1.0 Y Y 0 4 4 0 1.80k 1.80k 1 2 > 8078287f-ef22-40b3-8957-ae9982faef43:1.0 Y Y 0 2 2 0 172 172 1 2 > 8304b114-91ee-44ee-9b0b-abc190c53b08:1.0 Y Y 0 4 4 0 1.74k 1.74k 1 2 > 8304b114-91ee-44ee-9b0b-abc190c53b08:3.0 Y Y 0 7 7 0 2.15k 2.15k 1 2 > 948f072a-ff3b-453e-af82-e817231243b9:1.0 Y Y 0 0 0 0 0 0 1 2 > c4535718-70bb-4fb4-ae03-527c758b4a2a:1.0 Y Y 0 2 2 0 172 172 1 2 > celery Y 0 0 0 0 0 0 2 2 > celeryev.3c1226e5-092b-4593-832c-e9c00f544de8 Y 10 10 0 7.87k 7.87k 0 1 2 > celeryev.bda052ce-2991-4c77-9789-ecfcc7d82cb5 Y 10 10 0 7.87k 7.87k 0 1 2 > cf29b63d-e54c-4f50-b204-e06d93b79960:1.0 Y Y 0 9 9 0 3.83k 3.83k 1 2 > d589ca04-803f-4546-92ca-2c30a4d2becd:0.0 Y Y 0 0 0 0 0 0 1 2 > e6510bdb-a175-4eb0-94c5-8e2b53f65361:1.0 Y Y 0 10 10 0 3.99k 3.99k 1 2 > f089af33-b6a9-40c4-b282-d5436de9c453:1.0 Y Y 0 10 10 0 3.99k 3.99k 1 2 > f6e99597-5fc3-4afb-b74d-34bed9e9ab43:1.0 Y Y 0 8 8 0 3.62k 3.62k 1 2 > f6e99597-5fc3-4afb-b74d-34bed9e9ab43:3.0 Y Y 0 7 7 0 2.11k 2.11k 1 2 > pulp.task Y 0 0 0 0 0 0 3 1 > reserved_resource_worker-0@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com Y 0 0 0 0 0 0 1 2 > reserved_resource_worker-0@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com.celery.pidbox Y 5 5 0 3.12k 3.12k 0 1 2 > reserved_resource_worker-1@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com.celery.pidbox Y 4 4 0 2.53k 2.53k 0 1 2 > resource_manager Y 0 0 0 0 0 0 1 2 > resource_manager@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com.celery.pidbox Y 5 5 0 3.12k 3.12k 0 1 2 > >### running nosetests would just hang for couple of minutes; put the command on bg >[root@ec2-54-73-98-199 pulp_auto]# nosetests -vs tests/test_2_repo_importer_distributor.py >/usr/lib/python2.7/site-packages/pulp_auto-0.1-py2.7.egg/pulp_auto/__init__.py:2: UserWarning: libevent version mismatch: system version is '2.0.21-stable' but this gevent is compiled against '2.0.18-stable' >^Z >[2]+ Stopped nosetests -vs tests/test_2_repo_importer_distributor.py >[root@ec2-54-73-98-199 pulp_auto]# bg >[2]+ nosetests -vs tests/test_2_repo_importer_distributor.py & >[root@ec2-54-73-98-199 pulp_auto]# qpid-stat -q >Queues > queue dur autoDel excl msg msgIn msgOut bytes bytesIn bytesOut cons bind > ========================================================================================================================================================================== > 0e5f4de7-4b14-44aa-83c3-a308d12f4461:0.0 Y Y 0 0 0 0 0 0 1 2 > 183603f7-dbd7-4f91-bd0f-2e8d3f5aabcc:1.0 Y Y 0 2 2 0 172 172 1 2 > 1dfc73ea-87a4-46b1-afa6-0de6bd78b1bf:1.0 Y Y 0 0 0 0 0 0 1 2 > 411c4693-a747-40a1-8b5f-fe270cacaa76:1.0 Y Y 0 4 4 0 1.76k 1.76k 1 2 > 411c4693-a747-40a1-8b5f-fe270cacaa76:3.0 Y Y 0 6 6 0 2.08k 2.08k 1 2 > 577a4f72-d88b-4f96-9111-0691799b4782:1.0 Y Y 0 4 4 0 1.80k 1.80k 1 2 > 75f7a147-057a-4cdf-b1aa-6f09eff3afd8:1.0 Y Y 0 4 4 0 1.80k 1.80k 1 2 > 8078287f-ef22-40b3-8957-ae9982faef43:1.0 Y Y 0 2 2 0 172 172 1 2 > 8304b114-91ee-44ee-9b0b-abc190c53b08:1.0 Y Y 0 4 4 0 1.74k 1.74k 1 2 > 8304b114-91ee-44ee-9b0b-abc190c53b08:3.0 Y Y 0 7 7 0 2.15k 2.15k 1 2 > 948f072a-ff3b-453e-af82-e817231243b9:1.0 Y Y 0 0 0 0 0 0 1 2 > af2a0ba9-adec-4605-951e-20af3e9f2129:1.0 Y Y 0 9 9 0 3.90k 3.90k 1 2 > c4535718-70bb-4fb4-ae03-527c758b4a2a:1.0 Y Y 0 2 2 0 172 172 1 2 > celery Y 1 2 1 681 1.36k 681 2 2 > celeryev.3c1226e5-092b-4593-832c-e9c00f544de8 Y 10 10 0 7.87k 7.87k 0 1 2 > celeryev.bda052ce-2991-4c77-9789-ecfcc7d82cb5 Y 10 10 0 7.87k 7.87k 0 1 2 > cf29b63d-e54c-4f50-b204-e06d93b79960:1.0 Y Y 0 9 9 0 3.83k 3.83k 1 2 > e6510bdb-a175-4eb0-94c5-8e2b53f65361:1.0 Y Y 0 10 10 0 3.99k 3.99k 1 2 > e6e3ca58-be80-4bdf-a37e-2bbfc637d8b1:1.0 Y Y 0 4 4 0 1.76k 1.76k 1 2 > e907d46f-83a9-4b26-b56d-12fbc40e4e8a:1.0 Y Y 0 4 4 0 1.80k 1.80k 1 2 > f089af33-b6a9-40c4-b282-d5436de9c453:1.0 Y Y 0 10 10 0 3.99k 3.99k 1 2 > f6e99597-5fc3-4afb-b74d-34bed9e9ab43:1.0 Y Y 0 8 8 0 3.62k 3.62k 1 2 > f6e99597-5fc3-4afb-b74d-34bed9e9ab43:3.0 Y Y 0 8 8 0 2.21k 2.21k 1 2 > pulp.task Y 0 0 0 0 0 0 3 1 > reserved_resource_worker-0@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com Y 0 0 0 0 0 0 1 2 > reserved_resource_worker-0@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com.celery.pidbox Y 6 6 0 3.70k 3.70k 0 1 2 > reserved_resource_worker-1@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com.celery.pidbox Y 5 5 0 3.12k 3.12k 0 1 2 > resource_manager Y 1 1 0 741 741 0 1 2 > resource_manager@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com.celery.pidbox Y 6 6 0 3.70k 3.70k 0 1 2 > > >### checking what the processes are doing --- not much >[root@ec2-54-73-98-199 pulp_auto]# ps -ef | grep [c]elery >apache 8635 1 0 13:45 ? 00:00:01 /usr/bin/python /usr/bin/celery beat -A pulp.server.async.app >apache 8636 1 0 13:45 ? 00:00:01 /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%h -Q resource_manager -c 1 --events >apache 8637 1 0 13:45 ? 00:00:01 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events >apache 8638 1 0 13:45 ? 00:00:01 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events >apache 8705 8637 0 13:45 ? 00:00:00 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events >apache 8725 8636 0 13:45 ? 00:00:00 /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%h -Q resource_manager -c 1 --events >apache 8733 8638 0 13:45 ? 00:00:00 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events >[root@ec2-54-73-98-199 pulp_auto]# > >[root@ec2-54-73-98-199 pulp_auto]# strace -p 8636 >Process 8636 attached >select(36, [35], [], [], {0, 123605}) = 0 (Timeout) >gettimeofday({1400075453, 778952}, NULL) = 0 >select(36, [35], [], [], {3, 0}) = 0 (Timeout) >gettimeofday({1400075456, 782342}, NULL) = 0 >select(36, [35], [], [], {3, 0}) = 0 (Timeout) >gettimeofday({1400075459, 785734}, NULL) = 0 >select(36, [35], [], [], {3, 0}^CProcess 8636 detached > <detached ...> >[root@ec2-54-73-98-199 pulp_auto]# strace -p 8635 >Process 8635 attached >select(0, NULL, NULL, NULL, {28, 886318}^CProcess 8635 detached > <detached ...> > >### killing the processes and nosetests >[1]+ Stopped vi /usr/lib/python2.7/site-packages/kombu/transport/qpid.py >[root@ec2-54-73-98-199 pulp_auto]# killall nosetests >-bash: killall: command not found >[root@ec2-54-73-98-199 pulp_auto]# kill %2 >[root@ec2-54-73-98-199 pulp_auto]# kill %2 >-bash: kill: (8840) - No such process >[2]- Terminated nosetests -vs tests/test_2_repo_importer_distributor.py >[root@ec2-54-73-98-199 pulp_auto]# systemctl stop `systemctl list-unit-files | egrep 'pulp|httpd|goferd|qpidd' | cut -d\ -f1` >^C >[root@ec2-54-73-98-199 pulp_auto]# ps -u apache > PID TTY TIME CMD > 8078 ? 00:00:00 systemd > 8080 ? 00:00:00 (sd-pam) > 8641 ? 00:00:01 httpd > 8646 ? 00:00:00 httpd > 8648 ? 00:00:01 httpd > 8670 ? 00:00:00 httpd > >[root@ec2-54-73-98-199 pulp_auto]# ps -u apache -f >UID PID PPID C STIME TTY TIME CMD >apache 8078 1 0 11:05 ? 00:00:00 /usr/lib/systemd/systemd --user >apache 8080 8078 0 11:05 ? 00:00:00 (sd-pam) > > >[root@ec2-54-73-98-199 pulp_auto]# systemctl status `systemctl list-unit-files | egrep 'pulp|httpd|goferd|qpidd' | cut -d\ -f1` >httpd.service - The Apache HTTP Server > Loaded: loaded (/usr/lib/systemd/system/httpd.service; enabled) > Active: failed (Result: signal) since Wed 2014-05-14 13:53:59 UTC; 1min 1s ago > Process: 8890 ExecStop=/bin/kill -WINCH ${MAINPID} (code=exited, status=0/SUCCESS) > Process: 8634 ExecStart=/usr/sbin/httpd $OPTIONS -DFOREGROUND (code=killed, signal=KILL) > Main PID: 8634 (code=killed, signal=KILL) > Status: "Total requests: 1; Current requests/sec: 0; Current traffic: 0 B/sec" > >May 14 13:53:32 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com pulp[8641]: qpid.messaging:WARNING: sleeping 64 seconds >May 14 13:53:32 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com pulp[8646]: qpid.messaging:WARNING: recoverable error[attempt 6]: [Errno 111] Connection refused >May 14 13:53:32 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com pulp[8646]: qpid.messaging:WARNING: sleeping 64 seconds >May 14 13:53:32 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com pulp[8648]: qpid.messaging:WARNING: trying: ec2-54-73-98-199.eu-west-1.compute.amazonaws.com:5672 >May 14 13:53:32 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com pulp[8648]: qpid.messaging:WARNING: recoverable error[attempt 6]: [Errno 111] Connection refused >May 14 13:53:32 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com pulp[8648]: qpid.messaging:WARNING: sleeping 64 seconds >May 14 13:53:59 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: httpd.service stopping timed out. Killing. >May 14 13:53:59 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: httpd.service: main process exited, code=killed, status=9/KILL >May 14 13:53:59 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Stopped The Apache HTTP Server. >May 14 13:53:59 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Unit httpd.service entered failed state. > >pulp_celerybeat.service - Pulp's Celerybeat > Loaded: loaded (/usr/lib/systemd/system/pulp_celerybeat.service; enabled) > Active: inactive (dead) since Wed 2014-05-14 13:52:29 UTC; 2min 31s ago > Process: 8635 ExecStart=/usr/bin/celery beat -A pulp.server.async.app (code=exited, status=0/SUCCESS) > Main PID: 8635 (code=exited, status=0/SUCCESS) > >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Stopping Pulp's Celerybeat... >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8635]: celery beat v3.1.11 (Cipater) is starting. >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8635]: __ - ... __ - _ >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8635]: Configuration -> >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8635]: . broker -> qpid://guest@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com:5672// >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8635]: . loader -> celery.loaders.app.AppLoader >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8635]: . scheduler -> pulp.server.async.scheduler.Scheduler >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8635]: . logfile -> [stderr]@%INFO >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8635]: . maxinterval -> now (0s) >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Stopped Pulp's Celerybeat. > >pulp_resource_manager.service - Pulp Resource Manager > Loaded: loaded (/usr/lib/systemd/system/pulp_resource_manager.service; enabled) > Active: failed (Result: exit-code) since Wed 2014-05-14 13:52:31 UTC; 2min 30s ago > Process: 8636 ExecStart=/usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%%h -Q resource_manager -c 1 --events (code=exited, status=1/FAILURE) > Main PID: 8636 (code=exited, status=1/FAILURE) > >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8636]: File "/usr/lib/python2.7/site-packages/qpid/messaging/endpoints.py", line 585, in _ewait >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8636]: result = self.connection._ewait(lambda: self.error or predicate(), timeout) >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8636]: File "/usr/lib/python2.7/site-packages/qpid/messaging/endpoints.py", line 224, in _ewait >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8636]: self.check_error() >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8636]: File "/usr/lib/python2.7/site-packages/qpid/messaging/endpoints.py", line 217, in check_error >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8636]: raise e >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8636]: qpid.messaging.exceptions.ConnectionError: connection aborted >May 14 13:52:31 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: pulp_resource_manager.service: main process exited, code=exited, status=1/FAILURE >May 14 13:52:31 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Stopped Pulp Resource Manager. >May 14 13:52:31 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Unit pulp_resource_manager.service entered failed state. > >pulp_worker-0.service - Pulp Worker #0 > Loaded: loaded (/run/systemd/system/pulp_worker-0.service; static) > Active: failed (Result: exit-code) since Wed 2014-05-14 13:52:31 UTC; 2min 29s ago > Process: 8637 ExecStart=/usr/bin/celery worker -n reserved_resource_worker-0@%%h -A pulp.server.async.app -c 1 --events (code=exited, status=1/FAILURE) > Main PID: 8637 (code=exited, status=1/FAILURE) > >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8637]: result = self.connection._ewait(lambda: self.error or predicate(), timeout) >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8637]: File "/usr/lib/python2.7/site-packages/qpid/messaging/endpoints.py", line 224, in _ewait >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8637]: self.check_error() >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8637]: File "/usr/lib/python2.7/site-packages/qpid/messaging/endpoints.py", line 217, in check_error >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8637]: raise e >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8637]: qpid.messaging.exceptions.ConnectionError: connection aborted >May 14 13:52:31 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: pulp_worker-0.service: main process exited, code=exited, status=1/FAILURE >May 14 13:52:31 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Stopped Pulp Worker #0. >May 14 13:52:31 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Unit pulp_worker-0.service entered failed state. >May 14 13:52:31 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Stopped Pulp Worker #0. > >pulp_worker-1.service - Pulp Worker #1 > Loaded: loaded (/run/systemd/system/pulp_worker-1.service; static) > Active: failed (Result: exit-code) since Wed 2014-05-14 13:52:31 UTC; 2min 29s ago > Process: 8638 ExecStart=/usr/bin/celery worker -n reserved_resource_worker-1@%%h -A pulp.server.async.app -c 1 --events (code=exited, status=1/FAILURE) > Main PID: 8638 (code=exited, status=1/FAILURE) > >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8638]: File "/usr/lib/python2.7/site-packages/qpid/messaging/endpoints.py", line 585, in _ewait >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8638]: result = self.connection._ewait(lambda: self.error or predicate(), timeout) >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8638]: File "/usr/lib/python2.7/site-packages/qpid/messaging/endpoints.py", line 224, in _ewait >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8638]: self.check_error() >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8638]: File "/usr/lib/python2.7/site-packages/qpid/messaging/endpoints.py", line 217, in check_error >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8638]: raise e >May 14 13:52:30 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com celery[8638]: qpid.messaging.exceptions.ConnectionError: connection aborted >May 14 13:52:31 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: pulp_worker-1.service: main process exited, code=exited, status=1/FAILURE >May 14 13:52:31 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Stopped Pulp Worker #1. >May 14 13:52:31 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Unit pulp_worker-1.service entered failed state. > >pulp_workers.service - Pulp Celery Workers > Loaded: loaded (/usr/lib/systemd/system/pulp_workers.service; enabled) > Active: inactive (dead) since Wed 2014-05-14 13:52:31 UTC; 2min 29s ago > Process: 8893 ExecStop=/usr/libexec/pulp-manage-workers stop (code=exited, status=0/SUCCESS) > Process: 8639 ExecStart=/usr/libexec/pulp-manage-workers start (code=exited, status=0/SUCCESS) > Main PID: 8639 (code=exited, status=0/SUCCESS) > >May 14 13:45:37 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Started Pulp Celery Workers. >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Stopping Pulp Celery Workers... >May 14 13:52:31 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Stopped Pulp Celery Workers. > >qpidd.service - An AMQP message broker daemon. > Loaded: loaded (/usr/lib/systemd/system/qpidd.service; enabled) > Active: inactive (dead) since Wed 2014-05-14 13:52:29 UTC; 2min 31s ago > Docs: man:qpidd(1) > http://qpid.apache.org/ > Process: 8603 ExecStart=/usr/sbin/qpidd --config /etc/qpid/qpidd.conf (code=exited, status=0/SUCCESS) > Main PID: 8603 (code=exited, status=0/SUCCESS) > >May 14 13:42:37 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com qpidd[8603]: 2014-05-14 13:42:37 [Broker] notice Broker running >May 14 13:45:39 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com qpidd[8603]: 2014-05-14 13:45:39 [Store] notice Journal "celery": Created >May 14 13:45:40 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com qpidd[8603]: 2014-05-14 13:45:40 [Store] notice Journal "resource_manager": Created >May 14 13:45:40 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com qpidd[8603]: 2014-05-14 13:45:40 [Store] notice Journal "pulp.task": Created >May 14 13:45:43 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com qpidd[8603]: 2014-05-14 13:45:43 [Store] notice Journal "reserved_resource_worker-0@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com": Created >May 14 13:45:48 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com qpidd[8603]: 2014-05-14 13:45:48 [Broker] warning Exchange celeryev cannot deliver to queue celeryev.3c1226e5-092b-4593-832c-e9c00f544de8: resource-limit-exceeded: reso... >May 14 13:45:48 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com qpidd[8603]: 2014-05-14 13:45:48 [Broker] warning Exchange celeryev cannot deliver to queue celeryev.bda052ce-2991-4c77-9789-ecfcc7d82cb5: resource-limit-exceeded: reso... >May 14 13:45:48 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com qpidd[8603]: 2014-05-14 13:45:48 [Protocol] error Execution exception: resource-limit-exceeded: resource-limit-exceeded: Maximum depth exceeded on celeryev.3c1226e5-092b... >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Stopping An AMQP message broker daemon.... >May 14 13:52:29 ec2-54-73-98-199.eu-west-1.compute.amazonaws.com systemd[1]: Stopped An AMQP message broker daemon.. >Hint: Some lines were ellipsized, use -l to show in full. > >### Un-hacking the patch > >[root@ec2-54-73-98-199 pulp_auto]# #head -755 /usr/lib/python2.7/site-packages/kombu/transport/qpid.py | tail -4 >[root@ec2-54-73-98-199 pulp_auto]# fg >vi /usr/lib/python2.7/site-packages/kombu/transport/qpid.py > >[1]+ Stopped vi /usr/lib/python2.7/site-packages/kombu/transport/qpid.py >[root@ec2-54-73-98-199 pulp_auto]# head -755 /usr/lib/python2.7/site-packages/kombu/transport/qpid.py | tail -4 > options['qpid.max_count'] = 10 > options['qpid.auto_delete_timeout'] = AUTO_DELETE_TIMEOUT > if queue.startswith('celeryev') or queue.endswith('pidbox'): > options['qpid.policy_type'] = 'ring' >[root@ec2-54-73-98-199 pulp_auto]# systemctl start qpidd >[root@ec2-54-73-98-199 pulp_auto]# qpid-config queues | tail -n+4 | cut -d\ -f 1 | xargs -I{} qpid-config del queue --force {} >Failed: Exception: Exception from Agent: {u'error_code': 7, u'error_text': 'not-found: Delete failed. No such queue: dc1a5232-f3ac-42ef-8f7c-22cd68644ba3:0.0 (/builddir/build/BUILD/qpid-0.24/cpp/src/qpid/broker/Broker.cpp:1348)'} >[root@ec2-54-73-98-199 pulp_auto]# qpid-config queues >Queue Name Attributes >================================================================= >42441e4b-43e6-43c7-89a9-0db64704e1f4:0.0 auto-del excl >celery --durable --max-queue-count=10 --argument passive=False --argument exclusive=False --argument arguments=None >[root@ec2-54-73-98-199 pulp_auto]# qpid-config queues | tail -n+4 | cut -d\ -f 1 | xargs -I{} qpid-config del queue --force {} >[root@ec2-54-73-98-199 pulp_auto]# qpid-config queues >Queue Name Attributes >================================================================= >c0a6f3d4-1225-4be7-88ae-db60b2769738:0.0 auto-del excl >[root@ec2-54-73-98-199 pulp_auto]# systemctl restart qpidd >[root@ec2-54-73-98-199 pulp_auto]# mongo pulp_database --eval "db.dropDatabase()" >MongoDB shell version: 2.4.6 >connecting to: pulp_database >[object Object] >[root@ec2-54-73-98-199 pulp_auto]# sudo -u apache pulp-manage-db >/usr/lib64/python2.7/site-packages/pymongo/mongo_replica_set_client.py:340: UserWarning: libevent version mismatch: system version is '2.0.21-stable' but this gevent is compiled against '2.0.18-stable' > from gevent import Greenlet >Loading content types. >Content types loaded. >Ensuring the admin role and user are in place. >Admin role and user are in place. >Beginning database migrations. >Applying pulp.server.db.migrations version 1 >Migration to pulp.server.db.migrations version 1 complete. >Applying pulp.server.db.migrations version 2 >Migration to pulp.server.db.migrations version 2 complete. >Applying pulp.server.db.migrations version 3 >Migration to pulp.server.db.migrations version 3 complete. >Applying pulp.server.db.migrations version 4 >Migration to pulp.server.db.migrations version 4 complete. >Applying pulp.server.db.migrations version 5 >Migration to pulp.server.db.migrations version 5 complete. >Applying pulp.server.db.migrations version 6 >Migration to pulp.server.db.migrations version 6 complete. >Applying pulp.server.db.migrations version 7 >Migration to pulp.server.db.migrations version 7 complete. >Applying pulp.server.db.migrations version 8 >Migration to pulp.server.db.migrations version 8 complete. >Applying pulp.server.db.migrations version 9 >Migration to pulp.server.db.migrations version 9 complete. >Applying pulp_puppet.plugins.migrations version 1 >Migration to pulp_puppet.plugins.migrations version 1 complete. >Applying pulp_puppet.plugins.migrations version 2 >Migration to pulp_puppet.plugins.migrations version 2 complete. >Applying pulp_rpm.plugins.migrations version 1 >Migration to pulp_rpm.plugins.migrations version 1 complete. >Applying pulp_rpm.plugins.migrations version 2 >Migration to pulp_rpm.plugins.migrations version 2 complete. >Applying pulp_rpm.plugins.migrations version 3 >Migration to pulp_rpm.plugins.migrations version 3 complete. >Applying pulp_rpm.plugins.migrations version 4 >Migration to pulp_rpm.plugins.migrations version 4 complete. >Applying pulp_rpm.plugins.migrations version 5 >Migration to pulp_rpm.plugins.migrations version 5 complete. >Applying pulp_rpm.plugins.migrations version 6 >Migration to pulp_rpm.plugins.migrations version 6 complete. >Applying pulp_rpm.plugins.migrations version 7 >Migration to pulp_rpm.plugins.migrations version 7 complete. >Applying pulp_rpm.plugins.migrations version 8 >Migration to pulp_rpm.plugins.migrations version 8 complete. >Applying pulp_rpm.plugins.migrations version 9 >Migration to pulp_rpm.plugins.migrations version 9 complete. >Applying pulp_rpm.plugins.migrations version 10 >Migration to pulp_rpm.plugins.migrations version 10 complete. >Applying pulp_rpm.plugins.migrations version 11 >Migration to pulp_rpm.plugins.migrations version 11 complete. >Applying pulp_rpm.plugins.migrations version 12 >Migration to pulp_rpm.plugins.migrations version 12 complete. >Applying pulp_rpm.plugins.migrations version 13 >Migration to pulp_rpm.plugins.migrations version 13 complete. >Applying pulp_rpm.plugins.migrations version 14 >Migration to pulp_rpm.plugins.migrations version 14 complete. >Applying pulp_rpm.plugins.migrations version 15 >Migration to pulp_rpm.plugins.migrations version 15 complete. >Database migrations complete. >[root@ec2-54-73-98-199 pulp_auto]# systemctl restart `systemctl list-unit-files | egrep 'pulp|httpd|goferd' | cut -d\ -f1` >[root@ec2-54-73-98-199 pulp_auto]# qpid-config queues >Queue Name Attributes >============================================================================================================== >0faf6128-7aab-404f-935a-646d7d2a189b:0.0 auto-del excl >29887f5a-2219-47cf-89c9-cb11f74ee2ec:1.0 auto-del excl >36cef6ed-995b-3114-bc06-5672fd711e69.reply.celery.pidbox auto-del --max-queue-count=10 --limit-policy=ring --argument exclusive=False --argument qpid.auto_delete_timeout=3 --argument passive=False --argument arguments={u'x-expires': 10000} >3d2dff3b-12ca-442d-bc41-65c5ba7a3022:1.0 auto-del excl >45178b08-e51f-44ab-8478-59e47cb4eb9f:1.0 auto-del excl >4afb57ea-bce6-4b03-bc12-a09df4769e2c:1.0 auto-del excl >4afb57ea-bce6-4b03-bc12-a09df4769e2c:3.0 auto-del excl >8a19c6b2-3687-4502-8158-2d1cca428695:1.0 auto-del excl >a2064cec-8c32-3474-91e3-e877053756b9.reply.celery.pidbox auto-del --max-queue-count=10 --limit-policy=ring --argument exclusive=False --argument qpid.auto_delete_timeout=3 --argument passive=False --argument arguments={u'x-expires': 10000} >a5c540d4-3967-4829-9449-55bec2554fed:1.0 auto-del excl >a5c540d4-3967-4829-9449-55bec2554fed:3.0 auto-del excl >a868271f-3cce-4e4e-acc8-28038431cfb9:1.0 auto-del excl >c6a00c1a-4cb8-47ab-97a4-93549fb72664:1.0 auto-del excl >c70cdde3-5449-47c0-ab01-40ff5be12a4c:1.0 auto-del excl >celery --durable --max-queue-count=10 --argument passive=False --argument exclusive=False --argument qpid.auto_delete_timeout=3 --argument arguments=None >celeryev.0404e905-9163-4f31-ab50-15ec5833a216 auto-del --max-queue-count=10 --limit-policy=ring --argument exclusive=False --argument qpid.auto_delete_timeout=3 --argument passive=False --argument arguments={} >celeryev.f3d35032-1414-42f8-b9f2-cfd2797de820 auto-del --max-queue-count=10 --limit-policy=ring --argument exclusive=False --argument qpid.auto_delete_timeout=3 --argument passive=False --argument arguments={} >d4b8b5ce-0cc7-4b90-90bf-2a7d130b77d4:1.0 auto-del excl >dd070594-a580-3e12-b734-ada4e7b463b5.reply.celery.pidbox auto-del --max-queue-count=10 --limit-policy=ring --argument exclusive=False --argument qpid.auto_delete_timeout=3 --argument passive=False --argument arguments={u'x-expires': 10000} >e91c94ce-6d2b-4772-b8f3-624fe05e40db:1.0 auto-del excl >ee5fa5c7-47f0-4786-a87f-050b1a36b47e:1.0 auto-del excl >ff593894-aefd-4f56-bfc4-5f51a9a82fec:1.0 auto-del excl >ff593894-aefd-4f56-bfc4-5f51a9a82fec:3.0 auto-del excl >pulp.task --durable >reserved_resource_worker-0@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com --durable --max-queue-count=10 --argument passive=False --argument exclusive=False --argument qpid.auto_delete_timeout=3 --argument arguments=None >reserved_resource_worker-0@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com.celery.pidbox auto-del --max-queue-count=10 --limit-policy=ring --argument exclusive=False --argument qpid.auto_delete_timeout=3 --argument passive=False --argument arguments=None >reserved_resource_worker-1@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com.celery.pidbox auto-del --max-queue-count=10 --limit-policy=ring --argument exclusive=False --argument qpid.auto_delete_timeout=3 --argument passive=False --argument arguments=None >resource_manager --durable --max-queue-count=10 --argument passive=False --argument exclusive=False --argument qpid.auto_delete_timeout=3 --argument arguments=None >resource_manager@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com.celery.pidbox auto-del --max-queue-count=10 --limit-policy=ring --argument exclusive=False --argument qpid.auto_delete_timeout=3 --argument passive=False --argument arguments=None >[root@ec2-54-73-98-199 pulp_auto]# qpid-tool -q >Usage: qpid-tool [OPTIONS] [[<username>/<password>@]<target-host>[:<tcp-port>]] > >qpid-tool: error: no such option: -q >[root@ec2-54-73-98-199 pulp_auto]# qpid-stat -q >Queues > queue dur autoDel excl msg msgIn msgOut bytes bytesIn bytesOut cons bind > ========================================================================================================================================================================== > 1bfd2aca-71e8-42d7-a16e-4171b559f254:0.0 Y Y 0 0 0 0 0 0 1 2 > 29887f5a-2219-47cf-89c9-cb11f74ee2ec:1.0 Y Y 0 2 2 0 172 172 1 2 > 3d2dff3b-12ca-442d-bc41-65c5ba7a3022:1.0 Y Y 0 10 10 0 4.10k 4.10k 1 2 > 45178b08-e51f-44ab-8478-59e47cb4eb9f:1.0 Y Y 0 4 4 0 1.85k 1.85k 1 2 > 4afb57ea-bce6-4b03-bc12-a09df4769e2c:1.0 Y Y 0 4 4 0 1.77k 1.77k 1 2 > 4afb57ea-bce6-4b03-bc12-a09df4769e2c:3.0 Y Y 0 7 7 0 2.15k 2.15k 1 2 > 8a19c6b2-3687-4502-8158-2d1cca428695:1.0 Y Y 0 2 2 0 172 172 1 2 > a5c540d4-3967-4829-9449-55bec2554fed:1.0 Y Y 0 8 8 0 3.69k 3.69k 1 2 > a5c540d4-3967-4829-9449-55bec2554fed:3.0 Y Y 0 7 7 0 2.23k 2.23k 1 2 > a868271f-3cce-4e4e-acc8-28038431cfb9:1.0 Y Y 0 0 0 0 0 0 1 2 > c6a00c1a-4cb8-47ab-97a4-93549fb72664:1.0 Y Y 0 9 9 0 4.01k 4.01k 1 2 > c70cdde3-5449-47c0-ab01-40ff5be12a4c:1.0 Y Y 0 2 2 0 172 172 1 2 > celery Y 0 0 0 0 0 0 2 2 > celeryev.0404e905-9163-4f31-ab50-15ec5833a216 Y 10 66 56 7.90k 52.2k 44.3k 1 2 > celeryev.f3d35032-1414-42f8-b9f2-cfd2797de820 Y 10 66 56 7.90k 52.2k 44.3k 1 2 > d4b8b5ce-0cc7-4b90-90bf-2a7d130b77d4:1.0 Y Y 0 0 0 0 0 0 1 2 > e91c94ce-6d2b-4772-b8f3-624fe05e40db:1.0 Y Y 0 9 9 0 3.94k 3.94k 1 2 > ee5fa5c7-47f0-4786-a87f-050b1a36b47e:1.0 Y Y 0 4 4 0 1.85k 1.85k 1 2 > ff593894-aefd-4f56-bfc4-5f51a9a82fec:1.0 Y Y 0 4 4 0 1.80k 1.80k 1 2 > ff593894-aefd-4f56-bfc4-5f51a9a82fec:3.0 Y Y 0 5 5 0 2.04k 2.04k 1 2 > pulp.task Y 0 0 0 0 0 0 3 1 > reserved_resource_worker-0@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com Y 0 0 0 0 0 0 1 2 > reserved_resource_worker-0@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com.celery.pidbox Y 4 4 0 2.44k 2.44k 0 1 2 > reserved_resource_worker-1@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com.celery.pidbox Y 2 2 0 1.27k 1.27k 0 1 2 > resource_manager Y 0 0 0 0 0 0 1 2 > resource_manager@ec2-54-73-98-199.eu-west-1.compute.amazonaws.com.celery.pidbox Y 4 4 0 2.44k 2.44k 0 1 2 > ># test cases back to normal >[root@ec2-54-73-98-199 pulp_auto]# nosetests -vs tests/test_2_repo_importer_distributor.py >/usr/lib/python2.7/site-packages/pulp_auto-0.1-py2.7.egg/pulp_auto/__init__.py:2: UserWarning: libevent version mismatch: system version is '2.0.21-stable' but this gevent is compiled against '2.0.18-stable' >test_01_list_distributors (tests.test_2_repo_importer_distributor.DistributorTest) ... ok >test_02_list_distributors_of_unexistant_repo (tests.test_2_repo_importer_distributor.DistributorTest) ... ok >test_03_get_single_distributor (tests.test_2_repo_importer_distributor.DistributorTest) ... ok >test_04_distributor_update (tests.test_2_repo_importer_distributor.DistributorTest) ... ok >test_05_distributor_update_unexistent_repo (tests.test_2_repo_importer_distributor.DistributorTest) ... ok >test_06_delete_distributor (tests.test_2_repo_importer_distributor.DistributorTest) ... ok >test_01_list_importers (tests.test_2_repo_importer_distributor.ImporterTest) ... ok >test_02_list_importer_of_unexistant_repo (tests.test_2_repo_importer_distributor.ImporterTest) ... ok >test_03_get_single_importer (tests.test_2_repo_importer_distributor.ImporterTest) ... ok >test_04_importer_update (tests.test_2_repo_importer_distributor.ImporterTest) ... ok >test_05_importer_update_unexistent_repo (tests.test_2_repo_importer_distributor.ImporterTest) ... FAIL >test_06_delete_importer (tests.test_2_repo_importer_distributor.ImporterTest) ... ok > >====================================================================== >FAIL: test_05_importer_update_unexistent_repo (tests.test_2_repo_importer_distributor.ImporterTest) >---------------------------------------------------------------------- >Traceback (most recent call last): > File "/usr/share/pulp_auto/tests/test_2_repo_importer_distributor.py", line 54, in test_05_importer_update_unexistent_repo > self.assertPulp(code=404) > File "/usr/share/pulp_auto/tests/pulp_test.py", line 54, in assertPulp > code > File "/usr/share/pulp_auto/tests/pulp_test.py", line 63, in assertEqual > super(PulpTest, self).assertEqual(a, b, pprint.pformat("%r != %r" % (a, b))) >AssertionError: '202 != 404' >-------------------- >> begin captured logging << -------------------- >requests.packages.urllib3.connectionpool: DEBUG: "PUT /pulp/api/v2/repositories/ImporterTest_repo1/importers/yum_importer/ HTTP/1.1" 202 172 >pulp_auto.pulp: DEBUG: <pulp_auto.pulp.Pulp object at 0x2096850>.send(Request('PUT', '/repositories/ImporterTest_repo1/importers/yum_importer/', data='{"importer_config": {"num_units": 10}}', headers={'Content-Type': 'application/json'})) == <Response [202]> >--------------------- >> end captured logging << --------------------- > >---------------------------------------------------------------------- >Ran 12 tests in 8.162s > >FAILED (failures=1) >[root@ec2-54-73-98-199 pulp_auto]# head -755 /usr/lib/python2.7/site-packages/kombu/transport/qpid.py | tail -4 > options['qpid.max_count'] = 10 > options['qpid.auto_delete_timeout'] = AUTO_DELETE_TIMEOUT > if queue.startswith('celeryev') or queue.endswith('pidbox'): > options['qpid.policy_type'] = 'ring' >[root@ec2-54-73-98-199 pulp_auto]#
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 1088481
:
886965
|
895511
|
895814