Bug 1641864 - Missing module errors after upgrading to gofer-2.12.1-1
Summary: Missing module errors after upgrading to gofer-2.12.1-1
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Satellite
Classification: Red Hat
Component: katello-agent
Version: 6.4
Hardware: All
OS: Linux
high
high
Target Milestone: 6.5.0
Assignee: Jonathon Turel
QA Contact: Jan Hutař
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2018-10-23 03:01 UTC by Taft Sanders
Modified: 2019-11-05 22:35 UTC (History)
7 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2019-05-14 12:38:16 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHSA-2019:1222 0 None None None 2019-05-14 12:38:25 UTC

Description Taft Sanders 2018-10-23 03:01:48 UTC
Description of problem:
goferd appears to be running normally, but errors are shown in the logs after upgrading goferd from 6.3 version to 6.4

Version-Release number of selected component (if applicable):
gofer-2.12.1-1

How reproducible:
Everytime

Steps to Reproduce:
1. Fresh OS install (only tested on RHEL7)
2. Register to a Satellite and install katello-agent from sat-tools 6.3
3. Disable sat-tools-6.3 and enable sat-tools-6.4
4. Upgrade katello-agent
5. systemctl status goferd

Actual results:
systemd[1]: Starting Gofer Agent...
[INFO][Thread-1] gofer.rmi.store:108 - Using: /var/lib/gofer/messaging/pending/demo
[WARNING][MainThread] gofer.agent.plugin:647 - plugin:demo, DISABLED
[INFO][Thread-2] gofer.rmi.store:108 - Using: /var/lib/gofer/messaging/pending/katello
[ERROR][MainThread] gofer.agent.plugin:703 - plugin:katello, import failed
[ERROR][MainThread] gofer.agent.plugin:703 - Traceback (most recent call last):
[ERROR][MainThread] gofer.agent.plugin:703 -   File "/usr/lib/python2.7/site-packages/gofer/agent/plugin.py", line 687, in _load
[ERROR][MainThread] gofer.agent.plugin:703 -     plugin.impl = __import__(path, {}, {}, [path.split('.')[-1]])
[ERROR][MainThread] gofer.agent.plugin:703 - ImportError: No module named katello.agent.goferd.plugin
[INFO][MainThread] gofer.agent.main:92 - agent started.


Expected results:
No errors

Additional info:

Comment 1 Brad Buckingham 2018-10-24 19:21:41 UTC
Taft,

Thanks for the report.  

Did you notice any impact on the user workflows?  
e.g. Were you able to install/update/remove packages, groups, errata...etc?

Comment 2 Taft Sanders 2018-10-30 18:49:22 UTC
Brad,

Yeah can't push packages to a host via katello-agent:

Checked yum data on client to ensure connectivity to Satellite:
foreman-ssl_access_ssl.log:
10.12.213.129 - - [30/Oct/2018:14:22:17 -0400] "GET /rhsm/consumers/0b166023-f6bb-443e-97a5-a80d578e651b/release HTTP/1.1" 200 20 "-" "RHSM/1.0 (cmd=yum)"

- Initiated upgrade on dhclient package from the Satellite:
Id: 7dc24b0b-6a38-4712-bc83-1739bbaf6fb2

Label: Actions::Katello::Host::Package::Update

Status: running Cancel

Result: pending

Started at: 2018-10-30 18:23:50 UTC

Ended at:

Plan
Run
Finalize
Execution History
3: Actions::Pulp::Consumer::ContentUpdate (waiting for Pulp to start the task) [ 1408.27s / 12.40s ]  Cancel
Started at: 2018-10-30 18:23:51 UTC

Ended at: 2018-10-30 18:47:19 UTC

Real time: 1408.27s

Execution time (excluding suspended state): 12.40s

Input:

---
consumer_uuid: 0b166023-f6bb-443e-97a5-a80d578e651b
type: rpm
args:
- dhclient-4.2.5-58.el7_4.4.x86_64
remote_user: admin
remote_cp_user: admin
Output:

---
pulp_tasks:
- exception: 
  task_type: 
  _href: "/pulp/api/v2/tasks/272d7d68-90bf-407b-bda1-8d41b4ad1c46/"
  task_id: 272d7d68-90bf-407b-bda1-8d41b4ad1c46
  tags:
  - pulp:consumer:0b166023-f6bb-443e-97a5-a80d578e651b
  - pulp:action:unit_update
  finish_time: 
  _ns: task_status
  start_time: 
  traceback: 
  spawned_tasks: []
  progress_report: {}
  queue: agent.dq2
  state: waiting
  worker_name: agent
  result: 
  error: 
  _id:
    "$oid": 5bd8a1b7882c103ec4f5ea40
  id: 5bd8a1b7882c103ec4f5ea40
poll_attempts:
  total: 107
  failed: 0
==========================================================================================================================================
I'll leave this task running to be sure it won't be needed for further information. Also, all services on the Satellite are working:
[root@batman httpd]# foreman-maintain service status
Running Status Services
================================================================================
Get status of applicable services: 
Displaying the following service(s):

rh-mongodb34-mongod, postgresql, qdrouterd, qpidd, squid, pulp_celerybeat, pulp_resource_manager, pulp_streamer, pulp_workers, smart_proxy_dynflow_core, tomcat, dynflowd, goferd, httpd, puppetserver, foreman-proxy
| displaying rh-mongodb34-mongod                                                
● rh-mongodb34-mongod.service - High-performance, schema-free document-oriented database
   Loaded: loaded (/usr/lib/systemd/system/rh-mongodb34-mongod.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 10:48:34 EDT; 5 days ago
  Process: 1115 ExecStart=/opt/rh/rh-mongodb34/root/usr/libexec/mongodb-scl-helper enable $RH_MONGODB34_SCLS_ENABLED -- /opt/rh/rh-mongodb34/root/usr/bin/mongod $OPTIONS run (code=exited, status=0/SUCCESS)
 Main PID: 1368 (mongod)
    Tasks: 81
   Memory: 6.6G
   CGroup: /system.slice/rh-mongodb34-mongod.service
           └─1368 /opt/rh/rh-mongodb34/root/usr/bin/mongod -f /etc/opt/rh/rh-mongodb34/mongod.conf run

Oct 30 09:11:00 batman.usersys.redhat.com mongod.27017[1368]: [conn53] command pulp_database.$cmd command: update { update: "workers", ordered: true, updates: [ { q: { _id: "reserved_resource_worker-6.redhat.com" }, u: { $set: { last_heartbeat: new Date(1540905060465) } }, multi: false, upsert: true } ] } numYields:0 reslen:59 locks:{ Global: { acquireCount: { r: 1, w: 1 } }, MMAPV1Journal: { acquireCount: { w: 3 }, acquireWaitCount: { w: 1 }, timeAcquiringMicros: { w: 444899 } }, Database: { acquireCount: { w: 1 } }, Collection: { acquireCount: { W: 1 }, acquireWaitCount: { W: 1 }, timeAcquiringMicros: { W: 202 } } } protocol:op_query 445ms
Oct 30 10:18:02 batman.usersys.redhat.com mongod.27017[1368]: [conn57] update pulp_database.workers query: { _id: "reserved_resource_worker-3.redhat.com" } planSummary: IDHACK update: { $set: { last_heartbeat: new Date(1540909082592) } } keysExamined:1 docsExamined:1 nMatched:1 nModified:1 numYields:0 locks:{ Global: { acquireCount: { r: 1, w: 1 } }, MMAPV1Journal: { acquireCount: { w: 2 }, acquireWaitCount: { w: 1 }, timeAcquiringMicros: { w: 102498 } }, Database: { acquireCount: { w: 1 } }, Collection: { acquireCount: { W: 1 } } } 102ms
Oct 30 10:18:02 batman.usersys.redhat.com mongod.27017[1368]: [conn57] command pulp_database.$cmd command: update { update: "workers", ordered: true, updates: [ { q: { _id: "reserved_resource_worker-3.redhat.com" }, u: { $set: { last_heartbeat: new Date(1540909082592) } }, multi: false, upsert: true } ] } numYields:0 reslen:59 locks:{ Global: { acquireCount: { r: 1, w: 1 } }, MMAPV1Journal: { acquireCount: { w: 2 }, acquireWaitCount: { w: 1 }, timeAcquiringMicros: { w: 102498 } }, Database: { acquireCount: { w: 1 } }, Collection: { acquireCount: { W: 1 } } } protocol:op_query 103ms
Oct 30 11:20:40 batman.usersys.redhat.com mongod.27017[1368]: [conn57] update pulp_database.workers query: { _id: "reserved_resource_worker-3.redhat.com" } planSummary: IDHACK update: { $set: { last_heartbeat: new Date(1540912840161) } } keysExamined:1 docsExamined:1 nMatched:1 nModified:1 numYields:0 locks:{ Global: { acquireCount: { r: 1, w: 1 } }, MMAPV1Journal: { acquireCount: { w: 2 }, acquireWaitCount: { w: 1 }, timeAcquiringMicros: { w: 233372 } }, Database: { acquireCount: { w: 1 } }, Collection: { acquireCount: { W: 1 } } } 233ms
Oct 30 11:20:40 batman.usersys.redhat.com mongod.27017[1368]: [conn57] command pulp_database.$cmd command: update { update: "workers", ordered: true, updates: [ { q: { _id: "reserved_resource_worker-3.redhat.com" }, u: { $set: { last_heartbeat: new Date(1540912840161) } }, multi: false, upsert: true } ] } numYields:0 reslen:59 locks:{ Global: { acquireCount: { r: 1, w: 1 } }, MMAPV1Journal: { acquireCount: { w: 2 }, acquireWaitCount: { w: 1 }, timeAcquiringMicros: { w: 233372 } }, Database: { acquireCount: { w: 1 } }, Collection: { acquireCount: { W: 1 } } } protocol:op_query 234ms
Oct 30 14:26:33 batman.usersys.redhat.com mongod.27017[1368]: [conn8052] received client metadata from 127.0.0.1:45190 conn8052: { application: { name: "MongoDB Shell" }, driver: { name: "MongoDB Internal Client", version: "3.4.9" }, os: { type: "Linux", name: "Red Hat Enterprise Linux Server release 7.5 (Maipo)", architecture: "x86_64", version: "Kernel 3.10.0-862.14.4.el7.x86_64" } }
Oct 30 14:26:55 batman.usersys.redhat.com mongod.27017[1368]: [conn8053] received client metadata from 127.0.0.1:45206 conn8053: { application: { name: "MongoDB Shell" }, driver: { name: "MongoDB Internal Client", version: "3.4.9" }, os: { type: "Linux", name: "Red Hat Enterprise Linux Server release 7.5 (Maipo)", architecture: "x86_64", version: "Kernel 3.10.0-862.14.4.el7.x86_64" } }
Oct 30 14:45:08 batman.usersys.redhat.com mongod.27017[1368]: [conn51] update pulp_database.workers query: { _id: "reserved_resource_worker-2.redhat.com" } planSummary: IDHACK update: { $set: { last_heartbeat: new Date(1540925108740) } } keysExamined:1 docsExamined:1 nMatched:1 nModified:1 numYields:0 locks:{ Global: { acquireCount: { r: 1, w: 1 } }, MMAPV1Journal: { acquireCount: { w: 2 }, acquireWaitCount: { w: 1 }, timeAcquiringMicros: { w: 113321 } }, Database: { acquireCount: { w: 1 } }, Collection: { acquireCount: { W: 1 } } } 113ms
Oct 30 14:45:08 batman.usersys.redhat.com mongod.27017[1368]: [conn51] command pulp_database.$cmd command: update { update: "workers", ordered: true, updates: [ { q: { _id: "reserved_resource_worker-2.redhat.com" }, u: { $set: { last_heartbeat: new Date(1540925108740) } }, multi: false, upsert: true } ] } numYields:0 reslen:59 locks:{ Global: { acquireCount: { r: 1, w: 1 } }, MMAPV1Journal: { acquireCount: { w: 2 }, acquireWaitCount: { w: 1 }, timeAcquiringMicros: { w: 113321 } }, Database: { acquireCount: { w: 1 } }, Collection: { acquireCount: { W: 1 } } } protocol:op_query 114ms
Oct 30 14:48:58 batman.usersys.redhat.com mongod.27017[1368]: [conn8066] received client metadata from 127.0.0.1:46250 conn8066: { application: { name: "MongoDB Shell" }, driver: { name: "MongoDB Internal Client", version: "3.4.9" }, os: { type: "Linux", name: "Red Hat Enterprise Linux Server release 7.5 (Maipo)", architecture: "x86_64", version: "Kernel 3.10.0-862.14.4.el7.x86_64" } }
| displaying postgresql                                                         
● postgresql.service - PostgreSQL database server
   Loaded: loaded (/etc/systemd/system/postgresql.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 10:48:08 EDT; 5 days ago
  Process: 1179 ExecStart=/usr/bin/pg_ctl start -D ${PGDATA} -s -o -p ${PGPORT} -w -t 300 (code=exited, status=0/SUCCESS)
  Process: 1142 ExecStartPre=/usr/bin/postgresql-check-db-dir ${PGDATA} (code=exited, status=0/SUCCESS)
 Main PID: 1272 (postgres)
    Tasks: 48
   Memory: 1.3G
   CGroup: /system.slice/postgresql.service
           ├─1272 /usr/bin/postgres -D /var/lib/pgsql/data -p 5432
           ├─1366 postgres: logger process                        
           ├─1412 postgres: checkpointer process                  
           ├─1413 postgres: writer process                        
           ├─1414 postgres: wal writer process                    
           ├─1415 postgres: autovacuum launcher process           
           ├─1416 postgres: stats collector process               
           ├─1595 postgres: foreman foreman [local] idle          
           ├─1603 postgres: foreman foreman [local] idle          
           ├─1604 postgres: foreman foreman [local] idle          
           ├─1610 postgres: foreman foreman [local] idle          
           ├─2763 postgres: candlepin candlepin 127.0.0.1(46432) idl
           ├─2764 postgres: candlepin candlepin 127.0.0.1(46434) idl
           ├─2765 postgres: candlepin candlepin 127.0.0.1(46436) idl
           ├─2821 postgres: candlepin candlepin 127.0.0.1(46450) idl
           ├─2822 postgres: candlepin candlepin 127.0.0.1(46452) idl
           ├─2976 postgres: foreman foreman [local] idle          
           ├─2983 postgres: foreman foreman [local] idle          
           ├─2984 postgres: foreman foreman [local] idle          
           ├─2988 postgres: foreman foreman [local] idle          
           ├─3005 postgres: foreman foreman [local] idle          
           ├─3006 postgres: foreman foreman [local] idle          
           ├─4163 postgres: foreman foreman [local] idle          
           ├─4168 postgres: foreman foreman [local] idle          
           ├─4169 postgres: foreman foreman [local] idle          
           ├─4478 postgres: foreman foreman [local] idle          
           ├─5024 postgres: foreman foreman [local] idle          
           ├─5025 postgres: foreman foreman [local] idle          
           ├─5026 postgres: foreman foreman [local] idle          
           ├─5027 postgres: foreman foreman [local] idle          
           ├─5029 postgres: foreman foreman [local] idle          
           ├─5049 postgres: foreman foreman [local] idle          
           ├─5057 postgres: foreman foreman [local] idle          
           ├─5087 postgres: foreman foreman [local] idle          
           ├─5488 postgres: foreman foreman [local] idle          
           ├─6515 postgres: foreman foreman [local] idle          
           ├─7041 postgres: foreman foreman [local] idle          
           ├─7047 postgres: foreman foreman [local] idle          
           ├─7048 postgres: foreman foreman [local] idle          
           ├─7147 postgres: foreman foreman [local] idle          
           ├─8440 postgres: foreman foreman [local] idle          
           ├─8446 postgres: foreman foreman [local] idle          
           ├─8447 postgres: foreman foreman [local] idle          
           ├─8658 postgres: candlepin candlepin 127.0.0.1(54232) idl
           ├─8659 postgres: candlepin candlepin 127.0.0.1(54233) idl
           ├─8660 postgres: candlepin candlepin 127.0.0.1(54234) idl
           ├─8661 postgres: candlepin candlepin 127.0.0.1(54240) idl
           └─8662 postgres: candlepin candlepin 127.0.0.1(54242) idl

Warning: Journal has been rotated since unit was started. Log output is incomplete or unavailable.
| displaying qdrouterd                                                          
● qdrouterd.service - Qpid Dispatch router daemon
   Loaded: loaded (/usr/lib/systemd/system/qdrouterd.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 18:41:42 EDT; 4 days ago
 Main PID: 26811 (qdrouterd)
    Tasks: 9
   Memory: 21.7M
   CGroup: /system.slice/qdrouterd.service
           └─26811 /usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf

Warning: Journal has been rotated since unit was started. Log output is incomplete or unavailable.
/ displaying qpidd                                                              
● qpidd.service - An AMQP message broker daemon.
   Loaded: loaded (/usr/lib/systemd/system/qpidd.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 18:41:22 EDT; 4 days ago
     Docs: man:qpidd(1)
           http://qpid.apache.org/
 Main PID: 26751 (qpidd)
    Tasks: 10
   Memory: 45.6M
   CGroup: /system.slice/qpidd.service
           └─26751 /usr/sbin/qpidd --config /etc/qpid/qpidd.conf

Oct 30 14:48:59 batman.usersys.redhat.com qpidd[26751]: 2018-10-30 14:48:59 [Security] debug ACL: Lookup for id:anonymous action:access objectType:exchange name:celeryev with params { }
Oct 30 14:48:59 batman.usersys.redhat.com qpidd[26751]: 2018-10-30 14:48:59 [Security] debug ACL: Lookup for id:anonymous action:access objectType:exchange name:celeryev with params { }
Oct 30 14:48:59 batman.usersys.redhat.com qpidd[26751]: 2018-10-30 14:48:59 [Security] debug ACL: No successful match, defaulting to the decision mode allow
Oct 30 14:48:59 batman.usersys.redhat.com qpidd[26751]: 2018-10-30 14:48:59 [Broker] debug anonymous.22221275-7124-4151-8abc-db47c4249fa5:0: receiver marked completed: 418966 incomplete: { } unknown-completed: { [0,418966] }
Oct 30 14:48:59 batman.usersys.redhat.com qpidd[26751]: 2018-10-30 14:48:59 [Security] debug ACL: No successful match, defaulting to the decision mode allow
Oct 30 14:48:59 batman.usersys.redhat.com qpidd[26751]: 2018-10-30 14:48:59 [Broker] debug anonymous.22221275-7124-4151-8abc-db47c4249fa5:0: receiver marked completed: 418966 incomplete: { } unknown-completed: { [0,418966] }
Oct 30 14:48:59 batman.usersys.redhat.com qpidd[26751]: 2018-10-30 14:48:59 [Broker] debug : receive completed for msg seq=418967
Oct 30 14:48:59 batman.usersys.redhat.com qpidd[26751]: 2018-10-30 14:48:59 [Broker] debug : receive completed for msg seq=418967
Oct 30 14:48:59 batman.usersys.redhat.com qpidd[26751]: 2018-10-30 14:48:59 [Broker] debug anonymous.22221275-7124-4151-8abc-db47c4249fa5:0: receiver marked completed: 418967 incomplete: { } unknown-completed: { [0,418967] }
Oct 30 14:48:59 batman.usersys.redhat.com qpidd[26751]: 2018-10-30 14:48:59 [Broker] debug anonymous.22221275-7124-4151-8abc-db47c4249fa5:0: receiver marked completed: 418967 incomplete: { } unknown-completed: { [0,418967] }
/ displaying squid                                                              
● squid.service - Squid caching proxy
   Loaded: loaded (/usr/lib/systemd/system/squid.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 10:47:51 EDT; 5 days ago
  Process: 2110 ExecReload=/usr/sbin/squid $SQUID_OPTS -k reconfigure -f $SQUID_CONF (code=exited, status=0/SUCCESS)
  Process: 1177 ExecStart=/usr/sbin/squid $SQUID_OPTS -f $SQUID_CONF (code=exited, status=0/SUCCESS)
  Process: 1106 ExecStartPre=/usr/libexec/squid/cache_swap.sh (code=exited, status=0/SUCCESS)
 Main PID: 1183 (squid)
    Tasks: 18
   Memory: 18.0M
   CGroup: /system.slice/squid.service
           ├─1183 /usr/sbin/squid -f /etc/squid/squid.conf
           └─1185 (squid-1) -f /etc/squid/squid.conf

Warning: Journal has been rotated since unit was started. Log output is incomplete or unavailable.
/ displaying pulp_celerybeat                                                    
● pulp_celerybeat.service - Pulp's Celerybeat
   Loaded: loaded (/usr/lib/systemd/system/pulp_celerybeat.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 10:47:51 EDT; 5 days ago
 Main PID: 1168 (celery)
    Tasks: 5
   Memory: 50.8M
   CGroup: /system.slice/pulp_celerybeat.service
           └─1168 /usr/bin/python /usr/bin/celery beat --app=pulp.server.async.celery_instance.celery --scheduler=pulp.server.async.scheduler.Scheduler

Oct 30 13:15:15 batman.usersys.redhat.com pulp[1168]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Oct 30 13:25:15 batman.usersys.redhat.com pulp[1168]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Oct 30 13:35:15 batman.usersys.redhat.com pulp[1168]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Oct 30 13:45:15 batman.usersys.redhat.com pulp[1168]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Oct 30 13:55:15 batman.usersys.redhat.com pulp[1168]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Oct 30 14:05:15 batman.usersys.redhat.com pulp[1168]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Oct 30 14:15:15 batman.usersys.redhat.com pulp[1168]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Oct 30 14:25:15 batman.usersys.redhat.com pulp[1168]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Oct 30 14:35:15 batman.usersys.redhat.com pulp[1168]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Oct 30 14:45:15 batman.usersys.redhat.com pulp[1168]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
/ displaying pulp_resource_manager                                              
● pulp_resource_manager.service - Pulp Resource Manager
   Loaded: loaded (/usr/lib/systemd/system/pulp_resource_manager.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 10:47:51 EDT; 5 days ago
 Main PID: 1169 (celery)
    Tasks: 15
   Memory: 107.0M
   CGroup: /system.slice/pulp_resource_manager.service
           ├─1169 /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%h -Q resource_manager -c 1 --events --umask 18 --pidfile=/var/run/pulp/resource_manager.pid
           └─2292 /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%h -Q resource_manager -c 1 --events --umask 18 --pidfile=/var/run/pulp/resource_manager.pid

Oct 29 23:04:38 batman.usersys.redhat.com pulp[2292]: celery.app.trace:INFO: [a15691c8] Task pulp.server.async.tasks._queue_reserved_task[a15691c8-2163-4ff3-93a5-e919be6d1e7a] succeeded in 0.0613477489678s: None
Oct 29 23:04:39 batman.usersys.redhat.com pulp[2292]: celery.app.trace:INFO: [eefe2f39] Task pulp.server.async.tasks._queue_reserved_task[eefe2f39-7c81-492a-9ce0-8a1184ed826e] succeeded in 0.311454733019s: None
Oct 29 23:04:40 batman.usersys.redhat.com pulp[1169]: celery.worker.strategy:INFO: Received task: pulp.server.async.tasks._queue_reserved_task[52f13ae5-ce7c-4035-aec4-2f17d27188a7]
Oct 29 23:04:40 batman.usersys.redhat.com pulp[2292]: celery.app.trace:INFO: [52f13ae5] Task pulp.server.async.tasks._queue_reserved_task[52f13ae5-ce7c-4035-aec4-2f17d27188a7] succeeded in 0.0589094400057s: None
Oct 29 23:04:41 batman.usersys.redhat.com pulp[1169]: celery.worker.strategy:INFO: Received task: pulp.server.async.tasks._queue_reserved_task[d817fb9a-6f0d-408f-bf68-95527c705aa5]
Oct 29 23:04:41 batman.usersys.redhat.com pulp[2292]: celery.app.trace:INFO: [d817fb9a] Task pulp.server.async.tasks._queue_reserved_task[d817fb9a-6f0d-408f-bf68-95527c705aa5] succeeded in 0.0423962179921s: None
Oct 29 23:05:07 batman.usersys.redhat.com pulp[1169]: celery.worker.strategy:INFO: Received task: pulp.server.async.tasks._queue_reserved_task[77407a98-72c8-4c57-8c6c-bb3ab70b94b4]
Oct 29 23:05:07 batman.usersys.redhat.com pulp[2292]: celery.app.trace:INFO: [77407a98] Task pulp.server.async.tasks._queue_reserved_task[77407a98-72c8-4c57-8c6c-bb3ab70b94b4] succeeded in 0.0206409830134s: None
Oct 30 14:06:56 batman.usersys.redhat.com pulp[1169]: celery.worker.strategy:INFO: Received task: pulp.server.async.tasks._queue_reserved_task[0a0771ae-b1c8-405d-94f9-b7d01b2d030b]
Oct 30 14:06:56 batman.usersys.redhat.com pulp[2292]: celery.app.trace:INFO: [0a0771ae] Task pulp.server.async.tasks._queue_reserved_task[0a0771ae-b1c8-405d-94f9-b7d01b2d030b] succeeded in 0.0685426379787s: None
- displaying pulp_streamer                                                      
● pulp_streamer.service - The Pulp lazy content loading streamer
   Loaded: loaded (/usr/lib/systemd/system/pulp_streamer.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 10:47:51 EDT; 5 days ago
 Main PID: 1175 (pulp_streamer)
    Tasks: 5
   Memory: 68.8M
   CGroup: /system.slice/pulp_streamer.service
           └─1175 /usr/bin/python /usr/bin/pulp_streamer --nodaemon --syslog --prefix=pulp_streamer --pidfile= --python /usr/share/pulp/wsgi/streamer.tac

Oct 30 14:39:25 batman.usersys.redhat.com pulp_streamer[1175]: requests.packages.urllib3.connectionpool:INFO: Starting new HTTPS connection (1): cdn.redhat.com
Oct 30 14:39:26 batman.usersys.redhat.com pulp_streamer[1175]: pulp.streamer.server:INFO: Trying URL: https://cdn.redhat.com/content/dist/rhel/server/7/7Server/x86_64/satellite/6.3/os/Packages/python-pulp-bindings-2.13.4.14-1.el7sat.noarch.rpm
Oct 30 14:39:26 batman.usersys.redhat.com pulp_streamer[1175]: requests.packages.urllib3.connectionpool:INFO: Starting new HTTPS connection (2): cdn.redhat.com
Oct 30 14:39:26 batman.usersys.redhat.com pulp_streamer[1175]: nectar.downloaders.threaded:INFO: Download succeeded: https://cdn.redhat.com/content/dist/rhel/server/7/7Server/x86_64/satellite/6.3/os/Packages/pulp-admin-client-2.13.4.14-1.el7sat.noarch.rpm.
Oct 30 14:39:26 batman.usersys.redhat.com pulp_streamer[1175]: [-] 127.0.0.1 - - [30/Oct/2018:18:39:25 +0000] "GET /var/lib/pulp/content/units/rpm/19/cb222d7ae1a89cae86fb5e60396f55fb5ca203dd32854462b9483952a453e1/pulp-admin-client-2.13.4.14-1.el7sat.noarch.rpm HTTP/1.1" 200 130300 "-" "urlgrabber/3.10 yum/3.4.3"
Oct 30 14:39:26 batman.usersys.redhat.com pulp_streamer[1175]: nectar.downloaders.threaded:INFO: Download succeeded: https://cdn.redhat.com/content/dist/rhel/server/7/7Server/x86_64/satellite/6.3/os/Packages/python-pulp-bindings-2.13.4.14-1.el7sat.noarch.rpm.
Oct 30 14:39:26 batman.usersys.redhat.com pulp_streamer[1175]: [-] 127.0.0.1 - - [30/Oct/2018:18:39:25 +0000] "GET /var/lib/pulp/content/units/rpm/3d/58958b0acc758ffc35aef81b0d26e1b67aef34d1673b1353250672c9ddad4b/python-pulp-bindings-2.13.4.14-1.el7sat.noarch.rpm HTTP/1.1" 200 115292 "-" "urlgrabber/3.10 yum/3.4.3"
Oct 30 14:39:26 batman.usersys.redhat.com pulp_streamer[1175]: pulp.streamer.server:INFO: Trying URL: https://cdn.redhat.com/content/dist/rhel/server/7/7Server/x86_64/satellite/6.3/os/Packages/python-pulp-client-lib-2.13.4.14-1.el7sat.noarch.rpm
Oct 30 14:39:27 batman.usersys.redhat.com pulp_streamer[1175]: nectar.downloaders.threaded:INFO: Download succeeded: https://cdn.redhat.com/content/dist/rhel/server/7/7Server/x86_64/satellite/6.3/os/Packages/python-pulp-client-lib-2.13.4.14-1.el7sat.noarch.rpm.
Oct 30 14:39:27 batman.usersys.redhat.com pulp_streamer[1175]: [-] 127.0.0.1 - - [30/Oct/2018:18:39:26 +0000] "GET /var/lib/pulp/content/units/rpm/d1/f7420e0af01ea50dafac60c15f3137bf7e69ae3b8380589bf85f46185cb95b/python-pulp-client-lib-2.13.4.14-1.el7sat.noarch.rpm HTTP/1.1" 200 204276 "-" "urlgrabber/3.10 yum/3.4.3"
- displaying pulp_workers                                                       
● pulp_workers.service - Pulp Celery Workers
   Loaded: loaded (/usr/lib/systemd/system/pulp_workers.service; enabled; vendor preset: disabled)
   Active: active (exited) since Thu 2018-10-25 10:48:01 EDT; 5 days ago
  Process: 1146 ExecStart=/usr/bin/python -m pulp.server.async.manage_workers start (code=exited, status=0/SUCCESS)
 Main PID: 1146 (code=exited, status=0/SUCCESS)
    Tasks: 0
   Memory: 0B
   CGroup: /system.slice/pulp_workers.service

Warning: Journal has been rotated since unit was started. Log output is incomplete or unavailable.
- displaying smart_proxy_dynflow_core                                           
● smart_proxy_dynflow_core.service - Foreman smart proxy dynflow core service
   Loaded: loaded (/usr/lib/systemd/system/smart_proxy_dynflow_core.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/smart_proxy_dynflow_core.service.d
           └─90-limits.conf
   Active: active (running) since Thu 2018-10-25 10:48:53 EDT; 5 days ago
     Docs: https://github.com/theforeman/smart_proxy_dynflow
  Process: 1104 ExecStart=/usr/bin/smart_proxy_dynflow_core -d -p /var/run/foreman-proxy/smart_proxy_dynflow_core.pid (code=exited, status=0/SUCCESS)
 Main PID: 2424 (ruby)
    Tasks: 12
   Memory: 63.6M
   CGroup: /system.slice/smart_proxy_dynflow_core.service
           └─2424 ruby /usr/bin/smart_proxy_dynflow_core -d -p /var/run/foreman-proxy/smart_proxy_dynflow_core.pid

Warning: Journal has been rotated since unit was started. Log output is incomplete or unavailable.
- displaying tomcat                                                             
● tomcat.service - Apache Tomcat Web Application Container
   Loaded: loaded (/usr/lib/systemd/system/tomcat.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 10:47:51 EDT; 5 days ago
 Main PID: 1141 (java)
    Tasks: 160
   Memory: 1.3G
   CGroup: /system.slice/tomcat.service
           └─1141 /usr/lib/jvm/jre/bin/java -Xms1024m -Xmx4096m -classpath /usr/share/tomcat/bin/bootstrap.jar:/usr/share/tomcat/bin/tomcat-juli.jar:/usr/share/java/commons-daemon.jar -Dcatalina.base=/usr/share/tomcat -Dcatalina.home=/usr/share/tomcat -Djava.endorsed.dirs= -Djava.io.tmpdir=/var/cache/tomcat/temp -Djava.util.logging.config.file=/usr/share/tomcat/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager org.apache.catalina.startup.Bootstrap start

Warning: Journal has been rotated since unit was started. Log output is incomplete or unavailable.
- displaying dynflowd                                                           
● dynflowd.service - Foreman jobs daemon
   Loaded: loaded (/usr/lib/systemd/system/dynflowd.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 10:49:15 EDT; 5 days ago
     Docs: https://theforeman.org
  Process: 1102 ExecStart=/usr/sbin/dynflowd start (code=exited, status=0/SUCCESS)
    Tasks: 19
   Memory: 1.1G
   CGroup: /system.slice/dynflowd.service
           ├─2670 dynflow_executor_monitor     
           └─2671 dynflow_executor             

Warning: Journal has been rotated since unit was started. Log output is incomplete or unavailable.
- displaying goferd                                                             
● goferd.service - Gofer Agent
   Loaded: loaded (/usr/lib/systemd/system/goferd.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 10:47:51 EDT; 5 days ago
 Main PID: 1126 (python)
    Tasks: 8
   Memory: 137.2M
   CGroup: /system.slice/goferd.service
           └─1126 /usr/bin/python /usr/bin/goferd --foreground

Oct 30 14:44:29 batman.usersys.redhat.com goferd[1126]: [ERROR][worker-0] gofer.messaging.adapter.connect:33 - connect: proton+amqps://subscription.rhsm.redhat.com:5647, failed: Connection amqps://subscription.rhsm.redhat.com:5647 disconnected: Condition('proton:io', 'recv: Connection refused')
Oct 30 14:44:29 batman.usersys.redhat.com goferd[1126]: [INFO][worker-0] gofer.messaging.adapter.connect:35 - retry in 106 seconds
Oct 30 14:46:16 batman.usersys.redhat.com goferd[1126]: [INFO][worker-0] gofer.messaging.adapter.connect:28 - connecting: proton+amqps://subscription.rhsm.redhat.com:5647
Oct 30 14:46:16 batman.usersys.redhat.com goferd[1126]: [INFO][worker-0] gofer.messaging.adapter.proton.connection:87 - open: URL: amqps://subscription.rhsm.redhat.com:5647|SSL: ca: /etc/rhsm/ca/redhat-uep.pem|key: None|certificate: /etc/pki/consumer/bundle.pem|host-validation: None
Oct 30 14:46:16 batman.usersys.redhat.com goferd[1126]: [ERROR][worker-0] gofer.messaging.adapter.connect:33 - connect: proton+amqps://subscription.rhsm.redhat.com:5647, failed: Connection amqps://subscription.rhsm.redhat.com:5647 disconnected: Condition('proton:io', 'recv: Connection refused')
Oct 30 14:46:16 batman.usersys.redhat.com goferd[1126]: [INFO][worker-0] gofer.messaging.adapter.connect:35 - retry in 106 seconds
Oct 30 14:48:03 batman.usersys.redhat.com goferd[1126]: [INFO][worker-0] gofer.messaging.adapter.connect:28 - connecting: proton+amqps://subscription.rhsm.redhat.com:5647
Oct 30 14:48:03 batman.usersys.redhat.com goferd[1126]: [INFO][worker-0] gofer.messaging.adapter.proton.connection:87 - open: URL: amqps://subscription.rhsm.redhat.com:5647|SSL: ca: /etc/rhsm/ca/redhat-uep.pem|key: None|certificate: /etc/pki/consumer/bundle.pem|host-validation: None
Oct 30 14:48:03 batman.usersys.redhat.com goferd[1126]: [ERROR][worker-0] gofer.messaging.adapter.connect:33 - connect: proton+amqps://subscription.rhsm.redhat.com:5647, failed: Connection amqps://subscription.rhsm.redhat.com:5647 disconnected: Condition('proton:io', 'recv: Connection refused')
Oct 30 14:48:03 batman.usersys.redhat.com goferd[1126]: [INFO][worker-0] gofer.messaging.adapter.connect:35 - retry in 106 seconds
\ displaying httpd                                                              
● httpd.service - The Apache HTTP Server
   Loaded: loaded (/usr/lib/systemd/system/httpd.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 10:48:12 EDT; 5 days ago
     Docs: man:httpd(8)
           man:apachectl(8)
  Process: 16029 ExecReload=/usr/sbin/httpd $OPTIONS -k graceful (code=exited, status=0/SUCCESS)
 Main PID: 1103 (httpd)
   Status: "Total requests: 0; Current requests/sec: 0; Current traffic:   0 B/sec"
    Tasks: 245
   Memory: 2.3G
   CGroup: /system.slice/httpd.service
           ├─ 1103 /usr/sbin/httpd -DFOREGROUND
           ├─ 1581 Passenger RackApp: /usr/share/foreman                                      
           ├─ 4153 Passenger RackApp: /usr/share/foreman                                      
           ├─ 7029 Passenger RackApp: /usr/share/foreman                                      
           ├─ 7955 /usr/sbin/httpd -DFOREGROUND
           ├─ 8430 Passenger RackApp: /usr/share/foreman                                      
           ├─16068 (wsgi:pulp)     -DFOREGROUND
           ├─16069 (wsgi:pulp)     -DFOREGROUND
           ├─16070 (wsgi:pulp)     -DFOREGROUND
           ├─16071 (wsgi:pulp-cont -DFOREGROUND
           ├─16072 (wsgi:pulp-cont -DFOREGROUND
           ├─16073 (wsgi:pulp-cont -DFOREGROUND
           ├─16074 (wsgi:pulp_forg -DFOREGROUND
           ├─16075 PassengerWatchdog
           ├─16078 PassengerHelperAgent
           ├─16086 PassengerLoggingAgent
           ├─16093 /usr/sbin/httpd -DFOREGROUND
           ├─16094 /usr/sbin/httpd -DFOREGROUND
           ├─16095 /usr/sbin/httpd -DFOREGROUND
           ├─16096 /usr/sbin/httpd -DFOREGROUND
           ├─16097 /usr/sbin/httpd -DFOREGROUND
           ├─16098 /usr/sbin/httpd -DFOREGROUND
           ├─16099 /usr/sbin/httpd -DFOREGROUND
           ├─16100 /usr/sbin/httpd -DFOREGROUND
           ├─16375 /bin/bash /usr/bin/tfm-ruby /usr/share/gems/gems/passenger-4.0.18/helper-scripts/rack-preloader.rb
           ├─16383 scl enable tfm bash /tmp/tmp.kjzeJ0xLns
           ├─16384 /bin/bash /var/tmp/sclkEy04T
           ├─16408 bash /tmp/tmp.kjzeJ0xLns
           ├─16409 Passenger AppPreloader: /usr/share/foreman                                 
           ├─16585 /usr/sbin/httpd -DFOREGROUND
           ├─16643 /usr/sbin/httpd -DFOREGROUND
           ├─17700 /usr/sbin/httpd -DFOREGROUND
           ├─17843 /usr/sbin/httpd -DFOREGROUND
           ├─17897 /usr/sbin/httpd -DFOREGROUND
           ├─18709 /usr/sbin/httpd -DFOREGROUND
           └─19706 /usr/sbin/httpd -DFOREGROUND

Oct 30 14:06:50 batman.usersys.redhat.com pulp[16069]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
Oct 30 14:06:55 batman.usersys.redhat.com pulp[16069]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
Oct 30 14:23:50 batman.usersys.redhat.com pulp[16068]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
Oct 30 14:23:51 batman.usersys.redhat.com pulp[16068]: gofer.messaging.adapter.connect:INFO: connecting: qpid+ssl://localhost:5671
Oct 30 14:23:51 batman.usersys.redhat.com pulp[16068]: gofer.messaging.adapter.qpid.connection:INFO: open: URL: ssl://localhost|SSL: ca: /etc/pki/katello/certs/katello-default-ca.crt|key: None|certificate: /etc/pki/katello/qpid_client_striped.crt|host-validation: False
Oct 30 14:23:51 batman.usersys.redhat.com pulp[16068]: gofer.messaging.adapter.qpid.connection:INFO: opened: qpid+ssl://localhost:5671
Oct 30 14:23:51 batman.usersys.redhat.com pulp[16068]: gofer.messaging.adapter.connect:INFO: connected: qpid+ssl://localhost:5671
Oct 30 14:23:51 batman.usersys.redhat.com pulp[16068]: gofer.messaging.adapter.connect:INFO: connecting: qpid+ssl://localhost:5671
Oct 30 14:23:51 batman.usersys.redhat.com pulp[16068]: gofer.messaging.adapter.connect:INFO: connected: qpid+ssl://localhost:5671
Oct 30 14:23:51 batman.usersys.redhat.com pulp[16068]: gofer.messaging.adapter.qpid.connection:INFO: closed: qpid+ssl://localhost:5671
\ displaying puppetserver                                                       
● puppetserver.service - puppetserver Service
   Loaded: loaded (/usr/lib/systemd/system/puppetserver.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 10:49:08 EDT; 5 days ago
  Process: 1174 ExecStart=/opt/puppetlabs/server/apps/puppetserver/bin/puppetserver start (code=exited, status=0/SUCCESS)
 Main PID: 1224 (java)
    Tasks: 75
   Memory: 1.6G
   CGroup: /system.slice/puppetserver.service
           └─1224 /usr/bin/java -Xms2G -Xmx2G -XX:MaxPermSize=256m -Djava.security.egd=/dev/urandom -XX:OnOutOfMemoryError=kill -9 %p -cp /opt/puppetlabs/server/apps/puppetserver/puppet-server-release.jar:/opt/puppetlabs/server/apps/puppetserver/jruby-1_7.jar:/opt/puppetlabs/server/data/puppetserver/jars/* clojure.main -m puppetlabs.trapperkeeper.main --config /etc/puppetlabs/puppetserver/conf.d --bootstrap-config /etc/puppetlabs/puppetserver/services.d/,/opt/puppetlabs/server/apps/puppetserver/config/services.d/ --restart-file /opt/puppetlabs/server/data/puppetserver/restartcounter

Warning: Journal has been rotated since unit was started. Log output is incomplete or unavailable.
\ displaying foreman-proxy                                                      
● foreman-proxy.service - Foreman Proxy
   Loaded: loaded (/usr/lib/systemd/system/foreman-proxy.service; enabled; vendor preset: disabled)
   Active: active (running) since Thu 2018-10-25 10:48:33 EDT; 5 days ago
 Main PID: 1130 (ruby)
    Tasks: 8
   Memory: 827.4M
   CGroup: /system.slice/foreman-proxy.service
           └─1130 ruby /usr/share/foreman-proxy/bin/smart-proxy --no-daemonize

Oct 30 14:40:25 batman.usersys.redhat.com smart-proxy[1130]: dhcp-12-213-135.gsslab.rdu.redhat.com - - [30/Oct/2018:14:40:24 EDT] "POST /compliance/arf/1 HTTP/1.1" 200 0
Oct 30 14:40:25 batman.usersys.redhat.com smart-proxy[1130]: - -> /compliance/arf/1
Oct 30 14:44:59 batman.usersys.redhat.com smart-proxy[1130]: dhcp-12-213-135.gsslab.rdu.redhat.com - - [30/Oct/2018:14:44:58 EDT] "POST /compliance/arf/2 HTTP/1.1" 200 0
Oct 30 14:44:59 batman.usersys.redhat.com smart-proxy[1130]: - -> /compliance/arf/2
Oct 30 14:45:03 batman.usersys.redhat.com smart-proxy[1130]: dhcp-12-213-107.gsslab.rdu.redhat.com - - [30/Oct/2018:14:45:02 EDT] "POST /compliance/arf/2 HTTP/1.1" 200 0
Oct 30 14:45:03 batman.usersys.redhat.com smart-proxy[1130]: - -> /compliance/arf/2
Oct 30 14:45:22 batman.usersys.redhat.com smart-proxy[1130]: dhcp-12-213-107.gsslab.rdu.redhat.com - - [30/Oct/2018:14:45:22 EDT] "POST /compliance/arf/1 HTTP/1.1" 200 0
Oct 30 14:45:22 batman.usersys.redhat.com smart-proxy[1130]: - -> /compliance/arf/1
Oct 30 14:45:24 batman.usersys.redhat.com smart-proxy[1130]: dhcp-12-213-135.gsslab.rdu.redhat.com - - [30/Oct/2018:14:45:23 EDT] "POST /compliance/arf/1 HTTP/1.1" 200 0
Oct 30 14:45:24 batman.usersys.redhat.com smart-proxy[1130]: - -> /compliance/arf/1
\ All services are running                                            [OK]      
--------------------------------------------------------------------------------

Comment 3 Taft Sanders 2018-10-31 12:24:56 UTC
Brad,

I let the task sit overnight just to be sure it wasn't due to my infrastructure being slow. Here is the task this morning:

Action:

Actions::Pulp::Consumer::ContentUpdate
State: waiting for Pulp to start the task

Input:
{"consumer_uuid"=>"0b166023-f6bb-443e-97a5-a80d578e651b",
 "type"=>"rpm",
 "args"=>["dhclient-4.2.5-58.el7_4.4.x86_64"],
 "remote_user"=>"admin",
 "remote_cp_user"=>"admin"}
Output:
{"pulp_tasks"=>
  [{"exception"=>nil,
    "task_type"=>nil,
    "_href"=>"/pulp/api/v2/tasks/272d7d68-90bf-407b-bda1-8d41b4ad1c46/",
    "task_id"=>"272d7d68-90bf-407b-bda1-8d41b4ad1c46",
    "tags"=>
     ["pulp:consumer:0b166023-f6bb-443e-97a5-a80d578e651b",
      "pulp:action:unit_update"],
    "finish_time"=>nil,
    "_ns"=>"task_status",
    "start_time"=>nil,
    "traceback"=>nil,
    "spawned_tasks"=>[],
    "progress_report"=>{},
    "queue"=>"agent.dq2",
    "state"=>"waiting",
    "worker_name"=>"agent",
    "result"=>nil,
    "error"=>nil,
    "_id"=>{"$oid"=>"5bd8a1b7882c103ec4f5ea40"},
    "id"=>"5bd8a1b7882c103ec4f5ea40"}],
 "poll_attempts"=>{"total"=>3897, "failed"=>0}}

Comment 5 Jameer Pathan 2018-11-05 07:28:19 UTC
The reported issue is not reproducible with below steps:

1. Installed satellite6.3
2. Registered a rhel7.5 client
3. Package install from satellite to client - working fine
4. Upgraded Satellite 6.3 -> 6.4 via foreman-maintain
5. Re-validated step 3- working fine
6. Upgraded client with 6.4 tools bits
7. Performed step3 again and didn't see the issue.

Note: we explicitly tried package push and pull and it was working fine before and  after 6.4 sat/client upgrade

Comment 6 Jameer Pathan 2018-11-05 07:34:36 UTC
Additionally, client is pointed to gofer-2.12.1-1 after upgrade

Comment 7 Sachin Ghai 2018-11-05 07:41:29 UTC
@Taft: Hey,  As per comment5 issue is not reproducible. Could you please review steps in comment5 to ensure if we are following similar steps as you or is there any deviation ?

Comment 8 Jonathon Turel 2018-11-11 15:39:19 UTC
The root cause is that the new version of katello-agent was missing a 'Requires' for a particular version of katello-host-tools which underwent some code reorganization in 6.4. In other words the 3.3.5 katello-agent and the 3.3.1 katello-host-tools rpm are not compatible.

A workaround is to 'yum upgrade katello-host-tools' and restart goferd: 'systemctl restart goferd' until we release a new katello-agent RPM with the corrected dependency.

Comment 11 Taft Sanders 2018-12-14 14:32:09 UTC
[root@drone73 ~]# systemctl status goferd -l
● goferd.service - Gofer Agent
   Loaded: loaded (/usr/lib/systemd/system/goferd.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2018-12-14 09:16:20 EST; 9min ago
 Main PID: 31730 (python)
   CGroup: /system.slice/goferd.service
           └─31730 /usr/bin/python /usr/bin/goferd --foreground

Dec 14 09:16:20 drone73.usersys.redhat.com systemd[1]: Starting Gofer Agent...
Dec 14 09:16:20 drone73.usersys.redhat.com goferd[31730]: [INFO][Thread-1] gofer.rmi.store:108 - Using: /var/lib/gofer/messaging/pending/demo
Dec 14 09:16:20 drone73.usersys.redhat.com goferd[31730]: [WARNING][MainThread] gofer.agent.plugin:647 - plugin:demo, DISABLED
Dec 14 09:16:20 drone73.usersys.redhat.com goferd[31730]: [INFO][Thread-2] gofer.rmi.store:108 - Using: /var/lib/gofer/messaging/pending/katello
Dec 14 09:16:20 drone73.usersys.redhat.com goferd[31730]: [ERROR][MainThread] gofer.agent.plugin:703 - plugin:katello, import failed
Dec 14 09:16:20 drone73.usersys.redhat.com goferd[31730]: [ERROR][MainThread] gofer.agent.plugin:703 - Traceback (most recent call last):
Dec 14 09:16:20 drone73.usersys.redhat.com goferd[31730]: [ERROR][MainThread] gofer.agent.plugin:703 -   File "/usr/lib/python2.7/site-packages/gofer/agent/plugin.py", line 687, in _load
Dec 14 09:16:20 drone73.usersys.redhat.com goferd[31730]: [ERROR][MainThread] gofer.agent.plugin:703 -     plugin.impl = __import__(path, {}, {}, [path.split('.')[-1]])
Dec 14 09:16:20 drone73.usersys.redhat.com goferd[31730]: [ERROR][MainThread] gofer.agent.plugin:703 - ImportError: No module named katello.agent.goferd.plugin
Dec 14 09:16:20 drone73.usersys.redhat.com goferd[31730]: [INFO][MainThread] gofer.agent.main:92 - agent started.
[root@drone73 ~]# rpm -qa | grep katello-host
katello-host-tools-3.1.0-2.el7sat.noarch
katello-host-tools-fact-plugin-3.1.0-2.el7sat.noarch


Updating to the latest katello-host-tools:
[root@drone73 ~]# yum update katello-host-tools
Loaded plugins: enabled_repos_upload, package_upload, product-id, search-disabled-repos, subscription-manager
rhel-7-server-rpms                                                                             | 2.0 kB  00:00:00     
rhel-7-server-satellite-tools-6.4-rpms                                                         | 2.1 kB  00:00:00     
Resolving Dependencies
There are unfinished transactions remaining. You might consider running yum-complete-transaction, or "yum-complete-transaction --cleanup-only" and "yum history redo last", first to finish them. If those don't work you'll have to try removing/installing packages by hand (maybe package-cleanup can help).
--> Running transaction check
---> Package katello-host-tools.noarch 0:3.1.0-2.el7sat will be updated
---> Package katello-host-tools.noarch 0:3.3.5-3.el7sat will be an update
--> Finished Dependency Resolution

Dependencies Resolved

======================================================================================================================
 Package                   Arch          Version                  Repository                                     Size
======================================================================================================================
Updating:
 katello-host-tools        noarch        3.3.5-3.el7sat           rhel-7-server-satellite-tools-6.4-rpms         31 k

Transaction Summary
======================================================================================================================
Upgrade  1 Package

Total download size: 31 k
Is this ok [y/d/N]: y
Downloading packages:
Delta RPMs disabled because /usr/bin/applydeltarpm not installed.
katello-host-tools-3.3.5-3.el7sat.noarch.rpm                                                   |  31 kB  00:00:00     
Running transaction check
Running transaction test
Transaction test succeeded
Running transaction
  Updating   : katello-host-tools-3.3.5-3.el7sat.noarch                                                           1/2 
  Cleanup    : katello-host-tools-3.1.0-2.el7sat.noarch                                                           2/2 
Loaded plugins: product-id
Loaded plugins: product-id
Uploading Package Profile
rhel-7-server-rpms/7Server/x86_64/productid                                                    | 2.1 kB  00:00:00     
  Verifying  : katello-host-tools-3.3.5-3.el7sat.noarch                                                           1/2 
  Verifying  : katello-host-tools-3.1.0-2.el7sat.noarch                                                           2/2 

Updated:
  katello-host-tools.noarch 0:3.3.5-3.el7sat                                                                          

Complete!
Uploading Enabled Repositories Report
Loaded plugins: product-id
You have mail in /var/spool/mail/root
[root@drone73 ~]# rpm -qa | grep katello-agent
katello-agent-3.3.5-3.el7sat.noarch


No more errors with the latest katello-host-tools rpm. Issue appears to be fixed:
[root@drone73 ~]# rpm -qa | grep katello-agent
katello-agent-3.3.5-3.el7sat.noarch
[root@drone73 ~]# systemctl restart goferd
[root@drone73 ~]# systemctl status goferd -l
● goferd.service - Gofer Agent
   Loaded: loaded (/usr/lib/systemd/system/goferd.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2018-12-14 09:29:15 EST; 5s ago
 Main PID: 32206 (python)
   CGroup: /system.slice/goferd.service
           └─32206 /usr/bin/python /usr/bin/goferd --foreground

Dec 14 09:29:16 drone73.usersys.redhat.com goferd[32206]: [INFO][worker-0] gofer.messaging.adapter.proton.connection:87 - open: URL: amqps://batman.usersys.redhat.com:5647|SSL: ca: /etc/rhsm/ca/katello-default-ca.pem|key: None|certificate: /etc/pki/consumer/bundle.pem|host-validation: None
Dec 14 09:29:16 drone73.usersys.redhat.com goferd[32206]: [INFO][MainThread] gofer.agent.main:92 - agent started.
Dec 14 09:29:16 drone73.usersys.redhat.com goferd[32206]: [INFO][worker-0] gofer.messaging.adapter.proton.connection:92 - opened: proton+amqps://batman.usersys.redhat.com:5647
Dec 14 09:29:16 drone73.usersys.redhat.com goferd[32206]: [INFO][worker-0] gofer.messaging.adapter.connect:30 - connected: proton+amqps://batman.usersys.redhat.com:5647
Dec 14 09:29:16 drone73.usersys.redhat.com goferd[32206]: [INFO][worker-0] gofer.messaging.adapter.proton.connection:131 - closed: proton+amqps://batman.usersys.redhat.com:5647
Dec 14 09:29:16 drone73.usersys.redhat.com goferd[32206]: [INFO][pulp.agent.7fc523c5-4916-4ead-94f4-fef139c1f65c] gofer.messaging.adapter.connect:28 - connecting: proton+amqps://batman.usersys.redhat.com:5647
Dec 14 09:29:16 drone73.usersys.redhat.com goferd[32206]: [INFO][pulp.agent.7fc523c5-4916-4ead-94f4-fef139c1f65c] gofer.messaging.adapter.proton.connection:87 - open: URL: amqps://batman.usersys.redhat.com:5647|SSL: ca: /etc/rhsm/ca/katello-default-ca.pem|key: None|certificate: /etc/pki/consumer/bundle.pem|host-validation: None
Dec 14 09:29:16 drone73.usersys.redhat.com goferd[32206]: [INFO][worker-0] gofer.agent.plugin:379 - plugin:katello, attached => pulp.agent.7fc523c5-4916-4ead-94f4-fef139c1f65c
Dec 14 09:29:17 drone73.usersys.redhat.com goferd[32206]: [INFO][pulp.agent.7fc523c5-4916-4ead-94f4-fef139c1f65c] gofer.messaging.adapter.proton.connection:92 - opened: proton+amqps://batman.usersys.redhat.com:5647
Dec 14 09:29:17 drone73.usersys.redhat.com goferd[32206]: [INFO][pulp.agent.7fc523c5-4916-4ead-94f4-fef139c1f65c] gofer.messaging.adapter.connect:30 - connected: proton+amqps://batman.usersys.redhat.com:5647


Looks fixed to me with the latest release of katello-host-tools.

Comment 15 errata-xmlrpc 2019-05-14 12:38:16 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2019:1222


Note You need to log in before you can comment on or make changes to this bug.