Login
[x]
Log in using an account from:
Fedora Account System
Red Hat Associate
Red Hat Customer
Or login using a Red Hat Bugzilla account
Forgot Password
Login:
Hide Forgot
Create an Account
Red Hat Bugzilla – Attachment 1458445 Details for
Bug 1546099
Drop access logs and add qpid logs
[?]
New
Simple Search
Advanced Search
My Links
Browse
Requests
Reports
Current State
Search
Tabular reports
Graphical reports
Duplicates
Other Reports
User Changes
Plotly Reports
Bug Status
Bug Severity
Non-Defaults
|
Product Dashboard
Help
Page Help!
Bug Writing Guidelines
What's new
Browser Support Policy
5.0.4.rh83 Release notes
FAQ
Guides index
User guide
Web Services
Contact
Legal
This site requires JavaScript to be enabled to function correctly, please enable it.
katello-service status command output
katello-service_status.txt (text/plain), 28.26 KB, created by
Chris Brown
on 2018-07-12 13:56:26 UTC
(
hide
)
Description:
katello-service status command output
Filename:
MIME Type:
Creator:
Chris Brown
Created:
2018-07-12 13:56:26 UTC
Size:
28.26 KB
patch
obsolete
>[root@ibm-x3550m3-10 foreman-debug-2cQxq]# katello-service status >systemctl status rh-mongodb34-mongod.service >â rh-mongodb34-mongod.service - High-performance, schema-free document-oriented database > Loaded: loaded (/usr/lib/systemd/system/rh-mongodb34-mongod.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:33:10 CEST; 19h ago > Process: 19370 ExecStart=/opt/rh/rh-mongodb34/root/usr/libexec/mongodb-scl-helper enable $RH_MONGODB34_SCLS_ENABLED -- /opt/rh/rh-mongodb34/root/usr/bin/mongod $OPTIONS run (code=exited, status=0/SUCCESS) > Main PID: 19375 (mongod) > Tasks: 77 > CGroup: /system.slice/rh-mongodb34-mongod.service > ââ19375 /opt/rh/rh-mongodb34/root/usr/bin/mongod -f /etc/opt/rh/rh-mongodb34/mongod.conf run > >Jul 12 02:33:13 ibm-x3550m3-10.lab.eng.brq.redhat.com mongod.27017[19375]: [conn306] build index on: pulp_database.consumer_history properties: { v: 2, key: { consumer_id: -1 }, name: "consumer_id_-1", ns: "pulp_database.consumer_history", background: true } >Jul 12 02:33:13 ibm-x3550m3-10.lab.eng.brq.redhat.com mongod.27017[19375]: [conn306] build index done. scanned 0 total records. 0 secs >Jul 12 02:33:13 ibm-x3550m3-10.lab.eng.brq.redhat.com mongod.27017[19375]: [conn306] build index on: pulp_database.consumer_history properties: { v: 2, key: { originator: -1 }, name: "originator_-1", ns: "pulp_database.consumer_history", background: true } >Jul 12 02:33:13 ibm-x3550m3-10.lab.eng.brq.redhat.com mongod.27017[19375]: [conn306] build index done. scanned 0 total records. 0 secs >Jul 12 02:33:13 ibm-x3550m3-10.lab.eng.brq.redhat.com mongod.27017[19375]: [conn306] build index on: pulp_database.consumer_history properties: { v: 2, key: { type: -1 }, name: "type_-1", ns: "pulp_database.consumer_history", background: true } >Jul 12 02:33:13 ibm-x3550m3-10.lab.eng.brq.redhat.com mongod.27017[19375]: [conn306] build index done. scanned 0 total records. 0 secs >Jul 12 02:33:13 ibm-x3550m3-10.lab.eng.brq.redhat.com mongod.27017[19375]: [conn306] build index on: pulp_database.repo_group_publish_results properties: { v: 2, unique: true, key: { id: -1 }, name: "id_-1", ns: "pulp_database.repo_group_publish_results", background: true } >Jul 12 02:33:13 ibm-x3550m3-10.lab.eng.brq.redhat.com mongod.27017[19375]: [conn306] build index done. scanned 0 total records. 0 secs >Jul 12 02:33:13 ibm-x3550m3-10.lab.eng.brq.redhat.com mongod.27017[19375]: [conn306] build index on: pulp_database.repo_publish_results properties: { v: 2, unique: true, key: { id: -1 }, name: "id_-1", ns: "pulp_database.repo_publish_results", background: true } >Jul 12 02:33:13 ibm-x3550m3-10.lab.eng.brq.redhat.com mongod.27017[19375]: [conn306] build index done. scanned 0 total records. 0 secs >systemctl status postgresql.service >â postgresql.service - PostgreSQL database server > Loaded: loaded (/etc/systemd/system/postgresql.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:33:11 CEST; 19h ago > Process: 19347 ExecStop=/usr/bin/pg_ctl stop -D ${PGDATA} -s -m fast (code=exited, status=0/SUCCESS) > Process: 19414 ExecStart=/usr/bin/pg_ctl start -D ${PGDATA} -s -o -p ${PGPORT} -w -t 300 (code=exited, status=0/SUCCESS) > Process: 19407 ExecStartPre=/usr/bin/postgresql-check-db-dir ${PGDATA} (code=exited, status=0/SUCCESS) > Main PID: 19416 (postgres) > Tasks: 30 > CGroup: /system.slice/postgresql.service > ââ 5611 postgres: candlepin candlepin 127.0.0.1(34350) idl > ââ 5612 postgres: candlepin candlepin 127.0.0.1(34348) idl > ââ12546 postgres: candlepin candlepin 127.0.0.1(58652) idl > ââ12547 postgres: candlepin candlepin 127.0.0.1(58654) idl > ââ12548 postgres: candlepin candlepin 127.0.0.1(58653) idl > ââ12549 postgres: candlepin candlepin 127.0.0.1(58660) idl > ââ12550 postgres: candlepin candlepin 127.0.0.1(58662) idl > ââ19416 /usr/bin/postgres -D /var/lib/pgsql/data -p 5432 > ââ19417 postgres: logger process > ââ19419 postgres: checkpointer process > ââ19420 postgres: writer process > ââ19421 postgres: wal writer process > ââ19422 postgres: autovacuum launcher process > ââ19423 postgres: stats collector process > ââ20618 postgres: candlepin candlepin 127.0.0.1(54628) idl > ââ20619 postgres: candlepin candlepin 127.0.0.1(54630) idl > ââ20620 postgres: candlepin candlepin 127.0.0.1(54632) idl > ââ21512 postgres: foreman foreman [local] idle > ââ21533 postgres: foreman foreman [local] idle > ââ21534 postgres: foreman foreman [local] idle > ââ21535 postgres: foreman foreman [local] idle > ââ21536 postgres: foreman foreman [local] idle > ââ21554 postgres: foreman foreman [local] idle > ââ24170 postgres: foreman foreman [local] idle > ââ24184 postgres: foreman foreman [local] idle > ââ24187 postgres: foreman foreman [local] idle > ââ24193 postgres: foreman foreman [local] idle > ââ28998 postgres: foreman foreman [local] idle > ââ28999 postgres: foreman foreman [local] idle > ââ29000 postgres: foreman foreman [local] idle > >Jul 11 20:33:10 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Starting PostgreSQL database server... >Jul 11 20:33:11 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Started PostgreSQL database server. >systemctl status qpidd.service >â qpidd.service - An AMQP message broker daemon. > Loaded: loaded (/usr/lib/systemd/system/qpidd.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:33:11 CEST; 19h ago > Docs: man:qpidd(1) > http://qpid.apache.org/ > Main PID: 19433 (qpidd) > Tasks: 26 > CGroup: /system.slice/qpidd.service > ââ19433 /usr/sbin/qpidd --config /etc/qpid/qpidd.conf > >Jul 11 20:33:11 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Started An AMQP message broker daemon.. >Jul 11 20:33:11 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Starting An AMQP message broker daemon.... >systemctl status qdrouterd.service >â qdrouterd.service - Qpid Dispatch router daemon > Loaded: loaded (/usr/lib/systemd/system/qdrouterd.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:38:39 CEST; 19h ago > Main PID: 24745 (qdrouterd) > Tasks: 25 > CGroup: /system.slice/qdrouterd.service > ââ24745 /usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf > >Jul 11 20:38:39 ibm-x3550m3-10.lab.eng.brq.redhat.com qdrouterd[24745]: Wed Jul 11 20:38:39 2018 CONN_MGR (info) Configured Listener: 0.0.0.0:5647 proto=any, role=normal, sslProfile=server >Jul 11 20:38:39 ibm-x3550m3-10.lab.eng.brq.redhat.com qdrouterd[24745]: Wed Jul 11 20:38:39 2018 CONN_MGR (info) Configured Listener: 0.0.0.0:5646 proto=any, role=inter-router, sslProfile=server >Jul 11 20:38:39 ibm-x3550m3-10.lab.eng.brq.redhat.com qdrouterd[24745]: Wed Jul 11 20:38:39 2018 CONN_MGR (info) Configured Connector: localhost:5671 proto=any, role=route-container , sslProfile=client >Jul 11 20:38:39 ibm-x3550m3-10.lab.eng.brq.redhat.com qdrouterd[24745]: Wed Jul 11 20:38:39 2018 POLICY (info) Policy configured maxConnections: 65535, policyDir: '', access rules enabled: 'false' >Jul 11 20:38:39 ibm-x3550m3-10.lab.eng.brq.redhat.com qdrouterd[24745]: Wed Jul 11 20:38:39 2018 POLICY (info) Policy fallback defaultVhost is defined: '$default' >Jul 11 20:38:39 ibm-x3550m3-10.lab.eng.brq.redhat.com qdrouterd[24745]: Wed Jul 11 20:38:39 2018 SERVER (info) Operational, 24 Threads Running >Jul 11 20:38:41 ibm-x3550m3-10.lab.eng.brq.redhat.com qdrouterd[24745]: Wed Jul 11 20:38:41 2018 ROUTER_CORE (info) Link Route Activated 'linkRoute/0' on connection broker >Jul 11 20:38:41 ibm-x3550m3-10.lab.eng.brq.redhat.com qdrouterd[24745]: Wed Jul 11 20:38:41 2018 ROUTER_CORE (info) Link Route Activated 'linkRoute/1' on connection broker >Jul 11 20:38:41 ibm-x3550m3-10.lab.eng.brq.redhat.com qdrouterd[24745]: Wed Jul 11 20:38:41 2018 ROUTER_CORE (info) Link Route Activated 'linkRoute/2' on connection broker >Jul 11 20:38:41 ibm-x3550m3-10.lab.eng.brq.redhat.com qdrouterd[24745]: Wed Jul 11 20:38:41 2018 ROUTER_CORE (info) Link Route Activated 'linkRoute/3' on connection broker >systemctl status squid.service >â squid.service - Squid caching proxy > Loaded: loaded (/usr/lib/systemd/system/squid.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:33:11 CEST; 19h ago > Process: 19318 ExecStop=/usr/sbin/squid -k shutdown -f $SQUID_CONF (code=exited, status=0/SUCCESS) > Process: 18947 ExecReload=/usr/sbin/squid $SQUID_OPTS -k reconfigure -f $SQUID_CONF (code=exited, status=0/SUCCESS) > Process: 19458 ExecStart=/usr/sbin/squid $SQUID_OPTS -f $SQUID_CONF (code=exited, status=0/SUCCESS) > Process: 19452 ExecStartPre=/usr/libexec/squid/cache_swap.sh (code=exited, status=0/SUCCESS) > Main PID: 19460 (squid) > Tasks: 2 > CGroup: /system.slice/squid.service > ââ19460 /usr/sbin/squid -f /etc/squid/squid.conf > ââ19462 (squid-1) -f /etc/squid/squid.conf > >Jul 11 20:33:11 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Starting Squid caching proxy... >Jul 11 20:33:11 ibm-x3550m3-10.lab.eng.brq.redhat.com squid[19460]: Squid Parent: will start 1 kids >Jul 11 20:33:11 ibm-x3550m3-10.lab.eng.brq.redhat.com squid[19460]: Squid Parent: (squid-1) process 19462 started >Jul 11 20:33:11 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Started Squid caching proxy. >systemctl status tomcat.service >â tomcat.service - Apache Tomcat Web Application Container > Loaded: loaded (/usr/lib/systemd/system/tomcat.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:33:11 CEST; 19h ago > Main PID: 19471 (java) > Tasks: 178 > CGroup: /system.slice/tomcat.service > ââ19471 /usr/lib/jvm/jre/bin/java -Xms1024m -Xmx4096m -classpath /usr/share/tomcat/bin/bootstrap.jar:/usr/share/tomcat/bin/tomcat-juli.jar:/usr/share/java/commons-daemon.jar -Dcatalina.base=/usr/share/tomcat -Dcatalina.home=/usr/share/tomcat -Djava.endorsed.dirs= -Djava.io.tmpdir=/var/cache/tomcat/temp -Djava.util.logging.config.file=/usr/share/tomcat/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager org.apache.catalina.startup.Bootstrap start > >Jul 11 20:33:29 ibm-x3550m3-10.lab.eng.brq.redhat.com server[19471]: Jul 11, 2018 8:33:29 PM org.apache.catalina.startup.HostConfig deployDirectory >Jul 11 20:33:29 ibm-x3550m3-10.lab.eng.brq.redhat.com server[19471]: INFO: Deployment of web application directory /var/lib/tomcat/webapps/candlepin has finished in 16,835 ms >Jul 11 20:33:29 ibm-x3550m3-10.lab.eng.brq.redhat.com server[19471]: Jul 11, 2018 8:33:29 PM org.apache.coyote.AbstractProtocol start >Jul 11 20:33:29 ibm-x3550m3-10.lab.eng.brq.redhat.com server[19471]: INFO: Starting ProtocolHandler ["http-bio-8080"] >Jul 11 20:33:29 ibm-x3550m3-10.lab.eng.brq.redhat.com server[19471]: Jul 11, 2018 8:33:29 PM org.apache.coyote.AbstractProtocol start >Jul 11 20:33:29 ibm-x3550m3-10.lab.eng.brq.redhat.com server[19471]: INFO: Starting ProtocolHandler ["http-bio-8443"] >Jul 11 20:33:29 ibm-x3550m3-10.lab.eng.brq.redhat.com server[19471]: Jul 11, 2018 8:33:29 PM org.apache.coyote.AbstractProtocol start >Jul 11 20:33:29 ibm-x3550m3-10.lab.eng.brq.redhat.com server[19471]: INFO: Starting ProtocolHandler ["ajp-bio-8009"] >Jul 11 20:33:29 ibm-x3550m3-10.lab.eng.brq.redhat.com server[19471]: Jul 11, 2018 8:33:29 PM org.apache.catalina.startup.Catalina start >Jul 11 20:33:29 ibm-x3550m3-10.lab.eng.brq.redhat.com server[19471]: INFO: Server startup in 16890 ms >systemctl status pulp_workers.service >â pulp_workers.service - Pulp Celery Workers > Loaded: loaded (/usr/lib/systemd/system/pulp_workers.service; enabled; vendor preset: disabled) > Active: active (exited) since Wed 2018-07-11 20:33:12 CEST; 19h ago > Process: 19194 ExecStop=/usr/bin/python -m pulp.server.async.manage_workers stop (code=exited, status=0/SUCCESS) > Process: 19490 ExecStart=/usr/bin/python -m pulp.server.async.manage_workers start (code=exited, status=0/SUCCESS) > Main PID: 19490 (code=exited, status=0/SUCCESS) > Tasks: 0 > CGroup: /system.slice/pulp_workers.service > >Jul 11 20:33:12 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Starting Pulp Celery Workers... >Jul 11 20:33:12 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Started Pulp Celery Workers. >systemctl status pulp_celerybeat.service >â pulp_celerybeat.service - Pulp's Celerybeat > Loaded: loaded (/usr/lib/systemd/system/pulp_celerybeat.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:33:12 CEST; 19h ago > Main PID: 19602 (celery) > Tasks: 5 > CGroup: /system.slice/pulp_celerybeat.service > ââ19602 /usr/bin/python /usr/bin/celery beat --app=pulp.server.async.celery_instance.celery --scheduler=pulp.server.async.scheduler.Scheduler > >Jul 12 14:23:14 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[19602]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred) >Jul 12 14:33:13 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[19602]: celery.beat:INFO: Scheduler: Sending due task reap_expired_documents (pulp.server.db.reaper.queue_reap_expired_documents) >Jul 12 14:33:14 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[19602]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred) >Jul 12 14:43:14 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[19602]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred) >Jul 12 14:53:14 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[19602]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred) >Jul 12 15:03:14 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[19602]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred) >Jul 12 15:13:14 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[19602]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred) >Jul 12 15:23:14 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[19602]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred) >Jul 12 15:33:14 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[19602]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred) >Jul 12 15:43:14 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[19602]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred) >systemctl status smart_proxy_dynflow_core.service >â smart_proxy_dynflow_core.service - Foreman smart proxy dynflow core service > Loaded: loaded (/usr/lib/systemd/system/smart_proxy_dynflow_core.service; enabled; vendor preset: disabled) > Drop-In: /etc/systemd/system/smart_proxy_dynflow_core.service.d > ââ90-limits.conf > Active: active (running) since Wed 2018-07-11 20:33:13 CEST; 19h ago > Docs: https://github.com/theforeman/smart_proxy_dynflow > Process: 19617 ExecStart=/usr/bin/smart_proxy_dynflow_core -d -p /var/run/foreman-proxy/smart_proxy_dynflow_core.pid (code=exited, status=0/SUCCESS) > Main PID: 19886 (ruby) > Tasks: 27 > CGroup: /system.slice/smart_proxy_dynflow_core.service > ââ19886 ruby /usr/bin/smart_proxy_dynflow_core -d -p /var/run/foreman-proxy/smart_proxy_dynflow_core.pid > >Jul 11 20:33:12 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Starting Foreman smart proxy dynflow core service... >Jul 11 20:33:13 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Started Foreman smart proxy dynflow core service. >systemctl status pulp_streamer.service >â pulp_streamer.service - The Pulp lazy content loading streamer > Loaded: loaded (/usr/lib/systemd/system/pulp_streamer.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:33:13 CEST; 19h ago > Main PID: 19914 (pulp_streamer) > Tasks: 3 > CGroup: /system.slice/pulp_streamer.service > ââ19914 /usr/bin/python /usr/bin/pulp_streamer --nodaemon --syslog --prefix=pulp_streamer --pidfile= --python /usr/share/pulp/wsgi/streamer.tac > >Jul 11 20:33:15 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp_streamer[19914]: pulp.plugins.loader.manager:INFO: Loaded plugin puppet_importer for types: puppet_module >Jul 11 20:33:15 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp_streamer[19914]: pulp.plugins.loader.manager:INFO: Loaded plugin yum_profiler for types: rpm,erratum >Jul 11 20:33:15 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp_streamer[19914]: pulp.plugins.loader.manager:INFO: Loaded plugin puppet_whole_repo_profiler for types: puppet_module >Jul 11 20:33:15 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp_streamer[19914]: pulp.plugins.loader.manager:INFO: Loaded plugin yum for types: rpm >Jul 11 20:33:15 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp_streamer[19914]: pulp.plugins.loader.manager:INFO: Loaded plugin rhui for types: rpm >Jul 11 20:33:15 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp_streamer[19914]: [-] Log opened. >Jul 11 20:33:15 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp_streamer[19914]: [-] twistd 12.2.0 (/usr/bin/python 2.7.5) starting up. >Jul 11 20:33:15 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp_streamer[19914]: [-] reactor class: twisted.internet.epollreactor.EPollReactor. >Jul 11 20:33:15 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp_streamer[19914]: [-] Site starting on 8751 >Jul 11 20:33:15 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp_streamer[19914]: [-] Starting factory <twisted.web.server.Site instance at 0x7f0b89a291b8> >systemctl status foreman-proxy.service >â foreman-proxy.service - Foreman Proxy > Loaded: loaded (/usr/lib/systemd/system/foreman-proxy.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:37:46 CEST; 19h ago > Main PID: 23890 (ruby) > Tasks: 31 > CGroup: /system.slice/foreman-proxy.service > ââ23890 ruby /usr/share/foreman-proxy/bin/smart-proxy --no-daemonize > >Jul 11 20:37:45 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Starting Foreman Proxy... >Jul 11 20:37:46 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Started Foreman Proxy. >Jul 12 08:23:25 ibm-x3550m3-10.lab.eng.brq.redhat.com smart-proxy[23890]: Running Foreman Ansible Core in non-SCL context >Jul 12 08:23:25 ibm-x3550m3-10.lab.eng.brq.redhat.com smart-proxy[23890]: ibm-x3550m3-10.lab.eng.brq.redhat.com - - [12/Jul/2018:08:23:25 CEST] "GET /pulp/status/disk_usage HTTP/1.1" 200 399 >Jul 12 08:23:25 ibm-x3550m3-10.lab.eng.brq.redhat.com smart-proxy[23890]: - -> /pulp/status/disk_usage >systemctl status pulp_resource_manager.service >â pulp_resource_manager.service - Pulp Resource Manager > Loaded: loaded (/usr/lib/systemd/system/pulp_resource_manager.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:33:14 CEST; 19h ago > Main PID: 20020 (celery) > Tasks: 12 > CGroup: /system.slice/pulp_resource_manager.service > ââ20020 /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%h -Q resource_manager -c 1 --events --umask 18 --pidfile=/var/run/pulp/resource_manager.pid > ââ20371 /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%h -Q resource_manager -c 1 --events --umask 18 --pidfile=/var/run/pulp/resource_manager.pid > >Jul 11 20:33:16 ibm-x3550m3-10.lab.eng.brq.redhat.com celery[20020]: -------------- [queues] >Jul 11 20:33:16 ibm-x3550m3-10.lab.eng.brq.redhat.com celery[20020]: .> resource_manager exchange=resource_manager(direct) key=resource_manager >Jul 11 20:33:16 ibm-x3550m3-10.lab.eng.brq.redhat.com celery[20020]: .> resource_manager@ibm-x3550m3-10.lab.eng.brq.redhat.com.dq2 exchange=C.dq2(direct) key=resource_manager@ibm-x3550m3-10.lab.eng.brq.redhat.com >Jul 11 20:33:16 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[20371]: pulp.server.db.connection:INFO: Attempting to connect to localhost:27017 >Jul 11 20:33:16 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[20371]: pulp.server.db.connection:INFO: Attempting to connect to localhost:27017 >Jul 11 20:33:16 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[20020]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS >Jul 11 20:33:16 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[20020]: celery.worker.consumer.connection:INFO: Connected to qpid://localhost:5671// >Jul 11 20:33:16 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[20020]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS >Jul 11 20:33:17 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[20020]: celery.apps.worker:INFO: resource_manager@ibm-x3550m3-10.lab.eng.brq.redhat.com ready. >Jul 11 20:33:17 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[20371]: pulp.server.db.connection:INFO: Write concern for Mongo connection: {} >systemctl status httpd.service >â httpd.service - The Apache HTTP Server > Loaded: loaded (/usr/lib/systemd/system/httpd.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:37:44 CEST; 19h ago > Docs: man:httpd(8) > man:apachectl(8) > Process: 23591 ExecStop=/bin/kill -WINCH ${MAINPID} (code=exited, status=0/SUCCESS) > Main PID: 23625 (httpd) > Status: "Total requests: 0; Current requests/sec: 0; Current traffic: 0 B/sec" > Tasks: 208 > CGroup: /system.slice/httpd.service > ââ23625 /usr/sbin/httpd -DFOREGROUND > ââ23646 (wsgi:pulp) -DFOREGROUND > ââ23647 (wsgi:pulp) -DFOREGROUND > ââ23648 (wsgi:pulp) -DFOREGROUND > ââ23649 (wsgi:pulp-cont -DFOREGROUND > ââ23650 (wsgi:pulp-cont -DFOREGROUND > ââ23651 (wsgi:pulp-cont -DFOREGROUND > ââ23652 (wsgi:pulp_forg -DFOREGROUND > ââ23653 PassengerWatchdog > ââ23656 PassengerHelperAgent > ââ23661 PassengerLoggingAgent > ââ23671 /usr/sbin/httpd -DFOREGROUND > ââ23672 /usr/sbin/httpd -DFOREGROUND > ââ23673 /usr/sbin/httpd -DFOREGROUND > ââ23674 /usr/sbin/httpd -DFOREGROUND > ââ23675 /usr/sbin/httpd -DFOREGROUND > ââ23676 /usr/sbin/httpd -DFOREGROUND > ââ23677 /usr/sbin/httpd -DFOREGROUND > ââ23678 /usr/sbin/httpd -DFOREGROUND > ââ23991 /bin/bash /usr/bin/tfm-ruby /usr/share/gems/gems/passenger-4.0.18/helper-scripts/rack-preloader.rb > ââ23999 scl enable tfm bash /tmp/tmp.prQS9Rg61R > ââ24000 /bin/bash /var/tmp/sclzxsASA > ââ24024 bash /tmp/tmp.prQS9Rg61R > ââ24025 Passenger AppPreloader: /usr/share/foreman > ââ24160 Passenger RackApp: /usr/share/foreman > ââ25683 /usr/sbin/httpd -DFOREGROUND > ââ25717 /usr/sbin/httpd -DFOREGROUND > ââ25718 /usr/sbin/httpd -DFOREGROUND > >Jul 11 20:37:46 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[23648]: pulp.server.webservices.application:INFO: The Pulp server has been successfully initialized >Jul 11 20:37:46 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[23648]: pulp.server.webservices.application:INFO: ************************************************************* >Jul 11 20:37:46 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[23647]: gofer.messaging.adapter.qpid.connection:INFO: opened: qpid+ssl://localhost:5671 >Jul 11 20:37:46 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[23647]: gofer.messaging.adapter.connect:INFO: connected: qpid+ssl://localhost:5671 >Jul 11 20:37:46 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[23646]: gofer.messaging.adapter.qpid.connection:INFO: opened: qpid+ssl://localhost:5671 >Jul 11 20:37:46 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[23646]: gofer.messaging.adapter.connect:INFO: connected: qpid+ssl://localhost:5671 >Jul 11 20:37:46 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[23648]: gofer.messaging.adapter.qpid.connection:INFO: opened: qpid+ssl://localhost:5671 >Jul 11 20:37:46 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[23648]: gofer.messaging.adapter.connect:INFO: connected: qpid+ssl://localhost:5671 >Jul 11 20:51:06 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[23648]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS >Jul 12 03:47:26 ibm-x3550m3-10.lab.eng.brq.redhat.com pulp[23646]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS >systemctl status puppetserver.service >â puppetserver.service - puppetserver Service > Loaded: loaded (/usr/lib/systemd/system/puppetserver.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:33:55 CEST; 19h ago > Process: 19053 ExecStop=/opt/puppetlabs/server/apps/puppetserver/bin/puppetserver stop (code=exited, status=0/SUCCESS) > Process: 20098 ExecStart=/opt/puppetlabs/server/apps/puppetserver/bin/puppetserver start (code=exited, status=0/SUCCESS) > Main PID: 20112 (java) > Tasks: 115 > CGroup: /system.slice/puppetserver.service > ââ20112 /usr/bin/java -Xms2G -Xmx2G -Djruby.logger.class=com.puppetlabs.jruby_utils.jruby.Slf4jLogger -Djava.security.egd=/dev/urandom -XX:OnOutOfMemoryError=kill -9 %p -cp /opt/puppetlabs/server/apps/puppetserver/puppet-server-release.jar:/opt/puppetlabs/server/apps/puppetserver/jruby-1_7.jar:/opt/puppetlabs/server/data/puppetserver/jars/* clojure.main -m puppetlabs.trapperkeeper.main --config /etc/puppetlabs/puppetserver/conf.d --bootstrap-config /etc/puppetlabs/puppetserver/services.d/,/opt/puppetlabs/server/apps/puppetserver/config/services.d/ --restart-file /opt/puppetlabs/server/data/puppetserver/restartcounter > >Jul 11 20:33:15 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Starting puppetserver Service... >Jul 11 20:33:55 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Started puppetserver Service. >systemctl status dynflowd.service >â dynflowd.service - Foreman jobs daemon > Loaded: loaded (/usr/lib/systemd/system/dynflowd.service; enabled; vendor preset: disabled) > Active: active (running) since Wed 2018-07-11 20:34:00 CEST; 19h ago > Docs: https://theforeman.org > Process: 18999 ExecStop=/usr/sbin/dynflowd stop (code=exited, status=0/SUCCESS) > Process: 20978 ExecStart=/usr/sbin/dynflowd start (code=exited, status=0/SUCCESS) > Tasks: 33 > CGroup: /system.slice/dynflowd.service > ââ21036 dynflow_executor > ââ21037 dynflow_executor_monitor > >Jul 11 20:33:55 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Starting Foreman jobs daemon... >Jul 11 20:34:00 ibm-x3550m3-10.lab.eng.brq.redhat.com dynflowd[20978]: /usr/share/foreman/lib/foreman.rb:8: warning: already initialized constant Foreman::UUID_REGEXP >Jul 11 20:34:00 ibm-x3550m3-10.lab.eng.brq.redhat.com dynflowd[20978]: /usr/share/foreman/lib/foreman.rb:8: warning: previous definition of UUID_REGEXP was here >Jul 11 20:34:00 ibm-x3550m3-10.lab.eng.brq.redhat.com dynflowd[20978]: /usr/share/foreman/lib/core_extensions.rb:182: warning: already initialized constant ActiveSupport::MessageEncryptor::DEFAULT_CIPHER >Jul 11 20:34:00 ibm-x3550m3-10.lab.eng.brq.redhat.com dynflowd[20978]: /opt/theforeman/tfm-ror51/root/usr/share/gems/gems/activesupport-5.1.6/lib/active_support/message_encryptor.rb:22: warning: previous definition of DEFAULT_CIPHER was here >Jul 11 20:34:00 ibm-x3550m3-10.lab.eng.brq.redhat.com dynflowd[20978]: Dynflow Executor: start in progress >Jul 11 20:34:00 ibm-x3550m3-10.lab.eng.brq.redhat.com dynflowd[20978]: /opt/theforeman/tfm/root/usr/share/gems/gems/daemons-1.2.3/lib/daemons/daemonize.rb:108: warning: conflicting chdir during another chdir block >Jul 11 20:34:00 ibm-x3550m3-10.lab.eng.brq.redhat.com dynflowd[20978]: /opt/theforeman/tfm/root/usr/share/gems/gems/daemons-1.2.3/lib/daemons/daemonize.rb:75: warning: conflicting chdir during another chdir block >Jul 11 20:34:00 ibm-x3550m3-10.lab.eng.brq.redhat.com dynflowd[20978]: dynflow_executor: process with pid 21036 started. >Jul 11 20:34:00 ibm-x3550m3-10.lab.eng.brq.redhat.com systemd[1]: Started Foreman jobs daemon. >Success!
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 1546099
:
1458444
| 1458445