Bug 1357472 - Upgrading from 6.2 (or along 6.2.Z) should not run remove_gutterball and elasticsearch_message
Summary: Upgrading from 6.2 (or along 6.2.Z) should not run remove_gutterball and elas...
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: Red Hat Satellite
Classification: Red Hat
Component: Upgrades
Version: 6.2.0
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: Unspecified
Assignee: satellite6-bugs
QA Contact: jcallaha
URL:
Whiteboard:
: 1375734 (view as bug list)
Depends On:
Blocks: 1122832 1335807 Sat6_Upgrades 1416189
TreeView+ depends on / blocked
 
Reported: 2016-07-18 10:00 UTC by Lukas Pramuk
Modified: 2019-09-25 21:20 UTC (History)
11 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
: 1416189 (view as bug list)
Environment:
Last Closed: 2017-06-01 18:13:30 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Foreman Issue Tracker 15906 0 None None None 2016-07-29 14:32:34 UTC
Red Hat Bugzilla 1406068 0 unspecified CLOSED Do not log the expected actions as errors in installer log 2021-02-22 00:41:40 UTC

Internal Links: 1406068

Description Lukas Pramuk 2016-07-18 10:00:51 UTC
Description of problem:
Upgrading along 6.2.Z stream should not run remove_gutterball and elasticsearch_message. These steps are rrelevant to upgrading from 6.1

Hit when upgraded between GA snaps and elasticsearch_message and remove_gutterball steps were run needlessly. And assuming that touches 6.2.zstream

Version-Release number of selected component (if applicable):
satellite-6.2.0-20.1.el7sat.noarch

How reproducible:
always

Steps to Reproduce:
1. # yum upgrade # but not from sat6.1
2. # satellite-installer -S satellite --upgrade
...
Upgrade Step: elasticsearch_message...
package elasticsearch is not installed
...
Upgrade Step: remove_gutterball...
package gutterball is not installed
...


Actual results:
not appropriate upgrade steps are run needlessly.

Expected results:
only relevant steps are run when upgrading along 6.2.zstream


Additional info:

Comment 3 Justin Sherrill 2016-07-28 16:16:44 UTC
On a heavily loaded 6.1 that was upgraded to 6.2, re-running  

satellite-installer --scenario=satellite --upgrade

takes more than 6 hours.  It seems the import_* steps are the ones taking the longest.  A user cannot be expected for each z-stream update to take this long.  I'm going to re-purpose this bug to more generically only run the upgrade steps once

Comment 4 Ivan Necas 2016-07-29 14:32:32 UTC
Created redmine issue http://projects.theforeman.org/issues/15906 from this bug

Comment 5 Peter Vreman 2016-08-22 08:38:00 UTC
As a user i agree with both issues:
- A red Error that a correctly absent package is not available confuses.
- Timing of import_rpm is very long


Upgrade Step: import_rpms (this may take a while) ...
[ INFO 2016-08-22 07:59:00 verbose] Upgrade Step: import_rpms (this may take a while) ...
Importing Rpms

Upgrade Step: import_distributions (this may take a while) ...
[ INFO 2016-08-22 08:29:07 verbose] Upgrade Step: import_distributions (this may take a while) ...
Importing distribution data into repositories


For the import_XXXX tasks i also recommend to provide progress bar. The remark of 'this may take a while' is subjective. Is it 5 minutes of a few hours.

E.g.

import_rpms
10000/200000 rpms imported
20000/200000 rpms imported
30000/200000 rpms imported

Then the user can estimate how long it will take to finish

Comment 6 Brad Buckingham 2016-09-22 19:25:01 UTC
*** Bug 1375734 has been marked as a duplicate of this bug. ***

Comment 7 Mike McCune 2016-10-19 19:41:01 UTC
we are going to resolve this in 6.2.3.1, shipping ASAP in the bug I closed this as a dupe of

*** This bug has been marked as a duplicate of bug 1264597 ***

Comment 8 Lukas Pramuk 2016-10-20 11:31:37 UTC
This bug was about:
Upgrading along 6.2.Z stream should not run remove_gutterball and elasticsearch_message

As BZ #1264597 is not really fixing it I'm reopening this one and setting original title.

Comment 9 Lukas Pramuk 2016-10-20 11:39:05 UTC
@Sat6.3.0 Snap5

# satellite-installer --upgrade
...

Upgrade Step: elasticsearch_message...
package elasticsearch is not installed   <<< red alert

...

Upgrade Step: remove_gutterball...
package gutterball is not installed     <<< red alert

Upgrade completed!


These two steps should run only when upgrading from katello 2.4 (katello:upgrades:2.4)
And not when upgrading from/along katello 3.0 or even 3.2 !!! 

Or do we want these steps running all the time even on 6.4 ? When they are relevant only for 6.1

Comment 10 jcallaha 2017-06-01 18:13:30 UTC
This is no longer present in 6.2.10 Snap 2.


Upgrading...
Upgrade Step: stop_services...
celery multi v3.1.11 (Cipater)
> Stopping nodes...
	> resource_manager.eng.bos.redhat.com: QUIT -> 8615
> Waiting for 1 node -> 8615........
	> resource_manager.eng.bos.redhat.com: OK

celery multi v3.1.11 (Cipater)
> Stopping nodes...
	> reserved_resource_worker-0.eng.bos.redhat.com: QUIT -> 8793
	> reserved_resource_worker-1.eng.bos.redhat.com: QUIT -> 8814
> Waiting for 2 nodes -> 8793, 8814........
	> reserved_resource_worker-0.eng.bos.redhat.com: OK
> Waiting for 1 node -> 8814....
	> reserved_resource_worker-1.eng.bos.redhat.com: OK

Stopping httpd: [  OK  ]

Stopping smart_proxy_dynflow_core: [  OK  ]
celery init v10.0.
Using config script: /etc/default/pulp_resource_manager
celery init v10.0.
Using config script: /etc/default/pulp_workers
Stopping pulp_streamer... OK
Stopping tomcat6: waiting for processes 8121 to exit
killing 8121 which did not stop after 30 seconds[WARNING]
[  OK  ]
celery init v10.0.
Using configuration: /etc/default/pulp_workers, /etc/default/pulp_celerybeat
Stopping pulp_celerybeat... OK
Stopping foreman-proxy: [  OK  ]
Shutting down qdrouterd services: [  OK  ]
[  OK  ] Qpid AMQP daemon: [  OK  ]
Stopping squid: ................[  OK  ]
Success!

Upgrade Step: start_databases...
Starting postgresql service: [  OK  ]

Success!

Upgrade Step: update_http_conf...

Upgrade Step: fix_pulp_httpd_conf...
Upgrade Step: migrate_pulp...
7932

Attempting to connect to localhost:27017
Attempting to connect to localhost:27017
Write concern for Mongo connection: {}
/usr/lib/python2.6/site-packages/pulp/server/db/connection.py:159: DeprecationWarning: add_son_manipulator is deprecated
  _DATABASE.add_son_manipulator(NamespaceInjector())
Loading content types.
Loading type descriptors []
Parsing type descriptors
Validating type descriptor syntactic integrity
Validating type descriptor semantic integrity
Loading unit model: puppet_module = pulp_puppet.plugins.db.models:Module
Loading unit model: erratum = pulp_rpm.plugins.db.models:Errata
Loading unit model: distribution = pulp_rpm.plugins.db.models:Distribution
Loading unit model: package_group = pulp_rpm.plugins.db.models:PackageGroup
Loading unit model: package_category = pulp_rpm.plugins.db.models:PackageCategory
Loading unit model: iso = pulp_rpm.plugins.db.models:ISO
Loading unit model: package_environment = pulp_rpm.plugins.db.models:PackageEnvironment
Loading unit model: drpm = pulp_rpm.plugins.db.models:DRPM
Loading unit model: srpm = pulp_rpm.plugins.db.models:SRPM
Loading unit model: rpm = pulp_rpm.plugins.db.models:RPM
Loading unit model: yum_repo_metadata_file = pulp_rpm.plugins.db.models:YumMetadataFile
Loading unit model: docker_blob = pulp_docker.plugins.models:Blob
Loading unit model: docker_manifest = pulp_docker.plugins.models:Manifest
Loading unit model: docker_image = pulp_docker.plugins.models:Image
Loading unit model: docker_tag = pulp_docker.plugins.models:Tag
Updating the database with types []
/usr/lib/python2.6/site-packages/pulp/server/db/model/base.py:96: DeprecationWarning: ensure_index is deprecated. Use create_index instead.
  unique=unique, background=True)
Found the following type definitions that were not present in the update collection [node, puppet_module, docker_tag, repository, erratum, docker_blob, docker_manifest, yum_repo_metadata_file, package_group, package_category, iso, package_environment, drpm, distribution, rpm, srpm, docker_image]
Updating the database with types [puppet_module, docker_tag, erratum, docker_blob, docker_manifest, yum_repo_metadata_file, package_group, package_category, iso, package_environment, drpm, distribution, rpm, srpm, docker_image]
Found the following type definitions that were not present in the update collection [node, repository]
/usr/lib/python2.6/site-packages/pulp/plugins/types/database.py:277: DeprecationWarning: save is deprecated. Use insert_one or replace_one instead
  content_type_collection.save(content_type)
Content types loaded.
Ensuring the admin role and user are in place.
Admin role and user are in place.
Beginning database migrations.
Migration package pulp.server.db.migrations is up to date at version 24
Migration package pulp_docker.plugins.migrations is up to date at version 2
Migration package pulp_puppet.plugins.migrations is up to date at version 5
Migration package pulp_rpm.plugins.migrations is up to date at version 34
Loading unit model: puppet_module = pulp_puppet.plugins.db.models:Module
Loading unit model: erratum = pulp_rpm.plugins.db.models:Errata
Loading unit model: distribution = pulp_rpm.plugins.db.models:Distribution
Loading unit model: package_group = pulp_rpm.plugins.db.models:PackageGroup
Loading unit model: package_category = pulp_rpm.plugins.db.models:PackageCategory
Loading unit model: iso = pulp_rpm.plugins.db.models:ISO
Loading unit model: package_environment = pulp_rpm.plugins.db.models:PackageEnvironment
Loading unit model: drpm = pulp_rpm.plugins.db.models:DRPM
Loading unit model: srpm = pulp_rpm.plugins.db.models:SRPM
Loading unit model: rpm = pulp_rpm.plugins.db.models:RPM
Loading unit model: yum_repo_metadata_file = pulp_rpm.plugins.db.models:YumMetadataFile
Loading unit model: docker_blob = pulp_docker.plugins.models:Blob
Loading unit model: docker_manifest = pulp_docker.plugins.models:Manifest
Loading unit model: docker_image = pulp_docker.plugins.models:Image
Loading unit model: docker_tag = pulp_docker.plugins.models:Tag
Database migrations complete.

Upgrade Step: start_httpd...
[Thu Jun 01 13:54:33 2017] [warn] module passenger_module is already loaded, skipping
Starting httpd: [  OK  ]
Success!

Upgrade Step: start_qpidd...
Starting Qpid AMQP daemon: [  OK  ]
Starting qdrouterd services: [  OK  ]
Success!

Upgrade Step: start_pulp...
celery multi v3.1.11 (Cipater)
> Starting nodes...
	> reserved_resource_worker-0.eng.bos.redhat.com: OK
	> reserved_resource_worker-1.eng.bos.redhat.com: OK
celery multi v3.1.11 (Cipater)
> Starting nodes...
	> resource_manager.eng.bos.redhat.com: OK
celery init v10.0.
Using config script: /etc/default/pulp_workers
celery init v10.0.
Using configuration: /etc/default/pulp_workers, /etc/default/pulp_celerybeat
Starting pulp_celerybeat...
celery init v10.0.
Using config script: /etc/default/pulp_resource_manager
Success!

Upgrade Step: migrate_candlepin...
Migrating candlepin database
Liquibase Update Successful

Upgrade Step: start_tomcat...
Starting tomcat6: [  OK  ]
Success!

Upgrade Step: migrate_foreman...
/opt/theforeman/tfm/root/usr/share/gems/gems/foreman_theme_satellite-0.1.43/app/models/concerns/satellite_packages.rb:4: warning: already initialized constant Katello::Ping::PACKAGES
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/app/models/katello/ping.rb:7: warning: previous definition of PACKAGES was here
true

/opt/theforeman/tfm/root/usr/share/gems/gems/foreman_theme_satellite-0.1.43/app/models/concerns/satellite_packages.rb:4: warning: already initialized constant Katello::Ping::PACKAGES
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/app/models/katello/ping.rb:7: warning: previous definition of PACKAGES was here

/opt/theforeman/tfm/root/usr/share/gems/gems/foreman_theme_satellite-0.1.43/app/models/concerns/satellite_packages.rb:4: warning: already initialized constant Katello::Ping::PACKAGES
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/app/models/katello/ping.rb:7: warning: previous definition of PACKAGES was here
false

Upgrade Step: Running installer...
Installing             Done                                               [100%] [..........................................................................................................]
  The full log is at /var/log/foreman-installer/satellite.log
Upgrade Step: restart_services...
celery multi v3.1.11 (Cipater)
> Stopping nodes...
	> reserved_resource_worker-1.eng.bos.redhat.com: QUIT -> 25176
	> reserved_resource_worker-0.eng.bos.redhat.com: QUIT -> 25159
> Waiting for 2 nodes -> 25176, 25159........
	> reserved_resource_worker-1.eng.bos.redhat.com: OK
> Waiting for 1 node -> 25159....
	> reserved_resource_worker-0.eng.bos.redhat.com: OK

celery multi v3.1.11 (Cipater)
> Stopping nodes...
	> resource_manager.eng.bos.redhat.com: QUIT -> 25354
> Waiting for 1 node -> 25354.....
	> resource_manager.eng.bos.redhat.com: OK

celery multi v3.1.11 (Cipater)
> Starting nodes...
	> resource_manager.eng.bos.redhat.com: OK
/usr/lib/python2.6/site-packages/pulp/server/db/connection.py:159: DeprecationWarning: add_son_manipulator is deprecated
  _DATABASE.add_son_manipulator(NamespaceInjector())
/usr/lib/python2.6/site-packages/pulp/server/db/model/base.py:96: DeprecationWarning: ensure_index is deprecated. Use create_index instead.
  unique=unique, background=True)
celery multi v3.1.11 (Cipater)
> Starting nodes...
	> reserved_resource_worker-0.eng.bos.redhat.com: OK
	> reserved_resource_worker-1.eng.bos.redhat.com: OK
[Thu Jun 01 14:02:15 2017] [warn] module passenger_module is already loaded, skipping
Stopping httpd: [  OK  ]

celery init v10.0.
Using config script: /etc/default/pulp_workers
celery init v10.0.
Using configuration: /etc/default/pulp_workers, /etc/default/pulp_celerybeat
Stopping pulp_celerybeat... OK
Stopping foreman-proxy: [  OK  ]
Stopping tomcat6: waiting for processes 25467 to exit
[  OK  ]
Stopping smart_proxy_dynflow_core: [  OK  ]
Stopping pulp_streamer... OK
celery init v10.0.
Using config script: /etc/default/pulp_resource_manager
[  OK  ] Qpid AMQP daemon: [  OK  ]
Shutting down qdrouterd services: [  OK  ]
Stopping squid: ................[  OK  ]
Stopping mongod: [  OK  ]
Stopping postgresql service: [  OK  ]
Success!
Starting postgresql service: [  OK  ]
Starting mongod: [  OK  ]
Waiting for mongod to become available: [  OK  ]
Starting squid: .[  OK  ]
Starting qdrouterd services: [  OK  ]
Starting Qpid AMQP daemon: [  OK  ]
celery init v10.0.
Using config script: /etc/default/pulp_resource_manager
Starting pulp_streamer...
OK
Starting smart_proxy_dynflow_core: [  OK  ]
Starting tomcat6: [  OK  ]
Starting foreman-proxy: [  OK  ]
celery init v10.0.
Using configuration: /etc/default/pulp_workers, /etc/default/pulp_celerybeat
Starting pulp_celerybeat...
celery init v10.0.
Using config script: /etc/default/pulp_workers
Starting foreman-tasks: [  OK  ]
Starting httpd: [  OK  ]
Success!

Upgrade Step: db_seed...
/opt/theforeman/tfm/root/usr/share/gems/gems/foreman_theme_satellite-0.1.43/app/models/concerns/satellite_packages.rb:4: warning: already initialized constant Katello::Ping::PACKAGES
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/app/models/katello/ping.rb:7: warning: previous definition of PACKAGES was here
E, [2017-06-01T14:02:53.503656 #32683] ERROR -- /client-dispatcher: Could not find an executor for Dynflow::Dispatcher::Envelope[request_id: 2, sender_id: bce90ad9-d605-45a3-bb34-d09cf3473454, receiver_id: Dynflow::Dispatcher::UnknownWorld, message: Dynflow::Dispatcher::Event[execution_plan_id: 992e2753-7871-4a60-986e-8f4db2e66c13, step_id: 2, event: Actions::Candlepin::ListenOnCandlepinEvents::Reconnect[message: initialized...have not connected yet]]] (Dynflow::Error)
/opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.13.6/lib/dynflow/dispatcher/client_dispatcher.rb:66:in `dispatch_request'
/opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.13.6/lib/dynflow/dispatcher/client_dispatcher.rb:38:in `block in publish_request'
/opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.13.6/lib/dynflow/dispatcher/client_dispatcher.rb:108:in `track_request'
/opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.13.6/lib/dynflow/dispatcher/client_dispatcher.rb:37:in `publish_request'
/opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.13.6/lib/dynflow/actor.rb:6:in `on_message'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/context.rb:46:in `on_envelope'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/executes_context.rb:7:in `on_envelope'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass'
/opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-0.8.13.6/lib/dynflow/actor.rb:26:in `on_envelope'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/awaits.rb:15:in `on_envelope'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/sets_results.rb:14:in `on_envelope'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/buffer.rb:38:in `process_envelope'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/buffer.rb:31:in `process_envelopes?'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/buffer.rb:20:in `on_envelope'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/termination.rb:55:in `on_envelope'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/removes_child.rb:10:in `on_envelope'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/abstract.rb:25:in `pass'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/behaviour/sets_results.rb:14:in `on_envelope'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/core.rb:161:in `process_envelope'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/core.rb:95:in `block in on_envelope'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/core.rb:118:in `block (2 levels) in schedule_execution'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/synchronization/mri_lockable_object.rb:38:in `block in synchronize'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/synchronization/mri_lockable_object.rb:38:in `synchronize'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/synchronization/mri_lockable_object.rb:38:in `synchronize'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-edge-0.2.0/lib/concurrent/actor/core.rb:115:in `block in schedule_execution'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/serialized_execution.rb:18:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/serialized_execution.rb:18:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/serialized_execution.rb:96:in `work'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/serialized_execution.rb:77:in `block in call_job'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:333:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:333:in `run_task'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:322:in `block (3 levels) in create_worker'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:305:in `loop'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:305:in `block (2 levels) in create_worker'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:304:in `catch'
/opt/theforeman/tfm/root/usr/share/gems/gems/concurrent-ruby-1.0.0/lib/concurrent/executor/ruby_thread_pool_executor.rb:304:in `block in create_worker'
/opt/theforeman/tfm/root/usr/share/gems/gems/logging-1.8.2/lib/logging/diagnostic_context.rb:323:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/logging-1.8.2/lib/logging/diagnostic_context.rb:323:in `block in create_with_logging_context'
Seeding /usr/share/foreman/db/seeds.d/03-auth_sources.rb
Seeding /usr/share/foreman/db/seeds.d/03-permissions.rb
Seeding /usr/share/foreman/db/seeds.d/03-roles.rb
Seeding /usr/share/foreman/db/seeds.d/04-admin.rb
Seeding /usr/share/foreman/db/seeds.d/05-taxonomies.rb
Seeding /usr/share/foreman/db/seeds.d/06-architectures.rb
Seeding /usr/share/foreman/db/seeds.d/07-provisioning_templates.rb
Seeding /usr/share/foreman/db/seeds.d/08-partition_tables.rb
Seeding /usr/share/foreman/db/seeds.d/10-installation_media.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/db/seeds.d/101-locations.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/db/seeds.d/102-organizations.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/db/seeds.d/103-provisioning_templates.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/db/seeds.d/104-proxy.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/db/seeds.d/106-mail_notifications.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/db/seeds.d/107-enable_dynflow.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/db/seeds.d/108-ensure_sync_notification.rb
Seeding /usr/share/foreman/db/seeds.d/11-smart_proxy_features.rb
Seeding /usr/share/foreman/db/seeds.d/13-compute_profiles.rb
Seeding /usr/share/foreman/db/seeds.d/15-bookmarks.rb
Seeding /usr/share/foreman/db/seeds.d/16-mail_notifications.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-0.7.14.14/db/seeds.d/20-foreman_tasks_permissions.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/redhat_access-1.0.15/db/seeds.d/200-update-insights-roles.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/redhat_access-1.0.15/db/seeds.d/201-add-insights-email-notifications.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/foreman_bootdisk-6.1.0.4/db/seeds.d/50-bootdisk_templates.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/foreman_discovery-5.0.0.9/db/seeds.d/50_discovery_templates.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-0.7.14.14/db/seeds.d/60-dynflow_proxy_feature.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/foreman_remote_execution-0.3.0.17/db/seeds.d/60-ssh_proxy_feature.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/foreman_discovery-5.0.0.9/db/seeds.d/60_discovery_proxy_feature.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-0.7.14.14/db/seeds.d/61-foreman_tasks_bookmarks.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/foreman_remote_execution-0.3.0.17/db/seeds.d/70-job_templates.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/db/seeds.d/75-job_templates.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/foreman_remote_execution-0.3.0.17/db/seeds.d/90-bookmarks.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/foreman_theme_satellite-0.1.43/db/seeds.d/990 - provisioning_templates.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/foreman_openscap-0.5.3.22/db/seeds.d/openscap_feature.rb
Seeding /opt/theforeman/tfm/root/usr/share/gems/gems/foreman_openscap-0.5.3.22/db/seeds.d/openscap_policy_notification.rb
All seed files executed

Upgrade Step: correct_repositories (this may take a while) ...
/opt/theforeman/tfm/root/usr/share/gems/gems/foreman_theme_satellite-0.1.43/app/models/concerns/satellite_packages.rb:4: warning: already initialized constant Katello::Ping::PACKAGES
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/app/models/katello/ping.rb:7: warning: previous definition of PACKAGES was here
Processing Repository 1/4: Red Hat Satellite 6.1 for RHEL 6 Server RPMs x86_64 (1)
Processing Repository 2/4: Red Hat Satellite 6.1 for RHEL 7 Server RPMs x86_64 (2)
Processing Repository 3/4: Red Hat Satellite 6.2 for RHEL 6 Server RPMs x86_64 (3)
Processing Repository 4/4: Red Hat Satellite 6.2 for RHEL 7 Server RPMs x86_64 (4)

Upgrade Step: correct_puppet_environments (this may take a while) ...
/opt/theforeman/tfm/root/usr/share/gems/gems/foreman_theme_satellite-0.1.43/app/models/concerns/satellite_packages.rb:4: warning: already initialized constant Katello::Ping::PACKAGES
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/app/models/katello/ping.rb:7: warning: previous definition of PACKAGES was here

Upgrade Step: clean_backend_objects (this may take a while) ...
/opt/theforeman/tfm/root/usr/share/gems/gems/foreman_theme_satellite-0.1.43/app/models/concerns/satellite_packages.rb:4: warning: already initialized constant Katello::Ping::PACKAGES
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.0.0.135/app/models/katello/ping.rb:7: warning: previous definition of PACKAGES was here
0 orphaned consumer id(s) found.

Upgrade completed!


Note You need to log in before you can comment on or make changes to this bug.