Bug 1486760 - [RFE] Allow bootstrap.py to allow a user to migrate a system from Capsule->Capsule, Satellite->Capsule, or Capsule->Satellite while preserving (most) host information.
Summary: [RFE] Allow bootstrap.py to allow a user to migrate a system from Capsule->Ca...
Alias: None
Product: Red Hat Satellite
Classification: Red Hat
Component: Bootstrap
Version: 6.2.10
Hardware: x86_64
OS: Linux
medium vote
Target Milestone: Unspecified
Assignee: Rich Jerrido
QA Contact: Lukas Pramuk
: 1527332 (view as bug list)
Depends On:
TreeView+ depends on / blocked
Reported: 2017-08-30 13:45 UTC by Mihir Lele
Modified: 2020-12-14 09:46 UTC (History)
7 users (show)

Fixed In Version: katello-client-bootstrap-1.5.0-1.el7sat
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Last Closed: 2018-02-21 16:54:17 UTC
Target Upstream Version:

Attachments (Terms of Use)

System ID Private Priority Status Summary Last Updated
Github Katello katello-client-bootstrap pull 227 0 'None' closed Capsule migration option, take 2 2020-11-30 16:07:17 UTC
Red Hat Knowledge Base (Solution) 2206071 0 None None None 2017-12-19 11:43:37 UTC

Description Mihir Lele 2017-08-30 13:45:19 UTC
Description of problem:

Host parameters added on the Host are lost after migrating the host from the Satellite to the Capsule.

Version-Release number of selected component (if applicable): 6.2.11

How reproducible: Always

Steps to Reproduce:
1.  Register a host with the Satellite, add puppet classes and host parameters on the host
2.  Re-register the host using the bootstrap.py to the Capsule.
3.  Check the "host Parameters" 

Actual results:

The host parameters which were previously added on the host are lost.

Expected results:

The Satellite should retain the parameters after migrating from Satellite to the capsule.

Additional info:

Comment 1 Daniel Lobato Garcia 2017-09-05 12:15:22 UTC
Tested - I will fill a bug in the bootstrap script issue tracker.

Comment 3 Daniel Lobato Garcia 2017-09-05 12:18:12 UTC

Comment 4 Rich Jerrido 2017-09-05 16:28:04 UTC
The current design of the bootstrap script does not preserve the original host record if it is run to move a host from a Satellite to a Capsule. In the usage as described in comment #0, the user effectively recreates the host entry in Satellite, which loses any parameters that were defined. 

Updating the $SUBJECT of this BZ to reflect the actual request.

Comment 5 Rich Jerrido 2017-12-19 11:43:38 UTC
*** Bug 1527332 has been marked as a duplicate of this bug. ***

Comment 6 Rich Jerrido 2018-01-23 19:18:34 UTC
Moving to POST as https://github.com/Katello/katello-client-bootstrap/pull/227 has been merged.

Comment 7 Lukas Pramuk 2018-02-15 14:48:48 UTC


0. Have a SAT with two Capsules, Puppet CAs on all of them have autosign entry '*'

1. Provision a host @Satellite, with a custom hostgroup param and a host param

2. Migrate to Capsule using bootstrap

@HOST # ./bootstrap.py -l admin -p changeme --new-capsule --server cap.example.com

[RUNNING], [2018-02-13 17:37:01], [Calling Foreman API to update Puppet master and Puppet CA for h1.example.com to cap.example.com]
[WARNING], [2018-02-13 17:37:04], NON-FATAL: [New capsule doesn't have OpenSCAP capability, not switching / configuring openscap_proxy_id] failed to execute properly.
[RUNNING], [2018-02-13 17:37:04], [Calling Foreman API to update content source for h1.example.com to cap.example.com]
[RUNNING], [2018-02-13 17:37:04], [/usr/bin/systemctl enable rhsmcertd]

[SUCCESS], [2018-02-13 17:37:05], [/usr/bin/systemctl enable rhsmcertd], completed successfully.

[RUNNING], [2018-02-13 17:37:05], [/usr/bin/systemctl restart rhsmcertd]

[SUCCESS], [2018-02-13 17:37:05], [/usr/bin/systemctl restart rhsmcertd], completed successfully.

[RUNNING], [2018-02-13 17:37:05], [Stopping the Puppet agent for configuration update]
[RUNNING], [2018-02-13 17:37:05], [/usr/bin/systemctl stop puppet]

[SUCCESS], [2018-02-13 17:37:05], [/usr/bin/systemctl stop puppet], completed successfully.

[RUNNING], [2018-02-13 17:37:05], [Updating Puppet configuration]
[RUNNING], [2018-02-13 17:37:05], [sed -i '/^[[:space:]]*server.*/ s/=.*/= cap.example.com/' /etc/puppet/puppet.conf]

[SUCCESS], [2018-02-13 17:37:05], [sed -i '/^[[:space:]]*server.*/ s/=.*/= cap.example.com/' /etc/puppet/puppet.conf], completed successfully.

[RUNNING], [2018-02-13 17:37:05], [sed -i '/^[[:space:]]*ca_server.*/ s/=.*/= cap.example.com/' /etc/puppet/puppet.conf]

[SUCCESS], [2018-02-13 17:37:05], [Removing /var/lib/puppet/ssl], completed successfully.
[SUCCESS], [2018-02-13 17:37:05], [Removing /var/lib/puppet/client_data/catalog/h1.example.com.json], completed successfully.
[NOTIFICATION], [2018-02-13 17:37:05], [Running Puppet in noop mode to generate SSL certs]
[NOTIFICATION], [2018-02-13 17:37:05], [Visit the UI and approve this certificate via Infrastructure->Capsules]
[NOTIFICATION], [2018-02-13 17:37:05], [if auto-signing is disabled]
[RUNNING], [2018-02-13 17:37:05], [/usr/bin/puppet agent --test --noop --tags no_such_tag --waitforcert 10]
Info: Creating a new SSL key for h1.example.com
Info: Caching certificate for ca
Info: csr_attributes file loading from /etc/puppet/csr_attributes.yaml
Info: Creating a new SSL certificate request for h1.example.com
Info: Certificate Request fingerprint (SHA256): E7:B4:B9:48:B1:E4:85:E6:1F:78:C1:94:8D:E6:DA:EA:89:9D:89:5B:16:BC:14:55:E9:0B:36:1B:3E:4E:0F:16
Info: Caching certificate for ca
Info: Caching certificate for h1.example.com
Info: Caching certificate_revocation_list for ca
Info: Retrieving pluginfacts
Info: Retrieving plugin
Notice: /File[/var/lib/puppet/lib/facter/rh_certificates.rb]/ensure: removed
Info: Loading facts
Info: Caching catalog for h1.example.com
Info: Applying configuration version '1518543511'
Notice: Finished catalog run in 0.28 seconds
[SUCCESS], [2018-02-13 17:38:34], [/usr/bin/puppet agent --test --noop --tags no_such_tag --waitforcert 10], completed successfully.

[RUNNING], [2018-02-13 17:38:34], [/usr/bin/systemctl enable puppet]

[SUCCESS], [2018-02-13 17:38:34], [/usr/bin/systemctl enable puppet], completed successfully.

[RUNNING], [2018-02-13 17:38:34], [/usr/bin/systemctl restart puppet]

[SUCCESS], [2018-02-13 17:38:34], [/usr/bin/systemctl restart puppet], completed successfully.

[NOTIFICATION], [2018-02-13 17:38:34], [Puppet agent is not running; please start manually if required.]
[NOTIFICATION], [2018-02-13 17:38:34], [You also need to manually revoke the certificate on the old capsule.]

# hammer host info --name h1.example.com

Environment:              KT_Default_Organization_Library_Puppet_View_2
Puppet CA Id:             2
Puppet Master Id:         2
Cert name:                h1.example.com

    host_param_2 => two
All parameters:
    host_param_2 => two
    kt_activation_keys => rhel7 
    hg_param_1 => one

    Content Source:
        ID:   2
        Name: cap.example.com   

>>> @UI check: all params are retained and host is migrated to capsule

@HOST # grep -r cap.example.com /etc/rhsm/rhsm.conf /etc/yum.repos.d /etc/puppet

/etc/rhsm/rhsm.conf:hostname = cap.example.com
/etc/rhsm/rhsm.conf:baseurl= https://cap.example.com/pulp/repos
/etc/yum.repos.d/redhat.repo:baseurl = https://cap.example.com/pulp/repos/Default_Organization/Library/content/dist/rhel/server/7/$releasever/$basearch/os
/etc/yum.repos.d/redhat.repo:baseurl = https://cap.example.com/pulp/repos/Default_Organization/Library/custom/Internal_RHEL7/Tools_Puppet_4_RHEL7_x86_64
/etc/yum.repos.d/redhat.repo:baseurl = https://cap.example.com/pulp/repos/Default_Organization/Library/custom/Internal_RHEL7/Tools_6_3_RHEL7_x86_64
/etc/yum.repos.d/redhat.repo:baseurl = https://cap.example.com/pulp/repos/Default_Organization/Library/custom/Internal_RHEL7/RHEL_7_4
/etc/puppet/puppet.conf:ca_server       = cap.example.com
/etc/puppet/puppet.conf:server          = cap.example.com

# puppet agent -t
# yum -y install nc

>>> client side check show that the host is registered/reports to capsule

Iterated over various migrations S>C, C>S, C>C2 plus upgraded gradually to puppet4 and also tried with p4 upgraded host - all worked

Comment 9 Satellite Program 2018-02-21 16:54:17 UTC
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA.
> For information on the advisory, and where to find the updated files, follow the link below.
> If the solution does not work for you, open a new bug report.
> https://access.redhat.com/errata/RHSA-2018:0336

Note You need to log in before you can comment on or make changes to this bug.