Bug 1729267 - [RFE] Develop playbook for migration from FileStore to BlueStore for RHCS 4.0 upgrades
Summary: [RFE] Develop playbook for migration from FileStore to BlueStore for RHCS 4.0...
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: Red Hat Ceph Storage
Classification: Red Hat
Component: Ceph-Ansible
Version: 4.0
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: rc
: 4.0
Assignee: Guillaume Abrioux
QA Contact: Ameena Suhani S H
Aron Gunn
URL:
Whiteboard:
: 1613735 1641167 1649620 1792958 (view as bug list)
Depends On: 1738576 1756366 1756371
Blocks: 1644347 1730176 1733577 1738236 1770319
TreeView+ depends on / blocked
 
Reported: 2019-07-11 17:50 UTC by Mike Hackett
Modified: 2020-01-28 05:26 UTC (History)
19 users (show)

Fixed In Version: ceph-ansible-4.0.11-1.el8cp, ceph-ansible-4.0.11-1.el7cp
Doc Type: Enhancement
Doc Text:
.Ansible playbook for migrating OSDs from FileStore to BlueStore A new Ansible playbook has been added to migrate OSDs from FileStore to BlueStore. The object store migration is not done as part of the upgrade process to {storage-product} {storage-product-current-release}. Do the migration after the upgrade completes. For details, see the link:{install-guide}#how-to-migrate-the-object-store-from-filestore-to-bluestore[_How to migrate the object store from FileStore to BlueStore_] section in the _Installation Guide_ for {storage-product} {storage-product-current-release}.
Clone Of:
Environment:
Last Closed: 2020-01-27 19:57:49 UTC
Target Upstream Version:


Attachments (Terms of Use)


Links
System ID Priority Status Summary Last Updated
Github ceph ceph-ansible pull 4472/commits/8724a734d319525f5f2c75f70f3e88268e595487 None None None 2020-02-12 16:01:27 UTC
Github ceph ceph-ansible pull 4871 None closed filestore-to-bluestore: umount partitions before zapping them 2020-02-12 16:01:28 UTC
Github ceph ceph-ansible pull 4892 None closed ceph_volume: support filestore to bluestore migration (bp #4889) 2020-02-12 16:01:28 UTC
Github ceph ceph-ansible pull 4971 None closed filestore-to-bluestore: fix osd_auto_discovery 2020-02-12 16:01:28 UTC
Github ceph ceph-ansible pull 4972 None closed filestore-to-bluestore: fix osd_auto_discovery (bp #4971) 2020-02-12 16:01:28 UTC
Github ceph ceph-ansible pull 4985 None closed filestore-to-bluestore: don't fail when with no PV 2020-02-12 16:01:28 UTC
Github ceph ceph-ansible pull 4986 None closed filestore-to-bluestore: don't fail when with no PV (bp #4985) 2020-02-12 16:01:29 UTC

Description Mike Hackett 2019-07-11 17:50:34 UTC
Description of problem:
With the release of RHCS 4.0 only supporting BlueStore we will require a playbook for automated migrations of upgraded clusters that are currently on FileStore to BlueStore. 


Version-Release number of selected component (if applicable):
RHCS 4.0

Comment 4 Scott Ostapovicz 2019-07-17 12:05:20 UTC
The new code freeze date on this is September 2

Comment 5 Giridhar Ramaraju 2019-08-05 13:09:43 UTC
Updating the QA Contact to a Hemant. Hemant will be rerouting them to the appropriate QE Associate. 

Regards,
Giri

Comment 6 Giridhar Ramaraju 2019-08-05 13:10:56 UTC
Updating the QA Contact to a Hemant. Hemant will be rerouting them to the appropriate QE Associate. 

Regards,
Giri

Comment 7 Tejas 2019-08-12 07:15:22 UTC
*** Bug 1613735 has been marked as a duplicate of this bug. ***

Comment 8 Guillaume Abrioux 2019-08-21 07:29:07 UTC
*** Bug 1641167 has been marked as a duplicate of this bug. ***

Comment 9 Dimitri Savineau 2019-09-06 16:19:41 UTC
*** Bug 1649620 has been marked as a duplicate of this bug. ***

Comment 24 Vasishta 2019-12-20 12:53:48 UTC
Hi Guillaume,

Tried https://patch-diff.githubusercontent.com/raw/ceph/ceph-ansible/pull/4874.diff

Working fine when OSD was configured as -

>> devices="['/dev/sdb','/dev/sdc','/dev/sdd']" osd_scenario="collocated" dmcrypt="true"

Regards,
Vasishta
QE, Ceph

Comment 26 Yaniv Kaul 2020-01-08 13:41:32 UTC
Is this going to be fixed for RHCS 4.0? What's the latest status? https://github.com/ceph/ceph-ansible/pull/4871 has been merged almost 3 weeks ago - what else is missing?

Comment 38 Dimitri Savineau 2020-01-23 21:45:21 UTC
*** Bug 1792958 has been marked as a duplicate of this bug. ***

Comment 46 Federico Lucifredi 2020-01-27 19:57:49 UTC
Ok - the error message is a 4.1 issue (and we need a separate bug for it). This is one is CLOSED as the RFE is for the migration script and that is in.


Note You need to log in before you can comment on or make changes to this bug.