Bug 1793847 - Performance regression in scale deploy timings with config-download
Summary: Performance regression in scale deploy timings with config-download
Status: CLOSED DUPLICATE of bug 1767581
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: python-tripleoclient
Version: 16.0 (Train)
Hardware: x86_64
OS: Linux
Target Milestone: ---
: ---
Assignee: RHOS Maint
QA Contact: Sasha Smolyak
Depends On:
TreeView+ depends on / blocked
Reported: 2020-01-22 04:42 UTC by Dave Wilson
Modified: 2020-06-05 14:09 UTC (History)
6 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Last Closed: 2020-06-05 14:09:25 UTC
Target Upstream Version:

Attachments (Terms of Use)
ansible.log (2.70 MB, application/gzip)
2020-01-22 04:42 UTC, Dave Wilson
no flags Details

Description Dave Wilson 2020-01-22 04:42:53 UTC
Created attachment 1654441 [details]

Description of problem: Scale out deploy timings within the RDU scale lab of OSP16 are taking considerable longer than experiences with OSP13 (non config-download)

Version-Release number of selected component (if applicable): OSP16 (RHOS_TRUNK-16.0-RHEL-8-20200113.n.0)

How reproducible:

Steps to Reproduce:
1. Allocation of 300 baremetal nodes in scale env
2. Scale out env in increments of 50
3. defaults timeouts hit and deploy fails
4. bump up keystone token timeout

Actual results: 
1. Scale out of ~250 nodes with config download took 7.5hr (excludes stack create)

Expected results:
1. Deploy timings equal or less than documented in the  500 BM node scale deploy with OSP13 

Additional info:
1. Including ansible.log for deploy of the 250 node scale out
2  In osp13 the same scale out ( although no ceph which account for 1.5hr of the deploy) took ~2hrs

Comment 3 Luke Short 2020-06-05 14:09:25 UTC
We have various new features coming in a future release of RHOSP 16 including `openstack overcloud deploy --limit` and a new Ansible strategy to help with the deployment time.

*** This bug has been marked as a duplicate of bug 1767581 ***

Note You need to log in before you can comment on or make changes to this bug.