Description of problem:
Customer needs to be able to deploy and manage multiple Overclouds from a single Undercloud environment. They have a need to segregate out multiple Overclouds in a single location and not create a separate Undercloud and provisioning network for each one.
*** Bug 1275654 has been marked as a duplicate of this bug. ***
This bug did not make the OSP 8.0 release. It is being deferred to OSP 10.
Quality Engineering Management has reviewed and declined this request.
You may appeal this decision by reopening this request.
Very complex request, mainly for networking part. Removing target for now due to short Ocata release.
This bugzilla has been removed from the release and needs to be reviewed for targeting another release.
This bugzilla has been removed from the release and needs to be reviewed and Triaged for another Target Release.
Moving out of OSP13. Delivering containerized undercloud is having higher priority.
Due to ongoing updates of the underlying technology to containers which for undercloud is targeted for RHOSP 14  any new features to undercloud node will be pushed after this update. Therefor we will re-evaluate this feature for RHOSP 15 target.
We'll need more requirements around how to make this work.
Should the overclouds be separated by different tenants in the undercloud? Right now, the admin tenant is used for everything.
What if anything would be shared between multiple overclouds? The provisioning network may need to be shared. What about other networks?
Initial request is to deploy multiple standalone stacks. Let's keep it as simple as possible and make sure there is a good UX around it.
Single tenant is acceptable for the first iteration of this feature.
First step should be independent deployments (if there is need for shared provisioning network, that is OK). Further requirements on connecting multiple overclouds and share some resources (networks, replication of data, etc) will come in next iterations with helping requirements from multi-site epic leads.
The automation is ready, the feature is verified
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.
For information on the advisory, and where to find the updated
files, follow the link below.
If the solution does not work for you, open a new bug report.