| Summary: | Create an automated test for a full Beaker provisioning cycle & mulltihost tests | ||
|---|---|---|---|
| Product: | [Retired] Beaker | Reporter: | Nick Coghlan <ncoghlan> |
| Component: | tests | Assignee: | beaker-dev-list |
| Status: | CLOSED WONTFIX | QA Contact: | tools-bugs <tools-bugs> |
| Severity: | unspecified | Docs Contact: | |
| Priority: | unspecified | ||
| Version: | develop | CC: | dcallagh, ebaak, qwan, rjoost, tools-bugs |
| Target Milestone: | --- | Keywords: | FutureFeature |
| Target Release: | --- | ||
| Hardware: | Unspecified | ||
| OS: | Unspecified | ||
| Whiteboard: | |||
| Fixed In Version: | Doc Type: | Enhancement | |
| Doc Text: | Story Points: | --- | |
| Clone Of: | Environment: | ||
| Last Closed: | 2017-07-26 05:04:02 UTC | Type: | Bug |
| Regression: | --- | Mount Type: | --- |
| Documentation: | --- | CRM: | |
| Verified Versions: | Category: | --- | |
| oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
| Cloudforms Team: | --- | Target Upstream Version: | |
|
Description
Nick Coghlan
2013-12-17 08:06:08 UTC
I am inclined to close this as WONTFIX, because I think that the approach suggested here (create an entire virtualised Beaker lab including test machines and assert that they can be provisioned) would be costly to implement, and does not necessarily give us any better coverage than our current two-sided approach, which is: * the Python-level unit test + integration test suite in Beaker's source tree, which is run inside dogfood, and exercises all parts of Beaker itself with varying levels of mockery but *no* interaction with real hardware * the workflow-selftest jobs, including the Jenkins job to invoke it (see bug 954265, bug 1299722), which runs the self-tests to exercise all of Beaker's functionality on *real* hardware with as many different distro/arch combinations as possible |