Bug 1054935
| Summary: | cpu host-model should permanently encode the CPU definition on first define | ||
|---|---|---|---|
| Product: | [Community] Virtualization Tools | Reporter: | Cole Robinson <crobinso> |
| Component: | libvirt | Assignee: | Libvirt Maintainers <libvirt-maint> |
| Status: | CLOSED DEFERRED | QA Contact: | |
| Severity: | unspecified | Docs Contact: | |
| Priority: | unspecified | ||
| Version: | unspecified | ||
| Target Milestone: | --- | ||
| Target Release: | --- | ||
| Hardware: | Unspecified | ||
| OS: | Unspecified | ||
| Whiteboard: | |||
| Fixed In Version: | Doc Type: | Bug Fix | |
| Doc Text: | Story Points: | --- | |
| Clone Of: | Environment: | ||
| Last Closed: | 2018-10-09 14:54:49 UTC | Type: | Bug |
| Regression: | --- | Mount Type: | --- |
| Documentation: | --- | CRM: | |
| Verified Versions: | Category: | --- | |
| oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
| Cloudforms Team: | --- | Target Upstream Version: | |
| Embargoed: | |||
|
Description
Cole Robinson
2014-01-17 19:02:52 UTC
Still an issue with latest code This hasn't really turned out to be an issue any cares about in practice, so i don't think it needs explicit libvirt support. If apps care, they can grab the host-model CPU block from domcapabilities and define it into the XML themselves, which will accomplish the same goal |