|Summary:||HeatApiCloudwatch status for rhos 12|
|Product:||Red Hat OpenStack||Reporter:||Gurenko Alex <agurenko>|
|Component:||openstack-tripleo-heat-templates||Assignee:||Alex Schultz <aschultz>|
|Status:||CLOSED ERRATA||QA Contact:||Ronnie Rasouli <rrasouli>|
|Version:||12.0 (Pike)||CC:||agurenko, aschultz, jschluet, mburns, ramishra, rhel-osp-director-maint, sbaker, shardy, srevivo, therve|
|Target Release:||13.0 (Queens)|
|Fixed In Version:||openstack-tripleo-heat-templates-8.0.0-0.20180122224017.el7ost||Doc Type:||If docs needed, set a value|
|Doc Text:||Story Points:||---|
|Last Closed:||2018-06-27 13:33:53 UTC||Type:||Bug|
|oVirt Team:||---||RHEL 7.3 requirements from Atomic Host:|
|Cloudforms Team:||---||Target Upstream Version:|
Description Gurenko Alex 2017-08-10 15:00:20 UTC
Description of problem: During RHOS 12 deployment with containers I'm getting error that HeatApiCloudwatch container does not exists. A brief comment here: https://bugzilla.redhat.com/show_bug.cgi?id=1468256#c5 shows that it might be depricated for RHOS 12. Should it be removed for all role definitions starting RHOS 12? Or will it be containerized and added later in the cycle? Version-Release number of selected component (if applicable): How reproducible: 100% Steps to Reproduce: 1. 2. 3. Actual results: Expected results: Additional info:
Comment 1 Thomas Hervé 2017-09-28 10:41:13 UTC
I'm not sure what we should do here. CW is deprecated and will be removed in 13, but it doesn't really need to go in 12, even though the container is gone (unless that creates a problem with upgrades?). Nobody should have it deployed.
Comment 2 Gurenko Alex 2017-10-01 08:58:15 UTC
Based on what you said it just need to be removed from all references in both InfraRed (for 12+) and tripleo, as of right now it's still been generated and added to the composable deployment for example: [stack@undercloud-0 ~]$ openstack overcloud role show Controller | grep HeatApiCloudwatch * OS::TripleO::Services::HeatApiCloudwatch
Comment 3 Alex Schultz 2017-10-02 14:21:36 UTC
We probably just need to remove the references from the roles files in THT/roles/*.yaml.
Comment 4 Alex Schultz 2017-11-29 22:04:59 UTC
Since this was merged upstream for 13, I've updated the bug to reflect it's current status.
Comment 8 Gurenko Alex 2018-04-09 09:30:06 UTC
So I had look at default roles in recent puddle and it's still there. [stack@undercloud-0 roles]$ cat /etc/rhosp-release Red Hat OpenStack Platform release 13.0 Beta (Queens) [stack@undercloud-0 roles]$ cat ~/core_puddle_version 2018-04-03.3 [stack@undercloud-0 roles]$ rpm -q openstack-tripleo-heat-templates openstack-tripleo-heat-templates-8.0.2-0.20180327213843.f25e2d8.el7ost.noarch [stack@undercloud-0 roles]$ grep -i heatapicloudwatch ./Controller* ./ControllerAllNovaStandalone.yaml: - OS::TripleO::Services::HeatApiCloudwatch ./ControllerNoCeph.yaml: - OS::TripleO::Services::HeatApiCloudwatch ./ControllerNovaStandalone.yaml: - OS::TripleO::Services::HeatApiCloudwatch ./ControllerOpenstack.yaml: - OS::TripleO::Services::HeatApiCloudwatch ./ControllerStorageNfs.yaml: - OS::TripleO::Services::HeatApiCloudwatch ./Controller.yaml: - OS::TripleO::Services::HeatApiCloudwatch
Comment 9 Thomas Hervé 2018-04-09 09:34:38 UTC
The role is still present for upgrade purposes, but the service shouldn't be there.
Comment 10 Gurenko Alex 2018-04-09 15:42:48 UTC
I don't see cloudwatch container on a new deployment on controllers, so it seems like it's working as expected. [heat-admin@controller-0 ~]$ sudo docker ps | grep heat 251cc64f9636 192.168.24.1:8787/rhosp13/openstack-heat-api:2018-04-03.3 "kolla_start" 14 minutes ago Up 14 minutes heat_api_cron a49c5733e1e0 192.168.24.1:8787/rhosp13/openstack-heat-api-cfn:2018-04-03.3 "kolla_start" 14 minutes ago Up 14 minutes (healthy) heat_api_cfn f2c81d9b0138 192.168.24.1:8787/rhosp13/openstack-heat-engine:2018-04-03.3 "kolla_start" 14 minutes ago Up 14 minutes (healthy) heat_engine 971abc7bf307 192.168.24.1:8787/rhosp13/openstack-heat-api:2018-04-03.3 "kolla_start" 15 minutes ago Up 15 minutes (healthy) heat_api
Comment 12 errata-xmlrpc 2018-06-27 13:33:53 UTC
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHEA-2018:2086