Bugzilla will be upgraded to version 5.0. The upgrade date is tentatively scheduled for 2 December 2018, pending final testing and feedback.
Bug 1497408 - Failed to undeploy HOSA
Failed to undeploy HOSA
Status: CLOSED ERRATA
Product: OpenShift Container Platform
Classification: Red Hat
Component: Installer (Show other bugs)
3.7.0
Unspecified Unspecified
medium Severity high
: ---
: 3.9.0
Assigned To: John Sanda
Junqi Zhao
:
Depends On:
Blocks: 1548567
  Show dependency treegraph
 
Reported: 2017-09-30 04:10 EDT by Junqi Zhao
Modified: 2018-03-28 10:07 EDT (History)
5 users (show)

See Also:
Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of:
: 1548567 (view as bug list)
Environment:
Last Closed: 2018-03-28 10:07:14 EDT
Type: Bug
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)
HOSA pods info (4.54 KB, text/plain)
2017-09-30 04:10 EDT, Junqi Zhao
no flags Details
Undeploy HOSA ansible log (736.69 KB, text/plain)
2017-09-30 04:11 EDT, Junqi Zhao
no flags Details


External Trackers
Tracker ID Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2018:0489 None None None 2018-03-28 10:07 EDT

  None (edit)
Description Junqi Zhao 2017-09-30 04:10:39 EDT
Created attachment 1332610 [details]
HOSA pods info

Description of problem:
it failed to undeploy HOSA only,  hawkular-openshift-agent pods always in ContainerCreating status, described pods, 
error info:
	MountVolume.SetUp failed for volume "hawkular-openshift-agent-token-f60xz" : secrets "hawkular-openshift-agent-token-f60xz" not found

Checked,secret,sa were deleted, but configmap, daemonset,clusterrole were not delete
# oc get secret | grep hawkular-openshift-agent
# oc get template  | grep agent
No resources found.
# oc get configmap | grep hawkular-openshift-agent
hawkular-openshift-agent-configuration   2         18m
# oc get daemonset | grep hawkular-openshift-agent
hawkular-openshift-agent   2         2         0         2            0           <none>          18m
# oc get clusterrole | grep hawkular-openshift-agent
hawkular-openshift-agent
# oc get sa | grep hawkular-openshift-agent
# oc get secret | grep hawkular-openshift-agent
# oc get template  | grep hawkular-openshift-agent
No resources found.



Version-Release number of the following components:
# rpm -qa | grep openshift-ansible
openshift-ansible-docs-3.7.0-0.134.0.git.0.6f43fc3.el7.noarch
openshift-ansible-filter-plugins-3.7.0-0.134.0.git.0.6f43fc3.el7.noarch
openshift-ansible-3.7.0-0.134.0.git.0.6f43fc3.el7.noarch
openshift-ansible-lookup-plugins-3.7.0-0.134.0.git.0.6f43fc3.el7.noarch
openshift-ansible-roles-3.7.0-0.134.0.git.0.6f43fc3.el7.noarch
openshift-ansible-callback-plugins-3.7.0-0.134.0.git.0.6f43fc3.el7.noarch
openshift-ansible-playbooks-3.7.0-0.134.0.git.0.6f43fc3.el7.noarch

HOSA image:
metrics-hawkular-openshift-agent:v3.7.0-0.135.0.0


How reproducible:
Always

Steps to Reproduce:
1. Deploy HOSA along with Metrics
2. After all the pods get ready, undeploy HOSA only
3.

Actual results:
Failed to undeploy HOSA

Expected results:
Should be undepoly HOSA successfully

Additional info:
#Inventory file
[OSEv3:children]
masters
etcd

[masters]
${MASTER} openshift_public_hostname=${MASTER}

[etcd]
${ETCD} openshift_public_hostname=${ETCD}


[OSEv3:vars]
ansible_ssh_user=root
ansible_ssh_private_key_file="~/libra.pem"
deployment_type=openshift-enterprise
openshift_docker_additional_registries=brew-pulp-docker01.web.prod.ext.phx2.redhat.com:8888


# Undeploy HOSA
openshift_metrics_install_hawkular_agent=false
openshift_metrics_hawkular_hostname=hawkular-metrics.apps.0930-0ar.qe.rhcloud.com
openshift_metrics_image_prefix=brew-pulp-docker01.web.prod.ext.phx2.redhat.com:8888/openshift3/
openshift_metrics_image_version=v3.7
Comment 1 Junqi Zhao 2017-09-30 04:11 EDT
Created attachment 1332611 [details]
Undeploy HOSA ansible log
Comment 4 Junqi Zhao 2018-01-25 04:33:56 EST
Tested, HOSA can be uninstalled successfully now, although the hawkular-openshift-agent confimap is kept after uninstallation.

oc get configmap | grep hawkular-openshift-agent
hawkular-openshift-agent-configuration   2         19m

Env:
metrics-hawkular-openshift-agent/images/v3.9.0-0.24.0.0
# rpm -qa | grep openshift-ansible
openshift-ansible-playbooks-3.9.0-0.23.0.git.0.d53d7ed.el7.noarch
openshift-ansible-roles-3.9.0-0.23.0.git.0.d53d7ed.el7.noarch
openshift-ansible-3.9.0-0.23.0.git.0.d53d7ed.el7.noarch
openshift-ansible-docs-3.9.0-0.23.0.git.0.d53d7ed.el7.noarch
Comment 5 Joel Rosental R. 2018-01-25 05:09:02 EST
The current workaround for OCP 3.7 would be just to remove ds (oc delete ds hawkular-openshift-agent)?
Comment 6 Junqi Zhao 2018-01-25 19:35:11 EST
oh, for OCP 3.7

You can use the following commands to delete HOSA
oc project default
oc delete configmap hawkular-openshift-agent-configuration
oc delete daemonset hawkular-openshift-agent
oc delete clusterrole hawkular-openshift-agent
oc delete sa hawkular-openshift-agent
Comment 7 Junqi Zhao 2018-01-25 19:37:14 EST
Please change it to ON_QA, it is fixed in 3.9, see Comment 4
Comment 8 John Sanda 2018-02-23 16:42:58 EST
Because it is not completely fixed for 3.9 and because the changes cover both 3.7 and 3.9, I am going to change the target release of this to 3.9. I will also clone this BZ to create a 3.7 BZ.
Comment 9 John Sanda 2018-02-23 16:43:25 EST
PR submitted - https://github.com/openshift/openshift-ansible/pull/7276
Comment 11 Junqi Zhao 2018-02-26 22:24:13 EST
Tested with openshift-ansible-3.9.0-0.53.0, HOSA can be undeployed successfully now.

# rpm -qa | grep openshift-ansible
openshift-ansible-roles-3.9.0-0.53.0.git.0.f8f01ef.el7.noarch
openshift-ansible-playbooks-3.9.0-0.53.0.git.0.f8f01ef.el7.noarch
openshift-ansible-3.9.0-0.53.0.git.0.f8f01ef.el7.noarch
openshift-ansible-docs-3.9.0-0.53.0.git.0.f8f01ef.el7.noarch
Comment 14 errata-xmlrpc 2018-03-28 10:07:14 EDT
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2018:0489

Note You need to log in before you can comment on or make changes to this bug.