Description of problem:
For CNS, Multiple orphaned volumes were found where heketi volumes (besides heketidbstorage) did not have corresponding OpenShift PVs.
Upon heketi volume deletion, received unexpected error.
Version-Release number of selected component (if applicable):
100% for this specific volume
Steps to Reproduce:
1. oc rsh <<heketi-pod>>
2. heketi-cli volume delete <<volume-id>>
Error: Cannot delete thin pool <<thin_pool_name>> on <<openshift_node_name>> because it is used by [-1] snapshot(s) or cloned volume(s)
Successful volume deletion
Will update bz if this applies to multiple volumes
Occurs for multiple volumes in this cluster
Let's first try and determine what's in the thin pool in question. On the host where heketi created the brick that is failing to delete, run 'lvs' as well as getting the id of the brick (should be in heketi logs).
In addition, as a proactive measure let's get the heketi db dump (heketi-cli db dump) and topology info (heketi-cli topology info) and attach them to the bug please.
Item 1: on OpenShift node: lvs
Item 2: oc logs <<heketi_pod_name>>
Item 3: oc rsh <<heketi_pod_name>>
# heketi-cli db dump
Item 4: oc rsh <<heketi_pod_name>>
# heketi-cli topology info
are being added as private attachments
Clarification for comment 5:
Item 1: On OpenShift node: lvs > /tmp/lvs
Item 2: oc logs <<heketi_pod_name>> > /tmp/heketi.log
Item 3: oc rsh <<heketi_pod_name>> heketi-cli db dump > /tmp/heketi_db_dump
Item 4: oc rsh <<heketi_pod_name>> heketi-cli topology info > /tmp/heketi_topology_info
Updated Item 1: lvs --all -o lv_all >/tmp/lvs
Does this provide you with additional useful information beyond a simple "lvs"?
*** Bug 1603098 has been marked as a duplicate of this bug. ***
Updated doc text in the Doc Text field. Please review for technical accuracy.
I have updated the text to failed.
(In reply to Anjana from comment #28)
> I have updated the text to failed.
Looks OK to me.
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.
For information on the advisory, and where to find the updated
files, follow the link below.
If the solution does not work for you, open a new bug report.