Description of problem:
While a recycle PV's status are changing from Released to Available,
creating a new pvc will cause the PV stuck in Released status.
Version-Release number of selected component (if applicable):
Steps to Reproduce:
1.Create a PV with recycling policy.
2.Create a PVC.
3.Check the PV/PVC are bound.
4.Delete the PVC.
5.Check the PV is released status.
6.Create another PVC.
7.Check the PV and PVC status.
The PV keeps in released status, while
the PVC keeps in pending status.
$ oc get pv nfs --config=admin.kubeconfig ; echo ; oc get pvc nfsc
NAME CAPACITY ACCESSMODES STATUS CLAIM REASON AGE
nfs 5Gi RWO Released lxiap/nfsc 2h
NAME STATUS VOLUME CAPACITY ACCESSMODES AGE
nfsc Pending 13m
The PV should become available, the bind to the PVC.
PV file can be found at https://github.com/openshift-qe/v3-testfiles/blob/master/persistent-volumes/nfs/nfs-recycle-rwo.json
PVC file can be found at https://github.com/openshift-qe/v3-testfiles/blob/master/persistent-volumes/nfs/claim-rwo.json
$ oc get pv nfs -o yaml
$ oc get pvc -o yaml
- apiVersion: v1
Logs in /var/log/messages
Feb 22 13:31:47 openshift-138 atomic-openshift-master-controllers: I0222 00:31:47.801870 1 persistentvolume_claim_binder_controller.go:338] Synchronizing PersistentVolumeClaim[nfsc]
Feb 22 13:31:47 openshift-138 atomic-openshift-master-controllers: I0222 00:31:47.835563 1 persistentvolume_recycler_controller.go:158] Recycler: checking PersistentVolume[nfs]
Feb 22 13:31:47 openshift-138 atomic-openshift-master-controllers: I0222 00:31:47.843327 1 persistentvolume_recycler_controller.go:189] PersistentVolume[nfs] phase Available - skipping: The volume is not in 'Released' phase
Feb 22 13:31:47 openshift-138 atomic-openshift-master-controllers: I0222 00:31:47.862330 1 persistentvolume_claim_binder_controller.go:412] PersistentVolumeClaim[nfsc] is bound
Feb 22 13:31:47 openshift-138 atomic-openshift-master-controllers: I0222 00:31:47.862376 1 persistentvolume_claim_binder_controller.go:338] Synchronizing PersistentVolumeClaim[nfsc]
Feb 22 13:31:47 openshift-138 atomic-openshift-master-controllers: E0222 00:31:47.867998 1 persistentvolume_claim_binder_controller.go:162] PVClaimBinder could not update claim nfsc: Unexpected error saving claim status: persistentvolumeclaims "nfsc" cannot be updated: the object has been modified; please apply your changes to the latest version and try again
Recreated but no resolution found yet.
After refreshing to Origin HEAD and starting a fresh environment, I did not have any problems with the recycler using the Wordpress example in github.
I will test the specific version in this BZ.
I verified recycling in Origin HEAD works.
There was another issue around newly created claims binding to existing volumes, when they should be released. I believe this BZ is a symptom of that former problem.
Origin HEAD has the fix:
v3.1.1.x does not:
The latest rebase pulled in the fix from upstream.
From Mark this will be fixed from latest rebase
Tested on latest build with same PV and PVC templates:
The issue is still reproducible.
# oc get pv
NAME LABELS CAPACITY ACCESSMODES STATUS CLAIM REASON AGE
nfs <none> 5Gi RWO Released jhou/nfsc 50m
# oc get pvc
NAME LABELS STATUS VOLUME CAPACITY ACCESSMODES AGE
nfsc <none> Pending 50m
Reproduced.. working on a fix.
Fix posted upstream here: https://github.com/kubernetes/kubernetes/pull/23078
will open an origin PR
Origin PR: https://github.com/openshift/origin/pull/8100
2nd upstream PR: https://github.com/kubernetes/kubernetes/pull/23548. Will create new origin PR once this merges.
*** Bug 1323613 has been marked as a duplicate of this bug. ***
We are blocked by bug https://bugzilla.redhat.com/show_bug.cgi?id=1324418 since PV failed to recycle.
Once that bug is fixed/verified, we will re-check this one.
Following the same steps as in #comment 0, the PV and PVC finally got bound. Move bug to verified.
Verified on version,
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.
For information on the advisory, and where to find the updated
files, follow the link below.
If the solution does not work for you, open a new bug report.
*** Bug 1323596 has been marked as a duplicate of this bug. ***