Bug 1989724
| Summary: | Descheduler operator should expose options for pods with PVCs and Local Storage | ||
|---|---|---|---|
| Product: | OpenShift Container Platform | Reporter: | Mike Dame <mdame> |
| Component: | kube-scheduler | Assignee: | Mike Dame <mdame> |
| Status: | CLOSED ERRATA | QA Contact: | RamaKasturi <knarra> |
| Severity: | medium | Docs Contact: | |
| Priority: | medium | ||
| Version: | 4.8 | CC: | aos-bugs, jchaloup, knarra, mfojtik |
| Target Milestone: | --- | ||
| Target Release: | 4.9.0 | ||
| Hardware: | Unspecified | ||
| OS: | Unspecified | ||
| Whiteboard: | |||
| Fixed In Version: | Doc Type: | Enhancement | |
| Doc Text: |
Descheduler Operator now provides 2 new profiles: EvictPodsWithPVCs and EvictPodsWithLocalStorage which allow evicting pods with static storage configured.
|
Story Points: | --- |
| Clone Of: | Environment: | ||
| Last Closed: | 2021-10-18 17:44:27 UTC | Type: | --- |
| Regression: | --- | Mount Type: | --- |
| Documentation: | --- | CRM: | |
| Verified Versions: | Category: | --- | |
| oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
| Cloudforms Team: | --- | Target Upstream Version: | |
| Embargoed: | |||
|
Description
Mike Dame
2021-08-03 19:40:43 UTC
Tested bug in the way below using the build below and i see that the podswithlocalstorage gets evicted and podswithpvcs do not.
[knarra@knarra cucushift]$ oc get csv -n openshift-kube-descheduler-operator
NAME DISPLAY VERSION REPLACES PHASE
clusterkubedescheduleroperator.4.9.0-202108210926 Kube Descheduler Operator 4.9.0-202108210926 Succeeded
[knarra@knarra cucushift]$ oc get clusterversion
NAME VERSION AVAILABLE PROGRESSING SINCE STATUS
version 4.9.0-0.nightly-2021-08-22-070405 True False 8h Cluster version is 4.9.0-0.nightly-2021-08-22-070405
Test steps:
============
1) Install 4.9 cluster
2) Install descheduler
3) Enable ToplogyAndDuplicates,EvictPodsWithLocalStorage,DoNotEvictPodsWithPVC
4) create a new project with name test and run the command below
oc adm policy add-scc-to-user privileged -z default -n <project_name>
5) oc debug node/<worker> and create a directory /mnt/data1, create file called index.html inside the /mnt/data1 dir with some contents
5) create podswithlocalstorage with the yaml below
[knarra@knarra cucushift]$ cat /tmp/rc1.yaml
apiVersion: v1
kind: ReplicationController
metadata:
name: knarra
spec:
replicas: 2
selector:
app: sise
template:
metadata:
name: somename
labels:
app: sise
spec:
containers:
- name: sise
image: quay.io/openshifttest/hello-openshift@sha256:aaea76ff622d2f8bcb32e538e7b3cd0ef6d291953f3e7c9f556c1ba5baf47e2e
ports:
- containerPort: 9876
volumeMounts:
- mountPath: /tmp
name: task-pv-storage
volumes:
- name: task-pv-storage
hostPath:
path: /mnt/data1
type: Directory
6) oc debug node/<worker_node> and create /mnt/data dir, create a file inside the dir and add some contents to it.
7) Now create pods with pvc using the yaml file below.
[knarra@knarra cucushift]$ cat /tmp/rc.yaml
apiVersion: v1
kind: ReplicationController
metadata:
name: rcex
spec:
replicas: 6
selector:
app: sise
template:
metadata:
name: somename
labels:
app: sise
spec:
containers:
- name: sise
image: quay.io/openshifttest/hello-openshift@sha256:aaea76ff622d2f8bcb32e538e7b3cd0ef6d291953f3e7c9f556c1ba5baf47e2e
ports:
- containerPort: 9876
volumeMounts:
- mountPath: /data
name: hostpath-privileged
volumes:
- name: hostpath-privileged
persistentVolumeClaim:
claimName: task-pv-claim
8) Run descheduler & verify that pods with localstorage are only evicted and podswithpvc are not evicted.
I0823 15:24:15.325623 1 evictions.go:130] "Evicted pod" pod="knarra1/knarra-dtmfh" reason="RemoveDuplicatePods"
I0823 15:24:15.336810 1 evictions.go:130] "Evicted pod" pod="wduan/knarra-fx227" reason="RemoveDuplicatePods"
Based on the above moving bug to verified state.
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory (Moderate: OpenShift Container Platform 4.9.0 bug fix and security update), and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHSA-2021:3759 |