Description of problem: should not show dc doesn't exist when its hpa created by web console Version-Release number of selected component (if applicable): OpenShift Master: v3.9.0-0.24.0 Kubernetes Master: v1.9.1+a0ce1bc657 oc v3.9.0-0.31.0 How reproducible: Always Steps to Reproduce: 1. for project1: $ oc new-app openshift/hello-openshift --name myapp $ oc autoscale dc myapp --max=4 $ oc status 2. for project2: $ oc new-app openshift/hello-openshift --name myapp login web console, add hpa for dc myapp $ oc status 3. for project1 & project2: $ oc get hap myapp -o yaml Actual results: 1.$ oc status no error info 2.$ oc status Errors: * hpa/myapp is attempting to scale DeploymentConfig/myapp, which doesn't exist ... 3. hpa yaml file in project1: scaleTargetRef: apiVersion: v1 kind: DeploymentConfig name: hello-openshift hpa yaml file in project1: scaleTargetRef: apiVersion: extensions/v1beta1 kind: DeploymentConfig name: hello-openshift Expected results: should work well without no exist error like CLI. Additional info: cli fixed in: https://bugzilla.redhat.com/show_bug.cgi?id=1534956#c2
https://github.com/openshift/origin-web-console/pull/2748
The PR from comment #1 is replaced by https://github.com/openshift/origin-web-console/pull/2776
Commits pushed to master at https://github.com/openshift/origin-web-console https://github.com/openshift/origin-web-console/commit/632b0157eec80eb771ee6bdbe60dc94614f9abb2 Bug 1540916: Set correct group in HPA scale target Fixes https://bugzilla.redhat.com/show_bug.cgi?id=1540916 https://github.com/openshift/origin-web-console/commit/170e153636c6e122d4d45f495cc904dded5dc9c2 Merge pull request #2776 from spadgett/hpa-scale-target Automatic merge from submit-queue. Bug 1540916: Set correct group and version in HPA scale target Fixes https://bugzilla.redhat.com/show_bug.cgi?id=1540916 I had to move the function inside the data service callback, so the diff is larger than it looks. This changes the HPA scale target to use the correct group/version for the resource being scaled. /assign @jwforres /cc @DirectXMan12
Now the HPA created via web console use the correct api group and version: autoscaling/v1, without errors in 'oc status' Move to VERIFIED This is checked on OpenShift Web Console:v3.9.0-0.47.0
Verification steps: 1. Create ReplicaSet, Deployment, DeploymentConfig, ReplicationController $ oc create -f https://raw.githubusercontent.com/openshift-qe/v3-testfiles/master/deployment/deployment1.json $ oc create -f https://raw.githubusercontent.com/openshift-qe/v3-testfiles/master/deployment/hello-deployment-1.yaml $ oc create -f https://raw.githubusercontent.com/openshift-qe/v3-testfiles/master/replicaSet/tc536601/replicaset.yaml $ oc run myrunrc --image=aosqe/hello-openshift --generator=run-controller/v1 2. Login to web console, create HPA for all resources above 3. Check if correct api version was set in scaleTargetRef $ for i in `oc get hpa --no-headers=true | awk -F ' ' '{print $1}' `; do oc get hpa $i -o yaml | grep scaleTargetRef -A 3; done scaleTargetRef: apiVersion: extensions/v1beta1 kind: ReplicaSet name: frontend scaleTargetRef: apiVersion: apps/v1 kind: Deployment name: hello-openshift scaleTargetRef: apiVersion: v1 kind: DeploymentConfig name: hooks scaleTargetRef: apiVersion: v1 kind: ReplicationController name: myrunrc 4.Delete all hpas created from web console, and create HPAs via CLI [yapei@dhcp-140-3 test-scripts]$ oc autoscale ReplicaSet frontend --max=11 replicaset "frontend" autoscaled [yapei@dhcp-140-3 test-scripts]$ oc autoscale Deployment hello-openshift --max=12 deployment "hello-openshift" autoscaled [yapei@dhcp-140-3 test-scripts]$ oc autoscale DeploymentConfig hooks --max=13 deploymentconfig "hooks" autoscaled [yapei@dhcp-140-3 test-scripts]$ oc autoscale ReplicationController myrunrc --max=14 replicationcontroller "myrunrc" autoscaled [yapei@dhcp-140-3 test-scripts]$ for i in `oc get hpa --no-headers=true | awk -F ' ' '{print $1}' `; do oc get hpa $i -o yaml | grep scaleTargetRef -A 3; done scaleTargetRef: apiVersion: extensions/v1beta1 kind: ReplicaSet name: frontend scaleTargetRef: apiVersion: extensions/v1beta1 kind: Deployment name: hello-openshift scaleTargetRef: apiVersion: apps.openshift.io/v1 kind: DeploymentConfig name: hooks scaleTargetRef: apiVersion: v1 kind: ReplicationController name: myrunrc Inconsistent apiVersion in CLI and Web console, I'm so sorry for forgetting checking these last time verified the bug, sorry the trouble
The new apigroup for Deployment & ReplicaSet is apps/v1 DeploymentConfig & ReplicationController is apps.openshift.io/v1
Sorry, ReplicationController should be v1
This won't cause a problem. The CLI sets an older API version for deployments in the HPA scaleRef for n-1 compatibility, but both values should work. The value the web console sets for deployments is not wrong. Ideally the console should be using `apps.openshift.io/v1` for deployment configs, but `v1` won't cause a problem. Since the original problem is fixed, I suggest opening a separate, low severity bug to track that change.
Thanks for clarification. Moving to VERIFIED per above verification