Bug 1540916 - should not show dc doesn't exist when its hpa created by web console
Summary: should not show dc doesn't exist when its hpa created by web console
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Management Console
Version: 3.9.0
Hardware: Unspecified
OS: Unspecified
medium
medium
Target Milestone: ---
: 3.9.0
Assignee: Samuel Padgett
QA Contact: Yadan Pei
URL:
Whiteboard:
Depends On:
Blocks: 1543043
TreeView+ depends on / blocked
 
Reported: 2018-02-01 10:24 UTC by shahan
Modified: 2018-03-27 09:46 UTC (History)
6 users (show)

Fixed In Version:
Doc Type: No Doc Update
Doc Text:
undefined
Clone Of:
: 1543043 (view as bug list)
Environment:
Last Closed: 2018-03-08 14:24:04 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2018:0489 0 normal SHIPPED_LIVE Red Hat OpenShift Container Platform 3.9 RPM Release Advisory 2018-03-28 18:06:38 UTC

Description shahan 2018-02-01 10:24:15 UTC
Description of problem:
 should not show dc doesn't exist when its hpa created by web console

Version-Release number of selected component (if applicable):
 OpenShift Master: v3.9.0-0.24.0
 Kubernetes Master: v1.9.1+a0ce1bc657

 oc v3.9.0-0.31.0

How reproducible:
Always

Steps to Reproduce:
1.
for project1:
$ oc new-app openshift/hello-openshift --name myapp
$ oc autoscale dc myapp --max=4
$ oc status
2.
for project2:
$ oc new-app openshift/hello-openshift --name myapp
login web console, add hpa for dc myapp
$ oc status

3. 
for project1 & project2:
 $ oc get hap myapp -o yaml

Actual results:
1.$ oc status
no error info
 
2.$ oc status

Errors:
  * hpa/myapp is attempting to scale DeploymentConfig/myapp, which doesn't exist
...

3.
hpa yaml file in project1:

  scaleTargetRef:
    apiVersion: v1
    kind: DeploymentConfig
    name: hello-openshift

hpa yaml file in project1:
  
    scaleTargetRef:
    apiVersion: extensions/v1beta1
    kind: DeploymentConfig
    name: hello-openshift  

Expected results:
 should work well without no exist error like CLI.

Additional info:
 cli fixed in: https://bugzilla.redhat.com/show_bug.cgi?id=1534956#c2

Comment 2 Samuel Padgett 2018-02-07 15:41:23 UTC
The PR from comment #1 is replaced by

https://github.com/openshift/origin-web-console/pull/2776

Comment 3 openshift-github-bot 2018-02-07 21:44:03 UTC
Commits pushed to master at https://github.com/openshift/origin-web-console

https://github.com/openshift/origin-web-console/commit/632b0157eec80eb771ee6bdbe60dc94614f9abb2
Bug 1540916: Set correct group in HPA scale target

Fixes https://bugzilla.redhat.com/show_bug.cgi?id=1540916

https://github.com/openshift/origin-web-console/commit/170e153636c6e122d4d45f495cc904dded5dc9c2
Merge pull request #2776 from spadgett/hpa-scale-target

Automatic merge from submit-queue.

Bug 1540916: Set correct group and version in HPA scale target

Fixes https://bugzilla.redhat.com/show_bug.cgi?id=1540916

I had to move the function inside the data service callback, so the diff is larger than it looks. This changes the HPA scale target to use the correct group/version for the resource being scaled.

/assign @jwforres 
/cc @DirectXMan12

Comment 5 Yadan Pei 2018-02-22 08:25:36 UTC
Now the HPA created via web console use the correct api group and version: autoscaling/v1, without errors in 'oc status'

Move to VERIFIED

This is checked on OpenShift Web Console:v3.9.0-0.47.0

Comment 6 Yadan Pei 2018-03-07 09:18:43 UTC
Verification steps:

1. Create ReplicaSet, Deployment, DeploymentConfig, ReplicationController

$ oc create -f https://raw.githubusercontent.com/openshift-qe/v3-testfiles/master/deployment/deployment1.json
$ oc create -f https://raw.githubusercontent.com/openshift-qe/v3-testfiles/master/deployment/hello-deployment-1.yaml
$ oc create -f https://raw.githubusercontent.com/openshift-qe/v3-testfiles/master/replicaSet/tc536601/replicaset.yaml
$ oc run myrunrc --image=aosqe/hello-openshift --generator=run-controller/v1

2. Login to web console, create HPA for all resources above

3. Check if correct api version was set in scaleTargetRef
$ for i in `oc get hpa --no-headers=true | awk -F ' ' '{print $1}' `; do oc get hpa $i -o yaml | grep scaleTargetRef -A 3; done
  scaleTargetRef:
    apiVersion: extensions/v1beta1
    kind: ReplicaSet
    name: frontend
  scaleTargetRef:
    apiVersion: apps/v1
    kind: Deployment
    name: hello-openshift
  scaleTargetRef:
    apiVersion: v1
    kind: DeploymentConfig
    name: hooks
  scaleTargetRef:
    apiVersion: v1
    kind: ReplicationController
    name: myrunrc

4.Delete all hpas created from web console, and create HPAs via CLI
[yapei@dhcp-140-3 test-scripts]$ oc autoscale ReplicaSet frontend --max=11
replicaset "frontend" autoscaled
[yapei@dhcp-140-3 test-scripts]$ oc autoscale Deployment hello-openshift --max=12
deployment "hello-openshift" autoscaled
[yapei@dhcp-140-3 test-scripts]$ oc autoscale DeploymentConfig hooks --max=13
deploymentconfig "hooks" autoscaled
[yapei@dhcp-140-3 test-scripts]$ oc autoscale ReplicationController myrunrc --max=14
replicationcontroller "myrunrc" autoscaled
[yapei@dhcp-140-3 test-scripts]$ for i in `oc get hpa --no-headers=true | awk -F ' ' '{print $1}' `; do oc get hpa $i -o yaml | grep scaleTargetRef -A 3; done
  scaleTargetRef:
    apiVersion: extensions/v1beta1
    kind: ReplicaSet
    name: frontend
  scaleTargetRef:
    apiVersion: extensions/v1beta1
    kind: Deployment
    name: hello-openshift
  scaleTargetRef:
    apiVersion: apps.openshift.io/v1
    kind: DeploymentConfig
    name: hooks
  scaleTargetRef:
    apiVersion: v1
    kind: ReplicationController
    name: myrunrc

Inconsistent apiVersion in CLI and Web console, I'm so sorry for forgetting checking these last time verified the bug, sorry the trouble

Comment 7 Yadan Pei 2018-03-07 09:23:07 UTC
The new apigroup for Deployment & ReplicaSet is apps/v1

DeploymentConfig & ReplicationController is apps.openshift.io/v1

Comment 8 Yadan Pei 2018-03-07 09:23:47 UTC
Sorry, ReplicationController should be v1

Comment 9 Samuel Padgett 2018-03-07 16:00:04 UTC
This won't cause a problem. The CLI sets an older API version for deployments in the HPA scaleRef for n-1 compatibility, but both values should work. The value the web console sets for deployments is not wrong.

Ideally the console should be using `apps.openshift.io/v1` for deployment configs, but `v1` won't cause a problem. Since the original problem is fixed, I suggest opening a separate, low severity bug to track that change.

Comment 10 Xingxing Xia 2018-03-08 06:55:30 UTC
Thanks for clarification. Moving to VERIFIED per above verification


Note You need to log in before you can comment on or make changes to this bug.