Bugzilla will be upgraded to version 5.0. The upgrade date is tentatively scheduled for 2 December 2018, pending final testing and feedback.
Bug 1540916 - should not show dc doesn't exist when its hpa created by web console
should not show dc doesn't exist when its hpa created by web console
Status: CLOSED CURRENTRELEASE
Product: OpenShift Container Platform
Classification: Red Hat
Component: Management Console (Show other bugs)
3.9.0
Unspecified Unspecified
medium Severity medium
: ---
: 3.9.0
Assigned To: Samuel Padgett
Yadan Pei
: Reopened
Depends On:
Blocks: 1543043
  Show dependency treegraph
 
Reported: 2018-02-01 05:24 EST by shahan
Modified: 2018-03-27 05:46 EDT (History)
6 users (show)

See Also:
Fixed In Version:
Doc Type: No Doc Update
Doc Text:
undefined
Story Points: ---
Clone Of:
: 1543043 (view as bug list)
Environment:
Last Closed: 2018-03-08 09:24:04 EST
Type: Bug
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)


External Trackers
Tracker ID Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2018:0489 normal SHIPPED_LIVE Red Hat OpenShift Container Platform 3.9 RPM Release Advisory 2018-03-28 14:06:38 EDT

  None (edit)
Description shahan 2018-02-01 05:24:15 EST
Description of problem:
 should not show dc doesn't exist when its hpa created by web console

Version-Release number of selected component (if applicable):
 OpenShift Master: v3.9.0-0.24.0
 Kubernetes Master: v1.9.1+a0ce1bc657

 oc v3.9.0-0.31.0

How reproducible:
Always

Steps to Reproduce:
1.
for project1:
$ oc new-app openshift/hello-openshift --name myapp
$ oc autoscale dc myapp --max=4
$ oc status
2.
for project2:
$ oc new-app openshift/hello-openshift --name myapp
login web console, add hpa for dc myapp
$ oc status

3. 
for project1 & project2:
 $ oc get hap myapp -o yaml

Actual results:
1.$ oc status
no error info
 
2.$ oc status

Errors:
  * hpa/myapp is attempting to scale DeploymentConfig/myapp, which doesn't exist
...

3.
hpa yaml file in project1:

  scaleTargetRef:
    apiVersion: v1
    kind: DeploymentConfig
    name: hello-openshift

hpa yaml file in project1:
  
    scaleTargetRef:
    apiVersion: extensions/v1beta1
    kind: DeploymentConfig
    name: hello-openshift  

Expected results:
 should work well without no exist error like CLI.

Additional info:
 cli fixed in: https://bugzilla.redhat.com/show_bug.cgi?id=1534956#c2
Comment 2 Samuel Padgett 2018-02-07 10:41:23 EST
The PR from comment #1 is replaced by

https://github.com/openshift/origin-web-console/pull/2776
Comment 3 openshift-github-bot 2018-02-07 16:44:03 EST
Commits pushed to master at https://github.com/openshift/origin-web-console

https://github.com/openshift/origin-web-console/commit/632b0157eec80eb771ee6bdbe60dc94614f9abb2
Bug 1540916: Set correct group in HPA scale target

Fixes https://bugzilla.redhat.com/show_bug.cgi?id=1540916

https://github.com/openshift/origin-web-console/commit/170e153636c6e122d4d45f495cc904dded5dc9c2
Merge pull request #2776 from spadgett/hpa-scale-target

Automatic merge from submit-queue.

Bug 1540916: Set correct group and version in HPA scale target

Fixes https://bugzilla.redhat.com/show_bug.cgi?id=1540916

I had to move the function inside the data service callback, so the diff is larger than it looks. This changes the HPA scale target to use the correct group/version for the resource being scaled.

/assign @jwforres 
/cc @DirectXMan12
Comment 5 Yadan Pei 2018-02-22 03:25:36 EST
Now the HPA created via web console use the correct api group and version: autoscaling/v1, without errors in 'oc status'

Move to VERIFIED

This is checked on OpenShift Web Console:v3.9.0-0.47.0
Comment 6 Yadan Pei 2018-03-07 04:18:43 EST
Verification steps:

1. Create ReplicaSet, Deployment, DeploymentConfig, ReplicationController

$ oc create -f https://raw.githubusercontent.com/openshift-qe/v3-testfiles/master/deployment/deployment1.json
$ oc create -f https://raw.githubusercontent.com/openshift-qe/v3-testfiles/master/deployment/hello-deployment-1.yaml
$ oc create -f https://raw.githubusercontent.com/openshift-qe/v3-testfiles/master/replicaSet/tc536601/replicaset.yaml
$ oc run myrunrc --image=aosqe/hello-openshift --generator=run-controller/v1

2. Login to web console, create HPA for all resources above

3. Check if correct api version was set in scaleTargetRef
$ for i in `oc get hpa --no-headers=true | awk -F ' ' '{print $1}' `; do oc get hpa $i -o yaml | grep scaleTargetRef -A 3; done
  scaleTargetRef:
    apiVersion: extensions/v1beta1
    kind: ReplicaSet
    name: frontend
  scaleTargetRef:
    apiVersion: apps/v1
    kind: Deployment
    name: hello-openshift
  scaleTargetRef:
    apiVersion: v1
    kind: DeploymentConfig
    name: hooks
  scaleTargetRef:
    apiVersion: v1
    kind: ReplicationController
    name: myrunrc

4.Delete all hpas created from web console, and create HPAs via CLI
[yapei@dhcp-140-3 test-scripts]$ oc autoscale ReplicaSet frontend --max=11
replicaset "frontend" autoscaled
[yapei@dhcp-140-3 test-scripts]$ oc autoscale Deployment hello-openshift --max=12
deployment "hello-openshift" autoscaled
[yapei@dhcp-140-3 test-scripts]$ oc autoscale DeploymentConfig hooks --max=13
deploymentconfig "hooks" autoscaled
[yapei@dhcp-140-3 test-scripts]$ oc autoscale ReplicationController myrunrc --max=14
replicationcontroller "myrunrc" autoscaled
[yapei@dhcp-140-3 test-scripts]$ for i in `oc get hpa --no-headers=true | awk -F ' ' '{print $1}' `; do oc get hpa $i -o yaml | grep scaleTargetRef -A 3; done
  scaleTargetRef:
    apiVersion: extensions/v1beta1
    kind: ReplicaSet
    name: frontend
  scaleTargetRef:
    apiVersion: extensions/v1beta1
    kind: Deployment
    name: hello-openshift
  scaleTargetRef:
    apiVersion: apps.openshift.io/v1
    kind: DeploymentConfig
    name: hooks
  scaleTargetRef:
    apiVersion: v1
    kind: ReplicationController
    name: myrunrc

Inconsistent apiVersion in CLI and Web console, I'm so sorry for forgetting checking these last time verified the bug, sorry the trouble
Comment 7 Yadan Pei 2018-03-07 04:23:07 EST
The new apigroup for Deployment & ReplicaSet is apps/v1

DeploymentConfig & ReplicationController is apps.openshift.io/v1
Comment 8 Yadan Pei 2018-03-07 04:23:47 EST
Sorry, ReplicationController should be v1
Comment 9 Samuel Padgett 2018-03-07 11:00:04 EST
This won't cause a problem. The CLI sets an older API version for deployments in the HPA scaleRef for n-1 compatibility, but both values should work. The value the web console sets for deployments is not wrong.

Ideally the console should be using `apps.openshift.io/v1` for deployment configs, but `v1` won't cause a problem. Since the original problem is fixed, I suggest opening a separate, low severity bug to track that change.
Comment 10 Xingxing Xia 2018-03-08 01:55:30 EST
Thanks for clarification. Moving to VERIFIED per above verification

Note You need to log in before you can comment on or make changes to this bug.