Bug 1370058 - client 3.3 can not scale dc on OSE 3.1 [compatibility]
Summary: client 3.3 can not scale dc on OSE 3.1 [compatibility]
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: oc
Version: 3.3.0
Hardware: Unspecified
OS: Unspecified
Target Milestone: ---
: ---
Assignee: Solly Ross
QA Contact: Xingxing Xia
Depends On:
TreeView+ depends on / blocked
Reported: 2016-08-25 08:11 UTC by XiaochuanWang
Modified: 2017-01-03 08:43 UTC (History)
8 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Last Closed: 2016-09-19 15:18:21 UTC
Target Upstream Version:

Attachments (Terms of Use)

System ID Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2016:1933 normal SHIPPED_LIVE Red Hat OpenShift Container Platform 3.3 Release Advisory 2016-09-27 13:24:36 UTC

Description XiaochuanWang 2016-08-25 08:11:55 UTC
Description of problem:
After an app, failed to scale up dc on oc against server

Version-Release number of selected component (if applicable):
Server: openshift v3.1.1.6-64-g80b61da
Client: oc v3.3.0.25+d2ac65e-dirty

How reproducible:

Steps to Reproduce:
1. Create an app
# oc new-app --image-stream=openshift/perl:5.20 --name=myapp --code=https://github.com/openshift/sti-perl --context-dir=5.20/test/sample-test-app/
or # oc new-app -f https://raw.githubusercontent.com/openshift/origin/master/examples/sample-app/application-template-stibuild.json
2. #  oc scale deploymentconfig myapp --replicas=3
error: Scaling the resource failed with: scale "myapp" is invalid: metadata.uid: invalid value '3de541e0-6a88-11e6-93cb-fa163eb1d16c', Details: field is immutable; Current resource version 40920

Actual results:
2. Failed to scale dc, for more detail please check the attachment.

Expected results:
deploymentconfig "myapp" scaled

Additional info:
Not reproduced on oc 3.1 against openshift 3.1
Not reproduced on oc 3.3 against openshift 3.2/3.3

Comment 2 Solly Ross 2016-08-25 18:59:04 UTC
agoldste@ (and I) did some digging, and the problem stems from three "issues":

1. OSE 3.1 has a ValidateScaleUpdate method, which calls ValidateObjectMetaUpdate.  This is called on Scale updates in OSE 3.1, against a "fake" Scale object with only Name, Namespace, and CreationTimestamp filled out in ObjectMeta.

2. OpenShift's scaler code used in `oc scale` for DeploymentConfigs fetches the entire DC to scale, and then generates a scale object from that to submit

2. OSE 3.3's helper method for generating scale objects sets the UID, whereas OSE 3.1's does not (see https://github.com/openshift/origin/pull/6233).

Thus, OSE 3.3's oc submits a scale update with a UID, and OSE 3.1 attempts to validate the ObjectMeta update against the fake "old" Scale, sees the UID being filled out as an attempt to set the UID, and bails.

I think the right solution here is to not fetch the entire DeploymentConfig when attempting to submit a scale update, but instead just fetching the scale object itself.

Comment 3 Solly Ross 2016-08-25 19:07:25 UTC
it looks like we only fetch the DC to check if dc.Spec.Test is set, an print an error (but not actually fail to do the update) when it is.  IMO, it seems worth it to drop the warning message there in order to do the right thing with the Scale update.

Comment 4 Fabiano Franz 2016-08-25 19:39:46 UTC
Adding Mikail, PTAL at the solution proposed by Solly.

Comment 5 Michail Kargakis 2016-08-26 09:23:07 UTC

Comment 6 Fabiano Franz 2016-08-26 23:04:28 UTC
Fixed in https://github.com/openshift/origin/pull/10684

Comment 7 XiaochuanWang 2016-08-29 09:10:03 UTC
Fix has not yet been merged in OSE on oc v3.3.0.26.
Verified it on origin client: oc v1.3.0-alpha.3+4250e53 against server (openshift v3.1.1.6-64-g80b61da)

Comment 8 XiaochuanWang 2016-08-29 09:48:36 UTC
Better to verify it again on OSE 3.3 when code is merged. In order to keep tracking this bug, mark it on_qa.

Comment 9 XiaochuanWang 2016-08-30 01:52:10 UTC
Verified on oc v3.3.0.27 against openshift v3.1.1.6-64-g80b61da
# oc new-app -f https://raw.githubusercontent.com/openshift/origin/master/examples/sample-app/application-template-stibuild.json

# oc get pod
NAME                        READY     STATUS    RESTARTS   AGE
database-1-yv25d            1/1       Running   0          56s
ruby-sample-build-1-build   1/1       Running   0          1m

# oc scale dc/database --replicas=5
deploymentconfig "database" scaled

# oc get pod
NAME                        READY     STATUS      RESTARTS   AGE
database-1-biklu            1/1       Running     0          3m
database-1-due7c            1/1       Running     0          3m
database-1-ktq9u            1/1       Running     0          3m
database-1-xj4hv            1/1       Running     0          3m
database-1-yv25d            1/1       Running     0          4m
frontend-1-yfeq9            1/1       Running     0          2m
frontend-1-yfgh1            1/1       Running     0          2m
ruby-sample-build-1-build   0/1       Completed   0          4m

Note You need to log in before you can comment on or make changes to this bug.