Bug 890119 - [fork][model refactor]App can't auto scale up/down when enabled auto scaling
Summary: [fork][model refactor]App can't auto scale up/down when enabled auto scaling
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: OKD
Classification: Red Hat
Component: Pod
Version: 2.x
Hardware: Unspecified
OS: Unspecified
unspecified
medium
Target Milestone: ---
: ---
Assignee: Abhishek Gupta
QA Contact: libra bugs
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2012-12-25 08:23 UTC by Rony Gong 🔥
Modified: 2015-05-15 02:11 UTC (History)
4 users (show)

Fixed In Version: fork_ami_refctr1_401+
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2013-02-13 23:38:13 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
The scale_event.log of this app (8.16 KB, text/x-log)
2013-01-09 05:15 UTC, Rony Gong 🔥
no flags Details

Description Rony Gong 🔥 2012-12-25 08:23:31 UTC
Description of problem:
The app's auto-scaling is disabled default, but now after scale-up app manually, the app will not auto scale-down anymore.

Version-Release number of selected component (if applicable):
fork_ami_refctr1_372
rhc-1.3.2

How reproducible:
Always


Steps to Reproduce:
1.Create an scalable app, eg: php
2.Scalup up this app by RESTAPI
3.Wait for 10 minutes, check the gear number of this app.
  
Actual results:
App total gear number doesn't become less than before

Expected results:
App total gear number should become to 1.

Additional info:

Comment 1 Rony Gong 🔥 2012-12-25 09:08:37 UTC
Sorry, for above Description of problem need be updated to :
The app's auto-scaling is enabled default, but now after scale-up app manually, the app will not auto scale-down anymore.
And when use "ab -c 200 -t 60 http://qsphp-qgong22.dev.rhcloud.com/"  
to trigger app scale-up, can't trigger successfully.

Comment 2 Rony Gong 🔥 2012-12-25 09:54:46 UTC
logs from /var/lib/openshift/50d94a3e0b0c68f30900001c/haproxy-1.4/logs//var/lib/openshift/50d94a3e0b0c68f30900001c/haproxy-1.4/logs  when check auto scale-up:

I, [2012-12-25T04:03:53.143288 #13159]  INFO -- : GEAR_UP - capacity: 1500.0% gear_count: 1 sessions: 150 up_thresh: 90.0%
D, [2012-12-25T04:03:56.939040 #13159] DEBUG -- : GEAR_UP - add-gear: exit: pid 16350 exit 0  stdout: 
D, [2012-12-25T04:03:56.940992 #13159] DEBUG -- : GEAR_INFO - capacity: 1639.9999999999998% gear_count: 1 sessions: 164 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 117 gear_remove_thresh: 0/20
I, [2012-12-25T04:03:59.942650 #13159]  INFO -- : GEAR_UP - capacity: 1550.0% gear_count: 1 sessions: 155 up_thresh: 90.0%
D, [2012-12-25T04:04:05.423168 #13159] DEBUG -- : GEAR_UP - add-gear: exit: pid 16370 exit 0  stdout: 
D, [2012-12-25T04:04:05.424976 #13159] DEBUG -- : GEAR_INFO - capacity: 1720.0% gear_count: 1 sessions: 172 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 115 gear_remove_thresh: 0/20
I, [2012-12-25T04:04:08.426665 #13159]  INFO -- : GEAR_UP - capacity: 1530.0% gear_count: 1 sessions: 153 up_thresh: 90.0%

Comment 3 Lili Nader 2013-01-08 08:14:22 UTC
fixed in fork_ami_refctr1_389

Comment 4 Rony Gong 🔥 2013-01-08 09:05:26 UTC
Reopened on fork_ami_refctr1_389
Fully same as description, the scalable application(if scalable up manually) will not scale down automatically after wait 10 minutes.

Comment 5 Rony Gong 🔥 2013-01-08 09:33:29 UTC
The scale_event.log begin from scale up to wait 10 mins.
I, [2013-01-08T04:17:53.446201 #21759]  INFO -- : Starting haproxy_ctld
D, [2013-01-08T04:17:53.480695 #21759] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 120 gear_remove_thresh: 0/20
I, [2013-01-08T04:18:08.830313 #23878]  INFO -- : Starting haproxy_ctld
D, [2013-01-08T04:18:08.839837 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 120 gear_remove_thresh: 0/20
I, [2013-01-08T04:18:46.867887 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:18:49.541422 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 1528 exit 0  stdout: 
D, [2013-01-08T04:18:49.547795 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:19:49.666533 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:19:53.448139 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 14923 exit 0  stdout: 
D, [2013-01-08T04:19:53.453584 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:20:12.056439 #23878]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:20:14.834428 #23878] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 19028 exit 0  stdout: 
D, [2013-01-08T04:20:14.843937 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:20:53.591341 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:20:56.073904 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 24631 exit 0  stdout: 
D, [2013-01-08T04:20:56.131475 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:21:14.945434 #23878]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:21:17.589430 #23878] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 25116 exit 0  stdout: 
D, [2013-01-08T04:21:17.631654 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:21:56.209493 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:21:58.736253 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 25308 exit 0  stdout: 
D, [2013-01-08T04:21:58.738638 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:22:17.753614 #23878]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:22:20.353841 #23878] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 25799 exit 0  stdout: 
D, [2013-01-08T04:22:20.356442 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:22:58.877560 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:23:01.656427 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 26002 exit 0  stdout: 
D, [2013-01-08T04:23:01.681771 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:23:20.454390 #23878]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:23:23.054403 #23878] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 26468 exit 0  stdout: 
D, [2013-01-08T04:23:23.057136 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:24:01.761402 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:24:04.937057 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 26628 exit 0  stdout: 
D, [2013-01-08T04:24:04.947085 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:24:23.160990 #23878]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:24:26.618522 #23878] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 27355 exit 0  stdout: 
D, [2013-01-08T04:24:26.629995 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:25:05.304253 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:25:07.942794 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 30191 exit 0  stdout: 
D, [2013-01-08T04:25:07.945319 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:25:26.774775 #23878]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:25:29.503380 #23878] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 30742 exit 0  stdout: 
D, [2013-01-08T04:25:29.516304 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:26:08.211833 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:26:10.846439 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 32359 exit 0  stdout: 
D, [2013-01-08T04:26:10.848667 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:26:29.592091 #23878]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:26:32.458722 #23878] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 604 exit 0  stdout: 
D, [2013-01-08T04:26:32.468465 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:27:10.938903 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:27:13.457564 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 1566 exit 0  stdout: 
D, [2013-01-08T04:27:13.537819 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:27:32.543869 #23878]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:27:35.161833 #23878] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 1634 exit 0  stdout: 
D, [2013-01-08T04:27:35.167861 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:28:13.589500 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:28:16.054840 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 2371 exit 0  stdout: 
D, [2013-01-08T04:28:16.057033 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:28:35.252603 #23878]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:28:37.848403 #23878] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 2420 exit 0  stdout: 
D, [2013-01-08T04:28:37.854380 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:29:16.111165 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:29:18.643943 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 3239 exit 0  stdout: 
D, [2013-01-08T04:29:18.646362 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:29:37.945734 #23878]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:29:40.557327 #23878] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 3293 exit 0  stdout: 
D, [2013-01-08T04:29:40.560731 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:30:18.693864 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:30:21.158628 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 3867 exit 0  stdout: 
D, [2013-01-08T04:30:21.238686 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:30:40.645883 #23878]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:30:43.231798 #23878] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 3911 exit 0  stdout: 
D, [2013-01-08T04:30:43.235141 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:31:21.283771 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:31:23.755409 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 4493 exit 0  stdout: 
D, [2013-01-08T04:31:23.757648 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:31:43.344045 #23878]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:31:45.890812 #23878] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 4542 exit 0  stdout: 
D, [2013-01-08T04:31:45.935081 #23878] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20
I, [2013-01-08T04:32:23.820689 #2013]  INFO -- : GEAR_DOWN - capacity: 0.0% gear_count: 2 sessions: 0 remove_thresh: 49.9%
D, [2013-01-08T04:32:26.350969 #2013] DEBUG -- : GEAR_DOWN - remove-gear: exit: pid 5102 exit 0  stdout: 
D, [2013-01-08T04:32:26.355146 #2013] DEBUG -- : GEAR_INFO - capacity: 0.0% gear_count: 2 sessions: 0 up/remove_thresh: 90.0%/49.9% sec_left_til_remove: 0 gear_remove_thresh: 20/20

Comment 6 Lili Nader 2013-01-08 19:55:00 UTC
Works for me.  So please provide more information so I can reproduce it.

This is what I did

1.Created a scalable app
rhc app create -a app -t php-5.3 -s

2. Scaled up using the REST API
curl -k --user "lnader:ppp" https://localhost/broker/rest/domains/ldev25/applications/app/events --data "event=scale-up"

and Again

curl -k --user "lnader:ppp" https://localhost/broker/rest/domains/ldev25/applications/app/events --data "event=scale-up"

3. Verified that user had 3 comsumed gears
curl -k --user "lnader:ppp" https://localhost/broker/rest/user 
{"data":{"capabilities":{"subaccounts":false,"gear_sizes":["small"]},"consumed_gear_sizes":{"small":3},"consumed_gears":3

4. Waited less than 10 minutes
5. Verified that user had 1 comsumed gear
curl -k --user "lnader:ppp" https://localhost/broker/rest/user 
{"data":{"capabilities":{"subaccounts":false,"gear_sizes":["small"]},"consumed_gear_sizes":{"small":1},"consumed_gears":1

Comment 7 Rony Gong 🔥 2013-01-09 04:56:45 UTC
Retest on fork_refctr1_392
Steps:

1.Created a scalable app
rhc app create -a app -t php-5.3 -s

2. Scaled up using the REST API
curl -k -X POST -H 'Accept: application/xml' -d event=scale-up --user qgong:111111 https://ec2-107-21-171-219.compute-1.amazonaws.com/broker/rest/domains/qgong2/applications/qsperl/events

and Again

curl -k -X POST -H 'Accept: application/xml' -d event=scale-up --user qgong:111111 https://ec2-107-21-171-219.compute-1.amazonaws.com/broker/rest/domains/qgong2/applications/qsperl/events

3. Verified that user had 3 comsumed gears
curl -k -X GET -H 'Accept: application/xml' --user qgong:111111 https://ec2-107-21-171-219.compute-1.amazonaws.com/broker/rest/user|grep consumed-gears
<consumed-gears>3</consumed-gears>

4. Waited almost 10 minutes, that user had 2 comsumed gears
curl -k -X GET -H 'Accept: application/xml' --user qgong:111111 https://ec2-107-21-171-219.compute-1.amazonaws.com/broker/rest/user|grep consumed-gears
<consumed-gears>2</consumed-gears>

6. Scaled down using the REST API
curl -k -X POST -H 'Accept: application/xml' -d event=scale-down --user qgong:111111 https://ec2-107-21-171-219.compute-1.amazonaws.com/broker/rest/domains/qgong2/applications/qsperl/events

7. Verified that user had 1 comsumed gears
curl -k -X GET -H 'Accept: application/xml' --user qgong:111111 https://ec2-107-21-171-219.compute-1.amazonaws.com/broker/rest/user|grep consumed-gears
<consumed-gears>1</consumed-gears>

Comment 8 Rony Gong 🔥 2013-01-09 05:15:02 UTC
Reopen this bugs since:
The auto scaling function could only auto scale down when app's gear size > 2
The auto scaling function can't auto scale down when app's gear size = 2 

For detail please see comment 7.

Comment 9 Rony Gong 🔥 2013-01-09 05:15:45 UTC
Created attachment 675257 [details]
The scale_event.log of this app

Comment 10 Abhishek Gupta 2013-01-11 02:57:44 UTC
Bug 892076 has been fixed and the issue with not being able to scale down below 2 gears (as per comment 8) should now be resolved.

Comment 11 Abhishek Gupta 2013-01-11 19:18:12 UTC
Assigning it to QA for verification since the two underlying issues (mentioned below) have been resolved as part of separate bugs

1. Application scale-down is now fixed
2. HAProxy can now scale down an application to just 1 gear

Fixed in fork_ami_refctr1_401+

Comment 12 Rony Gong 🔥 2013-01-14 02:52:27 UTC
Verified on fork_ami_refctr1_404
1.After manual scale-up 2 times, wait some minutes.
2.This scalable application could auto scale-down to 1 gear.
<?xml version="1.0" encoding="UTF-8"?>
<response>
  <status>ok</status>
  <type>gear_groups</type>
  <data>
    <gear-group>
      <uuid>50f36e526af5ac220a000018</uuid>
      <name>50f36e526af5ac220a000018</name>
      <gear-profile>small</gear-profile>
      <gears>
        <gear>
          <id>50f36e526af5ac220a000005</id>
          <state>started</state>
        </gear>
      </gears>
      <cartridges>
        <cartridge>
          <name>php-5.3</name>
          <display-name>PHP 5.3</display-name>
          <tags>
            <tag>service</tag>
            <tag>php</tag>
            <tag>web_framework</tag>
          </tags>
        </cartridge>
        <cartridge>
          <name>haproxy-1.4</name>
          <display-name>HAProxy 1.4</display-name>
          <tags>
            <tag>web_proxy</tag>
            <tag>scales</tag>
            <tag>embedded</tag>


Note You need to log in before you can comment on or make changes to this bug.