Description of problem: Given one scalable app created(e.g., myperl510s),set min scaling value to 2, stop app, save snapshot, and then restore snapshot and check the state of gear, found haproxy gear will start. [rayzhang@ray Work]$ rhc app stop -a myperl510s RESULT: myperl510s stopped [rayzhang@ray Work]$ rhc app show --gear -a myperl510s ID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- ------------------------------------------------------------------------ 5384a2035be46d2b6e000784 stopped haproxy-1.4 perl-5.10 small 5384a2035be46d2b6e000784.rhcloud.com 5384a2765be46d007500005b stopped haproxy-1.4 perl-5.10 small 5384a2765be46d007500005b.rhcloud.com [rayzhang@ray Work]$ rhc snapshot restore -a myperl510s -f myperl510s.tar.gz Restoring from snapshot myperl510s.tar.gz to application 'myperl510s' ... done [rayzhang@ray Work]$ rhc app show --gear -a myperl510s ID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- ------------------------------------------------------------------------ 5384a2035be46d2b6e000784 stopped haproxy-1.4 perl-5.10 small 5384a2035be46d2b6e000784.rhcloud.com 5384a2765be46d007500005b started haproxy-1.4 perl-5.10 small 5384a2765be46d007500005b.rhcloud.com Version-Release number of selected component (if applicable): devenv_4815 How reproducible: always Steps to Reproduce: 1.Create scalable app and set min scaling value to 2 #rhc app create myperl510s perl-5.10 -s #rhc cartridge scale -a myperl510s -c perl-5.10 --min 2 2.stop app and save snapshot #rhc app stop myperl510s #rhc snapshot save myperl510s #rhc app show --gear -a myperl510s 3.restore snapshot and check status of gear #rhc snapshot restore myperl510s #rhc app show --gear -a myperl510s Actual results: Haproxy gear start after restore snapshot for stop app Expected results: haproxy gear should keep stop after restore snapshot for stop app Additional info:
Michal, you worked most recently in the snapshot/restore app state logic, can you take a look?
Jakub is working on this one atm.
Lei Zhang, the secondary gear doesn't contain HAproxy, only the cartridge with the application. The problem is that upon the restore the secondary gear is not stopped.
Jakub Hadvig, yes, correct! I have updated the title. Thank you!
Lei Zhang: Is this a regression? Did this worked before?
Michal Fojtik, no, STG also has this issue. [lijun@ray Work]$ rhc snapshot save myperl510s Pulling down a snapshot of application 'myperl510s' to myperl510s.tar.gz ... done [lijun@ray Work]$ rhc app show --gear -a myperl510s ID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- -------------------------------------------------------------------------- 538710dcdbd93cce240019b7 stopped haproxy-1.4 perl-5.10 small 538710dcdbd93cce240019b7.rhcloud.com 5387112f2587c84434000f59 stopped haproxy-1.4 perl-5.10 small 5387112f2587c84434000f59.rhcloud.com [lijun@ray Work]$ rhc snapshot restore -a myperl510s -f myperl510s.tar.gz Restoring from snapshot myperl510s.tar.gz to application 'myperl510s' ... done [lijun@ray Work]$ rhc app show --gear -a myperl510s ID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- -------------------------------------------------------------------------- 538710dcdbd93cce240019b7 stopped haproxy-1.4 perl-5.10 small 538710dcdbd93cce240019b7.rhcloud.com 5387112f2587c84434000f59 started haproxy-1.4 perl-5.10 small 5387112f2587c84434000f59.rhcloud.com
The problem here is that the application is scaled up when you do restore, so there are actually *2* gears that need restore, but we do restore just on the primary gear. So we have (at least ;-) two options here: 1) We will have to scale down prior to restore (eg. in snapshots.rb#prepare_for_restore method) and then AFTER the restore succeed scale back up. I see this as too much work for a 'bug' fix and if we want to go this way, I would prefer to move this into Trello. 2) We can tell users to scale their apps down before they run restore. IOW. update documentation. In that case, the restore will do the right thing. Dan, Ben -> Thoughts?
Marking this as UpcomingRelease as this is not regression.
We do restore all child gears in addition to the head gear. The issue is that child web gears aren't being stopped like they should. My guess is that https://github.com/openshift/origin-server/blob/master/node/lib/openshift-origin-node/model/application_container_ext/snapshots.rb#L209, which issues a stop if the pre-restore state was stopped, only applies to the current gear, and it's not a full app stop (all gears).
And just for clarity, we don't need to scale down prior to restore, or tell users to scale down and then scale up. The only bug here is that child web gears that are supposed to be stopped post-restore (because the app was stopped) end up started.
Commit pushed to master at https://github.com/openshift/origin-server https://github.com/openshift/origin-server/commit/da7e8c83fa197cf7765a410e3b163d5b7b9ef16d Bug 1101499: Stopping secondary gear after restore snapshot for scaleable app
Retest on devenv_4835, after restore snapshot, neither first or secondary gear is stop [rayzhang@ray Work]$ rhc app show --gear -a perl510s ID State Cartridges Size SSH URL -------------------------------- ------- --------------------- ----- ------------------------------------------------------------------------------------------ 562740b6eb2411e3be089ee8f0736ec1 stopped mysql-5.5 small 562740b6eb2411e3be089ee8f0736ec1.rhcloud.com 538dcfd05ccb4368b0000001 stopped haproxy-1.4 perl-5.10 small 538dcfd05ccb4368b0000001.rhcloud.com 3708042aeb2611e3be089ee8f0736ec1 stopped haproxy-1.4 perl-5.10 small 3708042aeb2611e3be089ee8f0736ec1.rhcloud.com [rayzhang@ray Work]$ rhc snapshot restore -a perl510s -f perl510s.tar.gz Restoring from snapshot perl510s.tar.gz to application 'perl510s' ... done [rayzhang@ray Work]$ rhc app show --gear -a perl510s ID State Cartridges Size SSH URL -------------------------------- ------- --------------------- ----- ------------------------------------------------------------------------------------------ 562740b6eb2411e3be089ee8f0736ec1 stopped mysql-5.5 small 562740b6eb2411e3be089ee8f0736ec1.rhcloud.com 538dcfd05ccb4368b0000001 started haproxy-1.4 perl-5.10 small 538dcfd05ccb4368b0000001.rhcloud.com 3708042aeb2611e3be089ee8f0736ec1 started haproxy-1.4 perl-5.10 small 3708042aeb2611e3be089ee8f0736ec1.rhcloud.com
Moving back to ON_QA. Tested several scenarios, together with those mentioned in the bug description and the one in the Comment#12 and all of them were working. My devenv version was also 4835, so I don't understand where is the fault, cause all the gears where in the same state, like they were before restoration process. Please re-test again.
This issue still can be reproduced on devenv_4838 if the app embedded one db cartridge reproduce steps: 1.create perl scalable app with mysql-5.5 db cartridge #rhc app create myperl510s perl-5.10 -s mysql-5.5 2.set min scaling value to 2 #rhc cartridge scale -a mypl510s -c perl-5.10 --min 2 3.stop app and do save snapshot #rhc app stop myperl510s #rhc snapshot save myperl510s #rhc app show --gear -a myperl510s 4.restore snapshot and check state of gear #rhc snapshot restore -a myperl510s -f myperl510s.tar.gz #rhc app show --gear -a myperl510s [rayzhang@ray Work]$ rhc app create mypls perl-5.10 -s mysql-5.1 Application Options ------------------- Domain: rgcbmq Cartridges: perl-5.10, mysql-5.1 Gear Size: default Scaling: yes Creating application 'mypls' ... done MySQL 5.1 database added. Please make note of these credentials: Root User: adminJQvcTnS Root Password: pt9y62BV8HsT Database Name: mypls Connection URL: mysql://$OPENSHIFT_MYSQL_DB_HOST:$OPENSHIFT_MYSQL_DB_PORT/ You can manage your new MySQL database by also embedding phpmyadmin. The phpmyadmin username and password will be the same as the MySQL credentials above. Waiting for your DNS name to be available ... done Cloning into 'mypls'... Warning: Permanently added 'mypls-rgcbmq.dev.rhcloud.com' (RSA) to the list of known hosts. Your application 'mypls' is now available. URL: http://mypls-rgcbmq.dev.rhcloud.com/ SSH to: 538ef2d73a6cd3d09e00029a.rhcloud.com Git remote: ssh://538ef2d73a6cd3d09e00029a.rhcloud.com/~/git/mypls.git/ Cloned to: /home/rayzhang/Work/mypls Run 'rhc show-app mypls' for more details about your app. [rayzhang@ray Work]$ rhc app show --gear -a mypls ID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- ------------------------------------------------------------------------ 897780718054832236134400 started mysql-5.1 small 897780718054832236134400.rhcloud.com 538ef2d73a6cd3d09e00029a started haproxy-1.4 perl-5.10 small 538ef2d73a6cd3d09e00029a.rhcloud.com [rayzhang@ray Work]$ rhc cartridge scale -a mypls -c perl-5.10 --min 2 This operation will run until the application is at the minimum scale and may take several minutes. Setting scale range for perl-5.10 ... done perl-5.10 (Perl 5.10) --------------------- Scaling: x2 (minimum: 2, maximum: available) on small gears [rayzhang@ray Work]$ rhc app show --gear -a mypls ID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- ------------------------------------------------------------------------ 897780718054832236134400 started mysql-5.1 small 897780718054832236134400.rhcloud.com 538ef2d73a6cd3d09e00029a started haproxy-1.4 perl-5.10 small 538ef2d73a6cd3d09e00029a.rhcloud.com 538ef36e3a6cd3009600002e started haproxy-1.4 perl-5.10 small 538ef36e3a6cd3009600002e.rhcloud.com [rayzhang@ray Work]$ rhc app stop -a mypls RESULT: mypls stopped [rayzhang@ray Work]$ rhc app show --gear -a mypls ID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- ------------------------------------------------------------------------ 897780718054832236134400 stopped mysql-5.1 small 897780718054832236134400.rhcloud.com 538ef2d73a6cd3d09e00029a stopped haproxy-1.4 perl-5.10 small 538ef2d73a6cd3d09e00029a.rhcloud.com 538ef36e3a6cd3009600002e stopped haproxy-1.4 perl-5.10 small 538ef36e3a6cd3009600002e.rhcloud.com [rayzhang@ray Work]$ rhc snapshot save -a mypls Pulling down a snapshot of application 'mypls' to mypls.tar.gz ... done [rayzhang@ray Work]$ rhc snapshot restore -a mypls -f mypls.tar.gz Restoring from snapshot mypls.tar.gz to application 'mypls' ... done [rayzhang@ray Work]$ rhc app show --gear -a mypls ID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- ------------------------------------------------------------------------ 897780718054832236134400 stopped mysql-5.1 small 897780718054832236134400.rhcloud.com 538ef2d73a6cd3d09e00029a started haproxy-1.4 perl-5.10 small 538ef2d73a6cd3d09e00029a.rhcloud.com 538ef36e3a6cd3009600002e started haproxy-1.4 perl-5.10 small 538ef36e3a6cd3009600002e.rhcloud.com
Commit pushed to master at https://github.com/openshift/origin-server https://github.com/openshift/origin-server/commit/e0daa900c1d55a5bfe74287661adb178ef1680ec Bug 1101499: Adjusting logic of gear state restoration
Verified on devenv_4843, this issue has been fixed. [rayzhang@ray Work]$ rhc app show --gear -a myperl510s ID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- ------------------------------------------------------------------------ 539066fdad2cc30063000025 started mysql-5.5 small 539066fdad2cc30063000025.rhcloud.com 539066fdad2cc380b90001f0 started haproxy-1.4 perl-5.10 small 539066fdad2cc380b90001f0.rhcloud.com [rayzhang@ray Work]$ rhc cartridge scale -a myperl510s -c perl-5.10 --min 2 This operation will run until the application is at the minimum scale and may take several minutes. Setting scale range for perl-5.10 ... done perl-5.10 (Perl 5.10) --------------------- Scaling: x2 (minimum: 2, maximum: available) on small gears [rayzhang@ray Work]$ rhc app show --gear -a myperl510s ID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- ------------------------------------------------------------------------ 539066fdad2cc30063000025 started mysql-5.5 small 539066fdad2cc30063000025.rhcloud.com 539066fdad2cc380b90001f0 started haproxy-1.4 perl-5.10 small 539066fdad2cc380b90001f0.rhcloud.com 53906870ad2cc30083000070 started haproxy-1.4 perl-5.10 small 53906870ad2cc30083000070.rhcloud.com [rayzhang@ray Work]$ rhc app stop -a myperl510s RESULT: myperl510s stopped [rayzhang@ray Work]$ rhc app show --gear -a myperl510s rID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- ------------------------------------------------------------------------ 539066fdad2cc30063000025 stopped mysql-5.5 small 539066fdad2cc30063000025.rhcloud.com 539066fdad2cc380b90001f0 stopped haproxy-1.4 perl-5.10 small 539066fdad2cc380b90001f0.rhcloud.com 53906870ad2cc30083000070 stopped haproxy-1.4 perl-5.10 small 53906870ad2cc30083000070.rhcloud.com [rayzhang@ray Work]$ rhc snapshot save myperl510s Pulling down a snapshot of application 'myperl510s' to myperl510s.tar.gz ... done [rayzhang@ray Work]$ rhc app show --gear -a myperl510s ID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- ------------------------------------------------------------------------ 539066fdad2cc30063000025 stopped mysql-5.5 small 539066fdad2cc30063000025.rhcloud.com 539066fdad2cc380b90001f0 stopped haproxy-1.4 perl-5.10 small 539066fdad2cc380b90001f0.rhcloud.com 53906870ad2cc30083000070 stopped haproxy-1.4 perl-5.10 small 53906870ad2cc30083000070.rhcloud.com [rayzhang@ray Work]$ rhc snapshot restore -a myperl510s -f myperl510s.tar.gz Restoring from snapshot myperl510s.tar.gz to application 'myperl510s' ... done [rayzhang@ray Work]$ rhc app show --gear -a myperl510s ID State Cartridges Size SSH URL ------------------------ ------- --------------------- ----- ------------------------------------------------------------------------ 539066fdad2cc30063000025 stopped mysql-5.5 small 539066fdad2cc30063000025.rhcloud.com 539066fdad2cc380b90001f0 stopped haproxy-1.4 perl-5.10 small 539066fdad2cc380b90001f0.rhcloud.com 53906870ad2cc30083000070 stopped haproxy-1.4 perl-5.10 small 53906870ad2cc30083000070.rhcloud.com