This service will be undergoing maintenance at 00:00 UTC, 2017-10-23 It is expected to last about 30 minutes
Bug 1012257 - Fail to move db(mysql/mongo/postgresql) gear for scale app with db cartridge
Fail to move db(mysql/mongo/postgresql) gear for scale app with db cartridge
Status: CLOSED CURRENTRELEASE
Product: OpenShift Online
Classification: Red Hat
Component: Pod (Show other bugs)
2.x
All All
high Severity high
: ---
: ---
Assigned To: Rob Millner
libra bugs
:
Depends On:
Blocks:
  Show dependency treegraph
 
Reported: 2013-09-26 02:16 EDT by zhaozhanqi
Modified: 2015-05-14 20:21 EDT (History)
3 users (show)

See Also:
Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed: 2013-10-17 09:31:27 EDT
Type: Bug
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)

  None (edit)
Description zhaozhanqi 2013-09-26 02:16:42 EDT
Description of problem:
Given a scalable app with DB cartrdge(mysql/mongo/postgresql), move any of the DB gear will be failed.

Version-Release number of selected component (if applicable):
devenv_3829


How reproducible:
always

Steps to Reproduce:
1. create multi-node env 
2. create one scalable app with mysql db
3. move mysql db gear to another node

Actual results:
oo-admin-move --gear_uuid 5243c8274977d10e26000011 -i ip-10-203-26-188
URL: http://zqpy27s-zqd.dev.rhcloud.com
Login: zzhao@redhat.com
App UUID: 5243c56b4977d1cab700005e
Gear UUID: 5243c8274977d1cab700009f
DEBUG: Source district uuid: 835207688557239203790848
DEBUG: Destination district uuid: 835207688557239203790848
DEBUG: Getting existing app 'zqpy27s' status before moving
DEBUG: Gear component 'python-2.7' was running
DEBUG: Stopping existing app cartridge 'mysql-5.1' before moving
DEBUG: Creating new account for gear '5243c8274977d10e26000011' on ip-10-203-26-188
DEBUG: Moving content for app 'zqpy27s', gear '5243c8274977d10e26000011' to ip-10-203-26-188
Identity added: /var/www/openshift/broker/config/keys/rsync_id_rsa (/var/www/openshift/broker/config/keys/rsync_id_rsa)
Agent pid 18103
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 18103 killed;
DEBUG: Moving system components for app 'zqpy27s', gear '5243c8274977d10e26000011' to ip-10-203-26-188
Identity added: /var/www/openshift/broker/config/keys/rsync_id_rsa (/var/www/openshift/broker/config/keys/rsync_id_rsa)
Agent pid 18244
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 18244 killed;
DEBUG: Moving failed.  Rolling back gear '5243c8274977d10e26000011' in 'zqpy27s' with delete on 'ip-10-203-26-188'
Node execution failure (invalid exit code from node)

Expected results:

can move db gear successfully.

Additional info:

development.log

013-09-26 01:43:46.858 [DEBUG] DEBUG: rpc_client.custom_request('cartridge_do', {:cartridge=>"openshift-origin-node", :action=>"frontend-backup", :args=>{"--with-container-uuid"=>"337602544118831148695552", "--with-container-name"=>"337602544118831148695552", "--with-namespace"=>"zqd", "--cart-name"=>"openshift-origin-node"}}, ip-10-114-25-51, {'identity' => ip-10-114-25-51}) (Request ID: ) (pid:20673)
2013-09-26 01:43:46.195 [DEBUG] DEBUG: [#<MCollective::RPC::Result:0x000000074ff0b8 @agent="openshift", @action="cartridge_do", @results={:sender=>"ip-10-114-25-51", :statuscode=>1, :statusmsg=>"cartridge_do_action failed -1. Output undefined method `[]' for nil:NilClass", :data=>{:time=>nil, :output=>"undefined method `[]' for nil:NilClass", :exitcode=>-1}}>] (Request ID: ) (pid:20673)
2013-09-26 01:43:46.196 [DEBUG] DEBUG: MCollective Response Time (execute_direct: frontend-backup): 0.111704858s  (Request ID: ) (pid:20673)
2013-09-26 01:43:46.196 [DEBUG] DEBUG: server results: undefined method `[]' for nil:NilClass (pid:20673)
2013-09-26 01:43:46.266 [DEBUG] DEBUG: Moving failed.  Rolling back gear '337602544118831148695552' in 'zqpy27s' with delete on 'ip-10-203-26-188' (pid:20673)
2013-09-26 01:43:46.268 [DEBUG] DEBUG: rpc_client.custom_request('cartridge_do', {:cartridge=>"openshift-origin-node", :action=>"app-destroy", :args=>{"--with-app-uuid"=>"5243c56b4977d1cab700005e", "--with-app-name"=>"zqpy27s", "--with-container-uuid"=>"337602544118831148695552", "--with-container-name"=>"337602544118831148695552", "--with-namespace"=>"zqd", "--with-uid"=>5854, "--with-request-id"=>nil, "--skip-hooks"=>true, "--cart-name"=>"openshift-origin-node"}}, ip-10-203-26-188, {'identity' => ip-10-203-26-188}) (Request ID: ) (pid:20673)
2013-09-26 01:43:47.389 [DEBUG] DEBUG: [#<MCollective::RPC::Result:0x000000075db0e0 @agent="openshift", @action="cartridge_do", @results={:sender=>"ip-10-203-26-188", :statuscode=>0, :statusmsg=>"OK", :data=>{:time=>nil, :output=>"NOTIFY_ENDPOINT_DELETE: 50.16.106.241 62301\n", :exitcode=>0}}>] (Request ID: ) (pid:20673)
2013-09-26 01:43:47.391 [DEBUG] DEBUG: MCollective Response Time (execute_direct: app-destroy): 1.123196316s  (Request ID: ) (pid:20673)
2013-09-26 01:43:47.411 [DEBUG] DEBUG: rpc_client.custom_request('cartridge_do', {:cartridge=>"postgresql-8.4", :action=>"start", :args=>{"--with-app-uuid"=>"5243c56b4977d1cab700005e", "--with-app-name"=>"zqpy27s", "--with-container-uuid"=>"337602544118831148695552", "--with-container-name"=>"337602544118831148695552", "--with-namespace"=>"zqd", "--with-uid"=>5854, "--with-request-id"=>nil, "--cart-name"=>"postgresql-8.4", "--component-name"=>"postgresql-8.4", "--with-software-version"=>"8.4", "--cartridge-vendor"=>"redhat"}}, ip-10-114-25-51, {'identity' => ip-10-114-25-51}) (Request ID: ) (pid:20673)
2013-09-26 01:44:10.308 [DEBUG] DEBUG: [#<MCollective::RPC::Result:0x0000000740f9f0 @agent="openshift", @action="cartridge_do", @results={:sender=>"ip-10-114-25-51", :statuscode=>0, :statusmsg=>"OK", :data=>{:time=>nil, :output=>"Starting Postgres cartridge\nserver starting\nPostgres started\n", :exitcode=>0}}>] (Request ID: ) (pid:20673)
Comment 1 Abhishek Gupta 2013-09-26 17:20:39 EDT
Comments from Rob Millner: It looks like the FrontendHttpServer module's to_json call returned nil.  

He requested doing "oo-devel-node  -t frontend-to-json -c [uuid]" on the command line to see what happens.
Comment 2 Rob Millner 2013-09-26 19:36:49 EDT
don't worry about running "oo-devel-node  -t frontend-to-json -c [uuid]", found the issue.

irb(main):010:0> f.to_json
NoMethodError: undefined method `[]' for nil:NilClass
	from /opt/rh/ruby193/root/usr/share/gems/gems/openshift-origin-frontend-nodejs-websocket-0.1.6/lib/openshift/runtime/frontend/http/plugins/nodejs-websocket.rb:125:in `block in idle?'
	from /opt/rh/ruby193/root/usr/share/gems/gems/openshift-origin-frontend-apachedb-0.1.5/lib/openshift/runtime/frontend/http/plugins/apachedb.rb:259:in `open'
	from /opt/rh/ruby193/root/usr/share/gems/gems/openshift-origin-frontend-nodejs-websocket-0.1.6/lib/openshift/runtime/frontend/http/plugins/nodejs-websocket.rb:124:in `idle?'
	from /opt/rh/ruby193/root/usr/share/gems/gems/openshift-origin-node-1.15.4/lib/openshift-origin-node/model/frontend_httpd.rb:540:in `block in call_plugins'
	from /opt/rh/ruby193/root/usr/share/gems/gems/openshift-origin-node-1.15.4/lib/openshift-origin-node/model/frontend_httpd.rb:536:in `map'
	from /opt/rh/ruby193/root/usr/share/gems/gems/openshift-origin-node-1.15.4/lib/openshift-origin-node/model/frontend_httpd.rb:536:in `call_plugins'
	from /opt/rh/ruby193/root/usr/share/gems/gems/openshift-origin-node-1.15.4/lib/openshift-origin-node/model/frontend_httpd.rb:390:in `idle?'
	from /opt/rh/ruby193/root/usr/share/gems/gems/openshift-origin-node-1.15.4/lib/openshift-origin-node/model/frontend_httpd.rb:206:in `to_hash'
	from /opt/rh/ruby193/root/usr/share/gems/gems/openshift-origin-node-1.15.4/lib/openshift-origin-node/model/frontend_httpd.rb:215:in `to_json'
	from (irb):10
	from /opt/rh/ruby193/root/usr/bin/irb:12:in `<main>'
Comment 3 Rob Millner 2013-09-26 19:49:33 EDT
Pull request:
https://github.com/openshift/origin-server/pull/3718
Comment 4 zhaozhanqi 2013-09-27 05:36:16 EDT
Tested this bug on devenv_3838, it has been fixed.

[root@ip-10-147-189-146 ~]# oo-admin-move --gear_uuid 52454ecb9553e90a8e000010 -i ip-10-147-189-146
URL: http://zqphps-zqd.dev.rhcloud.com
Login: zzhao@redhat.com
App UUID: 524544b99553e9ec7300003c
Gear UUID: 52454ecb9553e9ec73000173
DEBUG: Source district uuid: 215149309514994275057664
DEBUG: Destination district uuid: 215149309514994275057664
Error moving gear.  Old and new servers are the same: ip-10-147-189-146
[root@ip-10-147-189-146 ~]# oo-admin-move --gear_uuid 52454ecb9553e90a8e000010 -i ip-10-185-23-32
URL: http://zqphps-zqd.dev.rhcloud.com
Login: zzhao@redhat.com
App UUID: 524544b99553e9ec7300003c
Gear UUID: 52454ecb9553e9ec73000173
DEBUG: Source district uuid: 215149309514994275057664
DEBUG: Destination district uuid: 215149309514994275057664
DEBUG: Getting existing app 'zqphps' status before moving
DEBUG: Gear component 'php-5.3' was running
DEBUG: Stopping existing app cartridge 'mongodb-2.2' before moving
DEBUG: Creating new account for gear '52454ecb9553e90a8e000010' on ip-10-185-23-32
DEBUG: Moving content for app 'zqphps', gear '52454ecb9553e90a8e000010' to ip-10-185-23-32
Identity added: /var/www/openshift/broker/config/keys/rsync_id_rsa (/var/www/openshift/broker/config/keys/rsync_id_rsa)
Agent pid 21929
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 21929 killed;
DEBUG: Moving system components for app 'zqphps', gear '52454ecb9553e90a8e000010' to ip-10-185-23-32
Identity added: /var/www/openshift/broker/config/keys/rsync_id_rsa (/var/www/openshift/broker/config/keys/rsync_id_rsa)
Agent pid 22358
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 22358 killed;
DEBUG: Starting cartridge 'mongodb-2.2' in 'zqphps' after move on ip-10-185-23-32
DEBUG: Fixing DNS and mongo for gear '52454ecb9553e90a8e000010' after move
DEBUG: Changing server identity of '52454ecb9553e90a8e000010' from 'ip-10-147-189-146' to 'ip-10-185-23-32'
DEBUG: Deconfiguring old app 'zqphps' on ip-10-147-189-146 after move
Successfully moved gear with uuid '52454ecb9553e90a8e000010' of app 'zqphps' from 'ip-10-147-189-146' to 'ip-10-185-23-32'

Note You need to log in before you can comment on or make changes to this bug.