Created attachment 629314 [details] development.log Description of problem: always fail when move jbossews app across district Version-Release number of selected component (if applicable): devenv_2341 How reproducible: always Steps to Reproduce: 1.Setup multi_node env, create 2 districts 2.Create jbossews app 3.Move this app across district $ oo-admin-move --gear_uuid 08d0758b9d964ae584e5c7fe4c59a344 -i ip-10-122-65-201 --allow_change_district Actual results: [root@ip-10-122-78-176 lib]# oo-admin-move --gear_uuid 5efa7151a57447f8a96ed603d52edf9c -i ip-10-122-78-176 --allow_change_district URL: http://q2jbossews-qgong8.dev.rhcloud.com Login: qgong App UUID: 5efa7151a57447f8a96ed603d52edf9c Gear UUID: 5efa7151a57447f8a96ed603d52edf9c DEBUG: Source district uuid: 5cc574cc4334450ea185eb2130491aee DEBUG: Destination district uuid: 4ec1b9edb7f84c42a8869f07add1d4f0 DEBUG: Getting existing app 'q2jbossews' status before moving DEBUG: Gear component 'jbossews-1.0' was running DEBUG: Stopping existing app cartridge 'jbossews-1.0' before moving DEBUG: Force stopping existing app cartridge 'jbossews-1.0' before moving DEBUG: Reserved uid '1004' on district: '4ec1b9edb7f84c42a8869f07add1d4f0' DEBUG: Creating new account for gear 'q2jbossews' on ip-10-122-78-176 DEBUG: Moving content for app 'q2jbossews', gear 'q2jbossews' to ip-10-122-78-176 Identity added: /var/www/openshift/broker/config/keys/rsync_id_rsa (/var/www/openshift/broker/config/keys/rsync_id_rsa) Agent pid 17295 unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 17295 killed; DEBUG: Performing cartridge level move for 'jbossews-1.0' on ip-10-122-78-176 DEBUG: Starting cartridge 'jbossews-1.0' in 'q2jbossews' after move on ip-10-122-78-176 DEBUG: Moving failed. Rolling back gear 'q2jbossews' 'q2jbossews' with remove-httpd-proxy on 'ip-10-122-78-176' DEBUG: Moving failed. Rolling back gear 'q2jbossews' in 'q2jbossews' with destroy on 'ip-10-122-78-176' /usr/lib/ruby/gems/1.8/gems/openshift-origin-msg-broker-mcollective-0.4.5/lib/openshift-origin-msg-broker-mcollective/openshift/mcollective_application_container_proxy.rb:1324:in `run_cartridge_command_old': Node execution failure (invalid exit code from node). If the problem persists please contact Red Hat support. (OpenShift::NodeException) from /var/www/openshift/broker/lib/express/broker/mcollective_ext.rb:12:in `run_cartridge_command' from /usr/lib/ruby/gems/1.8/gems/openshift-origin-msg-broker-mcollective-0.4.5/lib/openshift-origin-msg-broker-mcollective/openshift/mcollective_application_container_proxy.rb:666:in `send' from /usr/lib/ruby/gems/1.8/gems/openshift-origin-msg-broker-mcollective-0.4.5/lib/openshift-origin-msg-broker-mcollective/openshift/mcollective_application_container_proxy.rb:666:in `move_gear_post' from /usr/lib/ruby/gems/1.8/gems/openshift-origin-msg-broker-mcollective-0.4.5/lib/openshift-origin-msg-broker-mcollective/openshift/mcollective_application_container_proxy.rb:658:in `each' from /usr/lib/ruby/gems/1.8/gems/openshift-origin-msg-broker-mcollective-0.4.5/lib/openshift-origin-msg-broker-mcollective/openshift/mcollective_application_container_proxy.rb:658:in `move_gear_post' from /usr/lib/ruby/gems/1.8/gems/openshift-origin-msg-broker-mcollective-0.4.5/lib/openshift-origin-msg-broker-mcollective/openshift/mcollective_application_container_proxy.rb:807:in `move_gear' from /usr/bin/oo-admin-move:111 Expected results: move success Additional info: also failed for scalable jbossews app
It would be very helpful when node exceptions occur to include the mcollective log from the source and target nodes.
This bug has been verified and fixed. Verified builds: devenv_2360 Verified steps: 1.Setup multi_node env, create 2 districts 2.Create jbossews scalable app 3.Move this app across district $ ooo-admin-move --gear_uuid b6c38efc120f45d2bd57aa6d34fdc7b0 -i ip-10-196-130-224 --allow_change_district Actual results: # oo-admin-move --gear_uuid b6c38efc120f45d2bd57aa6d34fdc7b0 -i ip-10-196-130-224 --allow_change_district URL: http://app1-joycezg1.dev.rhcloud.com Login: jinzhang+1 App UUID: b6c38efc120f45d2bd57aa6d34fdc7b0 Gear UUID: b6c38efc120f45d2bd57aa6d34fdc7b0 DEBUG: Source district uuid: f802d095f7e94c1b9ed6e06e3e5e3cf5 DEBUG: Destination district uuid: 15ff954c2f544822ac5f1447857219c0 DEBUG: Getting existing app 'app1' status before moving DEBUG: Gear component 'jbossews-1.0' was running DEBUG: Stopping existing app cartridge 'haproxy-1.4' before moving DEBUG: Stopping existing app cartridge 'jbossews-1.0' before moving DEBUG: Force stopping existing app cartridge 'jbossews-1.0' before moving DEBUG: Reserved uid '1001' on district: '15ff954c2f544822ac5f1447857219c0' DEBUG: Creating new account for gear 'app1' on ip-10-196-130-224 DEBUG: Moving content for app 'app1', gear 'app1' to ip-10-196-130-224 Identity added: /var/www/openshift/broker/config/keys/rsync_id_rsa (/var/www/openshift/broker/config/keys/rsync_id_rsa) Warning: Permanently added '10.196.82.159' (RSA) to the list of known hosts. Warning: Permanently added '10.196.130.224' (RSA) to the list of known hosts. Agent pid 27248 unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 27248 killed; DEBUG: Performing cartridge level move for 'jbossews-1.0' on ip-10-196-130-224 DEBUG: Performing cartridge level move for 'haproxy-1.4' on ip-10-196-130-224 DEBUG: Starting cartridge 'jbossews-1.0' in 'app1' after move on ip-10-196-130-224 DEBUG: Starting cartridge 'haproxy-1.4' in 'app1' after move on ip-10-196-130-224 DEBUG: Fixing DNS and mongo for gear 'app1' after move DEBUG: Changing server identity of 'app1' from 'ip-10-196-82-159' to 'ip-10-196-130-224' DEBUG: Deconfiguring old app 'app1' on ip-10-196-82-159 after move Successfully moved 'app1' with gear uuid 'b6c38efc120f45d2bd57aa6d34fdc7b0' from 'ip-10-196-82-159' to 'ip-10-196-130-224' And it can scale up correctly after this app moved to another district.