Bug 1304662 - "RuntimeError: Failed to update root password on appliance. Retries exhausted. Error: execution expired" when deploying CloudForms on OpenStack
"RuntimeError: Failed to update root password on appliance. Retries exhausted...
Status: VERIFIED
Product: Red Hat Quickstart Cloud Installer
Classification: Red Hat
Component: Installation - CloudForms (Show other bugs)
1.0
Unspecified Unspecified
unspecified Severity medium
: TP3
: 1.0
Assigned To: Jason Montleon
Tasos Papaioannou
Dan Macpherson
: Triaged
Depends On:
Blocks:
  Show dependency treegraph
 
Reported: 2016-02-04 04:46 EST by Antonin Pagac
Modified: 2016-04-29 12:19 EDT (History)
6 users (show)

See Also:
Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed:
Type: Bug
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)
log from "no route to host" rhci-20160204 iso (899.27 KB, text/plain)
2016-02-10 17:36 EST, Landon LaSmith
no flags Details

  None (edit)
Description Antonin Pagac 2016-02-04 04:46:04 EST
Description of problem:
For this deployment I was re-using undercloud from previous RHCI install. I undeployed the Overcloud and left the nodes there. CFME was previously installed on the RHELOSP.
See attached log files, exception is at 07:49:43 in production.log. Note that I have deployed RHEV as first deployment, RHELOSP+CFME was my second one, and you may notice in the deployment.log there are bits from RHEV deployment.

This is the Errors tab in the UI:

Action:

Actions::Fusor::Deployment::CloudForms::UpdateRootPassword

Input:

{"deployment_id"=>2, "locale"=>"en"}

Output:

{}

Exception:

RuntimeError: Failed to update root password on appliance. Retries exhausted. Error: execution expired

Backtrace:

/opt/rh/ruby193/root/usr/share/gems/gems/fusor_server-0.0.1/app/lib/actions/fusor/deployment/cloud_forms/update_root_password.rb:73:in `run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:487:in `block (3 levels) in execute_run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware/stack.rb:26:in `call'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware/stack.rb:26:in `pass'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware.rb:16:in `pass'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action/progress.rb:30:in `with_progress_calculation'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action/progress.rb:16:in `run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware/stack.rb:22:in `call'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware/stack.rb:26:in `pass'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware.rb:16:in `pass'
/opt/rh/ruby193/root/usr/share/gems/gems/katello-2.2.0.73/app/lib/actions/middleware/keep_locale.rb:23:in `block in run'
/opt/rh/ruby193/root/usr/share/gems/gems/katello-2.2.0.73/app/lib/actions/middleware/keep_locale.rb:34:in `with_locale'
/opt/rh/ruby193/root/usr/share/gems/gems/katello-2.2.0.73/app/lib/actions/middleware/keep_locale.rb:23:in `run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware/stack.rb:22:in `call'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware/world.rb:30:in `execute'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:486:in `block (2 levels) in execute_run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:485:in `catch'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:485:in `block in execute_run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:402:in `call'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:402:in `block in with_error_handling'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:402:in `catch'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:402:in `with_error_handling'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:480:in `execute_run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:262:in `execute'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:9:in `block (2 levels) in execute'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/execution_plan/steps/abstract.rb:155:in `call'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/execution_plan/steps/abstract.rb:155:in `with_meta_calculation'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:8:in `block in execute'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:22:in `open_action'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:7:in `execute'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/executors/parallel/worker.rb:20:in `block in on_message'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:859:in `block in assigns'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:858:in `tap'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:858:in `assigns'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:138:in `match_value'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:116:in `block in match'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:115:in `each'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:115:in `match'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/executors/parallel/worker.rb:17:in `on_message'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:82:in `on_envelope'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:72:in `receive'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:99:in `block (2 levels) in run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:99:in `loop'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:99:in `block in run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:99:in `catch'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:99:in `run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:13:in `block in initialize'
/opt/rh/ruby193/root/usr/share/gems/gems/logging-1.8.1/lib/logging/diagnostic_context.rb:323:in `call'
/opt/rh/ruby193/root/usr/share/gems/gems/logging-1.8.1/lib/logging/diagnostic_context.rb:323:in `block in create_with_logging_context'


Version-Release number of selected component (if applicable):
RHCI-6.0-RHEL-7-20160201.0-RHCI-x86_64-dvd1.iso
RHCIOOO-7-RHEL-7-20160127.0-RHCIOOO-x86_64-dvd1.iso

How reproducible:
Happened to me once

Steps to Reproduce:
1. Take an Undercloud from previous RHCI installation
2. Undeploy the Overcloud. Leave the discovered nodes there
3. In new RHCI installation, do a CFME+RHELOSP deployment using this Undercloud
4. Error appears during CFME installation

Actual results:
Error appears during CFME installation

Expected results:
No error appears; password is updated successfully; CFME install finishes

Additional info:
Comment 3 Antonin Pagac 2016-02-05 05:07:39 EST
Reproduced following the reproducer steps using the same composes.
Comment 4 Antonin Pagac 2016-02-05 08:05:58 EST
Reproduced using freshly installed newer RHCI and freshly installed RHCIOOO. I'll look for network issues in my setup.

RHCI-6.0-RHEL-7-20160204.t.1-RHCI-x86_64-dvd1.iso
RHCIOOO-7-RHEL-7-20160127.0-RHCIOOO-x86_64-dvd1.iso
Comment 5 Landon LaSmith 2016-02-05 09:06:44 EST
I was able to reproduce the same error and backtrace on a fresh libvirt Sat/OSP iso install while deploying overcloud + CFME.

RHCI-6.0-RHEL-7-20160201.0-RHCI-x86_64-dvd1.iso
RHCIOOO-7-RHEL-7-20160127.0-RHCIOOO-x86_64-dvd1.iso
Comment 6 John Matthews 2016-02-05 09:11:48 EST
Landon/Antonin,

When you see this error did you confirm that the CFME VM is up?
Are you able to ssh into the CFME VM as 'root' password 'smartvm'.

Also check that from the Satellite you are able to ssh into the CFME VM.
Networking issues could be blocking the Satellite from reaching the CFME VM.


I suspect that something went wrong with creating the CFME VM or a networking issue is preventing ssh to the network CFME VM is on.

Please help to gather some more info so we can understand more of the problem.
Comment 7 Thom Carlin 2016-02-05 11:35:00 EST
Also see "RuntimeError: Failed to update root password on appliance. Unhandled error: Authentication failed for user root@" on a fresh baremetal when deploying RHEL-OSP + CFME

TP2 RC7

RHCI-6.0-RHEL-7-20160204.0
RHCIOOO-7-RHEL-7-20160127.0
Comment 8 Thom Carlin 2016-02-05 12:58:27 EST
/opt/rh/ruby193/root/usr/share/gems/gems/fusor_server-0.0.1/app/lib/actions/fusor/deployment/cloud_forms/update_root_password.rb:75:in `run'

Investigating Overcloud, 
2016-02-05 11:06:18.712 16075 WARNING nova.virt.libvirt.driver [req-f7c9d11f-9ad1-417e-bca4-40cddbad7c45 - - - - -] Cannot update service status on host "overcloud-compute-0.localdomain" since it is n
ot registered.
2016-02-05 11:06:18.948 16075 WARNING nova.compute.resource_tracker [req-f7c9d11f-9ad1-417e-bca4-40cddbad7c45 - - - - -] No compute node record for overcloud-compute-0.localdomain:overcloud-compute-0.
localdomain

[...]
2016-02-05 11:08:46.562 16075 WARNING nova.compute.manager [req-e8316a50-fb7b-4c7b-8963-de97c97727d3 - - - - -] Bandwidth usage not supported by hypervisor.
2016-02-05 11:11:44.600 16075 ERROR nova.compute.manager [req-8dd7fba1-86c3-4fc8-8e75-de1ac458c443 08930a34218f48afa5ba78a0184f900b db45320b486a4d55ac00c71867200a44 - - -] [instance: f1826c22-e951-43e
f-a5ef-43076405e11c] Instance failed to spawn
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c] Traceback (most recent call last):
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c]   File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 2474, in _build_resour
ces
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c]     yield resources
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c]   File "/usr/lib/python2.7/site-packages/nova/compute/manager.py", line 2346, in _build_and_ru
n_instance
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c]     block_device_info=block_device_info)
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c]   File "/usr/lib/python2.7/site-packages/nova/virt/libvirt/driver.py", line 2460, in spawn
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c]     write_to_disk=True)
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c]   File "/usr/lib/python2.7/site-packages/nova/virt/libvirt/driver.py", line 4322, in _get_gues
t_xml
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c]     context)
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c]   File "/usr/lib/python2.7/site-packages/nova/virt/libvirt/driver.py", line 4193, in _get_gues
t_config
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c]     flavor, virt_type)
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c]   File "/usr/lib/python2.7/site-packages/nova/virt/libvirt/vif.py", line 374, in get_config
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c]     _("Unexpected vif_type=%s") % vif_type)
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c] NovaException: Unexpected vif_type=binding_failed
2016-02-05 11:11:44.600 16075 TRACE nova.compute.manager [instance: f1826c22-e951-43ef-a5ef-43076405e11c] 
2016-02-05 11:17:14.352 16075 WARNING nova.compute.manager [req-e8316a50-fb7b-4c7b-8963-de97c97727d3 - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 insta
nces on the hypervisor.
2016-02-05 11:27:15.331 16075 WARNING nova.compute.manager [req-e8316a50-fb7b-4c7b-8963-de97c97727d3 - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 insta
nces on the hypervisor.
2016-02-05 11:37:42.437 16075 WARNING nova.compute.manager [req-e8316a50-fb7b-4c7b-8963-de97c97727d3 - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 insta
nces on the hypervisor.
2016-02-05 11:48:11.331 16075 WARNING nova.compute.manager [req-e8316a50-fb7b-4c7b-8963-de97c97727d3 - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 insta
nces on the hypervisor.
2016-02-05 11:58:17.350 16075 WARNING nova.compute.manager [req-e8316a50-fb7b-4c7b-8963-de97c97727d3 - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 insta
nces on the hypervisor.
2016-02-05 12:00:29.524 16075 ERROR oslo_messaging.rpc.dispatcher [req-f2bae3eb-8caa-4375-b659-c7b9648a3ad7 08930a34218f48afa5ba78a0184f900b 1c3d243392324e5190fcccb16476b38b - - -] Exception during me
ssage handling: Instance f1826c22-e951-43ef-a5ef-43076405e11c in power_state 0. Cannot get_diagnostics while the instance is in this state.
Comment 9 Thom Carlin 2016-02-05 17:40:05 EST
2016-02-05 11:17:38 [I] Completed 201 Created in 248ms (Views: 11.4ms | ActiveRecord: 0.0ms)
2016-02-05 11:20:32 [W] Failed to get IP for <deployment_name>-<host>: The specified wait_for timeout (600 seconds) was exceeded/opt/rh/ruby193/root/usr/share/gems/gems/fog-core-1.24.0/lib/
fog/core/wait_for.rb:11:in `wait_for'
/opt/rh/ruby193/root/usr/share/gems/gems/fog-core-1.24.0/lib/fog/core/model.rb:65:in `wait_for'
/usr/share/foreman/app/models/concerns/orchestration/compute.rb:136:in `setComputeIP'
/usr/share/foreman/app/models/concerns/orchestration.rb:137:in `execute'
/usr/share/foreman/app/models/concerns/orchestration.rb:85:in `block in process'
/usr/share/foreman/app/models/concerns/orchestration.rb:77:in `each'
/usr/share/foreman/app/models/concerns/orchestration.rb:77:in `process'
/usr/share/foreman/app/models/concerns/orchestration.rb:18:in `on_save'
/opt/rh/ruby193/root/usr/share/gems/gems/activesupport-3.2.8/lib/active_support/callbacks.rb:671:in `_run__4035441742913024033__save__3631947971391872791__callbacks'
/opt/rh/ruby193/root/usr/share/gems/gems/activesupport-3.2.8/lib/active_support/callbacks.rb:405:in `__run_callback'
/opt/rh/ruby193/root/usr/share/gems/gems/activesupport-3.2.8/lib/active_support/callbacks.rb:385:in `_run_save_callbacks'
/opt/rh/ruby193/root/usr/share/gems/gems/activesupport-3.2.8/lib/active_support/callbacks.rb:81:in `run_callbacks'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/callbacks.rb:264:in `create_or_update'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/persistence.rb:84:in `save'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/validations.rb:50:in `save'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/attribute_methods/dirty.rb:22:in `save'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/transactions.rb:241:in `block (2 levels) in save'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/transactions.rb:295:in `block in with_transaction_returning_status'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/connection_adapters/abstract/database_statements.rb:192:in `transaction'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/transactions.rb:208:in `transaction'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/transactions.rb:293:in `with_transaction_returning_status'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/transactions.rb:241:in `block in save'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/transactions.rb:252:in `rollback_active_record_state!'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/transactions.rb:240:in `save'
/usr/share/foreman/app/models/concerns/foreman/sti.rb:29:in `save_with_type'
/opt/rh/ruby193/root/usr/share/gems/gems/activerecord-3.2.8/lib/active_record/persistence.rb:45:in `create'
/usr/share/foreman/app/models/host.rb:16:in `method_missing'
/opt/rh/ruby193/root/usr/share/gems/gems/fusor_server-0.0.1/app/lib/actions/fusor/deployment/openstack/cfme_launch.rb:103:in `create_host'
/opt/rh/ruby193/root/usr/share/gems/gems/fusor_server-0.0.1/app/lib/actions/fusor/deployment/openstack/cfme_launch.rb:35:in `run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:487:in `block (3 levels) in execute_run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware/stack.rb:26:in `call'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware/stack.rb:26:in `pass'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware.rb:16:in `pass'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action/progress.rb:30:in `with_progress_calculation'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action/progress.rb:16:in `run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware/stack.rb:22:in `call'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware/stack.rb:26:in `pass'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware.rb:16:in `pass'
/opt/rh/ruby193/root/usr/share/gems/gems/katello-2.2.0.73/app/lib/actions/middleware/keep_locale.rb:23:in `block in run'
/opt/rh/ruby193/root/usr/share/gems/gems/katello-2.2.0.73/app/lib/actions/middleware/keep_locale.rb:34:in `with_locale'
/opt/rh/ruby193/root/usr/share/gems/gems/katello-2.2.0.73/app/lib/actions/middleware/keep_locale.rb:23:in `run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware/stack.rb:22:in `call'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/middleware/world.rb:30:in `execute'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:486:in `block (2 levels) in execute_run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:485:in `catch'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:485:in `block in execute_run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:402:in `call'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:402:in `block in with_error_handling'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:402:in `catch'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:402:in `with_error_handling'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:480:in `execute_run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/action.rb:262:in `execute'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:9:in `block (2 levels) in execute'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/execution_plan/steps/abstract.rb:155:in `call'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/execution_plan/steps/abstract.rb:155:in `with_meta_calculation'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:8:in `block in execute'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:22:in `open_action'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:7:in `execute'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/executors/parallel/worker.rb:20:in `block in on_message'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:859:in `block in assigns'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:858:in `tap'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:858:in `assigns'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:138:in `match_value'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:116:in `block in match'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:115:in `each'
/opt/rh/ruby193/root/usr/share/gems/gems/algebrick-0.4.0/lib/algebrick.rb:115:in `match'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/executors/parallel/worker.rb:17:in `on_message'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:82:in `on_envelope'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:72:in `receive'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:99:in `block (2 levels) in run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:99:in `loop'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:99:in `block in run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:99:in `catch'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:99:in `run'
/opt/rh/ruby193/root/usr/share/gems/gems/dynflow-0.7.7.9/lib/dynflow/micro_actor.rb:13:in `block in initialize'
/opt/rh/ruby193/root/usr/share/gems/gems/logging-1.8.1/lib/logging/diagnostic_context.rb:323:in `call'
/opt/rh/ruby193/root/usr/share/gems/gems/logging-1.8.1/lib/logging/diagnostic_context.rb:323:in `block in create_with_logging_context'
2016-02-05 11:20:32 [W] Rolling back due to a problem: [Acquire IP address for <deployment_name>-<host>	 3	 failed	 [#<Host::Managed id: nil, name: "<deployment_name>-<host>", ip: nil, last_compile: nil, last_freshcheck: nil, last_report: nil, updated_at: nil, source_file_id: nil, created_at: nil, mac: nil, root_pass: "$1$hunter2", serial: nil,
 puppet_status: 0, domain_id: 1, architecture_id: 1, operatingsystem_id: 1, environment_id: 3, subnet_id: nil, ptable_id: nil, medium_id: nil, build: true, comment: nil, disk: nil, installed_at: nil, mo
del_id: nil, hostgroup_id: nil, owner_id: 3, owner_type: "User", enabled: true, puppet_ca_proxy_id: nil, managed: true, use_image: nil, image_file: nil, uuid: nil, compute_resource_id: 1, puppet_proxy_i
d: nil, certname: nil, image_id: 1, organization_id: 1, location_id: 2, type: "Host::Managed", otp: nil, realm_id: nil, compute_profile_id: 4, provision_method: "image", content_source_id: nil, primary_
interface: nil, grub_pass: "$1$hunter2", discovery_rule_id: nil, content_view_id: nil, lifecycle_environment_id: nil>, :setComputeIP]]
2016-02-05 11:20:32 [I] Removing Compute instance for <deployment_name>-<host>
2016-02-05 11:20:32 [I] Updating CFME password...
2016-02-05 11:20:36 [E] Probable error. Will we retry? false. Error message: Authentication failed for user root@
Comment 10 Antonin Pagac 2016-02-10 13:22:03 EST
John,

in response to comment 6: In the Overcloud UI I see status of CFME VM is 'Running' but I cannot ping the VM from Satellite nor Undercloud.
Comment 11 Landon LaSmith 2016-02-10 17:36 EST
Created attachment 1122982 [details]
log from "no route to host" rhci-20160204 iso

John,

comment 6: During CFME deployment received "Failed to update root password on appliance. Unhandled error: No route to host - connect(2)".  Before and after failure, satellite was able to ping the overcloud in the OSP provision and public network but not the cfme machine even though it is currently running.

TP2 RC7

RHCI-6.0-RHEL-7-20160204.0
RHCIOOO-7-RHEL-7-20160127.0
Comment 12 Antonin Pagac 2016-02-11 11:25:59 EST
I'm now seeing this in my rather unusual 7th deployment from the same satellite using the same undercloud:

2016-02-11 11:05:49 [I] waiting for instance to acquire ip address
2016-02-11 11:05:49 [I] Removing Compute instance for apagac-2-all-cfme.cfme.lab.eng.rdu2.redhat.com
2016-02-11 11:05:50 [W] ip 192.168.253.2 is already used by apagac-all-cfme.cfme.lab.eng.rdu2.redhat.com
2016-02-11 11:05:50 [W] Rolling back due to a problem: [Query instance details for apagac-2-all-cfme.cfme.lab.eng.rdu2.redhat.com 4 failed [#, :setComputeDetails]]
2016-02-11 11:05:50 [I] Removing Compute instance for apagac-2-all-cfme.cfme.lab.eng.rdu2.redhat.com
2016-02-11 11:05:50 [I] Updating CFME password...
2016-02-11 11:05:51 [I] Processing by Fusor::Api::V21::DeploymentsController#log as JSON
2016-02-11 11:05:51 [I] Parameters: {"log_type"=>"foreman_log", "api_version"=>"v21", "id"=>"7", "deployment"=>{}}
2016-02-11 11:05:52 [I] Completed 200 OK in 648ms (Views: 505.3ms | ActiveRecord: 0.4ms)
2016-02-11 11:05:52 [E] Password was not updated. Error: execution expired
2016-02-11 11:05:52 [I] UpdateRootPassword will retry again in 30 seconds.

This is from the '/usr/share/foreman/log/deployments/apagac-2-all-7/var/log/foreman/production.log' viewed using UI.

I think this means the installation encountered some leftover CFME instance that I forgot to delete from somewhere and aborted, but did not exit, instead it is trying to update password on non-existent CFME instance or on the leftover stuck appliance.

Leftover: "apagac-all-cfme.cfme.lab.eng.rdu2.redhat.com"
Current: "apagac-2-all-cfme.cfme.lab.eng.rdu2.redhat.com"
Comment 13 Thom Carlin 2016-02-18 17:10:40 EST
For TP2 RC9 and reusing TripleO environment, I see on the Overcloud nova-compute.log:
WARNING nova.virt.libvirt.driver [req-uuid hex_string hex_string - - -] Timeout waiting for vif plugging callback for instance uuid
2016-02-18 16:50:04.766 16081 WARNING nova.virt.libvirt.imagecache [req-uuid - - - - -] Unknown base file: /var/lib/nova/instances/_base/hex_string

qemu-inmg info /var/lib/nova/instances/_base/hex_string
image: hex_string
file format: raw
virtual size: 40G (42949672960 bytes)
disk size: 1.8G

File is virt_content_t but restorecon changes it to nova_var_lib_t
Comment 14 Mike McCune 2016-03-28 18:47:43 EDT
This bug was accidentally moved from POST to MODIFIED via an error in automation, please see mmccune@redhat.com with any questions
Comment 15 Thom Carlin 2016-03-31 13:51:31 EDT
For TP3 RC2, occurred on *new* TripleO environment
Comment 16 Thom Carlin 2016-03-31 16:26:23 EDT
Looks like in my case, the external bridge is missing the interface so jmatthews' question in https://bugzilla.redhat.com/show_bug.cgi?id=1304662#c6 was spot on.

Investigating, it seems way back in /usr/libexec/os-refresh-config/configure.d/20-os-net-config, NET_CONFIG was not set
Comment 17 Thom Carlin 2016-03-31 16:30:33 EDT
External bridge is OK on controller node
It's the compute node that is problematic
Comment 18 Thom Carlin 2016-03-31 17:02:18 EDT
os-apply-config --key os_net_config --type raw --key-default ''
is blank on compute [problem] node

returns values on controller node.

For compute node, /var/lib/os-collect-config/os_config_files.json contains:
[
  "/var/lib/os-collect-config/heat_local.json",
  "/var/lib/os-collect-config/ec2.json",
  "/var/lib/os-collect-config/cfn.json",
  "/var/lib/os-collect-config/overcloud-CephClusterConfig-t7skqpno3o66-CephClusterConfigImpl-jyfrkq3nwlbu.json",
  "/var/lib/os-collect-config/overcloud-Compute-vgihnewqc7yt-0-syimclnvoknl-NetworkConfig-yiqxvhunk2cu-OsNetConfigImpl-xfqkezqf776f.json",
  "/var/lib/os-collect-config/overcloud-Compute-vgihnewqc7yt-0-syimclnvoknl-NovaComputeConfig-iynqqr4qaomz.json",
  "/var/lib/os-collect-config/overcloud-allNodesConfig-lkht3ycob2dx-allNodesConfigImpl-tiybjggcazrd.json"
]

"os_net_config" is *not* defined in any of these files but *is* found in the equivalent files on Controller node.

51-hosts says "No hosts in Heat, nothing written." on problem node.

No errors in heat on Director that I've seen
Comment 19 John Matthews 2016-04-11 17:51:20 EDT
Moving to ON_QA as per Thom/Sudhir

Please test against TP3 RC#3
Comment 20 Tasos Papaioannou 2016-04-20 13:15:41 EDT
Verified on QCI-1.0-RHEL-7-20160411.t.0 + QCIOOO-7-RHEL-7-20160408.t.1.

Successfully deployed CFME on OSP.

Note You need to log in before you can comment on or make changes to this bug.