Bug 1608334 - No amphora image and octavia-ssh-key keypair are created during the deployment
Summary: No amphora image and octavia-ssh-key keypair are created during the deployment
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-tripleo-common
Version: 14.0 (Rocky)
Hardware: Unspecified
OS: Unspecified
urgent
urgent
Target Milestone: beta
: 14.0 (Rocky)
Assignee: Carlos Goncalves
QA Contact: Alexander Stafeyev
URL:
Whiteboard:
Depends On: 1640024
Blocks:
TreeView+ depends on / blocked
 
Reported: 2018-07-25 10:31 UTC by Alexander Stafeyev
Modified: 2023-09-14 04:32 UTC (History)
12 users (show)

Fixed In Version: openstack-tripleo-common-9.4.1-0.20181012010875.67bab16.el7ost openstack-tripleo-heat-templates-9.0.1-0.20181013060883.el7ost
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
: 1615128 (view as bug list)
Environment:
Last Closed: 2019-01-11 11:50:58 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Launchpad 1786786 0 None None None 2018-08-13 14:20:33 UTC
OpenStack gerrit 591413 0 None MERGED Fix skip of octavia-undercloud Ansible role 2020-05-13 13:33:45 UTC
OpenStack gerrit 608449 0 None MERGED Fix skip of octavia-undercloud Ansible role 2020-05-13 13:33:45 UTC
Red Hat Product Errata RHEA-2019:0045 0 None None None 2019-01-11 11:51:13 UTC

Description Alexander Stafeyev 2018-07-25 10:31:55 UTC
I tried to create a LB but it failed . after looking in logs I saw that there is no image for octavia to use. 

2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server [-] Exception during message handling: ComputeBuildException: Failed to build compute instance due to: No Glance images are tagged with amphora-image tag.
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 265, in dispatch
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 194, in _do_dispatch
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/controller/queue/endpoint.py", line 44, in create_load_balancer
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     self.worker.create_load_balancer(load_balancer_id)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/tenacity/__init__.py", line 214, in wrapped_f
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     return self.call(f, *args, **kw)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/tenacity/__init__.py", line 295, in call
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     start_time=start_time)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/tenacity/__init__.py", line 252, in iter
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     return fut.result()
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/concurrent/futures/_base.py", line 422, in result
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     return self.__get_result()
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/tenacity/__init__.py", line 298, in call
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     result = fn(*args, **kwargs)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/controller/worker/controller_worker.py", line 319, in create_load_balancer
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     create_lb_tf.run()
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 247, in run
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     for _state in self.run_iter(timeout=timeout):
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 340, in run_iter
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     failure.Failure.reraise_if_any(er_failures)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 336, in reraise_if_any
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     failures[0].reraise()
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 343, in reraise
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     six.reraise(*self._exc_info)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     result = task.execute(**arguments)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/controller/worker/tasks/compute_tasks.py", line 144, in execute
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     server_group_id=server_group_id, ports=ports)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/controller/worker/tasks/compute_tasks.py", line 98, in execute
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     server_group_id=server_group_id)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/compute/drivers/nova_driver.py", line 160, in build
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server     raise exceptions.ComputeBuildException(fault=e)
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server ComputeBuildException: Failed to build compute instance due to: No Glance images are tagged with amphora-image tag.
2018-07-25 10:20:42.397 22 ERROR oslo_messaging.rpc.server

Comment 2 Carlos Goncalves 2018-07-25 10:43:44 UTC
In the unlikely event that the RPM package containing the amphora image cannot be installed, the installation should fail right away, so if it did not fail I wonder what happened.

https://github.com/openstack/tripleo-common/blob/master/playbooks/roles/octavia-undercloud/tasks/main.yml#L13-L14

Could you please share the log from the Mistral octavia workflow?

Comment 3 Alexander Stafeyev 2018-07-25 11:36:54 UTC
(In reply to Carlos Goncalves from comment #2)
> In the unlikely event that the RPM package containing the amphora image
> cannot be installed, the installation should fail right away, so if it did
> not fail I wonder what happened.
> 
> https://github.com/openstack/tripleo-common/blob/master/playbooks/roles/
> octavia-undercloud/tasks/main.yml#L13-L14
> 
> Could you please share the log from the Mistral octavia workflow?

Which logs exactly do u need ? 

p.s.
I will add sosreport soon

Comment 5 Alexander Stafeyev 2018-07-25 11:43:26 UTC
We have the same issue with keypair - /var/log/containers/octavia/worker.log:2018-07-25 11:35:31.046 22 ERROR oslo_messaging.rpc.server ComputeBuildException: Failed to build compute instance due to: Invalid key_name provided.  


No keypair found

Comment 6 Carlos Goncalves 2018-08-13 14:14:23 UTC
From /var/lib/mistral/4842bfd0-8a7c-4ff2-b008-e52c6ea9c405/octavia-ansible/octavia-ansible.log


2018-08-13 08:52:17,312 p=14240 u=mistral |   [WARNING]: Could not match supplied host pattern, ignoring: undercloud

2018-08-13 08:52:17,314 p=14240 u=mistral |  PLAY [undercloud[0]] ***********************************************************
2018-08-13 08:52:17,314 p=14240 u=mistral |  skipping: no hosts matched
2018-08-13 08:52:17,319 p=14240 u=mistral |  PLAY [octavia_nodes[0]] ********************************************************
2018-08-13 08:52:17,338 p=14240 u=mistral |  TASK [Gathering Facts] *********************************************************
2018-08-13 08:52:17,338 p=14240 u=mistral |  Monday 13 August 2018  08:52:17 -0400 (0:00:00.105)       0:00:00.105 *********
2018-08-13 08:52:24,123 p=14240 u=mistral |  ok: [controller-2]
[...]

Host "undercloud[0]" could not be found when running the playbook tripleo-common/playbooks/octavia-files.yaml [1]. That must be due to the migration of Octavia to external_deploy_task [2] that happened in Rocky cycle.

So, the trouble maker was the renaming of 'undercloud' to 'Undercloud' in tripleo-heat-templates/tree/docker/services/octavia/octavia-deployment-config.yaml. 

[root@undercloud-0 mistral]# cat /var/lib/mistral/4842bfd0-8a7c-4ff2-b008-e52c6ea9c405/octavia-ansible/inventory.yaml
octavia_nodes:
  hosts:
    controller-2:
      ansible_user: tripleo-admin
      ansible_host: 192.168.24.19
      ansible_become: true
    controller-1:
      ansible_user: tripleo-admin
      ansible_host: 192.168.24.14
      ansible_become: true
    controller-0:
      ansible_user: tripleo-admin
      ansible_host: 192.168.24.16
      ansible_become: true
    
Undercloud:
  hosts:
    undercloud-0:
      ansible_host: localhost
      ansible_become: false
      ansible_connection: local


We need to update octavia-files.yaml to reflect that change. 

[1] https://github.com/openstack/tripleo-common/blob/336cd3c6364c8c9be88c840253db0d0b54336c75/playbooks/octavia-files.yaml#L2
[2] https://review.openstack.org/#/c/559374/

Comment 11 Nir Magnezi 2018-10-07 10:27:04 UTC
We are waiting for a W+1 for Carlos's patch: https://review.openstack.org/#/c/591413/

Comment 25 errata-xmlrpc 2019-01-11 11:50:58 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2019:0045

Comment 26 Red Hat Bugzilla 2023-09-14 04:32:04 UTC
The needinfo request[s] on this closed bug have been removed as they have been unresolved for 1000 days


Note You need to log in before you can comment on or make changes to this bug.