Bug 1551821 - Octavia requires jinja2 2.10
Summary: Octavia requires jinja2 2.10
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-octavia
Version: 13.0 (Queens)
Hardware: All
OS: Linux
high
high
Target Milestone: beta
: 13.0 (Queens)
Assignee: Carlos Goncalves
QA Contact: Alexander Stafeyev
URL:
Whiteboard:
Depends On: 1552192 1552448
Blocks: 1433523
TreeView+ depends on / blocked
 
Reported: 2018-03-06 00:32 UTC by iain MacDonnell
Modified: 2019-09-10 14:11 UTC (History)
8 users (show)

Fixed In Version: openstack-octavia-2.0.0-0.20180226092801.9f379ae.0rc2.el7ost
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2018-06-27 13:46:51 UTC
Target Upstream Version:


Attachments (Terms of Use)


Links
System ID Priority Status Summary Last Updated
OpenStack gerrit 519059 None master: MERGED requirements: Updated from generate-constraints (I60b21b3a5bb23ffa196a32f3aeec9499a658d661) 2018-03-12 15:43:15 UTC
OpenStack gerrit 549913 None master: MERGED requirements: Bump jinja2 minimum to 2.10 (required by Octavia) (Ia1c1f7255be298f75e95be6230a099a7c54ad127) 2018-03-12 15:43:08 UTC
RDO 12808 None rpm-master: NEW openstack/octavia-distgit: Bump python2-jinja2 to 2.10 minimum (I98f349b27ec47dcd9bd2af08296f329603a860e9) 2018-03-07 16:12:01 UTC
RDO 12810 None queens-rdo: NEW openstack/octavia-distgit: Bump python2-jinja2 to 2.10 minimum (I98f349b27ec47dcd9bd2af08296f329603a860e9) 2018-03-07 16:11:48 UTC
Red Hat Product Errata RHEA-2018:2086 None None None 2018-06-27 13:48:04 UTC

Description iain MacDonnell 2018-03-06 00:32:53 UTC
# rpm -qR python-octavia-2.0.0-1.el7.noarch | grep jinja2
python2-jinja2 >= 2.8
# 

According to the Octavia developers, version 2.10 is actually required. They claim that Red Hat should be distributing 2.10 because that's what's specified in the "upper-constraints.txt" for the Queens release, even though requirements.txt for Octavia specifies >= 2.8.

In any case, with the 2.8.1 version, Octavia fails to create a loadbalancer listener:

2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/controller/queue/endpoint.py", line 68, in create_listener
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     self.worker.create_listener(listener_id)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/controller/worker/controller_worker.py", line 206, in create_listener
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     create_listener_tf.run()
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 247, in run
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     for _state in self.run_iter(timeout=timeout):
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 340, in run_iter
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     failure.Failure.reraise_if_any(er_failures)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 336, in reraise_if_any
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     failures[0].reraise()
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 343, in reraise
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     six.reraise(*self._exc_info)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     result = task.execute(**arguments)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/controller/worker/tasks/amphora_driver_tasks.py", line 56, in execute
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     self.amphora_driver.update(listener, loadbalancer.vip)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/amphorae/drivers/haproxy/rest_api_driver.py", line 76, in update
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     user_group=CONF.haproxy_amphora.user_group)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/common/jinja/haproxy/jinja_cfg.py", line 101, in build_config
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     socket_path=socket_path)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/common/jinja/haproxy/jinja_cfg.py", line 146, in render_loadbalancer_obj
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     constants=constants)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/jinja2/environment.py", line 989, in render
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     return self.environment.handle_exception(exc_info, True)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/jinja2/environment.py", line 754, in handle_exception
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     reraise(exc_type, exc_value, tb)
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/octavia/common/jinja/haproxy/templates/base.j2", line 32, in template
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server     {% set found_ns.found = true %}
2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server TemplateSyntaxError: expected token 'end of statement block', got '.'

Comment 1 Alan Pevec 2018-03-07 11:45:49 UTC
> According to the Octavia developers, version 2.10 is actually required. They claim that Red Hat should be distributing 2.10 because that's what's specified in the "upper-constraints.txt" for the Queens release, even though requirements.txt for Octavia specifies >= 2.8.

If it is required, min. version must be bumped, this was done in https://review.openstack.org/549913 for master but it won't be backported to stable/queens.

Comment 2 Carlos Goncalves 2018-03-07 11:53:28 UTC
Alex, right. The only way to fix it is by bumping jinja2 version in the octavia .spec (see bz#1552448). Please correct me if I'm wrong.

Comment 3 Carlos Goncalves 2018-03-07 11:53:57 UTC
s/Alex/Alan/ :)

Comment 4 Jon Schlueter 2018-03-07 13:50:26 UTC
please make sure Patch is fixed in RDO rpm-master and in queens-rdo branches

Comment 5 Carlos Goncalves 2018-03-07 15:59:18 UTC
rpm-master: https://review.rdoproject.org/r/12808
queens-rdo: https://review.rdoproject.org/r/12810

Those patches are linked in bz#1552448 (a depends-on of this bz).

Comment 7 Jon Schlueter 2018-03-09 21:32:19 UTC
It appears that both RDO Remaining patches have merged, are we tracking any other patches before this BZ can move to POST?

Comment 8 Carlos Goncalves 2018-03-09 21:51:46 UTC
All patches have been merged.

A note that python2-jinja2 being distributed in http://mirror.centos.org/centos/7/cloud/x86_64/openstack-queens/ is still python2-jinja2-2.8.1-1.el7

Comment 10 Jon Schlueter 2018-03-12 15:49:22 UTC
openstack-octavia-2.0.0-0.20180226092801.9f379ae.0rc2.el7ost and python-jinja2-2.10-2.el7

Comment 12 Toni Freger 2018-03-14 11:33:27 UTC
It depends if we able install required image manually after installation. In general, once we have a workaround it won't considered as a blocker.

Carlos, can you provide us way to workaround this issue?

Comment 13 Carlos Goncalves 2018-03-14 12:40:52 UTC
Toni, we discussed that yesterday with QA (Alex and Noam). The workaround is to manually install the newer jinja version either via RPM or pip. Alex keeps a doc of issues, workarounds and patches.

Comment 14 Alexander Stafeyev 2018-03-15 14:11:48 UTC
        Body: None
    Response - Headers: {'status': '204', 'content-location': 'http://192.168.24.11:35357/v3/projects/10366eaf81fe4acca9c204968354d25c', u'vary': 'X-Auth-Token', u'server': 'Apache', u'connection': 'close', u'date': 'Thu, 15 Mar 2018 14:10:49 GMT', u'content-type': 'text/plain', u'x-openstack-request-id': 'req-703d8c5e-3b9f-4dda-a884-d63a2476efe1'}
        Body:  _log_request_full tempest/lib/common/rest_client.py:434

Ran 1 test in 309.333s
OK


With workaround provided by Bernard and co :)

Comment 18 errata-xmlrpc 2018-06-27 13:46:51 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2018:2086


Note You need to log in before you can comment on or make changes to this bug.