# rpm -qR python-octavia-2.0.0-1.el7.noarch | grep jinja2 python2-jinja2 >= 2.8 # According to the Octavia developers, version 2.10 is actually required. They claim that Red Hat should be distributing 2.10 because that's what's specified in the "upper-constraints.txt" for the Queens release, even though requirements.txt for Octavia specifies >= 2.8. In any case, with the 2.8.1 version, Octavia fails to create a loadbalancer listener: 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/octavia/controller/queue/endpoint.py", line 68, in create_listener 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server self.worker.create_listener(listener_id) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/octavia/controller/worker/controller_worker.py", line 206, in create_listener 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server create_listener_tf.run() 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 247, in run 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server for _state in self.run_iter(timeout=timeout): 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 340, in run_iter 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server failure.Failure.reraise_if_any(er_failures) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 336, in reraise_if_any 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server failures[0].reraise() 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 343, in reraise 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server six.reraise(*self._exc_info) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server result = task.execute(**arguments) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/octavia/controller/worker/tasks/amphora_driver_tasks.py", line 56, in execute 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server self.amphora_driver.update(listener, loadbalancer.vip) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/octavia/amphorae/drivers/haproxy/rest_api_driver.py", line 76, in update 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server user_group=CONF.haproxy_amphora.user_group) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/octavia/common/jinja/haproxy/jinja_cfg.py", line 101, in build_config 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server socket_path=socket_path) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/octavia/common/jinja/haproxy/jinja_cfg.py", line 146, in render_loadbalancer_obj 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server constants=constants) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/jinja2/environment.py", line 989, in render 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server return self.environment.handle_exception(exc_info, True) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/jinja2/environment.py", line 754, in handle_exception 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server reraise(exc_type, exc_value, tb) 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server File "/usr/lib/python2.7/site-packages/octavia/common/jinja/haproxy/templates/base.j2", line 32, in template 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server {% set found_ns.found = true %} 2018-03-05 22:30:06.730 22264 ERROR oslo_messaging.rpc.server TemplateSyntaxError: expected token 'end of statement block', got '.'
> According to the Octavia developers, version 2.10 is actually required. They claim that Red Hat should be distributing 2.10 because that's what's specified in the "upper-constraints.txt" for the Queens release, even though requirements.txt for Octavia specifies >= 2.8. If it is required, min. version must be bumped, this was done in https://review.openstack.org/549913 for master but it won't be backported to stable/queens.
Alex, right. The only way to fix it is by bumping jinja2 version in the octavia .spec (see bz#1552448). Please correct me if I'm wrong.
s/Alex/Alan/ :)
please make sure Patch is fixed in RDO rpm-master and in queens-rdo branches
rpm-master: https://review.rdoproject.org/r/12808 queens-rdo: https://review.rdoproject.org/r/12810 Those patches are linked in bz#1552448 (a depends-on of this bz).
It appears that both RDO Remaining patches have merged, are we tracking any other patches before this BZ can move to POST?
All patches have been merged. A note that python2-jinja2 being distributed in http://mirror.centos.org/centos/7/cloud/x86_64/openstack-queens/ is still python2-jinja2-2.8.1-1.el7
openstack-octavia-2.0.0-0.20180226092801.9f379ae.0rc2.el7ost and python-jinja2-2.10-2.el7
It depends if we able install required image manually after installation. In general, once we have a workaround it won't considered as a blocker. Carlos, can you provide us way to workaround this issue?
Toni, we discussed that yesterday with QA (Alex and Noam). The workaround is to manually install the newer jinja version either via RPM or pip. Alex keeps a doc of issues, workarounds and patches.
Body: None Response - Headers: {'status': '204', 'content-location': 'http://192.168.24.11:35357/v3/projects/10366eaf81fe4acca9c204968354d25c', u'vary': 'X-Auth-Token', u'server': 'Apache', u'connection': 'close', u'date': 'Thu, 15 Mar 2018 14:10:49 GMT', u'content-type': 'text/plain', u'x-openstack-request-id': 'req-703d8c5e-3b9f-4dda-a884-d63a2476efe1'} Body: _log_request_full tempest/lib/common/rest_client.py:434 Ran 1 test in 309.333s OK With workaround provided by Bernard and co :)
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHEA-2018:2086