(From LP directly) looks like non standalone (?) jobs failing for the same error, mistral related with trace that looks like below with many examples like [1][2] 2018-12-28 09:08:09 | "INFO: Undercloud post - Mistral workbooks configured successfully.", 2018-12-28 09:08:09 | "ERROR: Undercloud Post - Failed.", 2018-12-28 09:08:09 | "", 2018-12-28 09:08:09 | "[2018-12-28 09:08:06,120] (heat-config) [DEBUG] Traceback (most recent call last):", 2018-12-28 09:08:09 | " File \"/var/lib/heat-config/heat-config-script/89c78702-8793-4647-af28-a2e03aeb0bfc\", line 181, in <module>", 2018-12-28 09:08:09 | " _create_logging_cron(mistral)", 2018-12-28 09:08:09 | " File \"/var/lib/heat-config/heat-config-script/89c78702-8793-4647-af28-a2e03aeb0bfc\", line 111, in _create_logging_cron", 2018-12-28 09:08:09 | " pattern='0 * * * *')", 2018-12-28 09:08:09 | " File \"/usr/lib/python2.7/site-packages/mistralclient/api/v2/cron_triggers.py\", line 55, in create", 2018-12-28 09:08:09 | " return self._create('/cron_triggers', data)", 2018-12-28 09:08:09 | " File \"/usr/lib/python2.7/site-packages/mistralclient/api/base.py\", line 97, in _create", 2018-12-28 09:08:09 | " self._raise_api_exception(ex.response)", 2018-12-28 09:08:09 | " File \"/usr/lib/python2.7/site-packages/mistralclient/api/base.py\", line 160, in _raise_api_exception", 2018-12-28 09:08:09 | " error_message=error_data)", 2018-12-28 09:08:09 | "mistralclient.api.base.APIException: Authorization failed: Cannot authenticate without an auth_url", 2018-12-28 09:08:09 | "", [1] http://logs.openstack.org/98/604298/148/check/tripleo-ci-centos-7-undercloud-containers/7bbb677/logs/undercloud/home/zuul/undercloud_install.log.txt.gz#_2018-12-28_09_22_28 [2] http://logs.openstack.org/98/604298/148/check/tripleo-ci-centos-7-containers-multinode/afa83b1/logs/undercloud/home/zuul/undercloud_install.log.txt.gz#_2018-12-28_09_08_09
Not sure if this is a blocker. I see it happening on the https://review.openstack.org/#/c/604298/ that is a gate check review. It happened to the version 148. where it failed, was rechecked, failed again, was rechecked and worked since then. as I write the gate check review is on version 202. Even though the gate check in some other versions they were all intermittent. I tried to use the reproducer script based on the failed version to see if I could reproduce but I could not. If someone has an evn that has that happening consistently it would help a lot. I'm going to spend a little bit more on this but if I don't find anything else I don't see what we can do.
Removing the blocker flag as it was not reproduced as described in previous comment
Given that this hasn't happened again, we assume that it was resolved in the upstream. Feel free to reopen if someone hits this again.