Bug 1460460 - Setting MysqlNetwork: external caused deployment to fail at overcloud.AllNodesDeploySteps.ControllerDeployment_Step1
Summary: Setting MysqlNetwork: external caused deployment to fail at overcloud.AllNode...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-tripleo-heat-templates
Version: 11.0 (Ocata)
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: z1
: 11.0 (Ocata)
Assignee: Alex Schultz
QA Contact: Gurenko Alex
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2017-06-10 23:40 UTC by Andreas Karis
Modified: 2020-07-16 09:48 UTC (History)
8 users (show)

Fixed In Version: openstack-tripleo-heat-templates-6.0.0-14.el7ost
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2017-07-19 17:04:56 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
templates for osp 11 - need to modify network-environment.yaml (at the bottom) to provoke the issue and set the network to "external" (5.56 KB, application/x-gzip)
2017-06-12 22:24 UTC, Andreas Karis
no flags Details


Links
System ID Private Priority Status Summary Last Updated
Launchpad 1697722 0 None None None 2017-06-13 15:37:12 UTC
OpenStack gerrit 474196 0 None MERGED Add fqdn_external 2020-05-07 14:07:21 UTC
Red Hat Product Errata RHBA-2017:1778 0 normal SHIPPED_LIVE Red Hat OpenStack Platform 11.0 director Bug Fix Advisory 2017-07-19 21:01:28 UTC

Description Andreas Karis 2017-06-10 23:40:25 UTC
Description of problem:
Setting MysqlNetwork: external caused deployment to fail at overcloud.AllNodesDeploySteps.ControllerDeployment_Step1

Version-Release number of selected component (if applicable):
OSP 11

How reproducible:
network-environment.yaml
~~~
parameter_defaults:
(...)
  ServiceNetMap:
    MysqlNetwork: external
(...)
~~~

Expected results:
successful deployment

Additional info:
It works with   
~~~
ServiceNetMap:
    MysqlNetwork: ctlplane
~~~

Actual results:
~~~
 heat deployment-show 26755014-9289-49bf-afeb-bdb00c70d21e | sed 's/\\n/\n/g'
(...)
at http://mariadb.org/jira\u001b[0m
\u001b[1;31mError: /Stage[main]/Mysql::Server::Installdb/Mysql_datadir[/var/lib/mysql]/ensure: change from absent to present failed: Execution of '/usr/bin/mysql_install_db --defaults-extra-file=/etc/my.cnf.d/galera.cnf --basedir=/usr --datadir=/var/lib/mysql --user=mysql' returned 1: Installing MariaDB/MySQL system tables in '/var/lib/mysql' ...
170610 18:56:19 [ERROR] /usr/libexec/mysqld: option '--bind-address' requires an argument
170610 18:56:19 [ERROR] Aborting


Installation of system tables failed!  Examine the logs in
/var/lib/mysql for more information.

The problem could be conflicting information in an external
my.cnf files. You can ignore these by doing:

    shell> /usr/scripts/scripts/mysql_install_db --defaults-file=~/.my.cnf

You can also try to start the mysqld daemon with:

    shell> /usr/libexec/mysqld --skip-grant --general-log &

and use the command line tool /usr/bin/mysql
to connect to the mysql database and look at the grant tables:

    shell> /usr/bin/mysql -u root mysql
    mysql> show tables;

Try 'mysqld --help' if you have problems with paths.  Using
--general-log gives you a log in /var/lib/mysql that may be helpful.

The latest information about mysql_install_db is available at
https://mariadb.com/kb/en/installing-system-tables-mysql_install_db
MariaDB is hosted on launchpad; You can find the latest source and
email lists at http://launchpad.net/maria

Please check all of the above before submitting a bug report
at http://mariadb.org/jira\u001b[0m
\u001b[1;33mWarning: /Stage[main]/Mysql::Server::Root_password/Exec[remove install pass]: Skipping because of failed dependencies\u001b[0m
\u001b[1;33mWarning: /Stage[main]/Mysql::Server/Anchor[mysql::server::end]: Skipping because of failed dependencies\u001b[0m
", 
    "deploy_status_code": 6
  }, 
  "creation_time": "2017-06-10T22:55:59Z", 
  "updated_time": "2017-06-10T22:58:07Z", 
  "input_values": {
    "step": 1, 
    "update_identifier": "1497134678"
  }, 
  "action": "CREATE", 
  "status_reason": "deploy_status_code : Deployment exited with non-zero status code: 6", 
  "id": "26755014-9289-49bf-afeb-bdb00c70d21e"
}
~~~

Comment 1 Andreas Karis 2017-06-10 23:46:30 UTC
This does not seem to be reproducible in OSP 10 neither.

Comment 2 Andreas Karis 2017-06-10 23:51:17 UTC
+--------------------------------------+------------+---------------+----------------------+--------------+
| ID                                   | Stack Name | Stack Status  | Creation Time        | Updated Time |
+--------------------------------------+------------+---------------+----------------------+--------------+
| 6f641aeb-e104-43b2-97cb-068297797ca4 | overcloud  | CREATE_FAILED | 2017-06-10T23:06:58Z | None         |
+--------------------------------------+------------+---------------+----------------------+--------------+
+----------------------------------------------+---------------------------------------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------+-----------------+----------------------+---------------------------------------------------------------------------------------------------------------------------------------+
| resource_name                                | physical_resource_id                                                                              | resource_type                                                                                                          | resource_status | updated_time         | stack_name                                                                                                                            |
+----------------------------------------------+---------------------------------------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------+-----------------+----------------------+---------------------------------------------------------------------------------------------------------------------------------------+
| AllNodesDeploySteps                          | a94f203e-06a4-4890-8e63-2458eb21aed3                                                              | OS::TripleO::PostDeploySteps                                                                                           | CREATE_FAILED   | 2017-06-10T23:06:58Z | overcloud                                                                                                                             |
| ControllerDeployment_Step1                   | 7254c06c-e732-40d2-868a-32fa26192e30                                                              | OS::Heat::StructuredDeploymentGroup                                                                                    | CREATE_FAILED   | 2017-06-10T23:18:18Z | overcloud-AllNodesDeploySteps-icj6at7yr47q                                                                                            |
| 0                                            | 481982e2-7f7c-441f-a07a-2a2f5275f981                                                              | OS::Heat::StructuredDeployment                                                                                         | CREATE_FAILED   | 2017-06-10T23:18:40Z | overcloud-AllNodesDeploySteps-icj6at7yr47q-ControllerDeployment_Step1-oe6ytudi7764                                                    |
+----------------------------------------------+---------------------------------------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------+-----------------+----------------------+---------------------------------------------------------------------------------------------------------------------------------------+
/usr/lib/python2.7/site-packages/novaclient/client.py:278: UserWarning: The 'tenant_id' argument is deprecated in Ocata and its use may result in errors in future releases. As 'project_id' is provided, the 'tenant_id' argument will be ignored.
  warnings.warn(msg)
+--------------------------------------+------------------------+--------+------------+-------------+------------------------+
| ID                                   | Name                   | Status | Task State | Power State | Networks               |
+--------------------------------------+------------------------+--------+------------+-------------+------------------------+
| 3a061e81-cdac-41c7-8bfd-b9a5c19dbb4a | overcloud-compute-0    | ACTIVE | -          | Running     | ctlplane=192.168.24.16 |
| 95cc0503-9a16-4a43-bda5-7a9f2b5898bb | overcloud-controller-0 | ACTIVE | -          | Running     | ctlplane=192.168.24.13 |
+--------------------------------------+------------------------+--------+------------+-------------+------------------------+
+--------------------------------------+--------------------+--------------------------------------+-------------+--------------------+-------------+
| UUID                                 | Name               | Instance UUID                        | Power State | Provisioning State | Maintenance |
+--------------------------------------+--------------------+--------------------------------------+-------------+--------------------+-------------+
| a5401adf-5f10-4674-be8c-a895fd102499 | overcloud-node1    | 95cc0503-9a16-4a43-bda5-7a9f2b5898bb | power on    | active             | False       |
| 226d87ae-a8b3-4983-90db-90a571997dc3 | overcloud-node2    | None                                 | power off   | available          | False       |
| 1cc902e2-ef24-4ce7-b1c9-9afaae2c38d0 | overcloud-node3    | None                                 | power off   | available          | False       |
| 49226938-772f-4cbd-a68e-7a8c3e3d8a7f | overcloud-node4    | 3a061e81-cdac-41c7-8bfd-b9a5c19dbb4a | power on    | active             | False       |
| 0064cdbb-4046-4c22-845f-bc88be500b8b | overcloud-node5    | None                                 | power off   | available          | False       |
| 5b3819c9-71dc-4fe1-86df-821f015d01ad | overcloud-ceph1    | None                                 | None        | enroll             | False       |
| 2da41b95-f7d1-4257-87d8-52e5f8100c17 | overcloud-network1 | None                                 | None        | enroll             | False       |
+--------------------------------------+--------------------+--------------------------------------+-------------+--------------------+-------------+
^CAborted by user request ...
[stack@undercloud-8 ~]$ heat deployment-show 481982e2-7f7c-441f-a07a-2a2f5275f981
WARNING (shell) "heat deployment-show" is deprecated, please use "openstack software deployment show" instead
{
  "status": "FAILED", 
  "server_id": "95cc0503-9a16-4a43-bda5-7a9f2b5898bb", 
  "config_id": "642d9a56-9b4e-4f01-bc8c-a093f1e1a313", 
  "output_values": {
    "deploy_stdout": "\u001b[mNotice: hiera(): Cannot load backend module_data: cannot load such file -- hiera/backend/module_data_backend\u001b[0m\n\u001b[mNotice: Scope(Class[Tripleo::Firewall::Post]): At this stage, all network traffic is blocked.\u001b[0m\n\u001b[mNotice: Compiled catalog for overcloud-controller-0.localdomain in environment production in 5.72 seconds\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Package_manifest[/var/lib/tripleo/installed-packages/overcloud_controller1]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Mysql::Server::Config/File[mysql-config-file]/content: content changed '{md5}f434e1d5766874c7b9ed08d0c66904ca' to '{md5}e649ed820bfe25e2efcff0991e6bf96b'\u001b[0m\n\u001b[mNotice: /Stage[main]/Mysql::Server::Installdb/File[/var/log/mariadb/mariadb.log]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Mysql::Server::Root_password/Exec[remove install pass]: Dependency Mysql_datadir[/var/lib/mysql] has failures: true\u001b[0m\n\u001b[mNotice: /Stage[main]/Mysql::Server/Anchor[mysql::server::end]: Dependency Mysql_datadir[/var/lib/mysql] has failures: true\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[mysql-bind-address]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/File[/etc/rabbitmq]/owner: owner changed 'rabbitmq' to 'root'\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/File[/etc/rabbitmq]/group: group changed 'rabbitmq' to 'root'\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/File[/etc/rabbitmq/ssl]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmq-env.config]/ensure: defined content as '{md5}09d8a3b8e77b2fc5b83cc47b9ed4b1dc'\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmqadmin.conf]/ensure: defined content as '{md5}44d4ef5cb86ab30e6127e83939ef09c4'\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d/limits.conf]/ensure: defined content as '{md5}91d370d2c5a1af171c9d5b5985fca733'\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/Exec[rabbitmq-systemd-reload]: Triggered 'refresh' from 1 events\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/File[/etc/security/limits.d/rabbitmq-server.conf]/ensure: defined content as '{md5}1030abc4db405b5f2969643e99bc7435'\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/Rabbitmq_erlang_cookie[/var/lib/rabbitmq/.erlang.cookie]/content: The rabbitmq erlang cookie was changed\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]/content: content changed '{md5}c8e444b74c6294006936abdaea55a079' to '{md5}e59b8128e89bba36ad009148d46c75a9'\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]/owner: owner changed 'rabbitmq' to 'root'\u001b[0m\n\u001b[mNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]/group: group changed 'rabbitmq' to 'root'\u001b[0m\n\u001b[mNotice: /Stage[main]/Firewall::Linux::Redhat/Exec[/usr/bin/systemctl daemon-reload]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Firewall::Linux::Redhat/Service[iptables]/ensure: ensure changed 'stopped' to 'running'\u001b[0m\n\u001b[mNotice: /Stage[main]/Firewall::Linux::Redhat/Service[ip6tables]/ensure: ensure changed 'stopped' to 'running'\u001b[0m\n\u001b[mNotice: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]/content: content changed '{md5}a50ed62e82d31fb4cb2de2226650c545' to '{md5}0f1a06d0b41e7158471adcca09f12ce7'\u001b[0m\n\u001b[mNotice: /Stage[main]/Memcached/Service[memcached]/ensure: ensure changed 'stopped' to 'running'\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Service/Service[pcsd]/ensure: ensure changed 'stopped' to 'running'\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Corosync/User[hacluster]/password: changed password\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Corosync/User[hacluster]/groups: groups changed '' to ['haclient']\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Corosync/Exec[reauthenticate-across-all-nodes]: Triggered 'refresh' from 2 events\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Corosync/File[etc-pacemaker]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Corosync/File[etc-pacemaker-authkey]/ensure: defined content as '{md5}c042e532c9be62f5322c88258f045376'\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Corosync/Exec[Create Cluster tripleo_cluster]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Corosync/Exec[Start Cluster tripleo_cluster]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Service/Service[corosync]/enable: enable changed 'false' to 'true'\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Service/Service[pacemaker]/enable: enable changed 'false' to 'true'\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Corosync/Exec[wait-for-settle]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Redis::Config/File[/etc/redis]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Redis::Config/File[/etc/redis.conf.puppet]/ensure: defined content as '{md5}106eb199a7fb8baba7a90b725379100a'\u001b[0m\n\u001b[mNotice: /Stage[main]/Redis::Config/File[/var/log/redis]/mode: mode changed '0750' to '0755'\u001b[0m\n\u001b[mNotice: /Stage[main]/Redis::Config/Exec[cp -p /etc/redis.conf.puppet /etc/redis.conf]: Triggered 'refresh' from 1 events\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Pacemaker::Database::Redis/File[/etc/security/limits.d/redis.conf]/ensure: defined content as '{md5}a2f723773964f5ea42b6c7c5d6b72208'\u001b[0m\n\u001b[mNotice: /Stage[main]/Ntp::Config/File[/etc/ntp.conf]/content: content changed '{md5}c07b9a377faea45b96b7d3bf8976004b' to '{md5}1831fc3b6710354d103eb9e9904ab5e6'\u001b[0m\n\u001b[mNotice: /Stage[main]/Ntp::Config/File[/etc/ntp.conf]/seltype: seltype changed 'etc_t' to 'net_conf_t'\u001b[0m\n\u001b[mNotice: /Stage[main]/Ntp::Service/Service[ntp]/ensure: ensure changed 'stopped' to 'running'\u001b[0m\n\u001b[mNotice: /Stage[main]/Timezone/File[/etc/localtime]/target: target changed '../usr/share/zoneinfo/America/New_York' to '/usr/share/zoneinfo/UTC'\u001b[0m\n\u001b[mNotice: /Stage[main]/Opendaylight::Repos/Yumrepo[opendaylight-6-testing]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Opendaylight::Install/File_line[java_options_systemd]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Opendaylight::Install/Exec[reload_systemd_units]: Triggered 'refresh' from 1 events\u001b[0m\n\u001b[mNotice: /Stage[main]/Opendaylight::Config/File[jetty.xml]/content: content changed '{md5}9193aa0b354d9dc21269f4d4c507247b' to '{md5}840208e2e637549e341ac9ccf17d6b20'\u001b[0m\n\u001b[mNotice: /Stage[main]/Opendaylight::Config/File[org.ops4j.pax.logging.cfg]/content: content changed '{md5}f3197e8df5d640bb275ce5e07eddf1b5' to '{md5}711ef6390bfff3e696dec1e89a72fec6'\u001b[0m\n\u001b[mNotice: /Stage[main]/Opendaylight::Config/File[/opt/opendaylight/etc/opendaylight]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Opendaylight::Config/File[/opt/opendaylight/etc/opendaylight/datastore]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Opendaylight::Config/File[/opt/opendaylight/etc/opendaylight/datastore/initial]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Opendaylight::Config/File[/opt/opendaylight/etc/opendaylight/datastore/initial/config]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Opendaylight::Config/File[netvirt-aclservice-config.xml]/ensure: defined content as '{md5}0ead21ff74a7430a3c1aa6e8cf26053b'\u001b[0m\n\u001b[mNotice: /Stage[main]/Opendaylight::Service/Service[opendaylight]/ensure: ensure changed 'stopped' to 'running'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[ip_conntrack_proto_sctp]/Exec[modprobe ip_conntrack_proto_sctp]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[ip_conntrack_proto_sctp]/File[/etc/sysconfig/modules/ip_conntrack_proto_sctp.modules]/ensure: defined content as '{md5}df428765760cc7de821e99f06ad0d403'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack]/File[/etc/sysconfig/modules/nf_conntrack.modules]/ensure: defined content as '{md5}69dc79067bb7ee8d7a8a12176ceddb02'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[kernel.pid_max]/Sysctl[kernel.pid_max]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[kernel.pid_max]/Sysctl_runtime[kernel.pid_max]/val: val changed '32768' to '1048576'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.core.netdev_max_backlog]/Sysctl[net.core.netdev_max_backlog]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.core.netdev_max_backlog]/Sysctl_runtime[net.core.netdev_max_backlog]/val: val changed '1000' to '10000'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.tcp_keepalive_intvl]/Sysctl[net.ipv4.tcp_keepalive_intvl]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.tcp_keepalive_intvl]/Sysctl_runtime[net.ipv4.tcp_keepalive_intvl]/val: val changed '75' to '1'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.tcp_keepalive_probes]/Sysctl[net.ipv4.tcp_keepalive_probes]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.tcp_keepalive_probes]/Sysctl_runtime[net.ipv4.tcp_keepalive_probes]/val: val changed '9' to '5'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.tcp_keepalive_time]/Sysctl[net.ipv4.tcp_keepalive_time]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.tcp_keepalive_time]/Sysctl_runtime[net.ipv4.tcp_keepalive_time]/val: val changed '7200' to '5'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.all.accept_ra]/Sysctl[net.ipv6.conf.all.accept_ra]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.all.accept_ra]/Sysctl_runtime[net.ipv6.conf.all.accept_ra]/val: val changed '1' to '0'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.all.autoconf]/Sysctl[net.ipv6.conf.all.autoconf]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.all.autoconf]/Sysctl_runtime[net.ipv6.conf.all.autoconf]/val: val changed '1' to '0'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.default.accept_ra]/Sysctl[net.ipv6.conf.default.accept_ra]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.default.accept_ra]/Sysctl_runtime[net.ipv6.conf.default.accept_ra]/val: val changed '1' to '0'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.default.autoconf]/Sysctl[net.ipv6.conf.default.autoconf]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.default.autoconf]/Sysctl_runtime[net.ipv6.conf.default.autoconf]/val: val changed '1' to '0'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.netfilter.nf_conntrack_max]/Sysctl[net.netfilter.nf_conntrack_max]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.netfilter.nf_conntrack_max]/Sysctl_runtime[net.netfilter.nf_conntrack_max]/val: val changed '262144' to '500000'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.nf_conntrack_max]/Sysctl[net.nf_conntrack_max]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.nf_conntrack_max]/Sysctl_runtime[net.nf_conntrack_max]/val: val changed '262144' to '500000'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[000 accept related established rules]/Firewall[000 accept related established rules ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[000 accept related established rules]/Firewall[000 accept related established rules ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[001 accept all icmp]/Firewall[001 accept all icmp ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[001 accept all icmp]/Firewall[001 accept all icmp ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[002 accept all to lo interface]/Firewall[002 accept all to lo interface ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[002 accept all to lo interface]/Firewall[002 accept all to lo interface ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[003 accept ssh]/Firewall[003 accept ssh ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[003 accept ssh]/Firewall[003 accept ssh ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[004 accept ipv6 dhcpv6]/Firewall[004 accept ipv6 dhcpv6 ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Firewall::Rule[100 mysql_haproxy]/Firewall[100 mysql_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Firewall::Rule[100 mysql_haproxy]/Firewall[100 mysql_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Firewall::Rule[100 redis_haproxy]/Firewall[100 redis_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Firewall::Rule[100 redis_haproxy]/Firewall[100 redis_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Stonith/Pacemaker::Property[Disable STONITH]/Pcmk_property[property--stonith-enabled]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/content: content changed '{md5}e7f0ef60bc1689c6ff39a399b150c3d2' to '{md5}94ad982da674b1094a49a0f26a7c6988'\u001b[0m\n\u001b[mNotice: /Stage[main]/Ssh::Server::Service/Service[sshd]: Triggered 'refresh' from 2 events\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall::Post/Firewall[998 log all]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall::Post/Tripleo::Firewall::Rule[999 drop all]/Firewall[999 drop all ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall::Post/Tripleo::Firewall::Rule[999 drop all]/Firewall[999 drop all ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[keystone_admin]/Tripleo::Firewall::Rule[100 keystone_admin_haproxy]/Firewall[100 keystone_admin_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[keystone_admin]/Tripleo::Firewall::Rule[100 keystone_admin_haproxy]/Firewall[100 keystone_admin_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[keystone_admin]/Tripleo::Firewall::Rule[100 keystone_admin_haproxy_ssl]/Firewall[100 keystone_admin_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[keystone_admin]/Tripleo::Firewall::Rule[100 keystone_admin_haproxy_ssl]/Firewall[100 keystone_admin_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[keystone_public]/Tripleo::Firewall::Rule[100 keystone_public_haproxy]/Firewall[100 keystone_public_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[keystone_public]/Tripleo::Firewall::Rule[100 keystone_public_haproxy]/Firewall[100 keystone_public_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[keystone_public]/Tripleo::Firewall::Rule[100 keystone_public_haproxy_ssl]/Firewall[100 keystone_public_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[keystone_public]/Tripleo::Firewall::Rule[100 keystone_public_haproxy_ssl]/Firewall[100 keystone_public_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[neutron]/Tripleo::Firewall::Rule[100 neutron_haproxy]/Firewall[100 neutron_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[neutron]/Tripleo::Firewall::Rule[100 neutron_haproxy]/Firewall[100 neutron_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[neutron]/Tripleo::Firewall::Rule[100 neutron_haproxy_ssl]/Firewall[100 neutron_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[neutron]/Tripleo::Firewall::Rule[100 neutron_haproxy_ssl]/Firewall[100 neutron_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[cinder]/Tripleo::Firewall::Rule[100 cinder_haproxy]/Firewall[100 cinder_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[cinder]/Tripleo::Firewall::Rule[100 cinder_haproxy]/Firewall[100 cinder_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[cinder]/Tripleo::Firewall::Rule[100 cinder_haproxy_ssl]/Firewall[100 cinder_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[cinder]/Tripleo::Firewall::Rule[100 cinder_haproxy_ssl]/Firewall[100 cinder_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[glance_api]/Tripleo::Firewall::Rule[100 glance_api_haproxy]/Firewall[100 glance_api_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[glance_api]/Tripleo::Firewall::Rule[100 glance_api_haproxy]/Firewall[100 glance_api_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[glance_api]/Tripleo::Firewall::Rule[100 glance_api_haproxy_ssl]/Firewall[100 glance_api_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[glance_api]/Tripleo::Firewall::Rule[100 glance_api_haproxy_ssl]/Firewall[100 glance_api_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_osapi]/Tripleo::Firewall::Rule[100 nova_osapi_haproxy]/Firewall[100 nova_osapi_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_osapi]/Tripleo::Firewall::Rule[100 nova_osapi_haproxy]/Firewall[100 nova_osapi_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_osapi]/Tripleo::Firewall::Rule[100 nova_osapi_haproxy_ssl]/Firewall[100 nova_osapi_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_osapi]/Tripleo::Firewall::Rule[100 nova_osapi_haproxy_ssl]/Firewall[100 nova_osapi_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_placement]/Tripleo::Firewall::Rule[100 nova_placement_haproxy]/Firewall[100 nova_placement_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_placement]/Tripleo::Firewall::Rule[100 nova_placement_haproxy]/Firewall[100 nova_placement_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_placement]/Tripleo::Firewall::Rule[100 nova_placement_haproxy_ssl]/Firewall[100 nova_placement_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_placement]/Tripleo::Firewall::Rule[100 nova_placement_haproxy_ssl]/Firewall[100 nova_placement_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_metadata]/Tripleo::Firewall::Rule[100 nova_metadata_haproxy]/Firewall[100 nova_metadata_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_metadata]/Tripleo::Firewall::Rule[100 nova_metadata_haproxy]/Firewall[100 nova_metadata_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_novncproxy]/Tripleo::Firewall::Rule[100 nova_novncproxy_haproxy]/Firewall[100 nova_novncproxy_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_novncproxy]/Tripleo::Firewall::Rule[100 nova_novncproxy_haproxy]/Firewall[100 nova_novncproxy_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_novncproxy]/Tripleo::Firewall::Rule[100 nova_novncproxy_haproxy_ssl]/Firewall[100 nova_novncproxy_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[nova_novncproxy]/Tripleo::Firewall::Rule[100 nova_novncproxy_haproxy_ssl]/Firewall[100 nova_novncproxy_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[ceilometer]/Tripleo::Firewall::Rule[100 ceilometer_haproxy]/Firewall[100 ceilometer_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[ceilometer]/Tripleo::Firewall::Rule[100 ceilometer_haproxy]/Firewall[100 ceilometer_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[ceilometer]/Tripleo::Firewall::Rule[100 ceilometer_haproxy_ssl]/Firewall[100 ceilometer_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[ceilometer]/Tripleo::Firewall::Rule[100 ceilometer_haproxy_ssl]/Firewall[100 ceilometer_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[aodh]/Tripleo::Firewall::Rule[100 aodh_haproxy]/Firewall[100 aodh_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[aodh]/Tripleo::Firewall::Rule[100 aodh_haproxy]/Firewall[100 aodh_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[aodh]/Tripleo::Firewall::Rule[100 aodh_haproxy_ssl]/Firewall[100 aodh_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[aodh]/Tripleo::Firewall::Rule[100 aodh_haproxy_ssl]/Firewall[100 aodh_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[panko]/Tripleo::Firewall::Rule[100 panko_haproxy]/Firewall[100 panko_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[panko]/Tripleo::Firewall::Rule[100 panko_haproxy]/Firewall[100 panko_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[panko]/Tripleo::Firewall::Rule[100 panko_haproxy_ssl]/Firewall[100 panko_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[panko]/Tripleo::Firewall::Rule[100 panko_haproxy_ssl]/Firewall[100 panko_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[gnocchi]/Tripleo::Firewall::Rule[100 gnocchi_haproxy]/Firewall[100 gnocchi_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[gnocchi]/Tripleo::Firewall::Rule[100 gnocchi_haproxy]/Firewall[100 gnocchi_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[gnocchi]/Tripleo::Firewall::Rule[100 gnocchi_haproxy_ssl]/Firewall[100 gnocchi_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[gnocchi]/Tripleo::Firewall::Rule[100 gnocchi_haproxy_ssl]/Firewall[100 gnocchi_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[swift_proxy_server]/Tripleo::Firewall::Rule[100 swift_proxy_server_haproxy]/Firewall[100 swift_proxy_server_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[swift_proxy_server]/Tripleo::Firewall::Rule[100 swift_proxy_server_haproxy]/Firewall[100 swift_proxy_server_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[swift_proxy_server]/Tripleo::Firewall::Rule[100 swift_proxy_server_haproxy_ssl]/Firewall[100 swift_proxy_server_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[swift_proxy_server]/Tripleo::Firewall::Rule[100 swift_proxy_server_haproxy_ssl]/Firewall[100 swift_proxy_server_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[heat_api]/Tripleo::Firewall::Rule[100 heat_api_haproxy]/Firewall[100 heat_api_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[heat_api]/Tripleo::Firewall::Rule[100 heat_api_haproxy]/Firewall[100 heat_api_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[heat_api]/Tripleo::Firewall::Rule[100 heat_api_haproxy_ssl]/Firewall[100 heat_api_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[heat_api]/Tripleo::Firewall::Rule[100 heat_api_haproxy_ssl]/Firewall[100 heat_api_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[heat_cloudwatch]/Tripleo::Firewall::Rule[100 heat_cloudwatch_haproxy]/Firewall[100 heat_cloudwatch_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[heat_cloudwatch]/Tripleo::Firewall::Rule[100 heat_cloudwatch_haproxy]/Firewall[100 heat_cloudwatch_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[heat_cloudwatch]/Tripleo::Firewall::Rule[100 heat_cloudwatch_haproxy_ssl]/Firewall[100 heat_cloudwatch_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[heat_cloudwatch]/Tripleo::Firewall::Rule[100 heat_cloudwatch_haproxy_ssl]/Firewall[100 heat_cloudwatch_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[heat_cfn]/Tripleo::Firewall::Rule[100 heat_cfn_haproxy]/Firewall[100 heat_cfn_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[heat_cfn]/Tripleo::Firewall::Rule[100 heat_cfn_haproxy]/Firewall[100 heat_cfn_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[heat_cfn]/Tripleo::Firewall::Rule[100 heat_cfn_haproxy_ssl]/Firewall[100 heat_cfn_haproxy_ssl ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[heat_cfn]/Tripleo::Firewall::Rule[100 heat_cfn_haproxy_ssl]/Firewall[100 heat_cfn_haproxy_ssl ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[cinder_api]/Tripleo::Firewall::Rule[119 cinder]/Firewall[119 cinder ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[cinder_api]/Tripleo::Firewall::Rule[119 cinder]/Firewall[119 cinder ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[cinder_volume]/Tripleo::Firewall::Rule[120 iscsi initiator]/Firewall[120 iscsi initiator ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[cinder_volume]/Tripleo::Firewall::Rule[120 iscsi initiator]/Firewall[120 iscsi initiator ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[keystone]/Tripleo::Firewall::Rule[111 keystone]/Firewall[111 keystone ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[keystone]/Tripleo::Firewall::Rule[111 keystone]/Firewall[111 keystone ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[glance_api]/Tripleo::Firewall::Rule[112 glance_api]/Firewall[112 glance_api ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[glance_api]/Tripleo::Firewall::Rule[112 glance_api]/Firewall[112 glance_api ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[heat_api]/Tripleo::Firewall::Rule[125 heat_api]/Firewall[125 heat_api ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[heat_api]/Tripleo::Firewall::Rule[125 heat_api]/Firewall[125 heat_api ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[heat_api_cfn]/Tripleo::Firewall::Rule[125 heat_cfn]/Firewall[125 heat_cfn ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[heat_api_cfn]/Tripleo::Firewall::Rule[125 heat_cfn]/Firewall[125 heat_cfn ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[heat_api_cloudwatch]/Tripleo::Firewall::Rule[125 heat_cloudwatch]/Firewall[125 heat_cloudwatch ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[heat_api_cloudwatch]/Tripleo::Firewall::Rule[125 heat_cloudwatch]/Firewall[125 heat_cloudwatch ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[mysql]/Tripleo::Firewall::Rule[104 mysql galera]/Firewall[104 mysql galera ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[mysql]/Tripleo::Firewall::Rule[104 mysql galera]/Firewall[104 mysql galera ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[neutron_dhcp]/Tripleo::Firewall::Rule[115 neutron dhcp input]/Firewall[115 neutron dhcp input ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[neutron_dhcp]/Tripleo::Firewall::Rule[115 neutron dhcp input]/Firewall[115 neutron dhcp input ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[neutron_dhcp]/Tripleo::Firewall::Rule[116 neutron dhcp output]/Firewall[116 neutron dhcp output ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[neutron_dhcp]/Tripleo::Firewall::Rule[116 neutron dhcp output]/Firewall[116 neutron dhcp output ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[neutron_api]/Tripleo::Firewall::Rule[114 neutron api]/Firewall[114 neutron api ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[neutron_api]/Tripleo::Firewall::Rule[114 neutron api]/Firewall[114 neutron api ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[rabbitmq]/Tripleo::Firewall::Rule[109 rabbitmq]/Firewall[109 rabbitmq ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[rabbitmq]/Tripleo::Firewall::Rule[109 rabbitmq]/Firewall[109 rabbitmq ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[haproxy]/Tripleo::Firewall::Rule[107 haproxy stats]/Firewall[107 haproxy stats ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[haproxy]/Tripleo::Firewall::Rule[107 haproxy stats]/Firewall[107 haproxy stats ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[memcached]/Tripleo::Firewall::Rule[121 memcached]/Firewall[121 memcached ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[memcached]/Tripleo::Firewall::Rule[121 memcached]/Firewall[121 memcached ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[pacemaker]/Tripleo::Firewall::Rule[130 pacemaker tcp]/Firewall[130 pacemaker tcp ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[pacemaker]/Tripleo::Firewall::Rule[130 pacemaker tcp]/Firewall[130 pacemaker tcp ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[pacemaker]/Tripleo::Firewall::Rule[131 pacemaker udp]/Firewall[131 pacemaker udp ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[pacemaker]/Tripleo::Firewall::Rule[131 pacemaker udp]/Firewall[131 pacemaker udp ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[redis]/Tripleo::Firewall::Rule[108 redis]/Firewall[108 redis ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[redis]/Tripleo::Firewall::Rule[108 redis]/Firewall[108 redis ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[mongodb]/Tripleo::Firewall::Rule[101 mongodb_config]/Firewall[101 mongodb_config ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[mongodb]/Tripleo::Firewall::Rule[101 mongodb_config]/Firewall[101 mongodb_config ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[mongodb]/Tripleo::Firewall::Rule[102 mongodb_sharding]/Firewall[102 mongodb_sharding ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[mongodb]/Tripleo::Firewall::Rule[102 mongodb_sharding]/Firewall[102 mongodb_sharding ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[mongodb]/Tripleo::Firewall::Rule[103 mongod]/Firewall[103 mongod ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[mongodb]/Tripleo::Firewall::Rule[103 mongod]/Firewall[103 mongod ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[nova_api]/Tripleo::Firewall::Rule[113 nova_api]/Firewall[113 nova_api ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[nova_api]/Tripleo::Firewall::Rule[113 nova_api]/Firewall[113 nova_api ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[nova_placement]/Tripleo::Firewall::Rule[138 nova_placement]/Firewall[138 nova_placement ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[nova_placement]/Tripleo::Firewall::Rule[138 nova_placement]/Firewall[138 nova_placement ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[nova_vnc_proxy]/Tripleo::Firewall::Rule[137 nova_vnc_proxy]/Firewall[137 nova_vnc_proxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[nova_vnc_proxy]/Tripleo::Firewall::Rule[137 nova_vnc_proxy]/Firewall[137 nova_vnc_proxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[ntp]/Tripleo::Firewall::Rule[105 ntp]/Firewall[105 ntp ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[ntp]/Tripleo::Firewall::Rule[105 ntp]/Firewall[105 ntp ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[swift_proxy]/Tripleo::Firewall::Rule[122 swift proxy]/Firewall[122 swift proxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[swift_proxy]/Tripleo::Firewall::Rule[122 swift proxy]/Firewall[122 swift proxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[swift_storage]/Tripleo::Firewall::Rule[123 swift storage]/Firewall[123 swift storage ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[swift_storage]/Tripleo::Firewall::Rule[123 swift storage]/Firewall[123 swift storage ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[snmp]/Tripleo::Firewall::Rule[127 snmp]/Firewall[127 snmp ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[snmp]/Tripleo::Firewall::Rule[127 snmp]/Firewall[127 snmp ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[ceilometer_api]/Tripleo::Firewall::Rule[124 ceilometer]/Firewall[124 ceilometer ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[ceilometer_api]/Tripleo::Firewall::Rule[124 ceilometer]/Firewall[124 ceilometer ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[horizon]/Tripleo::Firewall::Rule[126 horizon]/Firewall[126 horizon ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[horizon]/Tripleo::Firewall::Rule[126 horizon]/Firewall[126 horizon ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[gnocchi_api]/Tripleo::Firewall::Rule[129 gnocchi-api]/Firewall[129 gnocchi-api ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[gnocchi_api]/Tripleo::Firewall::Rule[129 gnocchi-api]/Firewall[129 gnocchi-api ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[gnocchi_statsd]/Tripleo::Firewall::Rule[140 gnocchi-statsd]/Firewall[140 gnocchi-statsd ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[gnocchi_statsd]/Tripleo::Firewall::Rule[140 gnocchi-statsd]/Firewall[140 gnocchi-statsd ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[aodh_api]/Tripleo::Firewall::Rule[128 aodh-api]/Firewall[128 aodh-api ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[aodh_api]/Tripleo::Firewall::Rule[128 aodh-api]/Firewall[128 aodh-api ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[opendaylight_api]/Tripleo::Firewall::Rule[137 opendaylight api]/Firewall[137 opendaylight api ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[opendaylight_api]/Tripleo::Firewall::Rule[137 opendaylight api]/Firewall[137 opendaylight api ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[opendaylight_ovs]/Tripleo::Firewall::Rule[118 neutron vxlan networks]/Firewall[118 neutron vxlan networks ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[opendaylight_ovs]/Tripleo::Firewall::Rule[118 neutron vxlan networks]/Firewall[118 neutron vxlan networks ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[opendaylight_ovs]/Tripleo::Firewall::Rule[136 neutron gre networks]/Firewall[136 neutron gre networks ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[opendaylight_ovs]/Tripleo::Firewall::Rule[136 neutron gre networks]/Firewall[136 neutron gre networks ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[panko_api]/Tripleo::Firewall::Rule[140 panko-api]/Firewall[140 panko-api ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Service_rules[panko_api]/Tripleo::Firewall::Rule[140 panko-api]/Firewall[140 panko-api ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[opendaylight]/Tripleo::Firewall::Rule[100 opendaylight_haproxy]/Firewall[100 opendaylight_haproxy ipv4]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Haproxy/Tripleo::Haproxy::Endpoint[opendaylight]/Tripleo::Firewall::Rule[100 opendaylight_haproxy]/Firewall[100 opendaylight_haproxy ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Firewall::Linux::Redhat/File[/etc/sysconfig/iptables]/seluser: seluser changed 'system_u' to 'unconfined_u'\u001b[0m\n\u001b[mNotice: /Stage[main]/Firewall::Linux::Redhat/File[/etc/sysconfig/ip6tables]/seluser: seluser changed 'system_u' to 'unconfined_u'\u001b[0m\n\u001b[mNotice: /Stage[main]/Haproxy/Haproxy::Instance[haproxy]/Haproxy::Config[haproxy]/Concat[/etc/haproxy/haproxy.cfg]/File[/etc/haproxy/haproxy.cfg]/content: content changed '{md5}1f337186b0e1ba5ee82760cb437fb810' to '{md5}eb5a550e071013a718473cf2f078b1e4'\u001b[0m\n\u001b[mNotice: /Stage[main]/Haproxy/Haproxy::Instance[haproxy]/Haproxy::Config[haproxy]/Concat[/etc/haproxy/haproxy.cfg]/File[/etc/haproxy/haproxy.cfg]/mode: mode changed '0644' to '0640'\u001b[0m\n\u001b[mNotice: /Stage[main]/Tripleo::Profile::Base::Haproxy/Exec[haproxy-reload]: Triggered 'refresh' from 1 events\u001b[0m\n\u001b[mNotice: Applied catalog in 102.84 seconds\u001b[0m\n", 
    "deploy_stderr": "exception: connect failed\n\u001b[1;33mWarning: This method is deprecated, please use match expressions with Stdlib::Compat::Array instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at [\"/var/lib/heat-config/heat-config-puppet/642d9a56-9b4e-4f01-bc8c-a093f1e1a313.pp\", 61]:\n   (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:25:in `deprecation')\u001b[0m\n\u001b[1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, with Stdlib::Compat::Hash. There is further documentation for validate_legacy function in the README. at [\"/var/lib/heat-config/heat-config-puppet/642d9a56-9b4e-4f01-bc8c-a093f1e1a313.pp\", 61]:\n   (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:25:in `deprecation')\u001b[0m\n\u001b[1;33mWarning: ModuleLoader: module 'mysql' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules\n   (file & line not available)\u001b[0m\n\u001b[1;33mWarning: ModuleLoader: module 'rabbitmq' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules\n   (file & line not available)\u001b[0m\n\u001b[1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at [\"/var/lib/heat-config/heat-config-puppet/642d9a56-9b4e-4f01-bc8c-a093f1e1a313.pp\", 73]:[\"/etc/puppet/modules/tripleo/manifests/profile/pacemaker/rabbitmq.pp\", 62]\n   (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:25:in `deprecation')\u001b[0m\n\u001b[1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, with Pattern[]. There is further documentation for validate_legacy function in the README. at [\"/var/lib/heat-config/heat-config-puppet/642d9a56-9b4e-4f01-bc8c-a093f1e1a313.pp\", 73]:[\"/etc/puppet/modules/tripleo/manifests/profile/pacemaker/rabbitmq.pp\", 62]\n   (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:25:in `deprecation')\u001b[0m\n\u001b[1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at [\"/var/lib/heat-config/heat-config-puppet/642d9a56-9b4e-4f01-bc8c-a093f1e1a313.pp\", 73]:[\"/etc/puppet/modules/tripleo/manifests/profile/pacemaker/rabbitmq.pp\", 62]\n   (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:25:in `deprecation')\u001b[0m\n\u001b[1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at [\"/var/lib/heat-config/heat-config-puppet/642d9a56-9b4e-4f01-bc8c-a093f1e1a313.pp\", 73]:[\"/etc/puppet/modules/tripleo/manifests/profile/pacemaker/rabbitmq.pp\", 62]\n   (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:25:in `deprecation')\u001b[0m\n\u001b[1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, with Stdlib::Compat::Absolute_Path. There is further documentation for validate_legacy function in the README. at [\"/var/lib/heat-config/heat-config-puppet/642d9a56-9b4e-4f01-bc8c-a093f1e1a313.pp\", 73]:[\"/etc/puppet/modules/tripleo/manifests/profile/pacemaker/rabbitmq.pp\", 62]\n   (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:25:in `deprecation')\u001b[0m\n\u001b[1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, with Stdlib::Compat::Integer. There is further documentation for validate_legacy function in the README. at [\"/var/lib/heat-config/heat-config-puppet/642d9a56-9b4e-4f01-bc8c-a093f1e1a313.pp\", 73]:[\"/etc/puppet/modules/tripleo/manifests/profile/pacemaker/rabbitmq.pp\", 62]\n   (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:25:in `deprecation')\u001b[0m\n\u001b[1;33mWarning: Unknown variable: 'haproxy_stats_bind_certificate'. at /etc/puppet/modules/tripleo/manifests/haproxy.pp:769:6\u001b[0m\n\u001b[1;33mWarning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at [\"/var/lib/heat-config/heat-config-puppet/642d9a56-9b4e-4f01-bc8c-a093f1e1a313.pp\", 77]:[\"/etc/puppet/modules/tripleo/manifests/profile/base/memcached.pp\", 30]\n   (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:25:in `deprecation')\u001b[0m\n\u001b[1;33mWarning: ModuleLoader: module 'redis' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules\n   (file & line not available)\u001b[0m\n\u001b[1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at [\"/var/lib/heat-config/heat-config-puppet/642d9a56-9b4e-4f01-bc8c-a093f1e1a313.pp\", 98]:[\"/etc/puppet/modules/tripleo/manifests/profile/base/time/ntp.pp\", 29]\n   (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:25:in `deprecation')\u001b[0m\n\u001b[1;33mWarning: ModuleLoader: module 'timezone' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules\n   (file & line not available)\u001b[0m\n\u001b[1;33mWarning: ModuleLoader: module 'opendaylight' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules\n   (file & line not available)\u001b[0m\n\u001b[1;33mWarning: This method is deprecated, please use match expressions with Stdlib::Compat::Bool instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at :\n   (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:25:in `deprecation')\u001b[0m\n\u001b[1;33mWarning: Scope(Haproxy::Config[haproxy]): haproxy: The $merge_options parameter will default to true in the next major release. Please review the documentation regarding the implications.\u001b[0m\n\u001b[1;31mError: Execution of '/usr/bin/mysql_install_db --defaults-extra-file=/etc/my.cnf.d/galera.cnf --basedir=/usr --datadir=/var/lib/mysql --user=mysql' returned 1: Installing MariaDB/MySQL system tables in '/var/lib/mysql' ...\n170610 19:19:01 [ERROR] /usr/libexec/mysqld: option '--bind-address' requires an argument\n170610 19:19:01 [ERROR] Aborting\n\n\nInstallation of system tables failed!  Examine the logs in\n/var/lib/mysql for more information.\n\nThe problem could be conflicting information in an external\nmy.cnf files. You can ignore these by doing:\n\n    shell> /usr/scripts/scripts/mysql_install_db --defaults-file=~/.my.cnf\n\nYou can also try to start the mysqld daemon with:\n\n    shell> /usr/libexec/mysqld --skip-grant --general-log &\n\nand use the command line tool /usr/bin/mysql\nto connect to the mysql database and look at the grant tables:\n\n    shell> /usr/bin/mysql -u root mysql\n    mysql> show tables;\n\nTry 'mysqld --help' if you have problems with paths.  Using\n--general-log gives you a log in /var/lib/mysql that may be helpful.\n\nThe latest information about mysql_install_db is available at\nhttps://mariadb.com/kb/en/installing-system-tables-mysql_install_db\nMariaDB is hosted on launchpad; You can find the latest source and\nemail lists at http://launchpad.net/maria\n\nPlease check all of the above before submitting a bug report\nat http://mariadb.org/jira\u001b[0m\n\u001b[1;31mError: /Stage[main]/Mysql::Server::Installdb/Mysql_datadir[/var/lib/mysql]/ensure: change from absent to present failed: Execution of '/usr/bin/mysql_install_db --defaults-extra-file=/etc/my.cnf.d/galera.cnf --basedir=/usr --datadir=/var/lib/mysql --user=mysql' returned 1: Installing MariaDB/MySQL system tables in '/var/lib/mysql' ...\n170610 19:19:01 [ERROR] /usr/libexec/mysqld: option '--bind-address' requires an argument\n170610 19:19:01 [ERROR] Aborting\n\n\nInstallation of system tables failed!  Examine the logs in\n/var/lib/mysql for more information.\n\nThe problem could be conflicting information in an external\nmy.cnf files. You can ignore these by doing:\n\n    shell> /usr/scripts/scripts/mysql_install_db --defaults-file=~/.my.cnf\n\nYou can also try to start the mysqld daemon with:\n\n    shell> /usr/libexec/mysqld --skip-grant --general-log &\n\nand use the command line tool /usr/bin/mysql\nto connect to the mysql database and look at the grant tables:\n\n    shell> /usr/bin/mysql -u root mysql\n    mysql> show tables;\n\nTry 'mysqld --help' if you have problems with paths.  Using\n--general-log gives you a log in /var/lib/mysql that may be helpful.\n\nThe latest information about mysql_install_db is available at\nhttps://mariadb.com/kb/en/installing-system-tables-mysql_install_db\nMariaDB is hosted on launchpad; You can find the latest source and\nemail lists at http://launchpad.net/maria\n\nPlease check all of the above before submitting a bug report\nat http://mariadb.org/jira\u001b[0m\n\u001b[1;33mWarning: /Stage[main]/Mysql::Server::Root_password/Exec[remove install pass]: Skipping because of failed dependencies\u001b[0m\n\u001b[1;33mWarning: /Stage[main]/Mysql::Server/Anchor[mysql::server::end]: Skipping because of failed dependencies\u001b[0m\n", 
    "deploy_status_code": 6
  }, 
  "creation_time": "2017-06-10T23:18:41Z", 
  "updated_time": "2017-06-10T23:20:46Z", 
  "input_values": {
    "step": 1, 
    "update_identifier": "1497136008"
  }, 
  "action": "CREATE", 
  "status_reason": "deploy_status_code : Deployment exited with non-zero status code: 6", 
  "id": "481982e2-7f7c-441f-a07a-2a2f5275f981"
}

Comment 3 Alex Schultz 2017-06-12 17:54:14 UTC
The error seems to point to there not being an external address computed for the controller. What does the network configuration look like for this deployment?  Can you provide the complete deployment template set?

Comment 4 Andreas Karis 2017-06-12 18:08:42 UTC
Hi Alex,

This happens in my lab environment, and a customer environment. This should be reproducible easily with the above 

~~~
parameter_defaults:
(...)
  ServiceNetMap:
    MysqlNetwork: external
(...)
~~~

Whatever the environment / network configuration is.

- Andreas

Comment 5 Andreas Karis 2017-06-12 18:10:08 UTC
The controllers end up getting an external address after the deployment. This looks rather like a timing problem if they don't have an external address at this moment.

Comment 6 Alex Schultz 2017-06-12 22:10:08 UTC
So this is happening because fqdn_external is not defined in the hieradata.  Currently we only support internal_api, storage, storage_mgmt, tenant, management, ctlplane.

https://github.com/openstack/tripleo-heat-templates/blob/stable/ocata/puppet/controller-role.yaml#L498

I will have to ask some folks if this is intentional or a bug.  My assumption is that it might be intentional because we don't necessarily manage items on the 'external' network. This seems like a configuration we wouldn't necessarily want to support anyway as putting important things like your database on an 'external' network is not necessarily a good idea.

Comment 7 Andreas Karis 2017-06-12 22:22:54 UTC
Hi Alex, sorry that I didn't provide the templates. I was busy all day long. The interesting thing here though is that this works in OSP 10, but fails in OSP 11. 

Still need my templates?

- Andreas

Comment 8 Andreas Karis 2017-06-12 22:24:56 UTC
Created attachment 1287116 [details]
templates for osp 11 - need to modify network-environment.yaml (at the bottom) to provoke the issue and set the network to "external"

Comment 9 Alex Schultz 2017-06-13 15:33:34 UTC
Found it. 

In 10, we did the fqdn_<network> in puppet-tripleo https://github.com/openstack/puppet-tripleo/blob/stable/newton/lib/facter/alt_fqdns.rb

In 11 that was moved to THT https://github.com/openstack/tripleo-heat-templates/blob/stable/ocata/puppet/controller-role.yaml#L498

But fqdn_external was dropped in the migration. So it's missing which causes the failure. I will propose a fix upstream and get it backported. In the mean time the customer could work around by adding fqdn_external to their controller-role.yaml

fqdn_external: {get_attr: [NetHostMap, value, external, fqdn]}

Comment 11 Alex Schultz 2017-07-17 16:54:27 UTC
Verified fqdn_external is now defined in openstack-tripleo-heat-templates-6.0.0-14.el7ost

Comment 13 errata-xmlrpc 2017-07-19 17:04:56 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2017:1778


Note You need to log in before you can comment on or make changes to this bug.