Login
[x]
Log in using an account from:
Fedora Account System
Red Hat Associate
Red Hat Customer
Or login using a Red Hat Bugzilla account
Forgot Password
Login:
Hide Forgot
Create an Account
Red Hat Bugzilla – Attachment 1477543 Details for
Bug 1619704
Failing to deploy BM UC timeout waiting for execution
[?]
New
Simple Search
Advanced Search
My Links
Browse
Requests
Reports
Current State
Search
Tabular reports
Graphical reports
Duplicates
Other Reports
User Changes
Plotly Reports
Bug Status
Bug Severity
Non-Defaults
|
Product Dashboard
Help
Page Help!
Bug Writing Guidelines
What's new
Browser Support Policy
5.0.4.rh83 Release notes
FAQ
Guides index
User guide
Web Services
Contact
Legal
This site requires JavaScript to be enabled to function correctly, please enable it.
undercloud install lag
install-undercloud.log (text/plain), 935.79 KB, created by
Tzach Shefi
on 2018-08-21 14:17:53 UTC
(
hide
)
Description:
undercloud install lag
Filename:
MIME Type:
Creator:
Tzach Shefi
Created:
2018-08-21 14:17:53 UTC
Size:
935.79 KB
patch
obsolete
>2018-08-21 15:24:09,442 INFO: Logging to /home/stack/.instack/install-undercloud.log >2018-08-21 15:24:09,541 INFO: Checking for a FQDN hostname... >2018-08-21 15:24:09,702 INFO: Static hostname detected as undercloud-0.redhat.local >2018-08-21 15:24:09,753 INFO: Transient hostname detected as undercloud-0.redhat.local >2018-08-21 15:24:09,780 INFO: Generated new password for undercloud_db_password >2018-08-21 15:24:09,781 INFO: Generated new password for undercloud_admin_token >2018-08-21 15:24:09,782 INFO: Generated new password for undercloud_admin_password >2018-08-21 15:24:09,783 INFO: Generated new password for undercloud_glance_password >2018-08-21 15:24:09,784 INFO: Generated new password for undercloud_heat_encryption_key >2018-08-21 15:24:09,784 INFO: Generated new password for undercloud_heat_password >2018-08-21 15:24:09,785 INFO: Generated new password for undercloud_heat_cfn_password >2018-08-21 15:24:09,786 INFO: Generated new password for undercloud_neutron_password >2018-08-21 15:24:09,787 INFO: Generated new password for undercloud_nova_password >2018-08-21 15:24:09,787 INFO: Generated new password for undercloud_ironic_password >2018-08-21 15:24:09,788 INFO: Generated new password for undercloud_aodh_password >2018-08-21 15:24:09,789 INFO: Generated new password for undercloud_gnocchi_password >2018-08-21 15:24:09,789 INFO: Generated new password for undercloud_ceilometer_password >2018-08-21 15:24:09,790 INFO: Generated new password for undercloud_panko_password >2018-08-21 15:24:09,791 INFO: Generated new password for undercloud_ceilometer_metering_secret >2018-08-21 15:24:09,792 INFO: Generated new password for undercloud_ceilometer_snmpd_password >2018-08-21 15:24:09,793 INFO: Generated new password for undercloud_swift_password >2018-08-21 15:24:09,793 INFO: Generated new password for undercloud_mistral_password >2018-08-21 15:24:09,794 INFO: Generated new password for undercloud_rabbit_cookie >2018-08-21 15:24:09,795 INFO: Generated new password for undercloud_rabbit_password >2018-08-21 15:24:09,795 INFO: Generated new password for undercloud_rabbit_username >2018-08-21 15:24:09,796 INFO: Generated new password for undercloud_heat_stack_domain_admin_password >2018-08-21 15:24:09,797 INFO: Generated new password for undercloud_swift_hash_suffix >2018-08-21 15:24:09,798 INFO: Generated new password for undercloud_haproxy_stats_password >2018-08-21 15:24:09,798 INFO: Generated new password for undercloud_zaqar_password >2018-08-21 15:24:09,799 INFO: Generated new password for undercloud_horizon_secret_key >2018-08-21 15:24:09,800 INFO: Generated new password for undercloud_cinder_password >2018-08-21 15:24:09,800 INFO: Generated new password for undercloud_novajoin_password >2018-08-21 15:24:09,899 INFO: Running yum clean all >2018-08-21 15:24:10,720 INFO: Loaded plugins: product-id, search-disabled-repos, subscription-manager >2018-08-21 15:24:11,023 INFO: This system is not registered with an entitlement server. You can use subscription-manager to register. >2018-08-21 15:24:11,188 INFO: Cleaning repos: rhelosp-13.0-image-build-override rhelosp-13.0-optools-puddle >2018-08-21 15:24:11,189 INFO: : rhelosp-13.0-puddle rhelosp-ceph-3.0-mon rhelosp-ceph-3.0-osd >2018-08-21 15:24:11,189 INFO: : rhelosp-ceph-3.0-tools rhelosp-rhel-7.5-extras >2018-08-21 15:24:11,190 INFO: : rhelosp-rhel-7.5-ha rhelosp-rhel-7.5-image-build-override >2018-08-21 15:24:11,190 INFO: : rhelosp-rhel-7.5-server >2018-08-21 15:24:11,191 INFO: Cleaning up everything >2018-08-21 15:24:11,191 INFO: Maybe you want: rm -rf /var/cache/yum, to also free up space taken by orphaned data from disabled or removed repos >2018-08-21 15:24:11,458 INFO: yum-clean-all completed successfully >2018-08-21 15:24:11,459 INFO: Running yum update >2018-08-21 15:24:12,281 INFO: Loaded plugins: product-id, search-disabled-repos, subscription-manager >2018-08-21 15:24:12,576 INFO: This system is not registered with an entitlement server. You can use subscription-manager to register. >2018-08-21 15:26:29,936 INFO: No packages marked for update >2018-08-21 15:26:30,051 INFO: yum-update completed successfully >2018-08-21 15:26:30,142 INFO: Running instack >2018-08-21 15:26:30,720 INFO: INFO: 2018-08-21 15:26:30,719 -- Starting run of instack >2018-08-21 15:26:30,749 INFO: INFO: 2018-08-21 15:26:30,748 -- Using json file: /usr/share/instack-undercloud/json-files/rhel-7-undercloud-packages.json >2018-08-21 15:26:30,750 INFO: INFO: 2018-08-21 15:26:30,749 -- Running Installation >2018-08-21 15:26:30,752 INFO: INFO: 2018-08-21 15:26:30,751 -- Initialized with elements path: /usr/share/tripleo-puppet-elements /usr/share/instack-undercloud /usr/share/tripleo-image-elements /usr/share/diskimage-builder/elements >2018-08-21 15:26:30,810 INFO: WARNING: 2018-08-21 15:26:30,810 -- expand_dependencies() deprecated, use get_elements >2018-08-21 15:26:30,899 INFO: INFO: 2018-08-21 15:26:30,898 -- List of all elements and dependencies: epel undercloud-install dib-python source-repositories install-types puppet-modules install-bin pip-manifest puppet-stack-config os-refresh-config element-manifest manifests pip-and-virtualenv cache-url pkg-map enable-packages-install puppet os-apply-config hiera package-installs >2018-08-21 15:26:30,900 INFO: INFO: 2018-08-21 15:26:30,898 -- Excluding element pip-and-virtualenv >2018-08-21 15:26:30,900 INFO: INFO: 2018-08-21 15:26:30,899 -- Excluding element epel >2018-08-21 15:26:30,901 INFO: INFO: 2018-08-21 15:26:30,899 -- Excluding element pip-manifest >2018-08-21 15:26:30,901 INFO: INFO: 2018-08-21 15:26:30,900 -- Excluding element package-installs >2018-08-21 15:26:30,902 INFO: INFO: 2018-08-21 15:26:30,900 -- Excluding element pkg-map >2018-08-21 15:26:30,902 INFO: INFO: 2018-08-21 15:26:30,900 -- Excluding element puppet >2018-08-21 15:26:30,903 INFO: INFO: 2018-08-21 15:26:30,901 -- Excluding element cache-url >2018-08-21 15:26:30,903 INFO: INFO: 2018-08-21 15:26:30,901 -- Excluding element dib-python >2018-08-21 15:26:30,904 INFO: INFO: 2018-08-21 15:26:30,902 -- Excluding element install-bin >2018-08-21 15:26:30,904 INFO: INFO: 2018-08-21 15:26:30,902 -- List of all elements and dependencies after excludes: undercloud-install source-repositories install-types puppet-modules puppet-stack-config os-refresh-config element-manifest manifests enable-packages-install os-apply-config hiera >2018-08-21 15:26:31,617 INFO: INFO: 2018-08-21 15:26:31,616 -- Running hook extra-data >2018-08-21 15:26:31,621 INFO: INFO: 2018-08-21 15:26:31,620 -- ############### Begin stdout/stderr logging ############### >2018-08-21 15:26:31,661 INFO: dib-run-parts Sourcing environment file /tmp/tmpTPs6_V/extra-data.d/../environment.d/00-dib-v2-env >2018-08-21 15:26:31,667 INFO: + source /tmp/tmpTPs6_V/extra-data.d/../environment.d/00-dib-v2-env >2018-08-21 15:26:31,668 INFO: ++ export 'IMAGE_ELEMENT=epel undercloud-install dib-python source-repositories install-types install-bin pip-manifest pkg-map puppet-stack-config os-refresh-config element-manifest manifests pip-and-virtualenv cache-url puppet enable-packages-install puppet-modules os-apply-config hiera package-installs' >2018-08-21 15:26:31,669 INFO: ++ IMAGE_ELEMENT='epel undercloud-install dib-python source-repositories install-types install-bin pip-manifest pkg-map puppet-stack-config os-refresh-config element-manifest manifests pip-and-virtualenv cache-url puppet enable-packages-install puppet-modules os-apply-config hiera package-installs' >2018-08-21 15:26:31,670 INFO: ++ export 'IMAGE_ELEMENT_YAML={cache-url: /usr/share/diskimage-builder/elements/cache-url, dib-python: /usr/share/diskimage-builder/elements/dib-python, >2018-08-21 15:26:31,670 INFO: element-manifest: /usr/share/diskimage-builder/elements/element-manifest, enable-packages-install: /usr/share/tripleo-image-elements/enable-packages-install, >2018-08-21 15:26:31,671 INFO: epel: /usr/share/diskimage-builder/elements/epel, hiera: /usr/share/tripleo-puppet-elements/hiera, >2018-08-21 15:26:31,672 INFO: install-bin: /usr/share/diskimage-builder/elements/install-bin, install-types: /usr/share/diskimage-builder/elements/install-types, >2018-08-21 15:26:31,672 INFO: manifests: /usr/share/diskimage-builder/elements/manifests, os-apply-config: /usr/share/tripleo-image-elements/os-apply-config, >2018-08-21 15:26:31,673 INFO: os-refresh-config: /usr/share/tripleo-image-elements/os-refresh-config, package-installs: /usr/share/diskimage-builder/elements/package-installs, >2018-08-21 15:26:31,674 INFO: pip-and-virtualenv: /usr/share/diskimage-builder/elements/pip-and-virtualenv, pip-manifest: /usr/share/tripleo-image-elements/pip-manifest, >2018-08-21 15:26:31,674 INFO: pkg-map: /usr/share/diskimage-builder/elements/pkg-map, puppet: /usr/share/tripleo-puppet-elements/puppet, >2018-08-21 15:26:31,675 INFO: puppet-modules: /usr/share/tripleo-puppet-elements/puppet-modules, puppet-stack-config: /usr/share/instack-undercloud/puppet-stack-config, >2018-08-21 15:26:31,675 INFO: source-repositories: /usr/share/diskimage-builder/elements/source-repositories, >2018-08-21 15:26:31,676 INFO: undercloud-install: /usr/share/instack-undercloud/undercloud-install} >2018-08-21 15:26:31,676 INFO: ' >2018-08-21 15:26:31,677 INFO: ++ IMAGE_ELEMENT_YAML='{cache-url: /usr/share/diskimage-builder/elements/cache-url, dib-python: /usr/share/diskimage-builder/elements/dib-python, >2018-08-21 15:26:31,678 INFO: element-manifest: /usr/share/diskimage-builder/elements/element-manifest, enable-packages-install: /usr/share/tripleo-image-elements/enable-packages-install, >2018-08-21 15:26:31,678 INFO: epel: /usr/share/diskimage-builder/elements/epel, hiera: /usr/share/tripleo-puppet-elements/hiera, >2018-08-21 15:26:31,679 INFO: install-bin: /usr/share/diskimage-builder/elements/install-bin, install-types: /usr/share/diskimage-builder/elements/install-types, >2018-08-21 15:26:31,679 INFO: manifests: /usr/share/diskimage-builder/elements/manifests, os-apply-config: /usr/share/tripleo-image-elements/os-apply-config, >2018-08-21 15:26:31,680 INFO: os-refresh-config: /usr/share/tripleo-image-elements/os-refresh-config, package-installs: /usr/share/diskimage-builder/elements/package-installs, >2018-08-21 15:26:31,681 INFO: pip-and-virtualenv: /usr/share/diskimage-builder/elements/pip-and-virtualenv, pip-manifest: /usr/share/tripleo-image-elements/pip-manifest, >2018-08-21 15:26:31,681 INFO: pkg-map: /usr/share/diskimage-builder/elements/pkg-map, puppet: /usr/share/tripleo-puppet-elements/puppet, >2018-08-21 15:26:31,682 INFO: puppet-modules: /usr/share/tripleo-puppet-elements/puppet-modules, puppet-stack-config: /usr/share/instack-undercloud/puppet-stack-config, >2018-08-21 15:26:31,683 INFO: source-repositories: /usr/share/diskimage-builder/elements/source-repositories, >2018-08-21 15:26:31,683 INFO: undercloud-install: /usr/share/instack-undercloud/undercloud-install} >2018-08-21 15:26:31,683 INFO: ' >2018-08-21 15:26:31,684 INFO: ++ export -f get_image_element_array >2018-08-21 15:26:31,684 INFO: + set +o xtrace >2018-08-21 15:26:31,685 INFO: dib-run-parts Sourcing environment file /tmp/tmpTPs6_V/extra-data.d/../environment.d/01-export-install-types.bash >2018-08-21 15:26:31,685 INFO: + source /tmp/tmpTPs6_V/extra-data.d/../environment.d/01-export-install-types.bash >2018-08-21 15:26:31,686 INFO: ++ export DIB_DEFAULT_INSTALLTYPE=package >2018-08-21 15:26:31,686 INFO: ++ DIB_DEFAULT_INSTALLTYPE=package >2018-08-21 15:26:31,687 INFO: + set +o xtrace >2018-08-21 15:26:31,687 INFO: dib-run-parts Sourcing environment file /tmp/tmpTPs6_V/extra-data.d/../environment.d/01-puppet-module-pins.sh >2018-08-21 15:26:31,688 INFO: + source /tmp/tmpTPs6_V/extra-data.d/../environment.d/01-puppet-module-pins.sh >2018-08-21 15:26:31,688 INFO: ++ export DIB_REPOREF_puppetlabs_ntp=4.2.x >2018-08-21 15:26:31,689 INFO: ++ DIB_REPOREF_puppetlabs_ntp=4.2.x >2018-08-21 15:26:31,689 INFO: + set +o xtrace >2018-08-21 15:26:31,690 INFO: dib-run-parts Sourcing environment file /tmp/tmpTPs6_V/extra-data.d/../environment.d/02-puppet-modules-install-types.sh >2018-08-21 15:26:31,696 INFO: + source /tmp/tmpTPs6_V/extra-data.d/../environment.d/02-puppet-modules-install-types.sh >2018-08-21 15:26:31,697 INFO: ++ DIB_DEFAULT_INSTALLTYPE=package >2018-08-21 15:26:31,697 INFO: ++ DIB_INSTALLTYPE_puppet_modules=package >2018-08-21 15:26:31,698 INFO: ++ '[' package = source ']' >2018-08-21 15:26:31,698 INFO: + set +o xtrace >2018-08-21 15:26:31,699 INFO: dib-run-parts Sourcing environment file /tmp/tmpTPs6_V/extra-data.d/../environment.d/10-os-apply-config-venv-dir.bash >2018-08-21 15:26:31,705 INFO: + source /tmp/tmpTPs6_V/extra-data.d/../environment.d/10-os-apply-config-venv-dir.bash >2018-08-21 15:26:31,706 INFO: ++ '[' -z '' ']' >2018-08-21 15:26:31,706 INFO: ++ export OS_APPLY_CONFIG_VENV_DIR=/opt/stack/venvs/os-apply-config >2018-08-21 15:26:31,707 INFO: ++ OS_APPLY_CONFIG_VENV_DIR=/opt/stack/venvs/os-apply-config >2018-08-21 15:26:31,707 INFO: + set +o xtrace >2018-08-21 15:26:31,708 INFO: dib-run-parts Sourcing environment file /tmp/tmpTPs6_V/extra-data.d/../environment.d/14-manifests >2018-08-21 15:26:31,713 INFO: + source /tmp/tmpTPs6_V/extra-data.d/../environment.d/14-manifests >2018-08-21 15:26:31,714 INFO: ++ export DIB_MANIFEST_IMAGE_DIR=/etc/dib-manifests >2018-08-21 15:26:31,715 INFO: ++ DIB_MANIFEST_IMAGE_DIR=/etc/dib-manifests >2018-08-21 15:26:31,715 INFO: ++ export DIB_MANIFEST_SAVE_DIR=instack.d/ >2018-08-21 15:26:31,716 INFO: ++ DIB_MANIFEST_SAVE_DIR=instack.d/ >2018-08-21 15:26:31,716 INFO: + set +o xtrace >2018-08-21 15:26:31,717 INFO: dib-run-parts Running /tmp/tmpTPs6_V/extra-data.d/10-install-git >2018-08-21 15:26:31,727 INFO: + yum -y install git >2018-08-21 15:26:32,518 INFO: Loaded plugins: product-id, search-disabled-repos, subscription-manager >2018-08-21 15:26:32,821 INFO: This system is not registered with an entitlement server. You can use subscription-manager to register. >2018-08-21 15:26:34,937 INFO: Package git-1.8.3.1-14.el7_5.x86_64 already installed and latest version >2018-08-21 15:26:34,937 INFO: Nothing to do >2018-08-21 15:26:35,045 INFO: dib-run-parts 10-install-git completed >2018-08-21 15:26:35,046 INFO: dib-run-parts Running /tmp/tmpTPs6_V/extra-data.d/20-manifest-dir >2018-08-21 15:26:35,057 INFO: + set -eu >2018-08-21 15:26:35,058 INFO: + set -o pipefail >2018-08-21 15:26:35,059 INFO: + sudo mkdir -p /tmp/instack.lB_tu4/mnt//etc/dib-manifests >2018-08-21 15:26:35,097 INFO: dib-run-parts 20-manifest-dir completed >2018-08-21 15:26:35,098 INFO: dib-run-parts Running /tmp/tmpTPs6_V/extra-data.d/75-inject-element-manifest >2018-08-21 15:26:35,108 INFO: + set -eu >2018-08-21 15:26:35,109 INFO: + set -o pipefail >2018-08-21 15:26:35,110 INFO: + DIB_ELEMENT_MANIFEST_PATH=/etc/dib-manifests/dib-element-manifest >2018-08-21 15:26:35,111 INFO: ++ dirname /etc/dib-manifests/dib-element-manifest >2018-08-21 15:26:35,114 INFO: + sudo mkdir -p /tmp/instack.lB_tu4/mnt//etc/dib-manifests >2018-08-21 15:26:35,147 INFO: + sudo /bin/bash -c 'echo epel undercloud-install dib-python source-repositories install-types install-bin pip-manifest pkg-map puppet-stack-config os-refresh-config element-manifest manifests pip-and-virtualenv cache-url puppet enable-packages-install puppet-modules os-apply-config hiera package-installs | tr '\'' '\'' '\''\n'\'' > /tmp/instack.lB_tu4/mnt//etc/dib-manifests/dib-element-manifest' >2018-08-21 15:26:35,189 INFO: dib-run-parts 75-inject-element-manifest completed >2018-08-21 15:26:35,190 INFO: dib-run-parts Running /tmp/tmpTPs6_V/extra-data.d/98-source-repositories >2018-08-21 15:26:35,234 INFO: Getting /root/.cache/image-create/source-repositories/repositories_flock: Tue Aug 21 15:26:35 IDT 2018 for /tmp/tmpTPs6_V/source-repository-puppet-modules >2018-08-21 15:26:35,247 INFO: (0001 / 0081) >2018-08-21 15:26:35,272 INFO: puppetlabs-apache install type not set to source >2018-08-21 15:26:35,275 INFO: (0002 / 0081) >2018-08-21 15:26:35,288 INFO: puppet-aodh install type not set to source >2018-08-21 15:26:35,290 INFO: (0003 / 0081) >2018-08-21 15:26:35,305 INFO: puppet-auditd install type not set to source >2018-08-21 15:26:35,307 INFO: (0004 / 0081) >2018-08-21 15:26:35,321 INFO: puppet-barbican install type not set to source >2018-08-21 15:26:35,323 INFO: (0005 / 0081) >2018-08-21 15:26:35,337 INFO: puppet-cassandra install type not set to source >2018-08-21 15:26:35,339 INFO: (0006 / 0081) >2018-08-21 15:26:35,353 INFO: puppet-ceph install type not set to source >2018-08-21 15:26:35,355 INFO: (0007 / 0081) >2018-08-21 15:26:35,369 INFO: puppet-ceilometer install type not set to source >2018-08-21 15:26:35,371 INFO: (0008 / 0081) >2018-08-21 15:26:35,385 INFO: puppet-congress install type not set to source >2018-08-21 15:26:35,388 INFO: (0009 / 0081) >2018-08-21 15:26:35,402 INFO: puppet-gnocchi install type not set to source >2018-08-21 15:26:35,404 INFO: (0010 / 0081) >2018-08-21 15:26:35,418 INFO: puppet-certmonger install type not set to source >2018-08-21 15:26:35,420 INFO: (0011 / 0081) >2018-08-21 15:26:35,434 INFO: puppet-cinder install type not set to source >2018-08-21 15:26:35,436 INFO: (0012 / 0081) >2018-08-21 15:26:35,450 INFO: puppet-common install type not set to source >2018-08-21 15:26:35,452 INFO: (0013 / 0081) >2018-08-21 15:26:35,466 INFO: puppet-contrail install type not set to source >2018-08-21 15:26:35,468 INFO: (0014 / 0081) >2018-08-21 15:26:35,482 INFO: puppetlabs-concat install type not set to source >2018-08-21 15:26:35,484 INFO: (0015 / 0081) >2018-08-21 15:26:35,498 INFO: puppetlabs-firewall install type not set to source >2018-08-21 15:26:35,500 INFO: (0016 / 0081) >2018-08-21 15:26:35,514 INFO: puppet-glance install type not set to source >2018-08-21 15:26:35,516 INFO: (0017 / 0081) >2018-08-21 15:26:35,530 INFO: puppet-gluster install type not set to source >2018-08-21 15:26:35,533 INFO: (0018 / 0081) >2018-08-21 15:26:35,547 INFO: puppetlabs-haproxy install type not set to source >2018-08-21 15:26:35,549 INFO: (0019 / 0081) >2018-08-21 15:26:35,563 INFO: puppet-heat install type not set to source >2018-08-21 15:26:35,565 INFO: (0020 / 0081) >2018-08-21 15:26:35,579 INFO: puppet-healthcheck install type not set to source >2018-08-21 15:26:35,581 INFO: (0021 / 0081) >2018-08-21 15:26:35,595 INFO: puppet-horizon install type not set to source >2018-08-21 15:26:35,597 INFO: (0022 / 0081) >2018-08-21 15:26:35,611 INFO: puppetlabs-inifile install type not set to source >2018-08-21 15:26:35,613 INFO: (0023 / 0081) >2018-08-21 15:26:35,627 INFO: puppet-kafka install type not set to source >2018-08-21 15:26:35,629 INFO: (0024 / 0081) >2018-08-21 15:26:35,643 INFO: puppet-keystone install type not set to source >2018-08-21 15:26:35,645 INFO: (0025 / 0081) >2018-08-21 15:26:35,659 INFO: puppet-manila install type not set to source >2018-08-21 15:26:35,661 INFO: (0026 / 0081) >2018-08-21 15:26:35,675 INFO: puppet-memcached install type not set to source >2018-08-21 15:26:35,677 INFO: (0027 / 0081) >2018-08-21 15:26:35,691 INFO: puppet-mistral install type not set to source >2018-08-21 15:26:35,693 INFO: (0028 / 0081) >2018-08-21 15:26:35,707 INFO: puppetlabs-mongodb install type not set to source >2018-08-21 15:26:35,709 INFO: (0029 / 0081) >2018-08-21 15:26:35,723 INFO: puppetlabs-mysql install type not set to source >2018-08-21 15:26:35,725 INFO: (0030 / 0081) >2018-08-21 15:26:35,739 INFO: puppet-neutron install type not set to source >2018-08-21 15:26:35,741 INFO: (0031 / 0081) >2018-08-21 15:26:35,755 INFO: puppet-nova install type not set to source >2018-08-21 15:26:35,757 INFO: (0032 / 0081) >2018-08-21 15:26:35,771 INFO: puppet-octavia install type not set to source >2018-08-21 15:26:35,773 INFO: (0033 / 0081) >2018-08-21 15:26:35,787 INFO: puppet-oslo install type not set to source >2018-08-21 15:26:35,789 INFO: (0034 / 0081) >2018-08-21 15:26:35,803 INFO: puppet-nssdb install type not set to source >2018-08-21 15:26:35,806 INFO: (0035 / 0081) >2018-08-21 15:26:35,820 INFO: puppet-opendaylight install type not set to source >2018-08-21 15:26:35,822 INFO: (0036 / 0081) >2018-08-21 15:26:35,836 INFO: puppet-ovn install type not set to source >2018-08-21 15:26:35,838 INFO: (0037 / 0081) >2018-08-21 15:26:35,852 INFO: puppet-panko install type not set to source >2018-08-21 15:26:35,854 INFO: (0038 / 0081) >2018-08-21 15:26:35,867 INFO: puppet-puppet install type not set to source >2018-08-21 15:26:35,870 INFO: (0039 / 0081) >2018-08-21 15:26:35,884 INFO: puppetlabs-rabbitmq install type not set to source >2018-08-21 15:26:35,886 INFO: (0040 / 0081) >2018-08-21 15:26:35,900 INFO: puppet-redis install type not set to source >2018-08-21 15:26:35,902 INFO: (0041 / 0081) >2018-08-21 15:26:35,916 INFO: puppetlabs-rsync install type not set to source >2018-08-21 15:26:35,918 INFO: (0042 / 0081) >2018-08-21 15:26:35,932 INFO: puppet-sahara install type not set to source >2018-08-21 15:26:35,934 INFO: (0043 / 0081) >2018-08-21 15:26:35,948 INFO: sensu-puppet install type not set to source >2018-08-21 15:26:35,950 INFO: (0044 / 0081) >2018-08-21 15:26:35,964 INFO: puppet-tacker install type not set to source >2018-08-21 15:26:35,966 INFO: (0045 / 0081) >2018-08-21 15:26:35,980 INFO: puppet-trove install type not set to source >2018-08-21 15:26:35,982 INFO: (0046 / 0081) >2018-08-21 15:26:35,996 INFO: puppet-ssh install type not set to source >2018-08-21 15:26:35,998 INFO: (0047 / 0081) >2018-08-21 15:26:36,012 INFO: puppet-staging install type not set to source >2018-08-21 15:26:36,014 INFO: (0048 / 0081) >2018-08-21 15:26:36,028 INFO: puppetlabs-stdlib install type not set to source >2018-08-21 15:26:36,030 INFO: (0049 / 0081) >2018-08-21 15:26:36,044 INFO: puppet-swift install type not set to source >2018-08-21 15:26:36,046 INFO: (0050 / 0081) >2018-08-21 15:26:36,060 INFO: puppetlabs-sysctl install type not set to source >2018-08-21 15:26:36,062 INFO: (0051 / 0081) >2018-08-21 15:26:36,076 INFO: puppet-timezone install type not set to source >2018-08-21 15:26:36,078 INFO: (0052 / 0081) >2018-08-21 15:26:36,092 INFO: puppet-uchiwa install type not set to source >2018-08-21 15:26:36,094 INFO: (0053 / 0081) >2018-08-21 15:26:36,109 INFO: puppetlabs-vcsrepo install type not set to source >2018-08-21 15:26:36,111 INFO: (0054 / 0081) >2018-08-21 15:26:36,125 INFO: puppet-vlan install type not set to source >2018-08-21 15:26:36,127 INFO: (0055 / 0081) >2018-08-21 15:26:36,141 INFO: puppet-vswitch install type not set to source >2018-08-21 15:26:36,143 INFO: (0056 / 0081) >2018-08-21 15:26:36,157 INFO: puppetlabs-xinetd install type not set to source >2018-08-21 15:26:36,159 INFO: (0057 / 0081) >2018-08-21 15:26:36,173 INFO: puppet-zookeeper install type not set to source >2018-08-21 15:26:36,175 INFO: (0058 / 0081) >2018-08-21 15:26:36,189 INFO: puppet-openstacklib install type not set to source >2018-08-21 15:26:36,192 INFO: (0059 / 0081) >2018-08-21 15:26:36,206 INFO: puppet-module-keepalived install type not set to source >2018-08-21 15:26:36,208 INFO: (0060 / 0081) >2018-08-21 15:26:36,221 INFO: puppetlabs-ntp install type not set to source >2018-08-21 15:26:36,224 INFO: (0061 / 0081) >2018-08-21 15:26:36,237 INFO: puppet-snmp install type not set to source >2018-08-21 15:26:36,240 INFO: (0062 / 0081) >2018-08-21 15:26:36,253 INFO: puppet-tripleo install type not set to source >2018-08-21 15:26:36,256 INFO: (0063 / 0081) >2018-08-21 15:26:36,269 INFO: puppet-ironic install type not set to source >2018-08-21 15:26:36,272 INFO: (0064 / 0081) >2018-08-21 15:26:36,285 INFO: puppet-ipaclient install type not set to source >2018-08-21 15:26:36,288 INFO: (0065 / 0081) >2018-08-21 15:26:36,302 INFO: puppetlabs-corosync install type not set to source >2018-08-21 15:26:36,304 INFO: (0066 / 0081) >2018-08-21 15:26:36,318 INFO: puppet-pacemaker install type not set to source >2018-08-21 15:26:36,320 INFO: (0067 / 0081) >2018-08-21 15:26:36,334 INFO: puppet_aviator install type not set to source >2018-08-21 15:26:36,336 INFO: (0068 / 0081) >2018-08-21 15:26:36,350 INFO: puppet-openstack_extras install type not set to source >2018-08-21 15:26:36,352 INFO: (0069 / 0081) >2018-08-21 15:26:36,366 INFO: konstantin-fluentd install type not set to source >2018-08-21 15:26:36,368 INFO: (0070 / 0081) >2018-08-21 15:26:36,382 INFO: puppet-elasticsearch install type not set to source >2018-08-21 15:26:36,385 INFO: (0071 / 0081) >2018-08-21 15:26:36,398 INFO: puppet-kibana3 install type not set to source >2018-08-21 15:26:36,401 INFO: (0072 / 0081) >2018-08-21 15:26:36,414 INFO: puppetlabs-git install type not set to source >2018-08-21 15:26:36,417 INFO: (0073 / 0081) >2018-08-21 15:26:36,431 INFO: puppet-datacat install type not set to source >2018-08-21 15:26:36,433 INFO: (0074 / 0081) >2018-08-21 15:26:36,447 INFO: puppet-kmod install type not set to source >2018-08-21 15:26:36,449 INFO: (0075 / 0081) >2018-08-21 15:26:36,463 INFO: puppet-zaqar install type not set to source >2018-08-21 15:26:36,465 INFO: (0076 / 0081) >2018-08-21 15:26:36,479 INFO: puppet-ec2api install type not set to source >2018-08-21 15:26:36,481 INFO: (0077 / 0081) >2018-08-21 15:26:36,495 INFO: puppet-qdr install type not set to source >2018-08-21 15:26:36,497 INFO: (0078 / 0081) >2018-08-21 15:26:36,511 INFO: puppet-systemd install type not set to source >2018-08-21 15:26:36,513 INFO: (0079 / 0081) >2018-08-21 15:26:36,527 INFO: puppet-etcd install type not set to source >2018-08-21 15:26:36,529 INFO: (0080 / 0081) >2018-08-21 15:26:36,543 INFO: puppet-veritas_hyperscale install type not set to source >2018-08-21 15:26:36,545 INFO: (0081 / 0081) >2018-08-21 15:26:36,559 INFO: puppet-ptp install type not set to source >2018-08-21 15:26:36,565 INFO: dib-run-parts 98-source-repositories completed >2018-08-21 15:26:36,566 INFO: dib-run-parts Running /tmp/tmpTPs6_V/extra-data.d/99-enable-install-types >2018-08-21 15:26:36,576 INFO: + set -eu >2018-08-21 15:26:36,577 INFO: + set -o pipefail >2018-08-21 15:26:36,577 INFO: + declare -a SPECIFIED_ELEMS >2018-08-21 15:26:36,578 INFO: + SPECIFIED_ELEMS[0]= >2018-08-21 15:26:36,578 INFO: + PREFIX=DIB_INSTALLTYPE_ >2018-08-21 15:26:36,579 INFO: ++ env >2018-08-21 15:26:36,580 INFO: ++ grep '^DIB_INSTALLTYPE_' >2018-08-21 15:26:36,580 INFO: ++ cut -d= -f1 >2018-08-21 15:26:36,585 INFO: ++ echo '' >2018-08-21 15:26:36,586 INFO: + INSTALL_TYPE_VARS= >2018-08-21 15:26:36,588 INFO: ++ find /tmp/tmpTPs6_V/install.d -maxdepth 1 -name '*-package-install' -type d >2018-08-21 15:26:36,593 INFO: + default_install_type_dirs=/tmp/tmpTPs6_V/install.d/puppet-modules-package-install >2018-08-21 15:26:36,593 INFO: + for _install_dir in '$default_install_type_dirs' >2018-08-21 15:26:36,594 INFO: + SUFFIX=-package-install >2018-08-21 15:26:36,594 INFO: ++ basename /tmp/tmpTPs6_V/install.d/puppet-modules-package-install >2018-08-21 15:26:36,597 INFO: + _install_dir=puppet-modules-package-install >2018-08-21 15:26:36,598 INFO: + INSTALLDIRPREFIX=puppet-modules >2018-08-21 15:26:36,598 INFO: + found=0 >2018-08-21 15:26:36,598 INFO: + '[' 0 = 0 ']' >2018-08-21 15:26:36,599 INFO: + pushd /tmp/tmpTPs6_V/install.d >2018-08-21 15:26:36,599 INFO: /tmp/tmpTPs6_V/install.d /home/stack >2018-08-21 15:26:36,600 INFO: + ln -sf puppet-modules-package-install/75-puppet-modules-package . >2018-08-21 15:26:36,601 INFO: + popd >2018-08-21 15:26:36,602 INFO: /home/stack >2018-08-21 15:26:36,606 INFO: dib-run-parts 99-enable-install-types completed >2018-08-21 15:26:36,607 INFO: dib-run-parts ----------------------- PROFILING ----------------------- >2018-08-21 15:26:36,607 INFO: dib-run-parts >2018-08-21 15:26:36,611 INFO: dib-run-parts Target: extra-data.d >2018-08-21 15:26:36,611 INFO: dib-run-parts >2018-08-21 15:26:36,612 INFO: dib-run-parts Script Seconds >2018-08-21 15:26:36,612 INFO: dib-run-parts --------------------------------------- ---------- >2018-08-21 15:26:36,613 INFO: dib-run-parts >2018-08-21 15:26:36,634 INFO: dib-run-parts 10-install-git 3.326 >2018-08-21 15:26:36,649 INFO: dib-run-parts 20-manifest-dir 0.047 >2018-08-21 15:26:36,664 INFO: dib-run-parts 75-inject-element-manifest 0.088 >2018-08-21 15:26:36,679 INFO: dib-run-parts 98-source-repositories 1.372 >2018-08-21 15:26:36,694 INFO: dib-run-parts 99-enable-install-types 0.037 >2018-08-21 15:26:36,699 INFO: dib-run-parts >2018-08-21 15:26:36,700 INFO: dib-run-parts --------------------- END PROFILING --------------------- >2018-08-21 15:26:36,701 INFO: INFO: 2018-08-21 15:26:36,700 -- ############### End stdout/stderr logging ############### >2018-08-21 15:26:36,702 INFO: INFO: 2018-08-21 15:26:36,701 -- Running hook pre-install >2018-08-21 15:26:36,702 INFO: INFO: 2018-08-21 15:26:36,702 -- Skipping hook pre-install, the hook directory doesn't exist at /tmp/tmpTPs6_V/pre-install.d >2018-08-21 15:26:36,703 INFO: INFO: 2018-08-21 15:26:36,702 -- Running hook install >2018-08-21 15:26:36,704 INFO: INFO: 2018-08-21 15:26:36,703 -- ############### Begin stdout/stderr logging ############### >2018-08-21 15:26:36,743 INFO: dib-run-parts Sourcing environment file /tmp/tmpTPs6_V/install.d/../environment.d/00-dib-v2-env >2018-08-21 15:26:36,750 INFO: + source /tmp/tmpTPs6_V/install.d/../environment.d/00-dib-v2-env >2018-08-21 15:26:36,751 INFO: ++ export 'IMAGE_ELEMENT=epel undercloud-install dib-python source-repositories install-types install-bin pip-manifest pkg-map puppet-stack-config os-refresh-config element-manifest manifests pip-and-virtualenv cache-url puppet enable-packages-install puppet-modules os-apply-config hiera package-installs' >2018-08-21 15:26:36,752 INFO: ++ IMAGE_ELEMENT='epel undercloud-install dib-python source-repositories install-types install-bin pip-manifest pkg-map puppet-stack-config os-refresh-config element-manifest manifests pip-and-virtualenv cache-url puppet enable-packages-install puppet-modules os-apply-config hiera package-installs' >2018-08-21 15:26:36,752 INFO: ++ export 'IMAGE_ELEMENT_YAML={cache-url: /usr/share/diskimage-builder/elements/cache-url, dib-python: /usr/share/diskimage-builder/elements/dib-python, >2018-08-21 15:26:36,753 INFO: element-manifest: /usr/share/diskimage-builder/elements/element-manifest, enable-packages-install: /usr/share/tripleo-image-elements/enable-packages-install, >2018-08-21 15:26:36,754 INFO: epel: /usr/share/diskimage-builder/elements/epel, hiera: /usr/share/tripleo-puppet-elements/hiera, >2018-08-21 15:26:36,754 INFO: install-bin: /usr/share/diskimage-builder/elements/install-bin, install-types: /usr/share/diskimage-builder/elements/install-types, >2018-08-21 15:26:36,755 INFO: manifests: /usr/share/diskimage-builder/elements/manifests, os-apply-config: /usr/share/tripleo-image-elements/os-apply-config, >2018-08-21 15:26:36,756 INFO: os-refresh-config: /usr/share/tripleo-image-elements/os-refresh-config, package-installs: /usr/share/diskimage-builder/elements/package-installs, >2018-08-21 15:26:36,756 INFO: pip-and-virtualenv: /usr/share/diskimage-builder/elements/pip-and-virtualenv, pip-manifest: /usr/share/tripleo-image-elements/pip-manifest, >2018-08-21 15:26:36,757 INFO: pkg-map: /usr/share/diskimage-builder/elements/pkg-map, puppet: /usr/share/tripleo-puppet-elements/puppet, >2018-08-21 15:26:36,757 INFO: puppet-modules: /usr/share/tripleo-puppet-elements/puppet-modules, puppet-stack-config: /usr/share/instack-undercloud/puppet-stack-config, >2018-08-21 15:26:36,758 INFO: source-repositories: /usr/share/diskimage-builder/elements/source-repositories, >2018-08-21 15:26:36,758 INFO: undercloud-install: /usr/share/instack-undercloud/undercloud-install} >2018-08-21 15:26:36,759 INFO: ' >2018-08-21 15:26:36,759 INFO: ++ IMAGE_ELEMENT_YAML='{cache-url: /usr/share/diskimage-builder/elements/cache-url, dib-python: /usr/share/diskimage-builder/elements/dib-python, >2018-08-21 15:26:36,760 INFO: element-manifest: /usr/share/diskimage-builder/elements/element-manifest, enable-packages-install: /usr/share/tripleo-image-elements/enable-packages-install, >2018-08-21 15:26:36,761 INFO: epel: /usr/share/diskimage-builder/elements/epel, hiera: /usr/share/tripleo-puppet-elements/hiera, >2018-08-21 15:26:36,761 INFO: install-bin: /usr/share/diskimage-builder/elements/install-bin, install-types: /usr/share/diskimage-builder/elements/install-types, >2018-08-21 15:26:36,762 INFO: manifests: /usr/share/diskimage-builder/elements/manifests, os-apply-config: /usr/share/tripleo-image-elements/os-apply-config, >2018-08-21 15:26:36,763 INFO: os-refresh-config: /usr/share/tripleo-image-elements/os-refresh-config, package-installs: /usr/share/diskimage-builder/elements/package-installs, >2018-08-21 15:26:36,763 INFO: pip-and-virtualenv: /usr/share/diskimage-builder/elements/pip-and-virtualenv, pip-manifest: /usr/share/tripleo-image-elements/pip-manifest, >2018-08-21 15:26:36,764 INFO: pkg-map: /usr/share/diskimage-builder/elements/pkg-map, puppet: /usr/share/tripleo-puppet-elements/puppet, >2018-08-21 15:26:36,765 INFO: puppet-modules: /usr/share/tripleo-puppet-elements/puppet-modules, puppet-stack-config: /usr/share/instack-undercloud/puppet-stack-config, >2018-08-21 15:26:36,765 INFO: source-repositories: /usr/share/diskimage-builder/elements/source-repositories, >2018-08-21 15:26:36,766 INFO: undercloud-install: /usr/share/instack-undercloud/undercloud-install} >2018-08-21 15:26:36,766 INFO: ' >2018-08-21 15:26:36,767 INFO: ++ export -f get_image_element_array >2018-08-21 15:26:36,767 INFO: + set +o xtrace >2018-08-21 15:26:36,768 INFO: dib-run-parts Sourcing environment file /tmp/tmpTPs6_V/install.d/../environment.d/01-export-install-types.bash >2018-08-21 15:26:36,768 INFO: + source /tmp/tmpTPs6_V/install.d/../environment.d/01-export-install-types.bash >2018-08-21 15:26:36,769 INFO: ++ export DIB_DEFAULT_INSTALLTYPE=package >2018-08-21 15:26:36,769 INFO: ++ DIB_DEFAULT_INSTALLTYPE=package >2018-08-21 15:26:36,769 INFO: + set +o xtrace >2018-08-21 15:26:36,770 INFO: dib-run-parts Sourcing environment file /tmp/tmpTPs6_V/install.d/../environment.d/01-puppet-module-pins.sh >2018-08-21 15:26:36,771 INFO: + source /tmp/tmpTPs6_V/install.d/../environment.d/01-puppet-module-pins.sh >2018-08-21 15:26:36,771 INFO: ++ export DIB_REPOREF_puppetlabs_ntp=4.2.x >2018-08-21 15:26:36,772 INFO: ++ DIB_REPOREF_puppetlabs_ntp=4.2.x >2018-08-21 15:26:36,772 INFO: + set +o xtrace >2018-08-21 15:26:36,773 INFO: dib-run-parts Sourcing environment file /tmp/tmpTPs6_V/install.d/../environment.d/02-puppet-modules-install-types.sh >2018-08-21 15:26:36,775 INFO: + source /tmp/tmpTPs6_V/install.d/../environment.d/02-puppet-modules-install-types.sh >2018-08-21 15:26:36,776 INFO: ++ DIB_DEFAULT_INSTALLTYPE=package >2018-08-21 15:26:36,777 INFO: ++ DIB_INSTALLTYPE_puppet_modules=package >2018-08-21 15:26:36,777 INFO: ++ '[' package = source ']' >2018-08-21 15:26:36,777 INFO: + set +o xtrace >2018-08-21 15:26:36,778 INFO: dib-run-parts Sourcing environment file /tmp/tmpTPs6_V/install.d/../environment.d/10-os-apply-config-venv-dir.bash >2018-08-21 15:26:36,784 INFO: + source /tmp/tmpTPs6_V/install.d/../environment.d/10-os-apply-config-venv-dir.bash >2018-08-21 15:26:36,785 INFO: ++ '[' -z '' ']' >2018-08-21 15:26:36,785 INFO: ++ export OS_APPLY_CONFIG_VENV_DIR=/opt/stack/venvs/os-apply-config >2018-08-21 15:26:36,786 INFO: ++ OS_APPLY_CONFIG_VENV_DIR=/opt/stack/venvs/os-apply-config >2018-08-21 15:26:36,786 INFO: + set +o xtrace >2018-08-21 15:26:36,787 INFO: dib-run-parts Sourcing environment file /tmp/tmpTPs6_V/install.d/../environment.d/14-manifests >2018-08-21 15:26:36,793 INFO: + source /tmp/tmpTPs6_V/install.d/../environment.d/14-manifests >2018-08-21 15:26:36,793 INFO: ++ export DIB_MANIFEST_IMAGE_DIR=/etc/dib-manifests >2018-08-21 15:26:36,794 INFO: ++ DIB_MANIFEST_IMAGE_DIR=/etc/dib-manifests >2018-08-21 15:26:36,794 INFO: ++ export DIB_MANIFEST_SAVE_DIR=instack.d/ >2018-08-21 15:26:36,795 INFO: ++ DIB_MANIFEST_SAVE_DIR=instack.d/ >2018-08-21 15:26:36,795 INFO: + set +o xtrace >2018-08-21 15:26:36,796 INFO: dib-run-parts Running /tmp/tmpTPs6_V/install.d/02-puppet-stack-config >2018-08-21 15:26:39,724 INFO: dib-run-parts 02-puppet-stack-config completed >2018-08-21 15:26:39,725 INFO: dib-run-parts Running /tmp/tmpTPs6_V/install.d/10-hiera-yaml-symlink >2018-08-21 15:26:39,736 INFO: + set -o pipefail >2018-08-21 15:26:39,736 INFO: + ln -f -s /etc/puppet/hiera.yaml /etc/hiera.yaml >2018-08-21 15:26:39,745 INFO: dib-run-parts 10-hiera-yaml-symlink completed >2018-08-21 15:26:39,745 INFO: dib-run-parts Running /tmp/tmpTPs6_V/install.d/10-puppet-stack-config-puppet-module >2018-08-21 15:26:39,756 INFO: + set -o pipefail >2018-08-21 15:26:39,757 INFO: + mkdir -p /etc/puppet/manifests >2018-08-21 15:26:39,762 INFO: ++ dirname /tmp/tmpTPs6_V/install.d/10-puppet-stack-config-puppet-module >2018-08-21 15:26:39,765 INFO: + cp /tmp/tmpTPs6_V/install.d/../puppet-stack-config.pp /etc/puppet/manifests/puppet-stack-config.pp >2018-08-21 15:26:39,774 INFO: dib-run-parts 10-puppet-stack-config-puppet-module completed >2018-08-21 15:26:39,775 INFO: dib-run-parts Running /tmp/tmpTPs6_V/install.d/11-create-template-root >2018-08-21 15:26:39,788 INFO: ++ os-apply-config --print-templates >2018-08-21 15:26:40,276 INFO: + TEMPLATE_ROOT=/usr/libexec/os-apply-config/templates >2018-08-21 15:26:40,277 INFO: + mkdir -p /usr/libexec/os-apply-config/templates >2018-08-21 15:26:40,292 INFO: dib-run-parts 11-create-template-root completed >2018-08-21 15:26:40,292 INFO: dib-run-parts Running /tmp/tmpTPs6_V/install.d/11-hiera-orc-install >2018-08-21 15:26:40,303 INFO: + set -o pipefail >2018-08-21 15:26:40,304 INFO: + mkdir -p /usr/libexec/os-refresh-config/configure.d/ >2018-08-21 15:26:40,310 INFO: ++ dirname /tmp/tmpTPs6_V/install.d/11-hiera-orc-install >2018-08-21 15:26:40,313 INFO: + install -m 0755 -o root -g root /tmp/tmpTPs6_V/install.d/../10-hiera-disable /usr/libexec/os-refresh-config/configure.d/10-hiera-disable >2018-08-21 15:26:40,344 INFO: ++ dirname /tmp/tmpTPs6_V/install.d/11-hiera-orc-install >2018-08-21 15:26:40,347 INFO: + install -m 0755 -o root -g root /tmp/tmpTPs6_V/install.d/../40-hiera-datafiles /usr/libexec/os-refresh-config/configure.d/40-hiera-datafiles >2018-08-21 15:26:40,370 INFO: dib-run-parts 11-hiera-orc-install completed >2018-08-21 15:26:40,371 INFO: dib-run-parts Running /tmp/tmpTPs6_V/install.d/75-puppet-modules-package >2018-08-21 15:26:40,382 INFO: + find /opt/stack/puppet-modules/ -mindepth 1 >2018-08-21 15:26:40,384 INFO: + read >2018-08-21 15:26:40,397 INFO: + ln -f -s /usr/share/openstack-puppet/modules/aodh /usr/share/openstack-puppet/modules/apache /usr/share/openstack-puppet/modules/archive /usr/share/openstack-puppet/modules/auditd /usr/share/openstack-puppet/modules/barbican /usr/share/openstack-puppet/modules/cassandra /usr/share/openstack-puppet/modules/ceilometer /usr/share/openstack-puppet/modules/ceph /usr/share/openstack-puppet/modules/certmonger /usr/share/openstack-puppet/modules/cinder /usr/share/openstack-puppet/modules/collectd /usr/share/openstack-puppet/modules/concat /usr/share/openstack-puppet/modules/contrail /usr/share/openstack-puppet/modules/corosync /usr/share/openstack-puppet/modules/datacat /usr/share/openstack-puppet/modules/designate /usr/share/openstack-puppet/modules/dns /usr/share/openstack-puppet/modules/ec2api /usr/share/openstack-puppet/modules/elasticsearch /usr/share/openstack-puppet/modules/fdio /usr/share/openstack-puppet/modules/firewall /usr/share/openstack-puppet/modules/fluentd /usr/share/openstack-puppet/modules/git /usr/share/openstack-puppet/modules/glance /usr/share/openstack-puppet/modules/gnocchi /usr/share/openstack-puppet/modules/haproxy /usr/share/openstack-puppet/modules/heat /usr/share/openstack-puppet/modules/horizon /usr/share/openstack-puppet/modules/inifile /usr/share/openstack-puppet/modules/ipaclient /usr/share/openstack-puppet/modules/ironic /usr/share/openstack-puppet/modules/java /usr/share/openstack-puppet/modules/kafka /usr/share/openstack-puppet/modules/keepalived /usr/share/openstack-puppet/modules/keystone /usr/share/openstack-puppet/modules/kibana3 /usr/share/openstack-puppet/modules/kmod /usr/share/openstack-puppet/modules/manila /usr/share/openstack-puppet/modules/memcached /usr/share/openstack-puppet/modules/midonet /usr/share/openstack-puppet/modules/mistral /usr/share/openstack-puppet/modules/module-data /usr/share/openstack-puppet/modules/mysql /usr/share/openstack-puppet/modules/n1k_vsm /usr/share/openstack-puppet/modules/neutron /usr/share/openstack-puppet/modules/nova /usr/share/openstack-puppet/modules/nssdb /usr/share/openstack-puppet/modules/ntp /usr/share/openstack-puppet/modules/octavia /usr/share/openstack-puppet/modules/opendaylight /usr/share/openstack-puppet/modules/openstack_extras /usr/share/openstack-puppet/modules/openstacklib /usr/share/openstack-puppet/modules/oslo /usr/share/openstack-puppet/modules/ovn /usr/share/openstack-puppet/modules/pacemaker /usr/share/openstack-puppet/modules/panko /usr/share/openstack-puppet/modules/rabbitmq /usr/share/openstack-puppet/modules/redis /usr/share/openstack-puppet/modules/remote /usr/share/openstack-puppet/modules/rsync /usr/share/openstack-puppet/modules/sahara /usr/share/openstack-puppet/modules/sensu /usr/share/openstack-puppet/modules/snmp /usr/share/openstack-puppet/modules/ssh /usr/share/openstack-puppet/modules/staging /usr/share/openstack-puppet/modules/stdlib /usr/share/openstack-puppet/modules/swift /usr/share/openstack-puppet/modules/sysctl /usr/share/openstack-puppet/modules/systemd /usr/share/openstack-puppet/modules/timezone /usr/share/openstack-puppet/modules/tomcat /usr/share/openstack-puppet/modules/tripleo /usr/share/openstack-puppet/modules/trove /usr/share/openstack-puppet/modules/uchiwa /usr/share/openstack-puppet/modules/vcsrepo /usr/share/openstack-puppet/modules/veritas_hyperscale /usr/share/openstack-puppet/modules/vswitch /usr/share/openstack-puppet/modules/xinetd /usr/share/openstack-puppet/modules/zaqar /usr/share/openstack-puppet/modules/zookeeper /etc/puppet/modules/ >2018-08-21 15:26:40,405 INFO: dib-run-parts 75-puppet-modules-package completed >2018-08-21 15:26:40,405 INFO: dib-run-parts Running /tmp/tmpTPs6_V/install.d/99-install-config-templates >2018-08-21 15:26:40,418 INFO: ++ os-apply-config --print-templates >2018-08-21 15:26:40,908 INFO: + TEMPLATE_ROOT=/usr/libexec/os-apply-config/templates >2018-08-21 15:26:40,909 INFO: ++ dirname /tmp/tmpTPs6_V/install.d/99-install-config-templates >2018-08-21 15:26:40,912 INFO: + TEMPLATE_SOURCE=/tmp/tmpTPs6_V/install.d/../os-apply-config >2018-08-21 15:26:40,913 INFO: + mkdir -p /usr/libexec/os-apply-config/templates >2018-08-21 15:26:40,916 INFO: + '[' -d /tmp/tmpTPs6_V/install.d/../os-apply-config ']' >2018-08-21 15:26:40,917 INFO: + rsync '--exclude=.*.swp' -Cr /tmp/tmpTPs6_V/install.d/../os-apply-config/ /usr/libexec/os-apply-config/templates/ >2018-08-21 15:26:40,935 INFO: dib-run-parts 99-install-config-templates completed >2018-08-21 15:26:40,936 INFO: dib-run-parts Running /tmp/tmpTPs6_V/install.d/99-os-refresh-config-install-scripts >2018-08-21 15:26:40,948 INFO: ++ os-refresh-config --print-base >2018-08-21 15:26:41,176 INFO: + SCRIPT_BASE=/usr/libexec/os-refresh-config >2018-08-21 15:26:41,178 INFO: ++ dirname /tmp/tmpTPs6_V/install.d/99-os-refresh-config-install-scripts >2018-08-21 15:26:41,180 INFO: + SCRIPT_SOURCE=/tmp/tmpTPs6_V/install.d/../os-refresh-config >2018-08-21 15:26:41,181 INFO: + rsync -r /tmp/tmpTPs6_V/install.d/../os-refresh-config/ /usr/libexec/os-refresh-config/ >2018-08-21 15:26:41,196 INFO: dib-run-parts 99-os-refresh-config-install-scripts completed >2018-08-21 15:26:41,198 INFO: dib-run-parts ----------------------- PROFILING ----------------------- >2018-08-21 15:26:41,198 INFO: dib-run-parts >2018-08-21 15:26:41,201 INFO: dib-run-parts Target: install.d >2018-08-21 15:26:41,202 INFO: dib-run-parts >2018-08-21 15:26:41,202 INFO: dib-run-parts Script Seconds >2018-08-21 15:26:41,203 INFO: dib-run-parts --------------------------------------- ---------- >2018-08-21 15:26:41,203 INFO: dib-run-parts >2018-08-21 15:26:41,225 INFO: dib-run-parts 02-puppet-stack-config 2.925 >2018-08-21 15:26:41,240 INFO: dib-run-parts 10-hiera-yaml-symlink 0.016 >2018-08-21 15:26:41,255 INFO: dib-run-parts 10-puppet-stack-config-puppet-module 0.026 >2018-08-21 15:26:41,270 INFO: dib-run-parts 11-create-template-root 0.507 >2018-08-21 15:26:41,286 INFO: dib-run-parts 11-hiera-orc-install 0.074 >2018-08-21 15:26:41,301 INFO: dib-run-parts 75-puppet-modules-package 0.030 >2018-08-21 15:26:41,316 INFO: dib-run-parts 99-install-config-templates 0.526 >2018-08-21 15:26:41,331 INFO: dib-run-parts 99-os-refresh-config-install-scripts 0.257 >2018-08-21 15:26:41,337 INFO: dib-run-parts >2018-08-21 15:26:41,338 INFO: dib-run-parts --------------------- END PROFILING --------------------- >2018-08-21 15:26:41,339 INFO: INFO: 2018-08-21 15:26:41,339 -- ############### End stdout/stderr logging ############### >2018-08-21 15:26:41,340 INFO: INFO: 2018-08-21 15:26:41,340 -- Running hook post-install >2018-08-21 15:26:41,341 INFO: INFO: 2018-08-21 15:26:41,340 -- Skipping hook post-install, the hook directory doesn't exist at /tmp/tmpTPs6_V/post-install.d >2018-08-21 15:26:41,351 INFO: INFO: 2018-08-21 15:26:41,350 -- Ending run of instack. >2018-08-21 15:26:41,386 INFO: Instack completed successfully >2018-08-21 15:26:41,387 INFO: Running os-refresh-config >2018-08-21 15:26:41,639 INFO: [2018-08-21 15:26:41,638] (os-refresh-config) [INFO] Starting phase configure >2018-08-21 15:26:41,669 INFO: dib-run-parts Tue Aug 21 15:26:41 IDT 2018 Running /usr/libexec/os-refresh-config/configure.d/10-hiera-disable >2018-08-21 15:26:41,677 INFO: + '[' -f /etc/puppet/hiera.yaml ']' >2018-08-21 15:26:41,684 INFO: dib-run-parts Tue Aug 21 15:26:41 IDT 2018 10-hiera-disable completed >2018-08-21 15:26:41,687 INFO: dib-run-parts Tue Aug 21 15:26:41 IDT 2018 Running /usr/libexec/os-refresh-config/configure.d/20-os-apply-config >2018-08-21 15:26:42,159 INFO: [2018/08/21 03:26:42 PM] [WARNING] DEPRECATED: falling back to /var/run/os-collect-config/os_config_files.json >2018-08-21 15:26:42,184 INFO: [2018/08/21 03:26:42 PM] [INFO] writing /etc/os-net-config/config.json >2018-08-21 15:26:42,186 INFO: [2018/08/21 03:26:42 PM] [INFO] writing /root/stackrc >2018-08-21 15:26:42,187 INFO: [2018/08/21 03:26:42 PM] [INFO] writing /root/tripleo-undercloud-passwords >2018-08-21 15:26:42,188 INFO: [2018/08/21 03:26:42 PM] [INFO] writing /etc/puppet/hieradata/RedHat.yaml >2018-08-21 15:26:42,189 INFO: [2018/08/21 03:26:42 PM] [INFO] writing /etc/puppet/hiera.yaml >2018-08-21 15:26:42,190 INFO: [2018/08/21 03:26:42 PM] [INFO] writing /var/opt/undercloud-stack/masquerade >2018-08-21 15:26:42,196 INFO: [2018/08/21 03:26:42 PM] [INFO] writing /etc/puppet/hieradata/CentOS.yaml >2018-08-21 15:26:42,197 INFO: [2018/08/21 03:26:42 PM] [INFO] writing /var/run/heat-config/heat-config >2018-08-21 15:26:42,199 INFO: [2018/08/21 03:26:42 PM] [INFO] writing /etc/os-collect-config.conf >2018-08-21 15:26:42,200 INFO: [2018/08/21 03:26:42 PM] [INFO] success >2018-08-21 15:26:42,234 INFO: dib-run-parts Tue Aug 21 15:26:42 IDT 2018 20-os-apply-config completed >2018-08-21 15:26:42,237 INFO: dib-run-parts Tue Aug 21 15:26:42 IDT 2018 Running /usr/libexec/os-refresh-config/configure.d/30-reload-keepalived >2018-08-21 15:26:42,245 INFO: + systemctl is-enabled keepalived >2018-08-21 15:26:42,269 INFO: Failed to get unit file state for keepalived.service: No such file or directory >2018-08-21 15:26:42,277 INFO: dib-run-parts Tue Aug 21 15:26:42 IDT 2018 30-reload-keepalived completed >2018-08-21 15:26:42,280 INFO: dib-run-parts Tue Aug 21 15:26:42 IDT 2018 Running /usr/libexec/os-refresh-config/configure.d/40-hiera-datafiles >2018-08-21 15:26:42,759 INFO: [2018/08/21 03:26:42 PM] [WARNING] DEPRECATED: falling back to /var/run/os-collect-config/os_config_files.json >2018-08-21 15:26:42,817 INFO: dib-run-parts Tue Aug 21 15:26:42 IDT 2018 40-hiera-datafiles completed >2018-08-21 15:26:42,820 INFO: dib-run-parts Tue Aug 21 15:26:42 IDT 2018 Running /usr/libexec/os-refresh-config/configure.d/50-puppet-stack-config >2018-08-21 15:26:42,828 INFO: + set -o pipefail >2018-08-21 15:26:42,828 INFO: + puppet_apply puppet apply --summarize --detailed-exitcodes /etc/puppet/manifests/puppet-stack-config.pp >2018-08-21 15:26:42,829 INFO: + set +e >2018-08-21 15:26:42,829 INFO: + puppet apply --summarize --detailed-exitcodes /etc/puppet/manifests/puppet-stack-config.pp >2018-08-21 15:26:58,739 INFO: [mNotice: hiera(): Cannot load backend module_data: cannot load such file -- hiera/backend/module_data_backend[0m >2018-08-21 15:26:59,251 INFO: [1;33mWarning: ModuleLoader: module 'openstacklib' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:26:59,251 INFO: (file & line not available)[0m >2018-08-21 15:27:00,693 INFO: [mNotice: hiera(): Cannot load backend module_data: cannot load such file -- hiera/backend/module_data_backend[0m >2018-08-21 15:27:01,054 INFO: [1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, >2018-08-21 15:27:01,055 INFO: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/ntp/manifests/init.pp", 54]:["/etc/puppet/modules/tripleo/manifests/profile/base/time/ntp.pp", 29] >2018-08-21 15:27:01,055 INFO: (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:28:in `deprecation')[0m >2018-08-21 15:27:01,074 INFO: [1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, >2018-08-21 15:27:01,075 INFO: with Stdlib::Compat::Absolute_Path. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/ntp/manifests/init.pp", 55]:["/etc/puppet/modules/tripleo/manifests/profile/base/time/ntp.pp", 29] >2018-08-21 15:27:01,075 INFO: (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:28:in `deprecation')[0m >2018-08-21 15:27:01,324 INFO: [1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, >2018-08-21 15:27:01,325 INFO: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/ntp/manifests/init.pp", 56]:["/etc/puppet/modules/tripleo/manifests/profile/base/time/ntp.pp", 29] >2018-08-21 15:27:01,326 INFO: (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:28:in `deprecation')[0m >2018-08-21 15:27:01,405 INFO: [1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, >2018-08-21 15:27:01,406 INFO: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/ntp/manifests/init.pp", 66]:["/etc/puppet/modules/tripleo/manifests/profile/base/time/ntp.pp", 29] >2018-08-21 15:27:01,407 INFO: (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:28:in `deprecation')[0m >2018-08-21 15:27:01,422 INFO: [1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, >2018-08-21 15:27:01,423 INFO: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/ntp/manifests/init.pp", 68]:["/etc/puppet/modules/tripleo/manifests/profile/base/time/ntp.pp", 29] >2018-08-21 15:27:01,424 INFO: (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:28:in `deprecation')[0m >2018-08-21 15:27:01,511 INFO: [1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, >2018-08-21 15:27:01,512 INFO: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/ntp/manifests/init.pp", 89]:["/etc/puppet/modules/tripleo/manifests/profile/base/time/ntp.pp", 29] >2018-08-21 15:27:01,513 INFO: (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:28:in `deprecation')[0m >2018-08-21 15:27:02,850 INFO: [1;33mWarning: This method is deprecated, please use match expressions with Stdlib::Compat::Ipv6 instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/rabbitmq/manifests/install/rabbitmqadmin.pp", 37]:["/etc/puppet/modules/rabbitmq/manifests/init.pp", 316] >2018-08-21 15:27:02,851 INFO: (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:28:in `deprecation')[0m >2018-08-21 15:27:03,591 INFO: [mNotice: Scope(Class[Tripleo::Firewall::Post]): At this stage, all network traffic is blocked.[0m >2018-08-21 15:27:04,414 INFO: [1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, >2018-08-21 15:27:04,415 INFO: with Stdlib::Compat::Hash. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/tripleo/manifests/profile/base/database/mysql.pp", 103]:["/etc/puppet/manifests/puppet-stack-config.pp", 97] >2018-08-21 15:27:04,415 INFO: (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:28:in `deprecation')[0m >2018-08-21 15:27:04,648 INFO: [1;33mWarning: ModuleLoader: module 'mysql' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:27:04,649 INFO: (file & line not available)[0m >2018-08-21 15:27:05,762 INFO: [1;33mWarning: ModuleLoader: module 'keystone' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:27:05,763 INFO: (file & line not available)[0m >2018-08-21 15:27:07,781 INFO: [1;33mWarning: ModuleLoader: module 'glance' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:27:07,781 INFO: (file & line not available)[0m >2018-08-21 15:27:08,463 INFO: [1;33mWarning: ModuleLoader: module 'nova' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:27:08,464 INFO: (file & line not available)[0m >2018-08-21 15:27:09,015 INFO: [1;33mWarning: Unknown variable: '::nova::db::mysql_api::setup_cell0'. at /etc/puppet/modules/nova/manifests/db/mysql.pp:53:28[0m >2018-08-21 15:27:09,324 INFO: [1;33mWarning: ModuleLoader: module 'neutron' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:27:09,325 INFO: (file & line not available)[0m >2018-08-21 15:27:11,856 INFO: [1;33mWarning: ModuleLoader: module 'heat' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:27:11,856 INFO: (file & line not available)[0m >2018-08-21 15:27:12,132 INFO: [1;33mWarning: ModuleLoader: module 'ironic' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:27:12,133 INFO: (file & line not available)[0m >2018-08-21 15:27:12,734 INFO: [1;33mWarning: ModuleLoader: module 'swift' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:27:12,735 INFO: (file & line not available)[0m >2018-08-21 15:27:13,890 INFO: [1;33mWarning: Scope(Class[Keystone]): keystone::rabbit_host, keystone::rabbit_hosts, keystone::rabbit_password, keystone::rabbit_port, keystone::rabbit_userid and keystone::rabbit_virtual_host are deprecated. Please use keystone::default_transport_url instead.[0m >2018-08-21 15:27:20,527 INFO: [1;33mWarning: Scope(Class[Glance::Notify::Rabbitmq]): glance::notify::rabbitmq::rabbit_host, glance::notify::rabbitmq::rabbit_hosts, glance::notify::rabbitmq::rabbit_password, glance::notify::rabbitmq::rabbit_port, glance::notify::rabbitmq::rabbit_userid and glance::notify::rabbitmq::rabbit_virtual_host are deprecated. Please use glance::notify::rabbitmq::default_transport_url instead.[0m >2018-08-21 15:27:20,943 INFO: [1;33mWarning: Scope(Class[Nova::Db]): placement_database_connection has no effect as of pike, and may be removed in a future release[0m >2018-08-21 15:27:20,944 INFO: [1;33mWarning: Scope(Class[Nova::Db]): placement_slave_connection has no effect as of pike, and may be removed in a future release[0m >2018-08-21 15:27:21,922 INFO: [1;33mWarning: ModuleLoader: module 'cinder' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:27:21,923 INFO: (file & line not available)[0m >2018-08-21 15:27:23,459 INFO: [1;33mWarning: Unknown variable: 'until_complete_real'. at /etc/puppet/modules/nova/manifests/cron/archive_deleted_rows.pp:77:82[0m >2018-08-21 15:27:23,666 INFO: [1;33mWarning: This method is deprecated, please use match expressions with Stdlib::Compat::Array instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/nova/manifests/scheduler/filter.pp", 140]:["/etc/puppet/manifests/puppet-stack-config.pp", 396] >2018-08-21 15:27:23,667 INFO: (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:28:in `deprecation')[0m >2018-08-21 15:27:24,634 INFO: [1;33mWarning: Scope(Class[Neutron]): neutron::rabbit_host, neutron::rabbit_hosts, neutron::rabbit_password, neutron::rabbit_port, neutron::rabbit_user, neutron::rabbit_virtual_host and neutron::rpc_backend are deprecated. Please use neutron::default_transport_url instead.[0m >2018-08-21 15:27:28,958 INFO: [1;33mWarning: Unknown variable: 'methods_real'. at /etc/puppet/modules/swift/manifests/proxy/tempurl.pp:100:56[0m >2018-08-21 15:27:28,959 INFO: [1;33mWarning: Unknown variable: 'incoming_remove_headers_real'. at /etc/puppet/modules/swift/manifests/proxy/tempurl.pp:101:56[0m >2018-08-21 15:27:28,960 INFO: [1;33mWarning: Unknown variable: 'incoming_allow_headers_real'. at /etc/puppet/modules/swift/manifests/proxy/tempurl.pp:102:56[0m >2018-08-21 15:27:28,961 INFO: [1;33mWarning: Unknown variable: 'outgoing_remove_headers_real'. at /etc/puppet/modules/swift/manifests/proxy/tempurl.pp:103:56[0m >2018-08-21 15:27:28,963 INFO: [1;33mWarning: Unknown variable: 'outgoing_allow_headers_real'. at /etc/puppet/modules/swift/manifests/proxy/tempurl.pp:104:56[0m >2018-08-21 15:27:29,506 INFO: [1;33mWarning: Scope(Class[Swift::Storage::All]): The default port for the object storage server has changed from 6000 to 6200 and will be changed in a later release[0m >2018-08-21 15:27:29,507 INFO: [1;33mWarning: Scope(Class[Swift::Storage::All]): The default port for the container storage server has changed from 6001 to 6201 and will be changed in a later release[0m >2018-08-21 15:27:29,508 INFO: [1;33mWarning: Scope(Class[Swift::Storage::All]): The default port for the account storage server has changed from 6002 to 6202 and will be changed in a later release[0m >2018-08-21 15:27:31,481 INFO: [1;33mWarning: This method is deprecated, please use the stdlib validate_legacy function, >2018-08-21 15:27:31,482 INFO: with Stdlib::Compat::Integer. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/heat/manifests/wsgi/apache_api_cfn.pp", 125]:["/etc/puppet/manifests/puppet-stack-config.pp", 517] >2018-08-21 15:27:31,483 INFO: (at /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:28:in `deprecation')[0m >2018-08-21 15:27:33,168 INFO: [1;33mWarning: Unknown variable: '::ironic::conductor::swift_account'. at /etc/puppet/modules/ironic/manifests/glance.pp:117:30[0m >2018-08-21 15:27:33,170 INFO: [1;33mWarning: Unknown variable: '::ironic::conductor::swift_temp_url_key'. at /etc/puppet/modules/ironic/manifests/glance.pp:118:35[0m >2018-08-21 15:27:33,172 INFO: [1;33mWarning: Unknown variable: '::ironic::conductor::swift_temp_url_duration'. at /etc/puppet/modules/ironic/manifests/glance.pp:119:40[0m >2018-08-21 15:27:33,277 INFO: [1;33mWarning: Unknown variable: '::ironic::api::neutron_url'. at /etc/puppet/modules/ironic/manifests/neutron.pp:58:29[0m >2018-08-21 15:27:37,121 INFO: [1;33mWarning: ModuleLoader: module 'mistral' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:27:37,122 INFO: (file & line not available)[0m >2018-08-21 15:27:37,557 INFO: [1;33mWarning: Unknown variable: '::mistral::database_idle_timeout'. at /etc/puppet/modules/mistral/manifests/db.pp:57:40[0m >2018-08-21 15:27:37,559 INFO: [1;33mWarning: Unknown variable: '::mistral::database_min_pool_size'. at /etc/puppet/modules/mistral/manifests/db.pp:58:40[0m >2018-08-21 15:27:37,562 INFO: [1;33mWarning: Unknown variable: '::mistral::database_max_pool_size'. at /etc/puppet/modules/mistral/manifests/db.pp:59:40[0m >2018-08-21 15:27:37,564 INFO: [1;33mWarning: Unknown variable: '::mistral::database_max_retries'. at /etc/puppet/modules/mistral/manifests/db.pp:60:40[0m >2018-08-21 15:27:37,566 INFO: [1;33mWarning: Unknown variable: '::mistral::database_retry_interval'. at /etc/puppet/modules/mistral/manifests/db.pp:61:40[0m >2018-08-21 15:27:37,568 INFO: [1;33mWarning: Unknown variable: '::mistral::database_max_overflow'. at /etc/puppet/modules/mistral/manifests/db.pp:62:40[0m >2018-08-21 15:27:38,077 INFO: [1;33mWarning: Scope(Class[Mistral]): mistral::rabbit_host, mistral::rabbit_hosts, mistral::rabbit_password, mistral::rabbit_port, mistral::rabbit_userid, mistral::rabbit_virtual_host and mistral::rpc_backend are deprecated. Please use mistral::default_transport_url instead.[0m >2018-08-21 15:27:39,018 INFO: [1;33mWarning: ModuleLoader: module 'zaqar' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:27:39,019 INFO: (file & line not available)[0m >2018-08-21 15:27:43,514 INFO: [1;33mWarning: ModuleLoader: module 'oslo' has unresolved dependencies - it will only see those that are resolved. Use 'puppet module list --tree' to see information about modules >2018-08-21 15:27:43,514 INFO: (file & line not available)[0m >2018-08-21 15:27:44,167 INFO: [1;33mWarning: Scope(Oslo::Messaging::Rabbit[keystone_config]): The oslo_messaging rabbit_host, rabbit_hosts, rabbit_port, rabbit_userid, rabbit_password, rabbit_virtual_host parameters have been deprecated by the [DEFAULT]\transport_url. Please use oslo::messaging::default::transport_url instead.[0m >2018-08-21 15:27:48,018 INFO: [1;33mWarning: Scope(Oslo::Messaging::Rabbit[glance_api_config]): The oslo_messaging rabbit_host, rabbit_hosts, rabbit_port, rabbit_userid, rabbit_password, rabbit_virtual_host parameters have been deprecated by the [DEFAULT]\transport_url. Please use oslo::messaging::default::transport_url instead.[0m >2018-08-21 15:27:48,085 INFO: [1;33mWarning: Scope(Oslo::Messaging::Rabbit[glance_registry_config]): The oslo_messaging rabbit_host, rabbit_hosts, rabbit_port, rabbit_userid, rabbit_password, rabbit_virtual_host parameters have been deprecated by the [DEFAULT]\transport_url. Please use oslo::messaging::default::transport_url instead.[0m >2018-08-21 15:27:48,821 INFO: [1;33mWarning: Scope(Oslo::Messaging::Rabbit[neutron_config]): The oslo_messaging rabbit_host, rabbit_hosts, rabbit_port, rabbit_userid, rabbit_password, rabbit_virtual_host parameters have been deprecated by the [DEFAULT]\transport_url. Please use oslo::messaging::default::transport_url instead.[0m >2018-08-21 15:27:49,129 INFO: [1;33mWarning: Scope(Neutron::Plugins::Ml2::Type_driver[local]): local type_driver is useful only for single-box, because it provides no connectivity between hosts[0m >2018-08-21 15:27:52,161 INFO: [1;33mWarning: Scope(Oslo::Messaging::Rabbit[mistral_config]): The oslo_messaging rabbit_host, rabbit_hosts, rabbit_port, rabbit_userid, rabbit_password, rabbit_virtual_host parameters have been deprecated by the [DEFAULT]\transport_url. Please use oslo::messaging::default::transport_url instead.[0m >2018-08-21 15:28:07,056 INFO: [mNotice: Compiled catalog for undercloud-0.redhat.local in environment production in 69.33 seconds[0m >2018-08-21 15:30:09,488 INFO: [mNotice: /Stage[setup]/Vswitch::Ovs/Package[openvswitch]/ensure: created[0m >2018-08-21 15:30:11,206 INFO: [mNotice: /Stage[setup]/Vswitch::Ovs/Service[openvswitch]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 15:30:21,126 INFO: [mNotice: /Stage[setup]/Tripleo::Network::Os_net_config/Exec[os-net-config]/returns: executed successfully[0m >2018-08-21 15:30:21,182 INFO: [mNotice: /Stage[setup]/Tripleo::Network::Os_net_config/Exec[trigger-keepalived-restart]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:30:22,016 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Time::Ntp/Service[chronyd]/ensure: ensure changed 'running' to 'stopped'[0m >2018-08-21 15:30:22,551 INFO: [mNotice: /Stage[main]/Ntp::Config/File[/etc/ntp.conf]/content: content changed '{md5}724c6fabbf4b32e112eba62bb988a1da' to '{md5}559c25e8bcc4e66a8a99d18bb1059473'[0m >2018-08-21 15:30:23,585 INFO: [mNotice: /Stage[main]/Ntp::Service/Service[ntp]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 15:31:11,224 INFO: [mNotice: /Stage[main]/Rabbitmq::Install/Package[rabbitmq-server]/ensure: created[0m >2018-08-21 15:31:11,232 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/File[/etc/rabbitmq]/owner: owner changed 'rabbitmq' to 'root'[0m >2018-08-21 15:31:11,234 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/File[/etc/rabbitmq]/group: group changed 'rabbitmq' to 'root'[0m >2018-08-21 15:31:11,250 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/File[/etc/rabbitmq/ssl]/ensure: created[0m >2018-08-21 15:31:11,310 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]/content: content changed '{md5}b346ec0a8320f85f795bf612f6b02da7' to '{md5}8c4f19c256f91089c3aaeda217af0370'[0m >2018-08-21 15:31:11,311 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]/owner: owner changed 'rabbitmq' to 'root'[0m >2018-08-21 15:31:11,312 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]/mode: mode changed '0644' to '0640'[0m >2018-08-21 15:31:11,368 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmq-env.config]/ensure: defined content as '{md5}4373a65af6fd9fde9dc110c749b299fd'[0m >2018-08-21 15:31:11,418 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmq-inetrc]/ensure: defined content as '{md5}12f8d1a1f9f57f23c1be6c7bf2286e73'[0m >2018-08-21 15:31:11,469 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmqadmin.conf]/ensure: defined content as '{md5}44d4ef5cb86ab30e6127e83939ef09c4'[0m >2018-08-21 15:31:11,473 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d]/ensure: created[0m >2018-08-21 15:31:11,521 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d/limits.conf]/ensure: defined content as '{md5}8eb9ff6c576b9869944215af3a568c2e'[0m >2018-08-21 15:31:11,750 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/Exec[rabbitmq-systemd-reload]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:31:11,814 INFO: [mNotice: /Stage[main]/Rabbitmq::Config/File[/etc/security/limits.d/rabbitmq-server.conf]/ensure: defined content as '{md5}5ddc6ba5fcaeddd5b1565e5adfda5236'[0m >2018-08-21 15:31:19,113 INFO: [mNotice: /Stage[main]/Rabbitmq/Rabbitmq_plugin[rabbitmq_management]/ensure: created[0m >2018-08-21 15:31:34,851 INFO: [mNotice: /Stage[main]/Rabbitmq::Service/Service[rabbitmq-server]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 15:31:34,950 INFO: [mNotice: /Stage[main]/Rabbitmq::Install::Rabbitmqadmin/Archive[rabbitmqadmin]/ensure: download archive from http://192.168.24.1:15672/cli/rabbitmqadmin to /var/lib/rabbitmq/rabbitmqadmin without cleanup[0m >2018-08-21 15:31:35,664 INFO: [mNotice: /Stage[main]/Rabbitmq::Install::Rabbitmqadmin/File[/usr/local/bin/rabbitmqadmin]/ensure: defined content as '{md5}76394723569012aa8a197a08f1b53926'[0m >2018-08-21 15:31:37,069 INFO: [mNotice: /Stage[main]/Firewall::Linux::Redhat/Service[firewalld]/ensure: ensure changed 'running' to 'stopped'[0m >2018-08-21 15:31:52,826 INFO: [mNotice: /Stage[main]/Firewall::Linux::Redhat/Package[iptables-services]/ensure: created[0m >2018-08-21 15:31:53,088 INFO: [mNotice: /Stage[main]/Firewall::Linux::Redhat/Exec[/usr/bin/systemctl daemon-reload]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:31:54,283 INFO: [mNotice: /Stage[main]/Firewall::Linux::Redhat/Service[iptables]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 15:31:55,396 INFO: [mNotice: /Stage[main]/Firewall::Linux::Redhat/Service[ip6tables]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 15:31:55,478 INFO: [mNotice: /Stage[main]/Tripleo::Selinux/File[/etc/selinux/config]/content: content changed '{md5}a97ac66cf758dceb5eeafa2d83c8e6f3' to '{md5}1b476ce188acf89a99c4da6ae6f0e57f'[0m >2018-08-21 15:31:55,479 INFO: [mNotice: /Stage[main]/Tripleo::Selinux/File[/etc/selinux/config]/mode: mode changed '0644' to '0444'[0m >2018-08-21 15:31:55,557 INFO: [mNotice: /Stage[main]/Mysql::Server::Config/File[mysql-config-file]/ensure: defined content as '{md5}e20ce954fbfa058496bdc813990e4b19'[0m >2018-08-21 15:32:43,116 INFO: [mNotice: /Stage[main]/Mysql::Server::Install/Package[mysql-server]/ensure: created[0m >2018-08-21 15:32:57,545 INFO: [mNotice: /Stage[main]/Mysql::Server::Installdb/Mysql_datadir[/var/lib/mysql]/ensure: created[0m >2018-08-21 15:32:57,556 INFO: [mNotice: /Stage[main]/Mysql::Server::Installdb/File[/var/log/mariadb/mariadb.log]/seluser: seluser changed 'unconfined_u' to 'system_u'[0m >2018-08-21 15:32:57,573 INFO: [mNotice: /Stage[main]/Main/File[/etc/systemd/system/mariadb.service.d]/ensure: created[0m >2018-08-21 15:32:57,630 INFO: [mNotice: /Stage[main]/Main/File[/etc/systemd/system/mariadb.service.d/limits.conf]/ensure: defined content as '{md5}8eb9ff6c576b9869944215af3a568c2e'[0m >2018-08-21 15:32:57,863 INFO: [mNotice: /Stage[main]/Main/Exec[systemctl-daemon-reload]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:32:59,974 INFO: [mNotice: /Stage[main]/Mysql::Server::Service/Service[mysqld]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 15:33:00,104 INFO: [mNotice: /Stage[main]/Mysql::Server::Service/Exec[wait_for_mysql_socket_to_open]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:33:00,878 INFO: [mNotice: /Stage[main]/Mysql::Server::Root_password/Mysql_user[root@localhost]/password_hash: defined 'password_hash' as '*DB44FF141BB48F3CD99D9120AD6C637179C797EC'[0m >2018-08-21 15:33:00,944 INFO: [mNotice: /Stage[main]/Mysql::Server::Root_password/File[/root/.my.cnf]/ensure: defined content as '{md5}37c1cabcbd234d549659fef5ae1051a9'[0m >2018-08-21 15:33:01,037 INFO: [mNotice: /Stage[main]/Mysql::Server::Account_security/Mysql_user[root@127.0.0.1]/ensure: removed[0m >2018-08-21 15:33:01,126 INFO: [mNotice: /Stage[main]/Mysql::Server::Account_security/Mysql_user[root@::1]/ensure: removed[0m >2018-08-21 15:33:01,202 INFO: [mNotice: /Stage[main]/Mysql::Server::Account_security/Mysql_user[@localhost]/ensure: removed[0m >2018-08-21 15:33:01,276 INFO: [mNotice: /Stage[main]/Mysql::Server::Account_security/Mysql_user[root@undercloud-0.redhat.local]/ensure: removed[0m >2018-08-21 15:33:01,348 INFO: [mNotice: /Stage[main]/Mysql::Server::Account_security/Mysql_user[@undercloud-0.redhat.local]/ensure: removed[0m >2018-08-21 15:33:01,719 INFO: [mNotice: /Stage[main]/Mysql::Server::Account_security/Mysql_database[test]/ensure: removed[0m >2018-08-21 15:33:01,732 INFO: [mNotice: /Stage[main]/Main/File[/var/log/journal]/ensure: created[0m >2018-08-21 15:33:02,307 INFO: [mNotice: /Stage[main]/Main/Service[systemd-journald]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:33:27,569 INFO: [mNotice: /Stage[main]/Swift/Package[swift]/ensure: created[0m >2018-08-21 15:34:25,975 INFO: [mNotice: /Stage[main]/Keystone/Package[keystone]/ensure: created[0m >2018-08-21 15:34:41,744 INFO: [mNotice: /Stage[main]/Apache::Mod::Mime/Package[mailcap]/ensure: created[0m >2018-08-21 15:35:25,877 INFO: [mNotice: /Stage[main]/Glance/Package[openstack-glance]/ensure: created[0m >2018-08-21 15:35:25,883 INFO: [mNotice: /Stage[main]/Glance::Deps/Anchor[glance::install::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:35:26,026 INFO: [mNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_host]/ensure: created[0m >2018-08-21 15:35:26,150 INFO: [mNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_port]/ensure: created[0m >2018-08-21 15:35:26,324 INFO: [mNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/workers]/ensure: created[0m >2018-08-21 15:35:26,706 INFO: [mNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/image_cache_dir]/ensure: created[0m >2018-08-21 15:35:26,976 INFO: [mNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/enable_v1_api]/ensure: created[0m >2018-08-21 15:35:27,094 INFO: [mNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/enable_v2_api]/ensure: created[0m >2018-08-21 15:35:27,320 INFO: [mNotice: /Stage[main]/Glance::Api/Glance_api_config[glance_store/os_region_name]/ensure: created[0m >2018-08-21 15:35:28,181 INFO: [mNotice: /Stage[main]/Glance::Api/Glance_api_config[glance_store/stores]/ensure: created[0m >2018-08-21 15:35:28,305 INFO: [mNotice: /Stage[main]/Glance::Api/Glance_cache_config[glance_store/os_region_name]/ensure: created[0m >2018-08-21 15:35:28,428 INFO: [mNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_host]/ensure: created[0m >2018-08-21 15:35:28,598 INFO: [mNotice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_host]/ensure: created[0m >2018-08-21 15:35:28,738 INFO: [mNotice: /Stage[main]/Glance::Api/Glance_api_config[paste_deploy/flavor]/ensure: created[0m >2018-08-21 15:35:29,282 INFO: [mNotice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_create_container_on_put]/ensure: created[0m >2018-08-21 15:35:29,510 INFO: [mNotice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_endpoint_type]/ensure: created[0m >2018-08-21 15:35:29,636 INFO: [mNotice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_config_file]/ensure: created[0m >2018-08-21 15:35:30,199 INFO: [mNotice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_swift_reference]/ensure: created[0m >2018-08-21 15:35:30,315 INFO: [mNotice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_store]/ensure: created[0m >2018-08-21 15:35:30,320 INFO: [mNotice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/user]/ensure: created[0m >2018-08-21 15:35:30,324 INFO: [mNotice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/key]/ensure: created[0m >2018-08-21 15:35:30,328 INFO: [mNotice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/auth_address]/ensure: created[0m >2018-08-21 15:35:30,333 INFO: [mNotice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/auth_version]/ensure: created[0m >2018-08-21 15:35:30,337 INFO: [mNotice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/user_domain_id]/ensure: created[0m >2018-08-21 15:35:30,342 INFO: [mNotice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/project_domain_id]/ensure: created[0m >2018-08-21 15:36:11,191 INFO: [mNotice: /Stage[main]/Nova/Package[python-nova]/ensure: created[0m >2018-08-21 15:36:27,135 INFO: [mNotice: /Stage[main]/Nova/Package[nova-common]/ensure: created[0m >2018-08-21 15:36:45,086 INFO: [mNotice: /Stage[main]/Nova::Compute/Package[genisoimage]/ensure: created[0m >2018-08-21 15:37:56,174 INFO: [mNotice: /Stage[main]/Neutron/Package[neutron]/ensure: created[0m >2018-08-21 15:38:15,744 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/Package[neutron-plugin-ml2]/ensure: created[0m >2018-08-21 15:38:31,181 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2::Networking_baremetal/Package[python2-networking-baremetal]/ensure: created[0m >2018-08-21 15:38:46,799 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Networking_baremetal/Package[python2-ironic-neutron-agent]/ensure: created[0m >2018-08-21 15:39:02,301 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]/ensure: created[0m >2018-08-21 15:39:02,305 INFO: [mNotice: /Stage[main]/Neutron::Deps/Anchor[neutron::install::end]: Triggered 'refresh' from 5 events[0m >2018-08-21 15:39:02,362 INFO: [mNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/bind_host]/ensure: created[0m >2018-08-21 15:39:02,414 INFO: [mNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created[0m >2018-08-21 15:39:02,444 INFO: [mNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created[0m >2018-08-21 15:39:02,540 INFO: [mNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created[0m >2018-08-21 15:39:02,572 INFO: [mNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agents_per_network]/ensure: created[0m >2018-08-21 15:39:02,710 INFO: [mNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created[0m >2018-08-21 15:39:02,759 INFO: [mNotice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created[0m >2018-08-21 15:39:03,274 INFO: [mNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created[0m >2018-08-21 15:39:03,394 INFO: [mNotice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]/ensure: created[0m >2018-08-21 15:39:03,426 INFO: [mNotice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/max_l3_agents_per_router]/ensure: created[0m >2018-08-21 15:39:03,480 INFO: [mNotice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]/ensure: created[0m >2018-08-21 15:39:03,511 INFO: [mNotice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]/ensure: created[0m >2018-08-21 15:39:03,588 INFO: [mNotice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]/ensure: created[0m >2018-08-21 15:39:03,861 INFO: [mNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]/ensure: created[0m >2018-08-21 15:39:03,888 INFO: [mNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]/ensure: created[0m >2018-08-21 15:39:03,916 INFO: [mNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]/ensure: created[0m >2018-08-21 15:39:03,944 INFO: [mNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]/ensure: created[0m >2018-08-21 15:39:03,972 INFO: [mNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_name]/ensure: created[0m >2018-08-21 15:39:04,001 INFO: [mNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]/ensure: created[0m >2018-08-21 15:39:04,030 INFO: [mNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]/ensure: created[0m >2018-08-21 15:39:04,058 INFO: [mNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_name]/ensure: created[0m >2018-08-21 15:39:04,131 INFO: [mNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]/ensure: created[0m >2018-08-21 15:39:04,160 INFO: [mNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]/ensure: created[0m >2018-08-21 15:39:04,191 INFO: [mNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]/ensure: created[0m >2018-08-21 15:39:04,656 INFO: [mNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]/ensure: created[0m >2018-08-21 15:39:04,774 INFO: [mNotice: /Stage[main]/Neutron::Quota/Neutron_config[quotas/quota_port]/ensure: created[0m >2018-08-21 15:39:04,956 INFO: [mNotice: /Stage[main]/Neutron::Quota/Neutron_config[quotas/quota_firewall_rule]/ensure: created[0m >2018-08-21 15:39:05,027 INFO: [mNotice: /Stage[main]/Neutron::Quota/Neutron_config[quotas/quota_network_gateway]/ensure: created[0m >2018-08-21 15:39:05,055 INFO: [mNotice: /Stage[main]/Neutron::Quota/Neutron_config[quotas/quota_packet_filter]/ensure: created[0m >2018-08-21 15:39:05,131 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/File[/etc/neutron/plugin.ini]/ensure: created[0m >2018-08-21 15:39:05,140 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/File[/etc/default/neutron-server]/ensure: created[0m >2018-08-21 15:39:05,151 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/type_drivers]/ensure: created[0m >2018-08-21 15:39:05,162 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/tenant_network_types]/ensure: created[0m >2018-08-21 15:39:05,173 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/mechanism_drivers]/ensure: created[0m >2018-08-21 15:39:05,184 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/path_mtu]/ensure: created[0m >2018-08-21 15:39:05,195 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/extension_drivers]/ensure: created[0m >2018-08-21 15:39:05,229 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Networking_baremetal/Ironic_neutron_agent_config[ironic/auth_type]/ensure: created[0m >2018-08-21 15:39:05,233 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Networking_baremetal/Ironic_neutron_agent_config[ironic/auth_url]/ensure: created[0m >2018-08-21 15:39:05,237 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Networking_baremetal/Ironic_neutron_agent_config[ironic/username]/ensure: created[0m >2018-08-21 15:39:05,240 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Networking_baremetal/Ironic_neutron_agent_config[ironic/password]/ensure: created[0m >2018-08-21 15:39:05,244 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Networking_baremetal/Ironic_neutron_agent_config[ironic/project_domain_id]/ensure: created[0m >2018-08-21 15:39:05,248 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Networking_baremetal/Ironic_neutron_agent_config[ironic/project_domain_name]/ensure: created[0m >2018-08-21 15:39:05,252 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Networking_baremetal/Ironic_neutron_agent_config[ironic/project_name]/ensure: created[0m >2018-08-21 15:39:05,256 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Networking_baremetal/Ironic_neutron_agent_config[ironic/user_domain_id]/ensure: created[0m >2018-08-21 15:39:05,260 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Networking_baremetal/Ironic_neutron_agent_config[ironic/user_domain_name]/ensure: created[0m >2018-08-21 15:39:05,265 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Networking_baremetal/Ironic_neutron_agent_config[ironic/region_name]/ensure: created[0m >2018-08-21 15:39:05,281 INFO: [mNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_isolated_metadata]/ensure: created[0m >2018-08-21 15:39:05,298 INFO: [mNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_metadata_network]/ensure: created[0m >2018-08-21 15:39:05,316 INFO: [mNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/state_path]/ensure: created[0m >2018-08-21 15:39:05,327 INFO: [mNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/resync_interval]/ensure: created[0m >2018-08-21 15:39:05,339 INFO: [mNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/interface_driver]/ensure: created[0m >2018-08-21 15:39:05,363 INFO: [mNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/root_helper]/ensure: created[0m >2018-08-21 15:39:05,381 INFO: [mNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/dnsmasq_config_file]/ensure: created[0m >2018-08-21 15:39:05,446 INFO: [mNotice: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/interface_driver]/ensure: created[0m >2018-08-21 15:39:05,503 INFO: [mNotice: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/agent_mode]/ensure: created[0m >2018-08-21 15:39:05,529 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/bridge_mappings]/ensure: created[0m >2018-08-21 15:39:05,567 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/drop_flows_on_start]/ensure: created[0m >2018-08-21 15:39:06,033 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/integration_bridge]/ensure: created[0m >2018-08-21 15:39:06,076 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[securitygroup/firewall_driver]/ensure: created[0m >2018-08-21 15:39:06,572 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Service[ovs-cleanup-service]/enable: enable changed 'false' to 'true'[0m >2018-08-21 15:39:06,610 INFO: [mNotice: /Stage[main]/Main/Neutron_config[DEFAULT/notification_driver]/ensure: created[0m >2018-08-21 15:39:25,103 INFO: [mNotice: /Stage[main]/Memcached/Package[memcached]/ensure: created[0m >2018-08-21 15:39:25,494 INFO: [mNotice: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]/content: content changed '{md5}a50ed62e82d31fb4cb2de2226650c545' to '{md5}428e37392bd4e42102776c144482fa82'[0m >2018-08-21 15:39:26,444 INFO: [mNotice: /Stage[main]/Memcached/Service[memcached]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 15:39:44,155 INFO: [mNotice: /Stage[main]/Swift::Proxy/Package[swift-proxy]/ensure: created[0m >2018-08-21 15:39:44,179 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift_object_expirer_config[pipeline:main/pipeline]/value: value changed 'catch_errors cache proxy-server' to 'catch_errors proxy-server'[0m >2018-08-21 15:39:44,184 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift_object_expirer_config[object-expirer/auto_create_account_prefix]/ensure: created[0m >2018-08-21 15:39:44,188 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift_object_expirer_config[object-expirer/concurrency]/ensure: created[0m >2018-08-21 15:39:44,193 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift_object_expirer_config[object-expirer/expiring_objects_account_name]/ensure: created[0m >2018-08-21 15:39:44,197 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift_object_expirer_config[object-expirer/interval]/ensure: created[0m >2018-08-21 15:39:44,201 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift_object_expirer_config[object-expirer/process]/ensure: created[0m >2018-08-21 15:39:44,206 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift_object_expirer_config[object-expirer/processes]/ensure: created[0m >2018-08-21 15:39:44,210 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift_object_expirer_config[object-expirer/reclaim_age]/ensure: created[0m >2018-08-21 15:39:44,215 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift_object_expirer_config[object-expirer/recon_cache_path]/ensure: created[0m >2018-08-21 15:39:44,220 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift_object_expirer_config[object-expirer/report_interval]/ensure: created[0m >2018-08-21 15:39:44,224 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift_object_expirer_config[object-expirer/log_facility]/ensure: created[0m >2018-08-21 15:39:44,229 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift_object_expirer_config[object-expirer/log_level]/ensure: created[0m >2018-08-21 15:40:00,008 INFO: [mNotice: /Stage[main]/Xinetd/Package[xinetd]/ensure: created[0m >2018-08-21 15:40:00,078 INFO: [mNotice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/content: content changed '{md5}9ff8cc688dd9f0dfc45e5afd25c427a7' to '{md5}7d37008224e71625019cb48768f267e7'[0m >2018-08-21 15:40:00,079 INFO: [mNotice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/mode: mode changed '0600' to '0644'[0m >2018-08-21 15:40:37,358 INFO: [mNotice: /Stage[main]/Heat/Package[heat-common]/ensure: created[0m >2018-08-21 15:40:52,878 INFO: [mNotice: /Stage[main]/Heat::Api/Package[heat-api]/ensure: created[0m >2018-08-21 15:41:08,245 INFO: [mNotice: /Stage[main]/Heat::Api_cfn/Package[heat-api-cfn]/ensure: created[0m >2018-08-21 15:41:25,371 INFO: [mNotice: /Stage[main]/Heat::Engine/Package[heat-engine]/ensure: created[0m >2018-08-21 15:41:25,374 INFO: [mNotice: /Stage[main]/Heat::Deps/Anchor[heat::install::end]: Triggered 'refresh' from 4 events[0m >2018-08-21 15:41:25,455 INFO: [mNotice: /Stage[main]/Heat/Heat_config[trustee/auth_type]/ensure: created[0m >2018-08-21 15:41:25,520 INFO: [mNotice: /Stage[main]/Heat/Heat_config[trustee/auth_url]/ensure: created[0m >2018-08-21 15:41:25,587 INFO: [mNotice: /Stage[main]/Heat/Heat_config[trustee/username]/ensure: created[0m >2018-08-21 15:41:25,654 INFO: [mNotice: /Stage[main]/Heat/Heat_config[trustee/password]/ensure: created[0m >2018-08-21 15:41:25,720 INFO: [mNotice: /Stage[main]/Heat/Heat_config[trustee/project_domain_name]/ensure: created[0m >2018-08-21 15:41:25,788 INFO: [mNotice: /Stage[main]/Heat/Heat_config[trustee/user_domain_name]/ensure: created[0m >2018-08-21 15:41:25,853 INFO: [mNotice: /Stage[main]/Heat/Heat_config[clients_keystone/auth_uri]/ensure: created[0m >2018-08-21 15:41:25,951 INFO: [mNotice: /Stage[main]/Heat/Heat_config[clients/endpoint_type]/ensure: created[0m >2018-08-21 15:41:26,082 INFO: [mNotice: /Stage[main]/Heat/Heat_config[DEFAULT/max_json_body_size]/ensure: created[0m >2018-08-21 15:41:26,710 INFO: [mNotice: /Stage[main]/Heat/Heat_config[ec2authtoken/auth_uri]/ensure: created[0m >2018-08-21 15:41:26,805 INFO: [mNotice: /Stage[main]/Heat/Heat_config[yaql/limit_iterators]/ensure: created[0m >2018-08-21 15:41:26,869 INFO: [mNotice: /Stage[main]/Heat/Heat_config[yaql/memory_quota]/ensure: created[0m >2018-08-21 15:41:26,934 INFO: [mNotice: /Stage[main]/Heat::Api/Heat_config[heat_api/bind_host]/ensure: created[0m >2018-08-21 15:41:27,029 INFO: [mNotice: /Stage[main]/Heat::Api/Heat_config[heat_api/workers]/ensure: created[0m >2018-08-21 15:41:27,094 INFO: [mNotice: /Stage[main]/Heat::Api_cfn/Heat_config[heat_api_cfn/bind_host]/ensure: created[0m >2018-08-21 15:41:27,189 INFO: [mNotice: /Stage[main]/Heat::Api_cfn/Heat_config[heat_api_cfn/workers]/ensure: created[0m >2018-08-21 15:41:27,257 INFO: [mNotice: /Stage[main]/Heat::Engine/Heat_config[DEFAULT/auth_encryption_key]/ensure: created[0m >2018-08-21 15:41:27,325 INFO: [mNotice: /Stage[main]/Heat::Engine/Heat_config[DEFAULT/heat_stack_user_role]/ensure: created[0m >2018-08-21 15:41:27,391 INFO: [mNotice: /Stage[main]/Heat::Engine/Heat_config[DEFAULT/heat_metadata_server_url]/ensure: created[0m >2018-08-21 15:41:27,459 INFO: [mNotice: /Stage[main]/Heat::Engine/Heat_config[DEFAULT/heat_waitcondition_server_url]/ensure: created[0m >2018-08-21 15:41:27,526 INFO: [mNotice: /Stage[main]/Heat::Engine/Heat_config[DEFAULT/heat_watch_server_url]/ensure: created[0m >2018-08-21 15:41:27,778 INFO: [mNotice: /Stage[main]/Heat::Engine/Heat_config[DEFAULT/max_resources_per_stack]/ensure: created[0m >2018-08-21 15:41:27,910 INFO: [mNotice: /Stage[main]/Heat::Engine/Heat_config[DEFAULT/num_engine_workers]/ensure: created[0m >2018-08-21 15:41:27,977 INFO: [mNotice: /Stage[main]/Heat::Engine/Heat_config[DEFAULT/convergence_engine]/ensure: created[0m >2018-08-21 15:41:28,044 INFO: [mNotice: /Stage[main]/Heat::Engine/Heat_config[DEFAULT/reauthentication_auth_method]/ensure: created[0m >2018-08-21 15:41:28,176 INFO: [mNotice: /Stage[main]/Heat::Engine/Heat_config[DEFAULT/max_nested_stack_depth]/ensure: created[0m >2018-08-21 15:41:28,279 INFO: [mNotice: /Stage[main]/Heat::Keystone::Domain/Heat_config[DEFAULT/stack_domain_admin]/ensure: created[0m >2018-08-21 15:41:28,349 INFO: [mNotice: /Stage[main]/Heat::Keystone::Domain/Heat_config[DEFAULT/stack_domain_admin_password]/ensure: created[0m >2018-08-21 15:41:28,910 INFO: [mNotice: /Stage[main]/Heat::Keystone::Domain/Heat_config[DEFAULT/stack_user_domain_name]/ensure: created[0m >2018-08-21 15:41:29,053 INFO: [mNotice: /Stage[main]/Heat::Cron::Purge_deleted/Cron[heat-manage purge_deleted]/ensure: created[0m >2018-08-21 15:42:08,385 INFO: [mNotice: /Stage[main]/Ironic/Package[ironic-common]/ensure: created[0m >2018-08-21 15:42:08,457 INFO: [mNotice: /Stage[main]/Main/File[dnsmasq-ironic.conf]/ensure: defined content as '{md5}1c23f6b2b9a0910c3e32f02970493f00'[0m >2018-08-21 15:42:24,043 INFO: [mNotice: /Stage[main]/Ironic::Api/Package[ironic-api]/ensure: created[0m >2018-08-21 15:42:39,676 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Package[ironic-conductor]/ensure: created[0m >2018-08-21 15:42:55,659 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Ansible/Package[systemd-python]/ensure: created[0m >2018-08-21 15:43:12,115 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Staging/Package[ironic-staging-drivers]/ensure: created[0m >2018-08-21 15:43:35,024 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Package[ironic-inspector]/ensure: created[0m >2018-08-21 15:43:50,822 INFO: [mNotice: /Stage[main]/Ironic::Pxe/Package[tftp-server]/ensure: created[0m >2018-08-21 15:44:11,398 INFO: [mNotice: /Stage[main]/Ironic::Pxe/Package[syslinux]/ensure: created[0m >2018-08-21 15:44:28,968 INFO: [mNotice: /Stage[main]/Ironic::Pxe/Package[ipxe]/ensure: created[0m >2018-08-21 15:44:28,972 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic::install::end]: Triggered 'refresh' from 4 events[0m >2018-08-21 15:44:28,997 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic-inspector::install::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:44:29,116 INFO: [mNotice: /Stage[main]/Ironic::Glance/Ironic_config[glance/auth_type]/ensure: created[0m >2018-08-21 15:44:29,228 INFO: [mNotice: /Stage[main]/Ironic::Glance/Ironic_config[glance/username]/ensure: created[0m >2018-08-21 15:44:29,337 INFO: [mNotice: /Stage[main]/Ironic::Glance/Ironic_config[glance/password]/ensure: created[0m >2018-08-21 15:44:29,445 INFO: [mNotice: /Stage[main]/Ironic::Glance/Ironic_config[glance/auth_url]/ensure: created[0m >2018-08-21 15:44:29,553 INFO: [mNotice: /Stage[main]/Ironic::Glance/Ironic_config[glance/project_name]/ensure: created[0m >2018-08-21 15:44:29,664 INFO: [mNotice: /Stage[main]/Ironic::Glance/Ironic_config[glance/user_domain_name]/ensure: created[0m >2018-08-21 15:44:29,772 INFO: [mNotice: /Stage[main]/Ironic::Glance/Ironic_config[glance/project_domain_name]/ensure: created[0m >2018-08-21 15:44:30,809 INFO: [mNotice: /Stage[main]/Ironic::Neutron/Ironic_config[neutron/auth_type]/ensure: created[0m >2018-08-21 15:44:30,918 INFO: [mNotice: /Stage[main]/Ironic::Neutron/Ironic_config[neutron/username]/ensure: created[0m >2018-08-21 15:44:31,026 INFO: [mNotice: /Stage[main]/Ironic::Neutron/Ironic_config[neutron/password]/ensure: created[0m >2018-08-21 15:44:31,132 INFO: [mNotice: /Stage[main]/Ironic::Neutron/Ironic_config[neutron/auth_url]/ensure: created[0m >2018-08-21 15:44:31,241 INFO: [mNotice: /Stage[main]/Ironic::Neutron/Ironic_config[neutron/project_name]/ensure: created[0m >2018-08-21 15:44:31,349 INFO: [mNotice: /Stage[main]/Ironic::Neutron/Ironic_config[neutron/user_domain_name]/ensure: created[0m >2018-08-21 15:44:31,457 INFO: [mNotice: /Stage[main]/Ironic::Neutron/Ironic_config[neutron/project_domain_name]/ensure: created[0m >2018-08-21 15:44:31,564 INFO: [mNotice: /Stage[main]/Ironic/Ironic_config[DEFAULT/auth_strategy]/ensure: created[0m >2018-08-21 15:44:31,674 INFO: [mNotice: /Stage[main]/Ironic/Ironic_config[DEFAULT/my_ip]/ensure: created[0m >2018-08-21 15:44:31,781 INFO: [mNotice: /Stage[main]/Ironic/Ironic_config[DEFAULT/default_resource_class]/ensure: created[0m >2018-08-21 15:44:31,791 INFO: [mNotice: /Stage[main]/Ironic::Db::Sync/File[/var/log/ironic/ironic-dbsync.log]/ensure: created[0m >2018-08-21 15:44:31,898 INFO: [mNotice: /Stage[main]/Ironic::Api/Ironic_config[api/host_ip]/ensure: created[0m >2018-08-21 15:44:32,007 INFO: [mNotice: /Stage[main]/Ironic::Api/Ironic_config[api/port]/ensure: created[0m >2018-08-21 15:44:32,116 INFO: [mNotice: /Stage[main]/Ironic::Api/Ironic_config[api/max_limit]/ensure: created[0m >2018-08-21 15:44:32,226 INFO: [mNotice: /Stage[main]/Ironic::Api/Ironic_config[api/api_workers]/ensure: created[0m >2018-08-21 15:44:32,957 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Agent/Ironic_config[agent/deploy_logs_collect]/ensure: created[0m >2018-08-21 15:44:33,067 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Agent/Ironic_config[agent/deploy_logs_storage_backend]/ensure: created[0m >2018-08-21 15:44:33,678 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Agent/Ironic_config[agent/deploy_logs_local_path]/ensure: created[0m >2018-08-21 15:44:33,900 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Ironic_config[DEFAULT/enabled_drivers]/ensure: created[0m >2018-08-21 15:44:34,008 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Ironic_config[DEFAULT/enabled_hardware_types]/ensure: created[0m >2018-08-21 15:44:34,117 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/max_time_interval]/ensure: created[0m >2018-08-21 15:44:34,224 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/force_power_state_during_sync]/ensure: created[0m >2018-08-21 15:44:34,333 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/automated_clean]/ensure: created[0m >2018-08-21 15:44:34,440 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/api_url]/ensure: created[0m >2018-08-21 15:44:34,548 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Ironic_config[deploy/http_url]/ensure: created[0m >2018-08-21 15:44:34,707 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Ironic_config[deploy/erase_devices_priority]/ensure: created[0m >2018-08-21 15:44:34,815 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Ironic_config[deploy/erase_devices_metadata_priority]/ensure: created[0m >2018-08-21 15:44:35,125 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Ironic_config[deploy/default_boot_option]/ensure: created[0m >2018-08-21 15:44:35,862 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Ironic_config[neutron/cleaning_network]/ensure: created[0m >2018-08-21 15:44:35,973 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Ironic_config[neutron/provisioning_network]/ensure: created[0m >2018-08-21 15:44:37,428 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Ilo/Ironic_config[ilo/default_boot_mode]/ensure: created[0m >2018-08-21 15:44:37,537 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Inspector/Ironic_config[inspector/enabled]/ensure: created[0m >2018-08-21 15:44:37,695 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Inspector/Ironic_config[inspector/auth_type]/ensure: created[0m >2018-08-21 15:44:37,803 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Inspector/Ironic_config[inspector/username]/ensure: created[0m >2018-08-21 15:44:37,912 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Inspector/Ironic_config[inspector/password]/ensure: created[0m >2018-08-21 15:44:38,020 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Inspector/Ironic_config[inspector/auth_url]/ensure: created[0m >2018-08-21 15:44:38,129 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Inspector/Ironic_config[inspector/project_name]/ensure: created[0m >2018-08-21 15:44:38,240 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Inspector/Ironic_config[inspector/user_domain_name]/ensure: created[0m >2018-08-21 15:44:38,350 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Inspector/Ironic_config[inspector/project_domain_name]/ensure: created[0m >2018-08-21 15:44:38,564 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Pxe/Ironic_config[pxe/ipxe_enabled]/ensure: created[0m >2018-08-21 15:44:39,241 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Pxe/Ironic_config[pxe/pxe_bootfile_name]/ensure: created[0m >2018-08-21 15:44:39,349 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Pxe/Ironic_config[pxe/pxe_config_template]/ensure: created[0m >2018-08-21 15:44:39,508 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Pxe/Ironic_config[pxe/tftp_root]/ensure: created[0m >2018-08-21 15:44:39,667 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Pxe/Ironic_config[pxe/tftp_master_path]/ensure: created[0m >2018-08-21 15:44:39,826 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Pxe/Ironic_config[pxe/uefi_pxe_bootfile_name]/ensure: created[0m >2018-08-21 15:44:39,934 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Pxe/Ironic_config[pxe/uefi_pxe_config_template]/ensure: created[0m >2018-08-21 15:44:40,044 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Pxe/Ironic_config[pxe/ipxe_timeout]/ensure: created[0m >2018-08-21 15:44:40,151 INFO: [mNotice: /Stage[main]/Ironic::Inspector/File[/etc/ironic-inspector/inspector.conf]/owner: owner changed 'root' to 'ironic-inspector'[0m >2018-08-21 15:44:40,221 INFO: [mNotice: /Stage[main]/Ironic::Inspector/File[/etc/ironic-inspector/dnsmasq.conf]/content: content changed '{md5}9eabe6f969928fde6524d0dd00781479' to '{md5}6826e7a886030e536434ac3b3ab341b0'[0m >2018-08-21 15:44:40,254 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[DEFAULT/listen_address]/ensure: created[0m >2018-08-21 15:44:40,285 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[DEFAULT/auth_strategy]/ensure: created[0m >2018-08-21 15:44:40,331 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[capabilities/boot_mode]/ensure: created[0m >2018-08-21 15:44:40,362 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[iptables/dnsmasq_interface]/ensure: created[0m >2018-08-21 15:44:40,393 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[processing/ramdisk_logs_dir]/ensure: created[0m >2018-08-21 15:44:40,455 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[processing/keep_ports]/ensure: created[0m >2018-08-21 15:44:40,487 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[processing/store_data]/ensure: created[0m >2018-08-21 15:44:40,518 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[ironic/auth_type]/ensure: created[0m >2018-08-21 15:44:40,551 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[ironic/username]/ensure: created[0m >2018-08-21 15:44:40,584 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[ironic/password]/ensure: created[0m >2018-08-21 15:44:40,616 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[ironic/project_name]/ensure: created[0m >2018-08-21 15:44:40,649 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[ironic/project_domain_name]/ensure: created[0m >2018-08-21 15:44:40,682 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[ironic/user_domain_name]/ensure: created[0m >2018-08-21 15:44:40,714 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[ironic/auth_url]/ensure: created[0m >2018-08-21 15:44:40,746 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[ironic/max_retries]/ensure: created[0m >2018-08-21 15:44:40,779 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[ironic/retry_interval]/ensure: created[0m >2018-08-21 15:44:40,811 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[swift/auth_type]/ensure: created[0m >2018-08-21 15:44:40,844 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[swift/username]/ensure: created[0m >2018-08-21 15:44:40,876 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[swift/password]/ensure: created[0m >2018-08-21 15:44:40,909 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[swift/project_name]/ensure: created[0m >2018-08-21 15:44:40,942 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[swift/project_domain_name]/ensure: created[0m >2018-08-21 15:44:41,509 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[swift/user_domain_name]/ensure: created[0m >2018-08-21 15:44:41,541 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[swift/auth_url]/ensure: created[0m >2018-08-21 15:44:41,573 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[processing/processing_hooks]/ensure: created[0m >2018-08-21 15:44:41,619 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Ironic_inspector_config[discovery/enroll_node_driver]/ensure: created[0m >2018-08-21 15:44:41,651 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Pxe_filter/Ironic_inspector_config[pxe_filter/driver]/ensure: created[0m >2018-08-21 15:44:41,699 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Pxe_filter::Dnsmasq/Ironic_inspector_config[dnsmasq_pxe_filter/dhcp_hostsdir]/ensure: created[0m >2018-08-21 15:44:41,747 INFO: [mNotice: /Stage[main]/Ironic::Pxe/File[/tftpboot]/ensure: created[0m >2018-08-21 15:44:41,757 INFO: [mNotice: /Stage[main]/Ironic::Pxe/File[/tftpboot/pxelinux.cfg]/ensure: created[0m >2018-08-21 15:44:41,779 INFO: [mNotice: /Stage[main]/Ironic::Pxe/File[/httpboot]/ensure: created[0m >2018-08-21 15:44:41,847 INFO: [mNotice: /Stage[main]/Ironic::Inspector/File[/httpboot/inspector.ipxe]/ensure: defined content as '{md5}38460d0a94a08e1d102ea3b880169752'[0m >2018-08-21 15:44:41,905 INFO: [mNotice: /Stage[main]/Ironic::Pxe/File[/tftpboot/map-file]/ensure: defined content as '{md5}1c4343c656b7f7b9de48495fdc2b6c5e'[0m >2018-08-21 15:44:41,981 INFO: [mNotice: /Stage[main]/Ironic::Pxe/File[/tftpboot/undionly.kpxe]/ensure: defined content as '{md5}60d84c8e9035fac59c73ed4cee8dc82c'[0m >2018-08-21 15:44:42,565 INFO: [mNotice: /Stage[main]/Ironic::Pxe/File[/tftpboot/ipxe.efi]/ensure: defined content as '{md5}8f49ea062dadf0290b5f8b7e5f42a9b9'[0m >2018-08-21 15:44:42,675 INFO: [mNotice: /Stage[main]/Ironic::Service_catalog/Ironic_config[service_catalog/auth_type]/ensure: created[0m >2018-08-21 15:44:42,785 INFO: [mNotice: /Stage[main]/Ironic::Service_catalog/Ironic_config[service_catalog/username]/ensure: created[0m >2018-08-21 15:44:42,893 INFO: [mNotice: /Stage[main]/Ironic::Service_catalog/Ironic_config[service_catalog/password]/ensure: created[0m >2018-08-21 15:44:43,001 INFO: [mNotice: /Stage[main]/Ironic::Service_catalog/Ironic_config[service_catalog/auth_url]/ensure: created[0m >2018-08-21 15:44:43,110 INFO: [mNotice: /Stage[main]/Ironic::Service_catalog/Ironic_config[service_catalog/project_name]/ensure: created[0m >2018-08-21 15:44:43,219 INFO: [mNotice: /Stage[main]/Ironic::Service_catalog/Ironic_config[service_catalog/user_domain_name]/ensure: created[0m >2018-08-21 15:44:43,328 INFO: [mNotice: /Stage[main]/Ironic::Service_catalog/Ironic_config[service_catalog/project_domain_name]/ensure: created[0m >2018-08-21 15:44:43,439 INFO: [mNotice: /Stage[main]/Ironic::Swift/Ironic_config[swift/auth_type]/ensure: created[0m >2018-08-21 15:44:43,548 INFO: [mNotice: /Stage[main]/Ironic::Swift/Ironic_config[swift/username]/ensure: created[0m >2018-08-21 15:44:43,656 INFO: [mNotice: /Stage[main]/Ironic::Swift/Ironic_config[swift/password]/ensure: created[0m >2018-08-21 15:44:43,764 INFO: [mNotice: /Stage[main]/Ironic::Swift/Ironic_config[swift/auth_url]/ensure: created[0m >2018-08-21 15:44:43,873 INFO: [mNotice: /Stage[main]/Ironic::Swift/Ironic_config[swift/project_name]/ensure: created[0m >2018-08-21 15:44:43,984 INFO: [mNotice: /Stage[main]/Ironic::Swift/Ironic_config[swift/user_domain_name]/ensure: created[0m >2018-08-21 15:44:44,093 INFO: [mNotice: /Stage[main]/Ironic::Swift/Ironic_config[swift/project_domain_name]/ensure: created[0m >2018-08-21 15:45:15,417 INFO: [mNotice: /Stage[main]/Main/Package[openstack-tempest]/ensure: created[0m >2018-08-21 15:45:33,045 INFO: [mNotice: /Stage[main]/Main/Package[subunit-filters]/ensure: created[0m >2018-08-21 15:45:33,245 INFO: [mNotice: /Stage[main]/Main/Group[docker]/ensure: created[0m >2018-08-21 15:45:33,468 INFO: [mNotice: /Stage[main]/Main/User[docker_user]/groups: groups changed '' to ['docker'][0m >2018-08-21 15:45:57,147 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker_registry/Package[docker-distribution]/ensure: created[0m >2018-08-21 15:45:57,206 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker_registry/File[/etc/docker-distribution/registry/config.yml]/content: content changed '{md5}fcc7b86bd3a8b9b41577e3af434de461' to '{md5}944be5b9a5850417c064a8ff5005e522'[0m >2018-08-21 15:45:58,168 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker_registry/Service[docker-distribution]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 15:46:22,860 INFO: [mNotice: /Stage[main]/Mistral/Package[mistral-common]/ensure: created[0m >2018-08-21 15:46:39,063 INFO: [mNotice: /Stage[main]/Mistral::Api/Package[mistral-api]/ensure: created[0m >2018-08-21 15:46:54,728 INFO: [mNotice: /Stage[main]/Mistral::Engine/Package[mistral-engine]/ensure: created[0m >2018-08-21 15:47:10,488 INFO: [mNotice: /Stage[main]/Mistral::Executor/Package[mistral-executor]/ensure: created[0m >2018-08-21 15:47:10,492 INFO: [mNotice: /Stage[main]/Mistral::Deps/Anchor[mistral::install::end]: Triggered 'refresh' from 4 events[0m >2018-08-21 15:47:10,517 INFO: [mNotice: /Stage[main]/Mistral::Api/Mistral_config[api/api_workers]/ensure: created[0m >2018-08-21 15:47:10,521 INFO: [mNotice: /Stage[main]/Mistral::Api/Mistral_config[api/host]/ensure: created[0m >2018-08-21 15:47:10,535 INFO: [mNotice: /Stage[main]/Mistral::Engine/Mistral_config[engine/execution_field_size_limit_kb]/ensure: created[0m >2018-08-21 15:47:10,539 INFO: [mNotice: /Stage[main]/Mistral::Engine/Mistral_config[execution_expiration_policy/evaluation_interval]/ensure: created[0m >2018-08-21 15:47:10,543 INFO: [mNotice: /Stage[main]/Mistral::Engine/Mistral_config[execution_expiration_policy/older_than]/ensure: created[0m >2018-08-21 15:47:10,559 INFO: [mNotice: /Stage[main]/Mistral::Cron_trigger/Mistral_config[cron_trigger/execution_interval]/ensure: created[0m >2018-08-21 15:47:30,750 INFO: [mNotice: /Stage[main]/Tripleo::Ui/Package[openstack-tripleo-ui]/ensure: created[0m >2018-08-21 15:47:46,878 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Validations/Package[openstack-tripleo-validations]/ensure: created[0m >2018-08-21 15:47:47,340 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Validations/User[validations]/ensure: created[0m >2018-08-21 15:48:33,754 INFO: [mNotice: /Stage[main]/Zaqar/Package[zaqar-common]/ensure: created[0m >2018-08-21 15:48:33,759 INFO: [mNotice: /Stage[main]/Zaqar::Deps/Anchor[zaqar::install::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:48:33,824 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Trust/Zaqar_config[trustee/username]/ensure: created[0m >2018-08-21 15:48:33,909 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Trust/Zaqar_config[trustee/user_domain_name]/ensure: created[0m >2018-08-21 15:48:33,965 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Trust/Zaqar_config[trustee/auth_url]/ensure: created[0m >2018-08-21 15:48:34,049 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Trust/Zaqar_config[trustee/auth_type]/ensure: created[0m >2018-08-21 15:48:34,110 INFO: [mNotice: /Stage[main]/Zaqar/Zaqar_config[DEFAULT/auth_strategy]/ensure: created[0m >2018-08-21 15:48:34,198 INFO: [mNotice: /Stage[main]/Zaqar/Zaqar_config[DEFAULT/unreliable]/ensure: created[0m >2018-08-21 15:48:34,308 INFO: [mNotice: /Stage[main]/Zaqar/Zaqar_config[storage/message_pipeline]/ensure: created[0m >2018-08-21 15:48:34,417 INFO: [mNotice: /Stage[main]/Zaqar/Zaqar_config[transport/max_messages_post_size]/ensure: created[0m >2018-08-21 15:48:34,474 INFO: [mNotice: /Stage[main]/Zaqar/Zaqar_config[drivers/message_store]/ensure: created[0m >2018-08-21 15:48:34,530 INFO: [mNotice: /Stage[main]/Zaqar/Zaqar_config[drivers/management_store]/ensure: created[0m >2018-08-21 15:48:34,590 INFO: [mNotice: /Stage[main]/Zaqar::Management::Sqlalchemy/Zaqar_config[drivers:management_store:sqlalchemy/uri]/ensure: created[0m >2018-08-21 15:48:34,649 INFO: [mNotice: /Stage[main]/Zaqar::Messaging::Swift/Zaqar_config[drivers:message_store:swift/uri]/ensure: created[0m >2018-08-21 15:48:34,706 INFO: [mNotice: /Stage[main]/Zaqar::Messaging::Swift/Zaqar_config[drivers:message_store:swift/auth_url]/ensure: created[0m >2018-08-21 15:48:34,768 INFO: [mNotice: /Stage[main]/Zaqar::Transport::Websocket/Zaqar_config[drivers:transport:websocket/bind]/ensure: created[0m >2018-08-21 15:48:34,879 INFO: [mNotice: /Stage[main]/Zaqar::Transport::Websocket/Zaqar_config[drivers:transport:websocket/notification_bind]/ensure: created[0m >2018-08-21 15:48:47,591 INFO: [mNotice: /Stage[main]/Main/Package[firewalld]/ensure: purged[0m >2018-08-21 15:48:47,625 INFO: [mNotice: /Stage[main]/Main/Sysctl::Value[net.ipv4.ip_forward]/Sysctl[net.ipv4.ip_forward]/ensure: created[0m >2018-08-21 15:48:48,633 INFO: [mNotice: /Stage[main]/Main/Sysctl::Value[net.ipv4.ip_forward]/Sysctl_runtime[net.ipv4.ip_forward]/val: val changed '0' to '1'[0m >2018-08-21 15:49:51,788 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker/Package[docker]/ensure: created[0m >2018-08-21 15:49:51,802 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker/File[/etc/systemd/system/docker.service.d]/ensure: created[0m >2018-08-21 15:49:51,861 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker/File[/etc/systemd/system/docker.service.d/99-unset-mountflags.conf]/ensure: defined content as '{md5}b984426de0b5978853686a649b64e4b8'[0m >2018-08-21 15:49:52,097 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker/Exec[systemd daemon-reload]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:49:52,535 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker/Augeas[docker-sysconfig-options]/returns: executed successfully[0m >2018-08-21 15:49:52,943 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker/Augeas[docker-sysconfig-registry]/returns: executed successfully[0m >2018-08-21 15:49:53,060 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker/Augeas[docker-daemon.json-debug]/returns: executed successfully[0m >2018-08-21 15:49:53,430 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker/Augeas[docker-sysconfig-storage]/returns: executed successfully[0m >2018-08-21 15:49:53,779 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker/Augeas[docker-sysconfig-network]/returns: executed successfully[0m >2018-08-21 15:49:57,729 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Docker/Service[docker]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 15:49:58,956 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[105 ntp]/Firewall[105 ntp ipv4]/ensure: created[0m >2018-08-21 15:49:59,510 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[105 ntp]/Firewall[105 ntp ipv6]/ensure: created[0m >2018-08-21 15:50:00,675 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[106 vrrp]/Firewall[106 vrrp ipv4]/ensure: created[0m >2018-08-21 15:50:01,149 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[106 vrrp]/Firewall[106 vrrp ipv6]/ensure: created[0m >2018-08-21 15:50:01,780 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[107 haproxy stats]/Firewall[107 haproxy stats ipv4]/ensure: created[0m >2018-08-21 15:50:02,803 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[107 haproxy stats]/Firewall[107 haproxy stats ipv6]/ensure: created[0m >2018-08-21 15:50:03,456 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[108 redis]/Firewall[108 redis ipv4]/ensure: created[0m >2018-08-21 15:50:04,516 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[108 redis]/Firewall[108 redis ipv6]/ensure: created[0m >2018-08-21 15:50:05,182 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[110 ceph]/Firewall[110 ceph ipv4]/ensure: created[0m >2018-08-21 15:50:06,223 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[110 ceph]/Firewall[110 ceph ipv6]/ensure: created[0m >2018-08-21 15:50:07,461 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[111 keystone]/Firewall[111 keystone ipv4]/ensure: created[0m >2018-08-21 15:50:07,977 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[111 keystone]/Firewall[111 keystone ipv6]/ensure: created[0m >2018-08-21 15:50:09,299 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[112 glance]/Firewall[112 glance ipv4]/ensure: created[0m >2018-08-21 15:50:09,827 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[112 glance]/Firewall[112 glance ipv6]/ensure: created[0m >2018-08-21 15:50:11,194 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[113 nova]/Firewall[113 nova ipv4]/ensure: created[0m >2018-08-21 15:50:11,735 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[113 nova]/Firewall[113 nova ipv6]/ensure: created[0m >2018-08-21 15:50:13,016 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[114 neutron server]/Firewall[114 neutron server ipv4]/ensure: created[0m >2018-08-21 15:50:14,193 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[114 neutron server]/Firewall[114 neutron server ipv6]/ensure: created[0m >2018-08-21 15:50:15,515 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[115 neutron dhcp input]/Firewall[115 neutron dhcp input ipv4]/ensure: created[0m >2018-08-21 15:50:16,110 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[115 neutron dhcp input]/Firewall[115 neutron dhcp input ipv6]/ensure: created[0m >2018-08-21 15:50:17,489 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[116 neutron dhcp output]/Firewall[116 neutron dhcp output ipv4]/ensure: created[0m >2018-08-21 15:50:18,723 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[116 neutron dhcp output]/Firewall[116 neutron dhcp output ipv6]/ensure: created[0m >2018-08-21 15:50:19,549 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[118 neutron vxlan networks]/Firewall[118 neutron vxlan networks ipv4]/ensure: created[0m >2018-08-21 15:50:20,777 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[118 neutron vxlan networks]/Firewall[118 neutron vxlan networks ipv6]/ensure: created[0m >2018-08-21 15:50:22,203 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[119 cinder]/Firewall[119 cinder ipv4]/ensure: created[0m >2018-08-21 15:50:23,394 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[119 cinder]/Firewall[119 cinder ipv6]/ensure: created[0m >2018-08-21 15:50:24,797 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[120 iscsi initiator]/Firewall[120 iscsi initiator ipv4]/ensure: created[0m >2018-08-21 15:50:25,460 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[120 iscsi initiator]/Firewall[120 iscsi initiator ipv6]/ensure: created[0m >2018-08-21 15:50:26,928 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[121 memcached]/Firewall[121 memcached ipv4]/ensure: created[0m >2018-08-21 15:50:28,356 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[122 swift proxy]/Firewall[122 swift proxy ipv4]/ensure: created[0m >2018-08-21 15:50:29,643 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[122 swift proxy]/Firewall[122 swift proxy ipv6]/ensure: created[0m >2018-08-21 15:50:31,158 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[123 swift storage]/Firewall[123 swift storage ipv4]/ensure: created[0m >2018-08-21 15:50:32,481 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[123 swift storage]/Firewall[123 swift storage ipv6]/ensure: created[0m >2018-08-21 15:50:34,001 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[125 heat]/Firewall[125 heat ipv4]/ensure: created[0m >2018-08-21 15:50:35,268 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[125 heat]/Firewall[125 heat ipv6]/ensure: created[0m >2018-08-21 15:50:36,819 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[126 horizon]/Firewall[126 horizon ipv4]/ensure: created[0m >2018-08-21 15:50:38,094 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[126 horizon]/Firewall[126 horizon ipv6]/ensure: created[0m >2018-08-21 15:50:39,662 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[127 snmp]/Firewall[127 snmp ipv4]/ensure: created[0m >2018-08-21 15:50:40,964 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[127 snmp]/Firewall[127 snmp ipv6]/ensure: created[0m >2018-08-21 15:50:42,544 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[128 aodh]/Firewall[128 aodh ipv4]/ensure: created[0m >2018-08-21 15:50:43,860 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[128 aodh]/Firewall[128 aodh ipv6]/ensure: created[0m >2018-08-21 15:50:45,425 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[129 gnocchi-api]/Firewall[129 gnocchi-api ipv4]/ensure: created[0m >2018-08-21 15:50:46,827 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[129 gnocchi-api]/Firewall[129 gnocchi-api ipv6]/ensure: created[0m >2018-08-21 15:50:48,452 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[130 tftp]/Firewall[130 tftp ipv4]/ensure: created[0m >2018-08-21 15:50:49,801 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[130 tftp]/Firewall[130 tftp ipv6]/ensure: created[0m >2018-08-21 15:50:52,045 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[131 novnc]/Firewall[131 novnc ipv4]/ensure: created[0m >2018-08-21 15:50:52,890 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[131 novnc]/Firewall[131 novnc ipv6]/ensure: created[0m >2018-08-21 15:50:55,760 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[132 mistral]/Firewall[132 mistral ipv4]/ensure: created[0m >2018-08-21 15:50:57,204 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[132 mistral]/Firewall[132 mistral ipv6]/ensure: created[0m >2018-08-21 15:50:58,845 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[133 zaqar]/Firewall[133 zaqar ipv4]/ensure: created[0m >2018-08-21 15:51:00,311 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[133 zaqar]/Firewall[133 zaqar ipv6]/ensure: created[0m >2018-08-21 15:51:02,498 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[134 zaqar websockets]/Firewall[134 zaqar websockets ipv4]/ensure: created[0m >2018-08-21 15:51:03,997 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[134 zaqar websockets]/Firewall[134 zaqar websockets ipv6]/ensure: created[0m >2018-08-21 15:51:05,648 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[135 ironic]/Firewall[135 ironic ipv4]/ensure: created[0m >2018-08-21 15:51:07,139 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[135 ironic]/Firewall[135 ironic ipv6]/ensure: created[0m >2018-08-21 15:51:09,370 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[136 trove]/Firewall[136 trove ipv4]/ensure: created[0m >2018-08-21 15:51:10,814 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[136 trove]/Firewall[136 trove ipv6]/ensure: created[0m >2018-08-21 15:51:12,491 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[137 ironic-inspector]/Firewall[137 ironic-inspector ipv4]/ensure: created[0m >2018-08-21 15:51:14,530 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[137 ironic-inspector]/Firewall[137 ironic-inspector ipv6]/ensure: created[0m >2018-08-21 15:51:16,247 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[138 docker registry]/Firewall[138 docker registry ipv4]/ensure: created[0m >2018-08-21 15:51:17,752 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[138 docker registry]/Firewall[138 docker registry ipv6]/ensure: created[0m >2018-08-21 15:51:20,140 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[139 apache vhost]/Firewall[139 apache vhost ipv4]/ensure: created[0m >2018-08-21 15:51:21,672 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[139 apache vhost]/Firewall[139 apache vhost ipv6]/ensure: created[0m >2018-08-21 15:51:23,959 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[140 destination ctlplane-subnet cidr nat]/Firewall[140 destination ctlplane-subnet cidr nat ipv4]/ensure: created[0m >2018-08-21 15:51:25,740 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[140 source ctlplane-subnet cidr nat]/Firewall[140 source ctlplane-subnet cidr nat ipv4]/ensure: created[0m >2018-08-21 15:51:28,179 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[142 tripleo-ui]/Firewall[142 tripleo-ui ipv4]/ensure: created[0m >2018-08-21 15:51:29,721 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[142 tripleo-ui]/Firewall[142 tripleo-ui ipv6]/ensure: created[0m >2018-08-21 15:51:32,071 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[143 panko-api]/Firewall[143 panko-api ipv4]/ensure: created[0m >2018-08-21 15:51:33,685 INFO: [mNotice: /Stage[main]/Tripleo::Firewall/Tripleo::Firewall::Rule[143 panko-api]/Firewall[143 panko-api ipv6]/ensure: created[0m >2018-08-21 15:51:36,047 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[000 accept related established rules]/Firewall[000 accept related established rules ipv4]/ensure: created[0m >2018-08-21 15:51:37,672 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[000 accept related established rules]/Firewall[000 accept related established rules ipv6]/ensure: created[0m >2018-08-21 15:51:40,049 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[001 accept all icmp]/Firewall[001 accept all icmp ipv4]/ensure: created[0m >2018-08-21 15:51:42,316 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[001 accept all icmp]/Firewall[001 accept all icmp ipv6]/ensure: created[0m >2018-08-21 15:51:44,228 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[002 accept all to lo interface]/Firewall[002 accept all to lo interface ipv4]/ensure: created[0m >2018-08-21 15:51:46,488 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[002 accept all to lo interface]/Firewall[002 accept all to lo interface ipv6]/ensure: created[0m >2018-08-21 15:51:48,913 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[003 accept ssh]/Firewall[003 accept ssh ipv4]/ensure: created[0m >2018-08-21 15:51:50,543 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[003 accept ssh]/Firewall[003 accept ssh ipv6]/ensure: created[0m >2018-08-21 15:51:52,248 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Pre/Tripleo::Firewall::Rule[004 accept ipv6 dhcpv6]/Firewall[004 accept ipv6 dhcpv6 ipv6]/ensure: created[0m >2018-08-21 15:51:52,417 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.ip_nonlocal_bind]/Sysctl[net.ipv4.ip_nonlocal_bind]/ensure: created[0m >2018-08-21 15:51:52,472 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.ip_nonlocal_bind]/Sysctl_runtime[net.ipv4.ip_nonlocal_bind]/val: val changed '0' to '1'[0m >2018-08-21 15:51:52,478 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.ip_nonlocal_bind]/Sysctl[net.ipv6.ip_nonlocal_bind]/ensure: created[0m >2018-08-21 15:51:53,162 INFO: [mNotice: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.ip_nonlocal_bind]/Sysctl_runtime[net.ipv6.ip_nonlocal_bind]/val: val changed '0' to '1'[0m >2018-08-21 15:51:53,455 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Db::Mysql/Openstacklib::Db::Mysql[ironic-inspector]/Mysql_database[ironic-inspector]/ensure: created[0m >2018-08-21 15:51:53,788 INFO: [mNotice: /Stage[main]/Glance::Policy/Oslo::Policy[glance_api_config]/Glance_api_config[oslo_policy/policy_file]/ensure: created[0m >2018-08-21 15:51:53,957 INFO: [mNotice: /Stage[main]/Glance::Policy/Oslo::Policy[glance_registry_config]/Glance_registry_config[oslo_policy/policy_file]/ensure: created[0m >2018-08-21 15:51:54,237 INFO: [mNotice: /Stage[main]/Glance::Api::Db/Oslo::Db[glance_api_config]/Glance_api_config[database/connection]/ensure: created[0m >2018-08-21 15:51:55,794 INFO: [mNotice: /Stage[main]/Glance::Api::Logging/Oslo::Log[glance_api_config]/Glance_api_config[DEFAULT/debug]/ensure: created[0m >2018-08-21 15:51:56,029 INFO: [mNotice: /Stage[main]/Glance::Api::Logging/Oslo::Log[glance_api_config]/Glance_api_config[DEFAULT/log_file]/ensure: created[0m >2018-08-21 15:51:56,158 INFO: [mNotice: /Stage[main]/Glance::Api::Logging/Oslo::Log[glance_api_config]/Glance_api_config[DEFAULT/log_dir]/ensure: created[0m >2018-08-21 15:51:57,063 INFO: [mNotice: /Stage[main]/Glance::Cache::Logging/Oslo::Log[glance_cache_config]/Glance_cache_config[DEFAULT/debug]/ensure: created[0m >2018-08-21 15:51:57,190 INFO: [mNotice: /Stage[main]/Glance::Cache::Logging/Oslo::Log[glance_cache_config]/Glance_cache_config[DEFAULT/log_file]/ensure: created[0m >2018-08-21 15:51:57,262 INFO: [mNotice: /Stage[main]/Glance::Cache::Logging/Oslo::Log[glance_cache_config]/Glance_cache_config[DEFAULT/log_dir]/ensure: created[0m >2018-08-21 15:51:58,414 INFO: [mNotice: /Stage[main]/Glance::Api::Authtoken/Keystone::Resource::Authtoken[glance_api_config]/Glance_api_config[keystone_authtoken/auth_uri]/ensure: created[0m >2018-08-21 15:51:58,531 INFO: [mNotice: /Stage[main]/Glance::Api::Authtoken/Keystone::Resource::Authtoken[glance_api_config]/Glance_api_config[keystone_authtoken/auth_type]/ensure: created[0m >2018-08-21 15:52:00,457 INFO: [mNotice: /Stage[main]/Glance::Api::Authtoken/Keystone::Resource::Authtoken[glance_api_config]/Glance_api_config[keystone_authtoken/auth_url]/ensure: created[0m >2018-08-21 15:52:00,573 INFO: [mNotice: /Stage[main]/Glance::Api::Authtoken/Keystone::Resource::Authtoken[glance_api_config]/Glance_api_config[keystone_authtoken/username]/ensure: created[0m >2018-08-21 15:52:00,689 INFO: [mNotice: /Stage[main]/Glance::Api::Authtoken/Keystone::Resource::Authtoken[glance_api_config]/Glance_api_config[keystone_authtoken/password]/ensure: created[0m >2018-08-21 15:52:00,807 INFO: [mNotice: /Stage[main]/Glance::Api::Authtoken/Keystone::Resource::Authtoken[glance_api_config]/Glance_api_config[keystone_authtoken/user_domain_name]/ensure: created[0m >2018-08-21 15:52:00,924 INFO: [mNotice: /Stage[main]/Glance::Api::Authtoken/Keystone::Resource::Authtoken[glance_api_config]/Glance_api_config[keystone_authtoken/project_name]/ensure: created[0m >2018-08-21 15:52:01,040 INFO: [mNotice: /Stage[main]/Glance::Api::Authtoken/Keystone::Resource::Authtoken[glance_api_config]/Glance_api_config[keystone_authtoken/project_domain_name]/ensure: created[0m >2018-08-21 15:52:01,266 INFO: [mNotice: /Stage[main]/Glance::Api/Oslo::Middleware[glance_api_config]/Glance_api_config[oslo_middleware/enable_proxy_headers_parsing]/ensure: created[0m >2018-08-21 15:52:02,480 INFO: [mNotice: /Stage[main]/Glance::Notify::Rabbitmq/Oslo::Messaging::Rabbit[glance_api_config]/Glance_api_config[oslo_messaging_rabbit/rabbit_password]/ensure: created[0m >2018-08-21 15:52:02,803 INFO: [mNotice: /Stage[main]/Glance::Notify::Rabbitmq/Oslo::Messaging::Rabbit[glance_api_config]/Glance_api_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created[0m >2018-08-21 15:52:03,125 INFO: [mNotice: /Stage[main]/Glance::Notify::Rabbitmq/Oslo::Messaging::Rabbit[glance_api_config]/Glance_api_config[oslo_messaging_rabbit/rabbit_host]/ensure: created[0m >2018-08-21 15:52:03,704 INFO: [mNotice: /Stage[main]/Glance::Notify::Rabbitmq/Oslo::Messaging::Rabbit[glance_registry_config]/Glance_registry_config[oslo_messaging_rabbit/rabbit_password]/ensure: created[0m >2018-08-21 15:52:04,458 INFO: [mNotice: /Stage[main]/Glance::Notify::Rabbitmq/Oslo::Messaging::Rabbit[glance_registry_config]/Glance_registry_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created[0m >2018-08-21 15:52:04,632 INFO: [mNotice: /Stage[main]/Glance::Notify::Rabbitmq/Oslo::Messaging::Rabbit[glance_registry_config]/Glance_registry_config[oslo_messaging_rabbit/rabbit_host]/ensure: created[0m >2018-08-21 15:52:05,262 INFO: [mNotice: /Stage[main]/Glance::Deps/Anchor[glance::config::end]: Triggered 'refresh' from 47 events[0m >2018-08-21 15:52:05,326 INFO: [mNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Mysql_database[glance]/ensure: created[0m >2018-08-21 15:52:20,947 INFO: [mNotice: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Package[nova-api]/ensure: created[0m >2018-08-21 15:52:46,760 INFO: [mNotice: /Stage[main]/Nova::Wsgi::Apache_placement/Nova::Generic_service[placement-api]/Package[nova-placement-api]/ensure: created[0m >2018-08-21 15:52:46,945 INFO: [mNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::install::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:52:47,045 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_token]/ensure: created[0m >2018-08-21 15:52:47,132 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_bind_host]/ensure: created[0m >2018-08-21 15:52:47,219 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_bind_host]/ensure: created[0m >2018-08-21 15:52:47,306 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_port]/ensure: created[0m >2018-08-21 15:52:47,394 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_port]/ensure: created[0m >2018-08-21 15:52:48,199 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_endpoint]/ensure: created[0m >2018-08-21 15:52:48,325 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[token/driver]/ensure: created[0m >2018-08-21 15:52:48,408 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[token/expiration]/ensure: created[0m >2018-08-21 15:52:48,566 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[ssl/enable]/ensure: created[0m >2018-08-21 15:52:48,874 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[catalog/driver]/ensure: created[0m >2018-08-21 15:52:48,957 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[catalog/template_file]/ensure: created[0m >2018-08-21 15:52:49,270 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[token/provider]/ensure: created[0m >2018-08-21 15:52:49,431 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[eventlet_server/admin_workers]/ensure: created[0m >2018-08-21 15:52:49,515 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[eventlet_server/public_workers]/ensure: created[0m >2018-08-21 15:52:49,528 INFO: [mNotice: /Stage[main]/Keystone/File[/etc/keystone/fernet-keys]/ensure: created[0m >2018-08-21 15:52:49,538 INFO: [mNotice: /Stage[main]/Keystone/File[/etc/keystone/credential-keys]/ensure: created[0m >2018-08-21 15:52:49,621 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[fernet_tokens/key_repository]/ensure: created[0m >2018-08-21 15:52:49,706 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[token/revoke_by_id]/ensure: created[0m >2018-08-21 15:52:49,791 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[fernet_tokens/max_active_keys]/ensure: created[0m >2018-08-21 15:52:49,875 INFO: [mNotice: /Stage[main]/Keystone/Keystone_config[credential/key_repository]/ensure: created[0m >2018-08-21 15:52:49,954 INFO: [mNotice: /Stage[main]/Apache::Mod::Mime/File[mime.conf]/ensure: defined content as '{md5}9da85e58f3bd6c780ce76db603b7f028'[0m >2018-08-21 15:52:50,611 INFO: [mNotice: /Stage[main]/Apache::Mod::Mime_magic/File[mime_magic.conf]/ensure: defined content as '{md5}b258529b332429e2ff8344f726a95457'[0m >2018-08-21 15:52:50,662 INFO: [mNotice: /Stage[main]/Apache::Mod::Alias/File[alias.conf]/ensure: defined content as '{md5}983e865be85f5e0daaed7433db82995e'[0m >2018-08-21 15:52:50,712 INFO: [mNotice: /Stage[main]/Apache::Mod::Autoindex/File[autoindex.conf]/ensure: defined content as '{md5}2421a3c6df32c7e38c2a7a22afdf5728'[0m >2018-08-21 15:52:50,763 INFO: [mNotice: /Stage[main]/Apache::Mod::Deflate/File[deflate.conf]/ensure: defined content as '{md5}a045d750d819b1e9dae3fbfb3f20edd5'[0m >2018-08-21 15:52:50,813 INFO: [mNotice: /Stage[main]/Apache::Mod::Dir/File[dir.conf]/ensure: defined content as '{md5}c741d8ea840e6eb999d739eed47c69d7'[0m >2018-08-21 15:52:50,863 INFO: [mNotice: /Stage[main]/Apache::Mod::Negotiation/File[negotiation.conf]/ensure: defined content as '{md5}47284b5580b986a6ba32580b6ffb9fd7'[0m >2018-08-21 15:52:50,914 INFO: [mNotice: /Stage[main]/Apache::Mod::Setenvif/File[setenvif.conf]/ensure: defined content as '{md5}c7ede4173da1915b7ec088201f030c28'[0m >2018-08-21 15:52:50,973 INFO: [mNotice: /Stage[main]/Apache::Mod::Prefork/File[/etc/httpd/conf.modules.d/prefork.conf]/ensure: defined content as '{md5}f58b0483b70b4e73b5f67ff37b8f24a0'[0m >2018-08-21 15:52:50,989 INFO: [mNotice: /Stage[main]/Keystone::Wsgi::Apache/File[/var/www/cgi-bin/keystone]/ensure: created[0m >2018-08-21 15:52:51,075 INFO: [mNotice: /Stage[main]/Keystone::Wsgi::Apache/File[keystone_wsgi_admin]/ensure: defined content as '{md5}d6dda52b0e14d80a652ecf42686d3962'[0m >2018-08-21 15:52:51,133 INFO: [mNotice: /Stage[main]/Keystone::Wsgi::Apache/File[keystone_wsgi_main]/ensure: defined content as '{md5}072422f0d75777ed1783e6910b3ddc58'[0m >2018-08-21 15:52:51,141 INFO: [mNotice: /Stage[main]/Keystone::Cron::Token_flush/Cron[keystone-manage token_flush]/ensure: created[0m >2018-08-21 15:52:51,268 INFO: [mNotice: /Stage[main]/Main/Keystone_config[ec2/driver]/ensure: created[0m >2018-08-21 15:52:51,344 INFO: [mNotice: /Stage[main]/Apache::Mod::Proxy/File[proxy.conf]/ensure: defined content as '{md5}9eab682d8c4c89abd0ff20c1a60b908d'[0m >2018-08-21 15:52:51,432 INFO: [mNotice: /Stage[main]/Keystone::Logging/Oslo::Log[keystone_config]/Keystone_config[DEFAULT/debug]/ensure: created[0m >2018-08-21 15:52:51,635 INFO: [mNotice: /Stage[main]/Keystone::Logging/Oslo::Log[keystone_config]/Keystone_config[DEFAULT/log_dir]/ensure: created[0m >2018-08-21 15:52:52,962 INFO: [mNotice: /Stage[main]/Keystone::Policy/Oslo::Policy[keystone_config]/Keystone_config[oslo_policy/policy_file]/ensure: created[0m >2018-08-21 15:52:53,201 INFO: [mNotice: /Stage[main]/Keystone::Db/Oslo::Db[keystone_config]/Keystone_config[database/connection]/ensure: created[0m >2018-08-21 15:52:55,100 INFO: [mNotice: /Stage[main]/Keystone/Oslo::Middleware[keystone_config]/Keystone_config[oslo_middleware/enable_proxy_headers_parsing]/ensure: created[0m >2018-08-21 15:52:55,303 INFO: [mNotice: /Stage[main]/Keystone/Oslo::Messaging::Notifications[keystone_config]/Keystone_config[oslo_messaging_notifications/driver]/ensure: created[0m >2018-08-21 15:52:55,426 INFO: [mNotice: /Stage[main]/Keystone/Oslo::Messaging::Notifications[keystone_config]/Keystone_config[oslo_messaging_notifications/topics]/ensure: created[0m >2018-08-21 15:52:55,867 INFO: [mNotice: /Stage[main]/Keystone/Oslo::Messaging::Rabbit[keystone_config]/Keystone_config[oslo_messaging_rabbit/rabbit_password]/ensure: created[0m >2018-08-21 15:52:56,109 INFO: [mNotice: /Stage[main]/Keystone/Oslo::Messaging::Rabbit[keystone_config]/Keystone_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created[0m >2018-08-21 15:52:56,352 INFO: [mNotice: /Stage[main]/Keystone/Oslo::Messaging::Rabbit[keystone_config]/Keystone_config[oslo_messaging_rabbit/rabbit_host]/ensure: created[0m >2018-08-21 15:52:57,285 INFO: [mNotice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/etc/httpd/conf/ports.conf]/ensure: defined content as '{md5}8777e0ec97e199d942ceeda4ba2b92e1'[0m >2018-08-21 15:52:57,352 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]/content: content changed '{md5}f5e7449c0f17bc856e86011cb5d152ba' to '{md5}c1931d983754295ee6d6c0be3959a525'[0m >2018-08-21 15:52:57,412 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[log_config]/File[log_config.load]/ensure: defined content as '{md5}785d35cb285e190d589163b45263ca89'[0m >2018-08-21 15:52:57,470 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[systemd]/File[systemd.load]/ensure: defined content as '{md5}26e5d44aae258b3e9d821cbbbd3e2826'[0m >2018-08-21 15:52:57,529 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[unixd]/File[unixd.load]/ensure: defined content as '{md5}0e8468ecc1265f8947b8725f4d1be9c0'[0m >2018-08-21 15:52:57,588 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_host]/File[authz_host.load]/ensure: defined content as '{md5}d1045f54d2798499ca0f030ca0eef920'[0m >2018-08-21 15:52:57,647 INFO: [mNotice: /Stage[main]/Apache::Mod::Actions/Apache::Mod[actions]/File[actions.load]/ensure: defined content as '{md5}599866dfaf734f60f7e2d41ee8235515'[0m >2018-08-21 15:52:57,706 INFO: [mNotice: /Stage[main]/Apache::Mod::Authn_core/Apache::Mod[authn_core]/File[authn_core.load]/ensure: defined content as '{md5}704d6e8b02b0eca0eba4083960d16c52'[0m >2018-08-21 15:52:57,764 INFO: [mNotice: /Stage[main]/Apache::Mod::Cache/Apache::Mod[cache]/File[cache.load]/ensure: defined content as '{md5}01e4d392225b518a65b0f7d6c4e21d29'[0m >2018-08-21 15:52:57,823 INFO: [mNotice: /Stage[main]/Apache::Mod::Ext_filter/Apache::Mod[ext_filter]/File[ext_filter.load]/ensure: defined content as '{md5}76d5e0ac3411a4be57ac33ebe2e52ac8'[0m >2018-08-21 15:52:57,882 INFO: [mNotice: /Stage[main]/Apache::Mod::Mime/Apache::Mod[mime]/File[mime.load]/ensure: defined content as '{md5}e36257b9efab01459141d423cae57c7c'[0m >2018-08-21 15:52:57,941 INFO: [mNotice: /Stage[main]/Apache::Mod::Mime_magic/Apache::Mod[mime_magic]/File[mime_magic.load]/ensure: defined content as '{md5}cb8670bb2fb352aac7ebf3a85d52094c'[0m >2018-08-21 15:52:58,000 INFO: [mNotice: /Stage[main]/Apache::Mod::Rewrite/Apache::Mod[rewrite]/File[rewrite.load]/ensure: defined content as '{md5}26e2683352fc1599f29573ff0d934e79'[0m >2018-08-21 15:52:58,084 INFO: [mNotice: /Stage[main]/Apache::Mod::Speling/Apache::Mod[speling]/File[speling.load]/ensure: defined content as '{md5}f82e9e6b871a276c324c9eeffcec8a61'[0m >2018-08-21 15:52:58,143 INFO: [mNotice: /Stage[main]/Apache::Mod::Suexec/Apache::Mod[suexec]/File[suexec.load]/ensure: defined content as '{md5}c7d5c61c534ba423a79b0ae78ff9be35'[0m >2018-08-21 15:52:58,194 INFO: [mNotice: /Stage[main]/Apache::Mod::Version/Apache::Mod[version]/File[version.load]/ensure: defined content as '{md5}1c9243de22ace4dc8266442c48ae0c92'[0m >2018-08-21 15:52:58,253 INFO: [mNotice: /Stage[main]/Apache::Mod::Vhost_alias/Apache::Mod[vhost_alias]/File[vhost_alias.load]/ensure: defined content as '{md5}eca907865997d50d5130497665c3f82e'[0m >2018-08-21 15:52:58,312 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_digest]/File[auth_digest.load]/ensure: defined content as '{md5}df9e85f8da0b239fe8e698ae7ead4f60'[0m >2018-08-21 15:52:58,370 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_anon]/File[authn_anon.load]/ensure: defined content as '{md5}bf57b94b5aec35476fc2a2dc3861f132'[0m >2018-08-21 15:52:58,429 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_dbm]/File[authn_dbm.load]/ensure: defined content as '{md5}90ee8f8ef1a017cacadfda4225e10651'[0m >2018-08-21 15:52:58,488 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_dbm]/File[authz_dbm.load]/ensure: defined content as '{md5}c1363277984d22f99b70f7dce8753b60'[0m >2018-08-21 15:52:58,547 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_owner]/File[authz_owner.load]/ensure: defined content as '{md5}f30a9be1016df87f195449d9e02d1857'[0m >2018-08-21 15:52:58,605 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[expires]/File[expires.load]/ensure: defined content as '{md5}f0825bad1e470de86ffabeb86dcc5d95'[0m >2018-08-21 15:52:58,664 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[include]/File[include.load]/ensure: defined content as '{md5}88095a914eedc3c2c184dd5d74c3954c'[0m >2018-08-21 15:52:58,723 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[logio]/File[logio.load]/ensure: defined content as '{md5}084533c7a44e9129d0e6df952e2472b6'[0m >2018-08-21 15:52:58,774 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[substitute]/File[substitute.load]/ensure: defined content as '{md5}8077c34a71afcf41c8fc644830935915'[0m >2018-08-21 15:52:58,824 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[usertrack]/File[usertrack.load]/ensure: defined content as '{md5}e95fbbf030fabec98b948f8dc217775c'[0m >2018-08-21 15:52:58,874 INFO: [mNotice: /Stage[main]/Apache::Mod::Alias/Apache::Mod[alias]/File[alias.load]/ensure: defined content as '{md5}3cf2fa309ccae4c29a4b875d0894cd79'[0m >2018-08-21 15:52:59,606 INFO: [mNotice: /Stage[main]/Apache::Mod::Authn_file/Apache::Mod[authn_file]/File[authn_file.load]/ensure: defined content as '{md5}d41656680003d7b890267bb73621c60b'[0m >2018-08-21 15:52:59,667 INFO: [mNotice: /Stage[main]/Apache::Mod::Autoindex/Apache::Mod[autoindex]/File[autoindex.load]/ensure: defined content as '{md5}515cdf5b573e961a60d2931d39248648'[0m >2018-08-21 15:52:59,726 INFO: [mNotice: /Stage[main]/Apache::Mod::Dav/Apache::Mod[dav]/File[dav.load]/ensure: defined content as '{md5}588e496251838c4840c14b28b5aa7881'[0m >2018-08-21 15:52:59,792 INFO: [mNotice: /Stage[main]/Apache::Mod::Dav_fs/File[dav_fs.conf]/ensure: defined content as '{md5}899a57534f3d84efa81887ec93c90c9b'[0m >2018-08-21 15:52:59,852 INFO: [mNotice: /Stage[main]/Apache::Mod::Dav_fs/Apache::Mod[dav_fs]/File[dav_fs.load]/ensure: defined content as '{md5}2996277c73b1cd684a9a3111c355e0d3'[0m >2018-08-21 15:52:59,911 INFO: [mNotice: /Stage[main]/Apache::Mod::Deflate/Apache::Mod[deflate]/File[deflate.load]/ensure: defined content as '{md5}2d1a1afcae0c70557251829a8586eeaf'[0m >2018-08-21 15:52:59,970 INFO: [mNotice: /Stage[main]/Apache::Mod::Dir/Apache::Mod[dir]/File[dir.load]/ensure: defined content as '{md5}1bfb1c2a46d7351fc9eb47c659dee068'[0m >2018-08-21 15:53:00,029 INFO: [mNotice: /Stage[main]/Apache::Mod::Negotiation/Apache::Mod[negotiation]/File[negotiation.load]/ensure: defined content as '{md5}d262ee6a5f20d9dd7f87770638dc2ccd'[0m >2018-08-21 15:53:00,088 INFO: [mNotice: /Stage[main]/Apache::Mod::Setenvif/Apache::Mod[setenvif]/File[setenvif.load]/ensure: defined content as '{md5}ec6c99f7cc8e35bdbcf8028f652c9f6d'[0m >2018-08-21 15:53:00,147 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_basic]/File[auth_basic.load]/ensure: defined content as '{md5}494bcf4b843f7908675d663d8dc1bdc8'[0m >2018-08-21 15:53:00,205 INFO: [mNotice: /Stage[main]/Apache::Mod::Filter/Apache::Mod[filter]/File[filter.load]/ensure: defined content as '{md5}66a1e2064a140c3e7dca7ac33877700e'[0m >2018-08-21 15:53:00,264 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_core]/File[authz_core.load]/ensure: defined content as '{md5}39942569bff2abdb259f9a347c7246bc'[0m >2018-08-21 15:53:00,323 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[access_compat]/File[access_compat.load]/ensure: defined content as '{md5}d5feb88bec4570e2dbc41cce7e0de003'[0m >2018-08-21 15:53:00,382 INFO: [mNotice: /Stage[main]/Apache::Mod::Authz_user/Apache::Mod[authz_user]/File[authz_user.load]/ensure: defined content as '{md5}63594303ee808423679b1ea13dd5a784'[0m >2018-08-21 15:53:00,441 INFO: [mNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_groupfile]/File[authz_groupfile.load]/ensure: defined content as '{md5}ae005a36b3ac8c20af36c434561c8a75'[0m >2018-08-21 15:53:00,516 INFO: [mNotice: /Stage[main]/Apache::Mod::Env/Apache::Mod[env]/File[env.load]/ensure: defined content as '{md5}d74184d40d0ee24ba02626a188ee7e1a'[0m >2018-08-21 15:53:00,575 INFO: [mNotice: /Stage[main]/Apache::Mod::Prefork/Apache::Mpm[prefork]/File[/etc/httpd/conf.modules.d/prefork.load]/ensure: defined content as '{md5}157529aafcf03fa491bc924103e4608e'[0m >2018-08-21 15:53:00,642 INFO: [mNotice: /Stage[main]/Apache::Mod::Cgi/Apache::Mod[cgi]/File[cgi.load]/ensure: defined content as '{md5}ac20c5c5779b37ab06b480d6485a0881'[0m >2018-08-21 15:53:00,943 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf.d/README]/ensure: removed[0m >2018-08-21 15:53:00,949 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf.d/autoindex.conf]/ensure: removed[0m >2018-08-21 15:53:00,955 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf.d/userdir.conf]/ensure: removed[0m >2018-08-21 15:53:00,961 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf.d/welcome.conf]/ensure: removed[0m >2018-08-21 15:53:01,054 INFO: [mNotice: /Stage[main]/Apache::Mod::Wsgi/File[wsgi.conf]/ensure: defined content as '{md5}8b3feb3fc2563de439920bb2c52cbd11'[0m >2018-08-21 15:53:01,129 INFO: [mNotice: /Stage[main]/Nova::Wsgi::Apache_placement/File[/etc/httpd/conf.d/00-nova-placement-api.conf]/content: content changed '{md5}611e31d39e1635bfabc0aafc51b43d0b' to '{md5}612d455490cfecc4b51db6656ea39240'[0m >2018-08-21 15:53:01,197 INFO: [mNotice: /Stage[main]/Tripleo::Ui/File[/etc/httpd/conf.d/openstack-tripleo-ui.conf]/content: content changed '{md5}0bb5ccf9a90544699ec07adf8028d99a' to '{md5}ec9dfa67b5507ef6f7a8bba6345bc07d'[0m >2018-08-21 15:53:01,256 INFO: [mNotice: /Stage[main]/Apache::Mod::Wsgi/Apache::Mod[wsgi]/File[wsgi.load]/ensure: defined content as '{md5}e1795e051e7aae1f865fde0d3b86a507'[0m >2018-08-21 15:53:02,022 INFO: [mNotice: /Stage[main]/Keystone::Cors/Oslo::Cors[keystone_config]/Keystone_config[cors/allowed_origin]/ensure: created[0m >2018-08-21 15:53:02,238 INFO: [mNotice: /Stage[main]/Nova::Wsgi::Apache_api/Openstacklib::Wsgi::Apache[nova_api_wsgi]/File[/var/www/cgi-bin/nova]/ensure: created[0m >2018-08-21 15:53:02,307 INFO: [mNotice: /Stage[main]/Nova::Wsgi::Apache_api/Openstacklib::Wsgi::Apache[nova_api_wsgi]/File[nova_api_wsgi]/ensure: defined content as '{md5}8bcfb466d72544dd31a4f339243ed669'[0m >2018-08-21 15:53:02,374 INFO: [mNotice: /Stage[main]/Nova::Wsgi::Apache_placement/Openstacklib::Wsgi::Apache[placement_wsgi]/File[placement_wsgi]/ensure: defined content as '{md5}2c992c50344eb1765282cb9fb70126db'[0m >2018-08-21 15:53:18,004 INFO: [mNotice: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Package[nova-conductor]/ensure: created[0m >2018-08-21 15:53:33,500 INFO: [mNotice: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Package[nova-scheduler]/ensure: created[0m >2018-08-21 15:56:48,628 INFO: [mNotice: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Package[nova-compute]/ensure: created[0m >2018-08-21 15:56:48,632 INFO: [mNotice: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Triggered 'refresh' from 7 events[0m >2018-08-21 15:56:48,764 INFO: [mNotice: /Stage[main]/Nova::Db/Nova_config[api_database/connection]/ensure: created[0m >2018-08-21 15:56:48,803 INFO: [mNotice: /Stage[main]/Nova::Db/Nova_config[placement_database/connection]/ensure: created[0m >2018-08-21 15:56:48,841 INFO: [mNotice: /Stage[main]/Nova/Nova_config[glance/api_servers]/ensure: created[0m >2018-08-21 15:56:48,878 INFO: [mNotice: /Stage[main]/Nova/Nova_config[api/auth_strategy]/ensure: created[0m >2018-08-21 15:56:48,903 INFO: [mNotice: /Stage[main]/Nova/Nova_config[DEFAULT/image_service]/ensure: created[0m >2018-08-21 15:56:48,962 INFO: [mNotice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created[0m >2018-08-21 15:56:49,099 INFO: [mNotice: /Stage[main]/Nova/Nova_config[notifications/notify_api_faults]/ensure: created[0m >2018-08-21 15:56:49,121 INFO: [mNotice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created[0m >2018-08-21 15:56:49,203 INFO: [mNotice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created[0m >2018-08-21 15:56:49,228 INFO: [mNotice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created[0m >2018-08-21 15:56:49,253 INFO: [mNotice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created[0m >2018-08-21 15:56:49,278 INFO: [mNotice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created[0m >2018-08-21 15:56:49,334 INFO: [mNotice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created[0m >2018-08-21 15:56:50,323 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[wsgi/api_paste_config]/ensure: created[0m >2018-08-21 15:56:50,351 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/enabled_apis]/ensure: created[0m >2018-08-21 15:56:50,379 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen]/ensure: created[0m >2018-08-21 15:56:50,405 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen]/ensure: created[0m >2018-08-21 15:56:50,431 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen_port]/ensure: created[0m >2018-08-21 15:56:50,457 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen_port]/ensure: created[0m >2018-08-21 15:56:50,483 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_volume_listen]/ensure: created[0m >2018-08-21 15:56:50,509 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_workers]/ensure: created[0m >2018-08-21 15:56:50,535 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_workers]/ensure: created[0m >2018-08-21 15:56:50,609 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[api/use_forwarded_for]/ensure: created[0m >2018-08-21 15:56:50,631 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[api/fping_path]/ensure: created[0m >2018-08-21 15:56:50,936 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created[0m >2018-08-21 15:56:50,976 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created[0m >2018-08-21 15:56:51,791 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[neutron/service_metadata_proxy]/ensure: created[0m >2018-08-21 15:56:51,882 INFO: [mNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/allow_resize_to_same_host]/ensure: created[0m >2018-08-21 15:56:51,903 INFO: [mNotice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created[0m >2018-08-21 15:56:51,925 INFO: [mNotice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created[0m >2018-08-21 15:56:51,947 INFO: [mNotice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created[0m >2018-08-21 15:56:51,968 INFO: [mNotice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created[0m >2018-08-21 15:56:51,990 INFO: [mNotice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created[0m >2018-08-21 15:56:52,012 INFO: [mNotice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created[0m >2018-08-21 15:56:52,035 INFO: [mNotice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created[0m >2018-08-21 15:56:52,058 INFO: [mNotice: /Stage[main]/Nova::Placement/Nova_config[placement/os_region_name]/ensure: created[0m >2018-08-21 15:56:52,099 INFO: [mNotice: /Stage[main]/Nova::Conductor/Nova_config[conductor/workers]/ensure: created[0m >2018-08-21 15:56:52,138 INFO: [mNotice: /Stage[main]/Nova::Scheduler/Nova_config[scheduler/driver]/ensure: created[0m >2018-08-21 15:56:52,178 INFO: [mNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[scheduler/host_manager]/ensure: created[0m >2018-08-21 15:56:52,201 INFO: [mNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[scheduler/max_attempts]/ensure: created[0m >2018-08-21 15:56:52,241 INFO: [mNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[filter_scheduler/host_subset_size]/ensure: created[0m >2018-08-21 15:56:52,264 INFO: [mNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[filter_scheduler/max_io_ops_per_host]/ensure: created[0m >2018-08-21 15:56:52,286 INFO: [mNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[filter_scheduler/max_instances_per_host]/ensure: created[0m >2018-08-21 15:56:52,327 INFO: [mNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[filter_scheduler/available_filters]/ensure: created[0m >2018-08-21 15:56:52,349 INFO: [mNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[filter_scheduler/weight_classes]/ensure: created[0m >2018-08-21 15:56:52,372 INFO: [mNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[filter_scheduler/use_baremetal_filters]/ensure: created[0m >2018-08-21 15:56:52,396 INFO: [mNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[filter_scheduler/enabled_filters]/ensure: created[0m >2018-08-21 15:56:53,358 INFO: [mNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created[0m >2018-08-21 15:56:53,403 INFO: [mNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/heal_instance_info_cache_interval]/ensure: created[0m >2018-08-21 15:56:53,479 INFO: [mNotice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created[0m >2018-08-21 15:56:53,572 INFO: [mNotice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created[0m >2018-08-21 15:56:53,630 INFO: [mNotice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created[0m >2018-08-21 15:56:53,656 INFO: [mNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_config_drive]/ensure: created[0m >2018-08-21 15:56:53,683 INFO: [mNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created[0m >2018-08-21 15:56:53,712 INFO: [mNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created[0m >2018-08-21 15:56:53,739 INFO: [mNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created[0m >2018-08-21 15:56:53,802 INFO: [mNotice: /Stage[main]/Main/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created[0m >2018-08-21 15:56:53,825 INFO: [mNotice: /Stage[main]/Nova::Ironic::Common/Nova_config[ironic/auth_plugin]/ensure: created[0m >2018-08-21 15:56:53,848 INFO: [mNotice: /Stage[main]/Nova::Ironic::Common/Nova_config[ironic/username]/ensure: created[0m >2018-08-21 15:56:53,872 INFO: [mNotice: /Stage[main]/Nova::Ironic::Common/Nova_config[ironic/password]/ensure: created[0m >2018-08-21 15:56:53,895 INFO: [mNotice: /Stage[main]/Nova::Ironic::Common/Nova_config[ironic/auth_url]/ensure: created[0m >2018-08-21 15:56:53,919 INFO: [mNotice: /Stage[main]/Nova::Ironic::Common/Nova_config[ironic/project_name]/ensure: created[0m >2018-08-21 15:56:53,942 INFO: [mNotice: /Stage[main]/Nova::Ironic::Common/Nova_config[ironic/api_endpoint]/ensure: created[0m >2018-08-21 15:56:54,003 INFO: [mNotice: /Stage[main]/Nova::Ironic::Common/Nova_config[ironic/user_domain_name]/ensure: created[0m >2018-08-21 15:56:54,027 INFO: [mNotice: /Stage[main]/Nova::Ironic::Common/Nova_config[ironic/project_domain_name]/ensure: created[0m >2018-08-21 15:56:54,058 INFO: [mNotice: /Stage[main]/Nova::Compute::Ironic/Nova_config[DEFAULT/compute_driver]/ensure: created[0m >2018-08-21 15:56:54,107 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/dhcp_domain]/ensure: created[0m >2018-08-21 15:56:54,136 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/firewall_driver]/ensure: created[0m >2018-08-21 15:56:54,164 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created[0m >2018-08-21 15:56:54,874 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created[0m >2018-08-21 15:56:54,898 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created[0m >2018-08-21 15:56:54,921 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/url]/ensure: created[0m >2018-08-21 15:56:54,944 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created[0m >2018-08-21 15:56:54,967 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created[0m >2018-08-21 15:56:54,991 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created[0m >2018-08-21 15:56:55,014 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created[0m >2018-08-21 15:56:55,037 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created[0m >2018-08-21 15:56:55,061 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created[0m >2018-08-21 15:56:55,084 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created[0m >2018-08-21 15:56:55,107 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created[0m >2018-08-21 15:56:55,131 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created[0m >2018-08-21 15:56:55,155 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created[0m >2018-08-21 15:56:55,178 INFO: [mNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created[0m >2018-08-21 15:56:58,282 INFO: [mNotice: /Stage[main]/Main/Augeas[lvm.conf]/returns: executed successfully[0m >2018-08-21 15:56:58,344 INFO: [mNotice: /Stage[main]/Nova::Db/Oslo::Db[nova_config]/Nova_config[database/connection]/ensure: created[0m >2018-08-21 15:56:58,689 INFO: [mNotice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created[0m >2018-08-21 15:56:58,773 INFO: [mNotice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created[0m >2018-08-21 15:57:01,510 INFO: [mNotice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/rpc_response_timeout]/ensure: created[0m >2018-08-21 15:57:01,538 INFO: [mNotice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created[0m >2018-08-21 15:57:01,583 INFO: [mNotice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created[0m >2018-08-21 15:57:01,665 INFO: [mNotice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created[0m >2018-08-21 15:57:01,691 INFO: [mNotice: /Stage[main]/Nova::Policy/Oslo::Policy[nova_config]/Nova_config[oslo_policy/policy_file]/ensure: created[0m >2018-08-21 15:57:01,775 INFO: [mNotice: /Stage[main]/Nova::Keystone::Authtoken/Keystone::Resource::Authtoken[nova_config]/Nova_config[keystone_authtoken/auth_uri]/ensure: created[0m >2018-08-21 15:57:02,522 INFO: [mNotice: /Stage[main]/Nova::Keystone::Authtoken/Keystone::Resource::Authtoken[nova_config]/Nova_config[keystone_authtoken/auth_type]/ensure: created[0m >2018-08-21 15:57:02,991 INFO: [mNotice: /Stage[main]/Nova::Keystone::Authtoken/Keystone::Resource::Authtoken[nova_config]/Nova_config[keystone_authtoken/auth_url]/ensure: created[0m >2018-08-21 15:57:03,017 INFO: [mNotice: /Stage[main]/Nova::Keystone::Authtoken/Keystone::Resource::Authtoken[nova_config]/Nova_config[keystone_authtoken/username]/ensure: created[0m >2018-08-21 15:57:03,043 INFO: [mNotice: /Stage[main]/Nova::Keystone::Authtoken/Keystone::Resource::Authtoken[nova_config]/Nova_config[keystone_authtoken/password]/ensure: created[0m >2018-08-21 15:57:03,069 INFO: [mNotice: /Stage[main]/Nova::Keystone::Authtoken/Keystone::Resource::Authtoken[nova_config]/Nova_config[keystone_authtoken/user_domain_name]/ensure: created[0m >2018-08-21 15:57:03,096 INFO: [mNotice: /Stage[main]/Nova::Keystone::Authtoken/Keystone::Resource::Authtoken[nova_config]/Nova_config[keystone_authtoken/project_name]/ensure: created[0m >2018-08-21 15:57:03,123 INFO: [mNotice: /Stage[main]/Nova::Keystone::Authtoken/Keystone::Resource::Authtoken[nova_config]/Nova_config[keystone_authtoken/project_domain_name]/ensure: created[0m >2018-08-21 15:57:03,188 INFO: [mNotice: /Stage[main]/Nova::Api/Oslo::Middleware[nova_config]/Nova_config[oslo_middleware/enable_proxy_headers_parsing]/ensure: created[0m >2018-08-21 15:57:03,223 INFO: [mNotice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created[0m >2018-08-21 15:57:03,326 INFO: [mNotice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created[0m >2018-08-21 15:57:04,482 INFO: [mNotice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created[0m >2018-08-21 15:57:04,534 INFO: [mNotice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created[0m >2018-08-21 15:57:04,834 INFO: [mNotice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/rabbit_password]/ensure: created[0m >2018-08-21 15:57:04,953 INFO: [mNotice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created[0m >2018-08-21 15:57:05,724 INFO: [mNotice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created[0m >2018-08-21 15:57:07,362 INFO: [mNotice: /Stage[main]/Neutron::Db/Oslo::Db[neutron_config]/Neutron_config[database/connection]/ensure: created[0m >2018-08-21 15:57:07,775 INFO: [mNotice: /Stage[main]/Neutron::Policy/Oslo::Policy[neutron_config]/Neutron_config[oslo_policy/policy_file]/ensure: created[0m >2018-08-21 15:57:07,875 INFO: [mNotice: /Stage[main]/Neutron::Keystone::Authtoken/Keystone::Resource::Authtoken[neutron_config]/Neutron_config[keystone_authtoken/auth_uri]/ensure: created[0m >2018-08-21 15:57:07,906 INFO: [mNotice: /Stage[main]/Neutron::Keystone::Authtoken/Keystone::Resource::Authtoken[neutron_config]/Neutron_config[keystone_authtoken/auth_type]/ensure: created[0m >2018-08-21 15:57:09,200 INFO: [mNotice: /Stage[main]/Neutron::Keystone::Authtoken/Keystone::Resource::Authtoken[neutron_config]/Neutron_config[keystone_authtoken/auth_url]/ensure: created[0m >2018-08-21 15:57:09,230 INFO: [mNotice: /Stage[main]/Neutron::Keystone::Authtoken/Keystone::Resource::Authtoken[neutron_config]/Neutron_config[keystone_authtoken/username]/ensure: created[0m >2018-08-21 15:57:09,260 INFO: [mNotice: /Stage[main]/Neutron::Keystone::Authtoken/Keystone::Resource::Authtoken[neutron_config]/Neutron_config[keystone_authtoken/password]/ensure: created[0m >2018-08-21 15:57:09,290 INFO: [mNotice: /Stage[main]/Neutron::Keystone::Authtoken/Keystone::Resource::Authtoken[neutron_config]/Neutron_config[keystone_authtoken/user_domain_name]/ensure: created[0m >2018-08-21 15:57:09,321 INFO: [mNotice: /Stage[main]/Neutron::Keystone::Authtoken/Keystone::Resource::Authtoken[neutron_config]/Neutron_config[keystone_authtoken/project_name]/ensure: created[0m >2018-08-21 15:57:09,352 INFO: [mNotice: /Stage[main]/Neutron::Keystone::Authtoken/Keystone::Resource::Authtoken[neutron_config]/Neutron_config[keystone_authtoken/project_domain_name]/ensure: created[0m >2018-08-21 15:57:09,435 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[flat]/Neutron_plugin_ml2[ml2_type_flat/flat_networks]/ensure: created[0m >2018-08-21 15:57:09,448 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vlan]/Neutron_plugin_ml2[ml2_type_vlan/network_vlan_ranges]/ensure: created[0m >2018-08-21 15:57:09,461 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[gre]/Neutron_plugin_ml2[ml2_type_gre/tunnel_id_ranges]/ensure: created[0m >2018-08-21 15:57:09,473 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vxlan_group]/ensure: created[0m >2018-08-21 15:57:09,485 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vni_ranges]/ensure: created[0m >2018-08-21 15:57:09,503 INFO: [mNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[geneve]/Neutron_plugin_ml2[ml2_type_geneve/vni_ranges]/ensure: created[0m >2018-08-21 15:57:09,508 INFO: [mNotice: /Stage[main]/Neutron::Deps/Anchor[neutron::config::end]: Triggered 'refresh' from 83 events[0m >2018-08-21 15:57:09,599 INFO: [mNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Mysql_database[neutron]/ensure: created[0m >2018-08-21 15:57:09,798 INFO: [mNotice: /Stage[main]/Rsync::Server/Xinetd::Service[rsync]/File[/etc/xinetd.d/rsync]/ensure: defined content as '{md5}8d1b1666df57adf0304ef209c09c8fd2'[0m >2018-08-21 15:57:09,908 INFO: [mNotice: /Stage[main]/Rsync::Server/Concat[/etc/rsyncd.conf]/File[/etc/rsyncd.conf]/content: content changed '{md5}c63fccb45c0dcbbbe17d0f4bdba920ec' to '{md5}b32edd535ec3868257275ed4f9dd88a7'[0m >2018-08-21 15:57:10,001 INFO: [mNotice: /Stage[main]/Heat::Logging/Oslo::Log[heat_config]/Heat_config[DEFAULT/debug]/ensure: created[0m >2018-08-21 15:57:10,167 INFO: [mNotice: /Stage[main]/Heat::Logging/Oslo::Log[heat_config]/Heat_config[DEFAULT/log_dir]/ensure: created[0m >2018-08-21 15:57:11,576 INFO: [mNotice: /Stage[main]/Heat::Db/Oslo::Db[heat_config]/Heat_config[database/connection]/ensure: created[0m >2018-08-21 15:57:12,208 INFO: [mNotice: /Stage[main]/Heat::Keystone::Authtoken/Keystone::Resource::Authtoken[heat_config]/Heat_config[keystone_authtoken/auth_uri]/ensure: created[0m >2018-08-21 15:57:12,273 INFO: [mNotice: /Stage[main]/Heat::Keystone::Authtoken/Keystone::Resource::Authtoken[heat_config]/Heat_config[keystone_authtoken/auth_type]/ensure: created[0m >2018-08-21 15:57:13,850 INFO: [mNotice: /Stage[main]/Heat::Keystone::Authtoken/Keystone::Resource::Authtoken[heat_config]/Heat_config[keystone_authtoken/auth_url]/ensure: created[0m >2018-08-21 15:57:13,916 INFO: [mNotice: /Stage[main]/Heat::Keystone::Authtoken/Keystone::Resource::Authtoken[heat_config]/Heat_config[keystone_authtoken/username]/ensure: created[0m >2018-08-21 15:57:13,981 INFO: [mNotice: /Stage[main]/Heat::Keystone::Authtoken/Keystone::Resource::Authtoken[heat_config]/Heat_config[keystone_authtoken/password]/ensure: created[0m >2018-08-21 15:57:14,047 INFO: [mNotice: /Stage[main]/Heat::Keystone::Authtoken/Keystone::Resource::Authtoken[heat_config]/Heat_config[keystone_authtoken/user_domain_name]/ensure: created[0m >2018-08-21 15:57:14,114 INFO: [mNotice: /Stage[main]/Heat::Keystone::Authtoken/Keystone::Resource::Authtoken[heat_config]/Heat_config[keystone_authtoken/project_name]/ensure: created[0m >2018-08-21 15:57:14,180 INFO: [mNotice: /Stage[main]/Heat::Keystone::Authtoken/Keystone::Resource::Authtoken[heat_config]/Heat_config[keystone_authtoken/project_domain_name]/ensure: created[0m >2018-08-21 15:57:16,788 INFO: [mNotice: /Stage[main]/Heat/Oslo::Messaging::Notifications[heat_config]/Heat_config[oslo_messaging_notifications/driver]/ensure: created[0m >2018-08-21 15:57:17,740 INFO: [mNotice: /Stage[main]/Heat/Oslo::Messaging::Default[heat_config]/Heat_config[DEFAULT/rpc_response_timeout]/ensure: created[0m >2018-08-21 15:57:17,814 INFO: [mNotice: /Stage[main]/Heat/Oslo::Messaging::Default[heat_config]/Heat_config[DEFAULT/transport_url]/ensure: created[0m >2018-08-21 15:57:17,945 INFO: [mNotice: /Stage[main]/Heat/Oslo::Middleware[heat_config]/Heat_config[oslo_middleware/enable_proxy_headers_parsing]/ensure: created[0m >2018-08-21 15:57:18,014 INFO: [mNotice: /Stage[main]/Heat::Policy/Oslo::Policy[heat_config]/Heat_config[oslo_policy/policy_file]/ensure: created[0m >2018-08-21 15:57:18,150 INFO: [mNotice: /Stage[main]/Apache::Mod::Headers/Apache::Mod[headers]/File[headers.load]/ensure: defined content as '{md5}96094c96352002c43ada5bdf8650ff38'[0m >2018-08-21 15:57:18,220 INFO: [mNotice: /Stage[main]/Heat::Cors/Oslo::Cors[heat_config]/Heat_config[cors/allowed_origin]/ensure: created[0m >2018-08-21 15:57:18,319 INFO: [mNotice: /Stage[main]/Heat::Cors/Oslo::Cors[heat_config]/Heat_config[cors/expose_headers]/ensure: created[0m >2018-08-21 15:57:18,385 INFO: [mNotice: /Stage[main]/Heat::Cors/Oslo::Cors[heat_config]/Heat_config[cors/max_age]/ensure: created[0m >2018-08-21 15:57:18,484 INFO: [mNotice: /Stage[main]/Heat::Cors/Oslo::Cors[heat_config]/Heat_config[cors/allow_headers]/ensure: created[0m >2018-08-21 15:57:18,488 INFO: [mNotice: /Stage[main]/Heat::Deps/Anchor[heat::config::end]: Triggered 'refresh' from 49 events[0m >2018-08-21 15:57:18,567 INFO: [mNotice: /Stage[main]/Heat::Db::Mysql/Openstacklib::Db::Mysql[heat]/Mysql_database[heat]/ensure: created[0m >2018-08-21 15:57:18,595 INFO: [mNotice: /Stage[main]/Nova::Cors/Oslo::Cors[nova_config]/Nova_config[cors/allowed_origin]/ensure: created[0m >2018-08-21 15:57:18,641 INFO: [mNotice: /Stage[main]/Nova::Cors/Oslo::Cors[nova_config]/Nova_config[cors/expose_headers]/ensure: created[0m >2018-08-21 15:57:18,665 INFO: [mNotice: /Stage[main]/Nova::Cors/Oslo::Cors[nova_config]/Nova_config[cors/max_age]/ensure: created[0m >2018-08-21 15:57:18,690 INFO: [mNotice: /Stage[main]/Nova::Cors/Oslo::Cors[nova_config]/Nova_config[cors/allow_methods]/ensure: created[0m >2018-08-21 15:57:18,715 INFO: [mNotice: /Stage[main]/Nova::Cors/Oslo::Cors[nova_config]/Nova_config[cors/allow_headers]/ensure: created[0m >2018-08-21 15:57:18,720 INFO: [mNotice: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Triggered 'refresh' from 105 events[0m >2018-08-21 15:57:18,804 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Mysql_database[nova]/ensure: created[0m >2018-08-21 15:57:18,878 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova_cell0]/Mysql_database[nova_cell0]/ensure: created[0m >2018-08-21 15:57:18,953 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Mysql_database[nova_api]/ensure: created[0m >2018-08-21 15:57:19,031 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql_placement/Openstacklib::Db::Mysql[nova_placement]/Mysql_database[nova_placement]/ensure: created[0m >2018-08-21 15:57:19,150 INFO: [mNotice: /Stage[main]/Ironic::Logging/Oslo::Log[ironic_config]/Ironic_config[DEFAULT/debug]/ensure: created[0m >2018-08-21 15:57:19,422 INFO: [mNotice: /Stage[main]/Ironic::Logging/Oslo::Log[ironic_config]/Ironic_config[DEFAULT/log_dir]/ensure: created[0m >2018-08-21 15:57:21,319 INFO: [mNotice: /Stage[main]/Ironic::Db/Oslo::Db[ironic_config]/Ironic_config[database/connection]/ensure: created[0m >2018-08-21 15:57:23,159 INFO: [mNotice: /Stage[main]/Ironic/Oslo::Messaging::Default[ironic_config]/Ironic_config[DEFAULT/rpc_response_timeout]/ensure: created[0m >2018-08-21 15:57:23,278 INFO: [mNotice: /Stage[main]/Ironic/Oslo::Messaging::Default[ironic_config]/Ironic_config[DEFAULT/transport_url]/ensure: created[0m >2018-08-21 15:57:27,973 INFO: [mNotice: /Stage[main]/Ironic::Policy/Oslo::Policy[ironic_config]/Ironic_config[oslo_policy/policy_file]/ensure: created[0m >2018-08-21 15:57:28,242 INFO: [mNotice: /Stage[main]/Ironic::Api::Authtoken/Keystone::Resource::Authtoken[ironic_config]/Ironic_config[keystone_authtoken/auth_uri]/ensure: created[0m >2018-08-21 15:57:28,357 INFO: [mNotice: /Stage[main]/Ironic::Api::Authtoken/Keystone::Resource::Authtoken[ironic_config]/Ironic_config[keystone_authtoken/auth_type]/ensure: created[0m >2018-08-21 15:57:30,586 INFO: [mNotice: /Stage[main]/Ironic::Api::Authtoken/Keystone::Resource::Authtoken[ironic_config]/Ironic_config[keystone_authtoken/auth_url]/ensure: created[0m >2018-08-21 15:57:30,697 INFO: [mNotice: /Stage[main]/Ironic::Api::Authtoken/Keystone::Resource::Authtoken[ironic_config]/Ironic_config[keystone_authtoken/username]/ensure: created[0m >2018-08-21 15:57:30,808 INFO: [mNotice: /Stage[main]/Ironic::Api::Authtoken/Keystone::Resource::Authtoken[ironic_config]/Ironic_config[keystone_authtoken/password]/ensure: created[0m >2018-08-21 15:57:30,919 INFO: [mNotice: /Stage[main]/Ironic::Api::Authtoken/Keystone::Resource::Authtoken[ironic_config]/Ironic_config[keystone_authtoken/user_domain_name]/ensure: created[0m >2018-08-21 15:57:31,032 INFO: [mNotice: /Stage[main]/Ironic::Api::Authtoken/Keystone::Resource::Authtoken[ironic_config]/Ironic_config[keystone_authtoken/project_name]/ensure: created[0m >2018-08-21 15:57:31,145 INFO: [mNotice: /Stage[main]/Ironic::Api::Authtoken/Keystone::Resource::Authtoken[ironic_config]/Ironic_config[keystone_authtoken/project_domain_name]/ensure: created[0m >2018-08-21 15:57:31,319 INFO: [mNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/File[/var/www/cgi-bin/ironic]/ensure: created[0m >2018-08-21 15:57:31,384 INFO: [mNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/File[ironic_wsgi]/ensure: defined content as '{md5}1d56c8d9da9a51b60ed54ef55cb43c99'[0m >2018-08-21 15:57:32,416 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Interfaces/Ironic::Drivers::Hardware_interface[boot]/Ironic_config[DEFAULT/enabled_boot_interfaces]/ensure: created[0m >2018-08-21 15:57:32,580 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Interfaces/Ironic::Drivers::Hardware_interface[console]/Ironic_config[DEFAULT/enabled_console_interfaces]/ensure: created[0m >2018-08-21 15:57:32,743 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Interfaces/Ironic::Drivers::Hardware_interface[deploy]/Ironic_config[DEFAULT/enabled_deploy_interfaces]/ensure: created[0m >2018-08-21 15:57:32,907 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Interfaces/Ironic::Drivers::Hardware_interface[inspect]/Ironic_config[DEFAULT/enabled_inspect_interfaces]/ensure: created[0m >2018-08-21 15:57:33,019 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Interfaces/Ironic::Drivers::Hardware_interface[inspect]/Ironic_config[DEFAULT/default_inspect_interface]/ensure: created[0m >2018-08-21 15:57:33,132 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Interfaces/Ironic::Drivers::Hardware_interface[management]/Ironic_config[DEFAULT/enabled_management_interfaces]/ensure: created[0m >2018-08-21 15:57:33,405 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Interfaces/Ironic::Drivers::Hardware_interface[power]/Ironic_config[DEFAULT/enabled_power_interfaces]/ensure: created[0m >2018-08-21 15:57:33,573 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Interfaces/Ironic::Drivers::Hardware_interface[raid]/Ironic_config[DEFAULT/enabled_raid_interfaces]/ensure: created[0m >2018-08-21 15:57:33,956 INFO: [mNotice: /Stage[main]/Ironic::Drivers::Interfaces/Ironic::Drivers::Hardware_interface[vendor]/Ironic_config[DEFAULT/enabled_vendor_interfaces]/ensure: created[0m >2018-08-21 15:57:34,046 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Logging/Oslo::Log[ironic_inspector_config]/Ironic_inspector_config[DEFAULT/debug]/ensure: created[0m >2018-08-21 15:57:34,133 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Logging/Oslo::Log[ironic_inspector_config]/Ironic_inspector_config[DEFAULT/log_dir]/ensure: created[0m >2018-08-21 15:57:35,385 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Db/Oslo::Db[ironic_inspector_config]/Ironic_inspector_config[database/connection]/ensure: created[0m >2018-08-21 15:57:35,695 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Authtoken/Keystone::Resource::Authtoken[ironic_inspector_config]/Ironic_inspector_config[keystone_authtoken/auth_uri]/ensure: created[0m >2018-08-21 15:57:35,729 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Authtoken/Keystone::Resource::Authtoken[ironic_inspector_config]/Ironic_inspector_config[keystone_authtoken/auth_type]/ensure: created[0m >2018-08-21 15:57:36,118 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Authtoken/Keystone::Resource::Authtoken[ironic_inspector_config]/Ironic_inspector_config[keystone_authtoken/auth_url]/ensure: created[0m >2018-08-21 15:57:36,152 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Authtoken/Keystone::Resource::Authtoken[ironic_inspector_config]/Ironic_inspector_config[keystone_authtoken/username]/ensure: created[0m >2018-08-21 15:57:36,186 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Authtoken/Keystone::Resource::Authtoken[ironic_inspector_config]/Ironic_inspector_config[keystone_authtoken/password]/ensure: created[0m >2018-08-21 15:57:36,219 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Authtoken/Keystone::Resource::Authtoken[ironic_inspector_config]/Ironic_inspector_config[keystone_authtoken/user_domain_name]/ensure: created[0m >2018-08-21 15:57:36,253 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Authtoken/Keystone::Resource::Authtoken[ironic_inspector_config]/Ironic_inspector_config[keystone_authtoken/project_name]/ensure: created[0m >2018-08-21 15:57:36,287 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Authtoken/Keystone::Resource::Authtoken[ironic_inspector_config]/Ironic_inspector_config[keystone_authtoken/project_domain_name]/ensure: created[0m >2018-08-21 15:57:36,338 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Cors/Oslo::Cors[ironic_inspector_config]/Ironic_inspector_config[cors/allowed_origin]/ensure: created[0m >2018-08-21 15:57:36,387 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Cors/Oslo::Cors[ironic_inspector_config]/Ironic_inspector_config[cors/expose_headers]/ensure: created[0m >2018-08-21 15:57:36,420 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Cors/Oslo::Cors[ironic_inspector_config]/Ironic_inspector_config[cors/max_age]/ensure: created[0m >2018-08-21 15:57:37,389 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Cors/Oslo::Cors[ironic_inspector_config]/Ironic_inspector_config[cors/allow_methods]/ensure: created[0m >2018-08-21 15:57:37,422 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Cors/Oslo::Cors[ironic_inspector_config]/Ironic_inspector_config[cors/allow_headers]/ensure: created[0m >2018-08-21 15:57:37,425 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic-inspector::config::end]: Triggered 'refresh' from 43 events[0m >2018-08-21 15:57:37,484 INFO: [mNotice: /Stage[main]/Ironic::Pxe/Xinetd::Service[tftp]/File[/etc/xinetd.d/tftp]/content: content changed '{md5}678efd3887a91cd4e0955aa6c8b12257' to '{md5}44d72db774cbebdd1090f928e0b3e8f0'[0m >2018-08-21 15:57:38,204 INFO: [mNotice: /Stage[main]/Xinetd/Service[xinetd]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 15:57:38,270 INFO: [mNotice: /Stage[main]/Ironic::Pxe/Ironic::Pxe::Tftpboot_file[pxelinux.0]/File[/tftpboot/pxelinux.0]/ensure: defined content as '{md5}3b078292686534c3b81baf513c8be233'[0m >2018-08-21 15:57:38,336 INFO: [mNotice: /Stage[main]/Ironic::Pxe/Ironic::Pxe::Tftpboot_file[chain.c32]/File[/tftpboot/chain.c32]/ensure: defined content as '{md5}af5c5fd5623d1bc2221f59eab51c9b41'[0m >2018-08-21 15:57:38,459 INFO: [mNotice: /Stage[main]/Ironic::Cors/Oslo::Cors[ironic_config]/Ironic_config[cors/allowed_origin]/ensure: created[0m >2018-08-21 15:57:38,625 INFO: [mNotice: /Stage[main]/Ironic::Cors/Oslo::Cors[ironic_config]/Ironic_config[cors/expose_headers]/ensure: created[0m >2018-08-21 15:57:38,737 INFO: [mNotice: /Stage[main]/Ironic::Cors/Oslo::Cors[ironic_config]/Ironic_config[cors/max_age]/ensure: created[0m >2018-08-21 15:57:38,848 INFO: [mNotice: /Stage[main]/Ironic::Cors/Oslo::Cors[ironic_config]/Ironic_config[cors/allow_methods]/ensure: created[0m >2018-08-21 15:57:38,960 INFO: [mNotice: /Stage[main]/Ironic::Cors/Oslo::Cors[ironic_config]/Ironic_config[cors/allow_headers]/ensure: created[0m >2018-08-21 15:57:38,965 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic::config::end]: Triggered 'refresh' from 95 events[0m >2018-08-21 15:57:39,059 INFO: [mNotice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Mysql_database[ironic]/ensure: created[0m >2018-08-21 15:57:39,073 INFO: [mNotice: /Stage[main]/Mistral::Db/Oslo::Db[mistral_config]/Mistral_config[database/connection]/ensure: created[0m >2018-08-21 15:57:39,126 INFO: [mNotice: /Stage[main]/Mistral::Logging/Oslo::Log[mistral_config]/Mistral_config[DEFAULT/debug]/ensure: created[0m >2018-08-21 15:57:39,138 INFO: [mNotice: /Stage[main]/Mistral::Logging/Oslo::Log[mistral_config]/Mistral_config[DEFAULT/log_dir]/ensure: created[0m >2018-08-21 15:57:39,187 INFO: [mNotice: /Stage[main]/Mistral::Keystone::Authtoken/Keystone::Resource::Authtoken[mistral_config]/Mistral_config[keystone_authtoken/auth_uri]/ensure: created[0m >2018-08-21 15:57:39,192 INFO: [mNotice: /Stage[main]/Mistral::Keystone::Authtoken/Keystone::Resource::Authtoken[mistral_config]/Mistral_config[keystone_authtoken/auth_type]/ensure: created[0m >2018-08-21 15:57:39,256 INFO: [mNotice: /Stage[main]/Mistral::Keystone::Authtoken/Keystone::Resource::Authtoken[mistral_config]/Mistral_config[keystone_authtoken/auth_url]/ensure: created[0m >2018-08-21 15:57:39,260 INFO: [mNotice: /Stage[main]/Mistral::Keystone::Authtoken/Keystone::Resource::Authtoken[mistral_config]/Mistral_config[keystone_authtoken/username]/ensure: created[0m >2018-08-21 15:57:39,265 INFO: [mNotice: /Stage[main]/Mistral::Keystone::Authtoken/Keystone::Resource::Authtoken[mistral_config]/Mistral_config[keystone_authtoken/password]/ensure: created[0m >2018-08-21 15:57:39,270 INFO: [mNotice: /Stage[main]/Mistral::Keystone::Authtoken/Keystone::Resource::Authtoken[mistral_config]/Mistral_config[keystone_authtoken/user_domain_name]/ensure: created[0m >2018-08-21 15:57:39,275 INFO: [mNotice: /Stage[main]/Mistral::Keystone::Authtoken/Keystone::Resource::Authtoken[mistral_config]/Mistral_config[keystone_authtoken/project_name]/ensure: created[0m >2018-08-21 15:57:39,280 INFO: [mNotice: /Stage[main]/Mistral::Keystone::Authtoken/Keystone::Resource::Authtoken[mistral_config]/Mistral_config[keystone_authtoken/project_domain_name]/ensure: created[0m >2018-08-21 15:57:39,291 INFO: [mNotice: /Stage[main]/Mistral/Oslo::Messaging::Default[mistral_config]/Mistral_config[DEFAULT/rpc_response_timeout]/ensure: created[0m >2018-08-21 15:57:39,336 INFO: [mNotice: /Stage[main]/Mistral/Oslo::Messaging::Rabbit[mistral_config]/Mistral_config[oslo_messaging_rabbit/rabbit_password]/ensure: created[0m >2018-08-21 15:57:39,352 INFO: [mNotice: /Stage[main]/Mistral/Oslo::Messaging::Rabbit[mistral_config]/Mistral_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created[0m >2018-08-21 15:57:39,368 INFO: [mNotice: /Stage[main]/Mistral/Oslo::Messaging::Rabbit[mistral_config]/Mistral_config[oslo_messaging_rabbit/rabbit_host]/ensure: created[0m >2018-08-21 15:57:40,371 INFO: [mNotice: /Stage[main]/Mistral::Policy/Oslo::Policy[mistral_config]/Mistral_config[oslo_policy/policy_file]/ensure: created[0m >2018-08-21 15:57:40,391 INFO: [mNotice: /Stage[main]/Mistral::Cors/Oslo::Cors[mistral_config]/Mistral_config[cors/allowed_origin]/ensure: created[0m >2018-08-21 15:57:40,399 INFO: [mNotice: /Stage[main]/Mistral::Cors/Oslo::Cors[mistral_config]/Mistral_config[cors/expose_headers]/ensure: created[0m >2018-08-21 15:57:40,410 INFO: [mNotice: /Stage[main]/Mistral::Cors/Oslo::Cors[mistral_config]/Mistral_config[cors/allow_headers]/ensure: created[0m >2018-08-21 15:57:40,414 INFO: [mNotice: /Stage[main]/Mistral::Deps/Anchor[mistral::config::end]: Triggered 'refresh' from 25 events[0m >2018-08-21 15:57:40,502 INFO: [mNotice: /Stage[main]/Mistral::Db::Mysql/Openstacklib::Db::Mysql[mistral]/Mysql_database[mistral]/ensure: created[0m >2018-08-21 15:57:40,563 INFO: [mNotice: /Stage[main]/Apache::Mod::Proxy/Apache::Mod[proxy]/File[proxy.load]/ensure: defined content as '{md5}fe26a0a70f572eb256a3c6c183a62223'[0m >2018-08-21 15:57:40,630 INFO: [mNotice: /Stage[main]/Apache::Mod::Proxy_http/Apache::Mod[proxy_http]/File[proxy_http.load]/ensure: defined content as '{md5}0329b852b123a914fca8b072de61f913'[0m >2018-08-21 15:57:40,697 INFO: [mNotice: /Stage[main]/Apache::Mod::Proxy_wstunnel/Apache::Mod[proxy_wstunnel]/File[proxy_wstunnel.load]/ensure: defined content as '{md5}8036815f495618f4dde9d68796622e1c'[0m >2018-08-21 15:57:42,362 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-base.conf]/ensure: removed[0m >2018-08-21 15:57:42,368 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-dav.conf]/ensure: removed[0m >2018-08-21 15:57:42,374 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-lua.conf]/ensure: removed[0m >2018-08-21 15:57:42,380 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-mpm.conf]/ensure: removed[0m >2018-08-21 15:57:42,394 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-proxy.conf]/ensure: removed[0m >2018-08-21 15:57:42,400 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-systemd.conf]/ensure: removed[0m >2018-08-21 15:57:42,406 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/01-cgi.conf]/ensure: removed[0m >2018-08-21 15:57:42,411 INFO: [mNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/10-wsgi.conf]/ensure: removed[0m >2018-08-21 15:57:42,492 INFO: [mNotice: /Stage[main]/Tripleo::Ui/File[/var/www/openstack-tripleo-ui/dist/tripleo_ui_config.js]/ensure: defined content as '{md5}67b9a5a80f2c0109f6d5879f7c25947f'[0m >2018-08-21 15:57:42,664 INFO: [mNotice: /Stage[main]/Zaqar::Logging/Oslo::Log[zaqar_config]/Zaqar_config[DEFAULT/log_dir]/ensure: created[0m >2018-08-21 15:57:44,134 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Authtoken/Keystone::Resource::Authtoken[zaqar_config]/Zaqar_config[keystone_authtoken/auth_uri]/ensure: created[0m >2018-08-21 15:57:44,194 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Authtoken/Keystone::Resource::Authtoken[zaqar_config]/Zaqar_config[keystone_authtoken/auth_type]/ensure: created[0m >2018-08-21 15:57:44,863 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Authtoken/Keystone::Resource::Authtoken[zaqar_config]/Zaqar_config[keystone_authtoken/auth_url]/ensure: created[0m >2018-08-21 15:57:44,921 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Authtoken/Keystone::Resource::Authtoken[zaqar_config]/Zaqar_config[keystone_authtoken/username]/ensure: created[0m >2018-08-21 15:57:44,980 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Authtoken/Keystone::Resource::Authtoken[zaqar_config]/Zaqar_config[keystone_authtoken/password]/ensure: created[0m >2018-08-21 15:57:45,039 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Authtoken/Keystone::Resource::Authtoken[zaqar_config]/Zaqar_config[keystone_authtoken/user_domain_name]/ensure: created[0m >2018-08-21 15:57:45,098 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Authtoken/Keystone::Resource::Authtoken[zaqar_config]/Zaqar_config[keystone_authtoken/project_name]/ensure: created[0m >2018-08-21 15:57:45,157 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Authtoken/Keystone::Resource::Authtoken[zaqar_config]/Zaqar_config[keystone_authtoken/project_domain_name]/ensure: created[0m >2018-08-21 15:57:45,246 INFO: [mNotice: /Stage[main]/Zaqar::Policy/Oslo::Policy[zaqar_config]/Zaqar_config[oslo_policy/policy_file]/ensure: created[0m >2018-08-21 15:57:45,304 INFO: [mNotice: /Stage[main]/Zaqar::Deps/Anchor[zaqar::config::end]: Triggered 'refresh' from 25 events[0m >2018-08-21 15:57:45,392 INFO: [mNotice: /Stage[main]/Zaqar::Db::Mysql/Openstacklib::Db::Mysql[zaqar]/Mysql_database[zaqar]/ensure: created[0m >2018-08-21 15:57:45,406 INFO: [mNotice: /Stage[main]/Zaqar::Wsgi::Apache/Openstacklib::Wsgi::Apache[zaqar_wsgi]/File[/var/www/cgi-bin/zaqar]/ensure: created[0m >2018-08-21 15:57:45,467 INFO: [mNotice: /Stage[main]/Zaqar::Wsgi::Apache/Openstacklib::Wsgi::Apache[zaqar_wsgi]/File[zaqar_wsgi]/ensure: defined content as '{md5}3a0f81ec944ad0c68f7db4b58b1f72d6'[0m >2018-08-21 15:57:45,524 INFO: [mNotice: /Stage[main]/Main/Zaqar::Server_instance[1]/File[/etc/zaqar/1.conf]/ensure: defined content as '{md5}7750571c15d53265e703cc60ea7938af'[0m >2018-08-21 15:57:45,627 INFO: [mNotice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/content: content changed '{md5}40d961cd3154f0439fcac1a50bd77b96' to '{md5}56dac259c7b06586e1726ee6685171ba'[0m >2018-08-21 15:57:45,987 INFO: [mNotice: /Stage[main]/Ssh::Server::Service/Service[sshd]: Triggered 'refresh' from 2 events[0m >2018-08-21 15:57:47,250 INFO: [mNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_%]/Mysql_user[glance@%]/ensure: created[0m >2018-08-21 15:57:47,573 INFO: [mNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_%]/Mysql_grant[glance@%/glance.*]/ensure: created[0m >2018-08-21 15:57:47,892 INFO: [mNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_192.168.24.1]/Mysql_user[glance@192.168.24.1]/ensure: created[0m >2018-08-21 15:57:47,976 INFO: [mNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_192.168.24.1]/Mysql_grant[glance@192.168.24.1/glance.*]/ensure: created[0m >2018-08-21 15:57:48,061 INFO: [mNotice: /Stage[main]/Glance::Deps/Anchor[glance::db::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:57:48,064 INFO: [mNotice: /Stage[main]/Glance::Deps/Anchor[glance::dbsync::begin]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:58:14,670 INFO: [mNotice: /Stage[main]/Glance::Db::Sync/Exec[glance-manage db_sync]/returns: executed successfully[0m >2018-08-21 15:58:20,040 INFO: [mNotice: /Stage[main]/Glance::Db::Sync/Exec[glance-manage db_sync]: Triggered 'refresh' from 3 events[0m >2018-08-21 15:58:20,044 INFO: [mNotice: /Stage[main]/Glance::Deps/Anchor[glance::dbsync::end]: Triggered 'refresh' from 2 events[0m >2018-08-21 15:58:37,866 INFO: [mNotice: /Stage[main]/Glance::Db::Metadefs/Exec[glance-manage db_load_metadefs]: Triggered 'refresh' from 3 events[0m >2018-08-21 15:58:38,117 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_%]/Mysql_user[nova@%]/ensure: created[0m >2018-08-21 15:58:38,210 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_%]/Mysql_grant[nova@%/nova.*]/ensure: created[0m >2018-08-21 15:58:38,546 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_192.168.24.1]/Mysql_user[nova@192.168.24.1]/ensure: created[0m >2018-08-21 15:58:38,632 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_192.168.24.1]/Mysql_grant[nova@192.168.24.1/nova.*]/ensure: created[0m >2018-08-21 15:58:38,806 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova_cell0]/Openstacklib::Db::Mysql::Host_access[nova_cell0_%]/Mysql_grant[nova@%/nova_cell0.*]/ensure: created[0m >2018-08-21 15:58:38,976 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova_cell0]/Openstacklib::Db::Mysql::Host_access[nova_cell0_192.168.24.1]/Mysql_grant[nova@192.168.24.1/nova_cell0.*]/ensure: created[0m >2018-08-21 15:58:39,314 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Openstacklib::Db::Mysql::Host_access[nova_api_%]/Mysql_user[nova_api@%]/ensure: created[0m >2018-08-21 15:58:39,400 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Openstacklib::Db::Mysql::Host_access[nova_api_%]/Mysql_grant[nova_api@%/nova_api.*]/ensure: created[0m >2018-08-21 15:58:39,734 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Openstacklib::Db::Mysql::Host_access[nova_api_192.168.24.1]/Mysql_user[nova_api@192.168.24.1]/ensure: created[0m >2018-08-21 15:58:39,820 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Openstacklib::Db::Mysql::Host_access[nova_api_192.168.24.1]/Mysql_grant[nova_api@192.168.24.1/nova_api.*]/ensure: created[0m >2018-08-21 15:58:40,160 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql_placement/Openstacklib::Db::Mysql[nova_placement]/Openstacklib::Db::Mysql::Host_access[nova_placement_%]/Mysql_user[nova_placement@%]/ensure: created[0m >2018-08-21 15:58:40,248 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql_placement/Openstacklib::Db::Mysql[nova_placement]/Openstacklib::Db::Mysql::Host_access[nova_placement_%]/Mysql_grant[nova_placement@%/nova_placement.*]/ensure: created[0m >2018-08-21 15:58:40,578 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql_placement/Openstacklib::Db::Mysql[nova_placement]/Openstacklib::Db::Mysql::Host_access[nova_placement_192.168.24.1]/Mysql_user[nova_placement@192.168.24.1]/ensure: created[0m >2018-08-21 15:58:40,663 INFO: [mNotice: /Stage[main]/Nova::Db::Mysql_placement/Openstacklib::Db::Mysql[nova_placement]/Openstacklib::Db::Mysql::Host_access[nova_placement_192.168.24.1]/Mysql_grant[nova_placement@192.168.24.1/nova_placement.*]/ensure: created[0m >2018-08-21 15:58:40,750 INFO: [mNotice: /Stage[main]/Nova::Deps/Anchor[nova::db::end]: Triggered 'refresh' from 3 events[0m >2018-08-21 15:58:40,753 INFO: [mNotice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::begin]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:59:32,819 INFO: [mNotice: /Stage[main]/Nova::Db::Sync_api/Exec[nova-db-sync-api]/returns: executed successfully[0m >2018-08-21 15:59:50,204 INFO: [mNotice: /Stage[main]/Nova::Db::Sync_api/Exec[nova-db-sync-api]: Triggered 'refresh' from 4 events[0m >2018-08-21 15:59:50,208 INFO: [mNotice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::end]: Triggered 'refresh' from 2 events[0m >2018-08-21 15:59:50,211 INFO: [mNotice: /Stage[main]/Nova::Deps/Anchor[nova::cell_v2::begin]: Triggered 'refresh' from 1 events[0m >2018-08-21 15:59:50,213 INFO: [mNotice: /Stage[main]/Nova::Deps/Anchor[nova::db_online_data_migrations::begin]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:00:08,192 INFO: [mNotice: /Stage[main]/Nova::Cell_v2::Map_cell0/Exec[nova-cell_v2-map_cell0]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:00:08,196 INFO: [mNotice: /Stage[main]/Nova::Deps/Anchor[nova::cell_v2::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:00:08,198 INFO: [mNotice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::begin]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:05:34,465 INFO: [mNotice: /Stage[main]/Nova::Db::Sync/Exec[nova-db-sync]/returns: executed successfully[0m >2018-08-21 16:05:53,189 INFO: [mNotice: /Stage[main]/Nova::Db::Sync/Exec[nova-db-sync]: Triggered 'refresh' from 4 events[0m >2018-08-21 16:05:53,192 INFO: [mNotice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::end]: Triggered 'refresh' from 2 events[0m >2018-08-21 16:05:53,195 INFO: [mNotice: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Triggered 'refresh' from 5 events[0m >2018-08-21 16:06:28,970 INFO: [mNotice: /Stage[main]/Nova::Cell_v2::Simple_setup/Nova_cell_v2[default]/ensure: created[0m >2018-08-21 16:06:28,978 INFO: [mNotice: /Stage[main]/Nova::Cron::Archive_deleted_rows/Cron[nova-manage db archive_deleted_rows]/ensure: created[0m >2018-08-21 16:06:46,226 INFO: [mNotice: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Service[nova-conductor]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:07:05,032 INFO: [mNotice: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Service[nova-scheduler]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:07:05,291 INFO: [mNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[neutron_%]/Mysql_user[neutron@%]/ensure: created[0m >2018-08-21 16:07:05,376 INFO: [mNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[neutron_%]/Mysql_grant[neutron@%/neutron.*]/ensure: created[0m >2018-08-21 16:07:05,710 INFO: [mNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[neutron_192.168.24.1]/Mysql_user[neutron@192.168.24.1]/ensure: created[0m >2018-08-21 16:07:05,799 INFO: [mNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[neutron_192.168.24.1]/Mysql_grant[neutron@192.168.24.1/neutron.*]/ensure: created[0m >2018-08-21 16:07:05,889 INFO: [mNotice: /Stage[main]/Neutron::Deps/Anchor[neutron::db::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:07:05,892 INFO: [mNotice: /Stage[main]/Neutron::Deps/Anchor[neutron::dbsync::begin]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:09:24,715 INFO: [mNotice: /Stage[main]/Neutron::Db::Sync/Exec[neutron-db-sync]/returns: executed successfully[0m >2018-08-21 16:09:31,313 INFO: [mNotice: /Stage[main]/Neutron::Db::Sync/Exec[neutron-db-sync]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:09:31,316 INFO: [mNotice: /Stage[main]/Neutron::Deps/Anchor[neutron::dbsync::end]: Triggered 'refresh' from 2 events[0m >2018-08-21 16:09:31,319 INFO: [mNotice: /Stage[main]/Neutron::Deps/Anchor[neutron::service::begin]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:09:32,577 INFO: [mNotice: /Stage[main]/Neutron::Agents::Dhcp/Service[neutron-dhcp-service]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:09:33,758 INFO: [mNotice: /Stage[main]/Neutron::Agents::L3/Service[neutron-l3]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:09:34,972 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Service[neutron-ovs-agent-service]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:09:44,693 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Service[neutron-destroy-patch-ports-service]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:09:44,966 INFO: [mNotice: /Stage[main]/Heat::Db::Mysql/Openstacklib::Db::Mysql[heat]/Openstacklib::Db::Mysql::Host_access[heat_%]/Mysql_user[heat@%]/ensure: created[0m >2018-08-21 16:09:45,060 INFO: [mNotice: /Stage[main]/Heat::Db::Mysql/Openstacklib::Db::Mysql[heat]/Openstacklib::Db::Mysql::Host_access[heat_%]/Mysql_grant[heat@%/heat.*]/ensure: created[0m >2018-08-21 16:09:45,400 INFO: [mNotice: /Stage[main]/Heat::Db::Mysql/Openstacklib::Db::Mysql[heat]/Openstacklib::Db::Mysql::Host_access[heat_192.168.24.1]/Mysql_user[heat@192.168.24.1]/ensure: created[0m >2018-08-21 16:09:45,489 INFO: [mNotice: /Stage[main]/Heat::Db::Mysql/Openstacklib::Db::Mysql[heat]/Openstacklib::Db::Mysql::Host_access[heat_192.168.24.1]/Mysql_grant[heat@192.168.24.1/heat.*]/ensure: created[0m >2018-08-21 16:09:45,582 INFO: [mNotice: /Stage[main]/Heat::Deps/Anchor[heat::db::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:09:45,585 INFO: [mNotice: /Stage[main]/Heat::Deps/Anchor[heat::dbsync::begin]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:10:06,021 INFO: [mNotice: /Stage[main]/Heat::Db::Sync/Exec[heat-dbsync]/returns: executed successfully[0m >2018-08-21 16:10:11,194 INFO: [mNotice: /Stage[main]/Heat::Db::Sync/Exec[heat-dbsync]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:10:11,198 INFO: [mNotice: /Stage[main]/Heat::Deps/Anchor[heat::dbsync::end]: Triggered 'refresh' from 2 events[0m >2018-08-21 16:10:11,201 INFO: [mNotice: /Stage[main]/Heat::Deps/Anchor[heat::service::begin]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:10:11,431 INFO: [mNotice: /Stage[main]/Heat::Api/Service[heat-api]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:10:11,648 INFO: [mNotice: /Stage[main]/Heat::Api_cfn/Service[heat-api-cfn]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:10:12,946 INFO: [mNotice: /Stage[main]/Heat::Engine/Service[heat-engine]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:10:13,209 INFO: [mNotice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Openstacklib::Db::Mysql::Host_access[ironic_%]/Mysql_user[ironic@%]/ensure: created[0m >2018-08-21 16:10:13,298 INFO: [mNotice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Openstacklib::Db::Mysql::Host_access[ironic_%]/Mysql_grant[ironic@%/ironic.*]/ensure: created[0m >2018-08-21 16:10:13,634 INFO: [mNotice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Openstacklib::Db::Mysql::Host_access[ironic_192.168.24.1]/Mysql_user[ironic@192.168.24.1]/ensure: created[0m >2018-08-21 16:10:13,721 INFO: [mNotice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Openstacklib::Db::Mysql::Host_access[ironic_192.168.24.1]/Mysql_grant[ironic@192.168.24.1/ironic.*]/ensure: created[0m >2018-08-21 16:10:13,812 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic::db::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:10:13,815 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic::dbsync::begin]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:10:54,972 INFO: [mNotice: /Stage[main]/Ironic::Db::Sync/Exec[ironic-dbsync]/returns: executed successfully[0m >2018-08-21 16:11:00,434 INFO: [mNotice: /Stage[main]/Ironic::Db::Sync/Exec[ironic-dbsync]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:11:00,437 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic::dbsync::end]: Triggered 'refresh' from 2 events[0m >2018-08-21 16:11:00,440 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic::db_online_data_migrations::begin]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:11:14,163 INFO: [mNotice: /Stage[main]/Ironic::Db::Online_data_migrations/Exec[ironic-db-online-data-migrations]/returns: executed successfully[0m >2018-08-21 16:11:27,385 INFO: [mNotice: /Stage[main]/Ironic::Db::Online_data_migrations/Exec[ironic-db-online-data-migrations]: Triggered 'refresh' from 4 events[0m >2018-08-21 16:11:27,388 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic::db_online_data_migrations::end]: Triggered 'refresh' from 2 events[0m >2018-08-21 16:11:27,391 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic::service::begin]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:11:27,618 INFO: [mNotice: /Stage[main]/Ironic::Api/Service[ironic-api]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:11:28,898 INFO: [mNotice: /Stage[main]/Ironic::Conductor/Service[ironic-conductor]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:11:28,901 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic::service::end]: Triggered 'refresh' from 2 events[0m >2018-08-21 16:11:29,161 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Db::Mysql/Openstacklib::Db::Mysql[ironic-inspector]/Openstacklib::Db::Mysql::Host_access[ironic-inspector_%]/Mysql_user[ironic-inspector@%]/ensure: created[0m >2018-08-21 16:11:29,247 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Db::Mysql/Openstacklib::Db::Mysql[ironic-inspector]/Openstacklib::Db::Mysql::Host_access[ironic-inspector_%]/Mysql_grant[ironic-inspector@%/ironic-inspector.*]/ensure: created[0m >2018-08-21 16:11:29,580 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Db::Mysql/Openstacklib::Db::Mysql[ironic-inspector]/Openstacklib::Db::Mysql::Host_access[ironic-inspector_192.168.24.1]/Mysql_user[ironic-inspector@192.168.24.1]/ensure: created[0m >2018-08-21 16:11:29,667 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Db::Mysql/Openstacklib::Db::Mysql[ironic-inspector]/Openstacklib::Db::Mysql::Host_access[ironic-inspector_192.168.24.1]/Mysql_grant[ironic-inspector@192.168.24.1/ironic-inspector.*]/ensure: created[0m >2018-08-21 16:11:42,202 INFO: [mNotice: /Stage[main]/Ironic::Inspector::Db::Sync/Exec[ironic-inspector-dbsync]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:11:42,206 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic-inspector::dbsync::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:11:42,208 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic-inspector::service::begin]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:11:42,383 INFO: [mNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/etc/httpd/conf.d/15-default.conf]/ensure: defined content as '{md5}a430bf4e003be964b419e7aea251c6c4'[0m >2018-08-21 16:11:42,641 INFO: [mNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/etc/httpd/conf.d/10-keystone_wsgi_main.conf]/ensure: defined content as '{md5}36ba055567eeb2c4e1f0eb28561331ab'[0m >2018-08-21 16:11:42,817 INFO: [mNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/etc/httpd/conf.d/10-keystone_wsgi_admin.conf]/ensure: defined content as '{md5}d6123c98d7ad29a54d23edf55b7c262e'[0m >2018-08-21 16:11:59,526 INFO: [mNotice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]/ensure: created[0m >2018-08-21 16:12:16,056 INFO: [mNotice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]/ensure: created[0m >2018-08-21 16:12:32,569 INFO: [mNotice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]/ensure: created[0m >2018-08-21 16:12:32,573 INFO: [mNotice: /Stage[main]/Swift::Deps/Anchor[swift::install::end]: Triggered 'refresh' from 5 events[0m >2018-08-21 16:12:32,602 INFO: [mNotice: /Stage[main]/Swift/File[/var/lib/swift]/group: group changed 'root' to 'swift'[0m >2018-08-21 16:12:32,613 INFO: [mNotice: /Stage[main]/Swift/File[/etc/swift/swift.conf]/owner: owner changed 'root' to 'swift'[0m >2018-08-21 16:12:33,649 INFO: [mNotice: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]/value: value changed '%SWIFT_HASH_PATH_SUFFIX%' to 'f658cbd6780fe5d7de81ab3554ee22070cf668aa'[0m >2018-08-21 16:12:33,655 INFO: [mNotice: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]/ensure: created[0m >2018-08-21 16:12:33,661 INFO: [mNotice: /Stage[main]/Cinder::Deps/Anchor[cinder::service::end]: Triggered 'refresh' from 38 events[0m >2018-08-21 16:12:33,674 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[DEFAULT/bind_ip]/ensure: created[0m >2018-08-21 16:12:33,681 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[DEFAULT/workers]/value: value changed '8' to '6'[0m >2018-08-21 16:12:33,692 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[DEFAULT/log_name]/ensure: created[0m >2018-08-21 16:12:33,699 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[DEFAULT/log_facility]/ensure: created[0m >2018-08-21 16:12:33,706 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[DEFAULT/log_level]/ensure: created[0m >2018-08-21 16:12:33,713 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[DEFAULT/log_headers]/ensure: created[0m >2018-08-21 16:12:33,721 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[DEFAULT/log_address]/ensure: created[0m >2018-08-21 16:12:33,736 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[pipeline:main/pipeline]/value: value changed 'catch_errors gatekeeper healthcheck proxy-logging cache container_sync bulk tempurl ratelimit copy container-quotas account-quotas slo dlo versioned_writes proxy-logging proxy-server' to 'catch_errors healthcheck proxy-logging cache ratelimit bulk tempurl formpost authtoken keystone staticweb copy slo dlo versioned_writes proxy-logging proxy-server'[0m >2018-08-21 16:12:33,747 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[app:proxy-server/set log_name]/ensure: created[0m >2018-08-21 16:12:33,754 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[app:proxy-server/set log_facility]/ensure: created[0m >2018-08-21 16:12:33,762 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[app:proxy-server/set log_level]/ensure: created[0m >2018-08-21 16:12:33,769 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[app:proxy-server/set log_address]/ensure: created[0m >2018-08-21 16:12:33,778 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[app:proxy-server/log_handoffs]/ensure: created[0m >2018-08-21 16:12:33,785 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[app:proxy-server/allow_account_management]/value: value changed 'true' to 'True'[0m >2018-08-21 16:12:33,793 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[app:proxy-server/account_autocreate]/value: value changed 'true' to 'True'[0m >2018-08-21 16:12:33,808 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[app:proxy-server/node_timeout]/ensure: created[0m >2018-08-21 16:12:33,815 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[DEFAULT/cors_allow_origin]/ensure: created[0m >2018-08-21 16:12:33,823 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift_proxy_config[DEFAULT/strict_cors_mode]/ensure: created[0m >2018-08-21 16:12:33,857 INFO: [mNotice: /Stage[main]/Swift::Proxy::Bulk/Swift_proxy_config[filter:bulk/max_containers_per_extraction]/ensure: created[0m >2018-08-21 16:12:33,864 INFO: [mNotice: /Stage[main]/Swift::Proxy::Bulk/Swift_proxy_config[filter:bulk/max_failed_extractions]/ensure: created[0m >2018-08-21 16:12:33,872 INFO: [mNotice: /Stage[main]/Swift::Proxy::Bulk/Swift_proxy_config[filter:bulk/max_deletes_per_request]/ensure: created[0m >2018-08-21 16:12:33,879 INFO: [mNotice: /Stage[main]/Swift::Proxy::Bulk/Swift_proxy_config[filter:bulk/yield_frequency]/ensure: created[0m >2018-08-21 16:12:33,906 INFO: [mNotice: /Stage[main]/Swift::Proxy::Keystone/Swift_proxy_config[filter:keystone/reseller_prefix]/ensure: created[0m >2018-08-21 16:12:33,915 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/File[/var/cache/swift]/mode: mode changed '0755' to '0700'[0m >2018-08-21 16:12:33,923 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/log_name]/ensure: created[0m >2018-08-21 16:12:33,931 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/signing_dir]/value: value changed '/tmp/keystone-signing-swift' to '/var/cache/swift'[0m >2018-08-21 16:12:33,943 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/www_authenticate_uri]/ensure: created[0m >2018-08-21 16:12:33,950 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/auth_url]/ensure: created[0m >2018-08-21 16:12:33,958 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/auth_plugin]/ensure: created[0m >2018-08-21 16:12:33,966 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/project_domain_id]/ensure: created[0m >2018-08-21 16:12:33,974 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/user_domain_id]/ensure: created[0m >2018-08-21 16:12:33,982 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/project_name]/ensure: created[0m >2018-08-21 16:12:33,990 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/username]/ensure: created[0m >2018-08-21 16:12:33,998 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/password]/ensure: created[0m >2018-08-21 16:12:34,006 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/delay_auth_decision]/ensure: created[0m >2018-08-21 16:12:34,014 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/cache]/ensure: created[0m >2018-08-21 16:12:34,022 INFO: [mNotice: /Stage[main]/Swift::Proxy::Authtoken/Swift_proxy_config[filter:authtoken/include_service_catalog]/ensure: created[0m >2018-08-21 16:12:34,032 INFO: [mNotice: /Stage[main]/Swift::Proxy::Staticweb/Swift_proxy_config[filter:staticweb/use]/ensure: created[0m >2018-08-21 16:12:34,051 INFO: [mNotice: /Stage[main]/Swift::Proxy::Copy/Swift_proxy_config[filter:copy/object_post_as_copy]/value: value changed 'false' to 'True'[0m >2018-08-21 16:12:34,065 INFO: [mNotice: /Stage[main]/Swift::Proxy::Slo/Swift_proxy_config[filter:slo/max_manifest_segments]/ensure: created[0m >2018-08-21 16:12:34,073 INFO: [mNotice: /Stage[main]/Swift::Proxy::Slo/Swift_proxy_config[filter:slo/max_manifest_size]/ensure: created[0m >2018-08-21 16:12:34,082 INFO: [mNotice: /Stage[main]/Swift::Proxy::Slo/Swift_proxy_config[filter:slo/min_segment_size]/ensure: created[0m >2018-08-21 16:12:34,090 INFO: [mNotice: /Stage[main]/Swift::Proxy::Slo/Swift_proxy_config[filter:slo/rate_limit_after_segment]/ensure: created[0m >2018-08-21 16:12:34,098 INFO: [mNotice: /Stage[main]/Swift::Proxy::Slo/Swift_proxy_config[filter:slo/rate_limit_segments_per_sec]/ensure: created[0m >2018-08-21 16:12:34,107 INFO: [mNotice: /Stage[main]/Swift::Proxy::Slo/Swift_proxy_config[filter:slo/max_get_time]/ensure: created[0m >2018-08-21 16:12:34,121 INFO: [mNotice: /Stage[main]/Swift::Proxy::Dlo/Swift_proxy_config[filter:dlo/rate_limit_after_segment]/ensure: created[0m >2018-08-21 16:12:34,130 INFO: [mNotice: /Stage[main]/Swift::Proxy::Dlo/Swift_proxy_config[filter:dlo/rate_limit_segments_per_sec]/ensure: created[0m >2018-08-21 16:12:34,139 INFO: [mNotice: /Stage[main]/Swift::Proxy::Dlo/Swift_proxy_config[filter:dlo/max_get_time]/ensure: created[0m >2018-08-21 16:12:34,154 INFO: [mNotice: /Stage[main]/Swift::Proxy::Versioned_writes/Swift_proxy_config[filter:versioned_writes/allow_versioned_writes]/ensure: created[0m >2018-08-21 16:12:34,169 INFO: [mNotice: /Stage[main]/Swift::Proxy::Ratelimit/Swift_proxy_config[filter:ratelimit/clock_accuracy]/ensure: created[0m >2018-08-21 16:12:34,178 INFO: [mNotice: /Stage[main]/Swift::Proxy::Ratelimit/Swift_proxy_config[filter:ratelimit/max_sleep_time_seconds]/ensure: created[0m >2018-08-21 16:12:34,186 INFO: [mNotice: /Stage[main]/Swift::Proxy::Ratelimit/Swift_proxy_config[filter:ratelimit/log_sleep_time_seconds]/ensure: created[0m >2018-08-21 16:12:34,195 INFO: [mNotice: /Stage[main]/Swift::Proxy::Ratelimit/Swift_proxy_config[filter:ratelimit/rate_buffer_seconds]/ensure: created[0m >2018-08-21 16:12:34,204 INFO: [mNotice: /Stage[main]/Swift::Proxy::Ratelimit/Swift_proxy_config[filter:ratelimit/account_ratelimit]/ensure: created[0m >2018-08-21 16:12:34,251 INFO: [mNotice: /Stage[main]/Swift::Proxy::Formpost/Swift_proxy_config[filter:formpost/use]/ensure: created[0m >2018-08-21 16:12:34,278 INFO: [mNotice: /Stage[main]/Main/File[/srv/node]/ensure: created[0m >2018-08-21 16:12:34,288 INFO: [mNotice: /Stage[main]/Main/File[/srv/node/1]/ensure: created[0m >2018-08-21 16:12:35,636 INFO: [mNotice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Create[object]/Exec[create_object]/returns: executed successfully[0m >2018-08-21 16:12:36,932 INFO: [1;33mWarning: Unexpected line: Ring file /etc/swift/object.ring.gz not found, probably it hasn't been written yet[0m >2018-08-21 16:12:36,933 INFO: [1;33mWarning: Unexpected line: Devices: id region zone ip address:port replication ip:port name weight partitions balance flags meta[0m >2018-08-21 16:12:36,934 INFO: [1;33mWarning: Unexpected line: There are no devices in this ring, or all devices have been deleted[0m >2018-08-21 16:12:38,236 INFO: [mNotice: /Stage[main]/Main/Ring_object_device[192.168.24.1:6000/1]/ensure: created[0m >2018-08-21 16:12:39,546 INFO: [mNotice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Create[account]/Exec[create_account]/returns: executed successfully[0m >2018-08-21 16:12:40,846 INFO: [1;33mWarning: Unexpected line: Ring file /etc/swift/account.ring.gz not found, probably it hasn't been written yet[0m >2018-08-21 16:12:40,847 INFO: [1;33mWarning: Unexpected line: Devices: id region zone ip address:port replication ip:port name weight partitions balance flags meta[0m >2018-08-21 16:12:40,847 INFO: [1;33mWarning: Unexpected line: There are no devices in this ring, or all devices have been deleted[0m >2018-08-21 16:12:42,148 INFO: [mNotice: /Stage[main]/Main/Ring_account_device[192.168.24.1:6002/1]/ensure: created[0m >2018-08-21 16:12:43,474 INFO: [mNotice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Create[container]/Exec[create_container]/returns: executed successfully[0m >2018-08-21 16:12:44,777 INFO: [1;33mWarning: Unexpected line: Ring file /etc/swift/container.ring.gz not found, probably it hasn't been written yet[0m >2018-08-21 16:12:44,778 INFO: [1;33mWarning: Unexpected line: Devices: id region zone ip address:port replication ip:port name weight partitions balance flags meta[0m >2018-08-21 16:12:44,779 INFO: [1;33mWarning: Unexpected line: There are no devices in this ring, or all devices have been deleted[0m >2018-08-21 16:12:46,083 INFO: [mNotice: /Stage[main]/Main/Ring_container_device[192.168.24.1:6001/1]/ensure: created[0m >2018-08-21 16:12:47,666 INFO: [mNotice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Rebalance[object]/Exec[rebalance_object]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:12:49,253 INFO: [mNotice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Rebalance[account]/Exec[rebalance_account]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:12:50,842 INFO: [mNotice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Rebalance[container]/Exec[rebalance_container]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:12:50,855 INFO: [mNotice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/File[/etc/swift/account-server/]/owner: owner changed 'root' to 'swift'[0m >2018-08-21 16:12:50,857 INFO: [mNotice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/File[/etc/swift/account-server/]/group: group changed 'root' to 'swift'[0m >2018-08-21 16:12:50,869 INFO: [mNotice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/File[/etc/swift/container-server/]/owner: owner changed 'root' to 'swift'[0m >2018-08-21 16:12:50,871 INFO: [mNotice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/File[/etc/swift/container-server/]/group: group changed 'root' to 'swift'[0m >2018-08-21 16:12:50,882 INFO: [mNotice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/File[/etc/swift/object-server/]/owner: owner changed 'root' to 'swift'[0m >2018-08-21 16:12:50,884 INFO: [mNotice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/File[/etc/swift/object-server/]/group: group changed 'root' to 'swift'[0m >2018-08-21 16:12:51,038 INFO: [mNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/content: content changed '{md5}07e5a1a1e5a0ab83d745e20680eb32c1' to '{md5}b8e0acbe9cb3efa065f7bd431bbad4ef'[0m >2018-08-21 16:12:51,039 INFO: [mNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/mode: mode changed '0640' to '0644'[0m >2018-08-21 16:12:52,181 INFO: [mNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/content: content changed '{md5}4998257eb89ff63e838b37686ebb1ee7' to '{md5}13f9cc9533fc5e261656ac245b1024c9'[0m >2018-08-21 16:12:52,183 INFO: [mNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/mode: mode changed '0640' to '0644'[0m >2018-08-21 16:12:52,299 INFO: [mNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/content: content changed '{md5}8c3bfdea900f37c8b2cbd5d9fe5d664c' to '{md5}8c6d972ba5741359f37e8d246b67292f'[0m >2018-08-21 16:12:52,300 INFO: [mNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/mode: mode changed '0640' to '0644'[0m >2018-08-21 16:12:52,310 INFO: [mNotice: /Stage[main]/Swift::Deps/Anchor[swift::config::end]: Triggered 'refresh' from 62 events[0m >2018-08-21 16:12:52,312 INFO: [mNotice: /Stage[main]/Swift::Deps/Anchor[swift::service::begin]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:12:52,346 INFO: [mNotice: /Stage[main]/Heat::Wsgi::Apache_api/Heat::Wsgi::Apache[api]/Openstacklib::Wsgi::Apache[heat_api_wsgi]/File[/var/www/cgi-bin/heat]/ensure: created[0m >2018-08-21 16:12:52,400 INFO: [mNotice: /Stage[main]/Heat::Wsgi::Apache_api/Heat::Wsgi::Apache[api]/Openstacklib::Wsgi::Apache[heat_api_wsgi]/File[heat_api_wsgi]/ensure: defined content as '{md5}640891728ce5d46ae40234228561597c'[0m >2018-08-21 16:12:52,475 INFO: [mNotice: /Stage[main]/Heat::Wsgi::Apache_api_cfn/Heat::Wsgi::Apache[api_cfn]/Openstacklib::Wsgi::Apache[heat_api_cfn_wsgi]/File[heat_api_cfn_wsgi]/ensure: defined content as '{md5}c3ae61ab87649c8cdfab8977da2b194b'[0m >2018-08-21 16:12:52,601 INFO: [mNotice: /Stage[main]/Ironic::Pxe/Apache::Vhost[ipxe_vhost]/Concat[10-ipxe_vhost.conf]/File[/etc/httpd/conf.d/10-ipxe_vhost.conf]/ensure: defined content as '{md5}0ffa81700d1dc962149c4ec89737928f'[0m >2018-08-21 16:12:52,881 INFO: [mNotice: /Stage[main]/Mistral::Db::Mysql/Openstacklib::Db::Mysql[mistral]/Openstacklib::Db::Mysql::Host_access[mistral_%]/Mysql_user[mistral@%]/ensure: created[0m >2018-08-21 16:12:52,969 INFO: [mNotice: /Stage[main]/Mistral::Db::Mysql/Openstacklib::Db::Mysql[mistral]/Openstacklib::Db::Mysql::Host_access[mistral_%]/Mysql_grant[mistral@%/mistral.*]/ensure: created[0m >2018-08-21 16:12:53,307 INFO: [mNotice: /Stage[main]/Mistral::Db::Mysql/Openstacklib::Db::Mysql[mistral]/Openstacklib::Db::Mysql::Host_access[mistral_192.168.24.1]/Mysql_user[mistral@192.168.24.1]/ensure: created[0m >2018-08-21 16:12:53,397 INFO: [mNotice: /Stage[main]/Mistral::Db::Mysql/Openstacklib::Db::Mysql[mistral]/Openstacklib::Db::Mysql::Host_access[mistral_192.168.24.1]/Mysql_grant[mistral@192.168.24.1/mistral.*]/ensure: created[0m >2018-08-21 16:12:53,489 INFO: [mNotice: /Stage[main]/Mistral::Deps/Anchor[mistral::db::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:12:53,492 INFO: [mNotice: /Stage[main]/Mistral::Deps/Anchor[mistral::dbsync::begin]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:13:48,834 INFO: [mNotice: /Stage[main]/Mistral::Db::Sync/Exec[mistral-db-sync]/returns: executed successfully[0m >2018-08-21 16:13:56,693 INFO: [mNotice: /Stage[main]/Mistral::Db::Sync/Exec[mistral-db-sync]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:13:56,800 INFO: [mNotice: /Stage[main]/Tripleo::Ui/Apache::Vhost[tripleo-ui]/Concat[25-tripleo-ui.conf]/File[/etc/httpd/conf.d/25-tripleo-ui.conf]/ensure: defined content as '{md5}51d2b4043d24554641c7d22b571648a9'[0m >2018-08-21 16:13:57,081 INFO: [mNotice: /Stage[main]/Zaqar::Db::Mysql/Openstacklib::Db::Mysql[zaqar]/Openstacklib::Db::Mysql::Host_access[zaqar_%]/Mysql_user[zaqar@%]/ensure: created[0m >2018-08-21 16:13:57,173 INFO: [mNotice: /Stage[main]/Zaqar::Db::Mysql/Openstacklib::Db::Mysql[zaqar]/Openstacklib::Db::Mysql::Host_access[zaqar_%]/Mysql_grant[zaqar@%/zaqar.*]/ensure: created[0m >2018-08-21 16:13:57,525 INFO: [mNotice: /Stage[main]/Zaqar::Db::Mysql/Openstacklib::Db::Mysql[zaqar]/Openstacklib::Db::Mysql::Host_access[zaqar_192.168.24.1]/Mysql_user[zaqar@192.168.24.1]/ensure: created[0m >2018-08-21 16:13:57,612 INFO: [mNotice: /Stage[main]/Zaqar::Db::Mysql/Openstacklib::Db::Mysql[zaqar]/Openstacklib::Db::Mysql::Host_access[zaqar_192.168.24.1]/Mysql_grant[zaqar@192.168.24.1/zaqar.*]/ensure: created[0m >2018-08-21 16:13:57,705 INFO: [mNotice: /Stage[main]/Zaqar::Deps/Anchor[zaqar::db::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:13:57,708 INFO: [mNotice: /Stage[main]/Zaqar::Deps/Anchor[zaqar::dbsync::begin]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:14:04,615 INFO: [mNotice: /Stage[main]/Zaqar::Db::Sync/Exec[zaqar-db-sync]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:14:04,618 INFO: [mNotice: /Stage[main]/Zaqar::Deps/Anchor[zaqar::dbsync::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:14:04,621 INFO: [mNotice: /Stage[main]/Zaqar::Deps/Anchor[zaqar::service::begin]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:14:04,863 INFO: [mNotice: /Stage[main]/Zaqar::Server/Service[openstack-zaqar]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:14:06,198 INFO: [mNotice: /Stage[main]/Main/Zaqar::Server_instance[1]/Service[openstack-zaqar@1]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:14:06,201 INFO: [mNotice: /Stage[main]/Zaqar::Deps/Anchor[zaqar::service::end]: Triggered 'refresh' from 2 events[0m >2018-08-21 16:14:06,345 INFO: [mNotice: /Stage[main]/Nova::Wsgi::Apache_api/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/etc/httpd/conf.d/10-nova_api_wsgi.conf]/ensure: defined content as '{md5}e287da578698e34f2d7c3644322f6011'[0m >2018-08-21 16:14:06,470 INFO: [mNotice: /Stage[main]/Nova::Wsgi::Apache_placement/Openstacklib::Wsgi::Apache[placement_wsgi]/Apache::Vhost[placement_wsgi]/Concat[10-placement_wsgi.conf]/File[/etc/httpd/conf.d/10-placement_wsgi.conf]/ensure: defined content as '{md5}85537068d05fb648807fe04e8c54c007'[0m >2018-08-21 16:14:07,663 INFO: [mNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/etc/httpd/conf.d/10-ironic_wsgi.conf]/ensure: defined content as '{md5}9dd675a7e8193e3ccbd26db3c745ea52'[0m >2018-08-21 16:14:07,788 INFO: [mNotice: /Stage[main]/Zaqar::Wsgi::Apache/Openstacklib::Wsgi::Apache[zaqar_wsgi]/Apache::Vhost[zaqar_wsgi]/Concat[10-zaqar_wsgi.conf]/File[/etc/httpd/conf.d/10-zaqar_wsgi.conf]/ensure: defined content as '{md5}91417b33009d0a530264b710cd576066'[0m >2018-08-21 16:14:07,923 INFO: [mNotice: /Stage[main]/Heat::Wsgi::Apache_api/Heat::Wsgi::Apache[api]/Openstacklib::Wsgi::Apache[heat_api_wsgi]/Apache::Vhost[heat_api_wsgi]/Concat[10-heat_api_wsgi.conf]/File[/etc/httpd/conf.d/10-heat_api_wsgi.conf]/ensure: defined content as '{md5}d2d9ba421450fd969ac3b02c530c8336'[0m >2018-08-21 16:14:08,056 INFO: [mNotice: /Stage[main]/Heat::Wsgi::Apache_api_cfn/Heat::Wsgi::Apache[api_cfn]/Openstacklib::Wsgi::Apache[heat_api_cfn_wsgi]/Apache::Vhost[heat_api_cfn_wsgi]/Concat[10-heat_api_cfn_wsgi.conf]/File[/etc/httpd/conf.d/10-heat_api_cfn_wsgi.conf]/ensure: defined content as '{md5}4c4fbb4f9e2bf5242c619b90607acca8'[0m >2018-08-21 16:14:08,117 INFO: [mNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Triggered 'refresh' from 41 events[0m >2018-08-21 16:14:13,149 INFO: [mNotice: /Stage[main]/Keystone/Exec[keystone-manage fernet_setup]: Triggered 'refresh' from 2 events[0m >2018-08-21 16:14:18,203 INFO: [mNotice: /Stage[main]/Keystone/Exec[keystone-manage credential_setup]: Triggered 'refresh' from 2 events[0m >2018-08-21 16:14:18,294 INFO: [mNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Mysql_database[keystone]/ensure: created[0m >2018-08-21 16:14:18,559 INFO: [mNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_%]/Mysql_user[keystone@%]/ensure: created[0m >2018-08-21 16:14:18,646 INFO: [mNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_%]/Mysql_grant[keystone@%/keystone.*]/ensure: created[0m >2018-08-21 16:14:18,987 INFO: [mNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_192.168.24.1]/Mysql_user[keystone@192.168.24.1]/ensure: created[0m >2018-08-21 16:14:19,076 INFO: [mNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_192.168.24.1]/Mysql_grant[keystone@192.168.24.1/keystone.*]/ensure: created[0m >2018-08-21 16:14:19,165 INFO: [mNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::db::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:14:19,168 INFO: [mNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::begin]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:15:32,151 INFO: [mNotice: /Stage[main]/Keystone::Db::Sync/Exec[keystone-manage db_sync]/returns: executed successfully[0m >2018-08-21 16:15:39,432 INFO: [mNotice: /Stage[main]/Keystone::Db::Sync/Exec[keystone-manage db_sync]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:15:39,436 INFO: [mNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::end]: Triggered 'refresh' from 2 events[0m >2018-08-21 16:15:47,270 INFO: [mNotice: /Stage[main]/Keystone/Exec[keystone-manage bootstrap]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:15:47,273 INFO: [mNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::service::begin]: Triggered 'refresh' from 6 events[0m >2018-08-21 16:15:48,789 INFO: [mNotice: /Stage[main]/Apache::Service/Service[httpd]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:15:48,793 INFO: [mNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::service::end]: Triggered 'refresh' from 38 events[0m >2018-08-21 16:16:10,771 INFO: [mNotice: /Stage[main]/Heat::Keystone::Auth/Keystone_role[heat_stack_user]/ensure: created[0m >2018-08-21 16:16:21,568 INFO: [mNotice: /Stage[main]/Swift::Keystone::Auth/Keystone_role[swiftoperator]/ensure: created[0m >2018-08-21 16:16:43,007 INFO: [mNotice: /Stage[main]/Heat::Keystone::Domain/Keystone_domain[heat_stack]/ensure: created[0m >2018-08-21 16:17:04,414 INFO: [mNotice: /Stage[main]/Heat::Keystone::Domain/Keystone_user[heat_admin::heat_stack]/ensure: created[0m >2018-08-21 16:17:17,346 INFO: [mNotice: /Stage[main]/Heat::Keystone::Domain/Keystone_user_role[heat_admin::heat_stack@::heat_stack]/ensure: created[0m >2018-08-21 16:17:25,383 INFO: [mNotice: /Stage[main]/Keystone::Endpoint/Keystone::Resource::Service_identity[keystone]/Keystone_service[keystone::identity]/ensure: created[0m >2018-08-21 16:17:45,909 INFO: [mNotice: /Stage[main]/Keystone::Endpoint/Keystone::Resource::Service_identity[keystone]/Keystone_endpoint[regionOne/keystone::identity]/ensure: created[0m >2018-08-21 16:17:45,965 INFO: [1;33mWarning: Puppet::Type::Keystone_tenant::ProviderOpenstack: Support for a resource without the domain set is deprecated in Liberty cycle. It will be dropped in the M-cycle. Currently using 'Default' as default domain name while the default domain id is '3715c6b1d5924d21b29ad871ab4e6305'.[0m >2018-08-21 16:17:58,152 INFO: [mNotice: /Stage[main]/Keystone::Roles::Admin/Keystone_tenant[service]/ensure: created[0m >2018-08-21 16:17:58,157 INFO: [mNotice: /Stage[main]/Keystone::Roles::Admin/Keystone_tenant[admin]/description: description changed 'Bootstrap project for initializing the cloud.' to 'admin tenant'[0m >2018-08-21 16:18:16,789 INFO: [mNotice: /Stage[main]/Keystone::Roles::Admin/Keystone_user[admin]/email: defined 'email' as 'root@localhost'[0m >2018-08-21 16:18:43,607 INFO: [mNotice: /Stage[main]/Heat::Keystone::Auth/Keystone::Resource::Service_identity[heat]/Keystone_user[heat]/ensure: created[0m >2018-08-21 16:19:00,318 INFO: [mNotice: /Stage[main]/Heat::Keystone::Auth/Keystone::Resource::Service_identity[heat]/Keystone_user_role[heat@service]/ensure: created[0m >2018-08-21 16:19:04,391 INFO: [mNotice: /Stage[main]/Heat::Keystone::Auth/Keystone::Resource::Service_identity[heat]/Keystone_service[heat::orchestration]/ensure: created[0m >2018-08-21 16:19:20,961 INFO: [mNotice: /Stage[main]/Heat::Keystone::Auth/Keystone::Resource::Service_identity[heat]/Keystone_endpoint[regionOne/heat::orchestration]/ensure: created[0m >2018-08-21 16:19:30,270 INFO: [mNotice: /Stage[main]/Heat::Keystone::Auth_cfn/Keystone::Resource::Service_identity[heat-cfn]/Keystone_user[heat-cfn]/ensure: created[0m >2018-08-21 16:19:43,091 INFO: [mNotice: /Stage[main]/Heat::Keystone::Auth_cfn/Keystone::Resource::Service_identity[heat-cfn]/Keystone_user_role[heat-cfn@service]/ensure: created[0m >2018-08-21 16:19:43,095 INFO: [mNotice: /Stage[main]/Heat::Deps/Anchor[heat::service::end]: Triggered 'refresh' from 5 events[0m >2018-08-21 16:19:47,037 INFO: [mNotice: /Stage[main]/Heat::Keystone::Auth_cfn/Keystone::Resource::Service_identity[heat-cfn]/Keystone_service[heat-cfn::cloudformation]/ensure: created[0m >2018-08-21 16:20:03,384 INFO: [mNotice: /Stage[main]/Heat::Keystone::Auth_cfn/Keystone::Resource::Service_identity[heat-cfn]/Keystone_endpoint[regionOne/heat-cfn::cloudformation]/ensure: created[0m >2018-08-21 16:20:13,701 INFO: [mNotice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_user[neutron]/ensure: created[0m >2018-08-21 16:20:26,391 INFO: [mNotice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_user_role[neutron@service]/ensure: created[0m >2018-08-21 16:20:30,467 INFO: [mNotice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_service[neutron::network]/ensure: created[0m >2018-08-21 16:20:46,558 INFO: [mNotice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_endpoint[regionOne/neutron::network]/ensure: created[0m >2018-08-21 16:21:24,461 INFO: [mNotice: /Stage[main]/Neutron::Server/Service[neutron-server]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:21:33,807 INFO: [mNotice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user[glance]/ensure: created[0m >2018-08-21 16:21:46,409 INFO: [mNotice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user_role[glance@service]/ensure: created[0m >2018-08-21 16:21:50,336 INFO: [mNotice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_service[glance::image]/ensure: created[0m >2018-08-21 16:22:06,462 INFO: [mNotice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_endpoint[regionOne/glance::image]/ensure: created[0m >2018-08-21 16:22:06,466 INFO: [mNotice: /Stage[main]/Glance::Deps/Anchor[glance::service::begin]: Triggered 'refresh' from 5 events[0m >2018-08-21 16:22:15,759 INFO: [mNotice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova]/Keystone_user[nova]/ensure: created[0m >2018-08-21 16:22:28,365 INFO: [mNotice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova]/Keystone_user_role[nova@service]/ensure: created[0m >2018-08-21 16:22:32,386 INFO: [mNotice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova]/Keystone_service[nova::compute]/ensure: created[0m >2018-08-21 16:22:48,753 INFO: [mNotice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova]/Keystone_endpoint[regionOne/nova::compute]/ensure: created[0m >2018-08-21 16:22:58,126 INFO: [mNotice: /Stage[main]/Nova::Keystone::Auth_placement/Keystone::Resource::Service_identity[placement]/Keystone_user[placement]/ensure: created[0m >2018-08-21 16:23:10,882 INFO: [mNotice: /Stage[main]/Nova::Keystone::Auth_placement/Keystone::Resource::Service_identity[placement]/Keystone_user_role[placement@service]/ensure: created[0m >2018-08-21 16:23:14,871 INFO: [mNotice: /Stage[main]/Nova::Keystone::Auth_placement/Keystone::Resource::Service_identity[placement]/Keystone_service[placement::placement]/ensure: created[0m >2018-08-21 16:23:31,102 INFO: [mNotice: /Stage[main]/Nova::Keystone::Auth_placement/Keystone::Resource::Service_identity[placement]/Keystone_endpoint[regionOne/placement::placement]/ensure: created[0m >2018-08-21 16:23:40,408 INFO: [mNotice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_user[swift]/ensure: created[0m >2018-08-21 16:23:53,130 INFO: [mNotice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_user_role[swift@service]/ensure: created[0m >2018-08-21 16:23:57,222 INFO: [mNotice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_service[swift::object-store]/ensure: created[0m >2018-08-21 16:24:13,264 INFO: [mNotice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_endpoint[regionOne/swift::object-store]/ensure: created[0m >2018-08-21 16:24:22,530 INFO: [mNotice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_user[ironic]/ensure: created[0m >2018-08-21 16:24:35,074 INFO: [mNotice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_user_role[ironic@service]/ensure: created[0m >2018-08-21 16:24:39,045 INFO: [mNotice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_service[ironic::baremetal]/ensure: created[0m >2018-08-21 16:24:55,163 INFO: [mNotice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_endpoint[regionOne/ironic::baremetal]/ensure: created[0m >2018-08-21 16:25:04,607 INFO: [mNotice: /Stage[main]/Ironic::Keystone::Auth_inspector/Keystone::Resource::Service_identity[ironic-inspector]/Keystone_user[ironic-inspector]/ensure: created[0m >2018-08-21 16:25:17,337 INFO: [mNotice: /Stage[main]/Ironic::Keystone::Auth_inspector/Keystone::Resource::Service_identity[ironic-inspector]/Keystone_user_role[ironic-inspector@service]/ensure: created[0m >2018-08-21 16:25:21,371 INFO: [mNotice: /Stage[main]/Ironic::Keystone::Auth_inspector/Keystone::Resource::Service_identity[ironic-inspector]/Keystone_service[ironic-inspector::baremetal-introspection]/ensure: created[0m >2018-08-21 16:25:37,518 INFO: [mNotice: /Stage[main]/Ironic::Keystone::Auth_inspector/Keystone::Resource::Service_identity[ironic-inspector]/Keystone_endpoint[regionOne/ironic-inspector::baremetal-introspection]/ensure: created[0m >2018-08-21 16:25:54,871 INFO: [mNotice: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Service[nova-api]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:25:56,258 INFO: [mNotice: /Stage[main]/Swift::Proxy/Swift::Service[swift-proxy-server]/Service[swift-proxy-server]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:25:57,619 INFO: [mNotice: /Stage[main]/Swift::Objectexpirer/Swift::Service[swift-object-expirer]/Service[swift-object-expirer]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:26:06,999 INFO: [mNotice: /Stage[main]/Mistral::Keystone::Auth/Keystone::Resource::Service_identity[mistral]/Keystone_user[mistral]/ensure: created[0m >2018-08-21 16:26:19,667 INFO: [mNotice: /Stage[main]/Mistral::Keystone::Auth/Keystone::Resource::Service_identity[mistral]/Keystone_user_role[mistral@service]/ensure: created[0m >2018-08-21 16:26:23,739 INFO: [mNotice: /Stage[main]/Mistral::Keystone::Auth/Keystone::Resource::Service_identity[mistral]/Keystone_service[mistral::workflowv2]/ensure: created[0m >2018-08-21 16:26:39,877 INFO: [mNotice: /Stage[main]/Mistral::Keystone::Auth/Keystone::Resource::Service_identity[mistral]/Keystone_endpoint[regionOne/mistral::workflowv2]/ensure: created[0m >2018-08-21 16:26:49,139 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Auth/Keystone::Resource::Service_identity[zaqar]/Keystone_user[zaqar]/ensure: created[0m >2018-08-21 16:26:54,219 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Auth/Keystone::Resource::Service_identity[zaqar]/Keystone_role[ResellerAdmin]/ensure: created[0m >2018-08-21 16:27:11,275 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Auth/Keystone::Resource::Service_identity[zaqar]/Keystone_user_role[zaqar@service]/ensure: created[0m >2018-08-21 16:27:15,275 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Auth/Keystone::Resource::Service_identity[zaqar]/Keystone_service[zaqar::messaging]/ensure: created[0m >2018-08-21 16:27:31,421 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Auth/Keystone::Resource::Service_identity[zaqar]/Keystone_endpoint[regionOne/zaqar::messaging]/ensure: created[0m >2018-08-21 16:27:40,786 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Auth_websocket/Keystone::Resource::Service_identity[zaqar-websocket]/Keystone_user[zaqar-websocket]/ensure: created[0m >2018-08-21 16:27:53,470 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Auth_websocket/Keystone::Resource::Service_identity[zaqar-websocket]/Keystone_user_role[zaqar-websocket@service]/ensure: created[0m >2018-08-21 16:27:57,480 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Auth_websocket/Keystone::Resource::Service_identity[zaqar-websocket]/Keystone_service[zaqar-websocket::messaging-websocket]/ensure: created[0m >2018-08-21 16:28:13,734 INFO: [mNotice: /Stage[main]/Zaqar::Keystone::Auth_websocket/Keystone::Resource::Service_identity[zaqar-websocket]/Keystone_endpoint[regionOne/zaqar-websocket::messaging-websocket]/ensure: created[0m >2018-08-21 16:28:15,153 INFO: [mNotice: /Stage[main]/Neutron::Agents::Ml2::Networking_baremetal/Service[ironic-neutron-agent-service]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:28:15,157 INFO: [mNotice: /Stage[main]/Neutron::Deps/Anchor[neutron::service::end]: Triggered 'refresh' from 6 events[0m >2018-08-21 16:28:16,539 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Service[ironic-inspector]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:28:17,951 INFO: [mNotice: /Stage[main]/Ironic::Inspector/Service[ironic-inspector-dnsmasq]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:28:17,955 INFO: [mNotice: /Stage[main]/Ironic::Deps/Anchor[ironic-inspector::service::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:29:00,233 INFO: [mNotice: /Stage[main]/Mistral::Db::Sync/Exec[mistral-db-populate]/returns: executed successfully[0m >2018-08-21 16:29:40,397 INFO: [mNotice: /Stage[main]/Mistral::Db::Sync/Exec[mistral-db-populate]: Triggered 'refresh' from 5 events[0m >2018-08-21 16:29:40,400 INFO: [mNotice: /Stage[main]/Mistral::Deps/Anchor[mistral::dbsync::end]: Triggered 'refresh' from 4 events[0m >2018-08-21 16:29:40,403 INFO: [mNotice: /Stage[main]/Mistral::Deps/Anchor[mistral::service::begin]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:29:41,871 INFO: [mNotice: /Stage[main]/Mistral::Api/Service[mistral-api]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:29:43,252 INFO: [mNotice: /Stage[main]/Mistral::Engine/Service[mistral-engine]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:29:44,644 INFO: [mNotice: /Stage[main]/Mistral::Executor/Service[mistral-executor]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:29:44,647 INFO: [mNotice: /Stage[main]/Mistral::Deps/Anchor[mistral::service::end]: Triggered 'refresh' from 3 events[0m >2018-08-21 16:30:03,070 INFO: [mNotice: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Service[nova-compute]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:30:03,074 INFO: [mNotice: /Stage[main]/Nova::Deps/Anchor[nova::service::end]: Triggered 'refresh' from 4 events[0m >2018-08-21 16:30:03,087 INFO: [mNotice: /Stage[main]/Nova::Logging/File[/var/log/nova/nova-manage.log]/seluser: seluser changed 'unconfined_u' to 'system_u'[0m >2018-08-21 16:30:21,239 INFO: [mNotice: /Stage[main]/Nova::Cell_v2::Discover_hosts/Exec[nova-cell_v2-discover_hosts]: Triggered 'refresh' from 2 events[0m >2018-08-21 16:30:22,755 INFO: [mNotice: /Stage[main]/Swift::Storage::Account/Swift::Service[swift-account-reaper]/Service[swift-account-reaper]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:30:24,199 INFO: [mNotice: /Stage[main]/Swift::Storage::Container/Swift::Service[swift-container-updater]/Service[swift-container-updater]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:30:25,634 INFO: [mNotice: /Stage[main]/Swift::Storage::Container/Swift::Service[swift-container-sync]/Service[swift-container-sync]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:30:27,053 INFO: [mNotice: /Stage[main]/Swift::Storage::Object/Swift::Service[swift-object-updater]/Service[swift-object-updater]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:30:28,478 INFO: [mNotice: /Stage[main]/Swift::Storage::Object/Swift::Service[swift-object-reconstructor]/Service[swift-object-reconstructor]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:30:29,937 INFO: [mNotice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-server]/Service[swift-account-server]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:30:30,164 INFO: [mNotice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-replicator]/Service[swift-account-replicator]: Triggered 'refresh' from 4 events[0m >2018-08-21 16:30:30,402 INFO: [mNotice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-auditor]/Service[swift-account-auditor]: Triggered 'refresh' from 4 events[0m >2018-08-21 16:30:31,884 INFO: [mNotice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-server]/Service[swift-container-server]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:30:32,121 INFO: [mNotice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-replicator]/Service[swift-container-replicator]: Triggered 'refresh' from 4 events[0m >2018-08-21 16:30:33,419 INFO: [mNotice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-auditor]/Service[swift-container-auditor]: Triggered 'refresh' from 4 events[0m >2018-08-21 16:30:34,897 INFO: [mNotice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-server]/Service[swift-object-server]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:30:35,139 INFO: [mNotice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-replicator]/Service[swift-object-replicator]: Triggered 'refresh' from 4 events[0m >2018-08-21 16:30:35,381 INFO: [mNotice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-auditor]/Service[swift-object-auditor]: Triggered 'refresh' from 4 events[0m >2018-08-21 16:30:35,385 INFO: [mNotice: /Stage[main]/Swift::Deps/Anchor[swift::service::end]: Triggered 'refresh' from 16 events[0m >2018-08-21 16:30:36,848 INFO: [mNotice: /Stage[main]/Glance::Api/Service[glance-api]/ensure: ensure changed 'stopped' to 'running'[0m >2018-08-21 16:30:36,857 INFO: [mNotice: /Stage[main]/Glance::Deps/Anchor[glance::service::end]: Triggered 'refresh' from 1 events[0m >2018-08-21 16:30:40,656 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Post/Tripleo::Firewall::Rule[998 log all]/Firewall[998 log all ipv4]/ensure: created[0m >2018-08-21 16:30:43,119 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Post/Tripleo::Firewall::Rule[998 log all]/Firewall[998 log all ipv6]/ensure: created[0m >2018-08-21 16:30:46,924 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Post/Tripleo::Firewall::Rule[999 drop all]/Firewall[999 drop all ipv4]/ensure: created[0m >2018-08-21 16:30:49,184 INFO: [mNotice: /Stage[main]/Tripleo::Firewall::Post/Tripleo::Firewall::Rule[999 drop all]/Firewall[999 drop all ipv6]/ensure: created[0m >2018-08-21 16:30:49,385 INFO: [mNotice: /Stage[main]/Firewall::Linux::Redhat/File[/etc/sysconfig/iptables]/seluser: seluser changed 'unconfined_u' to 'system_u'[0m >2018-08-21 16:30:49,395 INFO: [mNotice: /Stage[main]/Firewall::Linux::Redhat/File[/etc/sysconfig/ip6tables]/seluser: seluser changed 'unconfined_u' to 'system_u'[0m >2018-08-21 16:30:57,010 INFO: [mNotice: Applied catalog in 3755.13 seconds[0m >2018-08-21 16:30:57,357 INFO: Changes: >2018-08-21 16:30:57,357 INFO: Total: 1079 >2018-08-21 16:30:57,358 INFO: Events: >2018-08-21 16:30:57,358 INFO: Success: 1079 >2018-08-21 16:30:57,359 INFO: Total: 1079 >2018-08-21 16:30:57,359 INFO: Resources: >2018-08-21 16:30:57,360 INFO: Changed: 1068 >2018-08-21 16:30:57,360 INFO: Out of sync: 1068 >2018-08-21 16:30:57,360 INFO: Restarted: 112 >2018-08-21 16:30:57,361 INFO: Total: 2788 >2018-08-21 16:30:57,361 INFO: Time: >2018-08-21 16:30:57,362 INFO: Filebucket: 0.00 >2018-08-21 16:30:57,362 INFO: Policy rcd: 0.00 >2018-08-21 16:30:57,363 INFO: Schedule: 0.00 >2018-08-21 16:30:57,363 INFO: Neutron api config: 0.00 >2018-08-21 16:30:57,363 INFO: Resources: 0.01 >2018-08-21 16:30:57,364 INFO: Swift config: 0.01 >2018-08-21 16:30:57,364 INFO: Concat file: 0.01 >2018-08-21 16:30:57,365 INFO: Glance swift config: 0.02 >2018-08-21 16:30:57,365 INFO: Nova paste api ini: 0.02 >2018-08-21 16:30:57,366 INFO: Ironic neutron agent config: 0.04 >2018-08-21 16:30:57,366 INFO: Concat fragment: 0.04 >2018-08-21 16:30:57,366 INFO: Anchor: 0.04 >2018-08-21 16:30:57,367 INFO: Swift object expirer config: 0.04 >2018-08-21 16:30:57,367 INFO: Neutron l3 agent config: 0.08 >2018-08-21 16:30:57,368 INFO: Archive: 0.09 >2018-08-21 16:30:57,368 INFO: Neutron plugin ml2: 0.13 >2018-08-21 16:30:57,369 INFO: Neutron dhcp agent config: 0.14 >2018-08-21 16:30:57,369 INFO: Vs bridge: 0.14 >2018-08-21 16:30:57,370 INFO: Cron: 0.15 >2018-08-21 16:30:57,370 INFO: Sysctl runtime: 0.16 >2018-08-21 16:30:57,370 INFO: Group: 0.20 >2018-08-21 16:30:57,371 INFO: Mistral config: 0.28 >2018-08-21 16:30:57,371 INFO: Swift proxy config: 0.49 >2018-08-21 16:30:57,372 INFO: Neutron agent ovs: 0.57 >2018-08-21 16:30:57,372 INFO: User: 0.68 >2018-08-21 16:30:57,373 INFO: Mysql database: 0.99 >2018-08-21 16:30:57,373 INFO: Sysctl: 1.24 >2018-08-21 16:30:57,373 INFO: Glance cache config: 1.44 >2018-08-21 16:30:57,374 INFO: Glance registry config: 1.67 >2018-08-21 16:30:57,374 INFO: Keystone domain: 10.76 >2018-08-21 16:30:57,375 INFO: Nova config: 11.10 >2018-08-21 16:30:57,375 INFO: Heat config: 11.79 >2018-08-21 16:30:57,376 INFO: Glance api config: 12.72 >2018-08-21 16:30:57,377 INFO: File: 12.85 >2018-08-21 16:30:57,377 INFO: Firewall: 126.65 >2018-08-21 16:30:57,377 INFO: Mysql datadir: 14.42 >2018-08-21 16:30:57,378 INFO: Package: 1499.44 >2018-08-21 16:30:57,378 INFO: Keystone user: 153.87 >2018-08-21 16:30:57,379 INFO: Last run: 1534858257 >2018-08-21 16:30:57,379 INFO: Nova cell v2: 17.95 >2018-08-21 16:30:57,380 INFO: Keystone user role: 185.98 >2018-08-21 16:30:57,380 INFO: Service: 186.09 >2018-08-21 16:30:57,380 INFO: Ring object device: 2.60 >2018-08-21 16:30:57,381 INFO: Ring account device: 2.60 >2018-08-21 16:30:57,381 INFO: Ring container device: 2.61 >2018-08-21 16:30:57,382 INFO: Keystone endpoint: 211.04 >2018-08-21 16:30:57,382 INFO: Keystone role: 26.66 >2018-08-21 16:30:57,383 INFO: Ironic config: 27.84 >2018-08-21 16:30:57,383 INFO: Zaqar config: 3.90 >2018-08-21 16:30:57,384 INFO: Total: 3524.44 >2018-08-21 16:30:57,384 INFO: Mysql grant: 4.02 >2018-08-21 16:30:57,384 INFO: Augeas: 4.77 >2018-08-21 16:30:57,385 INFO: Ironic inspector config: 4.79 >2018-08-21 16:30:57,385 INFO: Keystone service: 52.18 >2018-08-21 16:30:57,386 INFO: Mysql user: 6.01 >2018-08-21 16:30:57,386 INFO: Rabbitmq plugin: 7.30 >2018-08-21 16:30:57,387 INFO: Keystone tenant: 8.35 >2018-08-21 16:30:57,387 INFO: Neutron config: 8.40 >2018-08-21 16:30:57,387 INFO: Exec: 805.30 >2018-08-21 16:30:57,388 INFO: Config retrieval: 84.20 >2018-08-21 16:30:57,388 INFO: Keystone config: 9.60 >2018-08-21 16:30:57,389 INFO: Version: >2018-08-21 16:30:57,389 INFO: Config: 1534854418 >2018-08-21 16:30:57,390 INFO: Puppet: 4.8.2 >2018-08-21 16:31:38,230 INFO: + rc=2 >2018-08-21 16:31:38,231 INFO: + set -e >2018-08-21 16:31:38,231 INFO: + echo 'puppet apply exited with exit code 2' >2018-08-21 16:31:38,232 INFO: puppet apply exited with exit code 2 >2018-08-21 16:31:38,232 INFO: + '[' 2 '!=' 2 -a 2 '!=' 0 ']' >2018-08-21 16:31:38,239 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 50-puppet-stack-config completed >2018-08-21 16:31:38,243 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 ----------------------- PROFILING ----------------------- >2018-08-21 16:31:38,247 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 >2018-08-21 16:31:38,253 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 Target: configure.d >2018-08-21 16:31:38,256 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 >2018-08-21 16:31:38,260 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 Script Seconds >2018-08-21 16:31:38,264 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 --------------------------------------- ---------- >2018-08-21 16:31:38,268 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 >2018-08-21 16:31:38,293 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 10-hiera-disable 0.009 >2018-08-21 16:31:38,309 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 20-os-apply-config 0.540 >2018-08-21 16:31:38,325 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 30-reload-keepalived 0.033 >2018-08-21 16:31:38,341 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 40-hiera-datafiles 0.530 >2018-08-21 16:31:38,357 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 50-puppet-stack-config 3895.410 >2018-08-21 16:31:38,365 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 >2018-08-21 16:31:38,368 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 --------------------- END PROFILING --------------------- >2018-08-21 16:31:38,370 INFO: [2018-08-21 16:31:38,369] (os-refresh-config) [INFO] Completed phase configure >2018-08-21 16:31:38,371 INFO: [2018-08-21 16:31:38,371] (os-refresh-config) [INFO] Starting phase post-configure >2018-08-21 16:31:38,401 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 Running /usr/libexec/os-refresh-config/post-configure.d/10-iptables >2018-08-21 16:31:38,409 INFO: + set -o pipefail >2018-08-21 16:31:38,410 INFO: + EXTERNAL_BRIDGE=br-ctlplane >2018-08-21 16:31:38,410 INFO: + iptables -w -t nat -C PREROUTING -d 169.254.169.254/32 -i br-ctlplane -p tcp -m tcp --dport 80 -j REDIRECT --to-ports 8775 >2018-08-21 16:31:38,428 INFO: iptables: No chain/target/match by that name. >2018-08-21 16:31:38,429 INFO: + iptables -w -t nat -I PREROUTING -d 169.254.169.254/32 -i br-ctlplane -p tcp -m tcp --dport 80 -j REDIRECT --to-ports 8775 >2018-08-21 16:31:38,464 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 10-iptables completed >2018-08-21 16:31:38,467 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 Running /usr/libexec/os-refresh-config/post-configure.d/80-seedstack-masquerade >2018-08-21 16:31:38,475 INFO: + RULES_SCRIPT=/var/opt/undercloud-stack/masquerade >2018-08-21 16:31:38,476 INFO: + . /var/opt/undercloud-stack/masquerade >2018-08-21 16:31:38,477 INFO: ++ IPTCOMMAND=iptables >2018-08-21 16:31:38,477 INFO: ++ [[ 192.168.24.1 =~ : ]] >2018-08-21 16:31:38,477 INFO: ++ iptables -w -t nat -F BOOTSTACK_MASQ_NEW >2018-08-21 16:31:38,480 INFO: iptables: No chain/target/match by that name. >2018-08-21 16:31:38,481 INFO: ++ true >2018-08-21 16:31:38,482 INFO: ++ iptables -w -t nat -D POSTROUTING -j BOOTSTACK_MASQ_NEW >2018-08-21 16:31:38,484 INFO: iptables v1.4.21: Couldn't load target `BOOTSTACK_MASQ_NEW':No such file or directory >2018-08-21 16:31:38,485 INFO: >2018-08-21 16:31:38,485 INFO: Try `iptables -h' or 'iptables --help' for more information. >2018-08-21 16:31:38,486 INFO: ++ true >2018-08-21 16:31:38,486 INFO: ++ iptables -w -t nat -X BOOTSTACK_MASQ_NEW >2018-08-21 16:31:38,488 INFO: iptables: No chain/target/match by that name. >2018-08-21 16:31:38,488 INFO: ++ true >2018-08-21 16:31:38,489 INFO: ++ iptables -w -t nat -N BOOTSTACK_MASQ_NEW >2018-08-21 16:31:38,492 INFO: ++ NETWORK=192.168.24.0/24 >2018-08-21 16:31:38,493 INFO: ++ NETWORKS=192.168.24.0/24, >2018-08-21 16:31:38,493 INFO: ++ NETWORKS=192.168.24.0/24 >2018-08-21 16:31:38,494 INFO: ++ iptables -w -t nat -A BOOTSTACK_MASQ_NEW -s 192.168.24.0/24 -d 192.168.24.0/24 -j RETURN >2018-08-21 16:31:38,497 INFO: ++ iptables -w -t nat -A BOOTSTACK_MASQ_NEW -s 192.168.24.0/24 -j MASQUERADE >2018-08-21 16:31:38,501 INFO: ++ iptables -w -t nat -I POSTROUTING -j BOOTSTACK_MASQ_NEW >2018-08-21 16:31:38,504 INFO: ++ iptables -w -t nat -F BOOTSTACK_MASQ >2018-08-21 16:31:38,507 INFO: iptables: No chain/target/match by that name. >2018-08-21 16:31:38,508 INFO: ++ true >2018-08-21 16:31:38,509 INFO: ++ iptables -w -t nat -D POSTROUTING -j BOOTSTACK_MASQ >2018-08-21 16:31:38,511 INFO: iptables v1.4.21: Couldn't load target `BOOTSTACK_MASQ':No such file or directory >2018-08-21 16:31:38,512 INFO: >2018-08-21 16:31:38,512 INFO: Try `iptables -h' or 'iptables --help' for more information. >2018-08-21 16:31:38,513 INFO: ++ true >2018-08-21 16:31:38,513 INFO: ++ iptables -w -t nat -X BOOTSTACK_MASQ >2018-08-21 16:31:38,514 INFO: iptables: No chain/target/match by that name. >2018-08-21 16:31:38,515 INFO: ++ true >2018-08-21 16:31:38,515 INFO: ++ iptables -w -t nat -E BOOTSTACK_MASQ_NEW BOOTSTACK_MASQ >2018-08-21 16:31:38,519 INFO: ++ iptables -w -D FORWARD -j REJECT --reject-with icmp-host-prohibited >2018-08-21 16:31:38,523 INFO: + iptables-save >2018-08-21 16:31:38,534 INFO: + /bin/test -f /etc/sysconfig/iptables >2018-08-21 16:31:38,537 INFO: + /bin/grep -q neutron- /etc/sysconfig/iptables >2018-08-21 16:31:38,541 INFO: + /bin/test -f /etc/sysconfig/ip6tables >2018-08-21 16:31:38,543 INFO: + /bin/grep -q neutron- /etc/sysconfig/ip6tables >2018-08-21 16:31:38,547 INFO: + /bin/test -f /etc/sysconfig/iptables >2018-08-21 16:31:38,550 INFO: + /bin/grep -v '\-m comment \--comment' /etc/sysconfig/iptables >2018-08-21 16:31:38,551 INFO: + /bin/grep -q ironic-inspector >2018-08-21 16:31:38,554 INFO: + /bin/test -f /etc/sysconfig/ip6tables >2018-08-21 16:31:38,557 INFO: + /bin/grep -v '\-m comment \--comment' /etc/sysconfig/ip6tables >2018-08-21 16:31:38,558 INFO: + /bin/grep -q ironic-inspector >2018-08-21 16:31:38,568 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 80-seedstack-masquerade completed >2018-08-21 16:31:38,572 INFO: dib-run-parts Tue Aug 21 16:31:38 IDT 2018 Running /usr/libexec/os-refresh-config/post-configure.d/98-undercloud-setup >2018-08-21 16:31:38,579 INFO: + source /root/tripleo-undercloud-passwords >2018-08-21 16:31:38,581 INFO: +++ sudo hiera admin_password >2018-08-21 16:31:38,942 INFO: ++ UNDERCLOUD_ADMIN_PASSWORD=fd2cbdbb02f09f5f84dc55fec6eecffa44f721e9 >2018-08-21 16:31:38,943 INFO: +++ sudo hiera keystone::admin_token >2018-08-21 16:31:39,290 INFO: ++ UNDERCLOUD_ADMIN_TOKEN=50380f533882a0b2538e6bd6fd09623758492c59 >2018-08-21 16:31:39,291 INFO: +++ sudo hiera ceilometer::metering_secret >2018-08-21 16:31:39,634 INFO: ++ UNDERCLOUD_CEILOMETER_METERING_SECRET=f7eae168430febab3ae52065e5ba6b057f62cda1 >2018-08-21 16:31:39,635 INFO: +++ sudo hiera ceilometer::keystone::authtoken::password >2018-08-21 16:31:39,986 INFO: ++ UNDERCLOUD_CEILOMETER_PASSWORD=180342593dcc11eb8b7db84af9da0a196dc06930 >2018-08-21 16:31:39,987 INFO: +++ sudo hiera snmpd_readonly_user_password >2018-08-21 16:31:40,339 INFO: ++ UNDERCLOUD_CEILOMETER_SNMPD_PASSWORD=nil >2018-08-21 16:31:40,340 INFO: +++ sudo hiera snmpd_readonly_user_name >2018-08-21 16:31:40,686 INFO: ++ UNDERCLOUD_CEILOMETER_SNMPD_USER=nil >2018-08-21 16:31:40,687 INFO: +++ sudo hiera admin_password >2018-08-21 16:31:41,034 INFO: ++ UNDERCLOUD_DB_PASSWORD=fd2cbdbb02f09f5f84dc55fec6eecffa44f721e9 >2018-08-21 16:31:41,036 INFO: +++ sudo hiera glance::api::keystone_password >2018-08-21 16:31:41,381 INFO: ++ UNDERCLOUD_GLANCE_PASSWORD=nil >2018-08-21 16:31:41,383 INFO: +++ sudo hiera tripleo::haproxy::haproxy_stats_password >2018-08-21 16:31:41,732 INFO: ++ UNDERCLOUD_HAPROXY_STATS_PASSWORD=2f36716b9546aab702585e261b6a38e88d092e93 >2018-08-21 16:31:41,733 INFO: +++ sudo hiera heat::engine::auth_encryption_key >2018-08-21 16:31:42,085 INFO: ++ UNDERCLOUD_HEAT_ENCRYPTION_KEY=e9ac441049d0287e644a03726787f684 >2018-08-21 16:31:42,086 INFO: +++ sudo hiera heat::keystone_password >2018-08-21 16:31:42,431 INFO: ++ UNDERCLOUD_HEAT_PASSWORD=nil >2018-08-21 16:31:42,432 INFO: +++ sudo hiera heat_stack_domain_admin_password >2018-08-21 16:31:42,782 INFO: ++ UNDERCLOUD_HEAT_STACK_DOMAIN_ADMIN_PASSWORD=5412302b5e9edf98a20e376d4b842348c81a7d1a >2018-08-21 16:31:42,783 INFO: +++ sudo hiera horizon_secret_key >2018-08-21 16:31:43,129 INFO: ++ UNDERCLOUD_HORIZON_SECRET_KEY=d269e6cef16c3d2c19296742ba60b29f61c38b97 >2018-08-21 16:31:43,130 INFO: +++ sudo hiera ironic::api::authtoken::password >2018-08-21 16:31:43,475 INFO: ++ UNDERCLOUD_IRONIC_PASSWORD=0055417577663038fb5c7b60f36fbd4973a07a13 >2018-08-21 16:31:43,476 INFO: +++ sudo hiera neutron::server::auth_password >2018-08-21 16:31:43,820 INFO: ++ UNDERCLOUD_NEUTRON_PASSWORD=nil >2018-08-21 16:31:43,821 INFO: +++ sudo hiera nova::keystone::authtoken::password >2018-08-21 16:31:44,174 INFO: ++ UNDERCLOUD_NOVA_PASSWORD=f8d1cee7a859595f64b303da719c39ab2d11a6b1 >2018-08-21 16:31:44,175 INFO: +++ sudo hiera rabbit_cookie >2018-08-21 16:31:44,525 INFO: ++ UNDERCLOUD_RABBIT_COOKIE=a3899a162bcfbc21d4ffde3cf87b6dc5896fd185 >2018-08-21 16:31:44,526 INFO: +++ sudo hiera rabbit_password >2018-08-21 16:31:44,871 INFO: ++ UNDERCLOUD_RABBIT_PASSWORD=nil >2018-08-21 16:31:44,872 INFO: +++ sudo hiera rabbit_username >2018-08-21 16:31:45,216 INFO: ++ UNDERCLOUD_RABBIT_USERNAME=nil >2018-08-21 16:31:45,217 INFO: +++ sudo hiera swift::swift_hash_suffix >2018-08-21 16:31:45,571 INFO: ++ UNDERCLOUD_SWIFT_HASH_SUFFIX=nil >2018-08-21 16:31:45,572 INFO: +++ sudo hiera swift::proxy::authtoken::admin_password >2018-08-21 16:31:45,921 INFO: ++ UNDERCLOUD_SWIFT_PASSWORD=nil >2018-08-21 16:31:45,922 INFO: +++ sudo hiera mistral::admin_password >2018-08-21 16:31:46,273 INFO: ++ UNDERCLOUD_MISTRAL_PASSWORD=nil >2018-08-21 16:31:46,274 INFO: +++ sudo hiera zaqar::keystone::authtoken::password >2018-08-21 16:31:46,620 INFO: ++ UNDERCLOUD_ZAQAR_PASSWORD=17b45587584ba85a98415b48cb00cb505c80640b >2018-08-21 16:31:46,621 INFO: +++ sudo hiera cinder::keystone::authtoken::password >2018-08-21 16:31:46,971 INFO: ++ UNDERCLOUD_CINDER_PASSWORD=ee311fe9adc5de37ad9c2d7179ba5445323f8a80 >2018-08-21 16:31:46,972 INFO: + source /root/stackrc >2018-08-21 16:31:46,973 INFO: +++ set >2018-08-21 16:31:46,974 INFO: +++ awk '{FS="="} /^OS_/ {print $1}' >2018-08-21 16:31:46,979 INFO: ++ NOVA_VERSION=1.1 >2018-08-21 16:31:46,980 INFO: ++ export NOVA_VERSION >2018-08-21 16:31:46,980 INFO: ++ OS_PASSWORD=fd2cbdbb02f09f5f84dc55fec6eecffa44f721e9 >2018-08-21 16:31:46,981 INFO: ++ export OS_PASSWORD >2018-08-21 16:31:46,981 INFO: ++ OS_AUTH_TYPE=password >2018-08-21 16:31:46,982 INFO: ++ export OS_AUTH_TYPE >2018-08-21 16:31:46,982 INFO: ++ OS_AUTH_URL=http://192.168.24.1:5000/ >2018-08-21 16:31:46,983 INFO: ++ export OS_AUTH_URL >2018-08-21 16:31:46,983 INFO: ++ OS_USERNAME=admin >2018-08-21 16:31:46,984 INFO: ++ OS_PROJECT_NAME=admin >2018-08-21 16:31:46,984 INFO: ++ COMPUTE_API_VERSION=1.1 >2018-08-21 16:31:46,984 INFO: ++ IRONIC_API_VERSION=1.34 >2018-08-21 16:31:46,985 INFO: ++ OS_BAREMETAL_API_VERSION=1.34 >2018-08-21 16:31:46,985 INFO: ++ OS_NO_CACHE=True >2018-08-21 16:31:46,986 INFO: ++ OS_CLOUDNAME=undercloud >2018-08-21 16:31:46,986 INFO: ++ export OS_USERNAME >2018-08-21 16:31:46,987 INFO: ++ export OS_PROJECT_NAME >2018-08-21 16:31:46,987 INFO: ++ export COMPUTE_API_VERSION >2018-08-21 16:31:46,988 INFO: ++ export IRONIC_API_VERSION >2018-08-21 16:31:46,988 INFO: ++ export OS_BAREMETAL_API_VERSION >2018-08-21 16:31:46,988 INFO: ++ export OS_NO_CACHE >2018-08-21 16:31:46,989 INFO: ++ export OS_CLOUDNAME >2018-08-21 16:31:46,989 INFO: ++ OS_IDENTITY_API_VERSION=3 >2018-08-21 16:31:46,990 INFO: ++ export OS_IDENTITY_API_VERSION >2018-08-21 16:31:46,990 INFO: ++ OS_PROJECT_DOMAIN_NAME=Default >2018-08-21 16:31:46,991 INFO: ++ export OS_PROJECT_DOMAIN_NAME >2018-08-21 16:31:46,991 INFO: ++ OS_USER_DOMAIN_NAME=Default >2018-08-21 16:31:46,992 INFO: ++ export OS_USER_DOMAIN_NAME >2018-08-21 16:31:46,992 INFO: ++ '[' -z '' ']' >2018-08-21 16:31:46,992 INFO: ++ export PS1= >2018-08-21 16:31:46,993 INFO: ++ PS1= >2018-08-21 16:31:46,993 INFO: ++ export 'PS1=${OS_CLOUDNAME:+($OS_CLOUDNAME)} ' >2018-08-21 16:31:46,994 INFO: ++ PS1='${OS_CLOUDNAME:+($OS_CLOUDNAME)} ' >2018-08-21 16:31:46,994 INFO: ++ export CLOUDPROMPT_ENABLED=1 >2018-08-21 16:31:46,995 INFO: ++ CLOUDPROMPT_ENABLED=1 >2018-08-21 16:31:46,995 INFO: + INSTACK_ROOT= >2018-08-21 16:31:46,996 INFO: + export INSTACK_ROOT >2018-08-21 16:31:46,996 INFO: + '[' -n '' ']' >2018-08-21 16:31:46,996 INFO: + '[' '!' -f /root/.ssh/authorized_keys ']' >2018-08-21 16:31:46,997 INFO: + '[' '!' -f /root/.ssh/id_rsa ']' >2018-08-21 16:31:46,997 INFO: + ssh-keygen -b 1024 -N '' -f /root/.ssh/id_rsa >2018-08-21 16:31:47,519 INFO: Generating public/private rsa key pair. >2018-08-21 16:31:47,520 INFO: Your identification has been saved in /root/.ssh/id_rsa. >2018-08-21 16:31:47,520 INFO: Your public key has been saved in /root/.ssh/id_rsa.pub. >2018-08-21 16:31:47,521 INFO: The key fingerprint is: >2018-08-21 16:31:47,521 INFO: SHA256:HUfXNRjFaLPf28JtZ4lD5k8APES6f1593Q6V20I0D3E root@undercloud-0.redhat.local >2018-08-21 16:31:47,522 INFO: The key's randomart image is: >2018-08-21 16:31:47,522 INFO: +---[RSA 1024]----+ >2018-08-21 16:31:47,523 INFO: | .o..O+E| >2018-08-21 16:31:47,523 INFO: | +. * +o| >2018-08-21 16:31:47,523 INFO: | ..+o * | >2018-08-21 16:31:47,524 INFO: | ..ooo +.| >2018-08-21 16:31:47,524 INFO: | S.. .o.+| >2018-08-21 16:31:47,525 INFO: | . +.oB| >2018-08-21 16:31:47,525 INFO: | .+o+=X| >2018-08-21 16:31:47,526 INFO: | o+=*B| >2018-08-21 16:31:47,526 INFO: | .o=o| >2018-08-21 16:31:47,526 INFO: +----[SHA256]-----+ >2018-08-21 16:31:47,527 INFO: + cat /root/.ssh/id_rsa.pub >2018-08-21 16:31:47,528 INFO: + '[' -e /usr/sbin/getenforce ']' >2018-08-21 16:31:47,529 INFO: ++ getenforce >2018-08-21 16:31:47,541 INFO: + '[' Enforcing == Enforcing ']' >2018-08-21 16:31:47,542 INFO: + set +e >2018-08-21 16:31:47,543 INFO: ++ find /root/.ssh/ -exec ls -lZ '{}' ';' >2018-08-21 16:31:47,543 INFO: ++ grep -v ssh_home_t >2018-08-21 16:31:47,567 INFO: + selinux_wrong_permission= >2018-08-21 16:31:47,568 INFO: + set -e >2018-08-21 16:31:47,569 INFO: + '[' -n '' ']' >2018-08-21 16:31:47,570 INFO: ++ openstack project show admin >2018-08-21 16:31:47,570 INFO: ++ awk '$2=="id" {print $4}' >2018-08-21 16:32:09,162 INFO: + openstack quota set --cores -1 --instances -1 --ram -1 f22c2ea140d3466abe874cd49d41a625 >2018-08-21 16:32:46,754 INFO: + rm -rf /root/.novaclient >2018-08-21 16:32:46,767 INFO: dib-run-parts Tue Aug 21 16:32:46 IDT 2018 98-undercloud-setup completed >2018-08-21 16:32:46,771 INFO: dib-run-parts Tue Aug 21 16:32:46 IDT 2018 Running /usr/libexec/os-refresh-config/post-configure.d/99-refresh-completed >2018-08-21 16:32:46,781 INFO: ++ os-apply-config --key completion-handle --type raw --key-default '' >2018-08-21 16:32:47,557 INFO: [2018/08/21 04:32:47 PM] [WARNING] DEPRECATED: falling back to /var/run/os-collect-config/os_config_files.json >2018-08-21 16:32:47,589 INFO: + HANDLE= >2018-08-21 16:32:47,591 INFO: ++ os-apply-config --key completion-signal --type raw --key-default '' >2018-08-21 16:32:48,362 INFO: [2018/08/21 04:32:48 PM] [WARNING] DEPRECATED: falling back to /var/run/os-collect-config/os_config_files.json >2018-08-21 16:32:48,395 INFO: + SIGNAL= >2018-08-21 16:32:48,396 INFO: ++ os-apply-config --key instance-id --type raw --key-default '' >2018-08-21 16:32:49,159 INFO: [2018/08/21 04:32:49 PM] [WARNING] DEPRECATED: falling back to /var/run/os-collect-config/os_config_files.json >2018-08-21 16:32:49,188 INFO: + ID= >2018-08-21 16:32:49,189 INFO: + '[' -n '' ']' >2018-08-21 16:32:49,190 INFO: + exit 0 >2018-08-21 16:32:49,197 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 99-refresh-completed completed >2018-08-21 16:32:49,201 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 ----------------------- PROFILING ----------------------- >2018-08-21 16:32:49,205 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 >2018-08-21 16:32:49,211 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 Target: post-configure.d >2018-08-21 16:32:49,214 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 >2018-08-21 16:32:49,218 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 Script Seconds >2018-08-21 16:32:49,222 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 --------------------------------------- ---------- >2018-08-21 16:32:49,225 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 >2018-08-21 16:32:49,249 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 10-iptables 0.056 >2018-08-21 16:32:49,265 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 80-seedstack-masquerade 0.094 >2018-08-21 16:32:49,282 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 98-undercloud-setup 68.189 >2018-08-21 16:32:49,298 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 99-refresh-completed 2.419 >2018-08-21 16:32:49,306 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 >2018-08-21 16:32:49,310 INFO: dib-run-parts Tue Aug 21 16:32:49 IDT 2018 --------------------- END PROFILING --------------------- >2018-08-21 16:32:49,311 INFO: [2018-08-21 16:32:49,310] (os-refresh-config) [INFO] Completed phase post-configure >2018-08-21 16:32:49,337 INFO: os-refresh-config completed successfully >2018-08-21 16:32:50,250 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:5000/ -H "Accept: application/json" -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" >2018-08-21 16:32:50,260 DEBUG: Starting new HTTP connection (1): 192.168.24.1 >2018-08-21 16:32:50,274 DEBUG: http://192.168.24.1:5000 "GET / HTTP/1.1" 300 599 >2018-08-21 16:32:50,317 DEBUG: RESP: [300] Date: Tue, 21 Aug 2018 13:32:50 GMT Server: Apache Vary: X-Auth-Token Content-Length: 599 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"versions": {"values": [{"status": "stable", "updated": "2018-02-28T00:00:00Z", "media-types": [{"base": "application/json", "type": "application/vnd.openstack.identity-v3+json"}], "id": "v3.10", "links": [{"href": "http://192.168.24.1:5000/v3/", "rel": "self"}]}, {"status": "deprecated", "updated": "2016-08-04T00:00:00Z", "media-types": [{"base": "application/json", "type": "application/vnd.openstack.identity-v2.0+json"}], "id": "v2.0", "links": [{"href": "http://192.168.24.1:5000/v2.0/", "rel": "self"}, {"href": "https://docs.openstack.org/", "type": "text/html", "rel": "describedby"}]}]}} > >2018-08-21 16:32:50,320 DEBUG: Making authentication request to http://192.168.24.1:5000/v3/auth/tokens >2018-08-21 16:32:52,005 DEBUG: http://192.168.24.1:5000 "POST /v3/auth/tokens HTTP/1.1" 201 8110 >2018-08-21 16:32:52,009 DEBUG: {"token": {"is_domain": false, "methods": ["password"], "roles": [{"id": "d9bce82a1b3d48afb75c066ac7855390", "name": "admin"}], "expires_at": "2018-08-21T17:32:51.000000Z", "project": {"domain": {"id": "default", "name": "Default"}, "id": "f22c2ea140d3466abe874cd49d41a625", "name": "admin"}, "catalog": [{"endpoints": [{"url": "http://192.168.24.1:8774/v2.1", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "2ff4490fa2a0457098d77449f1f6803e"}, {"url": "http://192.168.24.1:8774/v2.1", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "95b1d5fdee5b4c0281704ea50e0d5ce9"}, {"url": "http://192.168.24.1:8774/v2.1", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "ece7fa4b347b43a681e87b934be19375"}], "type": "compute", "id": "004cfe3196f34e4db5a61392090cddac", "name": "nova"}, {"endpoints": [{"url": "http://192.168.24.1:9292", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "1789ddb69039451f92355bf1d8b57c70"}, {"url": "http://192.168.24.1:9292", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "323f09cf4edc4353b75d8a51f2b42b46"}, {"url": "http://192.168.24.1:9292", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "c1a312958c6a43d487d9632fb8fe0110"}], "type": "image", "id": "00c71b99f79b47c0b3f96c3c4d5f4322", "name": "glance"}, {"endpoints": [{"url": "http://192.168.24.1:8888", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "323248a37ae446ce8ca8a315e2749ac3"}, {"url": "http://192.168.24.1:8888", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "9d207a43a9ff4802a58145617986dcac"}, {"url": "http://192.168.24.1:8888", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "9f0116bcdbe84e6eab759177a6bef1eb"}], "type": "messaging", "id": "0612c7c7028f4e5bb2b2d6849fa7946a", "name": "zaqar"}, {"endpoints": [{"url": "http://192.168.24.1:6385", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "0dbb2ecfdfda453a906c8c2ace5ca7b3"}, {"url": "http://192.168.24.1:6385", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "3a75224641fc497199a2ecc199938395"}, {"url": "http://192.168.24.1:6385", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "b993bceb6f344a5fa15c3bcfdf955478"}], "type": "baremetal", "id": "19c7aa002275450cb9c01d0798ca9e70", "name": "ironic"}, {"endpoints": [{"url": "http://192.168.24.1:5050", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "b280db447ebc41798461c9dba0e5f300"}, {"url": "http://192.168.24.1:5050", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "e245311c27224c11b499aaf5ebbf670b"}, {"url": "http://192.168.24.1:5050", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "f8d8101e8ca84f739abfdeede71c2ba4"}], "type": "baremetal-introspection", "id": "587ff47314b14dadac30f78d7592a0a8", "name": "ironic-inspector"}, {"endpoints": [{"url": "http://192.168.24.1:9696", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "6b177ba421c84bd6a38c7c7608294f31"}, {"url": "http://192.168.24.1:9696", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "bfe3ba0cfb6544dd858017d2ba65ff5e"}, {"url": "http://192.168.24.1:9696", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "ebf75af3716d404ab6833d7d2d8f36b5"}], "type": "network", "id": "6b091099e31e47928e4b324bf19bf876", "name": "neutron"}, {"endpoints": [{"url": "http://192.168.24.1:5000", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "6f134ba68fe9426c8b6d97d4c2631380"}, {"url": "http://192.168.24.1:5000", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "dc9c95c9cedd49f79622f34938ece34b"}, {"url": "http://192.168.24.1:35357", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "eebdd30efded40ba9ca2882ab0df4bac"}], "type": "identity", "id": "77d62535ab7649948b8627fef40261ae", "name": "keystone"}, {"endpoints": [{"url": "http://192.168.24.1:8778/placement", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "58a7acaaeb134764a97167b5c7d7e501"}, {"url": "http://192.168.24.1:8778/placement", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "78167e26adbe4b6299b2e62e5b689c4c"}, {"url": "http://192.168.24.1:8778/placement", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "88f1dd19dddb4fca80aa7b67da525231"}], "type": "placement", "id": "9058c6ac3e2e45b08dc4fcdd1496930e", "name": "placement"}, {"endpoints": [{"url": "http://192.168.24.1:8000/v1/f22c2ea140d3466abe874cd49d41a625", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "52309d2782d5400882a835845712ac65"}, {"url": "http://192.168.24.1:8000/v1/f22c2ea140d3466abe874cd49d41a625", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "b0b4f67a949447c585ae743fe0cdcb39"}, {"url": "http://192.168.24.1:8000/v1/f22c2ea140d3466abe874cd49d41a625", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "f50c91aeb68845fcb044907f2f6db788"}], "type": "cloudformation", "id": "c38dc1f4ae39437db9a9af6586c846b3", "name": "heat-cfn"}, {"endpoints": [{"url": "http://192.168.24.1:8989/v2", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "0eb5b9e8fdaa41948b4f0dab3790103b"}, {"url": "http://192.168.24.1:8989/v2", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "7ed433372bd94fa29146a793b32c9109"}, {"url": "http://192.168.24.1:8989/v2", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "e3a01027f424427a9965b75305b301e1"}], "type": "workflowv2", "id": "d9276e2f5afd40c4b17d45f0109fc5cd", "name": "mistral"}, {"endpoints": [{"url": "http://192.168.24.1:8080", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "24868bb765e74704a5bac7adabe0754d"}, {"url": "http://192.168.24.1:8080/v1/AUTH_f22c2ea140d3466abe874cd49d41a625", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "90dc6e4b12eb4695a471bcd1d2584909"}, {"url": "http://192.168.24.1:8080/v1/AUTH_f22c2ea140d3466abe874cd49d41a625", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "eca19f95bac3437cbd0cefacba53ab84"}], "type": "object-store", "id": "e31f0b3d353a4f7dbee1c1fe563c8b8b", "name": "swift"}, {"endpoints": [{"url": "ws://192.168.24.1:9000", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "485830434ed64328961aa92033e9dc7a"}, {"url": "ws://192.168.24.1:9000", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "7cac3900b2d848909d1d53e7cec5eea7"}, {"url": "ws://192.168.24.1:9000", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "db30422dfe5f43b0b2f1ff615124bf46"}], "type": "messaging-websocket", "id": "ea43c0cc926a4a6c96350ec1c1271454", "name": "zaqar-websocket"}, {"endpoints": [{"url": "http://192.168.24.1:8004/v1/f22c2ea140d3466abe874cd49d41a625", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "225fd64783854aeda000fc1b6a0ecb95"}, {"url": "http://192.168.24.1:8004/v1/f22c2ea140d3466abe874cd49d41a625", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "5e23f9a97393489b84e4cc83b8f4d7df"}, {"url": "http://192.168.24.1:8004/v1/f22c2ea140d3466abe874cd49d41a625", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "7bd2c8977c964ab1bfdf67bbe47444f8"}], "type": "orchestration", "id": "f20a1267226a4725bdacf37cda18adb9", "name": "heat"}], "user": {"domain": {"id": "default", "name": "Default"}, "password_expires_at": null, "name": "admin", "id": "7d372ccb647644fbb126da9ed61a2287"}, "audit_ids": ["BB8r3aiASEy3q6ITLmBGKQ"], "issued_at": "2018-08-21T13:32:51.000000Z"}} >2018-08-21 16:32:52,117 DEBUG: found extension EntryPoint.parse('v1password = swiftclient.authv1:PasswordLoader') >2018-08-21 16:32:52,117 DEBUG: found extension EntryPoint.parse('v2token = keystoneauth1.loading._plugins.identity.v2:Token') >2018-08-21 16:32:52,118 DEBUG: found extension EntryPoint.parse('none = keystoneauth1.loading._plugins.noauth:NoAuth') >2018-08-21 16:32:52,118 DEBUG: found extension EntryPoint.parse('v3oauth1 = keystoneauth1.extras.oauth1._loading:V3OAuth1') >2018-08-21 16:32:52,118 DEBUG: found extension EntryPoint.parse('admin_token = keystoneauth1.loading._plugins.admin_token:AdminToken') >2018-08-21 16:32:52,118 DEBUG: found extension EntryPoint.parse('v3oidcauthcode = keystoneauth1.loading._plugins.identity.v3:OpenIDConnectAuthorizationCode') >2018-08-21 16:32:52,119 DEBUG: found extension EntryPoint.parse('v2password = keystoneauth1.loading._plugins.identity.v2:Password') >2018-08-21 16:32:52,119 DEBUG: found extension EntryPoint.parse('v3samlpassword = keystoneauth1.extras._saml2._loading:Saml2Password') >2018-08-21 16:32:52,119 DEBUG: found extension EntryPoint.parse('v3password = keystoneauth1.loading._plugins.identity.v3:Password') >2018-08-21 16:32:52,120 DEBUG: found extension EntryPoint.parse('v3adfspassword = keystoneauth1.extras._saml2._loading:ADFSPassword') >2018-08-21 16:32:52,120 DEBUG: found extension EntryPoint.parse('v3oidcaccesstoken = keystoneauth1.loading._plugins.identity.v3:OpenIDConnectAccessToken') >2018-08-21 16:32:52,120 DEBUG: found extension EntryPoint.parse('v3oidcpassword = keystoneauth1.loading._plugins.identity.v3:OpenIDConnectPassword') >2018-08-21 16:32:52,120 DEBUG: found extension EntryPoint.parse('v3kerberos = keystoneauth1.extras.kerberos._loading:Kerberos') >2018-08-21 16:32:52,121 DEBUG: found extension EntryPoint.parse('token = keystoneauth1.loading._plugins.identity.generic:Token') >2018-08-21 16:32:52,121 DEBUG: found extension EntryPoint.parse('v3oidcclientcredentials = keystoneauth1.loading._plugins.identity.v3:OpenIDConnectClientCredentials') >2018-08-21 16:32:52,121 DEBUG: found extension EntryPoint.parse('v3tokenlessauth = keystoneauth1.loading._plugins.identity.v3:TokenlessAuth') >2018-08-21 16:32:52,121 DEBUG: found extension EntryPoint.parse('v3token = keystoneauth1.loading._plugins.identity.v3:Token') >2018-08-21 16:32:52,122 DEBUG: found extension EntryPoint.parse('v3totp = keystoneauth1.loading._plugins.identity.v3:TOTP') >2018-08-21 16:32:52,122 DEBUG: found extension EntryPoint.parse('v3applicationcredential = keystoneauth1.loading._plugins.identity.v3:ApplicationCredential') >2018-08-21 16:32:52,122 DEBUG: found extension EntryPoint.parse('password = keystoneauth1.loading._plugins.identity.generic:Password') >2018-08-21 16:32:52,124 DEBUG: found extension EntryPoint.parse('v3fedkerb = keystoneauth1.extras.kerberos._loading:MappedKerberos') >2018-08-21 16:32:52,124 DEBUG: found extension EntryPoint.parse('token_endpoint = openstackclient.api.auth_plugin:TokenEndpoint') >2018-08-21 16:32:52,125 DEBUG: found extension EntryPoint.parse('noauth = cinderclient.contrib.noauth:CinderNoAuthLoader') >2018-08-21 16:32:52,154 DEBUG: Manager defaults:unknown running task network.GET.networks >2018-08-21 16:32:52,155 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:5000/ -H "Accept: application/json" -H "User-Agent: os-client-config/1.29.0 keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" >2018-08-21 16:32:52,159 DEBUG: Starting new HTTP connection (1): 192.168.24.1 >2018-08-21 16:32:52,175 DEBUG: http://192.168.24.1:5000 "GET / HTTP/1.1" 300 599 >2018-08-21 16:32:52,178 DEBUG: RESP: [300] Date: Tue, 21 Aug 2018 13:32:52 GMT Server: Apache Vary: X-Auth-Token Content-Length: 599 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"versions": {"values": [{"status": "stable", "updated": "2018-02-28T00:00:00Z", "media-types": [{"base": "application/json", "type": "application/vnd.openstack.identity-v3+json"}], "id": "v3.10", "links": [{"href": "http://192.168.24.1:5000/v3/", "rel": "self"}]}, {"status": "deprecated", "updated": "2016-08-04T00:00:00Z", "media-types": [{"base": "application/json", "type": "application/vnd.openstack.identity-v2.0+json"}], "id": "v2.0", "links": [{"href": "http://192.168.24.1:5000/v2.0/", "rel": "self"}, {"href": "https://docs.openstack.org/", "type": "text/html", "rel": "describedby"}]}]}} > >2018-08-21 16:32:52,179 DEBUG: Making authentication request to http://192.168.24.1:5000/v3/auth/tokens >2018-08-21 16:32:53,876 DEBUG: http://192.168.24.1:5000 "POST /v3/auth/tokens HTTP/1.1" 201 8110 >2018-08-21 16:32:53,880 DEBUG: {"token": {"is_domain": false, "methods": ["password"], "roles": [{"id": "d9bce82a1b3d48afb75c066ac7855390", "name": "admin"}], "expires_at": "2018-08-21T17:32:53.000000Z", "project": {"domain": {"id": "default", "name": "Default"}, "id": "f22c2ea140d3466abe874cd49d41a625", "name": "admin"}, "catalog": [{"endpoints": [{"url": "http://192.168.24.1:8774/v2.1", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "2ff4490fa2a0457098d77449f1f6803e"}, {"url": "http://192.168.24.1:8774/v2.1", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "95b1d5fdee5b4c0281704ea50e0d5ce9"}, {"url": "http://192.168.24.1:8774/v2.1", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "ece7fa4b347b43a681e87b934be19375"}], "type": "compute", "id": "004cfe3196f34e4db5a61392090cddac", "name": "nova"}, {"endpoints": [{"url": "http://192.168.24.1:9292", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "1789ddb69039451f92355bf1d8b57c70"}, {"url": "http://192.168.24.1:9292", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "323f09cf4edc4353b75d8a51f2b42b46"}, {"url": "http://192.168.24.1:9292", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "c1a312958c6a43d487d9632fb8fe0110"}], "type": "image", "id": "00c71b99f79b47c0b3f96c3c4d5f4322", "name": "glance"}, {"endpoints": [{"url": "http://192.168.24.1:8888", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "323248a37ae446ce8ca8a315e2749ac3"}, {"url": "http://192.168.24.1:8888", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "9d207a43a9ff4802a58145617986dcac"}, {"url": "http://192.168.24.1:8888", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "9f0116bcdbe84e6eab759177a6bef1eb"}], "type": "messaging", "id": "0612c7c7028f4e5bb2b2d6849fa7946a", "name": "zaqar"}, {"endpoints": [{"url": "http://192.168.24.1:6385", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "0dbb2ecfdfda453a906c8c2ace5ca7b3"}, {"url": "http://192.168.24.1:6385", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "3a75224641fc497199a2ecc199938395"}, {"url": "http://192.168.24.1:6385", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "b993bceb6f344a5fa15c3bcfdf955478"}], "type": "baremetal", "id": "19c7aa002275450cb9c01d0798ca9e70", "name": "ironic"}, {"endpoints": [{"url": "http://192.168.24.1:5050", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "b280db447ebc41798461c9dba0e5f300"}, {"url": "http://192.168.24.1:5050", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "e245311c27224c11b499aaf5ebbf670b"}, {"url": "http://192.168.24.1:5050", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "f8d8101e8ca84f739abfdeede71c2ba4"}], "type": "baremetal-introspection", "id": "587ff47314b14dadac30f78d7592a0a8", "name": "ironic-inspector"}, {"endpoints": [{"url": "http://192.168.24.1:9696", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "6b177ba421c84bd6a38c7c7608294f31"}, {"url": "http://192.168.24.1:9696", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "bfe3ba0cfb6544dd858017d2ba65ff5e"}, {"url": "http://192.168.24.1:9696", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "ebf75af3716d404ab6833d7d2d8f36b5"}], "type": "network", "id": "6b091099e31e47928e4b324bf19bf876", "name": "neutron"}, {"endpoints": [{"url": "http://192.168.24.1:5000", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "6f134ba68fe9426c8b6d97d4c2631380"}, {"url": "http://192.168.24.1:5000", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "dc9c95c9cedd49f79622f34938ece34b"}, {"url": "http://192.168.24.1:35357", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "eebdd30efded40ba9ca2882ab0df4bac"}], "type": "identity", "id": "77d62535ab7649948b8627fef40261ae", "name": "keystone"}, {"endpoints": [{"url": "http://192.168.24.1:8778/placement", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "58a7acaaeb134764a97167b5c7d7e501"}, {"url": "http://192.168.24.1:8778/placement", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "78167e26adbe4b6299b2e62e5b689c4c"}, {"url": "http://192.168.24.1:8778/placement", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "88f1dd19dddb4fca80aa7b67da525231"}], "type": "placement", "id": "9058c6ac3e2e45b08dc4fcdd1496930e", "name": "placement"}, {"endpoints": [{"url": "http://192.168.24.1:8000/v1/f22c2ea140d3466abe874cd49d41a625", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "52309d2782d5400882a835845712ac65"}, {"url": "http://192.168.24.1:8000/v1/f22c2ea140d3466abe874cd49d41a625", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "b0b4f67a949447c585ae743fe0cdcb39"}, {"url": "http://192.168.24.1:8000/v1/f22c2ea140d3466abe874cd49d41a625", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "f50c91aeb68845fcb044907f2f6db788"}], "type": "cloudformation", "id": "c38dc1f4ae39437db9a9af6586c846b3", "name": "heat-cfn"}, {"endpoints": [{"url": "http://192.168.24.1:8989/v2", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "0eb5b9e8fdaa41948b4f0dab3790103b"}, {"url": "http://192.168.24.1:8989/v2", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "7ed433372bd94fa29146a793b32c9109"}, {"url": "http://192.168.24.1:8989/v2", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "e3a01027f424427a9965b75305b301e1"}], "type": "workflowv2", "id": "d9276e2f5afd40c4b17d45f0109fc5cd", "name": "mistral"}, {"endpoints": [{"url": "http://192.168.24.1:8080", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "24868bb765e74704a5bac7adabe0754d"}, {"url": "http://192.168.24.1:8080/v1/AUTH_f22c2ea140d3466abe874cd49d41a625", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "90dc6e4b12eb4695a471bcd1d2584909"}, {"url": "http://192.168.24.1:8080/v1/AUTH_f22c2ea140d3466abe874cd49d41a625", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "eca19f95bac3437cbd0cefacba53ab84"}], "type": "object-store", "id": "e31f0b3d353a4f7dbee1c1fe563c8b8b", "name": "swift"}, {"endpoints": [{"url": "ws://192.168.24.1:9000", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "485830434ed64328961aa92033e9dc7a"}, {"url": "ws://192.168.24.1:9000", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "7cac3900b2d848909d1d53e7cec5eea7"}, {"url": "ws://192.168.24.1:9000", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "db30422dfe5f43b0b2f1ff615124bf46"}], "type": "messaging-websocket", "id": "ea43c0cc926a4a6c96350ec1c1271454", "name": "zaqar-websocket"}, {"endpoints": [{"url": "http://192.168.24.1:8004/v1/f22c2ea140d3466abe874cd49d41a625", "interface": "admin", "region": "regionOne", "region_id": "regionOne", "id": "225fd64783854aeda000fc1b6a0ecb95"}, {"url": "http://192.168.24.1:8004/v1/f22c2ea140d3466abe874cd49d41a625", "interface": "public", "region": "regionOne", "region_id": "regionOne", "id": "5e23f9a97393489b84e4cc83b8f4d7df"}, {"url": "http://192.168.24.1:8004/v1/f22c2ea140d3466abe874cd49d41a625", "interface": "internal", "region": "regionOne", "region_id": "regionOne", "id": "7bd2c8977c964ab1bfdf67bbe47444f8"}], "type": "orchestration", "id": "f20a1267226a4725bdacf37cda18adb9", "name": "heat"}], "user": {"domain": {"id": "default", "name": "Default"}, "password_expires_at": null, "name": "admin", "id": "7d372ccb647644fbb126da9ed61a2287"}, "audit_ids": ["nDP8OpHUQ8aV9R89Rs_bXQ"], "issued_at": "2018-08-21T13:32:53.000000Z"}} >2018-08-21 16:32:53,888 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:9696 -H "Accept: application/json" -H "User-Agent: os-client-config/1.29.0 keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" >2018-08-21 16:32:53,892 DEBUG: Starting new HTTP connection (1): 192.168.24.1 >2018-08-21 16:32:53,907 DEBUG: http://192.168.24.1:9696 "GET / HTTP/1.1" 200 121 >2018-08-21 16:32:53,911 DEBUG: RESP: [200] Content-Length: 121 Content-Type: application/json Date: Tue, 21 Aug 2018 13:32:53 GMT Connection: keep-alive >RESP BODY: {"versions": [{"status": "CURRENT", "id": "v2.0", "links": [{"href": "http://192.168.24.1:9696/v2.0/", "rel": "self"}]}]} > >2018-08-21 16:32:53,913 DEBUG: REQ: curl -g -i -X GET "http://192.168.24.1:9696/v2.0/networks?name=ctlplane" -H "User-Agent: os-client-config/1.29.0 keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}70c73a092cdb2fa197ffcf84bd4dc81a0d299e14" >2018-08-21 16:32:58,166 DEBUG: http://192.168.24.1:9696 "GET /v2.0/networks?name=ctlplane HTTP/1.1" 200 15 >2018-08-21 16:32:58,168 DEBUG: RESP: [200] Content-Type: application/json Content-Length: 15 X-Openstack-Request-Id: req-3fb05f70-b484-4eb5-ab21-49fa785d8911 Date: Tue, 21 Aug 2018 13:32:58 GMT Connection: keep-alive >RESP BODY: {"networks":[]} > >2018-08-21 16:32:58,169 DEBUG: GET call to network for http://192.168.24.1:9696/v2.0/networks?name=ctlplane used request id req-3fb05f70-b484-4eb5-ab21-49fa785d8911 >2018-08-21 16:32:58,169 DEBUG: Manager defaults:unknown ran task network.GET.networks in 6.01469397545s >2018-08-21 16:32:58,172 DEBUG: Manager defaults:unknown running task network.POST.networks >2018-08-21 16:32:58,180 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:9696/v2.0/networks -H "User-Agent: os-client-config/1.29.0 keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "Content-Type: application/json" -H "X-Auth-Token: {SHA1}70c73a092cdb2fa197ffcf84bd4dc81a0d299e14" -d '{"network": {"mtu": 1500, "name": "ctlplane", "provider:physical_network": "ctlplane", "provider:network_type": "flat"}}' >2018-08-21 16:33:00,639 DEBUG: http://192.168.24.1:9696 "POST /v2.0/networks HTTP/1.1" 201 648 >2018-08-21 16:33:00,642 DEBUG: RESP: [201] Content-Type: application/json Content-Length: 648 X-Openstack-Request-Id: req-dc833551-fecc-44e9-8338-d04263f818df Date: Tue, 21 Aug 2018 13:33:00 GMT Connection: keep-alive >RESP BODY: {"network":{"provider:physical_network":"ctlplane","ipv6_address_scope":null,"revision_number":2,"port_security_enabled":true,"provider:network_type":"flat","id":"07975f05-3a42-4c6d-96ca-c7d74dab95c3","router:external":false,"availability_zone_hints":[],"availability_zones":[],"ipv4_address_scope":null,"shared":false,"project_id":"f22c2ea140d3466abe874cd49d41a625","l2_adjacency":true,"status":"ACTIVE","subnets":[],"description":"","tags":[],"updated_at":"2018-08-21T13:32:59Z","provider:segmentation_id":null,"name":"ctlplane","admin_state_up":true,"tenant_id":"f22c2ea140d3466abe874cd49d41a625","created_at":"2018-08-21T13:32:59Z","mtu":1500}} > >2018-08-21 16:33:00,642 DEBUG: POST call to network for http://192.168.24.1:9696/v2.0/networks used request id req-dc833551-fecc-44e9-8338-d04263f818df >2018-08-21 16:33:00,643 DEBUG: Manager defaults:unknown ran task network.POST.networks in 2.47055792809s >2018-08-21 16:33:00,648 INFO: Network created openstack.network.v2.network.Network(provider:physical_network=ctlplane, ipv6_address_scope=None, revision_number=2, port_security_enabled=True, provider:network_type=flat, id=07975f05-3a42-4c6d-96ca-c7d74dab95c3, router:external=False, availability_zone_hints=[], availability_zones=[], ipv4_address_scope=None, shared=False, project_id=f22c2ea140d3466abe874cd49d41a625, status=ACTIVE, subnets=[], description=, tags=[], updated_at=2018-08-21T13:32:59Z, provider:segmentation_id=None, name=ctlplane, admin_state_up=True, created_at=2018-08-21T13:32:59Z, mtu=1500) >2018-08-21 16:33:00,651 DEBUG: Manager defaults:unknown running task network.GET.segments >2018-08-21 16:33:00,658 DEBUG: REQ: curl -g -i -X GET "http://192.168.24.1:9696/v2.0/segments?network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3" -H "User-Agent: os-client-config/1.29.0 keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}70c73a092cdb2fa197ffcf84bd4dc81a0d299e14" >2018-08-21 16:33:00,740 DEBUG: http://192.168.24.1:9696 "GET /v2.0/segments?network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3 HTTP/1.1" 200 232 >2018-08-21 16:33:00,743 DEBUG: RESP: [200] Content-Type: application/json Content-Length: 232 X-Openstack-Request-Id: req-7c15faf0-8bd0-46e0-a4b0-178f27094f8b Date: Tue, 21 Aug 2018 13:33:00 GMT Connection: keep-alive >RESP BODY: {"segments": [{"name": null, "network_id": "07975f05-3a42-4c6d-96ca-c7d74dab95c3", "segmentation_id": null, "network_type": "flat", "physical_network": "ctlplane", "id": "1eaf9316-bb97-4bc5-9d51-1ae30c6477e1", "description": null}]} > >2018-08-21 16:33:00,743 DEBUG: GET call to network for http://192.168.24.1:9696/v2.0/segments?network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3 used request id req-7c15faf0-8bd0-46e0-a4b0-178f27094f8b >2018-08-21 16:33:00,743 DEBUG: Manager defaults:unknown ran task network.GET.segments in 0.0925400257111s >2018-08-21 16:33:00,747 DEBUG: Manager defaults:unknown running task network.DELETE.segments >2018-08-21 16:33:00,755 DEBUG: REQ: curl -g -i -X DELETE http://192.168.24.1:9696/v2.0/segments/1eaf9316-bb97-4bc5-9d51-1ae30c6477e1 -H "User-Agent: os-client-config/1.29.0 keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "Accept: " -H "X-Auth-Token: {SHA1}70c73a092cdb2fa197ffcf84bd4dc81a0d299e14" >2018-08-21 16:33:02,725 DEBUG: http://192.168.24.1:9696 "DELETE /v2.0/segments/1eaf9316-bb97-4bc5-9d51-1ae30c6477e1 HTTP/1.1" 204 0 >2018-08-21 16:33:02,728 DEBUG: RESP: [204] X-Openstack-Request-Id: req-d522f052-1f69-4ed7-8992-8edb15303ba2 Content-Length: 0 Date: Tue, 21 Aug 2018 13:33:02 GMT Connection: keep-alive >RESP BODY: Omitted, Content-Type is set to None. Only application/json responses have their bodies logged. > >2018-08-21 16:33:02,728 DEBUG: DELETE call to network for http://192.168.24.1:9696/v2.0/segments/1eaf9316-bb97-4bc5-9d51-1ae30c6477e1 used request id req-d522f052-1f69-4ed7-8992-8edb15303ba2 >2018-08-21 16:33:02,729 DEBUG: Manager defaults:unknown ran task network.DELETE.segments in 1.98131084442s >2018-08-21 16:33:02,730 INFO: Default segment on network ctlplane deleted. >2018-08-21 16:33:02,732 DEBUG: Manager defaults:unknown running task network.GET.subnets >2018-08-21 16:33:02,740 DEBUG: REQ: curl -g -i -X GET "http://192.168.24.1:9696/v2.0/subnets?network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3&cidr=192.168.24.0%2F24" -H "User-Agent: os-client-config/1.29.0 keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}70c73a092cdb2fa197ffcf84bd4dc81a0d299e14" >2018-08-21 16:33:02,876 DEBUG: http://192.168.24.1:9696 "GET /v2.0/subnets?network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3&cidr=192.168.24.0%2F24 HTTP/1.1" 200 14 >2018-08-21 16:33:02,879 DEBUG: RESP: [200] Content-Type: application/json Content-Length: 14 X-Openstack-Request-Id: req-085f93b5-8c47-466f-b78f-c5370937a9cf Date: Tue, 21 Aug 2018 13:33:02 GMT Connection: keep-alive >RESP BODY: {"subnets":[]} > >2018-08-21 16:33:02,879 DEBUG: GET call to network for http://192.168.24.1:9696/v2.0/subnets?network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3&cidr=192.168.24.0%2F24 used request id req-085f93b5-8c47-466f-b78f-c5370937a9cf >2018-08-21 16:33:02,880 DEBUG: Manager defaults:unknown ran task network.GET.subnets in 0.147151947021s >2018-08-21 16:33:02,883 DEBUG: Manager defaults:unknown running task network.GET.subnets >2018-08-21 16:33:02,891 DEBUG: REQ: curl -g -i -X GET "http://192.168.24.1:9696/v2.0/subnets?network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3&cidr=192.168.24.0%2F24" -H "User-Agent: os-client-config/1.29.0 keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}70c73a092cdb2fa197ffcf84bd4dc81a0d299e14" >2018-08-21 16:33:03,019 DEBUG: http://192.168.24.1:9696 "GET /v2.0/subnets?network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3&cidr=192.168.24.0%2F24 HTTP/1.1" 200 14 >2018-08-21 16:33:03,022 DEBUG: RESP: [200] Content-Type: application/json Content-Length: 14 X-Openstack-Request-Id: req-35f57505-3a1f-4815-8c76-144b0bdfb975 Date: Tue, 21 Aug 2018 13:33:03 GMT Connection: keep-alive >RESP BODY: {"subnets":[]} > >2018-08-21 16:33:03,022 DEBUG: GET call to network for http://192.168.24.1:9696/v2.0/subnets?network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3&cidr=192.168.24.0%2F24 used request id req-35f57505-3a1f-4815-8c76-144b0bdfb975 >2018-08-21 16:33:03,022 DEBUG: Manager defaults:unknown ran task network.GET.subnets in 0.139023065567s >2018-08-21 16:33:03,025 DEBUG: Manager defaults:unknown running task network.GET.segments >2018-08-21 16:33:03,033 DEBUG: REQ: curl -g -i -X GET "http://192.168.24.1:9696/v2.0/segments?network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3&physical_network=ctlplane" -H "User-Agent: os-client-config/1.29.0 keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}70c73a092cdb2fa197ffcf84bd4dc81a0d299e14" >2018-08-21 16:33:03,095 DEBUG: http://192.168.24.1:9696 "GET /v2.0/segments?network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3&physical_network=ctlplane HTTP/1.1" 200 16 >2018-08-21 16:33:03,098 DEBUG: RESP: [200] Content-Type: application/json Content-Length: 16 X-Openstack-Request-Id: req-117bad92-1acb-4a2d-bd7f-cb9d8537f7e5 Date: Tue, 21 Aug 2018 13:33:03 GMT Connection: keep-alive >RESP BODY: {"segments": []} > >2018-08-21 16:33:03,098 DEBUG: GET call to network for http://192.168.24.1:9696/v2.0/segments?network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3&physical_network=ctlplane used request id req-117bad92-1acb-4a2d-bd7f-cb9d8537f7e5 >2018-08-21 16:33:03,099 DEBUG: Manager defaults:unknown ran task network.GET.segments in 0.0735459327698s >2018-08-21 16:33:03,101 DEBUG: Manager defaults:unknown running task network.POST.segments >2018-08-21 16:33:03,109 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:9696/v2.0/segments -H "User-Agent: os-client-config/1.29.0 keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "Content-Type: application/json" -H "X-Auth-Token: {SHA1}70c73a092cdb2fa197ffcf84bd4dc81a0d299e14" -d '{"segment": {"network_id": "07975f05-3a42-4c6d-96ca-c7d74dab95c3", "physical_network": "ctlplane", "name": "ctlplane-subnet", "network_type": "flat"}}' >2018-08-21 16:33:05,292 DEBUG: http://192.168.24.1:9696 "POST /v2.0/segments HTTP/1.1" 201 242 >2018-08-21 16:33:05,294 DEBUG: RESP: [201] Content-Type: application/json Content-Length: 242 X-Openstack-Request-Id: req-897a02df-d695-49cd-a0d9-56b74c62aaca Date: Tue, 21 Aug 2018 13:33:05 GMT Connection: keep-alive >RESP BODY: {"segment": {"name": "ctlplane-subnet", "network_id": "07975f05-3a42-4c6d-96ca-c7d74dab95c3", "segmentation_id": null, "network_type": "flat", "physical_network": "ctlplane", "id": "56f9493e-404c-489b-8823-a4b01928076c", "description": null}} > >2018-08-21 16:33:05,295 DEBUG: POST call to network for http://192.168.24.1:9696/v2.0/segments used request id req-897a02df-d695-49cd-a0d9-56b74c62aaca >2018-08-21 16:33:05,295 DEBUG: Manager defaults:unknown ran task network.POST.segments in 2.19403791428s >2018-08-21 16:33:05,297 INFO: Neutron Segment created openstack.network.v2.segment.Segment(name=ctlplane-subnet, network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3, segmentation_id=None, id=56f9493e-404c-489b-8823-a4b01928076c, physical_network=ctlplane, network_type=flat, description=None) >2018-08-21 16:33:05,300 DEBUG: Manager defaults:unknown running task network.POST.subnets >2018-08-21 16:33:05,308 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:9696/v2.0/subnets -H "User-Agent: os-client-config/1.29.0 keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "Content-Type: application/json" -H "X-Auth-Token: {SHA1}70c73a092cdb2fa197ffcf84bd4dc81a0d299e14" -d '{"subnet": {"name": "ctlplane-subnet", "enable_dhcp": true, "segment_id": null, "network_id": "07975f05-3a42-4c6d-96ca-c7d74dab95c3", "allocation_pools": [{"start": "192.168.24.5", "end": "192.168.24.24"}], "host_routes": [{"nexthop": "192.168.24.1", "destination": "169.254.169.254/32"}], "ip_version": "4", "gateway_ip": "192.168.24.1", "cidr": "192.168.24.0/24"}}' >2018-08-21 16:33:07,583 DEBUG: http://192.168.24.1:9696 "POST /v2.0/subnets HTTP/1.1" 201 704 >2018-08-21 16:33:07,586 DEBUG: RESP: [201] Content-Type: application/json Content-Length: 704 X-Openstack-Request-Id: req-7e13881d-edfb-4b76-b8de-faa1ad0d6b2d Date: Tue, 21 Aug 2018 13:33:07 GMT Connection: keep-alive >RESP BODY: {"subnet":{"updated_at":"2018-08-21T13:33:05Z","ipv6_ra_mode":null,"allocation_pools":[{"start":"192.168.24.5","end":"192.168.24.24"}],"host_routes":[{"destination":"169.254.169.254/32","nexthop":"192.168.24.1"}],"revision_number":0,"ipv6_address_mode":null,"id":"77d19ea0-8870-4629-8118-1fee3d357e40","dns_nameservers":[],"gateway_ip":"192.168.24.1","project_id":"f22c2ea140d3466abe874cd49d41a625","description":"","tags":[],"cidr":"192.168.24.0/24","subnetpool_id":null,"service_types":[],"name":"ctlplane-subnet","enable_dhcp":true,"segment_id":null,"network_id":"07975f05-3a42-4c6d-96ca-c7d74dab95c3","tenant_id":"f22c2ea140d3466abe874cd49d41a625","created_at":"2018-08-21T13:33:05Z","ip_version":4}} > >2018-08-21 16:33:07,587 DEBUG: POST call to network for http://192.168.24.1:9696/v2.0/subnets used request id req-7e13881d-edfb-4b76-b8de-faa1ad0d6b2d >2018-08-21 16:33:07,587 DEBUG: Manager defaults:unknown ran task network.POST.subnets in 2.28641605377s >2018-08-21 16:33:07,591 INFO: Subnet created openstack.network.v2.subnet.Subnet(service_types=[], description=, enable_dhcp=True, tags=[], network_id=07975f05-3a42-4c6d-96ca-c7d74dab95c3, tenant_id=f22c2ea140d3466abe874cd49d41a625, created_at=2018-08-21T13:33:05Z, segment_id=None, dns_nameservers=[], updated_at=2018-08-21T13:33:05Z, gateway_ip=192.168.24.1, ipv6_ra_mode=None, allocation_pools=[{u'start': u'192.168.24.5', u'end': u'192.168.24.24'}], host_routes=[{u'nexthop': u'192.168.24.1', u'destination': u'169.254.169.254/32'}], revision_number=0, ip_version=4, ipv6_address_mode=None, cidr=192.168.24.0/24, id=77d19ea0-8870-4629-8118-1fee3d357e40, subnetpool_id=None, name=ctlplane-subnet) >2018-08-21 16:33:07,795 INFO: Generated new ssh key in ~/.ssh/id_rsa >2018-08-21 16:33:07,804 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8774/v2.1/os-keypairs/default -H "User-Agent: python-novaclient" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:33:07,810 DEBUG: Starting new HTTP connection (1): 192.168.24.1 >2018-08-21 16:33:34,629 DEBUG: http://192.168.24.1:8774 "GET /v2.1/os-keypairs/default HTTP/1.1" 404 113 >2018-08-21 16:33:34,634 DEBUG: RESP: [404] Date: Tue, 21 Aug 2018 13:33:07 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version x-openstack-request-id: req-7ab3eeb9-598e-4463-a441-c1b7b04b678e x-compute-request-id: req-7ab3eeb9-598e-4463-a441-c1b7b04b678e Content-Length: 113 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: application/json; charset=UTF-8 >RESP BODY: {"itemNotFound": {"message": "Keypair default not found for user 7d372ccb647644fbb126da9ed61a2287", "code": 404}} > >2018-08-21 16:33:34,634 DEBUG: GET call to compute for http://192.168.24.1:8774/v2.1/os-keypairs/default used request id req-7ab3eeb9-598e-4463-a441-c1b7b04b678e >2018-08-21 16:33:34,642 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/os-keypairs -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"keypair": {"public_key": "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDfBf2Ahctb6b62m4DSLCj0HrZwv3OlSJCFr9ClXTBG7Rz08DbB1uf0Rf0o/GbZYWqpqUE5sutymudmk/PLbOydY6n41xP1xEDAOdmgvAqr+x9UYP4stNtu8UepCrlWwar2Mq99qklwdzy3Zg48VXMwyBJI2Fj0fAtXuK13LwU0e0Bvd6e+wc6/MYvIEqLPRrcLiFxvVCfNScAr0ejeFO3JdmWcbtZZutdpdN4FUwH5xcD/cBYEcLyeskUL+AUfzBhWlupc8RI2nluL37MXMiyQhrV0zHrYwJymZedRyGdXJBK11aOtttuqkQHlv43DLHSzgLctwixW5o3biOePKkPv stack@undercloud-0.redhat.local", "name": "default"}}' >2018-08-21 16:34:01,731 DEBUG: http://192.168.24.1:8774 "POST /v2.1/os-keypairs HTTP/1.1" 200 481 >2018-08-21 16:34:01,734 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:33:34 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-eb65d847-38d4-43ac-986b-154e5b72a073 x-compute-request-id: req-eb65d847-38d4-43ac-986b-154e5b72a073 Content-Encoding: gzip Content-Length: 481 Keep-Alive: timeout=15, max=99 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"keypair": {"public_key": "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDfBf2Ahctb6b62m4DSLCj0HrZwv3OlSJCFr9ClXTBG7Rz08DbB1uf0Rf0o/GbZYWqpqUE5sutymudmk/PLbOydY6n41xP1xEDAOdmgvAqr+x9UYP4stNtu8UepCrlWwar2Mq99qklwdzy3Zg48VXMwyBJI2Fj0fAtXuK13LwU0e0Bvd6e+wc6/MYvIEqLPRrcLiFxvVCfNScAr0ejeFO3JdmWcbtZZutdpdN4FUwH5xcD/cBYEcLyeskUL+AUfzBhWlupc8RI2nluL37MXMiyQhrV0zHrYwJymZedRyGdXJBK11aOtttuqkQHlv43DLHSzgLctwixW5o3biOePKkPv stack@undercloud-0.redhat.local", "user_id": "7d372ccb647644fbb126da9ed61a2287", "name": "default", "fingerprint": "f9:00:37:96:ff:1f:e3:a3:7f:1d:64:a4:0e:ad:75:60"}} > >2018-08-21 16:34:01,735 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/os-keypairs used request id req-eb65d847-38d4-43ac-986b-154e5b72a073 >2018-08-21 16:34:01,804 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8774/v2.1/flavors/detail -H "User-Agent: python-novaclient" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:34:28,735 DEBUG: http://192.168.24.1:8774 "GET /v2.1/flavors/detail HTTP/1.1" 200 15 >2018-08-21 16:34:28,739 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:34:01 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version x-openstack-request-id: req-69b116f1-70da-4a1e-88fc-e592b4080510 x-compute-request-id: req-69b116f1-70da-4a1e-88fc-e592b4080510 Content-Length: 15 Keep-Alive: timeout=15, max=98 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"flavors": []} > >2018-08-21 16:34:28,740 DEBUG: GET call to compute for http://192.168.24.1:8774/v2.1/flavors/detail used request id req-69b116f1-70da-4a1e-88fc-e592b4080510 >2018-08-21 16:34:28,743 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:6385/v1/nodes/?fields=uuid,resource_class -H "X-OpenStack-Ironic-API-Version: 1.21" -H "User-Agent: python-ironicclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:34:28,748 DEBUG: Starting new HTTP connection (1): 192.168.24.1 >2018-08-21 16:34:29,731 DEBUG: http://192.168.24.1:6385 "GET /v1/nodes/?fields=uuid,resource_class HTTP/1.1" 200 13 >2018-08-21 16:34:29,734 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:34:28 GMT Server: Apache X-OpenStack-Ironic-API-Minimum-Version: 1.1 X-OpenStack-Ironic-API-Maximum-Version: 1.38 X-OpenStack-Ironic-API-Version: 1.21 Openstack-Request-Id: req-2f7c37e8-7bde-4554-b13b-ce5efb597f16 Content-Length: 13 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"nodes": []} > >2018-08-21 16:34:29,740 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8774/v2.1/flavors/detail -H "User-Agent: python-novaclient" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:34:30,628 DEBUG: http://192.168.24.1:8774 "GET /v2.1/flavors/detail HTTP/1.1" 200 15 >2018-08-21 16:34:30,631 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:34:29 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version x-openstack-request-id: req-7a7c1635-d554-47eb-bce2-87d62537c799 x-compute-request-id: req-7a7c1635-d554-47eb-bce2-87d62537c799 Content-Length: 15 Keep-Alive: timeout=15, max=97 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"flavors": []} > >2018-08-21 16:34:30,631 DEBUG: GET call to compute for http://192.168.24.1:8774/v2.1/flavors/detail used request id req-7a7c1635-d554-47eb-bce2-87d62537c799 >2018-08-21 16:34:30,639 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/flavors -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"flavor": {"vcpus": 1, "disk": 40, "name": "baremetal", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "ram": 4096, "id": null, "swap": 0}}' >2018-08-21 16:34:57,637 DEBUG: http://192.168.24.1:8774 "POST /v2.1/flavors HTTP/1.1" 200 281 >2018-08-21 16:34:57,640 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:34:30 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-e0b01c56-f54e-4274-9e0e-34092f2bdd56 x-compute-request-id: req-e0b01c56-f54e-4274-9e0e-34092f2bdd56 Content-Encoding: gzip Content-Length: 281 Keep-Alive: timeout=15, max=96 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"flavor": {"name": "baremetal", "links": [{"href": "http://192.168.24.1:8774/v2.1/flavors/63169d21-76a3-485e-be45-71c185aaa801", "rel": "self"}, {"href": "http://192.168.24.1:8774/flavors/63169d21-76a3-485e-be45-71c185aaa801", "rel": "bookmark"}], "ram": 4096, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "swap": "", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "disk": 40, "id": "63169d21-76a3-485e-be45-71c185aaa801"}} > >2018-08-21 16:34:57,640 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/flavors used request id req-e0b01c56-f54e-4274-9e0e-34092f2bdd56 >2018-08-21 16:34:57,648 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/flavors/63169d21-76a3-485e-be45-71c185aaa801/os-extra_specs -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"extra_specs": {"resources:CUSTOM_BAREMETAL": "1", "resources:DISK_GB": "0", "capabilities:boot_option": "local", "resources:MEMORY_MB": "0", "resources:VCPU": "0"}}' >2018-08-21 16:34:57,938 DEBUG: http://192.168.24.1:8774 "POST /v2.1/flavors/63169d21-76a3-485e-be45-71c185aaa801/os-extra_specs HTTP/1.1" 200 136 >2018-08-21 16:34:57,941 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:34:57 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-77b0956b-3938-4c17-adc8-7113dd773754 x-compute-request-id: req-77b0956b-3938-4c17-adc8-7113dd773754 Content-Encoding: gzip Content-Length: 136 Keep-Alive: timeout=15, max=95 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"extra_specs": {"resources:CUSTOM_BAREMETAL": "1", "resources:DISK_GB": "0", "capabilities:boot_option": "local", "resources:MEMORY_MB": "0", "resources:VCPU": "0"}} > >2018-08-21 16:34:57,941 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/flavors/63169d21-76a3-485e-be45-71c185aaa801/os-extra_specs used request id req-77b0956b-3938-4c17-adc8-7113dd773754 >2018-08-21 16:34:57,942 INFO: Created flavor "baremetal" with profile "None" >2018-08-21 16:34:57,949 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/flavors -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"flavor": {"vcpus": 1, "disk": 40, "name": "control", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "ram": 4096, "id": null, "swap": 0}}' >2018-08-21 16:35:24,941 DEBUG: http://192.168.24.1:8774 "POST /v2.1/flavors HTTP/1.1" 200 281 >2018-08-21 16:35:24,944 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:34:57 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-5757231e-c32d-4526-bc45-01453cb2015f x-compute-request-id: req-5757231e-c32d-4526-bc45-01453cb2015f Content-Encoding: gzip Content-Length: 281 Keep-Alive: timeout=15, max=94 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"flavor": {"name": "control", "links": [{"href": "http://192.168.24.1:8774/v2.1/flavors/b64d0669-bb50-4964-b817-6a4712bf41e8", "rel": "self"}, {"href": "http://192.168.24.1:8774/flavors/b64d0669-bb50-4964-b817-6a4712bf41e8", "rel": "bookmark"}], "ram": 4096, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "swap": "", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "disk": 40, "id": "b64d0669-bb50-4964-b817-6a4712bf41e8"}} > >2018-08-21 16:35:24,944 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/flavors used request id req-5757231e-c32d-4526-bc45-01453cb2015f >2018-08-21 16:35:24,952 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/flavors/b64d0669-bb50-4964-b817-6a4712bf41e8/os-extra_specs -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"extra_specs": {"capabilities:boot_option": "local", "capabilities:profile": "control", "resources:MEMORY_MB": "0", "resources:VCPU": "0", "resources:CUSTOM_BAREMETAL": "1", "resources:DISK_GB": "0"}}' >2018-08-21 16:35:25,226 DEBUG: http://192.168.24.1:8774 "POST /v2.1/flavors/b64d0669-bb50-4964-b817-6a4712bf41e8/os-extra_specs HTTP/1.1" 200 149 >2018-08-21 16:35:25,229 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:35:24 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-1b11e041-0c0b-418b-9b67-cc0b6d68efdd x-compute-request-id: req-1b11e041-0c0b-418b-9b67-cc0b6d68efdd Content-Encoding: gzip Content-Length: 149 Keep-Alive: timeout=15, max=93 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"extra_specs": {"capabilities:boot_option": "local", "capabilities:profile": "control", "resources:MEMORY_MB": "0", "resources:DISK_GB": "0", "resources:CUSTOM_BAREMETAL": "1", "resources:VCPU": "0"}} > >2018-08-21 16:35:25,229 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/flavors/b64d0669-bb50-4964-b817-6a4712bf41e8/os-extra_specs used request id req-1b11e041-0c0b-418b-9b67-cc0b6d68efdd >2018-08-21 16:35:25,230 INFO: Created flavor "control" with profile "control" >2018-08-21 16:35:25,237 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/flavors -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"flavor": {"vcpus": 1, "disk": 40, "name": "compute", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "ram": 4096, "id": null, "swap": 0}}' >2018-08-21 16:35:25,397 DEBUG: http://192.168.24.1:8774 "POST /v2.1/flavors HTTP/1.1" 200 281 >2018-08-21 16:35:25,401 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:35:25 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-7a6d1f6e-b1cd-42a0-8632-e8eb46c74778 x-compute-request-id: req-7a6d1f6e-b1cd-42a0-8632-e8eb46c74778 Content-Encoding: gzip Content-Length: 281 Keep-Alive: timeout=15, max=92 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"flavor": {"name": "compute", "links": [{"href": "http://192.168.24.1:8774/v2.1/flavors/f752fccd-8321-4fdd-bfd2-6c01df812902", "rel": "self"}, {"href": "http://192.168.24.1:8774/flavors/f752fccd-8321-4fdd-bfd2-6c01df812902", "rel": "bookmark"}], "ram": 4096, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "swap": "", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "disk": 40, "id": "f752fccd-8321-4fdd-bfd2-6c01df812902"}} > >2018-08-21 16:35:25,402 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/flavors used request id req-7a6d1f6e-b1cd-42a0-8632-e8eb46c74778 >2018-08-21 16:35:25,417 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/flavors/f752fccd-8321-4fdd-bfd2-6c01df812902/os-extra_specs -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"extra_specs": {"capabilities:boot_option": "local", "capabilities:profile": "compute", "resources:MEMORY_MB": "0", "resources:VCPU": "0", "resources:CUSTOM_BAREMETAL": "1", "resources:DISK_GB": "0"}}' >2018-08-21 16:35:25,653 DEBUG: http://192.168.24.1:8774 "POST /v2.1/flavors/f752fccd-8321-4fdd-bfd2-6c01df812902/os-extra_specs HTTP/1.1" 200 150 >2018-08-21 16:35:25,657 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:35:25 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-804d03b8-b8b3-4f69-81df-978504090379 x-compute-request-id: req-804d03b8-b8b3-4f69-81df-978504090379 Content-Encoding: gzip Content-Length: 150 Keep-Alive: timeout=15, max=91 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"extra_specs": {"capabilities:boot_option": "local", "capabilities:profile": "compute", "resources:MEMORY_MB": "0", "resources:DISK_GB": "0", "resources:CUSTOM_BAREMETAL": "1", "resources:VCPU": "0"}} > >2018-08-21 16:35:25,658 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/flavors/f752fccd-8321-4fdd-bfd2-6c01df812902/os-extra_specs used request id req-804d03b8-b8b3-4f69-81df-978504090379 >2018-08-21 16:35:25,659 INFO: Created flavor "compute" with profile "compute" >2018-08-21 16:35:25,668 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/flavors -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"flavor": {"vcpus": 1, "disk": 40, "name": "ceph-storage", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "ram": 4096, "id": null, "swap": 0}}' >2018-08-21 16:35:25,800 DEBUG: http://192.168.24.1:8774 "POST /v2.1/flavors HTTP/1.1" 200 285 >2018-08-21 16:35:25,803 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:35:25 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-a3c8f8fc-870a-4f09-8caa-c2c06d2774c6 x-compute-request-id: req-a3c8f8fc-870a-4f09-8caa-c2c06d2774c6 Content-Encoding: gzip Content-Length: 285 Keep-Alive: timeout=15, max=90 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"flavor": {"name": "ceph-storage", "links": [{"href": "http://192.168.24.1:8774/v2.1/flavors/df28e33c-aaff-4b8d-a441-205ed5ac7895", "rel": "self"}, {"href": "http://192.168.24.1:8774/flavors/df28e33c-aaff-4b8d-a441-205ed5ac7895", "rel": "bookmark"}], "ram": 4096, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "swap": "", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "disk": 40, "id": "df28e33c-aaff-4b8d-a441-205ed5ac7895"}} > >2018-08-21 16:35:25,804 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/flavors used request id req-a3c8f8fc-870a-4f09-8caa-c2c06d2774c6 >2018-08-21 16:35:25,812 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/flavors/df28e33c-aaff-4b8d-a441-205ed5ac7895/os-extra_specs -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"extra_specs": {"capabilities:boot_option": "local", "capabilities:profile": "ceph-storage", "resources:MEMORY_MB": "0", "resources:VCPU": "0", "resources:CUSTOM_BAREMETAL": "1", "resources:DISK_GB": "0"}}' >2018-08-21 16:35:26,030 DEBUG: http://192.168.24.1:8774 "POST /v2.1/flavors/df28e33c-aaff-4b8d-a441-205ed5ac7895/os-extra_specs HTTP/1.1" 200 153 >2018-08-21 16:35:26,034 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:35:25 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-e14558d9-c12d-4888-bf82-da598c669df2 x-compute-request-id: req-e14558d9-c12d-4888-bf82-da598c669df2 Content-Encoding: gzip Content-Length: 153 Keep-Alive: timeout=15, max=89 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"extra_specs": {"capabilities:boot_option": "local", "capabilities:profile": "ceph-storage", "resources:MEMORY_MB": "0", "resources:DISK_GB": "0", "resources:CUSTOM_BAREMETAL": "1", "resources:VCPU": "0"}} > >2018-08-21 16:35:26,034 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/flavors/df28e33c-aaff-4b8d-a441-205ed5ac7895/os-extra_specs used request id req-e14558d9-c12d-4888-bf82-da598c669df2 >2018-08-21 16:35:26,035 INFO: Created flavor "ceph-storage" with profile "ceph-storage" >2018-08-21 16:35:26,042 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/flavors -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"flavor": {"vcpus": 1, "disk": 40, "name": "block-storage", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "ram": 4096, "id": null, "swap": 0}}' >2018-08-21 16:35:26,326 DEBUG: http://192.168.24.1:8774 "POST /v2.1/flavors HTTP/1.1" 200 286 >2018-08-21 16:35:26,329 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:35:26 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-94498751-3308-4a15-a07c-fbad391ec097 x-compute-request-id: req-94498751-3308-4a15-a07c-fbad391ec097 Content-Encoding: gzip Content-Length: 286 Keep-Alive: timeout=15, max=88 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"flavor": {"name": "block-storage", "links": [{"href": "http://192.168.24.1:8774/v2.1/flavors/9621c52d-c217-4713-b665-41d2c3b08501", "rel": "self"}, {"href": "http://192.168.24.1:8774/flavors/9621c52d-c217-4713-b665-41d2c3b08501", "rel": "bookmark"}], "ram": 4096, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "swap": "", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "disk": 40, "id": "9621c52d-c217-4713-b665-41d2c3b08501"}} > >2018-08-21 16:35:26,329 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/flavors used request id req-94498751-3308-4a15-a07c-fbad391ec097 >2018-08-21 16:35:26,337 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/flavors/9621c52d-c217-4713-b665-41d2c3b08501/os-extra_specs -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"extra_specs": {"capabilities:boot_option": "local", "capabilities:profile": "block-storage", "resources:MEMORY_MB": "0", "resources:VCPU": "0", "resources:CUSTOM_BAREMETAL": "1", "resources:DISK_GB": "0"}}' >2018-08-21 16:35:26,597 DEBUG: http://192.168.24.1:8774 "POST /v2.1/flavors/9621c52d-c217-4713-b665-41d2c3b08501/os-extra_specs HTTP/1.1" 200 154 >2018-08-21 16:35:26,600 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:35:26 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-2c0d83c8-a979-48e8-8eb1-7d7e4561b2f5 x-compute-request-id: req-2c0d83c8-a979-48e8-8eb1-7d7e4561b2f5 Content-Encoding: gzip Content-Length: 154 Keep-Alive: timeout=15, max=87 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"extra_specs": {"capabilities:boot_option": "local", "capabilities:profile": "block-storage", "resources:MEMORY_MB": "0", "resources:DISK_GB": "0", "resources:CUSTOM_BAREMETAL": "1", "resources:VCPU": "0"}} > >2018-08-21 16:35:26,601 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/flavors/9621c52d-c217-4713-b665-41d2c3b08501/os-extra_specs used request id req-2c0d83c8-a979-48e8-8eb1-7d7e4561b2f5 >2018-08-21 16:35:26,601 INFO: Created flavor "block-storage" with profile "block-storage" >2018-08-21 16:35:26,608 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/flavors -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"flavor": {"vcpus": 1, "disk": 40, "name": "swift-storage", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "ram": 4096, "id": null, "swap": 0}}' >2018-08-21 16:35:26,879 DEBUG: http://192.168.24.1:8774 "POST /v2.1/flavors HTTP/1.1" 200 284 >2018-08-21 16:35:26,882 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:35:26 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-22545331-ab94-4240-9392-fa37b2639ddf x-compute-request-id: req-22545331-ab94-4240-9392-fa37b2639ddf Content-Encoding: gzip Content-Length: 284 Keep-Alive: timeout=15, max=86 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"flavor": {"name": "swift-storage", "links": [{"href": "http://192.168.24.1:8774/v2.1/flavors/3f3f4be1-44df-4926-8de3-161e068ba3ec", "rel": "self"}, {"href": "http://192.168.24.1:8774/flavors/3f3f4be1-44df-4926-8de3-161e068ba3ec", "rel": "bookmark"}], "ram": 4096, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "swap": "", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "disk": 40, "id": "3f3f4be1-44df-4926-8de3-161e068ba3ec"}} > >2018-08-21 16:35:26,883 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/flavors used request id req-22545331-ab94-4240-9392-fa37b2639ddf >2018-08-21 16:35:26,891 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8774/v2.1/flavors/3f3f4be1-44df-4926-8de3-161e068ba3ec/os-extra_specs -H "User-Agent: python-novaclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"extra_specs": {"capabilities:boot_option": "local", "capabilities:profile": "swift-storage", "resources:MEMORY_MB": "0", "resources:VCPU": "0", "resources:CUSTOM_BAREMETAL": "1", "resources:DISK_GB": "0"}}' >2018-08-21 16:35:27,105 DEBUG: http://192.168.24.1:8774 "POST /v2.1/flavors/3f3f4be1-44df-4926-8de3-161e068ba3ec/os-extra_specs HTTP/1.1" 200 154 >2018-08-21 16:35:27,108 DEBUG: RESP: [200] Date: Tue, 21 Aug 2018 13:35:26 GMT Server: Apache OpenStack-API-Version: compute 2.1 X-OpenStack-Nova-API-Version: 2.1 Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version,Accept-Encoding x-openstack-request-id: req-c6c4cb3a-00b4-4012-8800-be6e7b470bac x-compute-request-id: req-c6c4cb3a-00b4-4012-8800-be6e7b470bac Content-Encoding: gzip Content-Length: 154 Keep-Alive: timeout=15, max=85 Connection: Keep-Alive Content-Type: application/json >RESP BODY: {"extra_specs": {"capabilities:boot_option": "local", "capabilities:profile": "swift-storage", "resources:MEMORY_MB": "0", "resources:DISK_GB": "0", "resources:CUSTOM_BAREMETAL": "1", "resources:VCPU": "0"}} > >2018-08-21 16:35:27,108 DEBUG: POST call to compute for http://192.168.24.1:8774/v2.1/flavors/3f3f4be1-44df-4926-8de3-161e068ba3ec/os-extra_specs used request id req-c6c4cb3a-00b4-4012-8800-be6e7b470bac >2018-08-21 16:35:27,109 INFO: Created flavor "swift-storage" with profile "swift-storage" >2018-08-21 16:35:27,111 INFO: Configuring Mistral workbooks >2018-08-21 16:35:27,112 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:35:27,117 DEBUG: Starting new HTTP connection (1): 192.168.24.1 >2018-08-21 16:35:30,160 DEBUG: http://192.168.24.1:8989 "GET /v2/workbooks HTTP/1.1" 200 17 >2018-08-21 16:35:30,163 DEBUG: RESP: [200] Content-Length: 17 Content-Type: application/json Date: Tue, 21 Aug 2018 13:35:30 GMT Connection: keep-alive >RESP BODY: {"workbooks": []} > >2018-08-21 16:35:30,164 DEBUG: HTTP GET http://192.168.24.1:8989/v2/workbooks 200 >2018-08-21 16:35:30,165 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/workflows -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:35:30,225 DEBUG: http://192.168.24.1:8989 "GET /v2/workflows HTTP/1.1" 200 3608 >2018-08-21 16:35:30,229 DEBUG: RESP: [200] Content-Length: 3608 Content-Type: application/json Date: Tue, 21 Aug 2018 13:35:30 GMT Connection: keep-alive >RESP BODY: {"workflows": [{"definition": "---\nversion: \"2.0\"\n\nstd.delete_instance:\n type: direct\n\n input:\n - instance_id\n\n description: Deletes VM.\n\n tasks:\n delete_vm:\n description: Destroy VM.\n action: nova.servers_delete server=<% $.instance_id %>\n wait-after: 10\n on-success:\n - find_given_vm\n\n find_given_vm:\n description: Checks that VM is already deleted.\n action: nova.servers_find id=<% $.instance_id %>\n on-error:\n - succeed\n\n", "name": "std.delete_instance", "tags": [], "created_at": "2018-08-21 13:29:39", "namespace": "", "updated_at": null, "scope": "public", "input": "instance_id", "project_id": "<default-project>", "id": "623a385b-8f98-4f6d-ac0f-b47740f17762"}, {"definition": "---\nversion: '2.0'\n\nstd.create_instance:\n type: direct\n\n description: |\n Creates VM and waits till VM OS is up and running.\n\n input:\n - name\n - image_id\n - flavor_id\n - ssh_username: null\n - ssh_password: null\n\n # Name of previously created keypair to inject into the instance.\n # Either ssh credentials or keypair must be provided.\n - key_name: null\n\n # Security_groups: A list of security group names\n - security_groups: null\n\n # An ordered list of nics to be added to this server, with information about connected networks, fixed IPs, port etc.\n # Example: nics: [{\"net-id\": \"27aa8c1c-d6b8-4474-b7f7-6cdcf63ac856\"}]\n - nics: null\n\n task-defaults:\n on-error:\n - delete_vm\n\n output:\n ip: <% $.vm_ip %>\n id: <% $.vm_id %>\n name: <% $.name %>\n status: <% $.status %>\n\n tasks:\n create_vm:\n description: Initial request to create a VM.\n action: nova.servers_create name=<% $.name %> image=<% $.image_id %> flavor=<% $.flavor_id %>\n input:\n key_name: <% $.key_name %>\n security_groups: <% $.security_groups %>\n nics: <% $.nics %>\n publish:\n vm_id: <% task(create_vm).result.id %>\n on-success:\n - search_for_ip\n\n search_for_ip:\n description: Gets first free ip from Nova floating IPs.\n action: nova.floating_ips_findall instance_id=null\n publish:\n vm_ip: <% task(search_for_ip).result[0].ip %>\n on-success:\n - wait_vm_active\n\n wait_vm_active:\n description: Waits till VM is ACTIVE.\n action: nova.servers_find id=<% $.vm_id %> status=\"ACTIVE\"\n retry:\n count: 10\n delay: 10\n publish:\n status: <% task(wait_vm_active).result.status %>\n on-success:\n - associate_ip\n\n associate_ip:\n description: Associate server with one of floating IPs.\n action: nova.servers_add_floating_ip server=<% $.vm_id %> address=<% $.vm_ip %>\n wait-after: 5\n on-success:\n - wait_ssh\n\n wait_ssh:\n description: Wait till operating system on the VM is up (SSH command).\n action: std.wait_ssh username=<% $.ssh_username %> password=<% $.ssh_password %> host=<% $.vm_ip %>\n retry:\n count: 10\n delay: 10\n\n delete_vm:\n description: Destroy VM.\n workflow: std.delete_instance instance_id=<% $.vm_id %>\n on-complete:\n - fail\n", "name": "std.create_instance", "tags": [], "created_at": "2018-08-21 13:29:39", "namespace": "", "updated_at": null, "scope": "public", "input": "name, image_id, flavor_id, ssh_username=None, ssh_password=None, key_name=None, security_groups=None, nics=None", "project_id": "<default-project>", "id": "df647eca-7569-4a79-9d07-9b3c568c05bb"}]} > >2018-08-21 16:35:30,230 DEBUG: HTTP GET http://192.168.24.1:8989/v2/workflows 200 >2018-08-21 16:35:30,232 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/cron_triggers -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:35:30,300 DEBUG: http://192.168.24.1:8989 "GET /v2/cron_triggers HTTP/1.1" 200 21 >2018-08-21 16:35:30,302 DEBUG: RESP: [200] Content-Length: 21 Content-Type: application/json Date: Tue, 21 Aug 2018 13:35:30 GMT Connection: keep-alive >RESP BODY: {"cron_triggers": []} > >2018-08-21 16:35:30,303 DEBUG: HTTP GET http://192.168.24.1:8989/v2/cron_triggers 200 >2018-08-21 16:35:30,335 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.swift_ring.v1 >description: Rebalance and distribute Swift rings using Ansible > > >workflows: > rebalance: > tags: > - tripleo-common-managed > > tasks: > get_private_key: > action: tripleo.validations.get_privkey > on-success: deploy_rings > > deploy_rings: > action: tripleo.ansible-playbook > publish: > output: <% task().result %> > input: > ssh_private_key: <% task(get_private_key).result %> > ssh_common_args: '-o StrictHostKeyChecking=no' > ssh_extra_args: '-o UserKnownHostsFile=/dev/null' > verbosity: 1 > remote_user: heat-admin > become: true > become_user: root > playbook: /usr/share/tripleo-common/playbooks/swift_ring_rebalance.yaml > inventory: /usr/bin/tripleo-ansible-inventory > use_openstack_credentials: true >' >2018-08-21 16:35:30,848 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 1140 >2018-08-21 16:35:30,851 DEBUG: RESP: [201] Content-Length: 1140 Content-Type: application/json Date: Tue, 21 Aug 2018 13:35:30 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.swift_ring.v1\ndescription: Rebalance and distribute Swift rings using Ansible\n\n\nworkflows:\n rebalance:\n tags:\n - tripleo-common-managed\n\n tasks:\n get_private_key:\n action: tripleo.validations.get_privkey\n on-success: deploy_rings\n\n deploy_rings:\n action: tripleo.ansible-playbook\n publish:\n output: <% task().result %>\n input:\n ssh_private_key: <% task(get_private_key).result %>\n ssh_common_args: '-o StrictHostKeyChecking=no'\n ssh_extra_args: '-o UserKnownHostsFile=/dev/null'\n verbosity: 1\n remote_user: heat-admin\n become: true\n become_user: root\n playbook: /usr/share/tripleo-common/playbooks/swift_ring_rebalance.yaml\n inventory: /usr/bin/tripleo-ansible-inventory\n use_openstack_credentials: true\n", "name": "tripleo.swift_ring.v1", "tags": [], "created_at": "2018-08-21 13:35:30", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "84cf4898-fa4e-49f6-b801-7c0c3694d0b5"} > >2018-08-21 16:35:30,851 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:35:30,854 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.stack.v1 >description: TripleO Stack Workflows > >workflows: > > wait_for_stack_complete_or_failed: > input: > - stack > - timeout: 14400 # 4 hours. Default timeout of stack deployment > > tags: > - tripleo-common-managed > > tasks: > > wait_for_stack_status: > action: heat.stacks_get stack_id=<% $.stack %> > timeout: <% $.timeout %> > retry: > delay: 15 > count: <% $.timeout / 15 %> > continue-on: <% task().result.stack_status in ['CREATE_IN_PROGRESS', 'UPDATE_IN_PROGRESS', 'DELETE_IN_PROGRESS'] %> > > wait_for_stack_in_progress: > input: > - stack > - timeout: 600 # 10 minutes. Should not take much longer for a stack to transition to IN_PROGRESS > > tags: > - tripleo-common-managed > > tasks: > > wait_for_stack_status: > action: heat.stacks_get stack_id=<% $.stack %> > timeout: <% $.timeout %> > retry: > delay: 15 > count: <% $.timeout / 15 %> > continue-on: <% task().result.stack_status in ['CREATE_COMPLETE', 'CREATE_FAILED', 'UPDATE_COMPLETE', 'UPDATE_FAILED', 'DELETE_FAILED'] %> > > wait_for_stack_does_not_exist: > input: > - stack > - timeout: 3600 > > tags: > - tripleo-common-managed > > tasks: > wait_for_stack_does_not_exist: > action: heat.stacks_list > timeout: <% $.timeout %> > retry: > delay: 15 > count: <% $.timeout / 15 %> > continue-on: <% $.stack in task(wait_for_stack_does_not_exist).result.select([$.stack_name, $.id]).flatten() %> > > delete_stack: > input: > - stack > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > delete_the_stack: > action: heat.stacks_delete stack_id=<% $.stack %> > on-success: wait_for_stack_does_not_exist > on-error: delete_the_stack_failed > > delete_the_stack_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(delete_the_stack).result %> > > wait_for_stack_does_not_exist: > workflow: tripleo.stack.v1.wait_for_stack_does_not_exist stack=<% $.stack %> > on-success: send_message > on-error: wait_for_stack_does_not_exist_failed > > wait_for_stack_does_not_exist_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(wait_for_stack_does_not_exist).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.scale.v1.delete_stack > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> >' >2018-08-21 16:35:32,287 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 3236 >2018-08-21 16:35:32,290 DEBUG: RESP: [201] Content-Length: 3236 Content-Type: application/json Date: Tue, 21 Aug 2018 13:35:32 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.stack.v1\ndescription: TripleO Stack Workflows\n\nworkflows:\n\n wait_for_stack_complete_or_failed:\n input:\n - stack\n - timeout: 14400 # 4 hours. Default timeout of stack deployment\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n wait_for_stack_status:\n action: heat.stacks_get stack_id=<% $.stack %>\n timeout: <% $.timeout %>\n retry:\n delay: 15\n count: <% $.timeout / 15 %>\n continue-on: <% task().result.stack_status in ['CREATE_IN_PROGRESS', 'UPDATE_IN_PROGRESS', 'DELETE_IN_PROGRESS'] %>\n\n wait_for_stack_in_progress:\n input:\n - stack\n - timeout: 600 # 10 minutes. Should not take much longer for a stack to transition to IN_PROGRESS\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n wait_for_stack_status:\n action: heat.stacks_get stack_id=<% $.stack %>\n timeout: <% $.timeout %>\n retry:\n delay: 15\n count: <% $.timeout / 15 %>\n continue-on: <% task().result.stack_status in ['CREATE_COMPLETE', 'CREATE_FAILED', 'UPDATE_COMPLETE', 'UPDATE_FAILED', 'DELETE_FAILED'] %>\n\n wait_for_stack_does_not_exist:\n input:\n - stack\n - timeout: 3600\n\n tags:\n - tripleo-common-managed\n\n tasks:\n wait_for_stack_does_not_exist:\n action: heat.stacks_list\n timeout: <% $.timeout %>\n retry:\n delay: 15\n count: <% $.timeout / 15 %>\n continue-on: <% $.stack in task(wait_for_stack_does_not_exist).result.select([$.stack_name, $.id]).flatten() %>\n\n delete_stack:\n input:\n - stack\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n delete_the_stack:\n action: heat.stacks_delete stack_id=<% $.stack %>\n on-success: wait_for_stack_does_not_exist\n on-error: delete_the_stack_failed\n\n delete_the_stack_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(delete_the_stack).result %>\n\n wait_for_stack_does_not_exist:\n workflow: tripleo.stack.v1.wait_for_stack_does_not_exist stack=<% $.stack %>\n on-success: send_message\n on-error: wait_for_stack_does_not_exist_failed\n\n wait_for_stack_does_not_exist_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(wait_for_stack_does_not_exist).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.scale.v1.delete_stack\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n", "name": "tripleo.stack.v1", "tags": [], "created_at": "2018-08-21 13:35:32", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "e48c546c-d664-4eb9-9dc4-a1e3b242ddb7"} > >2018-08-21 16:35:32,291 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:35:32,296 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.plan_management.v1 >description: TripleO Overcloud Deployment Workflows v1 > >workflows: > > create_default_deployment_plan: > description: > > This workflow exists to maintain backwards compatibility in pike. This > workflow will likely be removed in queens in favor of create_deployment_plan. > input: > - container > - queue_name: tripleo > - generate_passwords: true > tags: > - tripleo-common-managed > tasks: > call_create_deployment_plan: > workflow: tripleo.plan_management.v1.create_deployment_plan > on-success: set_status_success > on-error: call_create_deployment_plan_set_status_failed > input: > container: <% $.container %> > queue_name: <% $.queue_name %> > generate_passwords: <% $.generate_passwords %> > use_default_templates: true > > set_status_success: > on-success: notify_zaqar > publish: > status: SUCCESS > message: <% task(call_create_deployment_plan).result %> > > call_create_deployment_plan_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(call_create_deployment_plan).result %> > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.create_default_deployment_plan > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > create_deployment_plan: > description: > > This workflow provides the capability to create a deployment plan using > the default heat templates provided in a standard TripleO undercloud > deployment, heat templates contained in an external git repository, or a > swift container that already contains templates. > input: > - container > - source_url: null > - queue_name: tripleo > - generate_passwords: true > - use_default_templates: false > > tags: > - tripleo-common-managed > > tasks: > container_required_check: > description: > > If using the default templates or importing templates from a git > repository, a new container needs to be created. If using an existing > container containing templates, skip straight to create_plan. > on-success: > - verify_container_doesnt_exist: <% $.use_default_templates or $.source_url %> > - create_plan: <% $.use_default_templates = false and $.source_url = null %> > > verify_container_doesnt_exist: > action: swift.head_container container=<% $.container %> > on-success: notify_zaqar > on-error: create_container > publish: > status: FAILED > message: "Unable to create plan. The Swift container already exists" > > create_container: > action: tripleo.plan.create_container container=<% $.container %> > on-success: templates_source_check > on-error: create_container_set_status_failed > > cleanup_temporary_files: > action: tripleo.git.clean container=<% $.container %> > > templates_source_check: > on-success: > - upload_default_templates: <% $.use_default_templates = true %> > - clone_git_repo: <% $.source_url != null %> > > clone_git_repo: > action: tripleo.git.clone container=<% $.container %> url=<% $.source_url %> > on-success: upload_templates_directory > on-error: clone_git_repo_set_status_failed > > upload_templates_directory: > action: tripleo.templates.upload container=<% $.container %> templates_path=<% task(clone_git_repo).result %> > on-success: create_plan > on-complete: cleanup_temporary_files > on-error: upload_templates_directory_set_status_failed > > upload_default_templates: > action: tripleo.templates.upload container=<% $.container %> > on-success: create_plan > on-error: upload_to_container_set_status_failed > > create_plan: > on-success: > - ensure_passwords_exist: <% $.generate_passwords = true %> > - add_root_stack_name: <% $.generate_passwords != true %> > > ensure_passwords_exist: > action: tripleo.parameters.generate_passwords container=<% $.container %> > on-success: add_root_stack_name > on-error: ensure_passwords_exist_set_status_failed > > add_root_stack_name: > action: tripleo.parameters.update > input: > container: <% $.container %> > parameters: > RootStackName: <% $.container %> > on-success: container_images_prepare > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > container_images_prepare: > description: > > Populate all container image parameters with default values. > action: tripleo.container_images.prepare container=<% $.container %> > on-success: process_templates > on-error: container_images_prepare_set_status_failed > > process_templates: > action: tripleo.templates.process container=<% $.container %> > on-success: set_status_success > on-error: process_templates_set_status_failed > > set_status_success: > on-success: notify_zaqar > publish: > status: SUCCESS > message: 'Plan created.' > > create_container_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(create_container).result %> > > clone_git_repo_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(clone_git_repo).result %> > > upload_templates_directory_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(upload_templates_directory).result %> > > upload_to_container_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(upload_default_templates).result %> > > ensure_passwords_exist_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(ensure_passwords_exist).result %> > > process_templates_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(process_templates).result %> > > container_images_prepare_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(container_images_prepare).result %> > > notify_zaqar: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.create_deployment_plan > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > update_deployment_plan: > input: > - container > - source_url: null > - queue_name: tripleo > - generate_passwords: true > - plan_environment: null > tags: > - tripleo-common-managed > tasks: > templates_source_check: > on-success: > - update_plan: <% $.source_url = null %> > - clone_git_repo: <% $.source_url != null %> > > clone_git_repo: > action: tripleo.git.clone container=<% $.container %> url=<% $.source_url %> > on-success: upload_templates_directory > on-error: clone_git_repo_set_status_failed > > upload_templates_directory: > action: tripleo.templates.upload container=<% $.container %> templates_path=<% task(clone_git_repo).result %> > on-success: create_swift_rings_backup_plan > on-complete: cleanup_temporary_files > on-error: upload_templates_directory_set_status_failed > > cleanup_temporary_files: > action: tripleo.git.clean container=<% $.container %> > > create_swift_rings_backup_plan: > workflow: tripleo.swift_rings_backup.v1.create_swift_rings_backup_container_plan > on-success: update_plan > on-error: create_swift_rings_backup_plan_set_status_failed > input: > container: <% $.container %> > queue_name: <% $.queue_name %> > use_default_templates: true > > update_plan: > on-success: > - ensure_passwords_exist: <% $.generate_passwords = true %> > - container_images_prepare: <% $.generate_passwords != true %> > > ensure_passwords_exist: > action: tripleo.parameters.generate_passwords container=<% $.container %> > on-success: container_images_prepare > on-error: ensure_passwords_exist_set_status_failed > > container_images_prepare: > description: > > Populate all container image parameters with default values. > action: tripleo.container_images.prepare container=<% $.container %> > on-success: process_templates > on-error: container_images_prepare_set_status_failed > > process_templates: > action: tripleo.templates.process container=<% $.container %> > on-success: > - set_status_success: <% $.plan_environment = null %> > - upload_plan_environment: <% $.plan_environment != null %> > on-error: process_templates_set_status_failed > > upload_plan_environment: > action: tripleo.templates.upload_plan_environment container=<% $.container %> plan_environment=<% $.plan_environment %> > on-success: set_status_success > on-error: process_templates_set_status_failed > > set_status_success: > on-success: notify_zaqar > publish: > status: SUCCESS > message: 'Plan updated.' > > create_swift_rings_backup_plan_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(create_swift_rings_backup_plan).result %> > > clone_git_repo_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(clone_git_repo).result %> > > upload_templates_directory_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(upload_templates_directory).result %> > > process_templates_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(process_templates).result %> > > ensure_passwords_exist_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(ensure_passwords_exist).result %> > > container_images_prepare_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(container_images_prepare).result %> > > notify_zaqar: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.update_deployment_plan > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > delete_deployment_plan: > description: > > Deletes a plan by deleting the container matching plan_name. It will > not delete the plan if a stack exists with the same name. > > tags: > - tripleo-common-managed > > input: > - container: overcloud > - queue_name: tripleo > > tasks: > delete_plan: > action: tripleo.plan.delete container=<% $.container %> > on-complete: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > publish: > status: SUCCESS > message: <% task().result %> > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.delete_deployment_plan > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > > get_passwords: > description: Retrieves passwords for a given plan > input: > - container > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > verify_container_exists: > action: swift.head_container container=<% $.container %> > on-success: get_environment_passwords > on-error: verify_container_set_status_failed > > get_environment_passwords: > action: tripleo.parameters.get_passwords container=<% $.container %> > on-success: get_passwords_set_status_success > on-error: get_passwords_set_status_failed > > get_passwords_set_status_success: > on-success: notify_zaqar > publish: > status: SUCCESS > message: <% task(get_environment_passwords).result %> > > get_passwords_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(get_environment_passwords).result %> > > verify_container_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(verify_container_exists).result %> > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.get_passwords > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > export_deployment_plan: > description: Creates an export tarball for a given plan > input: > - plan > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > export_plan: > action: tripleo.plan.export > input: > plan: <% $.plan %> > delete_after: 3600 > exports_container: "plan-exports" > on-success: create_tempurl > on-error: export_plan_set_status_failed > > create_tempurl: > action: tripleo.swift.tempurl > on-success: set_status_success > on-error: create_tempurl_set_status_failed > input: > container: "plan-exports" > obj: "<% $.plan %>.tar.gz" > valid: 3600 > > set_status_success: > on-success: notify_zaqar > publish: > status: SUCCESS > message: <% task(create_tempurl).result %> > tempurl: <% task(create_tempurl).result %> > > export_plan_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(export_plan).result %> > > create_tempurl_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(create_tempurl).result %> > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.export_deployment_plan > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > tempurl: <% $.get('tempurl', '') %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > get_deprecated_parameters: > description: Gets the list of deprecated parameters in the whole of the plan including nested stack > input: > - container: overcloud > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > get_flatten_data: > action: tripleo.parameters.get_flatten container=<% $.container %> > on-success: get_deprecated_params > on-error: set_status_failed_get_flatten_data > publish: > user_params: <% task().result.environment_parameters %> > plan_params: <% task().result.heat_resource_tree.parameters.keys() %> > parameter_groups: <% task().result.heat_resource_tree.resources.values().where( $.get('parameter_groups') ).select($.parameter_groups).flatten() %> > > get_deprecated_params: > on-success: check_if_user_param_has_deprecated > publish: > deprecated_params: <% $.parameter_groups.where($.get('label') = 'deprecated').select($.parameters).flatten().distinct() %> > > check_if_user_param_has_deprecated: > on-success: get_unused_params > publish: > deprecated_result: <% let(up => $.user_params) -> $.deprecated_params.select( dict('parameter' => $, 'deprecated' => true, 'user_defined' => $up.keys().contains($)) ) %> > > # Get the list of parameters, which are defined by user via environment files's parameter_default, but not part of the plan definition > # It may be possible that the parameter will be used by a service, but the service is not part of the plan. > # In such cases, the parameter will be reported as unused, care should be take to understand whether it is really unused or not. > get_unused_params: > on-success: send_message > publish: > unused_params: <% let(plan_params => $.plan_params) -> $.user_params.keys().where( not $plan_params.contains($) ) %> > > set_status_failed_get_flatten_data: > on-success: send_message > publish: > status: FAILED > message: <% task(get_flatten_data).result %> > > send_message: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.get_deprecated_parameters > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > deprecated: <% $.get('deprecated_result', []) %> > unused: <% $.get('unused_params', []) %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > publish_ui_logs_to_swift: > description: > > This workflow drains a zaqar queue, and publish its messages into a log > file in swift. This workflow is called by cron trigger. > > input: > - logging_queue_name: tripleo-ui-logging > - logging_container: tripleo-ui-logs > > tags: > - tripleo-common-managed > > tasks: > > # We're using a NoOp action to start the workflow. The recursive nature > # of the workflow means that Mistral will refuse to execute it because it > # doesn't know where to begin. > start: > on-success: get_messages > > get_messages: > action: zaqar.claim_messages > on-success: > - format_messages: <% task().result.len() > 0 %> > input: > queue_name: <% $.logging_queue_name %> > ttl: 60 > grace: 60 > publish: > status: SUCCESS > messages: <% task().result %> > message_ids: <% task().result.select($._id) %> > > format_messages: > action: tripleo.logging_to_swift.format_messages > on-success: upload_to_swift > input: > messages: <% $.messages %> > publish: > status: SUCCESS > formatted_messages: <% task().result %> > > upload_to_swift: > action: tripleo.logging_to_swift.publish_ui_log_to_swift > on-success: delete_messages > input: > logging_data: <% $.formatted_messages %> > logging_container: <% $.logging_container %> > publish: > status: SUCCESS > > delete_messages: > action: zaqar.delete_messages > on-success: get_messages > input: > queue_name: <% $.logging_queue_name %> > messages: <% $.message_ids %> > publish: > status: SUCCESS > > download_logs: > description: Creates a tarball with logging data > input: > - queue_name: tripleo > - logging_container: "tripleo-ui-logs" > - downloads_container: "tripleo-ui-logs-downloads" > - delete_after: 3600 > > tags: > - tripleo-common-managed > > tasks: > > publish_logs: > workflow: tripleo.plan_management.v1.publish_ui_logs_to_swift > on-success: prepare_log_download > on-error: publish_logs_set_status_failed > > prepare_log_download: > action: tripleo.logging_to_swift.prepare_log_download > input: > logging_container: <% $.logging_container %> > downloads_container: <% $.downloads_container %> > delete_after: <% $.delete_after %> > on-success: create_tempurl > on-error: download_logs_set_status_failed > publish: > filename: <% task().result %> > > create_tempurl: > action: tripleo.swift.tempurl > on-success: set_status_success > on-error: create_tempurl_set_status_failed > input: > container: <% $.downloads_container %> > obj: <% $.filename %> > valid: 3600 > publish: > tempurl: <% task().result %> > > set_status_success: > on-success: notify_zaqar > publish: > status: SUCCESS > message: <% task(create_tempurl).result %> > tempurl: <% task(create_tempurl).result %> > > publish_logs_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(publish_logs).result %> > > download_logs_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(prepare_log_download).result %> > > create_tempurl_set_status_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(create_tempurl).result %> > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.download_logs > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > tempurl: <% $.get('tempurl', '') %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > list_roles: > description: Retrieve the roles_data.yaml and return a usable object > > input: > - container: overcloud > - roles_data_file: 'roles_data.yaml' > - queue_name: tripleo > > output: > roles_data: <% $.roles_data %> > > tags: > - tripleo-common-managed > > tasks: > get_roles_data: > action: swift.get_object > input: > container: <% $.container %> > obj: <% $.roles_data_file %> > publish: > roles_data: <% yaml_parse(task().result.last()) %> > status: SUCCESS > on-success: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.list_roles > payload: > status: <% $.status %> > roles_data: <% $.get('roles_data', {}) %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > list_available_networks: > input: > - container > - queue_name: tripleo > > output: > available_networks: <% $.available_networks %> > > tags: > - tripleo-common-managed > > tasks: > get_network_file_names: > action: swift.get_container > input: > container: <% $.container %> > publish: > network_names: <% task().result[1].where($.name.startsWith('networks/')).where($.name.endsWith('.yaml')).name %> > on-success: get_network_files > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > get_network_files: > with-items: network_name in <% $.network_names %> > action: swift.get_object > on-success: transform_output > on-error: notify_zaqar > input: > container: <% $.container %> > obj: <% $.network_name %> > publish: > status: SUCCESS > available_yaml_networks: <% task().result.select($[1]) %> > publish-on-error: > status: FAILED > message: <% task().result %> > > transform_output: > publish: > status: SUCCESS > available_networks: <% yaml_parse($.available_yaml_networks.join("\n")) %> > publish-on-error: > status: FAILED > message: <% task().result %> > on-complete: notify_zaqar > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.list_available_networks > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > available_networks: <% $.get('available_networks', []) %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > list_networks: > input: > - container: 'overcloud' > - network_data_file: 'network_data.yaml' > - queue_name: tripleo > > output: > network_data: <% $.network_data %> > > tags: > - tripleo-common-managed > > tasks: > get_networks: > action: swift.get_object > input: > container: <% $.container %> > obj: <% $.network_data_file %> > on-success: notify_zaqar > publish: > network_data: <% yaml_parse(task().result.last()) %> > status: SUCCESS > message: <% task().result %> > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > notify_zaqar: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.list_networks > payload: > status: <% $.status %> > network_data: <% $.get('network_data', {}) %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > validate_network_files: > description: Validate network files exist > input: > - container: overcloud > - network_data > - queue_name: tripleo > > output: > network_data: <% $.network_data %> > > tags: > - tripleo-common-managed > > tasks: > get_network_names: > publish: > network_names_lower: <% $.network_data.where($.containsKey('name_lower')).name_lower %> > network_names: <% $.network_data.where(not $.containsKey('name_lower')).name %> > on-success: validate_networks > > validate_networks: > with-items: network in <% $.network_names_lower.concat($.network_names) %> > action: swift.head_object > input: > container: <% $.container %> > obj: network/<% $.network.toLower() %>.yaml > publish: > status: SUCCESS > message: <% task().result %> > on-success: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > notify_zaqar: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.validate_network_files > payload: > status: <% $.status %> > message: <% $.message %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > validate_networks: > description: Validate network files were generated properly and exist > input: > - container: 'overcloud' > - network_data_file: 'network_data.yaml' > - queue_name: tripleo > > output: > network_data: <% $.network_data %> > > tags: > - tripleo-common-managed > > tasks: > get_network_data: > workflow: list_networks > input: > container: <% $.container %> > network_data_file: <% $.network_data_file %> > queue_name: <% $.queue_name %> > publish: > network_data: <% task().result.network_data %> > on-success: validate_networks > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: > notify_zaqar > > validate_networks: > workflow: validate_network_files > input: > container: <% $.container %> > network_data: <% $.network_data %> > queue_name: <% $.queue_name %> > publish: > status: SUCCESS > message: <% task().result %> > on-success: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.validate_networks > payload: > status: <% $.status %> > network_data: <% $.get('network_data', {}) %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > validate_roles: > description: Vaildate roles data exists and is parsable > > input: > - container: overcloud > - roles_data_file: 'roles_data.yaml' > - queue_name: tripleo > > output: > roles_data: <% $.roles_data %> > > tags: > - tripleo-common-managed > > tasks: > get_roles_data: > workflow: list_roles > input: > container: <% $.container %> > roles_data_file: <% $.roles_data_file %> > queue_name: <% $.queue_name %> > publish: > roles_data: <% task().result.roles_data %> > status: SUCCESS > on-success: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: > notify_zaqar > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.validate_networks > payload: > status: <% $.status %> > roles_data: <% $.get('roles_data', '') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > _validate_networks_from_roles: > description: Internal workflow for validating a network exists from a role > > input: > - container: overcloud > - defined_networks > - networks_in_roles > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > validate_network_in_network_data: > publish: > networks_found: <% $.networks_in_roles.toSet().intersect($.defined_networks.toSet()) %> > networks_not_found: <% $.networks_in_roles.toSet().difference($.defined_networks.toSet()) %> > on-success: > - network_not_found: <% $.networks_not_found %> > - notify_zaqar: <% not $.networks_not_found %> > > network_not_found: > publish: > message: <% "Some networks in roles are not defined, {0}".format($.networks_not_found.join(', ')) %> > status: FAILED > on-success: notify_zaqar > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1._validate_networks_from_role > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > validate_roles_and_networks: > description: Vaidate that roles and network data are valid > > input: > - container: overcloud > - roles_data_file: 'roles_data.yaml' > - network_data_file: 'network_data.yaml' > - queue_name: tripleo > > output: > roles_data: <% $.roles_data %> > network_data: <% $.network_data %> > > tags: > - tripleo-common-managed > > tasks: > validate_network_data: > workflow: validate_networks > input: > container: <% $.container %> > network_data_file: <% $.network_data_file %> > queue_name: <% $.queue_name %> > publish: > network_data: <% task().result.network_data %> > on-success: validate_roles_data > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > validate_roles_data: > workflow: validate_roles > input: > container: <% $.container %> > roles_data_file: <% $.roles_data_file %> > queue_name: <% $.queue_name %> > publish: > roles_data: <% task().result.roles_data %> > role_networks_data: <% task().result.roles_data.networks %> > networks_in_roles: <% task().result.roles_data.networks.flatten().distinct() %> > on-success: validate_roles_and_networks > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > validate_roles_and_networks: > workflow: _validate_networks_from_roles > input: > container: <% $.container %> > defined_networks: <% $.network_data.name %> > networks_in_roles: <% $.networks_in_roles %> > queue_name: <% $.queue_name %> > publish: > status: SUCCESS > on-success: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result.message %> > on-error: notify_zaqar > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.validate_roles_and_networks > payload: > status: <% $.status %> > roles_data: <% $.get('roles_data', {}) %> > network_data: <% $.get('network_data', {}) %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > list_available_roles: > input: > - container: overcloud > - queue_name: tripleo > > output: > available_roles: <% $.available_roles %> > > tags: > - tripleo-common-managed > > tasks: > get_role_file_names: > action: swift.get_container > input: > container: <% $.container %> > publish: > role_names: <% task().result[1].where($.name.startsWith('roles/')).where($.name.endsWith('.yaml')).name %> > on-success: get_role_files > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > get_role_files: > with-items: role_name in <% $.role_names %> > action: swift.get_object > on-success: transform_output > on-error: notify_zaqar > input: > container: <% $.container %> > obj: <% $.role_name %> > publish: > status: SUCCESS > available_yaml_roles: <% task().result.select($[1]) %> > publish-on-error: > status: FAILED > message: <% task().result %> > > transform_output: > publish: > status: SUCCESS > available_roles: <% yaml_parse($.available_yaml_roles.join("\n")) %> > publish-on-error: > status: FAILED > message: <% task().result %> > on-complete: notify_zaqar > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.list_available_roles > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > available_roles: <% $.get('available_roles', []) %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > update_roles: > description: > > takes data in json format validates its contents and persists them in > roles_data.yaml, after successful update, templates are regenerated. > input: > - container > - roles > - roles_data_file: 'roles_data.yaml' > - replace_all: false > - queue_name: tripleo > tags: > - tripleo-common-managed > tasks: > get_available_roles: > workflow: list_available_roles > input: > container: <% $.container %> > queue_name: <% $.queue_name%> > publish: > available_roles: <% task().result.available_roles %> > on-success: validate_input > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > validate_input: > description: > > validate the format of input (verify that each role in input has the > required attributes set. check README in roles directory in t-h-t), > validate that roles in input exist in roles directory in t-h-t > action: tripleo.plan.validate_roles > input: > container: <% $.container %> > roles: <% $.roles %> > available_roles: <% $.available_roles %> > on-success: get_network_data > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > get_network_data: > workflow: list_networks > input: > container: <% $.container %> > queue_name: <% $.queue_name %> > publish: > network_data: <% task().result.network_data %> > on-success: validate_network_names > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > validate_network_names: > description: > > validate that Network names assigned to Role exist in > network-data.yaml object in Swift container > workflow: _validate_networks_from_roles > input: > container: <% $.container %> > defined_networks: <% $.network_data.name %> > networks_in_roles: <% $.roles.networks.flatten().distinct() %> > queue_name: <% $.queue_name %> > on-success: get_current_roles > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result.message %> > > get_current_roles: > workflow: list_roles > input: > container: <% $.container %> > roles_data_file: <% $.roles_data_file %> > queue_name: <% $.queue_name %> > publish: > current_roles: <% task().result.roles_data %> > on-success: update_roles_data > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > update_roles_data: > description: > > update roles_data.yaml object in Swift with roles from workflow input > action: tripleo.plan.update_roles > input: > container: <% $.container %> > roles: <% $.roles %> > current_roles: <% $.current_roles %> > replace_all: <% $.replace_all %> > publish: > updated_roles_data: <% task().result.roles %> > on-success: update_roles_data_in_swift > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > update_roles_data_in_swift: > description: > > update roles_data.yaml object in Swift with data from workflow input > action: swift.put_object > input: > container: <% $.container %> > obj: <% $.roles_data_file %> > contents: <% yaml_dump($.updated_roles_data) %> > on-success: regenerate_templates > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > regenerate_templates: > action: tripleo.templates.process container=<% $.container %> > on-success: get_updated_roles > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > get_updated_roles: > workflow: list_roles > input: > container: <% $.container %> > roles_data_file: <% $.roles_data_file %> > publish: > updated_roles: <% task().result.roles_data %> > status: SUCCESS > on-complete: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.roles.v1.update_roles > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > updated_roles: <% $.get('updated_roles', []) %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > select_roles: > description: > > takes a list of role names as input and populates roles_data.yaml in > container in Swift with respective roles from 'roles directory' > input: > - container > - role_names > - roles_data_file: 'roles_data.yaml' > - replace_all: true > - queue_name: tripleo > tags: > - tripleo-common-managed > tasks: > > get_available_roles: > workflow: list_available_roles > input: > container: <% $.container %> > queue_name: <% $.queue_name %> > publish: > available_roles: <% task().result.available_roles %> > on-success: get_current_roles > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > get_current_roles: > workflow: list_roles > input: > container: <% $.container %> > roles_data_file: <% $.roles_data_file %> > queue_name: <% $.queue_name %> > publish: > current_roles: <% task().result.roles_data %> > on-success: gather_roles > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > gather_roles: > description: > > for each role name from the input, check if it exists in > roles_data.yaml, if yes, use that role definition, if not, get the > role definition from roles directory. Use the gathered roles > definitions as input to updateRolesWorkflow - this ensures > configuration of the roles which are already in roles_data.yaml > will not get overridden by data from roles directory > action: tripleo.plan.gather_roles > input: > role_names: <% $.role_names %> > current_roles: <% $.current_roles %> > available_roles: <% $.available_roles %> > publish: > gathered_roles: <% task().result.gathered_roles %> > on-success: call_update_roles_workflow > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > call_update_roles_workflow: > workflow: update_roles > input: > container: <% $.container %> > roles: <% $.gathered_roles %> > roles_data_file: <% $.roles_data_file %> > replace_all: <% $.replace_all %> > queue_name: <% $.queue_name %> > on-complete: notify_zaqar > publish: > selected_roles: <% task().result.updated_roles %> > status: SUCCESS > publish-on-error: > status: FAILED > message: <% task().result %> > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.plan_management.v1.select_roles > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > selected_roles: <% $.get('selected_roles', []) %> > on-success: > - fail: <% $.get('status') = "FAILED" %> >' >2018-08-21 16:35:51,311 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 47190 >2018-08-21 16:35:51,362 DEBUG: RESP: [201] Content-Length: 47190 Content-Type: application/json Date: Tue, 21 Aug 2018 13:35:51 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.plan_management.v1\ndescription: TripleO Overcloud Deployment Workflows v1\n\nworkflows:\n\n create_default_deployment_plan:\n description: >\n This workflow exists to maintain backwards compatibility in pike. This\n workflow will likely be removed in queens in favor of create_deployment_plan.\n input:\n - container\n - queue_name: tripleo\n - generate_passwords: true\n tags:\n - tripleo-common-managed\n tasks:\n call_create_deployment_plan:\n workflow: tripleo.plan_management.v1.create_deployment_plan\n on-success: set_status_success\n on-error: call_create_deployment_plan_set_status_failed\n input:\n container: <% $.container %>\n queue_name: <% $.queue_name %>\n generate_passwords: <% $.generate_passwords %>\n use_default_templates: true\n\n set_status_success:\n on-success: notify_zaqar\n publish:\n status: SUCCESS\n message: <% task(call_create_deployment_plan).result %>\n\n call_create_deployment_plan_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(call_create_deployment_plan).result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.create_default_deployment_plan\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n create_deployment_plan:\n description: >\n This workflow provides the capability to create a deployment plan using\n the default heat templates provided in a standard TripleO undercloud\n deployment, heat templates contained in an external git repository, or a\n swift container that already contains templates.\n input:\n - container\n - source_url: null\n - queue_name: tripleo\n - generate_passwords: true\n - use_default_templates: false\n\n tags:\n - tripleo-common-managed\n\n tasks:\n container_required_check:\n description: >\n If using the default templates or importing templates from a git\n repository, a new container needs to be created. If using an existing\n container containing templates, skip straight to create_plan.\n on-success:\n - verify_container_doesnt_exist: <% $.use_default_templates or $.source_url %>\n - create_plan: <% $.use_default_templates = false and $.source_url = null %>\n\n verify_container_doesnt_exist:\n action: swift.head_container container=<% $.container %>\n on-success: notify_zaqar\n on-error: create_container\n publish:\n status: FAILED\n message: \"Unable to create plan. The Swift container already exists\"\n\n create_container:\n action: tripleo.plan.create_container container=<% $.container %>\n on-success: templates_source_check\n on-error: create_container_set_status_failed\n\n cleanup_temporary_files:\n action: tripleo.git.clean container=<% $.container %>\n\n templates_source_check:\n on-success:\n - upload_default_templates: <% $.use_default_templates = true %>\n - clone_git_repo: <% $.source_url != null %>\n\n clone_git_repo:\n action: tripleo.git.clone container=<% $.container %> url=<% $.source_url %>\n on-success: upload_templates_directory\n on-error: clone_git_repo_set_status_failed\n\n upload_templates_directory:\n action: tripleo.templates.upload container=<% $.container %> templates_path=<% task(clone_git_repo).result %>\n on-success: create_plan\n on-complete: cleanup_temporary_files\n on-error: upload_templates_directory_set_status_failed\n\n upload_default_templates:\n action: tripleo.templates.upload container=<% $.container %>\n on-success: create_plan\n on-error: upload_to_container_set_status_failed\n\n create_plan:\n on-success:\n - ensure_passwords_exist: <% $.generate_passwords = true %>\n - add_root_stack_name: <% $.generate_passwords != true %>\n\n ensure_passwords_exist:\n action: tripleo.parameters.generate_passwords container=<% $.container %>\n on-success: add_root_stack_name\n on-error: ensure_passwords_exist_set_status_failed\n\n add_root_stack_name:\n action: tripleo.parameters.update\n input:\n container: <% $.container %>\n parameters:\n RootStackName: <% $.container %>\n on-success: container_images_prepare\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n container_images_prepare:\n description: >\n Populate all container image parameters with default values.\n action: tripleo.container_images.prepare container=<% $.container %>\n on-success: process_templates\n on-error: container_images_prepare_set_status_failed\n\n process_templates:\n action: tripleo.templates.process container=<% $.container %>\n on-success: set_status_success\n on-error: process_templates_set_status_failed\n\n set_status_success:\n on-success: notify_zaqar\n publish:\n status: SUCCESS\n message: 'Plan created.'\n\n create_container_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(create_container).result %>\n\n clone_git_repo_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(clone_git_repo).result %>\n\n upload_templates_directory_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(upload_templates_directory).result %>\n\n upload_to_container_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(upload_default_templates).result %>\n\n ensure_passwords_exist_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(ensure_passwords_exist).result %>\n\n process_templates_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(process_templates).result %>\n\n container_images_prepare_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(container_images_prepare).result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.create_deployment_plan\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n update_deployment_plan:\n input:\n - container\n - source_url: null\n - queue_name: tripleo\n - generate_passwords: true\n - plan_environment: null\n tags:\n - tripleo-common-managed\n tasks:\n templates_source_check:\n on-success:\n - update_plan: <% $.source_url = null %>\n - clone_git_repo: <% $.source_url != null %>\n\n clone_git_repo:\n action: tripleo.git.clone container=<% $.container %> url=<% $.source_url %>\n on-success: upload_templates_directory\n on-error: clone_git_repo_set_status_failed\n\n upload_templates_directory:\n action: tripleo.templates.upload container=<% $.container %> templates_path=<% task(clone_git_repo).result %>\n on-success: create_swift_rings_backup_plan\n on-complete: cleanup_temporary_files\n on-error: upload_templates_directory_set_status_failed\n\n cleanup_temporary_files:\n action: tripleo.git.clean container=<% $.container %>\n\n create_swift_rings_backup_plan:\n workflow: tripleo.swift_rings_backup.v1.create_swift_rings_backup_container_plan\n on-success: update_plan\n on-error: create_swift_rings_backup_plan_set_status_failed\n input:\n container: <% $.container %>\n queue_name: <% $.queue_name %>\n use_default_templates: true\n\n update_plan:\n on-success:\n - ensure_passwords_exist: <% $.generate_passwords = true %>\n - container_images_prepare: <% $.generate_passwords != true %>\n\n ensure_passwords_exist:\n action: tripleo.parameters.generate_passwords container=<% $.container %>\n on-success: container_images_prepare\n on-error: ensure_passwords_exist_set_status_failed\n\n container_images_prepare:\n description: >\n Populate all container image parameters with default values.\n action: tripleo.container_images.prepare container=<% $.container %>\n on-success: process_templates\n on-error: container_images_prepare_set_status_failed\n\n process_templates:\n action: tripleo.templates.process container=<% $.container %>\n on-success:\n - set_status_success: <% $.plan_environment = null %>\n - upload_plan_environment: <% $.plan_environment != null %>\n on-error: process_templates_set_status_failed\n\n upload_plan_environment:\n action: tripleo.templates.upload_plan_environment container=<% $.container %> plan_environment=<% $.plan_environment %>\n on-success: set_status_success\n on-error: process_templates_set_status_failed\n\n set_status_success:\n on-success: notify_zaqar\n publish:\n status: SUCCESS\n message: 'Plan updated.'\n\n create_swift_rings_backup_plan_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(create_swift_rings_backup_plan).result %>\n\n clone_git_repo_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(clone_git_repo).result %>\n\n upload_templates_directory_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(upload_templates_directory).result %>\n\n process_templates_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(process_templates).result %>\n\n ensure_passwords_exist_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(ensure_passwords_exist).result %>\n\n container_images_prepare_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(container_images_prepare).result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.update_deployment_plan\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n delete_deployment_plan:\n description: >\n Deletes a plan by deleting the container matching plan_name. It will\n not delete the plan if a stack exists with the same name.\n\n tags:\n - tripleo-common-managed\n\n input:\n - container: overcloud\n - queue_name: tripleo\n\n tasks:\n delete_plan:\n action: tripleo.plan.delete container=<% $.container %>\n on-complete: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n publish:\n status: SUCCESS\n message: <% task().result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.delete_deployment_plan\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n\n get_passwords:\n description: Retrieves passwords for a given plan\n input:\n - container\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n verify_container_exists:\n action: swift.head_container container=<% $.container %>\n on-success: get_environment_passwords\n on-error: verify_container_set_status_failed\n\n get_environment_passwords:\n action: tripleo.parameters.get_passwords container=<% $.container %>\n on-success: get_passwords_set_status_success\n on-error: get_passwords_set_status_failed\n\n get_passwords_set_status_success:\n on-success: notify_zaqar\n publish:\n status: SUCCESS\n message: <% task(get_environment_passwords).result %>\n\n get_passwords_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(get_environment_passwords).result %>\n\n verify_container_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(verify_container_exists).result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.get_passwords\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n export_deployment_plan:\n description: Creates an export tarball for a given plan\n input:\n - plan\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n export_plan:\n action: tripleo.plan.export\n input:\n plan: <% $.plan %>\n delete_after: 3600\n exports_container: \"plan-exports\"\n on-success: create_tempurl\n on-error: export_plan_set_status_failed\n\n create_tempurl:\n action: tripleo.swift.tempurl\n on-success: set_status_success\n on-error: create_tempurl_set_status_failed\n input:\n container: \"plan-exports\"\n obj: \"<% $.plan %>.tar.gz\"\n valid: 3600\n\n set_status_success:\n on-success: notify_zaqar\n publish:\n status: SUCCESS\n message: <% task(create_tempurl).result %>\n tempurl: <% task(create_tempurl).result %>\n\n export_plan_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(export_plan).result %>\n\n create_tempurl_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(create_tempurl).result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.export_deployment_plan\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n tempurl: <% $.get('tempurl', '') %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n get_deprecated_parameters:\n description: Gets the list of deprecated parameters in the whole of the plan including nested stack\n input:\n - container: overcloud\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_flatten_data:\n action: tripleo.parameters.get_flatten container=<% $.container %>\n on-success: get_deprecated_params\n on-error: set_status_failed_get_flatten_data\n publish:\n user_params: <% task().result.environment_parameters %>\n plan_params: <% task().result.heat_resource_tree.parameters.keys() %>\n parameter_groups: <% task().result.heat_resource_tree.resources.values().where( $.get('parameter_groups') ).select($.parameter_groups).flatten() %>\n\n get_deprecated_params:\n on-success: check_if_user_param_has_deprecated\n publish:\n deprecated_params: <% $.parameter_groups.where($.get('label') = 'deprecated').select($.parameters).flatten().distinct() %>\n\n check_if_user_param_has_deprecated:\n on-success: get_unused_params\n publish:\n deprecated_result: <% let(up => $.user_params) -> $.deprecated_params.select( dict('parameter' => $, 'deprecated' => true, 'user_defined' => $up.keys().contains($)) ) %>\n\n # Get the list of parameters, which are defined by user via environment files's parameter_default, but not part of the plan definition\n # It may be possible that the parameter will be used by a service, but the service is not part of the plan.\n # In such cases, the parameter will be reported as unused, care should be take to understand whether it is really unused or not.\n get_unused_params:\n on-success: send_message\n publish:\n unused_params: <% let(plan_params => $.plan_params) -> $.user_params.keys().where( not $plan_params.contains($) ) %>\n\n set_status_failed_get_flatten_data:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_flatten_data).result %>\n\n send_message:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.get_deprecated_parameters\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n deprecated: <% $.get('deprecated_result', []) %>\n unused: <% $.get('unused_params', []) %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n publish_ui_logs_to_swift:\n description: >\n This workflow drains a zaqar queue, and publish its messages into a log\n file in swift. This workflow is called by cron trigger.\n\n input:\n - logging_queue_name: tripleo-ui-logging\n - logging_container: tripleo-ui-logs\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n # We're using a NoOp action to start the workflow. The recursive nature\n # of the workflow means that Mistral will refuse to execute it because it\n # doesn't know where to begin.\n start:\n on-success: get_messages\n\n get_messages:\n action: zaqar.claim_messages\n on-success:\n - format_messages: <% task().result.len() > 0 %>\n input:\n queue_name: <% $.logging_queue_name %>\n ttl: 60\n grace: 60\n publish:\n status: SUCCESS\n messages: <% task().result %>\n message_ids: <% task().result.select($._id) %>\n\n format_messages:\n action: tripleo.logging_to_swift.format_messages\n on-success: upload_to_swift\n input:\n messages: <% $.messages %>\n publish:\n status: SUCCESS\n formatted_messages: <% task().result %>\n\n upload_to_swift:\n action: tripleo.logging_to_swift.publish_ui_log_to_swift\n on-success: delete_messages\n input:\n logging_data: <% $.formatted_messages %>\n logging_container: <% $.logging_container %>\n publish:\n status: SUCCESS\n\n delete_messages:\n action: zaqar.delete_messages\n on-success: get_messages\n input:\n queue_name: <% $.logging_queue_name %>\n messages: <% $.message_ids %>\n publish:\n status: SUCCESS\n\n download_logs:\n description: Creates a tarball with logging data\n input:\n - queue_name: tripleo\n - logging_container: \"tripleo-ui-logs\"\n - downloads_container: \"tripleo-ui-logs-downloads\"\n - delete_after: 3600\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n publish_logs:\n workflow: tripleo.plan_management.v1.publish_ui_logs_to_swift\n on-success: prepare_log_download\n on-error: publish_logs_set_status_failed\n\n prepare_log_download:\n action: tripleo.logging_to_swift.prepare_log_download\n input:\n logging_container: <% $.logging_container %>\n downloads_container: <% $.downloads_container %>\n delete_after: <% $.delete_after %>\n on-success: create_tempurl\n on-error: download_logs_set_status_failed\n publish:\n filename: <% task().result %>\n\n create_tempurl:\n action: tripleo.swift.tempurl\n on-success: set_status_success\n on-error: create_tempurl_set_status_failed\n input:\n container: <% $.downloads_container %>\n obj: <% $.filename %>\n valid: 3600\n publish:\n tempurl: <% task().result %>\n\n set_status_success:\n on-success: notify_zaqar\n publish:\n status: SUCCESS\n message: <% task(create_tempurl).result %>\n tempurl: <% task(create_tempurl).result %>\n\n publish_logs_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(publish_logs).result %>\n\n download_logs_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(prepare_log_download).result %>\n\n create_tempurl_set_status_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(create_tempurl).result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.download_logs\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n tempurl: <% $.get('tempurl', '') %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n list_roles:\n description: Retrieve the roles_data.yaml and return a usable object\n\n input:\n - container: overcloud\n - roles_data_file: 'roles_data.yaml'\n - queue_name: tripleo\n\n output:\n roles_data: <% $.roles_data %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_roles_data:\n action: swift.get_object\n input:\n container: <% $.container %>\n obj: <% $.roles_data_file %>\n publish:\n roles_data: <% yaml_parse(task().result.last()) %>\n status: SUCCESS\n on-success: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.list_roles\n payload:\n status: <% $.status %>\n roles_data: <% $.get('roles_data', {}) %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n list_available_networks:\n input:\n - container\n - queue_name: tripleo\n\n output:\n available_networks: <% $.available_networks %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_network_file_names:\n action: swift.get_container\n input:\n container: <% $.container %>\n publish:\n network_names: <% task().result[1].where($.name.startsWith('networks/')).where($.name.endsWith('.yaml')).name %>\n on-success: get_network_files\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n get_network_files:\n with-items: network_name in <% $.network_names %>\n action: swift.get_object\n on-success: transform_output\n on-error: notify_zaqar\n input:\n container: <% $.container %>\n obj: <% $.network_name %>\n publish:\n status: SUCCESS\n available_yaml_networks: <% task().result.select($[1]) %>\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n transform_output:\n publish:\n status: SUCCESS\n available_networks: <% yaml_parse($.available_yaml_networks.join(\"\\n\")) %>\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-complete: notify_zaqar\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.list_available_networks\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n available_networks: <% $.get('available_networks', []) %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n list_networks:\n input:\n - container: 'overcloud'\n - network_data_file: 'network_data.yaml'\n - queue_name: tripleo\n\n output:\n network_data: <% $.network_data %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_networks:\n action: swift.get_object\n input:\n container: <% $.container %>\n obj: <% $.network_data_file %>\n on-success: notify_zaqar\n publish:\n network_data: <% yaml_parse(task().result.last()) %>\n status: SUCCESS\n message: <% task().result %>\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.list_networks\n payload:\n status: <% $.status %>\n network_data: <% $.get('network_data', {}) %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n validate_network_files:\n description: Validate network files exist\n input:\n - container: overcloud\n - network_data\n - queue_name: tripleo\n\n output:\n network_data: <% $.network_data %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_network_names:\n publish:\n network_names_lower: <% $.network_data.where($.containsKey('name_lower')).name_lower %>\n network_names: <% $.network_data.where(not $.containsKey('name_lower')).name %>\n on-success: validate_networks\n\n validate_networks:\n with-items: network in <% $.network_names_lower.concat($.network_names) %>\n action: swift.head_object\n input:\n container: <% $.container %>\n obj: network/<% $.network.toLower() %>.yaml\n publish:\n status: SUCCESS\n message: <% task().result %>\n on-success: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.validate_network_files\n payload:\n status: <% $.status %>\n message: <% $.message %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n validate_networks:\n description: Validate network files were generated properly and exist\n input:\n - container: 'overcloud'\n - network_data_file: 'network_data.yaml'\n - queue_name: tripleo\n\n output:\n network_data: <% $.network_data %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_network_data:\n workflow: list_networks\n input:\n container: <% $.container %>\n network_data_file: <% $.network_data_file %>\n queue_name: <% $.queue_name %>\n publish:\n network_data: <% task().result.network_data %>\n on-success: validate_networks\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error:\n notify_zaqar\n\n validate_networks:\n workflow: validate_network_files\n input:\n container: <% $.container %>\n network_data: <% $.network_data %>\n queue_name: <% $.queue_name %>\n publish:\n status: SUCCESS\n message: <% task().result %>\n on-success: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.validate_networks\n payload:\n status: <% $.status %>\n network_data: <% $.get('network_data', {}) %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n validate_roles:\n description: Vaildate roles data exists and is parsable\n\n input:\n - container: overcloud\n - roles_data_file: 'roles_data.yaml'\n - queue_name: tripleo\n\n output:\n roles_data: <% $.roles_data %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_roles_data:\n workflow: list_roles\n input:\n container: <% $.container %>\n roles_data_file: <% $.roles_data_file %>\n queue_name: <% $.queue_name %>\n publish:\n roles_data: <% task().result.roles_data %>\n status: SUCCESS\n on-success: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error:\n notify_zaqar\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.validate_networks\n payload:\n status: <% $.status %>\n roles_data: <% $.get('roles_data', '') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n _validate_networks_from_roles:\n description: Internal workflow for validating a network exists from a role\n\n input:\n - container: overcloud\n - defined_networks\n - networks_in_roles\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n validate_network_in_network_data:\n publish:\n networks_found: <% $.networks_in_roles.toSet().intersect($.defined_networks.toSet()) %>\n networks_not_found: <% $.networks_in_roles.toSet().difference($.defined_networks.toSet()) %>\n on-success:\n - network_not_found: <% $.networks_not_found %>\n - notify_zaqar: <% not $.networks_not_found %>\n\n network_not_found:\n publish:\n message: <% \"Some networks in roles are not defined, {0}\".format($.networks_not_found.join(', ')) %>\n status: FAILED\n on-success: notify_zaqar\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1._validate_networks_from_role\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n validate_roles_and_networks:\n description: Vaidate that roles and network data are valid\n\n input:\n - container: overcloud\n - roles_data_file: 'roles_data.yaml'\n - network_data_file: 'network_data.yaml'\n - queue_name: tripleo\n\n output:\n roles_data: <% $.roles_data %>\n network_data: <% $.network_data %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n validate_network_data:\n workflow: validate_networks\n input:\n container: <% $.container %>\n network_data_file: <% $.network_data_file %>\n queue_name: <% $.queue_name %>\n publish:\n network_data: <% task().result.network_data %>\n on-success: validate_roles_data\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n validate_roles_data:\n workflow: validate_roles\n input:\n container: <% $.container %>\n roles_data_file: <% $.roles_data_file %>\n queue_name: <% $.queue_name %>\n publish:\n roles_data: <% task().result.roles_data %>\n role_networks_data: <% task().result.roles_data.networks %>\n networks_in_roles: <% task().result.roles_data.networks.flatten().distinct() %>\n on-success: validate_roles_and_networks\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n validate_roles_and_networks:\n workflow: _validate_networks_from_roles\n input:\n container: <% $.container %>\n defined_networks: <% $.network_data.name %>\n networks_in_roles: <% $.networks_in_roles %>\n queue_name: <% $.queue_name %>\n publish:\n status: SUCCESS\n on-success: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result.message %>\n on-error: notify_zaqar\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.validate_roles_and_networks\n payload:\n status: <% $.status %>\n roles_data: <% $.get('roles_data', {}) %>\n network_data: <% $.get('network_data', {}) %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n list_available_roles:\n input:\n - container: overcloud\n - queue_name: tripleo\n\n output:\n available_roles: <% $.available_roles %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_role_file_names:\n action: swift.get_container\n input:\n container: <% $.container %>\n publish:\n role_names: <% task().result[1].where($.name.startsWith('roles/')).where($.name.endsWith('.yaml')).name %>\n on-success: get_role_files\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n get_role_files:\n with-items: role_name in <% $.role_names %>\n action: swift.get_object\n on-success: transform_output\n on-error: notify_zaqar\n input:\n container: <% $.container %>\n obj: <% $.role_name %>\n publish:\n status: SUCCESS\n available_yaml_roles: <% task().result.select($[1]) %>\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n transform_output:\n publish:\n status: SUCCESS\n available_roles: <% yaml_parse($.available_yaml_roles.join(\"\\n\")) %>\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-complete: notify_zaqar\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.list_available_roles\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n available_roles: <% $.get('available_roles', []) %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n update_roles:\n description: >\n takes data in json format validates its contents and persists them in\n roles_data.yaml, after successful update, templates are regenerated.\n input:\n - container\n - roles\n - roles_data_file: 'roles_data.yaml'\n - replace_all: false\n - queue_name: tripleo\n tags:\n - tripleo-common-managed\n tasks:\n get_available_roles:\n workflow: list_available_roles\n input:\n container: <% $.container %>\n queue_name: <% $.queue_name%>\n publish:\n available_roles: <% task().result.available_roles %>\n on-success: validate_input\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n validate_input:\n description: >\n validate the format of input (verify that each role in input has the\n required attributes set. check README in roles directory in t-h-t),\n validate that roles in input exist in roles directory in t-h-t\n action: tripleo.plan.validate_roles\n input:\n container: <% $.container %>\n roles: <% $.roles %>\n available_roles: <% $.available_roles %>\n on-success: get_network_data\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n get_network_data:\n workflow: list_networks\n input:\n container: <% $.container %>\n queue_name: <% $.queue_name %>\n publish:\n network_data: <% task().result.network_data %>\n on-success: validate_network_names\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n validate_network_names:\n description: >\n validate that Network names assigned to Role exist in\n network-data.yaml object in Swift container\n workflow: _validate_networks_from_roles\n input:\n container: <% $.container %>\n defined_networks: <% $.network_data.name %>\n networks_in_roles: <% $.roles.networks.flatten().distinct() %>\n queue_name: <% $.queue_name %>\n on-success: get_current_roles\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result.message %>\n\n get_current_roles:\n workflow: list_roles\n input:\n container: <% $.container %>\n roles_data_file: <% $.roles_data_file %>\n queue_name: <% $.queue_name %>\n publish:\n current_roles: <% task().result.roles_data %>\n on-success: update_roles_data\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n update_roles_data:\n description: >\n update roles_data.yaml object in Swift with roles from workflow input\n action: tripleo.plan.update_roles\n input:\n container: <% $.container %>\n roles: <% $.roles %>\n current_roles: <% $.current_roles %>\n replace_all: <% $.replace_all %>\n publish:\n updated_roles_data: <% task().result.roles %>\n on-success: update_roles_data_in_swift\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n update_roles_data_in_swift:\n description: >\n update roles_data.yaml object in Swift with data from workflow input\n action: swift.put_object\n input:\n container: <% $.container %>\n obj: <% $.roles_data_file %>\n contents: <% yaml_dump($.updated_roles_data) %>\n on-success: regenerate_templates\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n regenerate_templates:\n action: tripleo.templates.process container=<% $.container %>\n on-success: get_updated_roles\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n get_updated_roles:\n workflow: list_roles\n input:\n container: <% $.container %>\n roles_data_file: <% $.roles_data_file %>\n publish:\n updated_roles: <% task().result.roles_data %>\n status: SUCCESS\n on-complete: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.roles.v1.update_roles\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n updated_roles: <% $.get('updated_roles', []) %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n select_roles:\n description: >\n takes a list of role names as input and populates roles_data.yaml in\n container in Swift with respective roles from 'roles directory'\n input:\n - container\n - role_names\n - roles_data_file: 'roles_data.yaml'\n - replace_all: true\n - queue_name: tripleo\n tags:\n - tripleo-common-managed\n tasks:\n\n get_available_roles:\n workflow: list_available_roles\n input:\n container: <% $.container %>\n queue_name: <% $.queue_name %>\n publish:\n available_roles: <% task().result.available_roles %>\n on-success: get_current_roles\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n get_current_roles:\n workflow: list_roles\n input:\n container: <% $.container %>\n roles_data_file: <% $.roles_data_file %>\n queue_name: <% $.queue_name %>\n publish:\n current_roles: <% task().result.roles_data %>\n on-success: gather_roles\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n gather_roles:\n description: >\n for each role name from the input, check if it exists in\n roles_data.yaml, if yes, use that role definition, if not, get the\n role definition from roles directory. Use the gathered roles\n definitions as input to updateRolesWorkflow - this ensures\n configuration of the roles which are already in roles_data.yaml\n will not get overridden by data from roles directory\n action: tripleo.plan.gather_roles\n input:\n role_names: <% $.role_names %>\n current_roles: <% $.current_roles %>\n available_roles: <% $.available_roles %>\n publish:\n gathered_roles: <% task().result.gathered_roles %>\n on-success: call_update_roles_workflow\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n call_update_roles_workflow:\n workflow: update_roles\n input:\n container: <% $.container %>\n roles: <% $.gathered_roles %>\n roles_data_file: <% $.roles_data_file %>\n replace_all: <% $.replace_all %>\n queue_name: <% $.queue_name %>\n on-complete: notify_zaqar\n publish:\n selected_roles: <% task().result.updated_roles %>\n status: SUCCESS\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.plan_management.v1.select_roles\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n selected_roles: <% $.get('selected_roles', []) %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n", "name": "tripleo.plan_management.v1", "tags": [], "created_at": "2018-08-21 13:35:50", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "16dd83ba-5b47-4d6f-bdba-70e1542c1743"} > >2018-08-21 16:35:51,364 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:35:51,369 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.scale.v1 >description: TripleO Overcloud Deployment Workflows v1 > >workflows: > > delete_node: > description: deletes given overcloud nodes and updates the stack > > input: > - container > - nodes > - timeout: 240 > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > delete_node: > action: tripleo.scale.delete_node nodes=<% $.nodes %> timeout=<% $.timeout %> container=<% $.container %> > on-success: wait_for_stack_in_progress > on-error: set_delete_node_failed > > set_delete_node_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(delete_node).result %> > > wait_for_stack_in_progress: > workflow: tripleo.stack.v1.wait_for_stack_in_progress stack=<% $.container %> > on-success: wait_for_stack_complete > on-error: wait_for_stack_in_progress_failed > > wait_for_stack_in_progress_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(wait_for_stack_in_progress).result %> > > wait_for_stack_complete: > workflow: tripleo.stack.v1.wait_for_stack_complete_or_failed stack=<% $.container %> > on-success: send_message > on-error: wait_for_stack_complete_failed > > wait_for_stack_complete_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(wait_for_stack_complete).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.scale.v1.delete_node > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> >' >2018-08-21 16:35:52,550 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 2258 >2018-08-21 16:35:52,553 DEBUG: RESP: [201] Content-Length: 2258 Content-Type: application/json Date: Tue, 21 Aug 2018 13:35:52 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.scale.v1\ndescription: TripleO Overcloud Deployment Workflows v1\n\nworkflows:\n\n delete_node:\n description: deletes given overcloud nodes and updates the stack\n\n input:\n - container\n - nodes\n - timeout: 240\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n delete_node:\n action: tripleo.scale.delete_node nodes=<% $.nodes %> timeout=<% $.timeout %> container=<% $.container %>\n on-success: wait_for_stack_in_progress\n on-error: set_delete_node_failed\n\n set_delete_node_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(delete_node).result %>\n\n wait_for_stack_in_progress:\n workflow: tripleo.stack.v1.wait_for_stack_in_progress stack=<% $.container %>\n on-success: wait_for_stack_complete\n on-error: wait_for_stack_in_progress_failed\n\n wait_for_stack_in_progress_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(wait_for_stack_in_progress).result %>\n\n wait_for_stack_complete:\n workflow: tripleo.stack.v1.wait_for_stack_complete_or_failed stack=<% $.container %>\n on-success: send_message\n on-error: wait_for_stack_complete_failed\n\n wait_for_stack_complete_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(wait_for_stack_complete).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.scale.v1.delete_node\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n", "name": "tripleo.scale.v1", "tags": [], "created_at": "2018-08-21 13:35:52", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "9ba9c5f2-de61-4b8a-8f79-c5253d1122dc"} > >2018-08-21 16:35:52,554 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:35:52,556 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.swift_rings_backup.v1 >description: TripleO Swift Rings backup container Deployment Workflow v1 > >workflows: > > create_swift_rings_backup_container_plan: > description: > > This plan ensures existence of container for Swift Rings backup. > input: > - container > - queue_name: tripleo > tags: > - tripleo-common-managed > tasks: > > swift_rings_container: > publish: > swift_rings_container: "<% $.container %>-swift-rings" > swift_rings_tar: "swift-rings.tar.gz" > on-complete: check_container > > check_container: > action: swift.head_container container=<% $.swift_rings_container %> > on-success: get_tempurl > on-error: create_container > > create_container: > action: swift.put_container container=<% $.swift_rings_container %> > on-error: set_create_container_failed > on-success: get_tempurl > > get_tempurl: > action: tripleo.swift.tempurl > on-success: set_get_tempurl > input: > container: <% $.swift_rings_container %> > obj: <% $.swift_rings_tar %> > > set_get_tempurl: > action: tripleo.parameters.update > input: > parameters: > SwiftRingGetTempurl: <% task(get_tempurl).result %> > container: <% $.container %> > on-success: put_tempurl > > put_tempurl: > action: tripleo.swift.tempurl > on-success: set_put_tempurl > input: > container: <% $.swift_rings_container %> > obj: <% $.swift_rings_tar %> > method: "PUT" > > set_put_tempurl: > action: tripleo.parameters.update > input: > parameters: > SwiftRingPutTempurl: <% task(put_tempurl).result %> > container: <% $.container %> > on-success: set_status_success > on-error: set_put_tempurl_failed > > set_status_success: > on-success: notify_zaqar > publish: > status: SUCCESS > message: <% task(set_put_tempurl).result %> > > set_put_tempurl_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(set_put_tempurl).result %> > > set_create_container_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: <% task(create_container).result %> > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.swift_rings_backup.v1.create_swift_rings_backup_container_plan > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> >' >2018-08-21 16:35:54,133 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 3154 >2018-08-21 16:35:54,137 DEBUG: RESP: [201] Content-Length: 3154 Content-Type: application/json Date: Tue, 21 Aug 2018 13:35:54 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.swift_rings_backup.v1\ndescription: TripleO Swift Rings backup container Deployment Workflow v1\n\nworkflows:\n\n create_swift_rings_backup_container_plan:\n description: >\n This plan ensures existence of container for Swift Rings backup.\n input:\n - container\n - queue_name: tripleo\n tags:\n - tripleo-common-managed\n tasks:\n\n swift_rings_container:\n publish:\n swift_rings_container: \"<% $.container %>-swift-rings\"\n swift_rings_tar: \"swift-rings.tar.gz\"\n on-complete: check_container\n\n check_container:\n action: swift.head_container container=<% $.swift_rings_container %>\n on-success: get_tempurl\n on-error: create_container\n\n create_container:\n action: swift.put_container container=<% $.swift_rings_container %>\n on-error: set_create_container_failed\n on-success: get_tempurl\n\n get_tempurl:\n action: tripleo.swift.tempurl\n on-success: set_get_tempurl\n input:\n container: <% $.swift_rings_container %>\n obj: <% $.swift_rings_tar %>\n\n set_get_tempurl:\n action: tripleo.parameters.update\n input:\n parameters:\n SwiftRingGetTempurl: <% task(get_tempurl).result %>\n container: <% $.container %>\n on-success: put_tempurl\n\n put_tempurl:\n action: tripleo.swift.tempurl\n on-success: set_put_tempurl\n input:\n container: <% $.swift_rings_container %>\n obj: <% $.swift_rings_tar %>\n method: \"PUT\"\n\n set_put_tempurl:\n action: tripleo.parameters.update\n input:\n parameters:\n SwiftRingPutTempurl: <% task(put_tempurl).result %>\n container: <% $.container %>\n on-success: set_status_success\n on-error: set_put_tempurl_failed\n\n set_status_success:\n on-success: notify_zaqar\n publish:\n status: SUCCESS\n message: <% task(set_put_tempurl).result %>\n\n set_put_tempurl_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(set_put_tempurl).result %>\n\n set_create_container_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: <% task(create_container).result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.swift_rings_backup.v1.create_swift_rings_backup_container_plan\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n", "name": "tripleo.swift_rings_backup.v1", "tags": [], "created_at": "2018-08-21 13:35:54", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "1eeff35d-c780-43e2-8872-8b05e00405d1"} > >2018-08-21 16:35:54,137 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:35:54,140 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.octavia_post.v1 >description: TripleO Octavia post deployment Workflows > >workflows: > > octavia_post_deploy: > description: Octavia post deployment > input: > - amp_image_name > - amp_image_filename > - amp_image_tag > - amp_ssh_key_name > - amp_ssh_key_path > - amp_ssh_key_data > - auth_username > - auth_password > - auth_project_name > - lb_mgmt_net_name > - lb_mgmt_subnet_name > - lb_sec_group_name > - lb_mgmt_subnet_cidr > - lb_mgmt_subnet_gateway > - lb_mgmt_subnet_pool_start > - lb_mgmt_subnet_pool_end > - generate_certs > - octavia_ansible_playbook > - overcloud_admin > - ca_cert_path > - ca_private_key_path > - ca_passphrase > - client_cert_path > - mgmt_port_dev > - overcloud_password > - overcloud_project > - overcloud_pub_auth_uri > - ansible_extra_env_variables: > ANSIBLE_HOST_KEY_CHECKING: 'False' > ANSIBLE_SSH_RETRIES: '3' > tags: > - tripleo-common-managed > tasks: > get_overcloud_stack_details: > publish: > # TODO(beagles), we are making an assumption about the octavia heatlh manager and > # controller worker needing > # > octavia_controller_ips: <% env().get('service_ips', {}).get('octavia_worker_ctlplane_node_ips', []) %> > on-success: enable_ssh_admin > > enable_ssh_admin: > workflow: tripleo.access.v1.enable_ssh_admin > input: > ssh_servers: <% $.octavia_controller_ips %> > on-success: get_private_key > > get_private_key: > action: tripleo.validations.get_privkey > publish: > private_key: <% task().result %> > on-success: make_local_temp_directory > > make_local_temp_directory: > action: tripleo.files.make_temp_dir > publish: > undercloud_local_dir: <% task().result.path %> > on-success: make_remote_temp_directory > > make_remote_temp_directory: > action: tripleo.files.make_temp_dir > publish: > undercloud_remote_dir: <% task().result.path %> > on-success: build_local_connection_environment_vars > > build_local_connection_environment_vars: > publish: > ansible_local_connection_variables: <% dict('ANSIBLE_REMOTE_TEMP' => $.undercloud_remote_dir, 'ANSIBLE_LOCAL_TEMP' => $.undercloud_local_dir) + $.ansible_extra_env_variables %> > on-success: upload_amphora > > upload_amphora: > action: tripleo.ansible-playbook > input: > inventory: > undercloud: > hosts: > localhost: > ansible_connection: local > > playbook: <% $.octavia_ansible_playbook %> > remote_user: stack > extra_env_variables: <% $.ansible_local_connection_variables %> > extra_vars: > os_password: <% $.overcloud_password %> > os_username: <% $.overcloud_admin %> > os_project_name: <% $.overcloud_project %> > os_auth_url: <% $.overcloud_pub_auth_uri %> > os_auth_type: "password" > os_identity_api_version: "3" > amp_image_name: <% $.amp_image_name %> > amp_image_filename: <% $.amp_image_filename %> > amp_image_tag: <% $.amp_image_tag %> > amp_ssh_key_name: <% $.amp_ssh_key_name %> > amp_ssh_key_path: <% $.amp_ssh_key_path %> > amp_ssh_key_data: <% $.amp_ssh_key_data %> > auth_username: <% $.auth_username %> > auth_password: <% $.auth_password %> > auth_project_name: <% $.auth_project_name %> > on-success: config_octavia > > config_octavia: > action: tripleo.ansible-playbook > input: > inventory: > octavia_nodes: > hosts: <% $.octavia_controller_ips.toDict($, {}) %> > verbosity: 0 > playbook: <% $.octavia_ansible_playbook %> > remote_user: tripleo-admin > become: true > become_user: root > ssh_private_key: <% $.private_key %> > ssh_common_args: '-o StrictHostKeyChecking=no' > ssh_extra_args: '-o UserKnownHostsFile=/dev/null' > extra_env_variables: <% $.ansible_extra_env_variables %> > extra_vars: > os_password: <% $.overcloud_password %> > os_username: <% $.overcloud_admin %> > os_project_name: <% $.overcloud_project %> > os_auth_url: <% $.overcloud_pub_auth_uri %> > os_auth_type: "password" > os_identity_api_version: "3" > amp_image_tag: <% $.amp_image_tag %> > lb_mgmt_net_name: <% $.lb_mgmt_net_name %> > lb_mgmt_subnet_name: <% $.lb_mgmt_subnet_name %> > lb_sec_group_name: <% $.lb_sec_group_name %> > lb_mgmt_subnet_cidr: <% $.lb_mgmt_subnet_cidr %> > lb_mgmt_subnet_gateway: <% $.lb_mgmt_subnet_gateway %> > lb_mgmt_subnet_pool_start: <% $.lb_mgmt_subnet_pool_start %> > lb_mgmt_subnet_pool_end: <% $.lb_mgmt_subnet_pool_end %> > ca_cert_path: <% $.ca_cert_path %> > ca_private_key_path: <% $.ca_private_key_path %> > ca_passphrase: <% $.ca_passphrase %> > client_cert_path: <% $.client_cert_path %> > generate_certs: <% $.generate_certs %> > mgmt_port_dev: <% $.mgmt_port_dev %> > auth_project_name: <% $.auth_project_name %> > on-complete: purge_local_temp_dir > purge_local_temp_dir: > action: tripleo.files.remove_temp_dir path=<% $.undercloud_local_dir %> > on-complete: purge_remote_temp_dir > purge_remote_temp_dir: > action: tripleo.files.remove_temp_dir path=<% $.undercloud_remote_dir %> > >' >2018-08-21 16:35:55,642 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 6113 >2018-08-21 16:35:55,646 DEBUG: RESP: [201] Content-Length: 6113 Content-Type: application/json Date: Tue, 21 Aug 2018 13:35:55 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.octavia_post.v1\ndescription: TripleO Octavia post deployment Workflows\n\nworkflows:\n\n octavia_post_deploy:\n description: Octavia post deployment\n input:\n - amp_image_name\n - amp_image_filename\n - amp_image_tag\n - amp_ssh_key_name\n - amp_ssh_key_path\n - amp_ssh_key_data\n - auth_username\n - auth_password\n - auth_project_name\n - lb_mgmt_net_name\n - lb_mgmt_subnet_name\n - lb_sec_group_name\n - lb_mgmt_subnet_cidr\n - lb_mgmt_subnet_gateway\n - lb_mgmt_subnet_pool_start\n - lb_mgmt_subnet_pool_end\n - generate_certs\n - octavia_ansible_playbook\n - overcloud_admin\n - ca_cert_path\n - ca_private_key_path\n - ca_passphrase\n - client_cert_path\n - mgmt_port_dev\n - overcloud_password\n - overcloud_project\n - overcloud_pub_auth_uri\n - ansible_extra_env_variables:\n ANSIBLE_HOST_KEY_CHECKING: 'False'\n ANSIBLE_SSH_RETRIES: '3'\n tags:\n - tripleo-common-managed\n tasks:\n get_overcloud_stack_details:\n publish:\n # TODO(beagles), we are making an assumption about the octavia heatlh manager and\n # controller worker needing\n #\n octavia_controller_ips: <% env().get('service_ips', {}).get('octavia_worker_ctlplane_node_ips', []) %>\n on-success: enable_ssh_admin\n\n enable_ssh_admin:\n workflow: tripleo.access.v1.enable_ssh_admin\n input:\n ssh_servers: <% $.octavia_controller_ips %>\n on-success: get_private_key\n\n get_private_key:\n action: tripleo.validations.get_privkey\n publish:\n private_key: <% task().result %>\n on-success: make_local_temp_directory\n\n make_local_temp_directory:\n action: tripleo.files.make_temp_dir\n publish:\n undercloud_local_dir: <% task().result.path %>\n on-success: make_remote_temp_directory\n\n make_remote_temp_directory:\n action: tripleo.files.make_temp_dir\n publish:\n undercloud_remote_dir: <% task().result.path %>\n on-success: build_local_connection_environment_vars\n\n build_local_connection_environment_vars:\n publish:\n ansible_local_connection_variables: <% dict('ANSIBLE_REMOTE_TEMP' => $.undercloud_remote_dir, 'ANSIBLE_LOCAL_TEMP' => $.undercloud_local_dir) + $.ansible_extra_env_variables %>\n on-success: upload_amphora\n\n upload_amphora:\n action: tripleo.ansible-playbook\n input:\n inventory:\n undercloud:\n hosts:\n localhost:\n ansible_connection: local\n\n playbook: <% $.octavia_ansible_playbook %>\n remote_user: stack\n extra_env_variables: <% $.ansible_local_connection_variables %>\n extra_vars:\n os_password: <% $.overcloud_password %>\n os_username: <% $.overcloud_admin %>\n os_project_name: <% $.overcloud_project %>\n os_auth_url: <% $.overcloud_pub_auth_uri %>\n os_auth_type: \"password\"\n os_identity_api_version: \"3\"\n amp_image_name: <% $.amp_image_name %>\n amp_image_filename: <% $.amp_image_filename %>\n amp_image_tag: <% $.amp_image_tag %>\n amp_ssh_key_name: <% $.amp_ssh_key_name %>\n amp_ssh_key_path: <% $.amp_ssh_key_path %>\n amp_ssh_key_data: <% $.amp_ssh_key_data %>\n auth_username: <% $.auth_username %>\n auth_password: <% $.auth_password %>\n auth_project_name: <% $.auth_project_name %>\n on-success: config_octavia\n\n config_octavia:\n action: tripleo.ansible-playbook\n input:\n inventory:\n octavia_nodes:\n hosts: <% $.octavia_controller_ips.toDict($, {}) %>\n verbosity: 0\n playbook: <% $.octavia_ansible_playbook %>\n remote_user: tripleo-admin\n become: true\n become_user: root\n ssh_private_key: <% $.private_key %>\n ssh_common_args: '-o StrictHostKeyChecking=no'\n ssh_extra_args: '-o UserKnownHostsFile=/dev/null'\n extra_env_variables: <% $.ansible_extra_env_variables %>\n extra_vars:\n os_password: <% $.overcloud_password %>\n os_username: <% $.overcloud_admin %>\n os_project_name: <% $.overcloud_project %>\n os_auth_url: <% $.overcloud_pub_auth_uri %>\n os_auth_type: \"password\"\n os_identity_api_version: \"3\"\n amp_image_tag: <% $.amp_image_tag %>\n lb_mgmt_net_name: <% $.lb_mgmt_net_name %>\n lb_mgmt_subnet_name: <% $.lb_mgmt_subnet_name %>\n lb_sec_group_name: <% $.lb_sec_group_name %>\n lb_mgmt_subnet_cidr: <% $.lb_mgmt_subnet_cidr %>\n lb_mgmt_subnet_gateway: <% $.lb_mgmt_subnet_gateway %>\n lb_mgmt_subnet_pool_start: <% $.lb_mgmt_subnet_pool_start %>\n lb_mgmt_subnet_pool_end: <% $.lb_mgmt_subnet_pool_end %>\n ca_cert_path: <% $.ca_cert_path %>\n ca_private_key_path: <% $.ca_private_key_path %>\n ca_passphrase: <% $.ca_passphrase %>\n client_cert_path: <% $.client_cert_path %>\n generate_certs: <% $.generate_certs %>\n mgmt_port_dev: <% $.mgmt_port_dev %>\n auth_project_name: <% $.auth_project_name %>\n on-complete: purge_local_temp_dir\n purge_local_temp_dir:\n action: tripleo.files.remove_temp_dir path=<% $.undercloud_local_dir %>\n on-complete: purge_remote_temp_dir\n purge_remote_temp_dir:\n action: tripleo.files.remove_temp_dir path=<% $.undercloud_remote_dir %>\n\n", "name": "tripleo.octavia_post.v1", "tags": [], "created_at": "2018-08-21 13:35:55", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "e9c58950-cf1c-4158-95d2-6e9ddfade5c5"} > >2018-08-21 16:35:55,647 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:35:55,650 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.storage.v1 >description: TripleO manages Ceph with ceph-ansible > >workflows: > ceph-install: > # allows for additional extra_vars via workflow input > input: > - ansible_playbook_verbosity: 0 > - ansible_skip_tags: 'package-install,with_pkg' > - ansible_env_variables: {} > - ansible_extra_env_variables: > ANSIBLE_CONFIG: /usr/share/ceph-ansible/ansible.cfg > ANSIBLE_ACTION_PLUGINS: /usr/share/ceph-ansible/plugins/actions/ > ANSIBLE_ROLES_PATH: /usr/share/ceph-ansible/roles/ > ANSIBLE_RETRY_FILES_ENABLED: 'False' > ANSIBLE_LOG_PATH: /var/log/mistral/ceph-install-workflow.log > ANSIBLE_LIBRARY: /usr/share/ceph-ansible/library/ > ANSIBLE_SSH_RETRIES: '3' > ANSIBLE_HOST_KEY_CHECKING: 'False' > DEFAULT_FORKS: '25' > - ceph_ansible_extra_vars: {} > - ceph_ansible_playbook: /usr/share/ceph-ansible/site-docker.yml.sample > - node_data_lookup: '{}' > - swift_container: 'ceph_ansible_fetch_dir' > tags: > - tripleo-common-managed > tasks: > set_swift_container: > publish: > swift_container: <% concat(env().get('heat_stack_name', ''), '_', $.swift_container) %> > on-complete: collect_puppet_hieradata > collect_puppet_hieradata: > on-success: check_hieradata > publish: > hieradata: <% env().get('role_merged_configs', {}).values().select($.keys()).flatten().select(regex('^ceph::profile::params::osds$').search($)).where($ != null).toSet() %> > check_hieradata: > on-success: > - set_blacklisted_ips: <% not bool($.hieradata) %> > - fail(msg=<% 'Ceph deployment stopped, puppet-ceph hieradata found. Convert it into ceph-ansible variables. {0}'.format($.hieradata) %>): <% bool($.hieradata) %> > set_blacklisted_ips: > publish: > blacklisted_ips: <% env().get('blacklisted_ip_addresses', []) %> > on-success: set_ip_lists > set_ip_lists: > publish: > mgr_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_mgr_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %> > mon_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_mon_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %> > osd_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_osd_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %> > mds_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_mds_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %> > rgw_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_rgw_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %> > nfs_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_nfs_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %> > rbdmirror_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_rbdmirror_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %> > client_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_client_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %> > on-success: merge_ip_lists > merge_ip_lists: > publish: > ips_list: <% ($.mgr_ips + $.mon_ips + $.osd_ips + $.mds_ips + $.rgw_ips + $.nfs_ips + $.rbdmirror_ips + $.client_ips).toSet() %> > on-success: enable_ssh_admin > enable_ssh_admin: > workflow: tripleo.access.v1.enable_ssh_admin > input: > ssh_servers: <% $.ips_list %> > on-success: get_private_key > get_private_key: > action: tripleo.validations.get_privkey > publish: > private_key: <% task().result %> > on-success: make_fetch_directory > make_fetch_directory: > action: tripleo.files.make_temp_dir > publish: > fetch_directory: <% task().result.path %> > on-success: verify_container_exists > verify_container_exists: > action: swift.head_container container=<% $.swift_container %> > on-success: restore_fetch_directory > on-error: create_container > restore_fetch_directory: > action: tripleo.files.restore_temp_dir_from_swift > input: > container: <% $.swift_container %> > path: <% $.fetch_directory %> > on-success: collect_nodes_uuid > create_container: > action: swift.put_container container=<% $.swift_container %> > on-success: collect_nodes_uuid > collect_nodes_uuid: > action: tripleo.ansible-playbook > input: > inventory: > overcloud: > hosts: <% $.ips_list.toDict($, {}) %> > remote_user: tripleo-admin > become: true > become_user: root > verbosity: 0 > ssh_private_key: <% $.private_key %> > #NOTE(gfidente): set ANSIBLE_CALLBACK_WHITELIST to empty string to avoid spurious output > #in the json output. The publish: directive will in fact parse the output. > extra_env_variables: > ANSIBLE_CALLBACK_WHITELIST: '' > ANSIBLE_HOST_KEY_CHECKING: 'False' > ANSIBLE_STDOUT_CALLBACK: 'json' > playbook: > - hosts: overcloud > gather_facts: no > tasks: > - name: collect machine id > command: dmidecode -s system-uuid > publish: > ansible_output: <% json_parse(task().result.stderr) %> > on-success: set_ip_uuids > set_ip_uuids: > publish: > ip_uuids: <% let(root => $.ansible_output.get('plays')[0].get('tasks')[0].get('hosts')) -> $.ips_list.toDict($, $root.get($).get('stdout')) %> > on-success: parse_node_data_lookup > parse_node_data_lookup: > publish: > json_node_data_lookup: <% json_parse($.node_data_lookup) %> > on-success: map_node_data_lookup > map_node_data_lookup: > publish: > ips_data: <% let(uuids => $.ip_uuids, root => $) -> $.ips_list.toDict($, $root.json_node_data_lookup.get($uuids.get($, "NO-UUID-FOUND"), {})) %> > on-success: set_role_vars > set_role_vars: > publish: > # NOTE(gfidente): collect role settings from all tht roles > mgr_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_mgr_ansible_vars', {})).aggregate($1 + $2) %> > mon_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_mon_ansible_vars', {})).aggregate($1 + $2) %> > osd_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_osd_ansible_vars', {})).aggregate($1 + $2) %> > mds_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_mds_ansible_vars', {})).aggregate($1 + $2) %> > rgw_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_rgw_ansible_vars', {})).aggregate($1 + $2) %> > nfs_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_nfs_ansible_vars', {})).aggregate($1 + $2) %> > rbdmirror_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_rbdmirror_ansible_vars', {})).aggregate($1 + $2) %> > client_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_client_ansible_vars', {})).aggregate($1 + $2) %> > on-success: build_extra_vars > build_extra_vars: > publish: > # NOTE(gfidente): merge vars from all ansible roles > extra_vars: <% {'fetch_directory'=> $.fetch_directory} + $.mgr_vars + $.mon_vars + $.osd_vars + $.mds_vars + $.rgw_vars + $.nfs_vars + $.client_vars + $.rbdmirror_vars + $.ceph_ansible_extra_vars %> > on-success: ceph_install > ceph_install: > with-items: playbook in <% list($.ceph_ansible_playbook).flatten() %> > concurrency: 1 > action: tripleo.ansible-playbook > input: > inventory: > mgrs: > hosts: <% let(root => $) -> $.mgr_ips.toDict($, $root.ips_data.get($, {})) %> > mons: > hosts: <% let(root => $) -> $.mon_ips.toDict($, $root.ips_data.get($, {})) %> > osds: > hosts: <% let(root => $) -> $.osd_ips.toDict($, $root.ips_data.get($, {})) %> > mdss: > hosts: <% let(root => $) -> $.mds_ips.toDict($, $root.ips_data.get($, {})) %> > rgws: > hosts: <% let(root => $) -> $.rgw_ips.toDict($, $root.ips_data.get($, {})) %> > nfss: > hosts: <% let(root => $) -> $.nfs_ips.toDict($, $root.ips_data.get($, {})) %> > rbdmirrors: > hosts: <% let(root => $) -> $.rbdmirror_ips.toDict($, $root.ips_data.get($, {})) %> > clients: > hosts: <% let(root => $) -> $.client_ips.toDict($, $root.ips_data.get($, {})) %> > all: > vars: <% $.extra_vars %> > playbook: <% $.playbook %> > remote_user: tripleo-admin > become: true > become_user: root > verbosity: <% $.ansible_playbook_verbosity %> > ssh_private_key: <% $.private_key %> > skip_tags: <% $.ansible_skip_tags %> > extra_env_variables: <% $.ansible_extra_env_variables.mergeWith($.ansible_env_variables) %> > extra_vars: > ireallymeanit: 'yes' > publish: > output: <% task().result %> > on-complete: save_fetch_directory > save_fetch_directory: > action: tripleo.files.save_temp_dir_to_swift > input: > container: <% $.swift_container %> > path: <% $.fetch_directory %> > on-success: purge_fetch_directory > purge_fetch_directory: > action: tripleo.files.remove_temp_dir path=<% $.fetch_directory %> > on-success: remove_ceph_osd_package_from_baremetal > remove_ceph_osd_package_from_baremetal: > action: tripleo.ansible-playbook > input: > inventory: > overcloud: > hosts: <% $.ips_list.toDict($, {}) %> > remote_user: tripleo-admin > become: true > become_user: root > verbosity: 0 > ssh_private_key: <% $.private_key %> > extra_env_variables: > ANSIBLE_HOST_KEY_CHECKING: 'False' > playbook: > - hosts: overcloud > gather_facts: no > tasks: > - name: Remove ceph-ods from baremetal > yum: name=ceph-osd state=absent' >2018-08-21 16:35:58,894 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 10873 >2018-08-21 16:35:58,899 DEBUG: RESP: [201] Content-Length: 10873 Content-Type: application/json Date: Tue, 21 Aug 2018 13:35:58 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.storage.v1\ndescription: TripleO manages Ceph with ceph-ansible\n\nworkflows:\n ceph-install:\n # allows for additional extra_vars via workflow input\n input:\n - ansible_playbook_verbosity: 0\n - ansible_skip_tags: 'package-install,with_pkg'\n - ansible_env_variables: {}\n - ansible_extra_env_variables:\n ANSIBLE_CONFIG: /usr/share/ceph-ansible/ansible.cfg\n ANSIBLE_ACTION_PLUGINS: /usr/share/ceph-ansible/plugins/actions/\n ANSIBLE_ROLES_PATH: /usr/share/ceph-ansible/roles/\n ANSIBLE_RETRY_FILES_ENABLED: 'False'\n ANSIBLE_LOG_PATH: /var/log/mistral/ceph-install-workflow.log\n ANSIBLE_LIBRARY: /usr/share/ceph-ansible/library/\n ANSIBLE_SSH_RETRIES: '3'\n ANSIBLE_HOST_KEY_CHECKING: 'False'\n DEFAULT_FORKS: '25'\n - ceph_ansible_extra_vars: {}\n - ceph_ansible_playbook: /usr/share/ceph-ansible/site-docker.yml.sample\n - node_data_lookup: '{}'\n - swift_container: 'ceph_ansible_fetch_dir'\n tags:\n - tripleo-common-managed\n tasks:\n set_swift_container:\n publish:\n swift_container: <% concat(env().get('heat_stack_name', ''), '_', $.swift_container) %>\n on-complete: collect_puppet_hieradata\n collect_puppet_hieradata:\n on-success: check_hieradata\n publish:\n hieradata: <% env().get('role_merged_configs', {}).values().select($.keys()).flatten().select(regex('^ceph::profile::params::osds$').search($)).where($ != null).toSet() %>\n check_hieradata:\n on-success:\n - set_blacklisted_ips: <% not bool($.hieradata) %>\n - fail(msg=<% 'Ceph deployment stopped, puppet-ceph hieradata found. Convert it into ceph-ansible variables. {0}'.format($.hieradata) %>): <% bool($.hieradata) %>\n set_blacklisted_ips:\n publish:\n blacklisted_ips: <% env().get('blacklisted_ip_addresses', []) %>\n on-success: set_ip_lists\n set_ip_lists:\n publish:\n mgr_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_mgr_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %>\n mon_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_mon_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %>\n osd_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_osd_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %>\n mds_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_mds_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %>\n rgw_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_rgw_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %>\n nfs_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_nfs_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %>\n rbdmirror_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_rbdmirror_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %>\n client_ips: <% let(root => $) -> env().get('service_ips', {}).get('ceph_client_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %>\n on-success: merge_ip_lists\n merge_ip_lists:\n publish:\n ips_list: <% ($.mgr_ips + $.mon_ips + $.osd_ips + $.mds_ips + $.rgw_ips + $.nfs_ips + $.rbdmirror_ips + $.client_ips).toSet() %>\n on-success: enable_ssh_admin\n enable_ssh_admin:\n workflow: tripleo.access.v1.enable_ssh_admin\n input:\n ssh_servers: <% $.ips_list %>\n on-success: get_private_key\n get_private_key:\n action: tripleo.validations.get_privkey\n publish:\n private_key: <% task().result %>\n on-success: make_fetch_directory\n make_fetch_directory:\n action: tripleo.files.make_temp_dir\n publish:\n fetch_directory: <% task().result.path %>\n on-success: verify_container_exists\n verify_container_exists:\n action: swift.head_container container=<% $.swift_container %>\n on-success: restore_fetch_directory\n on-error: create_container\n restore_fetch_directory:\n action: tripleo.files.restore_temp_dir_from_swift\n input:\n container: <% $.swift_container %>\n path: <% $.fetch_directory %>\n on-success: collect_nodes_uuid\n create_container:\n action: swift.put_container container=<% $.swift_container %>\n on-success: collect_nodes_uuid\n collect_nodes_uuid:\n action: tripleo.ansible-playbook\n input:\n inventory:\n overcloud:\n hosts: <% $.ips_list.toDict($, {}) %>\n remote_user: tripleo-admin\n become: true\n become_user: root\n verbosity: 0\n ssh_private_key: <% $.private_key %>\n #NOTE(gfidente): set ANSIBLE_CALLBACK_WHITELIST to empty string to avoid spurious output\n #in the json output. The publish: directive will in fact parse the output.\n extra_env_variables:\n ANSIBLE_CALLBACK_WHITELIST: ''\n ANSIBLE_HOST_KEY_CHECKING: 'False'\n ANSIBLE_STDOUT_CALLBACK: 'json'\n playbook:\n - hosts: overcloud\n gather_facts: no\n tasks:\n - name: collect machine id\n command: dmidecode -s system-uuid\n publish:\n ansible_output: <% json_parse(task().result.stderr) %>\n on-success: set_ip_uuids\n set_ip_uuids:\n publish:\n ip_uuids: <% let(root => $.ansible_output.get('plays')[0].get('tasks')[0].get('hosts')) -> $.ips_list.toDict($, $root.get($).get('stdout')) %>\n on-success: parse_node_data_lookup\n parse_node_data_lookup:\n publish:\n json_node_data_lookup: <% json_parse($.node_data_lookup) %>\n on-success: map_node_data_lookup\n map_node_data_lookup:\n publish:\n ips_data: <% let(uuids => $.ip_uuids, root => $) -> $.ips_list.toDict($, $root.json_node_data_lookup.get($uuids.get($, \"NO-UUID-FOUND\"), {})) %>\n on-success: set_role_vars\n set_role_vars:\n publish:\n # NOTE(gfidente): collect role settings from all tht roles\n mgr_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_mgr_ansible_vars', {})).aggregate($1 + $2) %>\n mon_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_mon_ansible_vars', {})).aggregate($1 + $2) %>\n osd_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_osd_ansible_vars', {})).aggregate($1 + $2) %>\n mds_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_mds_ansible_vars', {})).aggregate($1 + $2) %>\n rgw_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_rgw_ansible_vars', {})).aggregate($1 + $2) %>\n nfs_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_nfs_ansible_vars', {})).aggregate($1 + $2) %>\n rbdmirror_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_rbdmirror_ansible_vars', {})).aggregate($1 + $2) %>\n client_vars: <% env().get('role_merged_configs', {}).values().select($.get('ceph_client_ansible_vars', {})).aggregate($1 + $2) %>\n on-success: build_extra_vars\n build_extra_vars:\n publish:\n # NOTE(gfidente): merge vars from all ansible roles\n extra_vars: <% {'fetch_directory'=> $.fetch_directory} + $.mgr_vars + $.mon_vars + $.osd_vars + $.mds_vars + $.rgw_vars + $.nfs_vars + $.client_vars + $.rbdmirror_vars + $.ceph_ansible_extra_vars %>\n on-success: ceph_install\n ceph_install:\n with-items: playbook in <% list($.ceph_ansible_playbook).flatten() %>\n concurrency: 1\n action: tripleo.ansible-playbook\n input:\n inventory:\n mgrs:\n hosts: <% let(root => $) -> $.mgr_ips.toDict($, $root.ips_data.get($, {})) %>\n mons:\n hosts: <% let(root => $) -> $.mon_ips.toDict($, $root.ips_data.get($, {})) %>\n osds:\n hosts: <% let(root => $) -> $.osd_ips.toDict($, $root.ips_data.get($, {})) %>\n mdss:\n hosts: <% let(root => $) -> $.mds_ips.toDict($, $root.ips_data.get($, {})) %>\n rgws:\n hosts: <% let(root => $) -> $.rgw_ips.toDict($, $root.ips_data.get($, {})) %>\n nfss:\n hosts: <% let(root => $) -> $.nfs_ips.toDict($, $root.ips_data.get($, {})) %>\n rbdmirrors:\n hosts: <% let(root => $) -> $.rbdmirror_ips.toDict($, $root.ips_data.get($, {})) %>\n clients:\n hosts: <% let(root => $) -> $.client_ips.toDict($, $root.ips_data.get($, {})) %>\n all:\n vars: <% $.extra_vars %>\n playbook: <% $.playbook %>\n remote_user: tripleo-admin\n become: true\n become_user: root\n verbosity: <% $.ansible_playbook_verbosity %>\n ssh_private_key: <% $.private_key %>\n skip_tags: <% $.ansible_skip_tags %>\n extra_env_variables: <% $.ansible_extra_env_variables.mergeWith($.ansible_env_variables) %>\n extra_vars:\n ireallymeanit: 'yes'\n publish:\n output: <% task().result %>\n on-complete: save_fetch_directory\n save_fetch_directory:\n action: tripleo.files.save_temp_dir_to_swift\n input:\n container: <% $.swift_container %>\n path: <% $.fetch_directory %>\n on-success: purge_fetch_directory\n purge_fetch_directory:\n action: tripleo.files.remove_temp_dir path=<% $.fetch_directory %>\n on-success: remove_ceph_osd_package_from_baremetal\n remove_ceph_osd_package_from_baremetal:\n action: tripleo.ansible-playbook\n input:\n inventory:\n overcloud:\n hosts: <% $.ips_list.toDict($, {}) %>\n remote_user: tripleo-admin\n become: true\n become_user: root\n verbosity: 0\n ssh_private_key: <% $.private_key %>\n extra_env_variables:\n ANSIBLE_HOST_KEY_CHECKING: 'False'\n playbook:\n - hosts: overcloud\n gather_facts: no\n tasks:\n - name: Remove ceph-ods from baremetal\n yum: name=ceph-osd state=absent", "name": "tripleo.storage.v1", "tags": [], "created_at": "2018-08-21 13:35:58", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "cd6e7a9e-b760-426c-adc4-629a3e9fd1ff"} > >2018-08-21 16:35:58,900 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:35:58,903 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.fernet_keys.v1 >description: TripleO fernet key rotation workflows > >workflows: > > rotate_fernet_keys: > > input: > - container > - queue_name: tripleo > - ansible_extra_env_variables: > ANSIBLE_HOST_KEY_CHECKING: 'False' > > tags: > - tripleo-common-managed > > tasks: > > rotate_keys: > action: tripleo.parameters.rotate_fernet_keys container=<% $.container %> > on-success: deploy_ssh_key > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > deploy_ssh_key: > workflow: tripleo.validations.v1.copy_ssh_key > on-success: get_privkey > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > get_privkey: > action: tripleo.validations.get_privkey > on-success: deploy_keys > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > deploy_keys: > action: tripleo.ansible-playbook > input: > hosts: keystone > inventory: /usr/bin/tripleo-ansible-inventory > ssh_private_key: <% task(get_privkey).result %> > extra_env_variables: <% $.ansible_extra_env_variables + dict(TRIPLEO_PLAN_NAME=>$.container) %> > verbosity: 0 > remote_user: heat-admin > become: true > extra_vars: > fernet_keys: <% task(rotate_keys).result %> > use_openstack_credentials: true > playbook: /usr/share/tripleo-common/playbooks/rotate-keys.yaml > on-success: notify_zaqar > publish: > status: SUCCESS > message: <% task().result %> > on-error: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.fernet_keys.v1.rotate_fernet_keys > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> >' >2018-08-21 16:35:59,895 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 2609 >2018-08-21 16:35:59,898 DEBUG: RESP: [201] Content-Length: 2609 Content-Type: application/json Date: Tue, 21 Aug 2018 13:35:59 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.fernet_keys.v1\ndescription: TripleO fernet key rotation workflows\n\nworkflows:\n\n rotate_fernet_keys:\n\n input:\n - container\n - queue_name: tripleo\n - ansible_extra_env_variables:\n ANSIBLE_HOST_KEY_CHECKING: 'False'\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n rotate_keys:\n action: tripleo.parameters.rotate_fernet_keys container=<% $.container %>\n on-success: deploy_ssh_key\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n deploy_ssh_key:\n workflow: tripleo.validations.v1.copy_ssh_key\n on-success: get_privkey\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n get_privkey:\n action: tripleo.validations.get_privkey\n on-success: deploy_keys\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n deploy_keys:\n action: tripleo.ansible-playbook\n input:\n hosts: keystone\n inventory: /usr/bin/tripleo-ansible-inventory\n ssh_private_key: <% task(get_privkey).result %>\n extra_env_variables: <% $.ansible_extra_env_variables + dict(TRIPLEO_PLAN_NAME=>$.container) %>\n verbosity: 0\n remote_user: heat-admin\n become: true\n extra_vars:\n fernet_keys: <% task(rotate_keys).result %>\n use_openstack_credentials: true\n playbook: /usr/share/tripleo-common/playbooks/rotate-keys.yaml\n on-success: notify_zaqar\n publish:\n status: SUCCESS\n message: <% task().result %>\n on-error: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.fernet_keys.v1.rotate_fernet_keys\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n", "name": "tripleo.fernet_keys.v1", "tags": [], "created_at": "2018-08-21 13:35:59", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "7afff29f-4092-4bc5-a99d-25b9af2b183b"} > >2018-08-21 16:35:59,898 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:35:59,901 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.undercloud_backup.v1 >description: TripleO Undercloud backup workflows > >workflows: > > backup: > description: This workflow will launch the Undercloud backup > tags: > - tripleo-common-managed > input: > - sources_path: '/home/stack/' > - queue_name: tripleo > tasks: > # Action to know if there is enough available space > # to run the Undercloud backup > get_free_space: > action: tripleo.undercloud.get_free_space > publish: > status: SUCCESS > message: <% task().result %> > free_space: <% task().result %> > on-success: create_backup_dir > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > # We create a temp directory to store the Undercloud > # backup > create_backup_dir: > action: tripleo.undercloud.create_backup_dir > publish: > status: SUCCESS > message: <% task().result %> > backup_path: <% task().result %> > on-success: get_database_credentials > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > # The Undercloud database password for the root > # user is stored in a Mistral environment, we > # need the password in order to run the database dump > get_database_credentials: > action: mistral.environments_get name='tripleo.undercloud-config' > publish: > status: SUCCESS > message: <% task().result %> > undercloud_db_password: <% task(get_database_credentials).result.variables.undercloud_db_password %> > on-success: create_database_backup > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > # Run the DB dump of all the databases and store the result > # in the temporary folder > create_database_backup: > input: > path: <% $.backup_path.path %> > dbuser: root > dbpassword: <% $.undercloud_db_password %> > action: tripleo.undercloud.create_database_backup > publish: > status: SUCCESS > message: <% task().result %> > on-success: create_fs_backup > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > # This action will run the fs backup > create_fs_backup: > input: > sources_path: <% $.sources_path %> > path: <% $.backup_path.path %> > action: tripleo.undercloud.create_file_system_backup > publish: > status: SUCCESS > message: <% task().result %> > on-success: upload_backup > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > # This action will push the backup to swift > upload_backup: > input: > backup_path: <% $.backup_path.path %> > action: tripleo.undercloud.upload_backup_to_swift > publish: > status: SUCCESS > message: <% task().result %> > on-success: cleanup_backup > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > # This action will remove the backup temp folder > cleanup_backup: > input: > path: <% $.backup_path.path %> > action: tripleo.undercloud.remove_temp_dir > publish: > status: SUCCESS > message: <% task().result %> > on-success: send_message > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > # Sending a message to show that the backup finished > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.undercloud_backup.v1.launch > payload: > status: <% $.get('status', 'SUCCESS') %> > execution: <% execution() %> > message: <% $.get('message', '') %> > on-success: > - fail: <% $.get('status') = "FAILED" %> >' >2018-08-21 16:36:01,445 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 4669 >2018-08-21 16:36:01,449 DEBUG: RESP: [201] Content-Length: 4669 Content-Type: application/json Date: Tue, 21 Aug 2018 13:36:01 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.undercloud_backup.v1\ndescription: TripleO Undercloud backup workflows\n\nworkflows:\n\n backup:\n description: This workflow will launch the Undercloud backup\n tags:\n - tripleo-common-managed\n input:\n - sources_path: '/home/stack/'\n - queue_name: tripleo\n tasks:\n # Action to know if there is enough available space\n # to run the Undercloud backup\n get_free_space:\n action: tripleo.undercloud.get_free_space\n publish:\n status: SUCCESS\n message: <% task().result %>\n free_space: <% task().result %>\n on-success: create_backup_dir\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n # We create a temp directory to store the Undercloud\n # backup\n create_backup_dir:\n action: tripleo.undercloud.create_backup_dir\n publish:\n status: SUCCESS\n message: <% task().result %>\n backup_path: <% task().result %>\n on-success: get_database_credentials\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n # The Undercloud database password for the root\n # user is stored in a Mistral environment, we\n # need the password in order to run the database dump\n get_database_credentials:\n action: mistral.environments_get name='tripleo.undercloud-config'\n publish:\n status: SUCCESS\n message: <% task().result %>\n undercloud_db_password: <% task(get_database_credentials).result.variables.undercloud_db_password %>\n on-success: create_database_backup\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n # Run the DB dump of all the databases and store the result\n # in the temporary folder\n create_database_backup:\n input:\n path: <% $.backup_path.path %>\n dbuser: root\n dbpassword: <% $.undercloud_db_password %>\n action: tripleo.undercloud.create_database_backup\n publish:\n status: SUCCESS\n message: <% task().result %>\n on-success: create_fs_backup\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n # This action will run the fs backup\n create_fs_backup:\n input:\n sources_path: <% $.sources_path %>\n path: <% $.backup_path.path %>\n action: tripleo.undercloud.create_file_system_backup\n publish:\n status: SUCCESS\n message: <% task().result %>\n on-success: upload_backup\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n # This action will push the backup to swift\n upload_backup:\n input:\n backup_path: <% $.backup_path.path %>\n action: tripleo.undercloud.upload_backup_to_swift\n publish:\n status: SUCCESS\n message: <% task().result %>\n on-success: cleanup_backup\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n # This action will remove the backup temp folder\n cleanup_backup:\n input:\n path: <% $.backup_path.path %>\n action: tripleo.undercloud.remove_temp_dir\n publish:\n status: SUCCESS\n message: <% task().result %>\n on-success: send_message\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n # Sending a message to show that the backup finished\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.undercloud_backup.v1.launch\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n execution: <% execution() %>\n message: <% $.get('message', '') %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n", "name": "tripleo.undercloud_backup.v1", "tags": [], "created_at": "2018-08-21 13:36:01", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "68caff64-16c9-42d7-b3cd-b6647dd57913"} > >2018-08-21 16:36:01,450 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:36:01,452 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.access.v1 >description: TripleO administration access workflows > >workflows: > > enable_ssh_admin: > description: >- > This workflow creates an admin user on the overcloud nodes, > which can then be used for connecting for automated > administrative or deployment tasks, e.g. via Ansible. The > workflow can be used both for Nova-managed and split-stack > deployments, assuming the correct input values are passed > in. The workflow defaults to Nova-managed approach, for which no > additional parameters need to be supplied. In case of > split-stack, temporary ssh connection details (user, key, list > of servers) need to be provided -- these are only used > temporarily to create the actual ssh admin user for use by > Mistral. > tags: > - tripleo-common-managed > input: > - ssh_private_key: null > - ssh_user: null > - ssh_servers: [] > - overcloud_admin: tripleo-admin > - queue_name: tripleo > tasks: > get_pubkey: > action: tripleo.validations.get_pubkey > on-success: generate_playbook > publish: > pubkey: <% task().result %> > > generate_playbook: > on-success: > - create_admin_via_nova: <% $.ssh_private_key = null %> > - create_admin_via_ssh: <% $.ssh_private_key != null %> > publish: > create_admin_tasks: > - name: create user <% $.overcloud_admin %> > user: > name: '<% $.overcloud_admin %>' > - name: grant admin rights to user <% $.overcloud_admin %> > copy: > dest: /etc/sudoers.d/<% $.overcloud_admin %> > content: | > <% $.overcloud_admin %> ALL=(ALL) NOPASSWD:ALL > mode: 0440 > - name: ensure .ssh dir exists for user <% $.overcloud_admin %> > file: > path: /home/<% $.overcloud_admin %>/.ssh > state: directory > owner: <% $.overcloud_admin %> > group: <% $.overcloud_admin %> > mode: 0700 > - name: ensure authorized_keys file exists for user <% $.overcloud_admin %> > file: > path: /home/<% $.overcloud_admin %>/.ssh/authorized_keys > state: touch > owner: <% $.overcloud_admin %> > group: <% $.overcloud_admin %> > mode: 0700 > - name: authorize TripleO Mistral key for user <% $.overcloud_admin %> > lineinfile: > path: /home/<% $.overcloud_admin %>/.ssh/authorized_keys > line: <% $.pubkey %> > regexp: "Generated by TripleO" > > # Nova variant > create_admin_via_nova: > workflow: tripleo.access.v1.create_admin_via_nova > input: > queue_name: <% $.queue_name %> > ssh_servers: <% $.ssh_servers %> > tasks: <% $.create_admin_tasks %> > overcloud_admin: <% $.overcloud_admin %> > > # SSH variant > create_admin_via_ssh: > workflow: tripleo.access.v1.create_admin_via_ssh > input: > ssh_private_key: <% $.ssh_private_key %> > ssh_user: <% $.ssh_user %> > ssh_servers: <% $.ssh_servers %> > tasks: <% $.create_admin_tasks %> > > create_admin_via_nova: > input: > - tasks > - queue_name: tripleo > - ssh_servers: [] > - overcloud_admin: tripleo-admin > - ansible_extra_env_variables: > ANSIBLE_HOST_KEY_CHECKING: 'False' > tags: > - tripleo-common-managed > tasks: > get_servers: > action: nova.servers_list > on-success: create_admin > publish: > servers: <% let(root => $) -> task().result._info.where($.addresses.ctlplane.addr.any($ in $root.ssh_servers)) %> > > create_admin: > workflow: tripleo.deployment.v1.deploy_on_server > on-success: get_privkey > with-items: server in <% $.servers %> > input: > server_name: <% $.server.name %> > server_uuid: <% $.server.id %> > queue_name: <% $.queue_name %> > config_name: create_admin > group: ansible > config: | > - hosts: localhost > connection: local > tasks: <% json_pp($.tasks) %> > > get_privkey: > action: tripleo.validations.get_privkey > on-success: wait_for_occ > publish: > privkey: <% task().result %> > > wait_for_occ: > action: tripleo.ansible-playbook > input: > inventory: > overcloud: > hosts: <% $.ssh_servers.toDict($, {}) %> > remote_user: <% $.overcloud_admin %> > ssh_private_key: <% $.privkey %> > extra_env_variables: <% $.ansible_extra_env_variables %> > playbook: > - hosts: overcloud > gather_facts: no > tasks: > - name: wait for connection > wait_for_connection: > sleep: 5 > timeout: 300 > > create_admin_via_ssh: > input: > - tasks > - ssh_private_key > - ssh_user > - ssh_servers > - ansible_extra_env_variables: > ANSIBLE_HOST_KEY_CHECKING: 'False' > > tags: > - tripleo-common-managed > tasks: > write_tmp_playbook: > action: tripleo.ansible-playbook > input: > inventory: > overcloud: > hosts: <% $.ssh_servers.toDict($, {}) %> > remote_user: <% $.ssh_user %> > ssh_private_key: <% $.ssh_private_key %> > extra_env_variables: <% $.ansible_extra_env_variables %> > become: true > become_user: root > playbook: > - hosts: overcloud > tasks: <% $.tasks %> >' >2018-08-21 16:36:02,922 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 6130 >2018-08-21 16:36:02,926 DEBUG: RESP: [201] Content-Length: 6130 Content-Type: application/json Date: Tue, 21 Aug 2018 13:36:02 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.access.v1\ndescription: TripleO administration access workflows\n\nworkflows:\n\n enable_ssh_admin:\n description: >-\n This workflow creates an admin user on the overcloud nodes,\n which can then be used for connecting for automated\n administrative or deployment tasks, e.g. via Ansible. The\n workflow can be used both for Nova-managed and split-stack\n deployments, assuming the correct input values are passed\n in. The workflow defaults to Nova-managed approach, for which no\n additional parameters need to be supplied. In case of\n split-stack, temporary ssh connection details (user, key, list\n of servers) need to be provided -- these are only used\n temporarily to create the actual ssh admin user for use by\n Mistral.\n tags:\n - tripleo-common-managed\n input:\n - ssh_private_key: null\n - ssh_user: null\n - ssh_servers: []\n - overcloud_admin: tripleo-admin\n - queue_name: tripleo\n tasks:\n get_pubkey:\n action: tripleo.validations.get_pubkey\n on-success: generate_playbook\n publish:\n pubkey: <% task().result %>\n\n generate_playbook:\n on-success:\n - create_admin_via_nova: <% $.ssh_private_key = null %>\n - create_admin_via_ssh: <% $.ssh_private_key != null %>\n publish:\n create_admin_tasks:\n - name: create user <% $.overcloud_admin %>\n user:\n name: '<% $.overcloud_admin %>'\n - name: grant admin rights to user <% $.overcloud_admin %>\n copy:\n dest: /etc/sudoers.d/<% $.overcloud_admin %>\n content: |\n <% $.overcloud_admin %> ALL=(ALL) NOPASSWD:ALL\n mode: 0440\n - name: ensure .ssh dir exists for user <% $.overcloud_admin %>\n file:\n path: /home/<% $.overcloud_admin %>/.ssh\n state: directory\n owner: <% $.overcloud_admin %>\n group: <% $.overcloud_admin %>\n mode: 0700\n - name: ensure authorized_keys file exists for user <% $.overcloud_admin %>\n file:\n path: /home/<% $.overcloud_admin %>/.ssh/authorized_keys\n state: touch\n owner: <% $.overcloud_admin %>\n group: <% $.overcloud_admin %>\n mode: 0700\n - name: authorize TripleO Mistral key for user <% $.overcloud_admin %>\n lineinfile:\n path: /home/<% $.overcloud_admin %>/.ssh/authorized_keys\n line: <% $.pubkey %>\n regexp: \"Generated by TripleO\"\n\n # Nova variant\n create_admin_via_nova:\n workflow: tripleo.access.v1.create_admin_via_nova\n input:\n queue_name: <% $.queue_name %>\n ssh_servers: <% $.ssh_servers %>\n tasks: <% $.create_admin_tasks %>\n overcloud_admin: <% $.overcloud_admin %>\n\n # SSH variant\n create_admin_via_ssh:\n workflow: tripleo.access.v1.create_admin_via_ssh\n input:\n ssh_private_key: <% $.ssh_private_key %>\n ssh_user: <% $.ssh_user %>\n ssh_servers: <% $.ssh_servers %>\n tasks: <% $.create_admin_tasks %>\n\n create_admin_via_nova:\n input:\n - tasks\n - queue_name: tripleo\n - ssh_servers: []\n - overcloud_admin: tripleo-admin\n - ansible_extra_env_variables:\n ANSIBLE_HOST_KEY_CHECKING: 'False'\n tags:\n - tripleo-common-managed\n tasks:\n get_servers:\n action: nova.servers_list\n on-success: create_admin\n publish:\n servers: <% let(root => $) -> task().result._info.where($.addresses.ctlplane.addr.any($ in $root.ssh_servers)) %>\n\n create_admin:\n workflow: tripleo.deployment.v1.deploy_on_server\n on-success: get_privkey\n with-items: server in <% $.servers %>\n input:\n server_name: <% $.server.name %>\n server_uuid: <% $.server.id %>\n queue_name: <% $.queue_name %>\n config_name: create_admin\n group: ansible\n config: |\n - hosts: localhost\n connection: local\n tasks: <% json_pp($.tasks) %>\n\n get_privkey:\n action: tripleo.validations.get_privkey\n on-success: wait_for_occ\n publish:\n privkey: <% task().result %>\n\n wait_for_occ:\n action: tripleo.ansible-playbook\n input:\n inventory:\n overcloud:\n hosts: <% $.ssh_servers.toDict($, {}) %>\n remote_user: <% $.overcloud_admin %>\n ssh_private_key: <% $.privkey %>\n extra_env_variables: <% $.ansible_extra_env_variables %>\n playbook:\n - hosts: overcloud\n gather_facts: no\n tasks:\n - name: wait for connection\n wait_for_connection:\n sleep: 5\n timeout: 300\n\n create_admin_via_ssh:\n input:\n - tasks\n - ssh_private_key\n - ssh_user\n - ssh_servers\n - ansible_extra_env_variables:\n ANSIBLE_HOST_KEY_CHECKING: 'False'\n\n tags:\n - tripleo-common-managed\n tasks:\n write_tmp_playbook:\n action: tripleo.ansible-playbook\n input:\n inventory:\n overcloud:\n hosts: <% $.ssh_servers.toDict($, {}) %>\n remote_user: <% $.ssh_user %>\n ssh_private_key: <% $.ssh_private_key %>\n extra_env_variables: <% $.ansible_extra_env_variables %>\n become: true\n become_user: root\n playbook:\n - hosts: overcloud\n tasks: <% $.tasks %>\n", "name": "tripleo.access.v1", "tags": [], "created_at": "2018-08-21 13:36:02", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "fcba381e-129f-42a7-977c-0cbd53f01d3b"} > >2018-08-21 16:36:02,926 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:36:02,929 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.skydive_ansible.v1 >description: TripleO manages Skydive with skydive-ansible > >workflows: > skydive_install: > # allows for additional extra_vars via workflow input > input: > - ansible_playbook_verbosity: 0 > - ansible_extra_env_variables: > ANSIBLE_ROLES_PATH: /usr/share/skydive-ansible/roles/ > ANSIBLE_RETRY_FILES_ENABLED: 'False' > ANSIBLE_LOG_PATH: /var/log/mistral/skydive-install-workflow.log > ANSIBLE_HOST_KEY_CHECKING: 'False' > - skydive_ansible_extra_vars: {} > - skydive_ansible_playbook: /usr/share/skydive-ansible/playbook.yml.sample > tags: > - tripleo-common-managed > tasks: > set_blacklisted_ips: > publish: > blacklisted_ips: <% env().get('blacklisted_ip_addresses', []) %> > on-success: set_ip_lists > set_ip_lists: > publish: > agent_ips: <% let(root => $) -> env().get('service_ips', {}).get('skydive_agent_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %> > analyzer_ips: <% let(root => $) -> env().get('service_ips', {}).get('skydive_analyzer_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %> > on-success: enable_ssh_admin > enable_ssh_admin: > workflow: tripleo.access.v1.enable_ssh_admin > input: > ssh_servers: <% ($.agent_ips + $.analyzer_ips).toSet() %> > on-success: get_private_key > get_private_key: > action: tripleo.validations.get_privkey > publish: > private_key: <% task().result %> > on-success: set_fork_count > set_fork_count: > publish: # unique list of all IPs: make each list a set, take unions and count > fork_count: <% min($.agent_ips.toSet().union($.analyzer_ips.toSet()).count(), 100) %> # don't use >100 forks > on-success: set_role_vars > set_role_vars: > publish: > # NOTE(sbaubeau): collect role settings from all tht roles > agent_vars: <% env().get('role_merged_configs', {}).values().select($.get('skydive_agent_ansible_vars', {})).aggregate($1 + $2) %> > analyzer_vars: <% env().get('role_merged_configs', {}).values().select($.get('skydive_analyzer_ansible_vars', {})).aggregate($1 + $2) %> > on-success: build_extra_vars > build_extra_vars: > publish: > # NOTE(sbaubeau): merge vars from all ansible roles > extra_vars: <% $.agent_vars + $.analyzer_vars + $.skydive_ansible_extra_vars %> > on-success: skydive_install > skydive_install: > action: tripleo.ansible-playbook > input: > inventory: > agents: > hosts: <% $.agent_ips.toDict($, {}) %> > analyzers: > hosts: <% $.analyzer_ips.toDict($, {}) %> > playbook: <% $.skydive_ansible_playbook %> > remote_user: tripleo-admin > become: true > become_user: root > verbosity: <% $.ansible_playbook_verbosity %> > forks: <% $.fork_count %> > ssh_private_key: <% $.private_key %> > extra_env_variables: <% $.ansible_extra_env_variables %> > extra_vars: <% $.extra_vars %> > publish: > output: <% task().result %> >' >2018-08-21 16:36:04,115 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 3507 >2018-08-21 16:36:04,118 DEBUG: RESP: [201] Content-Length: 3507 Content-Type: application/json Date: Tue, 21 Aug 2018 13:36:04 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.skydive_ansible.v1\ndescription: TripleO manages Skydive with skydive-ansible\n\nworkflows:\n skydive_install:\n # allows for additional extra_vars via workflow input\n input:\n - ansible_playbook_verbosity: 0\n - ansible_extra_env_variables:\n ANSIBLE_ROLES_PATH: /usr/share/skydive-ansible/roles/\n ANSIBLE_RETRY_FILES_ENABLED: 'False'\n ANSIBLE_LOG_PATH: /var/log/mistral/skydive-install-workflow.log\n ANSIBLE_HOST_KEY_CHECKING: 'False'\n - skydive_ansible_extra_vars: {}\n - skydive_ansible_playbook: /usr/share/skydive-ansible/playbook.yml.sample\n tags:\n - tripleo-common-managed\n tasks:\n set_blacklisted_ips:\n publish:\n blacklisted_ips: <% env().get('blacklisted_ip_addresses', []) %>\n on-success: set_ip_lists\n set_ip_lists:\n publish:\n agent_ips: <% let(root => $) -> env().get('service_ips', {}).get('skydive_agent_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %>\n analyzer_ips: <% let(root => $) -> env().get('service_ips', {}).get('skydive_analyzer_ctlplane_node_ips', []).where(not ($ in $root.blacklisted_ips)) %>\n on-success: enable_ssh_admin\n enable_ssh_admin:\n workflow: tripleo.access.v1.enable_ssh_admin\n input:\n ssh_servers: <% ($.agent_ips + $.analyzer_ips).toSet() %>\n on-success: get_private_key\n get_private_key:\n action: tripleo.validations.get_privkey\n publish:\n private_key: <% task().result %>\n on-success: set_fork_count\n set_fork_count:\n publish: # unique list of all IPs: make each list a set, take unions and count\n fork_count: <% min($.agent_ips.toSet().union($.analyzer_ips.toSet()).count(), 100) %> # don't use >100 forks\n on-success: set_role_vars\n set_role_vars:\n publish:\n # NOTE(sbaubeau): collect role settings from all tht roles\n agent_vars: <% env().get('role_merged_configs', {}).values().select($.get('skydive_agent_ansible_vars', {})).aggregate($1 + $2) %>\n analyzer_vars: <% env().get('role_merged_configs', {}).values().select($.get('skydive_analyzer_ansible_vars', {})).aggregate($1 + $2) %>\n on-success: build_extra_vars\n build_extra_vars:\n publish:\n # NOTE(sbaubeau): merge vars from all ansible roles\n extra_vars: <% $.agent_vars + $.analyzer_vars + $.skydive_ansible_extra_vars %>\n on-success: skydive_install\n skydive_install:\n action: tripleo.ansible-playbook\n input:\n inventory:\n agents:\n hosts: <% $.agent_ips.toDict($, {}) %>\n analyzers:\n hosts: <% $.analyzer_ips.toDict($, {}) %>\n playbook: <% $.skydive_ansible_playbook %>\n remote_user: tripleo-admin\n become: true\n become_user: root\n verbosity: <% $.ansible_playbook_verbosity %>\n forks: <% $.fork_count %>\n ssh_private_key: <% $.private_key %>\n extra_env_variables: <% $.ansible_extra_env_variables %>\n extra_vars: <% $.extra_vars %>\n publish:\n output: <% task().result %>\n", "name": "tripleo.skydive_ansible.v1", "tags": [], "created_at": "2018-08-21 13:36:04", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "08ba3162-3ec8-4691-962b-0c4c42ec0d1b"} > >2018-08-21 16:36:04,119 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:36:04,122 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.validations.v1 >description: TripleO Validations Workflows v1 > >workflows: > > run_validation: > input: > - validation_name > - plan: overcloud > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > notify_running: > on-complete: run_validation > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.validations.v1.run_validation > payload: > validation_name: <% $.validation_name %> > plan: <% $.plan %> > status: RUNNING > execution: <% execution() %> > > run_validation: > on-success: send_message > on-error: set_status_failed > action: tripleo.validations.run_validation validation=<% $.validation_name %> plan=<% $.plan %> > publish: > status: SUCCESS > stdout: <% task().result.stdout %> > stderr: <% task().result.stderr %> > > set_status_failed: > on-complete: send_message > publish: > status: FAILED > stdout: <% task(run_validation).result.stdout %> > stderr: <% task(run_validation).result.stderr %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.validations.v1.run_validation > payload: > validation_name: <% $.validation_name %> > plan: <% $.plan %> > status: <% $.get('status', 'SUCCESS') %> > stdout: <% $.stdout %> > stderr: <% $.stderr %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > run_validations: > input: > - validation_names: [] > - plan: overcloud > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > notify_running: > on-complete: run_validations > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.validations.v1.run_validations > payload: > validation_names: <% $.validation_names %> > plan: <% $.plan %> > status: RUNNING > execution: <% execution() %> > > run_validations: > on-success: send_message > on-error: set_status_failed > workflow: tripleo.validations.v1.run_validation validation_name=<% $.validation %> plan=<% $.plan %> queue_name=<% $.queue_name %> > with-items: validation in <% $.validation_names %> > publish: > status: SUCCESS > > set_status_failed: > on-complete: send_message > publish: > status: FAILED > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.validations.v1.run_validations > payload: > validation_names: <% $.validation_names %> > plan: <% $.plan %> > status: <% $.get('status', 'SUCCESS') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > run_groups: > input: > - group_names: [] > - plan: overcloud > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > find_validations: > on-success: notify_running > action: tripleo.validations.list_validations groups=<% $.group_names %> > publish: > validations: <% task().result %> > > notify_running: > on-complete: run_validation_group > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.validations.v1.run_validations > payload: > group_names: <% $.group_names %> > validation_names: <% $.validations.id %> > plan: <% $.plan %> > status: RUNNING > execution: <% execution() %> > > run_validation_group: > on-success: send_message > on-error: set_status_failed > workflow: tripleo.validations.v1.run_validation validation_name=<% $.validation %> plan=<% $.plan %> queue_name=<% $.queue_name %> > with-items: validation in <% $.validations.id %> > publish: > status: SUCCESS > > set_status_failed: > on-complete: send_message > publish: > status: FAILED > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.validations.v1.run_groups > payload: > group_names: <% $.group_names %> > validation_names: <% $.validations.id %> > plan: <% $.plan %> > status: <% $.get('status', 'SUCCESS') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > list: > input: > - group_names: [] > tags: > - tripleo-common-managed > tasks: > find_validations: > action: tripleo.validations.list_validations groups=<% $.group_names %> > > list_groups: > tags: > - tripleo-common-managed > tasks: > find_groups: > action: tripleo.validations.list_groups > > add_validation_ssh_key_parameter: > input: > - container > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > test_validations_enabled: > action: tripleo.validations.enabled > on-success: get_pubkey > on-error: unset_validation_key_parameter > > get_pubkey: > action: tripleo.validations.get_pubkey > on-success: set_validation_key_parameter > publish: > pubkey: <% task().result %> > > set_validation_key_parameter: > action: tripleo.parameters.update > input: > parameters: > node_admin_extra_ssh_keys: <% $.pubkey %> > container: <% $.container %> > > # NOTE(shadower): We need to clear keys from a previous deployment > unset_validation_key_parameter: > action: tripleo.parameters.update > input: > parameters: > node_admin_extra_ssh_keys: "" > container: <% $.container %> > > copy_ssh_key: > input: > # FIXME: we should stop using heat-admin as e.g. split-stack > # environments (where Nova didn't create overcloud nodes) don't > # have it present > - overcloud_admin: heat-admin > - queue_name: tripleo > tags: > - tripleo-common-managed > tasks: > get_servers: > action: nova.servers_list > on-success: get_pubkey > publish: > servers: <% task().result._info %> > > get_pubkey: > action: tripleo.validations.get_pubkey > on-success: deploy_ssh_key > publish: > pubkey: <% task().result %> > > deploy_ssh_key: > workflow: tripleo.deployment.v1.deploy_on_server > with-items: server in <% $.servers %> > input: > server_name: <% $.server.name %> > server_uuid: <% $.server.id %> > config: | > #!/bin/bash > if ! grep "<% $.pubkey %>" /home/<% $.overcloud_admin %>/.ssh/authorized_keys; then > echo "<% $.pubkey %>" >> /home/<% $.overcloud_admin %>/.ssh/authorized_keys > fi > config_name: copy_ssh_key > group: script > queue_name: <% $.queue_name %> > > check_boot_images: > input: > - deploy_kernel_name: 'bm-deploy-kernel' > - deploy_ramdisk_name: 'bm-deploy-ramdisk' > - run_validations: true > - queue_name: tripleo > output: > errors: <% $.errors %> > warnings: <% $.warnings %> > kernel_id: <% $.kernel_id %> > ramdisk_id: <% $.ramdisk_id %> > tags: > - tripleo-common-managed > tasks: > check_run_validations: > on-complete: > - get_images: <% $.run_validations %> > - send_message: <% not $.run_validations %> > > get_images: > action: glance.images_list > on-success: check_images > publish: > images: <% task().result %> > > check_images: > action: tripleo.validations.check_boot_images > input: > images: <% $.images %> > deploy_kernel_name: <% $.deploy_kernel_name %> > deploy_ramdisk_name: <% $.deploy_ramdisk_name %> > on-success: send_message > publish: > kernel_id: <% task().result.kernel_id %> > ramdisk_id: <% task().result.ramdisk_id %> > warnings: <% task().result.warnings %> > errors: <% task().result.errors %> > on-error: send_message > publish-on-error: > kernel_id: <% task().result.kernel_id %> > ramdisk_id: <% task().result.ramdisk_id %> > warnings: <% task().result.warnings %> > errors: <% task().result.errors %> > status: FAILED > message: <% task().result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.validations.v1.check_boot_images > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > kernel_id: <% $.kernel_id %> > ramdisk_id: <% $.ramdisk_id %> > errors: <% $.errors %> > warnings: <% $.warnings %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > collect_flavors: > input: > - roles_info: {} > - run_validations: true > - queue_name: tripleo > output: > errors: <% $.errors %> > warnings: <% $.warnings %> > flavors: <% $.flavors %> > > tags: > - tripleo-common-managed > > tasks: > check_run_validations: > on-complete: > - check_flavors: <% $.run_validations %> > - send_message: <% not $.run_validations %> > > check_flavors: > action: tripleo.validations.check_flavors > input: > roles_info: <% $.roles_info %> > on-success: send_message > publish: > flavors: <% task().result.flavors %> > errors: <% task().result.errors %> > warnings: <% task().result.warnings %> > on-error: send_message > publish-on-error: > flavors: {} > errors: <% task().result.errors %> > warnings: <% task().result.warnings %> > status: FAILED > message: <% task().result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.validations.v1.collect_flavors > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > flavors: <% $.flavors %> > errors: <% $.errors %> > warnings: <% $.warnings %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > check_ironic_boot_configuration: > input: > - kernel_id: null > - ramdisk_id: null > - run_validations: true > - queue_name: tripleo > output: > errors: <% $.errors %> > warnings: <% $.warnings %> > > tags: > - tripleo-common-managed > > tasks: > check_run_validations: > on-complete: > - get_ironic_nodes: <% $.run_validations %> > - send_message: <% not $.run_validations %> > > get_ironic_nodes: > action: ironic.node_list > input: > provision_state: available > maintenance: false > detail: true > on-success: check_node_boot_configuration > publish: > nodes: <% task().result %> > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > check_node_boot_configuration: > action: tripleo.validations.check_node_boot_configuration > input: > node: <% $.node %> > kernel_id: <% $.kernel_id %> > ramdisk_id: <% $.ramdisk_id %> > with-items: node in <% $.nodes %> > on-success: send_message > publish: > errors: <% task().result.errors.flatten() %> > warnings: <% task().result.warnings.flatten() %> > on-error: send_message > publish-on-error: > errors: <% task().result.errors.flatten() %> > warnings: <% task().result.warnings.flatten() %> > status: FAILED > message: <% task().result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.validations.v1.check_ironic_boot_configuration > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > errors: <% $.errors %> > warnings: <% $.warnings %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > verify_profiles: > input: > - flavors: [] > - run_validations: true > - queue_name: tripleo > output: > errors: <% $.errors %> > warnings: <% $.warnings %> > > tags: > - tripleo-common-managed > > tasks: > check_run_validations: > on-complete: > - get_ironic_nodes: <% $.run_validations %> > - send_message: <% not $.run_validations %> > > get_ironic_nodes: > action: ironic.node_list > input: > maintenance: false > detail: true > on-success: verify_profiles > publish: > nodes: <% task().result %> > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > verify_profiles: > action: tripleo.validations.verify_profiles > input: > nodes: <% $.nodes %> > flavors: <% $.flavors %> > on-success: send_message > publish: > errors: <% task().result.errors %> > warnings: <% task().result.warnings %> > on-error: send_message > publish-on-error: > errors: <% task().result.errors %> > warnings: <% task().result.warnings %> > status: FAILED > message: <% task().result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.validations.v1.verify_profiles > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > errors: <% $.errors %> > warnings: <% $.warnings %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > check_default_nodes_count: > input: > - stack_id: overcloud > - parameters: {} > - default_role_counts: {} > - run_validations: true > - queue_name: tripleo > output: > statistics: <% $.statistics %> > errors: <% $.errors %> > warnings: <% $.warnings %> > > tags: > - tripleo-common-managed > > tasks: > check_run_validations: > on-complete: > - get_hypervisor_statistics: <% $.run_validations %> > - send_message: <% not $.run_validations %> > > get_hypervisor_statistics: > action: nova.hypervisors_statistics > on-success: get_stack > publish: > statistics: <% task().result %> > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > errors: [] > warnings: [] > statistics: null > > get_stack: > action: heat.stacks_get > input: > stack_id: <% $.stack_id %> > on-success: get_associated_nodes > publish: > stack: <% task().result %> > on-error: get_associated_nodes > publish-on-error: > stack: null > > get_associated_nodes: > action: ironic.node_list > input: > associated: true > on-success: get_available_nodes > publish: > associated_nodes: <% task().result %> > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > errors: [] > warnings: [] > > get_available_nodes: > action: ironic.node_list > input: > provision_state: available > associated: false > maintenance: false > on-success: check_nodes_count > publish: > available_nodes: <% task().result %> > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > errors: [] > warnings: [] > > check_nodes_count: > action: tripleo.validations.check_nodes_count > input: > statistics: <% $.statistics %> > stack: <% $.stack %> > associated_nodes: <% $.associated_nodes %> > available_nodes: <% $.available_nodes %> > parameters: <% $.parameters %> > default_role_counts: <% $.default_role_counts %> > on-success: send_message > publish: > errors: <% task().result.errors %> > warnings: <% task().result.warnings %> > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > statistics: null > errors: <% task().result.errors %> > warnings: <% task().result.warnings %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.validations.v1.check_hypervisor_stats > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > statistics: <% $.statistics %> > errors: <% $.errors %> > warnings: <% $.warnings %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > check_pre_deployment_validations: > input: > - deploy_kernel_name: 'bm-deploy-kernel' > - deploy_ramdisk_name: 'bm-deploy-ramdisk' > - roles_info: {} > - stack_id: overcloud > - parameters: {} > - default_role_counts: {} > - run_validations: true > - queue_name: tripleo > > output: > errors: <% $.errors %> > warnings: <% $.warnings %> > kernel_id: <% $.kernel_id %> > ramdisk_id: <% $.ramdisk_id %> > flavors: <% $.flavors %> > statistics: <% $.statistics %> > tags: > - tripleo-common-managed > tasks: > init_messages: > on-success: check_boot_images > publish: > errors: [] > warnings: [] > > check_boot_images: > workflow: check_boot_images > input: > deploy_kernel_name: <% $.deploy_kernel_name %> > deploy_ramdisk_name: <% $.deploy_ramdisk_name %> > run_validations: <% $.run_validations %> > queue_name: <% $.queue_name %> > publish: > errors: <% $.errors + task().result.get('errors', []) %> > warnings: <% $.warnings + task().result.get('warnings', []) %> > kernel_id: <% task().result.get('kernel_id') %> > ramdisk_id: <% task().result.get('ramdisk_id') %> > publish-on-error: > errors: <% $.errors + task().result.get('errors', []) %> > warnings: <% $.warnings + task().result.get('warnings', []) %> > kernel_id: <% task().result.get('kernel_id') %> > ramdisk_id: <% task().result.get('ramdisk_id') %> > status: FAILED > on-success: collect_flavors > on-error: collect_flavors > > collect_flavors: > workflow: collect_flavors > input: > roles_info: <% $.roles_info %> > run_validations: <% $.run_validations %> > queue_name: <% $.queue_name %> > publish: > errors: <% $.errors + task().result.get('errors', []) %> > warnings: <% $.warnings + task().result.get('warnings', []) %> > flavors: <% task().result.get('flavors') %> > publish-on-error: > errors: <% $.errors + task().result.get('errors', []) %> > warnings: <% $.warnings + task().result.get('warnings', []) %> > flavors: <% task().result.get('flavors') %> > status: FAILED > on-success: check_ironic_boot_configuration > on-error: check_ironic_boot_configuration > > check_ironic_boot_configuration: > workflow: check_ironic_boot_configuration > input: > kernel_id: <% $.kernel_id %> > ramdisk_id: <% $.ramdisk_id %> > run_validations: <% $.run_validations %> > queue_name: <% $.queue_name %> > publish: > errors: <% $.errors + task().result.get('errors', []) %> > warnings: <% $.warnings + task().result.get('warnings', []) %> > publish-on-error: > errors: <% $.errors + task().result.get('errors', []) %> > warnings: <% $.warnings + task().result.get('warnings', []) %> > status: FAILED > on-success: check_default_nodes_count > on-error: check_default_nodes_count > > check_default_nodes_count: > workflow: check_default_nodes_count > # ironic-nova sync happens once in two minutes > retry: count=12 delay=10 > input: > stack_id: <% $.stack_id %> > parameters: <% $.parameters %> > default_role_counts: <% $.default_role_counts %> > run_validations: <% $.run_validations %> > queue_name: <% $.queue_name %> > publish: > errors: <% $.errors + task().result.get('errors', []) %> > warnings: <% $.warnings + task().result.get('warnings', []) %> > statistics: <% task().result.get('statistics') %> > publish-on-error: > errors: <% $.errors + task().result.get('errors', []) %> > warnings: <% $.warnings + task().result.get('warnings', []) %> > statistics: <% task().result.get('statistics') %> > status: FAILED > on-success: verify_profiles > # Do not confuse user with info about profiles if the nodes > # count is off in the first place. Skip directly to > # send_message. (bug 1703942) > on-error: send_message > > verify_profiles: > workflow: verify_profiles > input: > flavors: <% $.flavors %> > run_validations: <% $.run_validations %> > queue_name: <% $.queue_name %> > publish: > errors: <% $.errors + task().result.get('errors', []) %> > warnings: <% $.warnings + task().result.get('warnings', []) %> > publish-on-error: > errors: <% $.errors + task().result.get('errors', []) %> > warnings: <% $.warnings + task().result.get('warnings', []) %> > status: FAILED > on-success: send_message > on-error: send_message > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.validations.v1.check_hypervisor_stats > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > kernel_id: <% $.kernel_id %> > ramdisk_id: <% $.ramdisk_id %> > flavors: <% $.flavors %> > statistics: <% $.statistics %> > errors: <% $.errors %> > warnings: <% $.warnings %> > on-success: > - fail: <% $.get('status') = "FAILED" %> >' >2018-08-21 16:36:13,711 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 25434 >2018-08-21 16:36:13,757 DEBUG: RESP: [201] Content-Length: 25434 Content-Type: application/json Date: Tue, 21 Aug 2018 13:36:13 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.validations.v1\ndescription: TripleO Validations Workflows v1\n\nworkflows:\n\n run_validation:\n input:\n - validation_name\n - plan: overcloud\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n notify_running:\n on-complete: run_validation\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.validations.v1.run_validation\n payload:\n validation_name: <% $.validation_name %>\n plan: <% $.plan %>\n status: RUNNING\n execution: <% execution() %>\n\n run_validation:\n on-success: send_message\n on-error: set_status_failed\n action: tripleo.validations.run_validation validation=<% $.validation_name %> plan=<% $.plan %>\n publish:\n status: SUCCESS\n stdout: <% task().result.stdout %>\n stderr: <% task().result.stderr %>\n\n set_status_failed:\n on-complete: send_message\n publish:\n status: FAILED\n stdout: <% task(run_validation).result.stdout %>\n stderr: <% task(run_validation).result.stderr %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.validations.v1.run_validation\n payload:\n validation_name: <% $.validation_name %>\n plan: <% $.plan %>\n status: <% $.get('status', 'SUCCESS') %>\n stdout: <% $.stdout %>\n stderr: <% $.stderr %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n run_validations:\n input:\n - validation_names: []\n - plan: overcloud\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n notify_running:\n on-complete: run_validations\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.validations.v1.run_validations\n payload:\n validation_names: <% $.validation_names %>\n plan: <% $.plan %>\n status: RUNNING\n execution: <% execution() %>\n\n run_validations:\n on-success: send_message\n on-error: set_status_failed\n workflow: tripleo.validations.v1.run_validation validation_name=<% $.validation %> plan=<% $.plan %> queue_name=<% $.queue_name %>\n with-items: validation in <% $.validation_names %>\n publish:\n status: SUCCESS\n\n set_status_failed:\n on-complete: send_message\n publish:\n status: FAILED\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.validations.v1.run_validations\n payload:\n validation_names: <% $.validation_names %>\n plan: <% $.plan %>\n status: <% $.get('status', 'SUCCESS') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n run_groups:\n input:\n - group_names: []\n - plan: overcloud\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n find_validations:\n on-success: notify_running\n action: tripleo.validations.list_validations groups=<% $.group_names %>\n publish:\n validations: <% task().result %>\n\n notify_running:\n on-complete: run_validation_group\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.validations.v1.run_validations\n payload:\n group_names: <% $.group_names %>\n validation_names: <% $.validations.id %>\n plan: <% $.plan %>\n status: RUNNING\n execution: <% execution() %>\n\n run_validation_group:\n on-success: send_message\n on-error: set_status_failed\n workflow: tripleo.validations.v1.run_validation validation_name=<% $.validation %> plan=<% $.plan %> queue_name=<% $.queue_name %>\n with-items: validation in <% $.validations.id %>\n publish:\n status: SUCCESS\n\n set_status_failed:\n on-complete: send_message\n publish:\n status: FAILED\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.validations.v1.run_groups\n payload:\n group_names: <% $.group_names %>\n validation_names: <% $.validations.id %>\n plan: <% $.plan %>\n status: <% $.get('status', 'SUCCESS') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n list:\n input:\n - group_names: []\n tags:\n - tripleo-common-managed\n tasks:\n find_validations:\n action: tripleo.validations.list_validations groups=<% $.group_names %>\n\n list_groups:\n tags:\n - tripleo-common-managed\n tasks:\n find_groups:\n action: tripleo.validations.list_groups\n\n add_validation_ssh_key_parameter:\n input:\n - container\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n test_validations_enabled:\n action: tripleo.validations.enabled\n on-success: get_pubkey\n on-error: unset_validation_key_parameter\n\n get_pubkey:\n action: tripleo.validations.get_pubkey\n on-success: set_validation_key_parameter\n publish:\n pubkey: <% task().result %>\n\n set_validation_key_parameter:\n action: tripleo.parameters.update\n input:\n parameters:\n node_admin_extra_ssh_keys: <% $.pubkey %>\n container: <% $.container %>\n\n # NOTE(shadower): We need to clear keys from a previous deployment\n unset_validation_key_parameter:\n action: tripleo.parameters.update\n input:\n parameters:\n node_admin_extra_ssh_keys: \"\"\n container: <% $.container %>\n\n copy_ssh_key:\n input:\n # FIXME: we should stop using heat-admin as e.g. split-stack\n # environments (where Nova didn't create overcloud nodes) don't\n # have it present\n - overcloud_admin: heat-admin\n - queue_name: tripleo\n tags:\n - tripleo-common-managed\n tasks:\n get_servers:\n action: nova.servers_list\n on-success: get_pubkey\n publish:\n servers: <% task().result._info %>\n\n get_pubkey:\n action: tripleo.validations.get_pubkey\n on-success: deploy_ssh_key\n publish:\n pubkey: <% task().result %>\n\n deploy_ssh_key:\n workflow: tripleo.deployment.v1.deploy_on_server\n with-items: server in <% $.servers %>\n input:\n server_name: <% $.server.name %>\n server_uuid: <% $.server.id %>\n config: |\n #!/bin/bash\n if ! grep \"<% $.pubkey %>\" /home/<% $.overcloud_admin %>/.ssh/authorized_keys; then\n echo \"<% $.pubkey %>\" >> /home/<% $.overcloud_admin %>/.ssh/authorized_keys\n fi\n config_name: copy_ssh_key\n group: script\n queue_name: <% $.queue_name %>\n\n check_boot_images:\n input:\n - deploy_kernel_name: 'bm-deploy-kernel'\n - deploy_ramdisk_name: 'bm-deploy-ramdisk'\n - run_validations: true\n - queue_name: tripleo\n output:\n errors: <% $.errors %>\n warnings: <% $.warnings %>\n kernel_id: <% $.kernel_id %>\n ramdisk_id: <% $.ramdisk_id %>\n tags:\n - tripleo-common-managed\n tasks:\n check_run_validations:\n on-complete:\n - get_images: <% $.run_validations %>\n - send_message: <% not $.run_validations %>\n\n get_images:\n action: glance.images_list\n on-success: check_images\n publish:\n images: <% task().result %>\n\n check_images:\n action: tripleo.validations.check_boot_images\n input:\n images: <% $.images %>\n deploy_kernel_name: <% $.deploy_kernel_name %>\n deploy_ramdisk_name: <% $.deploy_ramdisk_name %>\n on-success: send_message\n publish:\n kernel_id: <% task().result.kernel_id %>\n ramdisk_id: <% task().result.ramdisk_id %>\n warnings: <% task().result.warnings %>\n errors: <% task().result.errors %>\n on-error: send_message\n publish-on-error:\n kernel_id: <% task().result.kernel_id %>\n ramdisk_id: <% task().result.ramdisk_id %>\n warnings: <% task().result.warnings %>\n errors: <% task().result.errors %>\n status: FAILED\n message: <% task().result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.validations.v1.check_boot_images\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n kernel_id: <% $.kernel_id %>\n ramdisk_id: <% $.ramdisk_id %>\n errors: <% $.errors %>\n warnings: <% $.warnings %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n collect_flavors:\n input:\n - roles_info: {}\n - run_validations: true\n - queue_name: tripleo\n output:\n errors: <% $.errors %>\n warnings: <% $.warnings %>\n flavors: <% $.flavors %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n check_run_validations:\n on-complete:\n - check_flavors: <% $.run_validations %>\n - send_message: <% not $.run_validations %>\n\n check_flavors:\n action: tripleo.validations.check_flavors\n input:\n roles_info: <% $.roles_info %>\n on-success: send_message\n publish:\n flavors: <% task().result.flavors %>\n errors: <% task().result.errors %>\n warnings: <% task().result.warnings %>\n on-error: send_message\n publish-on-error:\n flavors: {}\n errors: <% task().result.errors %>\n warnings: <% task().result.warnings %>\n status: FAILED\n message: <% task().result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.validations.v1.collect_flavors\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n flavors: <% $.flavors %>\n errors: <% $.errors %>\n warnings: <% $.warnings %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n check_ironic_boot_configuration:\n input:\n - kernel_id: null\n - ramdisk_id: null\n - run_validations: true\n - queue_name: tripleo\n output:\n errors: <% $.errors %>\n warnings: <% $.warnings %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n check_run_validations:\n on-complete:\n - get_ironic_nodes: <% $.run_validations %>\n - send_message: <% not $.run_validations %>\n\n get_ironic_nodes:\n action: ironic.node_list\n input:\n provision_state: available\n maintenance: false\n detail: true\n on-success: check_node_boot_configuration\n publish:\n nodes: <% task().result %>\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n check_node_boot_configuration:\n action: tripleo.validations.check_node_boot_configuration\n input:\n node: <% $.node %>\n kernel_id: <% $.kernel_id %>\n ramdisk_id: <% $.ramdisk_id %>\n with-items: node in <% $.nodes %>\n on-success: send_message\n publish:\n errors: <% task().result.errors.flatten() %>\n warnings: <% task().result.warnings.flatten() %>\n on-error: send_message\n publish-on-error:\n errors: <% task().result.errors.flatten() %>\n warnings: <% task().result.warnings.flatten() %>\n status: FAILED\n message: <% task().result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.validations.v1.check_ironic_boot_configuration\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n errors: <% $.errors %>\n warnings: <% $.warnings %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n verify_profiles:\n input:\n - flavors: []\n - run_validations: true\n - queue_name: tripleo\n output:\n errors: <% $.errors %>\n warnings: <% $.warnings %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n check_run_validations:\n on-complete:\n - get_ironic_nodes: <% $.run_validations %>\n - send_message: <% not $.run_validations %>\n\n get_ironic_nodes:\n action: ironic.node_list\n input:\n maintenance: false\n detail: true\n on-success: verify_profiles\n publish:\n nodes: <% task().result %>\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n verify_profiles:\n action: tripleo.validations.verify_profiles\n input:\n nodes: <% $.nodes %>\n flavors: <% $.flavors %>\n on-success: send_message\n publish:\n errors: <% task().result.errors %>\n warnings: <% task().result.warnings %>\n on-error: send_message\n publish-on-error:\n errors: <% task().result.errors %>\n warnings: <% task().result.warnings %>\n status: FAILED\n message: <% task().result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.validations.v1.verify_profiles\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n errors: <% $.errors %>\n warnings: <% $.warnings %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n check_default_nodes_count:\n input:\n - stack_id: overcloud\n - parameters: {}\n - default_role_counts: {}\n - run_validations: true\n - queue_name: tripleo\n output:\n statistics: <% $.statistics %>\n errors: <% $.errors %>\n warnings: <% $.warnings %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n check_run_validations:\n on-complete:\n - get_hypervisor_statistics: <% $.run_validations %>\n - send_message: <% not $.run_validations %>\n\n get_hypervisor_statistics:\n action: nova.hypervisors_statistics\n on-success: get_stack\n publish:\n statistics: <% task().result %>\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n errors: []\n warnings: []\n statistics: null\n\n get_stack:\n action: heat.stacks_get\n input:\n stack_id: <% $.stack_id %>\n on-success: get_associated_nodes\n publish:\n stack: <% task().result %>\n on-error: get_associated_nodes\n publish-on-error:\n stack: null\n\n get_associated_nodes:\n action: ironic.node_list\n input:\n associated: true\n on-success: get_available_nodes\n publish:\n associated_nodes: <% task().result %>\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n errors: []\n warnings: []\n\n get_available_nodes:\n action: ironic.node_list\n input:\n provision_state: available\n associated: false\n maintenance: false\n on-success: check_nodes_count\n publish:\n available_nodes: <% task().result %>\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n errors: []\n warnings: []\n\n check_nodes_count:\n action: tripleo.validations.check_nodes_count\n input:\n statistics: <% $.statistics %>\n stack: <% $.stack %>\n associated_nodes: <% $.associated_nodes %>\n available_nodes: <% $.available_nodes %>\n parameters: <% $.parameters %>\n default_role_counts: <% $.default_role_counts %>\n on-success: send_message\n publish:\n errors: <% task().result.errors %>\n warnings: <% task().result.warnings %>\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n statistics: null\n errors: <% task().result.errors %>\n warnings: <% task().result.warnings %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.validations.v1.check_hypervisor_stats\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n statistics: <% $.statistics %>\n errors: <% $.errors %>\n warnings: <% $.warnings %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n check_pre_deployment_validations:\n input:\n - deploy_kernel_name: 'bm-deploy-kernel'\n - deploy_ramdisk_name: 'bm-deploy-ramdisk'\n - roles_info: {}\n - stack_id: overcloud\n - parameters: {}\n - default_role_counts: {}\n - run_validations: true\n - queue_name: tripleo\n\n output:\n errors: <% $.errors %>\n warnings: <% $.warnings %>\n kernel_id: <% $.kernel_id %>\n ramdisk_id: <% $.ramdisk_id %>\n flavors: <% $.flavors %>\n statistics: <% $.statistics %>\n tags:\n - tripleo-common-managed\n tasks:\n init_messages:\n on-success: check_boot_images\n publish:\n errors: []\n warnings: []\n\n check_boot_images:\n workflow: check_boot_images\n input:\n deploy_kernel_name: <% $.deploy_kernel_name %>\n deploy_ramdisk_name: <% $.deploy_ramdisk_name %>\n run_validations: <% $.run_validations %>\n queue_name: <% $.queue_name %>\n publish:\n errors: <% $.errors + task().result.get('errors', []) %>\n warnings: <% $.warnings + task().result.get('warnings', []) %>\n kernel_id: <% task().result.get('kernel_id') %>\n ramdisk_id: <% task().result.get('ramdisk_id') %>\n publish-on-error:\n errors: <% $.errors + task().result.get('errors', []) %>\n warnings: <% $.warnings + task().result.get('warnings', []) %>\n kernel_id: <% task().result.get('kernel_id') %>\n ramdisk_id: <% task().result.get('ramdisk_id') %>\n status: FAILED\n on-success: collect_flavors\n on-error: collect_flavors\n\n collect_flavors:\n workflow: collect_flavors\n input:\n roles_info: <% $.roles_info %>\n run_validations: <% $.run_validations %>\n queue_name: <% $.queue_name %>\n publish:\n errors: <% $.errors + task().result.get('errors', []) %>\n warnings: <% $.warnings + task().result.get('warnings', []) %>\n flavors: <% task().result.get('flavors') %>\n publish-on-error:\n errors: <% $.errors + task().result.get('errors', []) %>\n warnings: <% $.warnings + task().result.get('warnings', []) %>\n flavors: <% task().result.get('flavors') %>\n status: FAILED\n on-success: check_ironic_boot_configuration\n on-error: check_ironic_boot_configuration\n\n check_ironic_boot_configuration:\n workflow: check_ironic_boot_configuration\n input:\n kernel_id: <% $.kernel_id %>\n ramdisk_id: <% $.ramdisk_id %>\n run_validations: <% $.run_validations %>\n queue_name: <% $.queue_name %>\n publish:\n errors: <% $.errors + task().result.get('errors', []) %>\n warnings: <% $.warnings + task().result.get('warnings', []) %>\n publish-on-error:\n errors: <% $.errors + task().result.get('errors', []) %>\n warnings: <% $.warnings + task().result.get('warnings', []) %>\n status: FAILED\n on-success: check_default_nodes_count\n on-error: check_default_nodes_count\n\n check_default_nodes_count:\n workflow: check_default_nodes_count\n # ironic-nova sync happens once in two minutes\n retry: count=12 delay=10\n input:\n stack_id: <% $.stack_id %>\n parameters: <% $.parameters %>\n default_role_counts: <% $.default_role_counts %>\n run_validations: <% $.run_validations %>\n queue_name: <% $.queue_name %>\n publish:\n errors: <% $.errors + task().result.get('errors', []) %>\n warnings: <% $.warnings + task().result.get('warnings', []) %>\n statistics: <% task().result.get('statistics') %>\n publish-on-error:\n errors: <% $.errors + task().result.get('errors', []) %>\n warnings: <% $.warnings + task().result.get('warnings', []) %>\n statistics: <% task().result.get('statistics') %>\n status: FAILED\n on-success: verify_profiles\n # Do not confuse user with info about profiles if the nodes\n # count is off in the first place. Skip directly to\n # send_message. (bug 1703942)\n on-error: send_message\n\n verify_profiles:\n workflow: verify_profiles\n input:\n flavors: <% $.flavors %>\n run_validations: <% $.run_validations %>\n queue_name: <% $.queue_name %>\n publish:\n errors: <% $.errors + task().result.get('errors', []) %>\n warnings: <% $.warnings + task().result.get('warnings', []) %>\n publish-on-error:\n errors: <% $.errors + task().result.get('errors', []) %>\n warnings: <% $.warnings + task().result.get('warnings', []) %>\n status: FAILED\n on-success: send_message\n on-error: send_message\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.validations.v1.check_hypervisor_stats\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n kernel_id: <% $.kernel_id %>\n ramdisk_id: <% $.ramdisk_id %>\n flavors: <% $.flavors %>\n statistics: <% $.statistics %>\n errors: <% $.errors %>\n warnings: <% $.warnings %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n", "name": "tripleo.validations.v1", "tags": [], "created_at": "2018-08-21 13:36:12", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "9c44adbe-6afc-4a61-b892-19e6f663798e"} > >2018-08-21 16:36:13,758 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:36:13,762 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.networks.v1 >description: TripleO Overcloud Networks Workflows v1 > >workflows: > > validate_networks_input: > description: > > Validate that required fields are present. > > input: > - networks > - queue_name: tripleo > > output: > result: <% task(validate_network_names).result %> > > tags: > - tripleo-common-managed > > tasks: > validate_network_names: > publish: > network_name_present: <% $.networks.all($.containsKey('name')) %> > on-success: > - set_status_success: <% $.network_name_present = true %> > - set_status_error: <% $.network_name_present = false %> > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > set_status_success: > on-success: notify_zaqar > publish: > status: SUCCESS > message: <% task(validate_network_names).result %> > > set_status_error: > on-success: notify_zaqar > publish: > status: FAILED > message: "One or more entries did not contain the required field 'name'" > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.networks.v1.validate_networks_input > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > update_networks: > description: > > Takes data in networks parameter in json format, validates its contents, > and persists them in network_data.yaml. After successful update, > templates are regenerated. > > input: > - container: overcloud > - networks > - network_data_file: 'network_data.yaml' > - queue_name: tripleo > > output: > network_data: <% $.network_data %> > > tags: > - tripleo-common-managed > > tasks: > validate_input: > description: > > validate the format of input (input includes required fields for > each network) > workflow: validate_networks_input > input: > networks: <% $.networks %> > on-success: validate_network_files > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > validate_network_files: > description: > > validate that Network names exist in Swift container > workflow: tripleo.plan_management.v1.validate_network_files > input: > container: <% $.container %> > network_data: <% $.networks %> > queue_name: <% $.queue_name %> > publish: > network_data: <% task().network_data %> > on-success: get_available_networks > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > get_available_networks: > workflow: tripleo.plan_management.v1.list_available_networks > input: > container: <% $.container %> > queue_name: <% $.queue_name %> > publish: > available_networks: <% task().result.available_networks %> > on-success: get_current_networks > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > get_current_networks: > workflow: tripleo.plan_management.v1.get_network_data > input: > container: <% $.container %> > network_data_file: <% $.network_data_file %> > queue_name: <% $.queue_name %> > publish: > current_networks: <% task().result.network_data %> > on-success: update_network_data > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > update_network_data: > description: > > Combine (or replace) the network data > action: tripleo.plan.update_networks > input: > networks: <% $.available_networks %> > current_networks: <% $.current_networks %> > remove_all: false > publish: > new_network_data: <% task().result.network_data %> > on-success: update_network_data_in_swift > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > update_network_data_in_swift: > description: > > update network_data.yaml object in Swift with data from workflow input > action: swift.put_object > input: > container: <% $.container %> > obj: <% $.network_data_file %> > contents: <% yaml_dump($.new_network_data) %> > on-success: regenerate_templates > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > regenerate_templates: > action: tripleo.templates.process container=<% $.container %> > on-success: get_networks > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > get_networks: > description: > > run GetNetworksAction to get updated contents of network_data.yaml and > provide it as output > workflow: tripleo.plan_management.v1.get_network_data > input: > container: <% $.container %> > network_data_file: <% $.network_data_file %> > queue_name: <% $.queue_name %> > publish: > network_data: <% task().network_data %> > on-success: set_status_success > publish-on-error: > status: FAILED > message: <% task().result %> > on-error: notify_zaqar > > set_status_success: > on-success: notify_zaqar > publish: > status: SUCCESS > message: <% task(get_networks).result %> > > notify_zaqar: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.networks.v1.update_networks > payload: > status: <% $.status %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> >' >2018-08-21 16:36:16,244 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 6800 >2018-08-21 16:36:16,249 DEBUG: RESP: [201] Content-Length: 6800 Content-Type: application/json Date: Tue, 21 Aug 2018 13:36:16 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.networks.v1\ndescription: TripleO Overcloud Networks Workflows v1\n\nworkflows:\n\n validate_networks_input:\n description: >\n Validate that required fields are present.\n\n input:\n - networks\n - queue_name: tripleo\n\n output:\n result: <% task(validate_network_names).result %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n validate_network_names:\n publish:\n network_name_present: <% $.networks.all($.containsKey('name')) %>\n on-success:\n - set_status_success: <% $.network_name_present = true %>\n - set_status_error: <% $.network_name_present = false %>\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n set_status_success:\n on-success: notify_zaqar\n publish:\n status: SUCCESS\n message: <% task(validate_network_names).result %>\n\n set_status_error:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: \"One or more entries did not contain the required field 'name'\"\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.networks.v1.validate_networks_input\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n update_networks:\n description: >\n Takes data in networks parameter in json format, validates its contents,\n and persists them in network_data.yaml. After successful update,\n templates are regenerated.\n\n input:\n - container: overcloud\n - networks\n - network_data_file: 'network_data.yaml'\n - queue_name: tripleo\n\n output:\n network_data: <% $.network_data %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n validate_input:\n description: >\n validate the format of input (input includes required fields for\n each network)\n workflow: validate_networks_input\n input:\n networks: <% $.networks %>\n on-success: validate_network_files\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n validate_network_files:\n description: >\n validate that Network names exist in Swift container\n workflow: tripleo.plan_management.v1.validate_network_files\n input:\n container: <% $.container %>\n network_data: <% $.networks %>\n queue_name: <% $.queue_name %>\n publish:\n network_data: <% task().network_data %>\n on-success: get_available_networks\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n get_available_networks:\n workflow: tripleo.plan_management.v1.list_available_networks\n input:\n container: <% $.container %>\n queue_name: <% $.queue_name %>\n publish:\n available_networks: <% task().result.available_networks %>\n on-success: get_current_networks\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n get_current_networks:\n workflow: tripleo.plan_management.v1.get_network_data\n input:\n container: <% $.container %>\n network_data_file: <% $.network_data_file %>\n queue_name: <% $.queue_name %>\n publish:\n current_networks: <% task().result.network_data %>\n on-success: update_network_data\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n update_network_data:\n description: >\n Combine (or replace) the network data\n action: tripleo.plan.update_networks\n input:\n networks: <% $.available_networks %>\n current_networks: <% $.current_networks %>\n remove_all: false\n publish:\n new_network_data: <% task().result.network_data %>\n on-success: update_network_data_in_swift\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n update_network_data_in_swift:\n description: >\n update network_data.yaml object in Swift with data from workflow input\n action: swift.put_object\n input:\n container: <% $.container %>\n obj: <% $.network_data_file %>\n contents: <% yaml_dump($.new_network_data) %>\n on-success: regenerate_templates\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n regenerate_templates:\n action: tripleo.templates.process container=<% $.container %>\n on-success: get_networks\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n get_networks:\n description: >\n run GetNetworksAction to get updated contents of network_data.yaml and\n provide it as output\n workflow: tripleo.plan_management.v1.get_network_data\n input:\n container: <% $.container %>\n network_data_file: <% $.network_data_file %>\n queue_name: <% $.queue_name %>\n publish:\n network_data: <% task().network_data %>\n on-success: set_status_success\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-error: notify_zaqar\n\n set_status_success:\n on-success: notify_zaqar\n publish:\n status: SUCCESS\n message: <% task(get_networks).result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.networks.v1.update_networks\n payload:\n status: <% $.status %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n", "name": "tripleo.networks.v1", "tags": [], "created_at": "2018-08-21 13:36:15", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "3eb3347a-1716-4099-8822-abb3241ae6bb"} > >2018-08-21 16:36:16,249 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:36:16,252 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.derive_params.v1 >description: TripleO Workflows to derive deployment parameters from the introspected data > >workflows: > > derive_parameters: > description: The main workflow for deriving parameters from the introspected data > > input: > - plan: overcloud > - queue_name: tripleo > - user_inputs: {} > > tags: > - tripleo-common-managed > > tasks: > get_flattened_parameters: > action: tripleo.parameters.get_flatten container=<% $.plan %> > publish: > environment_parameters: <% task().result.environment_parameters %> > heat_resource_tree: <% task().result.heat_resource_tree %> > on-success: > - get_roles: <% $.environment_parameters and $.heat_resource_tree %> > - set_status_failed_get_flattened_parameters: <% (not $.environment_parameters) or (not $.heat_resource_tree) %> > on-error: set_status_failed_get_flattened_parameters > > get_roles: > action: tripleo.role.list container=<% $.plan %> > publish: > role_name_list: <% task().result %> > on-success: > - get_valid_roles: <% $.role_name_list %> > - set_status_failed_get_roles: <% not $.role_name_list %> > on-error: set_status_failed_on_error_get_roles > > # Obtain only the roles which has count > 0, by checking <RoleName>Count parameter, like ComputeCount > get_valid_roles: > publish: > valid_role_name_list: <% let(hr => $.heat_resource_tree.parameters) -> $.role_name_list.where(int($hr.get(concat($, 'Count'), {}).get('default', 0)) > 0) %> > on-success: > - for_each_role: <% $.valid_role_name_list %> > - set_status_failed_get_valid_roles: <% not $.valid_role_name_list %> > > # Execute the basic preparation workflow for each role to get introspection data > for_each_role: > with-items: role_name in <% $.valid_role_name_list %> > concurrency: 1 > workflow: _derive_parameters_per_role > input: > plan: <% $.plan %> > role_name: <% $.role_name %> > environment_parameters: <% $.environment_parameters %> > heat_resource_tree: <% $.heat_resource_tree %> > user_inputs: <% $.user_inputs %> > publish: > # Gets all the roles derived parameters as dictionary > result: <% task().result.select($.get('derived_parameters', {})).sum() %> > on-success: reset_derive_parameters_in_plan > on-error: set_status_failed_for_each_role > > reset_derive_parameters_in_plan: > action: tripleo.parameters.reset > input: > container: <% $.plan %> > key: 'derived_parameters' > on-success: > # Add the derived parameters to the deployment plan only when $.result > # (the derived parameters) is non-empty. Otherwise, we're done. > - update_derive_parameters_in_plan: <% $.result %> > - send_message: <% not $.result %> > on-error: set_status_failed_reset_derive_parameters_in_plan > > update_derive_parameters_in_plan: > action: tripleo.parameters.update > input: > container: <% $.plan %> > key: 'derived_parameters' > parameters: <% $.get('result', {}) %> > on-success: send_message > on-error: set_status_failed_update_derive_parameters_in_plan > > set_status_failed_get_flattened_parameters: > on-success: send_message > publish: > status: FAILED > message: <% task(get_flattened_parameters).result %> > > set_status_failed_get_roles: > on-success: send_message > publish: > status: FAILED > message: "Unable to determine the list of roles in the deployment plan" > > set_status_failed_on_error_get_roles: > on-success: send_message > publish: > status: FAILED > message: <% task(get_roles).result %> > > set_status_failed_get_valid_roles: > on-success: send_message > publish: > status: FAILED > message: 'Unable to determine the list of valid roles in the deployment plan.' > > set_status_failed_for_each_role: > on-success: update_message_format > publish: > status: FAILED > # gets the status and message for all roles from task result. > message: <% task(for_each_role).result.select(dict('role_name' => $.role_name, 'status' => $.get('status', 'SUCCESS'), 'message' => $.get('message', ''))) %> > > update_message_format: > on-success: send_message > publish: > # updates the message format(Role 'role name': message) for each roles which are failed and joins the message list as string with ', ' separator. > message: <% $.message.where($.get('status', 'SUCCESS') != 'SUCCESS').select(concat("Role '{}':".format($.role_name), " ", $.get('message', '(error unknown)'))).join(', ') %> > > set_status_failed_reset_derive_parameters_in_plan: > on-success: send_message > publish: > status: FAILED > message: <% task(reset_derive_parameters_in_plan).result %> > > set_status_failed_update_derive_parameters_in_plan: > on-success: send_message > publish: > status: FAILED > message: <% task(update_derive_parameters_in_plan).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.derive_params.v1.derive_parameters > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > result: <% $.get('result', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = 'FAILED' %> > > > _derive_parameters_per_role: > description: > > Workflow which runs per role to get the introspection data on the first matching node assigned to role. > Once introspection data is fetched, this worklow will trigger the actual derive parameters workflow > input: > - plan > - role_name > - environment_parameters > - heat_resource_tree > - user_inputs > > output: > derived_parameters: <% $.get('derived_parameters', {}) %> > # Need role_name in output parameter to display the status for all roles in main workflow when any role fails here. > role_name: <% $.role_name %> > > tags: > - tripleo-common-managed > > tasks: > get_role_info: > workflow: _get_role_info > input: > role_name: <% $.role_name %> > heat_resource_tree: <% $.heat_resource_tree %> > publish: > role_features: <% task().result.get('role_features', []) %> > role_services: <% task().result.get('role_services', []) %> > on-success: > # Continue only if there are features associated with this role. Otherwise, we're done. > - get_scheduler_hints: <% $.role_features %> > on-error: set_status_failed_get_role_info > > # Find a node associated with this role. Look for nodes matching any scheduler hints > # associated with the role, and if there are no scheduler hints then locate nodes > # with a profile matching the role's flavor. > get_scheduler_hints: > publish: > scheduler_hints: <% let(param_name => concat($.role_name, 'SchedulerHints')) -> $.heat_resource_tree.parameters.get($param_name, {}).get('default', {}) %> > on-success: > - get_hint_regex: <% $.scheduler_hints %> > # If there are no scheduler hints then move on to use the flavor > - get_flavor_name: <% not $.scheduler_hints %> > > get_hint_regex: > publish: > hint_regex: <% $.scheduler_hints.get('capabilities:node', '').replace('%index%', '(\d+)') %> > on-success: > - get_node_with_hint: <% $.hint_regex %> > # If there is no 'capabilities:node' hint then move on to use the flavor > - get_flavor_name: <% not $.hint_regex %> > > get_node_with_hint: > workflow: tripleo.baremetal.v1.nodes_with_hint > input: > hint_regex: <% concat('^', $.hint_regex, '$') %> > publish: > role_node_uuid: <% task().result.matching_nodes.first('') %> > on-success: > - get_introspection_data: <% $.role_node_uuid %> > # If no nodes match the scheduler hint then move on to use the flavor > - get_flavor_name: <% not $.role_node_uuid %> > on-error: set_status_failed_on_error_get_node_with_hint > > get_flavor_name: > publish: > flavor_name: <% let(param_name => concat('Overcloud', $.role_name, 'Flavor').replace('OvercloudControllerFlavor', 'OvercloudControlFlavor')) -> $.heat_resource_tree.parameters.get($param_name, {}).get('default', '') %> > on-success: > - get_profile_name: <% $.flavor_name %> > - set_status_failed_get_flavor_name: <% not $.flavor_name %> > > get_profile_name: > action: tripleo.parameters.get_profile_of_flavor flavor_name=<% $.flavor_name %> > publish: > profile_name: <% task().result %> > on-success: get_profile_node > on-error: set_status_failed_get_profile_name > > get_profile_node: > workflow: tripleo.baremetal.v1.nodes_with_profile > input: > profile: <% $.profile_name %> > publish: > role_node_uuid: <% task().result.matching_nodes.first('') %> > on-success: > - get_introspection_data: <% $.role_node_uuid %> > - set_status_failed_no_matching_node_get_profile_node: <% not $.role_node_uuid %> > on-error: set_status_failed_on_error_get_profile_node > > get_introspection_data: > action: baremetal_introspection.get_data uuid=<% $.role_node_uuid %> > publish: > hw_data: <% task().result %> > # Establish an empty dictionary of derived_parameters prior to > # invoking the individual "feature" algorithms > derived_parameters: <% dict() %> > on-success: handle_dpdk_feature > on-error: set_status_failed_get_introspection_data > > handle_dpdk_feature: > on-success: > - get_dpdk_derive_params: <% $.role_features.contains('DPDK') %> > - handle_sriov_feature: <% not $.role_features.contains('DPDK') %> > > get_dpdk_derive_params: > workflow: tripleo.derive_params_formulas.v1.dpdk_derive_params > input: > plan: <% $.plan %> > role_name: <% $.role_name %> > hw_data: <% $.hw_data %> > user_inputs: <% $.user_inputs %> > publish: > derived_parameters: <% task().result.get('derived_parameters', {}) %> > on-success: handle_sriov_feature > on-error: set_status_failed_get_dpdk_derive_params > > handle_sriov_feature: > on-success: > - get_sriov_derive_params: <% $.role_features.contains('SRIOV') %> > - handle_host_feature: <% not $.role_features.contains('SRIOV') %> > > get_sriov_derive_params: > workflow: tripleo.derive_params_formulas.v1.sriov_derive_params > input: > role_name: <% $.role_name %> > hw_data: <% $.hw_data %> > derived_parameters: <% $.derived_parameters %> > publish: > derived_parameters: <% task().result.get('derived_parameters', {}) %> > on-success: handle_host_feature > on-error: set_status_failed_get_sriov_derive_params > > handle_host_feature: > on-success: > - get_host_derive_params: <% $.role_features.contains('HOST') %> > - handle_hci_feature: <% not $.role_features.contains('HOST') %> > > get_host_derive_params: > workflow: tripleo.derive_params_formulas.v1.host_derive_params > input: > role_name: <% $.role_name %> > hw_data: <% $.hw_data %> > user_inputs: <% $.user_inputs %> > derived_parameters: <% $.derived_parameters %> > publish: > derived_parameters: <% task().result.get('derived_parameters', {}) %> > on-success: handle_hci_feature > on-error: set_status_failed_get_host_derive_params > > handle_hci_feature: > on-success: > - get_hci_derive_params: <% $.role_features.contains('HCI') %> > > get_hci_derive_params: > workflow: tripleo.derive_params_formulas.v1.hci_derive_params > input: > role_name: <% $.role_name %> > environment_parameters: <% $.environment_parameters %> > heat_resource_tree: <% $.heat_resource_tree %> > introspection_data: <% $.hw_data %> > user_inputs: <% $.user_inputs %> > derived_parameters: <% $.derived_parameters %> > publish: > derived_parameters: <% task().result.get('derived_parameters', {}) %> > on-error: set_status_failed_get_hci_derive_params > # Done (no more derived parameter features) > > set_status_failed_get_role_info: > publish: > role_name: <% $.role_name %> > status: FAILED > message: <% task(get_role_info).result.get('message', '') %> > on-success: fail > > set_status_failed_get_flavor_name: > publish: > role_name: <% $.role_name %> > status: FAILED > message: <% "Unable to determine flavor for role '{0}'".format($.role_name) %> > on-success: fail > > set_status_failed_get_profile_name: > publish: > role_name: <% $.role_name %> > status: FAILED > message: <% task(get_profile_name).result %> > on-success: fail > > set_status_failed_no_matching_node_get_profile_node: > publish: > role_name: <% $.role_name %> > status: FAILED > message: <% "Unable to determine matching node for profile '{0}'".format($.profile_name) %> > on-success: fail > > set_status_failed_on_error_get_profile_node: > publish: > role_name: <% $.role_name %> > status: FAILED > message: <% task(get_profile_node).result %> > on-success: fail > > set_status_failed_on_error_get_node_with_hint: > publish: > role_name: <% $.role_name %> > status: FAILED > message: <% task(get_node_with_hint).result %> > on-success: fail > > set_status_failed_get_introspection_data: > publish: > role_name: <% $.role_name %> > status: FAILED > message: <% task(get_introspection_data).result %> > on-success: fail > > set_status_failed_get_dpdk_derive_params: > publish: > role_name: <% $.role_name %> > status: FAILED > message: <% task(get_dpdk_derive_params).result.message %> > on-success: fail > > set_status_failed_get_sriov_derive_params: > publish: > role_name: <% $.role_name %> > status: FAILED > message: <% task(get_sriov_derive_params).result.message %> > on-success: fail > > set_status_failed_get_host_derive_params: > publish: > role_name: <% $.role_name %> > status: FAILED > message: <% task(get_host_derive_params).result.message %> > on-success: fail > > set_status_failed_get_hci_derive_params: > publish: > role_name: <% $.role_name %> > status: FAILED > message: <% task(get_hci_derive_params).result.message %> > on-success: fail > > > _get_role_info: > description: > > Workflow that determines the list of derived parameter features (DPDK, > HCI, etc.) for a role based on the services assigned to the role. > > input: > - role_name > - heat_resource_tree > > tags: > - tripleo-common-managed > > tasks: > get_resource_chains: > publish: > resource_chains: <% $.heat_resource_tree.resources.values().where($.get('type', '') = 'OS::Heat::ResourceChain') %> > on-success: > - get_role_chain: <% $.resource_chains %> > - set_status_failed_get_resource_chains: <% not $.resource_chains %> > > get_role_chain: > publish: > role_chain: <% let(chain_name => concat($.role_name, 'ServiceChain'))-> $.heat_resource_tree.resources.values().where($.name = $chain_name).first({}) %> > on-success: > - get_service_chain: <% $.role_chain %> > - set_status_failed_get_role_chain: <% not $.role_chain %> > > get_service_chain: > publish: > service_chain: <% let(resources => $.role_chain.resources)-> $.resource_chains.where($resources.contains($.id)).first('') %> > on-success: > - get_role_services: <% $.service_chain %> > - set_status_failed_get_service_chain: <% not $.service_chain %> > > get_role_services: > publish: > role_services: <% let(resources => $.heat_resource_tree.resources)-> $.service_chain.resources.select($resources.get($)) %> > on-success: > - check_features: <% $.role_services %> > - set_status_failed_get_role_services: <% not $.role_services %> > > check_features: > on-success: build_feature_dict > publish: > # The role supports the DPDK feature if the NeutronDatapathType parameter is present > dpdk: <% let(resources => $.heat_resource_tree.resources) -> $.role_services.any($.get('parameters', []).contains('NeutronDatapathType') or $.get('resources', []).select($resources.get($)).any($.get('parameters', []).contains('NeutronDatapathType'))) %> > > # The role supports the DPDK feature in ODL if the OvsEnableDpdk parameter value is true in role parameters. > odl_dpdk: <% let(role => $.role_name) -> $.heat_resource_tree.parameters.get(concat($role, 'Parameters'), {}).get('default', {}).get('OvsEnableDpdk', false) %> > > # The role supports the SRIOV feature if it includes NeutronSriovAgent services. > sriov: <% $.role_services.any($.get('type', '').endsWith('::NeutronSriovAgent')) %> > > # The role supports the HCI feature if it includes both NovaCompute and CephOSD services. > hci: <% $.role_services.any($.get('type', '').endsWith('::NovaCompute')) and $.role_services.any($.get('type', '').endsWith('::CephOSD')) %> > > build_feature_dict: > on-success: filter_features > publish: > feature_dict: <% dict(DPDK => ($.dpdk or $.odl_dpdk), SRIOV => $.sriov, HOST => ($.dpdk or $.odl_dpdk or $.sriov), HCI => $.hci) %> > > filter_features: > publish: > # The list of features that are enabled (i.e. are true in the feature_dict). > role_features: <% let(feature_dict => $.feature_dict)-> $feature_dict.keys().where($feature_dict[$]) %> > > set_status_failed_get_resource_chains: > publish: > message: <% 'Unable to locate any resource chains in the heat resource tree' %> > on-success: fail > > set_status_failed_get_role_chain: > publish: > message: <% "Unable to determine the service chain resource for role '{0}'".format($.role_name) %> > on-success: fail > > set_status_failed_get_service_chain: > publish: > message: <% "Unable to determine the service chain for role '{0}'".format($.role_name) %> > on-success: fail > > set_status_failed_get_role_services: > publish: > message: <% "Unable to determine list of services for role '{0}'".format($.role_name) %> > on-success: fail >' >2018-08-21 16:36:23,928 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 20049 >2018-08-21 16:36:23,973 DEBUG: RESP: [201] Content-Length: 20049 Content-Type: application/json Date: Tue, 21 Aug 2018 13:36:23 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.derive_params.v1\ndescription: TripleO Workflows to derive deployment parameters from the introspected data\n\nworkflows:\n\n derive_parameters:\n description: The main workflow for deriving parameters from the introspected data\n\n input:\n - plan: overcloud\n - queue_name: tripleo\n - user_inputs: {}\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_flattened_parameters:\n action: tripleo.parameters.get_flatten container=<% $.plan %>\n publish:\n environment_parameters: <% task().result.environment_parameters %>\n heat_resource_tree: <% task().result.heat_resource_tree %>\n on-success:\n - get_roles: <% $.environment_parameters and $.heat_resource_tree %>\n - set_status_failed_get_flattened_parameters: <% (not $.environment_parameters) or (not $.heat_resource_tree) %>\n on-error: set_status_failed_get_flattened_parameters\n\n get_roles:\n action: tripleo.role.list container=<% $.plan %>\n publish:\n role_name_list: <% task().result %>\n on-success:\n - get_valid_roles: <% $.role_name_list %>\n - set_status_failed_get_roles: <% not $.role_name_list %>\n on-error: set_status_failed_on_error_get_roles\n\n # Obtain only the roles which has count > 0, by checking <RoleName>Count parameter, like ComputeCount\n get_valid_roles:\n publish:\n valid_role_name_list: <% let(hr => $.heat_resource_tree.parameters) -> $.role_name_list.where(int($hr.get(concat($, 'Count'), {}).get('default', 0)) > 0) %>\n on-success:\n - for_each_role: <% $.valid_role_name_list %>\n - set_status_failed_get_valid_roles: <% not $.valid_role_name_list %>\n\n # Execute the basic preparation workflow for each role to get introspection data\n for_each_role:\n with-items: role_name in <% $.valid_role_name_list %>\n concurrency: 1\n workflow: _derive_parameters_per_role\n input:\n plan: <% $.plan %>\n role_name: <% $.role_name %>\n environment_parameters: <% $.environment_parameters %>\n heat_resource_tree: <% $.heat_resource_tree %>\n user_inputs: <% $.user_inputs %>\n publish:\n # Gets all the roles derived parameters as dictionary\n result: <% task().result.select($.get('derived_parameters', {})).sum() %>\n on-success: reset_derive_parameters_in_plan\n on-error: set_status_failed_for_each_role\n\n reset_derive_parameters_in_plan:\n action: tripleo.parameters.reset\n input:\n container: <% $.plan %>\n key: 'derived_parameters'\n on-success:\n # Add the derived parameters to the deployment plan only when $.result\n # (the derived parameters) is non-empty. Otherwise, we're done.\n - update_derive_parameters_in_plan: <% $.result %>\n - send_message: <% not $.result %>\n on-error: set_status_failed_reset_derive_parameters_in_plan\n\n update_derive_parameters_in_plan:\n action: tripleo.parameters.update\n input:\n container: <% $.plan %>\n key: 'derived_parameters'\n parameters: <% $.get('result', {}) %>\n on-success: send_message\n on-error: set_status_failed_update_derive_parameters_in_plan\n\n set_status_failed_get_flattened_parameters:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_flattened_parameters).result %>\n\n set_status_failed_get_roles:\n on-success: send_message\n publish:\n status: FAILED\n message: \"Unable to determine the list of roles in the deployment plan\"\n\n set_status_failed_on_error_get_roles:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_roles).result %>\n\n set_status_failed_get_valid_roles:\n on-success: send_message\n publish:\n status: FAILED\n message: 'Unable to determine the list of valid roles in the deployment plan.'\n\n set_status_failed_for_each_role:\n on-success: update_message_format\n publish:\n status: FAILED\n # gets the status and message for all roles from task result.\n message: <% task(for_each_role).result.select(dict('role_name' => $.role_name, 'status' => $.get('status', 'SUCCESS'), 'message' => $.get('message', ''))) %>\n\n update_message_format:\n on-success: send_message\n publish:\n # updates the message format(Role 'role name': message) for each roles which are failed and joins the message list as string with ', ' separator.\n message: <% $.message.where($.get('status', 'SUCCESS') != 'SUCCESS').select(concat(\"Role '{}':\".format($.role_name), \" \", $.get('message', '(error unknown)'))).join(', ') %>\n\n set_status_failed_reset_derive_parameters_in_plan:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(reset_derive_parameters_in_plan).result %>\n\n set_status_failed_update_derive_parameters_in_plan:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(update_derive_parameters_in_plan).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.derive_params.v1.derive_parameters\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n result: <% $.get('result', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = 'FAILED' %>\n\n\n _derive_parameters_per_role:\n description: >\n Workflow which runs per role to get the introspection data on the first matching node assigned to role.\n Once introspection data is fetched, this worklow will trigger the actual derive parameters workflow\n input:\n - plan\n - role_name\n - environment_parameters\n - heat_resource_tree\n - user_inputs\n\n output:\n derived_parameters: <% $.get('derived_parameters', {}) %>\n # Need role_name in output parameter to display the status for all roles in main workflow when any role fails here.\n role_name: <% $.role_name %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_role_info:\n workflow: _get_role_info\n input:\n role_name: <% $.role_name %>\n heat_resource_tree: <% $.heat_resource_tree %>\n publish:\n role_features: <% task().result.get('role_features', []) %>\n role_services: <% task().result.get('role_services', []) %>\n on-success:\n # Continue only if there are features associated with this role. Otherwise, we're done.\n - get_scheduler_hints: <% $.role_features %>\n on-error: set_status_failed_get_role_info\n\n # Find a node associated with this role. Look for nodes matching any scheduler hints\n # associated with the role, and if there are no scheduler hints then locate nodes\n # with a profile matching the role's flavor.\n get_scheduler_hints:\n publish:\n scheduler_hints: <% let(param_name => concat($.role_name, 'SchedulerHints')) -> $.heat_resource_tree.parameters.get($param_name, {}).get('default', {}) %>\n on-success:\n - get_hint_regex: <% $.scheduler_hints %>\n # If there are no scheduler hints then move on to use the flavor\n - get_flavor_name: <% not $.scheduler_hints %>\n\n get_hint_regex:\n publish:\n hint_regex: <% $.scheduler_hints.get('capabilities:node', '').replace('%index%', '(\\d+)') %>\n on-success:\n - get_node_with_hint: <% $.hint_regex %>\n # If there is no 'capabilities:node' hint then move on to use the flavor\n - get_flavor_name: <% not $.hint_regex %>\n\n get_node_with_hint:\n workflow: tripleo.baremetal.v1.nodes_with_hint\n input:\n hint_regex: <% concat('^', $.hint_regex, '$') %>\n publish:\n role_node_uuid: <% task().result.matching_nodes.first('') %>\n on-success:\n - get_introspection_data: <% $.role_node_uuid %>\n # If no nodes match the scheduler hint then move on to use the flavor\n - get_flavor_name: <% not $.role_node_uuid %>\n on-error: set_status_failed_on_error_get_node_with_hint\n\n get_flavor_name:\n publish:\n flavor_name: <% let(param_name => concat('Overcloud', $.role_name, 'Flavor').replace('OvercloudControllerFlavor', 'OvercloudControlFlavor')) -> $.heat_resource_tree.parameters.get($param_name, {}).get('default', '') %>\n on-success:\n - get_profile_name: <% $.flavor_name %>\n - set_status_failed_get_flavor_name: <% not $.flavor_name %>\n\n get_profile_name:\n action: tripleo.parameters.get_profile_of_flavor flavor_name=<% $.flavor_name %>\n publish:\n profile_name: <% task().result %>\n on-success: get_profile_node\n on-error: set_status_failed_get_profile_name\n\n get_profile_node:\n workflow: tripleo.baremetal.v1.nodes_with_profile\n input:\n profile: <% $.profile_name %>\n publish:\n role_node_uuid: <% task().result.matching_nodes.first('') %>\n on-success:\n - get_introspection_data: <% $.role_node_uuid %>\n - set_status_failed_no_matching_node_get_profile_node: <% not $.role_node_uuid %>\n on-error: set_status_failed_on_error_get_profile_node\n\n get_introspection_data:\n action: baremetal_introspection.get_data uuid=<% $.role_node_uuid %>\n publish:\n hw_data: <% task().result %>\n # Establish an empty dictionary of derived_parameters prior to\n # invoking the individual \"feature\" algorithms\n derived_parameters: <% dict() %>\n on-success: handle_dpdk_feature\n on-error: set_status_failed_get_introspection_data\n\n handle_dpdk_feature:\n on-success:\n - get_dpdk_derive_params: <% $.role_features.contains('DPDK') %>\n - handle_sriov_feature: <% not $.role_features.contains('DPDK') %>\n\n get_dpdk_derive_params:\n workflow: tripleo.derive_params_formulas.v1.dpdk_derive_params\n input:\n plan: <% $.plan %>\n role_name: <% $.role_name %>\n hw_data: <% $.hw_data %>\n user_inputs: <% $.user_inputs %>\n publish:\n derived_parameters: <% task().result.get('derived_parameters', {}) %>\n on-success: handle_sriov_feature\n on-error: set_status_failed_get_dpdk_derive_params\n\n handle_sriov_feature:\n on-success:\n - get_sriov_derive_params: <% $.role_features.contains('SRIOV') %>\n - handle_host_feature: <% not $.role_features.contains('SRIOV') %>\n\n get_sriov_derive_params:\n workflow: tripleo.derive_params_formulas.v1.sriov_derive_params\n input:\n role_name: <% $.role_name %>\n hw_data: <% $.hw_data %>\n derived_parameters: <% $.derived_parameters %>\n publish:\n derived_parameters: <% task().result.get('derived_parameters', {}) %>\n on-success: handle_host_feature\n on-error: set_status_failed_get_sriov_derive_params\n\n handle_host_feature:\n on-success:\n - get_host_derive_params: <% $.role_features.contains('HOST') %>\n - handle_hci_feature: <% not $.role_features.contains('HOST') %>\n\n get_host_derive_params:\n workflow: tripleo.derive_params_formulas.v1.host_derive_params\n input:\n role_name: <% $.role_name %>\n hw_data: <% $.hw_data %>\n user_inputs: <% $.user_inputs %>\n derived_parameters: <% $.derived_parameters %>\n publish:\n derived_parameters: <% task().result.get('derived_parameters', {}) %>\n on-success: handle_hci_feature\n on-error: set_status_failed_get_host_derive_params\n\n handle_hci_feature:\n on-success:\n - get_hci_derive_params: <% $.role_features.contains('HCI') %>\n\n get_hci_derive_params:\n workflow: tripleo.derive_params_formulas.v1.hci_derive_params\n input:\n role_name: <% $.role_name %>\n environment_parameters: <% $.environment_parameters %>\n heat_resource_tree: <% $.heat_resource_tree %>\n introspection_data: <% $.hw_data %>\n user_inputs: <% $.user_inputs %>\n derived_parameters: <% $.derived_parameters %>\n publish:\n derived_parameters: <% task().result.get('derived_parameters', {}) %>\n on-error: set_status_failed_get_hci_derive_params\n # Done (no more derived parameter features)\n\n set_status_failed_get_role_info:\n publish:\n role_name: <% $.role_name %>\n status: FAILED\n message: <% task(get_role_info).result.get('message', '') %>\n on-success: fail\n\n set_status_failed_get_flavor_name:\n publish:\n role_name: <% $.role_name %>\n status: FAILED\n message: <% \"Unable to determine flavor for role '{0}'\".format($.role_name) %>\n on-success: fail\n\n set_status_failed_get_profile_name:\n publish:\n role_name: <% $.role_name %>\n status: FAILED\n message: <% task(get_profile_name).result %>\n on-success: fail\n\n set_status_failed_no_matching_node_get_profile_node:\n publish:\n role_name: <% $.role_name %>\n status: FAILED\n message: <% \"Unable to determine matching node for profile '{0}'\".format($.profile_name) %>\n on-success: fail\n\n set_status_failed_on_error_get_profile_node:\n publish:\n role_name: <% $.role_name %>\n status: FAILED\n message: <% task(get_profile_node).result %>\n on-success: fail\n\n set_status_failed_on_error_get_node_with_hint:\n publish:\n role_name: <% $.role_name %>\n status: FAILED\n message: <% task(get_node_with_hint).result %>\n on-success: fail\n\n set_status_failed_get_introspection_data:\n publish:\n role_name: <% $.role_name %>\n status: FAILED\n message: <% task(get_introspection_data).result %>\n on-success: fail\n\n set_status_failed_get_dpdk_derive_params:\n publish:\n role_name: <% $.role_name %>\n status: FAILED\n message: <% task(get_dpdk_derive_params).result.message %>\n on-success: fail\n\n set_status_failed_get_sriov_derive_params:\n publish:\n role_name: <% $.role_name %>\n status: FAILED\n message: <% task(get_sriov_derive_params).result.message %>\n on-success: fail\n\n set_status_failed_get_host_derive_params:\n publish:\n role_name: <% $.role_name %>\n status: FAILED\n message: <% task(get_host_derive_params).result.message %>\n on-success: fail\n\n set_status_failed_get_hci_derive_params:\n publish:\n role_name: <% $.role_name %>\n status: FAILED\n message: <% task(get_hci_derive_params).result.message %>\n on-success: fail\n\n\n _get_role_info:\n description: >\n Workflow that determines the list of derived parameter features (DPDK,\n HCI, etc.) for a role based on the services assigned to the role.\n\n input:\n - role_name\n - heat_resource_tree\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_resource_chains:\n publish:\n resource_chains: <% $.heat_resource_tree.resources.values().where($.get('type', '') = 'OS::Heat::ResourceChain') %>\n on-success:\n - get_role_chain: <% $.resource_chains %>\n - set_status_failed_get_resource_chains: <% not $.resource_chains %>\n\n get_role_chain:\n publish:\n role_chain: <% let(chain_name => concat($.role_name, 'ServiceChain'))-> $.heat_resource_tree.resources.values().where($.name = $chain_name).first({}) %>\n on-success:\n - get_service_chain: <% $.role_chain %>\n - set_status_failed_get_role_chain: <% not $.role_chain %>\n\n get_service_chain:\n publish:\n service_chain: <% let(resources => $.role_chain.resources)-> $.resource_chains.where($resources.contains($.id)).first('') %>\n on-success:\n - get_role_services: <% $.service_chain %>\n - set_status_failed_get_service_chain: <% not $.service_chain %>\n\n get_role_services:\n publish:\n role_services: <% let(resources => $.heat_resource_tree.resources)-> $.service_chain.resources.select($resources.get($)) %>\n on-success:\n - check_features: <% $.role_services %>\n - set_status_failed_get_role_services: <% not $.role_services %>\n\n check_features:\n on-success: build_feature_dict\n publish:\n # The role supports the DPDK feature if the NeutronDatapathType parameter is present\n dpdk: <% let(resources => $.heat_resource_tree.resources) -> $.role_services.any($.get('parameters', []).contains('NeutronDatapathType') or $.get('resources', []).select($resources.get($)).any($.get('parameters', []).contains('NeutronDatapathType'))) %>\n\n # The role supports the DPDK feature in ODL if the OvsEnableDpdk parameter value is true in role parameters.\n odl_dpdk: <% let(role => $.role_name) -> $.heat_resource_tree.parameters.get(concat($role, 'Parameters'), {}).get('default', {}).get('OvsEnableDpdk', false) %>\n\n # The role supports the SRIOV feature if it includes NeutronSriovAgent services.\n sriov: <% $.role_services.any($.get('type', '').endsWith('::NeutronSriovAgent')) %>\n\n # The role supports the HCI feature if it includes both NovaCompute and CephOSD services.\n hci: <% $.role_services.any($.get('type', '').endsWith('::NovaCompute')) and $.role_services.any($.get('type', '').endsWith('::CephOSD')) %>\n\n build_feature_dict:\n on-success: filter_features\n publish:\n feature_dict: <% dict(DPDK => ($.dpdk or $.odl_dpdk), SRIOV => $.sriov, HOST => ($.dpdk or $.odl_dpdk or $.sriov), HCI => $.hci) %>\n\n filter_features:\n publish:\n # The list of features that are enabled (i.e. are true in the feature_dict).\n role_features: <% let(feature_dict => $.feature_dict)-> $feature_dict.keys().where($feature_dict[$]) %>\n\n set_status_failed_get_resource_chains:\n publish:\n message: <% 'Unable to locate any resource chains in the heat resource tree' %>\n on-success: fail\n\n set_status_failed_get_role_chain:\n publish:\n message: <% \"Unable to determine the service chain resource for role '{0}'\".format($.role_name) %>\n on-success: fail\n\n set_status_failed_get_service_chain:\n publish:\n message: <% \"Unable to determine the service chain for role '{0}'\".format($.role_name) %>\n on-success: fail\n\n set_status_failed_get_role_services:\n publish:\n message: <% \"Unable to determine list of services for role '{0}'\".format($.role_name) %>\n on-success: fail\n", "name": "tripleo.derive_params.v1", "tags": [], "created_at": "2018-08-21 13:36:23", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "fc09b688-7226-4f08-9238-be5f4e6f8422"} > >2018-08-21 16:36:23,974 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:36:23,978 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.baremetal.v1 >description: TripleO Baremetal Workflows > >workflows: > > set_node_state: > input: > - node_uuid > - state_action > - target_state > - error_states: > # The default includes all failure states, even unused by TripleO. > - 'error' > - 'adopt failed' > - 'clean failed' > - 'deploy failed' > - 'inspect failed' > - 'rescue failed' > > tags: > - tripleo-common-managed > > tasks: > > set_provision_state: > on-success: wait_for_provision_state > on-error: set_provision_state_failed > action: ironic.node_set_provision_state node_uuid=<% $.node_uuid %> state=<% $.state_action %> > > set_provision_state_failed: > publish: > message: <% task(set_provision_state).result %> > on-complete: fail > > wait_for_provision_state: > action: ironic.node_get > input: > node_id: <% $.node_uuid %> > fields: ['provision_state', 'last_error'] > timeout: 1200 #20 minutes > retry: > delay: 3 > count: 400 > continue-on: <% not task().result.provision_state in [$.target_state] + $.error_states %> > on-complete: > - state_not_reached: <% task().result.provision_state != $.target_state %> > > state_not_reached: > publish: > message: >- > Node <% $.node_uuid %> did not reach state "<% $.target_state %>", > the state is "<% task(wait_for_provision_state).result.provision_state %>", > error: <% task(wait_for_provision_state).result.last_error %> > on-complete: fail > > output-on-error: > result: <% $.message %> > > set_power_state: > input: > - node_uuid > - state_action > - target_state > - error_state: 'error' > > tags: > - tripleo-common-managed > > tasks: > > set_power_state: > on-success: wait_for_power_state > on-error: set_power_state_failed > action: ironic.node_set_power_state node_id=<% $.node_uuid %> state=<% $.state_action %> > > set_power_state_failed: > publish: > message: <% task(set_power_state).result %> > on-complete: fail > > wait_for_power_state: > action: ironic.node_get > input: > node_id: <% $.node_uuid %> > fields: ['power_state', 'last_error'] > timeout: 120 #2 minutes > retry: > delay: 6 > count: 20 > continue-on: <% not task().result.power_state in [$.target_state, $.error_state] %> > on-complete: > - state_not_reached: <% task().result.power_state != $.target_state %> > > state_not_reached: > publish: > message: >- > Node <% $.node_uuid %> did not reach power state "<% $.target_state %>", > the state is "<% task(wait_for_power_state).result.power_state %>", > error: <% task(wait_for_power_state).result.last_error %> > on-complete: fail > > output-on-error: > result: <% $.message %> > > manual_cleaning: > input: > - node_uuid > - clean_steps > - timeout: 7200 # 2 hours (cleaning can take really long) > - retry_delay: 10 > - retry_count: 720 > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > set_provision_state: > on-success: wait_for_provision_state > on-error: set_provision_state_failed > action: ironic.node_set_provision_state node_uuid=<% $.node_uuid %> state='clean' cleansteps=<% $.clean_steps %> > > set_provision_state_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(set_provision_state).result %> > > wait_for_provision_state: > on-success: send_message > action: ironic.node_get node_id=<% $.node_uuid %> > timeout: <% $.timeout %> > retry: > delay: <% $.retry_delay %> > count: <% $.retry_count %> > continue-on: <% task().result.provision_state != 'manageable' %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.manual_cleaning > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > validate_nodes: > description: Validate nodes JSON > > input: > - nodes_json > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > validate_nodes: > action: tripleo.baremetal.validate_nodes > on-success: send_message > on-error: validation_failed > input: > nodes_json: <% $.nodes_json %> > > validation_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(validate_nodes).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.validate_nodes > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > register_or_update: > description: Take nodes JSON and create nodes in a "manageable" state > > input: > - nodes_json > - remove: False > - queue_name: tripleo > - kernel_name: null > - ramdisk_name: null > - instance_boot_option: local > - initial_state: manageable > > tags: > - tripleo-common-managed > > tasks: > > validate_input: > workflow: tripleo.baremetal.v1.validate_nodes > on-success: register_or_update_nodes > on-error: validation_failed > input: > nodes_json: <% $.nodes_json %> > queue_name: <% $.queue_name %> > > validation_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(validate_input).result %> > registered_nodes: [] > > register_or_update_nodes: > action: tripleo.baremetal.register_or_update_nodes > on-success: > - set_nodes_managed: <% $.initial_state != "enroll" %> > - send_message: <% $.initial_state = "enroll" %> > on-error: set_status_failed_register_or_update_nodes > input: > nodes_json: <% $.nodes_json %> > remove: <% $.remove %> > kernel_name: <% $.kernel_name %> > ramdisk_name: <% $.ramdisk_name %> > instance_boot_option: <% $.instance_boot_option %> > publish: > registered_nodes: <% task().result %> > new_nodes: <% task().result.where($.provision_state = 'enroll') %> > > set_status_failed_register_or_update_nodes: > on-success: send_message > publish: > status: FAILED > message: <% task(register_or_update_nodes).result %> > registered_nodes: [] > > set_nodes_managed: > on-success: > - set_nodes_available: <% $.initial_state = "available" %> > - send_message: <% $.initial_state != "available" %> > on-error: set_status_failed_nodes_managed > workflow: tripleo.baremetal.v1.manage > input: > node_uuids: <% $.new_nodes.uuid %> > queue_name: <% $.queue_name %> > publish: > status: SUCCESS > message: <% $.new_nodes.len() %> node(s) successfully moved to the "manageable" state. > > set_status_failed_nodes_managed: > on-success: send_message > publish: > status: FAILED > message: <% task(set_nodes_managed).result %> > > set_nodes_available: > on-success: send_message > on-error: set_status_failed_nodes_available > workflow: tripleo.baremetal.v1.provide node_uuids=<% $.new_nodes.uuid %> queue_name=<% $.queue_name %> > publish: > status: SUCCESS > message: <% $.new_nodes.len() %> node(s) successfully moved to the "available" state. > > set_status_failed_nodes_available: > on-success: send_message > publish: > status: FAILED > message: <% task(set_nodes_available).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.register_or_update > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > registered_nodes: <% $.registered_nodes or [] %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > provide: > description: Take a list of nodes and move them to "available" > > input: > - node_uuids > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > set_nodes_available: > on-success: cell_v2_discover_hosts > on-error: set_status_failed_nodes_available > with-items: uuid in <% $.node_uuids %> > workflow: tripleo.baremetal.v1.set_node_state > input: > node_uuid: <% $.uuid %> > queue_name: <% $.queue_name %> > state_action: 'provide' > target_state: 'available' > > set_status_failed_nodes_available: > on-success: send_message > publish: > status: FAILED > message: <% task(set_nodes_available).result %> > > cell_v2_discover_hosts: > on-success: try_power_off > on-error: cell_v2_discover_hosts_failed > workflow: tripleo.baremetal.v1.cellv2_discovery > input: > node_uuids: <% $.node_uuids %> > queue_name: <% $.queue_name %> > timeout: 900 #15 minutes > retry: > delay: 30 > count: 30 > > cell_v2_discover_hosts_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(cell_v2_discover_hosts).result %> > > try_power_off: > on-success: send_message > on-error: power_off_failed > with-items: uuid in <% $.node_uuids %> > workflow: tripleo.baremetal.v1.set_power_state > input: > node_uuid: <% $.uuid %> > queue_name: <% $.queue_name %> > state_action: 'off' > target_state: 'power off' > publish: > status: SUCCESS > message: <% $.node_uuids.len() %> node(s) successfully moved to the "available" state. > > power_off_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(try_power_off).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.provide > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > provide_manageable_nodes: > description: Provide all nodes in a 'manageable' state. > > input: > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > get_manageable_nodes: > action: ironic.node_list maintenance=False associated=False > on-success: provide_manageable > on-error: set_status_failed_get_manageable_nodes > publish: > managed_nodes: <% task().result.where($.provision_state = 'manageable').uuid %> > > set_status_failed_get_manageable_nodes: > on-success: send_message > publish: > status: FAILED > message: <% task(get_manageable_nodes).result %> > > provide_manageable: > on-success: send_message > workflow: tripleo.baremetal.v1.provide > input: > node_uuids: <% $.managed_nodes %> > queue_name: <% $.queue_name %> > publish: > status: SUCCESS > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.provide_manageable_nodes > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > manage: > description: Set a list of nodes to 'manageable' state > > input: > - node_uuids > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > set_nodes_manageable: > on-success: send_message > on-error: set_status_failed_nodes_manageable > with-items: uuid in <% $.node_uuids %> > workflow: tripleo.baremetal.v1.set_node_state > input: > node_uuid: <% $.uuid %> > state_action: 'manage' > target_state: 'manageable' > error_states: > # node going back to enroll designates power credentials failure > - 'enroll' > - 'error' > > set_status_failed_nodes_manageable: > on-success: send_message > publish: > status: FAILED > message: <% task(set_nodes_manageable).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.manage > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > _introspect: > description: > > An internal workflow. The tripleo.baremetal.v1.introspect workflow > should be used for introspection. > > input: > - node_uuid > - timeout > - queue_name > > output: > result: <% task(start_introspection).result %> > > tags: > - tripleo-common-managed > > tasks: > start_introspection: > action: baremetal_introspection.introspect uuid=<% $.node_uuid %> > on-success: wait_for_introspection_to_finish > on-error: set_status_failed_start_introspection > > set_status_failed_start_introspection: > publish: > status: FAILED > message: <% task(start_introspection).result %> > introspected_nodes: [] > on-success: send_message > > wait_for_introspection_to_finish: > action: baremetal_introspection.wait_for_finish > input: > uuids: <% [$.node_uuid] %> > # The interval is 10 seconds, so divide to make the overall timeout > # in seconds correct. > max_retries: <% $.timeout / 10 %> > retry_interval: 10 > publish: > introspected_node: <% task().result.values().first() %> > status: <% bool(task().result.values().first().error) and "FAILED" or "SUCCESS" %> > publish-on-error: > status: FAILED > message: <% task().result %> > on-success: wait_for_introspection_to_finish_success > on-error: wait_for_introspection_to_finish_error > > wait_for_introspection_to_finish_success: > publish: > message: <% "Introspection of node {0} completed. Status:{1}. Errors:{2}".format($.introspected_node.uuid, $.status, $.introspected_node.error) %> > on-success: send_message > > wait_for_introspection_to_finish_error: > publish: > message: <% "Introspection of node {0} timed out.".format($.node_uuid) %> > on-success: send_message > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1._introspect > payload: > status: <% $.status %> > message: <% $.message %> > introspected_node: <% $.get('introspected_node') %> > node_uuid: <% $.node_uuid %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > introspect: > description: > > Take a list of nodes and move them through introspection. > > By default each node will attempt introspection up to 3 times (two > retries plus the initial attemp) if it fails. This behaviour can be > modified by changing the max_retry_attempts input. > > The workflow will assume the node has timed out after 20 minutes (1200 > seconds). This can be changed by passing the node_timeout input in > seconds. > > input: > - node_uuids > - run_validations: False > - queue_name: tripleo > - concurrency: 20 > - max_retry_attempts: 2 > - node_timeout: 1200 > > tags: > - tripleo-common-managed > > task-defaults: > on-error: unhandled_error > > tasks: > initialize: > publish: > introspection_attempt: 1 > on-complete: > - run_validations: <% $.run_validations %> > - introspect_nodes: <% not $.run_validations %> > > run_validations: > workflow: tripleo.validations.v1.run_groups > input: > group_names: > - 'pre-introspection' > queue_name: <% $.queue_name %> > on-success: introspect_nodes > on-error: set_validations_failed > > set_validations_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(run_validations).result %> > > introspect_nodes: > with-items: uuid in <% $.node_uuids %> > concurrency: <% $.concurrency %> > workflow: _introspect > input: > node_uuid: <% $.uuid %> > queue_name: <% $.queue_name %> > timeout: <% $.node_timeout %> > # on-error is triggered if one or more nodes failed introspection. We > # still go to get_introspection_status as it will collect the result > # for each node. Unless we hit the retry limit. > on-error: > - get_introspection_status: <% $.introspection_attempt <= $.max_retry_attempts %> > - max_retry_attempts_reached: <% $.introspection_attempt > $.max_retry_attempts %> > on-success: get_introspection_status > > get_introspection_status: > with-items: uuid in <% $.node_uuids %> > action: baremetal_introspection.get_status > input: > uuid: <% $.uuid %> > publish: > introspected_nodes: <% task().result.toDict($.uuid, $) %> > # Currently there is no way for us to ignore user introspection > # aborts. This means we will retry aborted nodes until the Ironic API > # gives us more details (error code or a boolean to show aborts etc.) > # If a node hasn't finished, we consider it to be failed. > # TODO(d0ugal): When possible, don't retry introspection of nodes > # that a user manually aborted. > failed_introspection: <% task().result.where($.finished = true and $.error != null).select($.uuid) + task().result.where($.finished = false).select($.uuid) %> > publish-on-error: > # If a node fails to start introspection, getting the status can fail. > # When that happens, the result is a string and the nodes need to be > # filtered out. > introspected_nodes: <% task().result.where(isDict($)).toDict($.uuid, $) %> > # If there was an error, the exception string we get doesn't give us > # the UUID. So we use a set difference to find the UUIDs missing in > # the results. These are then added to the failed nodes. > failed_introspection: <% ($.node_uuids.toSet() - task().result.where(isDict($)).select($.uuid).toSet()) + task().result.where(isDict($)).where($.finished = true and $.error != null).toSet() + task().result.where(isDict($)).where($.finished = false).toSet() %> > on-error: increase_attempt_counter > on-success: > - successful_introspection: <% $.failed_introspection.len() = 0 %> > - increase_attempt_counter: <% $.failed_introspection.len() > 0 %> > > increase_attempt_counter: > publish: > introspection_attempt: <% $.introspection_attempt + 1 %> > on-complete: > retry_failed_nodes > > retry_failed_nodes: > publish: > status: RUNNING > message: <% 'Retrying {0} nodes that failed introspection. Attempt {1} of {2} '.format($.failed_introspection.len(), $.introspection_attempt, $.max_retry_attempts + 1) %> > # We are about to retry, update the tracking stats. > node_uuids: <% $.failed_introspection %> > on-success: > - send_message > - introspect_nodes > > max_retry_attempts_reached: > publish: > status: FAILED > message: <% 'Retry limit reached with {0} nodes still failing introspection'.format($.failed_introspection.len()) %> > on-complete: send_message > > successful_introspection: > publish: > status: SUCCESS > message: Successfully introspected <% $.introspected_nodes.len() %> node(s). > on-complete: send_message > > unhandled_error: > publish: > status: FAILED > message: "Unhandled workflow error" > on-complete: send_message > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.introspect > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > introspected_nodes: <% $.get('introspected_nodes', []) %> > failed_introspection: <% $.get('failed_introspection', []) %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > introspect_manageable_nodes: > description: Introspect all nodes in a 'manageable' state. > > input: > - run_validations: False > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > get_manageable_nodes: > action: ironic.node_list maintenance=False associated=False > on-success: validate_nodes > on-error: set_status_failed_get_manageable_nodes > publish: > managed_nodes: <% task().result.where($.provision_state = 'manageable').uuid %> > > set_status_failed_get_manageable_nodes: > on-success: send_message > publish: > status: FAILED > message: <% task(get_manageable_nodes).result %> > > validate_nodes: > on-success: > - introspect_manageable: <% $.managed_nodes.len() > 0 %> > - set_status_failed_no_nodes: <% $.managed_nodes.len() = 0 %> > > set_status_failed_no_nodes: > on-success: send_message > publish: > status: FAILED > message: No manageable nodes to introspect. Check node states and maintenance. > > introspect_manageable: > on-success: send_message > on-error: set_status_introspect_manageable > workflow: tripleo.baremetal.v1.introspect > input: > node_uuids: <% $.managed_nodes %> > run_validations: <% $.run_validations %> > queue_name: <% $.queue_name %> > publish: > introspected_nodes: <% task().result.introspected_nodes %> > > set_status_introspect_manageable: > on-success: send_message > publish: > status: FAILED > message: <% task(introspect_manageable).result %> > introspected_nodes: [] > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.introspect_manageable_nodes > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > introspected_nodes: <% $.get('introspected_nodes', []) %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > configure: > description: Take a list of manageable nodes and update their boot configuration. > > input: > - node_uuids > - queue_name: tripleo > - kernel_name: bm-deploy-kernel > - ramdisk_name: bm-deploy-ramdisk > - instance_boot_option: null > - root_device: null > - root_device_minimum_size: 4 > - overwrite_root_device_hints: False > > tags: > - tripleo-common-managed > > tasks: > > configure_boot: > on-success: configure_root_device > on-error: set_status_failed_configure_boot > with-items: node_uuid in <% $.node_uuids %> > action: tripleo.baremetal.configure_boot node_uuid=<% $.node_uuid %> kernel_name=<% $.kernel_name %> ramdisk_name=<% $.ramdisk_name %> instance_boot_option=<% $.instance_boot_option %> > > configure_root_device: > on-success: send_message > on-error: set_status_failed_configure_root_device > with-items: node_uuid in <% $.node_uuids %> > action: tripleo.baremetal.configure_root_device node_uuid=<% $.node_uuid %> root_device=<% $.root_device %> minimum_size=<% $.root_device_minimum_size %> overwrite=<% $.overwrite_root_device_hints %> > publish: > status: SUCCESS > message: 'Successfully configured the nodes.' > > set_status_failed_configure_boot: > on-success: send_message > publish: > status: FAILED > message: <% task(configure_boot).result %> > > set_status_failed_configure_root_device: > on-success: send_message > publish: > status: FAILED > message: <% task(configure_root_device).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.configure > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > configure_manageable_nodes: > description: Update the boot configuration of all nodes in 'manageable' state. > > input: > - queue_name: tripleo > - kernel_name: 'bm-deploy-kernel' > - ramdisk_name: 'bm-deploy-ramdisk' > - instance_boot_option: null > - root_device: null > - root_device_minimum_size: 4 > - overwrite_root_device_hints: False > > tags: > - tripleo-common-managed > > tasks: > > get_manageable_nodes: > action: ironic.node_list maintenance=False associated=False > on-success: configure_manageable > on-error: set_status_failed_get_manageable_nodes > publish: > managed_nodes: <% task().result.where($.provision_state = 'manageable').uuid %> > > configure_manageable: > on-success: send_message > on-error: set_status_failed_configure_manageable > workflow: tripleo.baremetal.v1.configure > input: > node_uuids: <% $.managed_nodes %> > queue_name: <% $.queue_name %> > kernel_name: <% $.kernel_name %> > ramdisk_name: <% $.ramdisk_name %> > instance_boot_option: <% $.instance_boot_option %> > root_device: <% $.root_device %> > root_device_minimum_size: <% $.root_device_minimum_size %> > overwrite_root_device_hints: <% $.overwrite_root_device_hints %> > publish: > message: 'Manageable nodes configured successfully.' > > set_status_failed_configure_manageable: > on-success: send_message > publish: > status: FAILED > message: <% task(configure_manageable).result %> > > set_status_failed_get_manageable_nodes: > on-success: send_message > publish: > status: FAILED > message: <% task(get_manageable_nodes).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.configure_manageable_nodes > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > tag_node: > description: Tag a node with a role > input: > - node_uuid > - role: null > - queue_name: tripleo > > task-defaults: > on-error: send_message > > tags: > - tripleo-common-managed > > tasks: > > update_node: > on-success: send_message > action: tripleo.baremetal.update_node_capability node_uuid=<% $.node_uuid %> capability='profile' value=<% $.role %> > publish: > message: <% task().result %> > status: SUCCESS > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.tag_node > payload: > status: <% $.get('status', 'FAILED') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > tag_nodes: > description: Runs the tag_node workflow in a loop > input: > - tag_node_uuids > - untag_node_uuids > - role > - plan: overcloud > - queue_name: tripleo > > task-defaults: > on-error: send_message > > tags: > - tripleo-common-managed > > tasks: > > tag_nodes: > with-items: node_uuid in <% $.tag_node_uuids %> > workflow: tripleo.baremetal.v1.tag_node > input: > node_uuid: <% $.node_uuid %> > queue_name: <% $.queue_name %> > role: <% $.role %> > concurrency: 1 > on-success: untag_nodes > > untag_nodes: > with-items: node_uuid in <% $.untag_node_uuids %> > workflow: tripleo.baremetal.v1.tag_node > input: > node_uuid: <% $.node_uuid %> > queue_name: <% $.queue_name %> > concurrency: 1 > on-success: update_role_parameters > > update_role_parameters: > on-success: send_message > action: tripleo.parameters.update_role role=<% $.role %> container=<% $.plan %> > publish: > message: <% task().result %> > status: SUCCESS > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.tag_nodes > payload: > status: <% $.get('status', 'FAILED') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > nodes_with_hint: > description: Find nodes matching a hint regex > input: > - hint_regex > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > get_nodes: > with-items: provision_state in <% ['available', 'active'] %> > action: ironic.node_list maintenance=false provision_state=<% $.provision_state %> detail=true > on-success: get_matching_nodes > on-error: set_status_failed_get_nodes > > get_matching_nodes: > with-items: node in <% task(get_nodes).result.flatten() %> > action: tripleo.baremetal.get_node_hint node=<% $.node %> > on-success: send_message > on-error: set_status_failed_get_matching_nodes > publish: > matching_nodes: <% let(hint_regex => $.hint_regex) -> task().result.where($.hint and $.hint.matches($hint_regex)).uuid %> > > set_status_failed_get_nodes: > on-success: send_message > publish: > status: FAILED > message: <% task(get_nodes).result %> > > set_status_failed_get_matching_nodes: > on-success: send_message > publish: > status: FAILED > message: <% task(get_matching_nodes).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.nodes_with_hint > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > matching_nodes: <% $.matching_nodes or [] %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > nodes_with_profile: > description: Find nodes with a specific profile > input: > - profile > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > get_active_nodes: > action: ironic.node_list maintenance=false provision_state='active' detail=true > on-success: get_available_nodes > on-error: set_status_failed_get_active_nodes > > get_available_nodes: > action: ironic.node_list maintenance=false provision_state='available' detail=true > on-success: get_matching_nodes > on-error: set_status_failed_get_available_nodes > > get_matching_nodes: > with-items: node in <% task(get_available_nodes).result + task(get_active_nodes).result %> > action: tripleo.baremetal.get_profile node=<% $.node %> > on-success: send_message > on-error: set_status_failed_get_matching_nodes > publish: > matching_nodes: <% let(input_profile_name => $.profile) -> task().result.where($.profile = $input_profile_name).uuid %> > > set_status_failed_get_active_nodes: > on-success: send_message > publish: > status: FAILED > message: <% task(get_active_nodes).result %> > > set_status_failed_get_available_nodes: > on-success: send_message > publish: > status: FAILED > message: <% task(get_available_nodes).result %> > > set_status_failed_get_matching_nodes: > on-success: send_message > publish: > status: FAILED > message: <% task(get_matching_nodes).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.nodes_with_profile > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > matching_nodes: <% $.matching_nodes or [] %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > create_raid_configuration: > description: Create and apply RAID configuration for given nodes > input: > - node_uuids > - configuration > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > set_configuration: > with-items: node_uuid in <% $.node_uuids %> > action: ironic.node_set_target_raid_config node_ident=<% $.node_uuid %> target_raid_config=<% $.configuration %> > on-success: apply_configuration > on-error: set_configuration_failed > > set_configuration_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(set_configuration).result %> > > apply_configuration: > with-items: node_uuid in <% $.node_uuids %> > workflow: tripleo.baremetal.v1.manual_cleaning > input: > node_uuid: <% $.node_uuid %> > clean_steps: > - interface: raid > step: delete_configuration > - interface: raid > step: create_configuration > timeout: 1800 # building RAID should be fast than general cleaning > retry_count: 180 > retry_delay: 10 > on-success: send_message > on-error: apply_configuration_failed > publish: > message: <% task().result %> > status: SUCCESS > > apply_configuration_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(apply_configuration).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.create_raid_configuration > payload: > status: <% $.get('status', 'FAILED') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > > cellv2_discovery: > description: Run cell_v2 host discovery > > input: > - node_uuids > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > cell_v2_discover_hosts: > on-success: wait_for_nova_resources > on-error: cell_v2_discover_hosts_failed > action: tripleo.baremetal.cell_v2_discover_hosts > > cell_v2_discover_hosts_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(cell_v2_discover_hosts).result %> > > wait_for_nova_resources: > on-success: send_message > on-error: wait_for_nova_resources_failed > with-items: node_uuid in <% $.node_uuids %> > action: nova.hypervisors_find hypervisor_hostname=<% $.node_uuid %> > > wait_for_nova_resources_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(wait_for_nova_resources).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.cellv2_discovery > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > > discover_nodes: > description: Run nodes discovery over the given IP range > > input: > - ip_addresses > - credentials > - ports: [623] > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > get_all_nodes: > action: ironic.node_list > input: > fields: ["uuid", "driver", "driver_info"] > limit: 0 > on-success: get_candidate_nodes > on-error: get_all_nodes_failed > publish: > existing_nodes: <% task().result %> > > get_all_nodes_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(get_all_nodes).result %> > > get_candidate_nodes: > action: tripleo.baremetal.get_candidate_nodes > input: > ip_addresses: <% $.ip_addresses %> > credentials: <% $.credentials %> > ports: <% $.ports %> > existing_nodes: <% $.existing_nodes %> > on-success: probe_nodes > on-error: get_candidate_nodes_failed > publish: > candidates: <% task().result %> > > get_candidate_nodes_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(get_candidate_nodes).result %> > > probe_nodes: > action: tripleo.baremetal.probe_node > on-success: send_message > on-error: probe_nodes_failed > input: > ip: <% $.node.ip %> > port: <% $.node.port %> > username: <% $.node.username %> > password: <% $.node.password %> > with-items: > - node in <% $.candidates %> > publish: > nodes_json: <% task().result.where($ != null) %> > > probe_nodes_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(probe_nodes).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.discover_nodes > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > nodes_json: <% $.get('nodes_json', []) %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > discover_and_enroll_nodes: > description: Run nodes discovery over the given IP range and enroll nodes > > input: > - ip_addresses > - credentials > - ports: [623] > - kernel_name: null > - ramdisk_name: null > - instance_boot_option: local > - initial_state: manageable > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > discover_nodes: > workflow: tripleo.baremetal.v1.discover_nodes > input: > ip_addresses: <% $.ip_addresses %> > ports: <% $.ports %> > credentials: <% $.credentials %> > queue_name: <% $.queue_name %> > on-success: enroll_nodes > on-error: discover_nodes_failed > publish: > nodes_json: <% task().result.nodes_json %> > > discover_nodes_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(discover_nodes).result %> > > enroll_nodes: > workflow: tripleo.baremetal.v1.register_or_update > input: > nodes_json: <% $.nodes_json %> > kernel_name: <% $.kernel_name %> > ramdisk_name: <% $.ramdisk_name %> > instance_boot_option: <% $.instance_boot_option %> > initial_state: <% $.initial_state %> > on-success: send_message > on-error: enroll_nodes_failed > publish: > registered_nodes: <% task().result.registered_nodes %> > > enroll_nodes_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(enroll_nodes).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.baremetal.v1.discover_and_enroll_nodes > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > registered_nodes: <% $.get('registered_nodes', []) %> > on-success: > - fail: <% $.get('status') = "FAILED" %> >' >2018-08-21 16:36:42,606 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 45010 >2018-08-21 16:36:42,657 DEBUG: RESP: [201] Content-Length: 45010 Content-Type: application/json Date: Tue, 21 Aug 2018 13:36:42 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.baremetal.v1\ndescription: TripleO Baremetal Workflows\n\nworkflows:\n\n set_node_state:\n input:\n - node_uuid\n - state_action\n - target_state\n - error_states:\n # The default includes all failure states, even unused by TripleO.\n - 'error'\n - 'adopt failed'\n - 'clean failed'\n - 'deploy failed'\n - 'inspect failed'\n - 'rescue failed'\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n set_provision_state:\n on-success: wait_for_provision_state\n on-error: set_provision_state_failed\n action: ironic.node_set_provision_state node_uuid=<% $.node_uuid %> state=<% $.state_action %>\n\n set_provision_state_failed:\n publish:\n message: <% task(set_provision_state).result %>\n on-complete: fail\n\n wait_for_provision_state:\n action: ironic.node_get\n input:\n node_id: <% $.node_uuid %>\n fields: ['provision_state', 'last_error']\n timeout: 1200 #20 minutes\n retry:\n delay: 3\n count: 400\n continue-on: <% not task().result.provision_state in [$.target_state] + $.error_states %>\n on-complete:\n - state_not_reached: <% task().result.provision_state != $.target_state %>\n\n state_not_reached:\n publish:\n message: >-\n Node <% $.node_uuid %> did not reach state \"<% $.target_state %>\",\n the state is \"<% task(wait_for_provision_state).result.provision_state %>\",\n error: <% task(wait_for_provision_state).result.last_error %>\n on-complete: fail\n\n output-on-error:\n result: <% $.message %>\n\n set_power_state:\n input:\n - node_uuid\n - state_action\n - target_state\n - error_state: 'error'\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n set_power_state:\n on-success: wait_for_power_state\n on-error: set_power_state_failed\n action: ironic.node_set_power_state node_id=<% $.node_uuid %> state=<% $.state_action %>\n\n set_power_state_failed:\n publish:\n message: <% task(set_power_state).result %>\n on-complete: fail\n\n wait_for_power_state:\n action: ironic.node_get\n input:\n node_id: <% $.node_uuid %>\n fields: ['power_state', 'last_error']\n timeout: 120 #2 minutes\n retry:\n delay: 6\n count: 20\n continue-on: <% not task().result.power_state in [$.target_state, $.error_state] %>\n on-complete:\n - state_not_reached: <% task().result.power_state != $.target_state %>\n\n state_not_reached:\n publish:\n message: >-\n Node <% $.node_uuid %> did not reach power state \"<% $.target_state %>\",\n the state is \"<% task(wait_for_power_state).result.power_state %>\",\n error: <% task(wait_for_power_state).result.last_error %>\n on-complete: fail\n\n output-on-error:\n result: <% $.message %>\n\n manual_cleaning:\n input:\n - node_uuid\n - clean_steps\n - timeout: 7200 # 2 hours (cleaning can take really long)\n - retry_delay: 10\n - retry_count: 720\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n set_provision_state:\n on-success: wait_for_provision_state\n on-error: set_provision_state_failed\n action: ironic.node_set_provision_state node_uuid=<% $.node_uuid %> state='clean' cleansteps=<% $.clean_steps %>\n\n set_provision_state_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(set_provision_state).result %>\n\n wait_for_provision_state:\n on-success: send_message\n action: ironic.node_get node_id=<% $.node_uuid %>\n timeout: <% $.timeout %>\n retry:\n delay: <% $.retry_delay %>\n count: <% $.retry_count %>\n continue-on: <% task().result.provision_state != 'manageable' %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.manual_cleaning\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n validate_nodes:\n description: Validate nodes JSON\n\n input:\n - nodes_json\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n validate_nodes:\n action: tripleo.baremetal.validate_nodes\n on-success: send_message\n on-error: validation_failed\n input:\n nodes_json: <% $.nodes_json %>\n\n validation_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(validate_nodes).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.validate_nodes\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n register_or_update:\n description: Take nodes JSON and create nodes in a \"manageable\" state\n\n input:\n - nodes_json\n - remove: False\n - queue_name: tripleo\n - kernel_name: null\n - ramdisk_name: null\n - instance_boot_option: local\n - initial_state: manageable\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n validate_input:\n workflow: tripleo.baremetal.v1.validate_nodes\n on-success: register_or_update_nodes\n on-error: validation_failed\n input:\n nodes_json: <% $.nodes_json %>\n queue_name: <% $.queue_name %>\n\n validation_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(validate_input).result %>\n registered_nodes: []\n\n register_or_update_nodes:\n action: tripleo.baremetal.register_or_update_nodes\n on-success:\n - set_nodes_managed: <% $.initial_state != \"enroll\" %>\n - send_message: <% $.initial_state = \"enroll\" %>\n on-error: set_status_failed_register_or_update_nodes\n input:\n nodes_json: <% $.nodes_json %>\n remove: <% $.remove %>\n kernel_name: <% $.kernel_name %>\n ramdisk_name: <% $.ramdisk_name %>\n instance_boot_option: <% $.instance_boot_option %>\n publish:\n registered_nodes: <% task().result %>\n new_nodes: <% task().result.where($.provision_state = 'enroll') %>\n\n set_status_failed_register_or_update_nodes:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(register_or_update_nodes).result %>\n registered_nodes: []\n\n set_nodes_managed:\n on-success:\n - set_nodes_available: <% $.initial_state = \"available\" %>\n - send_message: <% $.initial_state != \"available\" %>\n on-error: set_status_failed_nodes_managed\n workflow: tripleo.baremetal.v1.manage\n input:\n node_uuids: <% $.new_nodes.uuid %>\n queue_name: <% $.queue_name %>\n publish:\n status: SUCCESS\n message: <% $.new_nodes.len() %> node(s) successfully moved to the \"manageable\" state.\n\n set_status_failed_nodes_managed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(set_nodes_managed).result %>\n\n set_nodes_available:\n on-success: send_message\n on-error: set_status_failed_nodes_available\n workflow: tripleo.baremetal.v1.provide node_uuids=<% $.new_nodes.uuid %> queue_name=<% $.queue_name %>\n publish:\n status: SUCCESS\n message: <% $.new_nodes.len() %> node(s) successfully moved to the \"available\" state.\n\n set_status_failed_nodes_available:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(set_nodes_available).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.register_or_update\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n registered_nodes: <% $.registered_nodes or [] %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n provide:\n description: Take a list of nodes and move them to \"available\"\n\n input:\n - node_uuids\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n set_nodes_available:\n on-success: cell_v2_discover_hosts\n on-error: set_status_failed_nodes_available\n with-items: uuid in <% $.node_uuids %>\n workflow: tripleo.baremetal.v1.set_node_state\n input:\n node_uuid: <% $.uuid %>\n queue_name: <% $.queue_name %>\n state_action: 'provide'\n target_state: 'available'\n\n set_status_failed_nodes_available:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(set_nodes_available).result %>\n\n cell_v2_discover_hosts:\n on-success: try_power_off\n on-error: cell_v2_discover_hosts_failed\n workflow: tripleo.baremetal.v1.cellv2_discovery\n input:\n node_uuids: <% $.node_uuids %>\n queue_name: <% $.queue_name %>\n timeout: 900 #15 minutes\n retry:\n delay: 30\n count: 30\n\n cell_v2_discover_hosts_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(cell_v2_discover_hosts).result %>\n\n try_power_off:\n on-success: send_message\n on-error: power_off_failed\n with-items: uuid in <% $.node_uuids %>\n workflow: tripleo.baremetal.v1.set_power_state\n input:\n node_uuid: <% $.uuid %>\n queue_name: <% $.queue_name %>\n state_action: 'off'\n target_state: 'power off'\n publish:\n status: SUCCESS\n message: <% $.node_uuids.len() %> node(s) successfully moved to the \"available\" state.\n\n power_off_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(try_power_off).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.provide\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n provide_manageable_nodes:\n description: Provide all nodes in a 'manageable' state.\n\n input:\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n get_manageable_nodes:\n action: ironic.node_list maintenance=False associated=False\n on-success: provide_manageable\n on-error: set_status_failed_get_manageable_nodes\n publish:\n managed_nodes: <% task().result.where($.provision_state = 'manageable').uuid %>\n\n set_status_failed_get_manageable_nodes:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_manageable_nodes).result %>\n\n provide_manageable:\n on-success: send_message\n workflow: tripleo.baremetal.v1.provide\n input:\n node_uuids: <% $.managed_nodes %>\n queue_name: <% $.queue_name %>\n publish:\n status: SUCCESS\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.provide_manageable_nodes\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n manage:\n description: Set a list of nodes to 'manageable' state\n\n input:\n - node_uuids\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n set_nodes_manageable:\n on-success: send_message\n on-error: set_status_failed_nodes_manageable\n with-items: uuid in <% $.node_uuids %>\n workflow: tripleo.baremetal.v1.set_node_state\n input:\n node_uuid: <% $.uuid %>\n state_action: 'manage'\n target_state: 'manageable'\n error_states:\n # node going back to enroll designates power credentials failure\n - 'enroll'\n - 'error'\n\n set_status_failed_nodes_manageable:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(set_nodes_manageable).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.manage\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n _introspect:\n description: >\n An internal workflow. The tripleo.baremetal.v1.introspect workflow\n should be used for introspection.\n\n input:\n - node_uuid\n - timeout\n - queue_name\n\n output:\n result: <% task(start_introspection).result %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n start_introspection:\n action: baremetal_introspection.introspect uuid=<% $.node_uuid %>\n on-success: wait_for_introspection_to_finish\n on-error: set_status_failed_start_introspection\n\n set_status_failed_start_introspection:\n publish:\n status: FAILED\n message: <% task(start_introspection).result %>\n introspected_nodes: []\n on-success: send_message\n\n wait_for_introspection_to_finish:\n action: baremetal_introspection.wait_for_finish\n input:\n uuids: <% [$.node_uuid] %>\n # The interval is 10 seconds, so divide to make the overall timeout\n # in seconds correct.\n max_retries: <% $.timeout / 10 %>\n retry_interval: 10\n publish:\n introspected_node: <% task().result.values().first() %>\n status: <% bool(task().result.values().first().error) and \"FAILED\" or \"SUCCESS\" %>\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n on-success: wait_for_introspection_to_finish_success\n on-error: wait_for_introspection_to_finish_error\n\n wait_for_introspection_to_finish_success:\n publish:\n message: <% \"Introspection of node {0} completed. Status:{1}. Errors:{2}\".format($.introspected_node.uuid, $.status, $.introspected_node.error) %>\n on-success: send_message\n\n wait_for_introspection_to_finish_error:\n publish:\n message: <% \"Introspection of node {0} timed out.\".format($.node_uuid) %>\n on-success: send_message\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1._introspect\n payload:\n status: <% $.status %>\n message: <% $.message %>\n introspected_node: <% $.get('introspected_node') %>\n node_uuid: <% $.node_uuid %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n introspect:\n description: >\n Take a list of nodes and move them through introspection.\n\n By default each node will attempt introspection up to 3 times (two\n retries plus the initial attemp) if it fails. This behaviour can be\n modified by changing the max_retry_attempts input.\n\n The workflow will assume the node has timed out after 20 minutes (1200\n seconds). This can be changed by passing the node_timeout input in\n seconds.\n\n input:\n - node_uuids\n - run_validations: False\n - queue_name: tripleo\n - concurrency: 20\n - max_retry_attempts: 2\n - node_timeout: 1200\n\n tags:\n - tripleo-common-managed\n\n task-defaults:\n on-error: unhandled_error\n\n tasks:\n initialize:\n publish:\n introspection_attempt: 1\n on-complete:\n - run_validations: <% $.run_validations %>\n - introspect_nodes: <% not $.run_validations %>\n\n run_validations:\n workflow: tripleo.validations.v1.run_groups\n input:\n group_names:\n - 'pre-introspection'\n queue_name: <% $.queue_name %>\n on-success: introspect_nodes\n on-error: set_validations_failed\n\n set_validations_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(run_validations).result %>\n\n introspect_nodes:\n with-items: uuid in <% $.node_uuids %>\n concurrency: <% $.concurrency %>\n workflow: _introspect\n input:\n node_uuid: <% $.uuid %>\n queue_name: <% $.queue_name %>\n timeout: <% $.node_timeout %>\n # on-error is triggered if one or more nodes failed introspection. We\n # still go to get_introspection_status as it will collect the result\n # for each node. Unless we hit the retry limit.\n on-error:\n - get_introspection_status: <% $.introspection_attempt <= $.max_retry_attempts %>\n - max_retry_attempts_reached: <% $.introspection_attempt > $.max_retry_attempts %>\n on-success: get_introspection_status\n\n get_introspection_status:\n with-items: uuid in <% $.node_uuids %>\n action: baremetal_introspection.get_status\n input:\n uuid: <% $.uuid %>\n publish:\n introspected_nodes: <% task().result.toDict($.uuid, $) %>\n # Currently there is no way for us to ignore user introspection\n # aborts. This means we will retry aborted nodes until the Ironic API\n # gives us more details (error code or a boolean to show aborts etc.)\n # If a node hasn't finished, we consider it to be failed.\n # TODO(d0ugal): When possible, don't retry introspection of nodes\n # that a user manually aborted.\n failed_introspection: <% task().result.where($.finished = true and $.error != null).select($.uuid) + task().result.where($.finished = false).select($.uuid) %>\n publish-on-error:\n # If a node fails to start introspection, getting the status can fail.\n # When that happens, the result is a string and the nodes need to be\n # filtered out.\n introspected_nodes: <% task().result.where(isDict($)).toDict($.uuid, $) %>\n # If there was an error, the exception string we get doesn't give us\n # the UUID. So we use a set difference to find the UUIDs missing in\n # the results. These are then added to the failed nodes.\n failed_introspection: <% ($.node_uuids.toSet() - task().result.where(isDict($)).select($.uuid).toSet()) + task().result.where(isDict($)).where($.finished = true and $.error != null).toSet() + task().result.where(isDict($)).where($.finished = false).toSet() %>\n on-error: increase_attempt_counter\n on-success:\n - successful_introspection: <% $.failed_introspection.len() = 0 %>\n - increase_attempt_counter: <% $.failed_introspection.len() > 0 %>\n\n increase_attempt_counter:\n publish:\n introspection_attempt: <% $.introspection_attempt + 1 %>\n on-complete:\n retry_failed_nodes\n\n retry_failed_nodes:\n publish:\n status: RUNNING\n message: <% 'Retrying {0} nodes that failed introspection. Attempt {1} of {2} '.format($.failed_introspection.len(), $.introspection_attempt, $.max_retry_attempts + 1) %>\n # We are about to retry, update the tracking stats.\n node_uuids: <% $.failed_introspection %>\n on-success:\n - send_message\n - introspect_nodes\n\n max_retry_attempts_reached:\n publish:\n status: FAILED\n message: <% 'Retry limit reached with {0} nodes still failing introspection'.format($.failed_introspection.len()) %>\n on-complete: send_message\n\n successful_introspection:\n publish:\n status: SUCCESS\n message: Successfully introspected <% $.introspected_nodes.len() %> node(s).\n on-complete: send_message\n\n unhandled_error:\n publish:\n status: FAILED\n message: \"Unhandled workflow error\"\n on-complete: send_message\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.introspect\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n introspected_nodes: <% $.get('introspected_nodes', []) %>\n failed_introspection: <% $.get('failed_introspection', []) %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n introspect_manageable_nodes:\n description: Introspect all nodes in a 'manageable' state.\n\n input:\n - run_validations: False\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n get_manageable_nodes:\n action: ironic.node_list maintenance=False associated=False\n on-success: validate_nodes\n on-error: set_status_failed_get_manageable_nodes\n publish:\n managed_nodes: <% task().result.where($.provision_state = 'manageable').uuid %>\n\n set_status_failed_get_manageable_nodes:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_manageable_nodes).result %>\n\n validate_nodes:\n on-success:\n - introspect_manageable: <% $.managed_nodes.len() > 0 %>\n - set_status_failed_no_nodes: <% $.managed_nodes.len() = 0 %>\n\n set_status_failed_no_nodes:\n on-success: send_message\n publish:\n status: FAILED\n message: No manageable nodes to introspect. Check node states and maintenance.\n\n introspect_manageable:\n on-success: send_message\n on-error: set_status_introspect_manageable\n workflow: tripleo.baremetal.v1.introspect\n input:\n node_uuids: <% $.managed_nodes %>\n run_validations: <% $.run_validations %>\n queue_name: <% $.queue_name %>\n publish:\n introspected_nodes: <% task().result.introspected_nodes %>\n\n set_status_introspect_manageable:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(introspect_manageable).result %>\n introspected_nodes: []\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.introspect_manageable_nodes\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n introspected_nodes: <% $.get('introspected_nodes', []) %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n configure:\n description: Take a list of manageable nodes and update their boot configuration.\n\n input:\n - node_uuids\n - queue_name: tripleo\n - kernel_name: bm-deploy-kernel\n - ramdisk_name: bm-deploy-ramdisk\n - instance_boot_option: null\n - root_device: null\n - root_device_minimum_size: 4\n - overwrite_root_device_hints: False\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n configure_boot:\n on-success: configure_root_device\n on-error: set_status_failed_configure_boot\n with-items: node_uuid in <% $.node_uuids %>\n action: tripleo.baremetal.configure_boot node_uuid=<% $.node_uuid %> kernel_name=<% $.kernel_name %> ramdisk_name=<% $.ramdisk_name %> instance_boot_option=<% $.instance_boot_option %>\n\n configure_root_device:\n on-success: send_message\n on-error: set_status_failed_configure_root_device\n with-items: node_uuid in <% $.node_uuids %>\n action: tripleo.baremetal.configure_root_device node_uuid=<% $.node_uuid %> root_device=<% $.root_device %> minimum_size=<% $.root_device_minimum_size %> overwrite=<% $.overwrite_root_device_hints %>\n publish:\n status: SUCCESS\n message: 'Successfully configured the nodes.'\n\n set_status_failed_configure_boot:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(configure_boot).result %>\n\n set_status_failed_configure_root_device:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(configure_root_device).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.configure\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n configure_manageable_nodes:\n description: Update the boot configuration of all nodes in 'manageable' state.\n\n input:\n - queue_name: tripleo\n - kernel_name: 'bm-deploy-kernel'\n - ramdisk_name: 'bm-deploy-ramdisk'\n - instance_boot_option: null\n - root_device: null\n - root_device_minimum_size: 4\n - overwrite_root_device_hints: False\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n get_manageable_nodes:\n action: ironic.node_list maintenance=False associated=False\n on-success: configure_manageable\n on-error: set_status_failed_get_manageable_nodes\n publish:\n managed_nodes: <% task().result.where($.provision_state = 'manageable').uuid %>\n\n configure_manageable:\n on-success: send_message\n on-error: set_status_failed_configure_manageable\n workflow: tripleo.baremetal.v1.configure\n input:\n node_uuids: <% $.managed_nodes %>\n queue_name: <% $.queue_name %>\n kernel_name: <% $.kernel_name %>\n ramdisk_name: <% $.ramdisk_name %>\n instance_boot_option: <% $.instance_boot_option %>\n root_device: <% $.root_device %>\n root_device_minimum_size: <% $.root_device_minimum_size %>\n overwrite_root_device_hints: <% $.overwrite_root_device_hints %>\n publish:\n message: 'Manageable nodes configured successfully.'\n\n set_status_failed_configure_manageable:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(configure_manageable).result %>\n\n set_status_failed_get_manageable_nodes:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_manageable_nodes).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.configure_manageable_nodes\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n tag_node:\n description: Tag a node with a role\n input:\n - node_uuid\n - role: null\n - queue_name: tripleo\n\n task-defaults:\n on-error: send_message\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n update_node:\n on-success: send_message\n action: tripleo.baremetal.update_node_capability node_uuid=<% $.node_uuid %> capability='profile' value=<% $.role %>\n publish:\n message: <% task().result %>\n status: SUCCESS\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.tag_node\n payload:\n status: <% $.get('status', 'FAILED') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n tag_nodes:\n description: Runs the tag_node workflow in a loop\n input:\n - tag_node_uuids\n - untag_node_uuids\n - role\n - plan: overcloud\n - queue_name: tripleo\n\n task-defaults:\n on-error: send_message\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n tag_nodes:\n with-items: node_uuid in <% $.tag_node_uuids %>\n workflow: tripleo.baremetal.v1.tag_node\n input:\n node_uuid: <% $.node_uuid %>\n queue_name: <% $.queue_name %>\n role: <% $.role %>\n concurrency: 1\n on-success: untag_nodes\n\n untag_nodes:\n with-items: node_uuid in <% $.untag_node_uuids %>\n workflow: tripleo.baremetal.v1.tag_node\n input:\n node_uuid: <% $.node_uuid %>\n queue_name: <% $.queue_name %>\n concurrency: 1\n on-success: update_role_parameters\n\n update_role_parameters:\n on-success: send_message\n action: tripleo.parameters.update_role role=<% $.role %> container=<% $.plan %>\n publish:\n message: <% task().result %>\n status: SUCCESS\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.tag_nodes\n payload:\n status: <% $.get('status', 'FAILED') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n nodes_with_hint:\n description: Find nodes matching a hint regex\n input:\n - hint_regex\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_nodes:\n with-items: provision_state in <% ['available', 'active'] %>\n action: ironic.node_list maintenance=false provision_state=<% $.provision_state %> detail=true\n on-success: get_matching_nodes\n on-error: set_status_failed_get_nodes\n\n get_matching_nodes:\n with-items: node in <% task(get_nodes).result.flatten() %>\n action: tripleo.baremetal.get_node_hint node=<% $.node %>\n on-success: send_message\n on-error: set_status_failed_get_matching_nodes\n publish:\n matching_nodes: <% let(hint_regex => $.hint_regex) -> task().result.where($.hint and $.hint.matches($hint_regex)).uuid %>\n\n set_status_failed_get_nodes:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_nodes).result %>\n\n set_status_failed_get_matching_nodes:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_matching_nodes).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.nodes_with_hint\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n matching_nodes: <% $.matching_nodes or [] %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n nodes_with_profile:\n description: Find nodes with a specific profile\n input:\n - profile\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_active_nodes:\n action: ironic.node_list maintenance=false provision_state='active' detail=true\n on-success: get_available_nodes\n on-error: set_status_failed_get_active_nodes\n\n get_available_nodes:\n action: ironic.node_list maintenance=false provision_state='available' detail=true\n on-success: get_matching_nodes\n on-error: set_status_failed_get_available_nodes\n\n get_matching_nodes:\n with-items: node in <% task(get_available_nodes).result + task(get_active_nodes).result %>\n action: tripleo.baremetal.get_profile node=<% $.node %>\n on-success: send_message\n on-error: set_status_failed_get_matching_nodes\n publish:\n matching_nodes: <% let(input_profile_name => $.profile) -> task().result.where($.profile = $input_profile_name).uuid %>\n\n set_status_failed_get_active_nodes:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_active_nodes).result %>\n\n set_status_failed_get_available_nodes:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_available_nodes).result %>\n\n set_status_failed_get_matching_nodes:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_matching_nodes).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.nodes_with_profile\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n matching_nodes: <% $.matching_nodes or [] %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n create_raid_configuration:\n description: Create and apply RAID configuration for given nodes\n input:\n - node_uuids\n - configuration\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n set_configuration:\n with-items: node_uuid in <% $.node_uuids %>\n action: ironic.node_set_target_raid_config node_ident=<% $.node_uuid %> target_raid_config=<% $.configuration %>\n on-success: apply_configuration\n on-error: set_configuration_failed\n\n set_configuration_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(set_configuration).result %>\n\n apply_configuration:\n with-items: node_uuid in <% $.node_uuids %>\n workflow: tripleo.baremetal.v1.manual_cleaning\n input:\n node_uuid: <% $.node_uuid %>\n clean_steps:\n - interface: raid\n step: delete_configuration\n - interface: raid\n step: create_configuration\n timeout: 1800 # building RAID should be fast than general cleaning\n retry_count: 180\n retry_delay: 10\n on-success: send_message\n on-error: apply_configuration_failed\n publish:\n message: <% task().result %>\n status: SUCCESS\n\n apply_configuration_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(apply_configuration).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.create_raid_configuration\n payload:\n status: <% $.get('status', 'FAILED') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n\n cellv2_discovery:\n description: Run cell_v2 host discovery\n\n input:\n - node_uuids\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n cell_v2_discover_hosts:\n on-success: wait_for_nova_resources\n on-error: cell_v2_discover_hosts_failed\n action: tripleo.baremetal.cell_v2_discover_hosts\n\n cell_v2_discover_hosts_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(cell_v2_discover_hosts).result %>\n\n wait_for_nova_resources:\n on-success: send_message\n on-error: wait_for_nova_resources_failed\n with-items: node_uuid in <% $.node_uuids %>\n action: nova.hypervisors_find hypervisor_hostname=<% $.node_uuid %>\n\n wait_for_nova_resources_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(wait_for_nova_resources).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.cellv2_discovery\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n\n discover_nodes:\n description: Run nodes discovery over the given IP range\n\n input:\n - ip_addresses\n - credentials\n - ports: [623]\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n get_all_nodes:\n action: ironic.node_list\n input:\n fields: [\"uuid\", \"driver\", \"driver_info\"]\n limit: 0\n on-success: get_candidate_nodes\n on-error: get_all_nodes_failed\n publish:\n existing_nodes: <% task().result %>\n\n get_all_nodes_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_all_nodes).result %>\n\n get_candidate_nodes:\n action: tripleo.baremetal.get_candidate_nodes\n input:\n ip_addresses: <% $.ip_addresses %>\n credentials: <% $.credentials %>\n ports: <% $.ports %>\n existing_nodes: <% $.existing_nodes %>\n on-success: probe_nodes\n on-error: get_candidate_nodes_failed\n publish:\n candidates: <% task().result %>\n\n get_candidate_nodes_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(get_candidate_nodes).result %>\n\n probe_nodes:\n action: tripleo.baremetal.probe_node\n on-success: send_message\n on-error: probe_nodes_failed\n input:\n ip: <% $.node.ip %>\n port: <% $.node.port %>\n username: <% $.node.username %>\n password: <% $.node.password %>\n with-items:\n - node in <% $.candidates %>\n publish:\n nodes_json: <% task().result.where($ != null) %>\n\n probe_nodes_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(probe_nodes).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.discover_nodes\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n nodes_json: <% $.get('nodes_json', []) %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n discover_and_enroll_nodes:\n description: Run nodes discovery over the given IP range and enroll nodes\n\n input:\n - ip_addresses\n - credentials\n - ports: [623]\n - kernel_name: null\n - ramdisk_name: null\n - instance_boot_option: local\n - initial_state: manageable\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n discover_nodes:\n workflow: tripleo.baremetal.v1.discover_nodes\n input:\n ip_addresses: <% $.ip_addresses %>\n ports: <% $.ports %>\n credentials: <% $.credentials %>\n queue_name: <% $.queue_name %>\n on-success: enroll_nodes\n on-error: discover_nodes_failed\n publish:\n nodes_json: <% task().result.nodes_json %>\n\n discover_nodes_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(discover_nodes).result %>\n\n enroll_nodes:\n workflow: tripleo.baremetal.v1.register_or_update\n input:\n nodes_json: <% $.nodes_json %>\n kernel_name: <% $.kernel_name %>\n ramdisk_name: <% $.ramdisk_name %>\n instance_boot_option: <% $.instance_boot_option %>\n initial_state: <% $.initial_state %>\n on-success: send_message\n on-error: enroll_nodes_failed\n publish:\n registered_nodes: <% task().result.registered_nodes %>\n\n enroll_nodes_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(enroll_nodes).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.baremetal.v1.discover_and_enroll_nodes\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n registered_nodes: <% $.get('registered_nodes', []) %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n", "name": "tripleo.baremetal.v1", "tags": [], "created_at": "2018-08-21 13:36:41", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "9cfa7e92-f4fc-4f7f-8826-6173d87110bb"} > >2018-08-21 16:36:42,659 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:36:42,665 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.package_update.v1 >description: TripleO update workflows > >workflows: > > # Updates a workload cloud stack > package_update_plan: > description: Take a container and perform a package update with possible breakpoints > > input: > - container > - container_registry > - ceph_ansible_playbook > - timeout: 240 > - queue_name: tripleo > - skip_deploy_identifier: False > - config_dir: '/tmp/' > > tags: > - tripleo-common-managed > > tasks: > update: > action: tripleo.package_update.update_stack > input: > timeout: <% $.timeout %> > container: <% $.container %> > container_registry: <% $.container_registry %> > ceph_ansible_playbook: <% $.ceph_ansible_playbook %> > on-success: clean_plan > on-error: set_update_failed > > clean_plan: > action: tripleo.plan.update_plan_environment > input: > container: <% $.container %> > parameter: CephAnsiblePlaybook > env_key: parameter_defaults > delete: true > on-success: send_message > on-error: set_update_failed > > > set_update_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(update).result %> > > send_message: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.package_update.v1.package_update_plan > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > get_config: > input: > - container > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > get_config: > action: tripleo.config.get_overcloud_config container=<% $.container %> > publish: > status: SUCCESS > message: <% task().result %> > publish-on-error: > status: FAILED > message: Init Minor update failed > on-complete: send_message > > send_message: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.package_update.v1.package_update_plan > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > update_nodes: > description: Take a container and perform an update nodes by nodes > > input: > - node_user: heat-admin > - nodes > - playbook > - inventory_file > - ansible_queue_name: tripleo > - module_path: /usr/share/ansible-modules > - ansible_extra_env_variables: > ANSIBLE_LOG_PATH: /var/log/mistral/package_update.log > ANSIBLE_HOST_KEY_CHECKING: 'False' > - verbosity: 1 > - work_dir: /var/lib/mistral > - skip_tags: '' > > tags: > - tripleo-common-managed > > tasks: > download_config: > action: tripleo.config.download_config > input: > work_dir: <% $.work_dir %>/<% execution().id %> > on-success: get_private_key > on-error: node_update_failed > > get_private_key: > action: tripleo.validations.get_privkey > publish: > private_key: <% task().result %> > on-success: node_update > > node_update: > action: tripleo.ansible-playbook > input: > inventory: <% $.inventory_file %> > playbook: <% $.work_dir %>/<% execution().id %>/<% $.playbook %> > remote_user: <% $.node_user %> > become: true > become_user: root > verbosity: <% $.verbosity %> > ssh_private_key: <% $.private_key %> > extra_env_variables: <% $.ansible_extra_env_variables %> > limit_hosts: <% $.nodes %> > module_path: <% $.module_path %> > queue_name: <% $.ansible_queue_name %> > execution_id: <% execution().id %> > skip_tags: <% $.skip_tags %> > trash_output: true > on-success: > - node_update_passed: <% task().result.returncode = 0 %> > - node_update_failed: <% task().result.returncode != 0 %> > on-error: node_update_failed > publish: > output: <% task().result %> > > node_update_passed: > on-success: notify_zaqar > publish: > status: SUCCESS > message: Updated nodes - <% $.nodes %> > > node_update_failed: > on-success: notify_zaqar > publish: > status: FAILED > message: Failed to update nodes - <% $.nodes %>, please see the logs. > > notify_zaqar: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.ansible_queue_name %> > messages: > body: > type: tripleo.package_update.v1.update_nodes > payload: > status: <% $.status %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > update_converge_plan: > description: Take a container and perform the converge for minor update > > input: > - container > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > remove_noop: > action: tripleo.plan.remove_noop_deploystep > input: > container: <% $.container %> > on-success: send_message > on-error: set_update_failed > > set_update_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(remove_noop).result %> > > send_message: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.package_update.v1.update_converge_plan > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > converge_upgrade_plan: > description: Take a container and perform the converge step of a major upgrade > > input: > - container > - timeout: 240 > - queue_name: tripleo > - skip_deploy_identifier: False > > tags: > - tripleo-common-managed > > tasks: > remove_noop: > action: tripleo.plan.remove_noop_deploystep > input: > container: <% $.container %> > on-success: send_message > on-error: set_update_failed > > set_update_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(upgrade_converge).result %> > > send_message: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.major_upgrade.v1.converge_upgrade_plan > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > ffwd_upgrade_converge_plan: > description: ffwd-upgrade converge removes DeploymentSteps no-op from plan > > input: > - container > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > remove_noop: > action: tripleo.plan.remove_noop_deploystep > input: > container: <% $.container %> > on-success: send_message > on-error: set_update_failed > > set_update_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(remove_noop).result %> > > send_message: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.package_update.v1.ffwd_upgrade_converge_plan > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> >' >2018-08-21 16:36:46,724 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 8946 >2018-08-21 16:36:46,728 DEBUG: RESP: [201] Content-Length: 8946 Content-Type: application/json Date: Tue, 21 Aug 2018 13:36:46 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.package_update.v1\ndescription: TripleO update workflows\n\nworkflows:\n\n # Updates a workload cloud stack\n package_update_plan:\n description: Take a container and perform a package update with possible breakpoints\n\n input:\n - container\n - container_registry\n - ceph_ansible_playbook\n - timeout: 240\n - queue_name: tripleo\n - skip_deploy_identifier: False\n - config_dir: '/tmp/'\n\n tags:\n - tripleo-common-managed\n\n tasks:\n update:\n action: tripleo.package_update.update_stack\n input:\n timeout: <% $.timeout %>\n container: <% $.container %>\n container_registry: <% $.container_registry %>\n ceph_ansible_playbook: <% $.ceph_ansible_playbook %>\n on-success: clean_plan\n on-error: set_update_failed\n\n clean_plan:\n action: tripleo.plan.update_plan_environment\n input:\n container: <% $.container %>\n parameter: CephAnsiblePlaybook\n env_key: parameter_defaults\n delete: true\n on-success: send_message\n on-error: set_update_failed\n\n\n set_update_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(update).result %>\n\n send_message:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.package_update.v1.package_update_plan\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n get_config:\n input:\n - container\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_config:\n action: tripleo.config.get_overcloud_config container=<% $.container %>\n publish:\n status: SUCCESS\n message: <% task().result %>\n publish-on-error:\n status: FAILED\n message: Init Minor update failed\n on-complete: send_message\n\n send_message:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.package_update.v1.package_update_plan\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n update_nodes:\n description: Take a container and perform an update nodes by nodes\n\n input:\n - node_user: heat-admin\n - nodes\n - playbook\n - inventory_file\n - ansible_queue_name: tripleo\n - module_path: /usr/share/ansible-modules\n - ansible_extra_env_variables:\n ANSIBLE_LOG_PATH: /var/log/mistral/package_update.log\n ANSIBLE_HOST_KEY_CHECKING: 'False'\n - verbosity: 1\n - work_dir: /var/lib/mistral\n - skip_tags: ''\n\n tags:\n - tripleo-common-managed\n\n tasks:\n download_config:\n action: tripleo.config.download_config\n input:\n work_dir: <% $.work_dir %>/<% execution().id %>\n on-success: get_private_key\n on-error: node_update_failed\n\n get_private_key:\n action: tripleo.validations.get_privkey\n publish:\n private_key: <% task().result %>\n on-success: node_update\n\n node_update:\n action: tripleo.ansible-playbook\n input:\n inventory: <% $.inventory_file %>\n playbook: <% $.work_dir %>/<% execution().id %>/<% $.playbook %>\n remote_user: <% $.node_user %>\n become: true\n become_user: root\n verbosity: <% $.verbosity %>\n ssh_private_key: <% $.private_key %>\n extra_env_variables: <% $.ansible_extra_env_variables %>\n limit_hosts: <% $.nodes %>\n module_path: <% $.module_path %>\n queue_name: <% $.ansible_queue_name %>\n execution_id: <% execution().id %>\n skip_tags: <% $.skip_tags %>\n trash_output: true\n on-success:\n - node_update_passed: <% task().result.returncode = 0 %>\n - node_update_failed: <% task().result.returncode != 0 %>\n on-error: node_update_failed\n publish:\n output: <% task().result %>\n\n node_update_passed:\n on-success: notify_zaqar\n publish:\n status: SUCCESS\n message: Updated nodes - <% $.nodes %>\n\n node_update_failed:\n on-success: notify_zaqar\n publish:\n status: FAILED\n message: Failed to update nodes - <% $.nodes %>, please see the logs.\n\n notify_zaqar:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.ansible_queue_name %>\n messages:\n body:\n type: tripleo.package_update.v1.update_nodes\n payload:\n status: <% $.status %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n update_converge_plan:\n description: Take a container and perform the converge for minor update\n\n input:\n - container\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n remove_noop:\n action: tripleo.plan.remove_noop_deploystep\n input:\n container: <% $.container %>\n on-success: send_message\n on-error: set_update_failed\n\n set_update_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(remove_noop).result %>\n\n send_message:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.package_update.v1.update_converge_plan\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n converge_upgrade_plan:\n description: Take a container and perform the converge step of a major upgrade\n\n input:\n - container\n - timeout: 240\n - queue_name: tripleo\n - skip_deploy_identifier: False\n\n tags:\n - tripleo-common-managed\n\n tasks:\n remove_noop:\n action: tripleo.plan.remove_noop_deploystep\n input:\n container: <% $.container %>\n on-success: send_message\n on-error: set_update_failed\n\n set_update_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(upgrade_converge).result %>\n\n send_message:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.major_upgrade.v1.converge_upgrade_plan\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n ffwd_upgrade_converge_plan:\n description: ffwd-upgrade converge removes DeploymentSteps no-op from plan\n\n input:\n - container\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n remove_noop:\n action: tripleo.plan.remove_noop_deploystep\n input:\n container: <% $.container %>\n on-success: send_message\n on-error: set_update_failed\n\n set_update_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(remove_noop).result %>\n\n send_message:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.package_update.v1.ffwd_upgrade_converge_plan\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n", "name": "tripleo.package_update.v1", "tags": [], "created_at": "2018-08-21 13:36:46", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "4730c1b9-5620-4a9a-8473-46d530effe1d"} > >2018-08-21 16:36:46,729 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:36:46,732 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.deployment.v1 >description: TripleO deployment workflows > >workflows: > > deploy_on_server: > > input: > - server_uuid > - server_name > - config > - config_name > - group > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > deploy_config: > action: tripleo.deployment.config > on-complete: send_message > input: > server_id: <% $.server_uuid %> > name: <% $.config_name %> > config: <% $.config %> > group: <% $.group %> > publish: > stdout: <% task().result.deploy_stdout %> > stderr: <% task().result.deploy_stderr %> > status_code: <% task().result.deploy_status_code %> > publish-on-error: > status: FAILED > message: <% task().result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.deployment.v1.deploy_on_server > payload: > status: <% $.get("status", "SUCCESS") %> > message: <% $.get("message", "") %> > server_uuid: <% $.server_uuid %> > server_name: <% $.server_name %> > config_name: <% $.config_name %> > status_code: <% $.get("status_code", "") %> > stdout: <% $.get("stdout", "") %> > stderr: <% $.get("stderr", "") %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > deploy_on_servers: > > input: > - server_name > - config_name > - config > - group: script > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > check_if_all_servers: > on-success: > - get_servers_matching: <% $.server_name != "all" %> > - get_all_servers: <% $.server_name = "all" %> > > get_servers_matching: > action: nova.servers_list > on-success: deploy_on_servers > publish: > servers_with_name: <% task().result._info.where($.name.indexOf(execution().input.server_name) > -1) %> > > get_all_servers: > action: nova.servers_list > on-success: deploy_on_servers > publish: > servers_with_name: <% task().result._info %> > > deploy_on_servers: > on-success: send_success_message > on-error: send_failed_message > with-items: server in <% $.servers_with_name %> > workflow: tripleo.deployment.v1.deploy_on_server > input: > server_name: <% $.server.name %> > server_uuid: <% $.server.id %> > config: <% $.config %> > config_name: <% $.config_name %> > group: <% $.group %> > queue_name: <% $.queue_name %> > > send_success_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.deployment.v1.deploy_on_servers > payload: > status: SUCCESS > execution: <% execution() %> > > send_failed_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.deployment.v1.deploy_on_servers > payload: > status: FAILED > message: <% task(deploy_on_servers).result %> > execution: <% execution() %> > on-success: fail > > deploy_plan: > > description: > > Deploy the overcloud for a plan. > > input: > - container > - run_validations: False > - timeout: 240 > - skip_deploy_identifier: False > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > > add_validation_ssh_key: > workflow: tripleo.validations.v1.add_validation_ssh_key_parameter > input: > container: <% $.container %> > queue_name: <% $.queue_name %> > on-complete: > - run_validations: <% $.run_validations %> > - create_swift_rings_backup_plan: <% not $.run_validations %> > > run_validations: > workflow: tripleo.validations.v1.run_groups > input: > group_names: > - 'pre-deployment' > plan: <% $.container %> > queue_name: <% $.queue_name %> > on-success: create_swift_rings_backup_plan > on-error: set_validations_failed > > set_validations_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(run_validations).result %> > > create_swift_rings_backup_plan: > workflow: tripleo.swift_rings_backup.v1.create_swift_rings_backup_container_plan > on-success: cell_v2_discover_hosts > on-error: create_swift_rings_backup_plan_set_status_failed > input: > container: <% $.container %> > queue_name: <% $.queue_name %> > use_default_templates: true > > cell_v2_discover_hosts: > on-success: deploy > on-error: cell_v2_discover_hosts_failed > action: tripleo.baremetal.cell_v2_discover_hosts > > cell_v2_discover_hosts_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(cell_v2_discover_hosts).result %> > > deploy: > action: tripleo.deployment.deploy > input: > timeout: <% $.timeout %> > container: <% $.container %> > skip_deploy_identifier: <% $.skip_deploy_identifier %> > on-success: send_message > on-error: set_deployment_failed > > create_swift_rings_backup_plan_set_status_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(create_swift_rings_backup_plan).result %> > > set_deployment_failed: > on-success: send_message > publish: > status: FAILED > message: <% task(deploy).result %> > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.deployment.v1.deploy_plan > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > get_horizon_url: > > description: > > Retrieve the Horizon URL from the Overcloud stack. > > input: > - stack: overcloud > - queue_name: tripleo > > tags: > - tripleo-common-managed > > output: > horizon_url: <% $.horizon_url %> > > tasks: > get_horizon_url: > action: heat.stacks_get > input: > stack_id: <% $.stack %> > publish: > horizon_url: <% task().result.outputs.where($.output_key = "EndpointMap").output_value.HorizonPublic.uri.single() %> > on-success: notify_zaqar > publish-on-error: > status: FAILED > message: <% task().result %> > > notify_zaqar: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.deployment.v1.get_horizon_url > payload: > horizon_url: <% $.get('horizon_url', '') %> > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> > > config_download_deploy: > > description: > > Configure the overcloud with config-download. > > input: > - timeout: 240 > - queue_name: tripleo > - plan_name: overcloud > - work_dir: /var/lib/mistral > - verbosity: 1 > > tags: > - tripleo-common-managed > > tasks: > > get_config: > action: tripleo.config.get_overcloud_config > input: > container: <% $.get('plan_name') %> > on-success: download_config > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > download_config: > action: tripleo.config.download_config > input: > work_dir: <% $.get('work_dir') %>/<% execution().id %> > on-success: send_msg_config_download > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > send_msg_config_download: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.deployment.v1.config_download > payload: > status: <% $.get('status', 'RUNNING') %> > message: Config downloaded at <% $.get('work_dir') %>/<% execution().id %> > execution: <% execution() %> > on-success: get_private_key > > get_private_key: > action: tripleo.validations.get_privkey > publish: > private_key: <% task().result %> > on-success: generate_inventory > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > generate_inventory: > action: tripleo.ansible-generate-inventory > input: > ansible_ssh_user: tripleo-admin > work_dir: <% $.get('work_dir') %>/<% execution().id %> > plan_name: <% $.get('plan_name') %> > publish: > inventory: <% task().result %> > on-success: send_msg_generate_inventory > on-error: send_message > publish-on-error: > status: FAILED > message: <% task().result %> > > send_msg_generate_inventory: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.deployment.v1.config_download > payload: > status: <% $.get('status', 'RUNNING') %> > message: Inventory generated at <% $.get('inventory') %> > execution: <% execution() %> > on-success: send_msg_run_ansible > > send_msg_run_ansible: > action: zaqar.queue_post > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.deployment.v1.config_download > payload: > status: <% $.get('status', 'RUNNING') %> > message: > > Running ansible playbook at <% $.get('work_dir') %>/<% execution().id %>/deploy_steps_playbook.yaml. > See log file at <% $.get('work_dir') %>/<% execution().id %>/ansible.log for progress. > ... > execution: <% execution() %> > on-success: run_ansible > > run_ansible: > action: tripleo.ansible-playbook > input: > inventory: <% $.inventory %> > playbook: <% $.get('work_dir') %>/<% execution().id %>/deploy_steps_playbook.yaml > remote_user: tripleo-admin > ssh_extra_args: '-o StrictHostKeyChecking=no' > ssh_private_key: <% $.private_key %> > use_openstack_credentials: true > verbosity: <% $.get('verbosity') %> > become: true > timeout: <% $.timeout %> > work_dir: <% $.get('work_dir') %>/<% execution().id %> > queue_name: <% $.queue_name %> > reproduce_command: true > trash_output: true > publish: > log_path: <% task(run_ansible).result.get('log_path') %> > on-success: > - ansible_passed: <% task().result.returncode = 0 %> > - ansible_failed: <% task().result.returncode != 0 %> > on-error: send_message > publish-on-error: > status: FAILED > message: Ansible failed, check log at <% $.get('work_dir') %>/<% execution().id %>/ansible.log. > > ansible_passed: > on-success: send_message > publish: > status: SUCCESS > message: Ansible passed. > > ansible_failed: > on-success: send_message > publish: > status: FAILED > message: Ansible failed, check log at <% $.get('work_dir') %>/<% execution().id %>/ansible.log. > > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: tripleo.deployment.v1.config_download > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = "FAILED" %> >' >2018-08-21 16:36:51,477 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 13556 >2018-08-21 16:36:51,483 DEBUG: RESP: [201] Content-Length: 13556 Content-Type: application/json Date: Tue, 21 Aug 2018 13:36:51 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.deployment.v1\ndescription: TripleO deployment workflows\n\nworkflows:\n\n deploy_on_server:\n\n input:\n - server_uuid\n - server_name\n - config\n - config_name\n - group\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n deploy_config:\n action: tripleo.deployment.config\n on-complete: send_message\n input:\n server_id: <% $.server_uuid %>\n name: <% $.config_name %>\n config: <% $.config %>\n group: <% $.group %>\n publish:\n stdout: <% task().result.deploy_stdout %>\n stderr: <% task().result.deploy_stderr %>\n status_code: <% task().result.deploy_status_code %>\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.deployment.v1.deploy_on_server\n payload:\n status: <% $.get(\"status\", \"SUCCESS\") %>\n message: <% $.get(\"message\", \"\") %>\n server_uuid: <% $.server_uuid %>\n server_name: <% $.server_name %>\n config_name: <% $.config_name %>\n status_code: <% $.get(\"status_code\", \"\") %>\n stdout: <% $.get(\"stdout\", \"\") %>\n stderr: <% $.get(\"stderr\", \"\") %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n deploy_on_servers:\n\n input:\n - server_name\n - config_name\n - config\n - group: script\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n check_if_all_servers:\n on-success:\n - get_servers_matching: <% $.server_name != \"all\" %>\n - get_all_servers: <% $.server_name = \"all\" %>\n\n get_servers_matching:\n action: nova.servers_list\n on-success: deploy_on_servers\n publish:\n servers_with_name: <% task().result._info.where($.name.indexOf(execution().input.server_name) > -1) %>\n\n get_all_servers:\n action: nova.servers_list\n on-success: deploy_on_servers\n publish:\n servers_with_name: <% task().result._info %>\n\n deploy_on_servers:\n on-success: send_success_message\n on-error: send_failed_message\n with-items: server in <% $.servers_with_name %>\n workflow: tripleo.deployment.v1.deploy_on_server\n input:\n server_name: <% $.server.name %>\n server_uuid: <% $.server.id %>\n config: <% $.config %>\n config_name: <% $.config_name %>\n group: <% $.group %>\n queue_name: <% $.queue_name %>\n\n send_success_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.deployment.v1.deploy_on_servers\n payload:\n status: SUCCESS\n execution: <% execution() %>\n\n send_failed_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.deployment.v1.deploy_on_servers\n payload:\n status: FAILED\n message: <% task(deploy_on_servers).result %>\n execution: <% execution() %>\n on-success: fail\n\n deploy_plan:\n\n description: >\n Deploy the overcloud for a plan.\n\n input:\n - container\n - run_validations: False\n - timeout: 240\n - skip_deploy_identifier: False\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n add_validation_ssh_key:\n workflow: tripleo.validations.v1.add_validation_ssh_key_parameter\n input:\n container: <% $.container %>\n queue_name: <% $.queue_name %>\n on-complete:\n - run_validations: <% $.run_validations %>\n - create_swift_rings_backup_plan: <% not $.run_validations %>\n\n run_validations:\n workflow: tripleo.validations.v1.run_groups\n input:\n group_names:\n - 'pre-deployment'\n plan: <% $.container %>\n queue_name: <% $.queue_name %>\n on-success: create_swift_rings_backup_plan\n on-error: set_validations_failed\n\n set_validations_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(run_validations).result %>\n\n create_swift_rings_backup_plan:\n workflow: tripleo.swift_rings_backup.v1.create_swift_rings_backup_container_plan\n on-success: cell_v2_discover_hosts\n on-error: create_swift_rings_backup_plan_set_status_failed\n input:\n container: <% $.container %>\n queue_name: <% $.queue_name %>\n use_default_templates: true\n\n cell_v2_discover_hosts:\n on-success: deploy\n on-error: cell_v2_discover_hosts_failed\n action: tripleo.baremetal.cell_v2_discover_hosts\n\n cell_v2_discover_hosts_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(cell_v2_discover_hosts).result %>\n\n deploy:\n action: tripleo.deployment.deploy\n input:\n timeout: <% $.timeout %>\n container: <% $.container %>\n skip_deploy_identifier: <% $.skip_deploy_identifier %>\n on-success: send_message\n on-error: set_deployment_failed\n\n create_swift_rings_backup_plan_set_status_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(create_swift_rings_backup_plan).result %>\n\n set_deployment_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: <% task(deploy).result %>\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.deployment.v1.deploy_plan\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n get_horizon_url:\n\n description: >\n Retrieve the Horizon URL from the Overcloud stack.\n\n input:\n - stack: overcloud\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n output:\n horizon_url: <% $.horizon_url %>\n\n tasks:\n get_horizon_url:\n action: heat.stacks_get\n input:\n stack_id: <% $.stack %>\n publish:\n horizon_url: <% task().result.outputs.where($.output_key = \"EndpointMap\").output_value.HorizonPublic.uri.single() %>\n on-success: notify_zaqar\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n notify_zaqar:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.deployment.v1.get_horizon_url\n payload:\n horizon_url: <% $.get('horizon_url', '') %>\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n\n config_download_deploy:\n\n description: >\n Configure the overcloud with config-download.\n\n input:\n - timeout: 240\n - queue_name: tripleo\n - plan_name: overcloud\n - work_dir: /var/lib/mistral\n - verbosity: 1\n\n tags:\n - tripleo-common-managed\n\n tasks:\n\n get_config:\n action: tripleo.config.get_overcloud_config\n input:\n container: <% $.get('plan_name') %>\n on-success: download_config\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n download_config:\n action: tripleo.config.download_config\n input:\n work_dir: <% $.get('work_dir') %>/<% execution().id %>\n on-success: send_msg_config_download\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n send_msg_config_download:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.deployment.v1.config_download\n payload:\n status: <% $.get('status', 'RUNNING') %>\n message: Config downloaded at <% $.get('work_dir') %>/<% execution().id %>\n execution: <% execution() %>\n on-success: get_private_key\n\n get_private_key:\n action: tripleo.validations.get_privkey\n publish:\n private_key: <% task().result %>\n on-success: generate_inventory\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n generate_inventory:\n action: tripleo.ansible-generate-inventory\n input:\n ansible_ssh_user: tripleo-admin\n work_dir: <% $.get('work_dir') %>/<% execution().id %>\n plan_name: <% $.get('plan_name') %>\n publish:\n inventory: <% task().result %>\n on-success: send_msg_generate_inventory\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: <% task().result %>\n\n send_msg_generate_inventory:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.deployment.v1.config_download\n payload:\n status: <% $.get('status', 'RUNNING') %>\n message: Inventory generated at <% $.get('inventory') %>\n execution: <% execution() %>\n on-success: send_msg_run_ansible\n\n send_msg_run_ansible:\n action: zaqar.queue_post\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.deployment.v1.config_download\n payload:\n status: <% $.get('status', 'RUNNING') %>\n message: >\n Running ansible playbook at <% $.get('work_dir') %>/<% execution().id %>/deploy_steps_playbook.yaml.\n See log file at <% $.get('work_dir') %>/<% execution().id %>/ansible.log for progress.\n ...\n execution: <% execution() %>\n on-success: run_ansible\n\n run_ansible:\n action: tripleo.ansible-playbook\n input:\n inventory: <% $.inventory %>\n playbook: <% $.get('work_dir') %>/<% execution().id %>/deploy_steps_playbook.yaml\n remote_user: tripleo-admin\n ssh_extra_args: '-o StrictHostKeyChecking=no'\n ssh_private_key: <% $.private_key %>\n use_openstack_credentials: true\n verbosity: <% $.get('verbosity') %>\n become: true\n timeout: <% $.timeout %>\n work_dir: <% $.get('work_dir') %>/<% execution().id %>\n queue_name: <% $.queue_name %>\n reproduce_command: true\n trash_output: true\n publish:\n log_path: <% task(run_ansible).result.get('log_path') %>\n on-success:\n - ansible_passed: <% task().result.returncode = 0 %>\n - ansible_failed: <% task().result.returncode != 0 %>\n on-error: send_message\n publish-on-error:\n status: FAILED\n message: Ansible failed, check log at <% $.get('work_dir') %>/<% execution().id %>/ansible.log.\n\n ansible_passed:\n on-success: send_message\n publish:\n status: SUCCESS\n message: Ansible passed.\n\n ansible_failed:\n on-success: send_message\n publish:\n status: FAILED\n message: Ansible failed, check log at <% $.get('work_dir') %>/<% execution().id %>/ansible.log.\n\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: tripleo.deployment.v1.config_download\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = \"FAILED\" %>\n", "name": "tripleo.deployment.v1", "tags": [], "created_at": "2018-08-21 13:36:51", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "430ae731-dbc1-418d-b757-7134e5867bed"} > >2018-08-21 16:36:51,484 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:36:51,488 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.derive_params_formulas.v1 >description: TripleO Workflows to derive deployment parameters from the introspected data > >workflows: > > > dpdk_derive_params: > description: > > Workflow to derive parameters for DPDK service. > input: > - plan > - role_name > - hw_data # introspection data > - user_inputs > - derived_parameters: {} > > output: > derived_parameters: <% $.derived_parameters.mergeWith($.get('dpdk_parameters', {})) %> > > tags: > - tripleo-common-managed > > tasks: > get_network_config: > action: tripleo.parameters.get_network_config > input: > container: <% $.plan %> > role_name: <% $.role_name %> > publish: > network_configs: <% task().result.get('network_config', []) %> > on-success: get_dpdk_nics_numa_info > on-error: set_status_failed_get_network_config > > get_dpdk_nics_numa_info: > action: tripleo.derive_params.get_dpdk_nics_numa_info > input: > network_configs: <% $.network_configs %> > inspect_data: <% $.hw_data %> > publish: > dpdk_nics_numa_info: <% task().result %> > on-success: > # TODO: Need to remove condtions here > # adding condition and throw error in action for empty check > - get_dpdk_nics_numa_nodes: <% $.dpdk_nics_numa_info %> > - set_status_failed_get_dpdk_nics_numa_info: <% not $.dpdk_nics_numa_info %> > on-error: set_status_failed_on_error_get_dpdk_nics_numa_info > > get_dpdk_nics_numa_nodes: > publish: > dpdk_nics_numa_nodes: <% $.dpdk_nics_numa_info.groupBy($.numa_node).select($[0]).orderBy($) %> > on-success: > - get_numa_nodes: <% $.dpdk_nics_numa_nodes %> > - set_status_failed_get_dpdk_nics_numa_nodes: <% not $.dpdk_nics_numa_nodes %> > > get_numa_nodes: > publish: > numa_nodes: <% $.hw_data.numa_topology.ram.select($.numa_node).orderBy($) %> > on-success: > - get_num_phy_cores_per_numa_for_pmd: <% $.numa_nodes %> > - set_status_failed_get_numa_nodes: <% not $.numa_nodes %> > > get_num_phy_cores_per_numa_for_pmd: > publish: > num_phy_cores_per_numa_node_for_pmd: <% $.user_inputs.get('num_phy_cores_per_numa_node_for_pmd', 0) %> > on-success: > - get_num_cores_per_numa_nodes: <% isInteger($.num_phy_cores_per_numa_node_for_pmd) and $.num_phy_cores_per_numa_node_for_pmd > 0 %> > - set_status_failed_get_num_phy_cores_per_numa_for_pmd_invalid: <% not isInteger($.num_phy_cores_per_numa_node_for_pmd) %> > - set_status_failed_get_num_phy_cores_per_numa_for_pmd_not_provided: <% $.num_phy_cores_per_numa_node_for_pmd = 0 %> > > # For NUMA node with DPDK nic, number of cores should be used from user input > # For NUMA node without DPDK nic, number of cores should be 1 > get_num_cores_per_numa_nodes: > publish: > num_cores_per_numa_nodes: <% let(dpdk_nics_nodes => $.dpdk_nics_numa_nodes, cores => $.num_phy_cores_per_numa_node_for_pmd) -> $.numa_nodes.select(switch($ in $dpdk_nics_nodes => $cores, not $ in $dpdk_nics_nodes => 1)) %> > on-success: get_pmd_cpus > > get_pmd_cpus: > action: tripleo.derive_params.get_dpdk_core_list > input: > inspect_data: <% $.hw_data %> > numa_nodes_cores_count: <% $.num_cores_per_numa_nodes %> > publish: > pmd_cpus: <% task().result %> > on-success: > - get_pmd_cpus_range_list: <% $.pmd_cpus %> > - set_status_failed_get_pmd_cpus: <% not $.pmd_cpus %> > on-error: set_status_failed_on_error_get_pmd_cpus > > get_pmd_cpus_range_list: > action: tripleo.derive_params.convert_number_to_range_list > input: > num_list: <% $.pmd_cpus %> > publish: > pmd_cpus: <% task().result %> > on-success: get_host_cpus > on-error: set_status_failed_get_pmd_cpus_range_list > > get_host_cpus: > workflow: tripleo.derive_params_formulas.v1.get_host_cpus > input: > role_name: <% $.role_name %> > hw_data: <% $.hw_data %> > publish: > host_cpus: <% task().result.get('host_cpus', '') %> > on-success: get_sock_mem > on-error: set_status_failed_get_host_cpus > > get_sock_mem: > action: tripleo.derive_params.get_dpdk_socket_memory > input: > dpdk_nics_numa_info: <% $.dpdk_nics_numa_info %> > numa_nodes: <% $.numa_nodes %> > overhead: <% $.user_inputs.get('overhead', 800) %> > packet_size_in_buffer: <% 4096*64 %> > publish: > sock_mem: <% task().result %> > on-success: > - get_dpdk_parameters: <% $.sock_mem %> > - set_status_failed_get_sock_mem: <% not $.sock_mem %> > on-error: set_status_failed_on_error_get_sock_mem > > get_dpdk_parameters: > publish: > dpdk_parameters: <% dict(concat($.role_name, 'Parameters') => dict('OvsPmdCoreList' => $.get('pmd_cpus', ''), 'OvsDpdkCoreList' => $.get('host_cpus', ''), 'OvsDpdkSocketMemory' => $.get('sock_mem', ''))) %> > > set_status_failed_get_network_config: > publish: > status: FAILED > message: <% task(get_network_config).result %> > on-success: fail > > set_status_failed_get_dpdk_nics_numa_info: > publish: > status: FAILED > message: "Unable to determine DPDK NIC's NUMA information" > on-success: fail > > set_status_failed_on_error_get_dpdk_nics_numa_info: > publish: > status: FAILED > message: <% task(get_dpdk_nics_numa_info).result %> > on-success: fail > > set_status_failed_get_dpdk_nics_numa_nodes: > publish: > status: FAILED > message: "Unable to determine DPDK NIC's numa nodes" > on-success: fail > > set_status_failed_get_numa_nodes: > publish: > status: FAILED > message: 'Unable to determine available NUMA nodes' > on-success: fail > > set_status_failed_get_num_phy_cores_per_numa_for_pmd_invalid: > publish: > status: FAILED > message: <% "num_phy_cores_per_numa_node_for_pmd user input '{0}' is invalid".format($.num_phy_cores_per_numa_node_for_pmd) %> > on-success: fail > > set_status_failed_get_num_phy_cores_per_numa_for_pmd_not_provided: > publish: > status: FAILED > message: 'num_phy_cores_per_numa_node_for_pmd user input is not provided' > on-success: fail > > set_status_failed_get_pmd_cpus: > publish: > status: FAILED > message: 'Unable to determine OvsPmdCoreList parameter' > on-success: fail > > set_status_failed_on_error_get_pmd_cpus: > publish: > status: FAILED > message: <% task(get_pmd_cpus).result %> > on-success: fail > > set_status_failed_get_pmd_cpus_range_list: > publish: > status: FAILED > message: <% task(get_pmd_cpus_range_list).result %> > on-success: fail > > set_status_failed_get_host_cpus: > publish: > status: FAILED > message: <% task(get_host_cpus).result.get('message', '') %> > on-success: fail > > set_status_failed_get_sock_mem: > publish: > status: FAILED > message: 'Unable to determine OvsDpdkSocketMemory parameter' > on-success: fail > > set_status_failed_on_error_get_sock_mem: > publish: > status: FAILED > message: <% task(get_sock_mem).result %> > on-success: fail > > > sriov_derive_params: > description: > > This workflow derives parameters for the SRIOV feature. > > input: > - role_name > - hw_data # introspection data > - derived_parameters: {} > > output: > derived_parameters: <% $.derived_parameters.mergeWith($.get('sriov_parameters', {})) %> > > tags: > - tripleo-common-managed > > tasks: > get_host_cpus: > workflow: tripleo.derive_params_formulas.v1.get_host_cpus > input: > role_name: <% $.role_name %> > hw_data: <% $.hw_data %> > publish: > host_cpus: <% task().result.get('host_cpus', '') %> > on-success: get_sriov_parameters > on-error: set_status_failed_get_host_cpus > > get_sriov_parameters: > publish: > # SriovHostCpusList parameter is added temporarily and it's removed later from derived parameters result. > sriov_parameters: <% dict(concat($.role_name, 'Parameters') => dict('SriovHostCpusList' => $.get('host_cpus', ''))) %> > > set_status_failed_get_host_cpus: > publish: > status: FAILED > message: <% task(get_host_cpus).result.get('message', '') %> > on-success: fail > > > get_host_cpus: > description: > > Fetching the host CPU list from the introspection data, and then converting the raw list into a range list. > > input: > - hw_data # introspection data > > output: > host_cpus: <% $.get('host_cpus', '') %> > > tags: > - tripleo-common-managed > > tasks: > get_host_cpus: > action: tripleo.derive_params.get_host_cpus_list inspect_data=<% $.hw_data %> > publish: > host_cpus: <% task().result %> > on-success: > - get_host_cpus_range_list: <% $.host_cpus %> > - set_status_failed_get_host_cpus: <% not $.host_cpus %> > on-error: set_status_failed_on_error_get_host_cpus > > get_host_cpus_range_list: > action: tripleo.derive_params.convert_number_to_range_list > input: > num_list: <% $.host_cpus %> > publish: > host_cpus: <% task().result %> > on-error: set_status_failed_get_host_cpus_range_list > > set_status_failed_get_host_cpus: > publish: > status: FAILED > message: 'Unable to determine host cpus' > on-success: fail > > set_status_failed_on_error_get_host_cpus: > publish: > status: FAILED > message: <% task(get_host_cpus).result %> > on-success: fail > > set_status_failed_get_host_cpus_range_list: > publish: > status: FAILED > message: <% task(get_host_cpus_range_list).result %> > on-success: fail > > > host_derive_params: > description: > > This workflow derives parameters for the Host process, and is mainly associated with CPU pinning and huge memory pages. > This workflow can be dependent on any feature or also can be invoked individually as well. > > input: > - role_name > - hw_data # introspection data > - user_inputs > - derived_parameters: {} > > output: > derived_parameters: <% $.derived_parameters.mergeWith($.get('host_parameters', {})) %> > > tags: > - tripleo-common-managed > > tasks: > get_cpus: > publish: > cpus: <% $.hw_data.numa_topology.cpus %> > on-success: > - get_role_derive_params: <% $.cpus %> > - set_status_failed_get_cpus: <% not $.cpus %> > > get_role_derive_params: > publish: > role_derive_params: <% $.derived_parameters.get(concat($.role_name, 'Parameters'), {}) %> > # removing the role parameters (eg. ComputeParameters) in derived_parameters dictionary since already copied in role_derive_params. > derived_parameters: <% $.derived_parameters.delete(concat($.role_name, 'Parameters')) %> > on-success: get_host_cpus > > get_host_cpus: > publish: > host_cpus: <% $.role_derive_params.get('OvsDpdkCoreList', '') or $.role_derive_params.get('SriovHostCpusList', '') %> > # SriovHostCpusList parameter is added temporarily for host_cpus and not needed in derived_parameters result. > # SriovHostCpusList parameter is deleted in derived_parameters list and adding the updated role parameters > # back in the derived_parameters. > derived_parameters: <% $.derived_parameters + dict(concat($.role_name, 'Parameters') => $.role_derive_params.delete('SriovHostCpusList')) %> > on-success: get_host_dpdk_combined_cpus > > get_host_dpdk_combined_cpus: > publish: > host_dpdk_combined_cpus: <% let(pmd_cpus => $.role_derive_params.get('OvsPmdCoreList', '')) -> switch($pmd_cpus => concat($pmd_cpus, ',', $.host_cpus), not $pmd_cpus => $.host_cpus) %> > reserved_cpus: [] > on-success: > - get_host_dpdk_combined_cpus_num_list: <% $.host_dpdk_combined_cpus %> > - set_status_failed_get_host_dpdk_combined_cpus: <% not $.host_dpdk_combined_cpus %> > > get_host_dpdk_combined_cpus_num_list: > action: tripleo.derive_params.convert_range_to_number_list > input: > range_list: <% $.host_dpdk_combined_cpus %> > publish: > host_dpdk_combined_cpus: <% task().result %> > reserved_cpus: <% task().result.split(',') %> > on-success: get_nova_cpus > on-error: set_status_failed_get_host_dpdk_combined_cpus_num_list > > get_nova_cpus: > publish: > nova_cpus: <% let(reserved_cpus => $.reserved_cpus) -> $.cpus.select($.thread_siblings).flatten().where(not (str($) in $reserved_cpus)).join(',') %> > on-success: > - get_isol_cpus: <% $.nova_cpus %> > - set_status_failed_get_nova_cpus: <% not $.nova_cpus %> > > # concatinates OvsPmdCoreList range format and NovaVcpuPinSet in range format. it may not be in perfect range format. > # example: concatinates '12-15,19' and 16-18' ranges '12-15,19,16-18' > get_isol_cpus: > publish: > isol_cpus: <% let(pmd_cpus => $.role_derive_params.get('OvsPmdCoreList','')) -> switch($pmd_cpus => concat($pmd_cpus, ',', $.nova_cpus), not $pmd_cpus => $.nova_cpus) %> > on-success: get_isol_cpus_num_list > > # Gets the isol_cpus in the number list > # example: '12-15,19,16-18' into '12,13,14,15,16,17,18,19' > get_isol_cpus_num_list: > action: tripleo.derive_params.convert_range_to_number_list > input: > range_list: <% $.isol_cpus %> > publish: > isol_cpus: <% task().result %> > on-success: get_nova_cpus_range_list > on-error: set_status_failed_get_isol_cpus_num_list > > get_nova_cpus_range_list: > action: tripleo.derive_params.convert_number_to_range_list > input: > num_list: <% $.nova_cpus %> > publish: > nova_cpus: <% task().result %> > on-success: get_isol_cpus_range_list > on-error: set_status_failed_get_nova_cpus_range_list > > # converts number format isol_cpus into range format > # example: '12,13,14,15,16,17,18,19' into '12-19' > get_isol_cpus_range_list: > action: tripleo.derive_params.convert_number_to_range_list > input: > num_list: <% $.isol_cpus %> > publish: > isol_cpus: <% task().result %> > on-success: get_host_mem > on-error: set_status_failed_get_isol_cpus_range_list > > get_host_mem: > publish: > host_mem: <% $.user_inputs.get('host_mem_default', 4096) %> > on-success: check_default_hugepage_supported > > check_default_hugepage_supported: > publish: > default_hugepage_supported: <% $.hw_data.get('inventory', {}).get('cpu', {}).get('flags', []).contains('pdpe1gb') %> > on-success: > - get_total_memory: <% $.default_hugepage_supported %> > - set_status_failed_check_default_hugepage_supported: <% not $.default_hugepage_supported %> > > get_total_memory: > publish: > total_memory: <% $.hw_data.get('inventory', {}).get('memory', {}).get('physical_mb', 0) %> > on-success: > - get_hugepage_allocation_percentage: <% $.total_memory %> > - set_status_failed_get_total_memory: <% not $.total_memory %> > > get_hugepage_allocation_percentage: > publish: > huge_page_allocation_percentage: <% $.user_inputs.get('huge_page_allocation_percentage', 0) %> > on-success: > - get_hugepages: <% isInteger($.huge_page_allocation_percentage) and $.huge_page_allocation_percentage > 0 %> > - set_status_failed_get_hugepage_allocation_percentage_invalid: <% not isInteger($.huge_page_allocation_percentage) %> > - set_status_failed_get_hugepage_allocation_percentage_not_provided: <% $.huge_page_allocation_percentage = 0 %> > > get_hugepages: > publish: > hugepages: <% let(huge_page_perc => float($.huge_page_allocation_percentage)/100)-> int((($.total_memory/1024)-4) * $huge_page_perc) %> > on-success: > - get_cpu_model: <% $.hugepages %> > - set_status_failed_get_hugepages: <% not $.hugepages %> > > get_cpu_model: > publish: > intel_cpu_model: <% $.hw_data.get('inventory', {}).get('cpu', {}).get('model_name', '').startsWith('Intel') %> > on-success: get_iommu_info > > get_iommu_info: > publish: > iommu_info: <% switch($.intel_cpu_model => 'intel_iommu=on iommu=pt', not $.intel_cpu_model => '') %> > on-success: get_kernel_args > > get_kernel_args: > publish: > kernel_args: <% concat('default_hugepagesz=1GB hugepagesz=1G ', 'hugepages=', str($.hugepages), ' ', $.iommu_info, ' isolcpus=', $.isol_cpus) %> > on-success: get_host_parameters > > get_host_parameters: > publish: > host_parameters: <% dict(concat($.role_name, 'Parameters') => dict('NovaVcpuPinSet' => $.get('nova_cpus', ''), 'NovaReservedHostMemory' => $.get('host_mem', ''), 'KernelArgs' => $.get('kernel_args', ''), 'IsolCpusList' => $.get('isol_cpus', ''))) %> > > set_status_failed_get_cpus: > publish: > status: FAILED > message: "Unable to determine CPU's on NUMA nodes" > on-success: fail > > set_status_failed_get_host_dpdk_combined_cpus: > publish: > status: FAILED > message: 'Unable to combine host and dpdk cpus list' > on-success: fail > > set_status_failed_get_host_dpdk_combined_cpus_num_list: > publish: > status: FAILED > message: <% task(get_host_dpdk_combined_cpus_num_list).result %> > on-success: fail > > set_status_failed_get_nova_cpus: > publish: > status: FAILED > message: 'Unable to determine nova vcpu pin set' > on-success: fail > > set_status_failed_get_nova_cpus_range_list: > publish: > status: FAILED > message: <% task(get_nova_cpus_range_list).result %> > on-success: fail > > set_status_failed_get_isol_cpus_num_list: > publish: > status: FAILED > message: <% task(get_isol_cpus_num_list).result %> > on-success: fail > > set_status_failed_get_isol_cpus_range_list: > publish: > status: FAILED > message: <% task(get_isol_cpus_range_list).result %> > on-success: fail > > set_status_failed_check_default_hugepage_supported: > publish: > status: FAILED > message: 'default huge page size 1GB is not supported' > on-success: fail > > set_status_failed_get_total_memory: > publish: > status: FAILED > message: 'Unable to determine total memory' > on-success: fail > > set_status_failed_get_hugepage_allocation_percentage_invalid: > publish: > status: FAILED > message: <% "huge_page_allocation_percentage user input '{0}' is invalid".format($.huge_page_allocation_percentage) %> > on-success: fail > > set_status_failed_get_hugepage_allocation_percentage_not_provided: > publish: > status: FAILED > message: 'huge_page_allocation_percentage user input is not provided' > on-success: fail > > set_status_failed_get_hugepages: > publish: > status: FAILED > message: 'Unable to determine huge pages' > on-success: fail > > > hci_derive_params: > description: Derive the deployment parameters for HCI > input: > - role_name > - environment_parameters > - heat_resource_tree > - introspection_data > - user_inputs > - derived_parameters: {} > > output: > derived_parameters: <% $.derived_parameters.mergeWith($.get('hci_parameters', {})) %> > > tags: > - tripleo-common-managed > > tasks: > get_hci_inputs: > publish: > hci_profile: <% $.user_inputs.get('hci_profile', '') %> > hci_profile_config: <% $.user_inputs.get('hci_profile_config', {}) %> > MB_PER_GB: 1024 > on-success: > - get_average_guest_memory_size_in_mb: <% $.hci_profile and $.hci_profile_config.get($.hci_profile, {}) %> > - set_failed_invalid_hci_profile: <% $.hci_profile and not $.hci_profile_config.get($.hci_profile, {}) %> > # When no hci_profile is specified, the workflow terminates without deriving any HCI parameters. > > get_average_guest_memory_size_in_mb: > publish: > average_guest_memory_size_in_mb: <% $.hci_profile_config.get($.hci_profile, {}).get('average_guest_memory_size_in_mb', 0) %> > on-success: > - get_average_guest_cpu_utilization_percentage: <% isInteger($.average_guest_memory_size_in_mb) %> > - set_failed_invalid_average_guest_memory_size_in_mb: <% not isInteger($.average_guest_memory_size_in_mb) %> > > get_average_guest_cpu_utilization_percentage: > publish: > average_guest_cpu_utilization_percentage: <% $.hci_profile_config.get($.hci_profile, {}).get('average_guest_cpu_utilization_percentage', 0) %> > on-success: > - get_gb_overhead_per_guest: <% isInteger($.average_guest_cpu_utilization_percentage) %> > - set_failed_invalid_average_guest_cpu_utilization_percentage: <% not isInteger($.average_guest_cpu_utilization_percentage) %> > > get_gb_overhead_per_guest: > publish: > gb_overhead_per_guest: <% $.user_inputs.get('gb_overhead_per_guest', 0.5) %> > on-success: > - get_gb_per_osd: <% isNumber($.gb_overhead_per_guest) %> > - set_failed_invalid_gb_overhead_per_guest: <% not isNumber($.gb_overhead_per_guest) %> > > get_gb_per_osd: > publish: > gb_per_osd: <% $.user_inputs.get('gb_per_osd', 5) %> > on-success: > - get_cores_per_osd: <% isNumber($.gb_per_osd) %> > - set_failed_invalid_gb_per_osd: <% not isNumber($.gb_per_osd) %> > > get_cores_per_osd: > publish: > cores_per_osd: <% $.user_inputs.get('cores_per_osd', 1.0) %> > on-success: > - get_extra_configs: <% isNumber($.cores_per_osd) %> > - set_failed_invalid_cores_per_osd: <% not isNumber($.cores_per_osd) %> > > get_extra_configs: > publish: > extra_config: <% $.environment_parameters.get('ExtraConfig', {}) %> > role_extra_config: <% $.environment_parameters.get(concat($.role_name, 'ExtraConfig'), {}) %> > role_env_params: <% $.environment_parameters.get(concat($.role_name, 'Parameters'), {}) %> > role_derive_params: <% $.derived_parameters.get(concat($.role_name, 'Parameters'), {}) %> > on-success: get_num_osds > > get_num_osds: > publish: > num_osds: <% $.heat_resource_tree.parameters.get('CephAnsibleDisksConfig', {}).get('default', {}).get('devices', []).count() %> > on-success: > - get_memory_mb: <% $.num_osds %> > # If there's no CephAnsibleDisksConfig then look for OSD configuration in hiera data > - get_num_osds_from_hiera: <% not $.num_osds %> > > get_num_osds_from_hiera: > publish: > num_osds: <% $.role_extra_config.get('ceph::profile::params::osds', $.extra_config.get('ceph::profile::params::osds', {})).keys().count() %> > on-success: > - get_memory_mb: <% $.num_osds %> > - set_failed_no_osds: <% not $.num_osds %> > > get_memory_mb: > publish: > memory_mb: <% $.introspection_data.get('memory_mb', 0) %> > on-success: > - get_nova_vcpu_pin_set: <% $.memory_mb %> > - set_failed_get_memory_mb: <% not $.memory_mb %> > > # Determine the number of CPU cores available to Nova and Ceph. If > # NovaVcpuPinSet is defined then use the number of vCPUs in the set, > # otherwise use all of the cores identified in the introspection data. > > get_nova_vcpu_pin_set: > publish: > # NovaVcpuPinSet can be defined in multiple locations, and it's > # important to select the value in order of precedence: > # > # 1) User specified value for this role > # 2) User specified default value for all roles > # 3) Value derived by another derived parameters workflow > nova_vcpu_pin_set: <% $.role_env_params.get('NovaVcpuPinSet', $.environment_parameters.get('NovaVcpuPinSet', $.role_derive_params.get('NovaVcpuPinSet', ''))) %> > on-success: > - get_nova_vcpu_count: <% $.nova_vcpu_pin_set %> > - get_num_cores: <% not $.nova_vcpu_pin_set %> > > get_nova_vcpu_count: > action: tripleo.derive_params.convert_range_to_number_list > input: > range_list: <% $.nova_vcpu_pin_set %> > publish: > num_cores: <% task().result.split(',').count() %> > on-success: calculate_nova_parameters > on-error: set_failed_get_nova_vcpu_count > > get_num_cores: > publish: > num_cores: <% $.introspection_data.get('cpus', 0) %> > on-success: > - calculate_nova_parameters: <% $.num_cores %> > - set_failed_get_num_cores: <% not $.num_cores %> > > # HCI calculations are broken into multiple steps. This is necessary > # because variables published by a Mistral task are not available > # for use by that same task. Variables computed and published in a task > # are only available in subsequent tasks. > # > # The HCI calculations compute two Nova parameters: > # - reserved_host_memory > # - cpu_allocation_ratio > # > # The reserved_host_memory calculation computes the amount of memory > # that needs to be reserved for Ceph and the total amount of "guest > # overhead" memory that is based on the anticipated number of guests. > # Psuedo-code for the calculation (disregarding MB and GB units) is > # as follows: > # > # ceph_memory = mem_per_osd * num_osds > # nova_memory = total_memory - ceph_memory > # num_guests = nova_memory / > # (average_guest_memory_size + overhead_per_guest) > # reserved_memory = ceph_memory + (num_guests * overhead_per_guest) > # > # The cpu_allocation_ratio calculation is similar in that it takes into > # account the number of cores that must be reserved for Ceph. > # > # ceph_cores = cores_per_osd * num_osds > # guest_cores = num_cores - ceph_cores > # guest_vcpus = guest_cores / average_guest_utilization > # cpu_allocation_ratio = guest_vcpus / num_cores > > calculate_nova_parameters: > publish: > avg_guest_util: <% $.average_guest_cpu_utilization_percentage / 100.0 %> > avg_guest_size_gb: <% $.average_guest_memory_size_in_mb / float($.MB_PER_GB) %> > memory_gb: <% $.memory_mb / float($.MB_PER_GB) %> > ceph_mem_gb: <% $.gb_per_osd * $.num_osds %> > nonceph_cores: <% $.num_cores - int($.cores_per_osd * $.num_osds) %> > on-success: calc_step_2 > > calc_step_2: > publish: > num_guests: <% int(($.memory_gb - $.ceph_mem_gb) / ($.avg_guest_size_gb + $.gb_overhead_per_guest)) %> > guest_vcpus: <% $.nonceph_cores / $.avg_guest_util %> > on-success: calc_step_3 > > calc_step_3: > publish: > reserved_host_memory: <% $.MB_PER_GB * int($.ceph_mem_gb + ($.num_guests * $.gb_overhead_per_guest)) %> > cpu_allocation_ratio: <% $.guest_vcpus / $.num_cores %> > on-success: validate_results > > validate_results: > publish: > # Verify whether HCI is viable: > # - At least 80% of the memory is reserved for Ceph and guest overhead > # - At least half of the CPU cores must be available to Nova > mem_ok: <% $.reserved_host_memory <= ($.memory_mb * 0.8) %> > cpu_ok: <% $.cpu_allocation_ratio >= 0.5 %> > on-success: > - set_failed_insufficient_mem: <% not $.mem_ok %> > - set_failed_insufficient_cpu: <% not $.cpu_ok %> > - publish_hci_parameters: <% $.mem_ok and $.cpu_ok %> > > publish_hci_parameters: > publish: > # TODO(abishop): Update this when the cpu_allocation_ratio can be set > # via a THT parameter (no such parameter currently exists). Until a > # THT parameter exists, use hiera data to set the cpu_allocation_ratio. > hci_parameters: <% dict(concat($.role_name, 'Parameters') => dict('NovaReservedHostMemory' => $.reserved_host_memory)) + dict(concat($.role_name, 'ExtraConfig') => dict('nova::cpu_allocation_ratio' => $.cpu_allocation_ratio)) %> > > set_failed_invalid_hci_profile: > publish: > message: "'<% $.hci_profile %>' is not a valid HCI profile." > on-success: fail > > set_failed_invalid_average_guest_memory_size_in_mb: > publish: > message: "'<% $.average_guest_memory_size_in_mb %>' is not a valid average_guest_memory_size_in_mb value." > on-success: fail > > set_failed_invalid_gb_overhead_per_guest: > publish: > message: "'<% $.gb_overhead_per_guest %>' is not a valid gb_overhead_per_guest value." > on-success: fail > > set_failed_invalid_gb_per_osd: > publish: > message: "'<% $.gb_per_osd %>' is not a valid gb_per_osd value." > on-success: fail > > set_failed_invalid_cores_per_osd: > publish: > message: "'<% $.cores_per_osd %>' is not a valid cores_per_osd value." > on-success: fail > > set_failed_invalid_average_guest_cpu_utilization_percentage: > publish: > message: "'<% $.average_guest_cpu_utilization_percentage %>' is not a valid average_guest_cpu_utilization_percentage value." > on-success: fail > > set_failed_no_osds: > publish: > message: "No Ceph OSDs found in the overcloud definition ('ceph::profile::params::osds')." > on-success: fail > > set_failed_get_memory_mb: > publish: > message: "Unable to determine the amount of physical memory (no 'memory_mb' found in introspection_data)." > on-success: fail > > set_failed_get_nova_vcpu_count: > publish: > message: <% task(get_nova_vcpu_count).result %> > on-success: fail > > set_failed_get_num_cores: > publish: > message: "Unable to determine the number of CPU cores (no 'cpus' found in introspection_data)." > on-success: fail > > set_failed_insufficient_mem: > publish: > message: "<% $.memory_mb %> MB is not enough memory to run hyperconverged." > on-success: fail > > set_failed_insufficient_cpu: > publish: > message: "<% $.num_cores %> CPU cores are not enough to run hyperconverged." > on-success: fail >' >2018-08-21 16:37:03,912 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 32010 >2018-08-21 16:37:03,960 DEBUG: RESP: [201] Content-Length: 32010 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:03 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.derive_params_formulas.v1\ndescription: TripleO Workflows to derive deployment parameters from the introspected data\n\nworkflows:\n\n\n dpdk_derive_params:\n description: >\n Workflow to derive parameters for DPDK service.\n input:\n - plan\n - role_name\n - hw_data # introspection data\n - user_inputs\n - derived_parameters: {}\n\n output:\n derived_parameters: <% $.derived_parameters.mergeWith($.get('dpdk_parameters', {})) %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_network_config:\n action: tripleo.parameters.get_network_config\n input:\n container: <% $.plan %>\n role_name: <% $.role_name %>\n publish:\n network_configs: <% task().result.get('network_config', []) %>\n on-success: get_dpdk_nics_numa_info\n on-error: set_status_failed_get_network_config\n\n get_dpdk_nics_numa_info:\n action: tripleo.derive_params.get_dpdk_nics_numa_info\n input:\n network_configs: <% $.network_configs %>\n inspect_data: <% $.hw_data %>\n publish:\n dpdk_nics_numa_info: <% task().result %>\n on-success:\n # TODO: Need to remove condtions here\n # adding condition and throw error in action for empty check\n - get_dpdk_nics_numa_nodes: <% $.dpdk_nics_numa_info %>\n - set_status_failed_get_dpdk_nics_numa_info: <% not $.dpdk_nics_numa_info %>\n on-error: set_status_failed_on_error_get_dpdk_nics_numa_info\n\n get_dpdk_nics_numa_nodes:\n publish:\n dpdk_nics_numa_nodes: <% $.dpdk_nics_numa_info.groupBy($.numa_node).select($[0]).orderBy($) %>\n on-success:\n - get_numa_nodes: <% $.dpdk_nics_numa_nodes %>\n - set_status_failed_get_dpdk_nics_numa_nodes: <% not $.dpdk_nics_numa_nodes %>\n\n get_numa_nodes:\n publish:\n numa_nodes: <% $.hw_data.numa_topology.ram.select($.numa_node).orderBy($) %>\n on-success:\n - get_num_phy_cores_per_numa_for_pmd: <% $.numa_nodes %>\n - set_status_failed_get_numa_nodes: <% not $.numa_nodes %>\n\n get_num_phy_cores_per_numa_for_pmd:\n publish:\n num_phy_cores_per_numa_node_for_pmd: <% $.user_inputs.get('num_phy_cores_per_numa_node_for_pmd', 0) %>\n on-success:\n - get_num_cores_per_numa_nodes: <% isInteger($.num_phy_cores_per_numa_node_for_pmd) and $.num_phy_cores_per_numa_node_for_pmd > 0 %>\n - set_status_failed_get_num_phy_cores_per_numa_for_pmd_invalid: <% not isInteger($.num_phy_cores_per_numa_node_for_pmd) %>\n - set_status_failed_get_num_phy_cores_per_numa_for_pmd_not_provided: <% $.num_phy_cores_per_numa_node_for_pmd = 0 %>\n\n # For NUMA node with DPDK nic, number of cores should be used from user input\n # For NUMA node without DPDK nic, number of cores should be 1\n get_num_cores_per_numa_nodes:\n publish:\n num_cores_per_numa_nodes: <% let(dpdk_nics_nodes => $.dpdk_nics_numa_nodes, cores => $.num_phy_cores_per_numa_node_for_pmd) -> $.numa_nodes.select(switch($ in $dpdk_nics_nodes => $cores, not $ in $dpdk_nics_nodes => 1)) %>\n on-success: get_pmd_cpus\n\n get_pmd_cpus:\n action: tripleo.derive_params.get_dpdk_core_list\n input:\n inspect_data: <% $.hw_data %>\n numa_nodes_cores_count: <% $.num_cores_per_numa_nodes %>\n publish:\n pmd_cpus: <% task().result %>\n on-success:\n - get_pmd_cpus_range_list: <% $.pmd_cpus %>\n - set_status_failed_get_pmd_cpus: <% not $.pmd_cpus %>\n on-error: set_status_failed_on_error_get_pmd_cpus\n\n get_pmd_cpus_range_list:\n action: tripleo.derive_params.convert_number_to_range_list\n input:\n num_list: <% $.pmd_cpus %>\n publish:\n pmd_cpus: <% task().result %>\n on-success: get_host_cpus\n on-error: set_status_failed_get_pmd_cpus_range_list\n\n get_host_cpus:\n workflow: tripleo.derive_params_formulas.v1.get_host_cpus\n input:\n role_name: <% $.role_name %>\n hw_data: <% $.hw_data %>\n publish:\n host_cpus: <% task().result.get('host_cpus', '') %>\n on-success: get_sock_mem\n on-error: set_status_failed_get_host_cpus\n\n get_sock_mem:\n action: tripleo.derive_params.get_dpdk_socket_memory\n input:\n dpdk_nics_numa_info: <% $.dpdk_nics_numa_info %>\n numa_nodes: <% $.numa_nodes %>\n overhead: <% $.user_inputs.get('overhead', 800) %>\n packet_size_in_buffer: <% 4096*64 %>\n publish:\n sock_mem: <% task().result %>\n on-success:\n - get_dpdk_parameters: <% $.sock_mem %>\n - set_status_failed_get_sock_mem: <% not $.sock_mem %>\n on-error: set_status_failed_on_error_get_sock_mem\n\n get_dpdk_parameters:\n publish:\n dpdk_parameters: <% dict(concat($.role_name, 'Parameters') => dict('OvsPmdCoreList' => $.get('pmd_cpus', ''), 'OvsDpdkCoreList' => $.get('host_cpus', ''), 'OvsDpdkSocketMemory' => $.get('sock_mem', ''))) %>\n\n set_status_failed_get_network_config:\n publish:\n status: FAILED\n message: <% task(get_network_config).result %>\n on-success: fail\n\n set_status_failed_get_dpdk_nics_numa_info:\n publish:\n status: FAILED\n message: \"Unable to determine DPDK NIC's NUMA information\"\n on-success: fail\n\n set_status_failed_on_error_get_dpdk_nics_numa_info:\n publish:\n status: FAILED\n message: <% task(get_dpdk_nics_numa_info).result %>\n on-success: fail\n\n set_status_failed_get_dpdk_nics_numa_nodes:\n publish:\n status: FAILED\n message: \"Unable to determine DPDK NIC's numa nodes\"\n on-success: fail\n\n set_status_failed_get_numa_nodes:\n publish:\n status: FAILED\n message: 'Unable to determine available NUMA nodes'\n on-success: fail\n\n set_status_failed_get_num_phy_cores_per_numa_for_pmd_invalid:\n publish:\n status: FAILED\n message: <% \"num_phy_cores_per_numa_node_for_pmd user input '{0}' is invalid\".format($.num_phy_cores_per_numa_node_for_pmd) %>\n on-success: fail\n\n set_status_failed_get_num_phy_cores_per_numa_for_pmd_not_provided:\n publish:\n status: FAILED\n message: 'num_phy_cores_per_numa_node_for_pmd user input is not provided'\n on-success: fail\n\n set_status_failed_get_pmd_cpus:\n publish:\n status: FAILED\n message: 'Unable to determine OvsPmdCoreList parameter'\n on-success: fail\n\n set_status_failed_on_error_get_pmd_cpus:\n publish:\n status: FAILED\n message: <% task(get_pmd_cpus).result %>\n on-success: fail\n\n set_status_failed_get_pmd_cpus_range_list:\n publish:\n status: FAILED\n message: <% task(get_pmd_cpus_range_list).result %>\n on-success: fail\n\n set_status_failed_get_host_cpus:\n publish:\n status: FAILED\n message: <% task(get_host_cpus).result.get('message', '') %>\n on-success: fail\n\n set_status_failed_get_sock_mem:\n publish:\n status: FAILED\n message: 'Unable to determine OvsDpdkSocketMemory parameter'\n on-success: fail\n\n set_status_failed_on_error_get_sock_mem:\n publish:\n status: FAILED\n message: <% task(get_sock_mem).result %>\n on-success: fail\n\n\n sriov_derive_params:\n description: >\n This workflow derives parameters for the SRIOV feature.\n\n input:\n - role_name\n - hw_data # introspection data\n - derived_parameters: {}\n\n output:\n derived_parameters: <% $.derived_parameters.mergeWith($.get('sriov_parameters', {})) %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_host_cpus:\n workflow: tripleo.derive_params_formulas.v1.get_host_cpus\n input:\n role_name: <% $.role_name %>\n hw_data: <% $.hw_data %>\n publish:\n host_cpus: <% task().result.get('host_cpus', '') %>\n on-success: get_sriov_parameters\n on-error: set_status_failed_get_host_cpus\n\n get_sriov_parameters:\n publish:\n # SriovHostCpusList parameter is added temporarily and it's removed later from derived parameters result.\n sriov_parameters: <% dict(concat($.role_name, 'Parameters') => dict('SriovHostCpusList' => $.get('host_cpus', ''))) %>\n\n set_status_failed_get_host_cpus:\n publish:\n status: FAILED\n message: <% task(get_host_cpus).result.get('message', '') %>\n on-success: fail\n\n\n get_host_cpus:\n description: >\n Fetching the host CPU list from the introspection data, and then converting the raw list into a range list.\n\n input:\n - hw_data # introspection data\n\n output:\n host_cpus: <% $.get('host_cpus', '') %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_host_cpus:\n action: tripleo.derive_params.get_host_cpus_list inspect_data=<% $.hw_data %>\n publish:\n host_cpus: <% task().result %>\n on-success:\n - get_host_cpus_range_list: <% $.host_cpus %>\n - set_status_failed_get_host_cpus: <% not $.host_cpus %>\n on-error: set_status_failed_on_error_get_host_cpus\n\n get_host_cpus_range_list:\n action: tripleo.derive_params.convert_number_to_range_list\n input:\n num_list: <% $.host_cpus %>\n publish:\n host_cpus: <% task().result %>\n on-error: set_status_failed_get_host_cpus_range_list\n\n set_status_failed_get_host_cpus:\n publish:\n status: FAILED\n message: 'Unable to determine host cpus'\n on-success: fail\n\n set_status_failed_on_error_get_host_cpus:\n publish:\n status: FAILED\n message: <% task(get_host_cpus).result %>\n on-success: fail\n\n set_status_failed_get_host_cpus_range_list:\n publish:\n status: FAILED\n message: <% task(get_host_cpus_range_list).result %>\n on-success: fail\n\n\n host_derive_params:\n description: >\n This workflow derives parameters for the Host process, and is mainly associated with CPU pinning and huge memory pages.\n This workflow can be dependent on any feature or also can be invoked individually as well.\n\n input:\n - role_name\n - hw_data # introspection data\n - user_inputs\n - derived_parameters: {}\n\n output:\n derived_parameters: <% $.derived_parameters.mergeWith($.get('host_parameters', {})) %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_cpus:\n publish:\n cpus: <% $.hw_data.numa_topology.cpus %>\n on-success:\n - get_role_derive_params: <% $.cpus %>\n - set_status_failed_get_cpus: <% not $.cpus %>\n\n get_role_derive_params:\n publish:\n role_derive_params: <% $.derived_parameters.get(concat($.role_name, 'Parameters'), {}) %>\n # removing the role parameters (eg. ComputeParameters) in derived_parameters dictionary since already copied in role_derive_params.\n derived_parameters: <% $.derived_parameters.delete(concat($.role_name, 'Parameters')) %>\n on-success: get_host_cpus\n\n get_host_cpus:\n publish:\n host_cpus: <% $.role_derive_params.get('OvsDpdkCoreList', '') or $.role_derive_params.get('SriovHostCpusList', '') %>\n # SriovHostCpusList parameter is added temporarily for host_cpus and not needed in derived_parameters result.\n # SriovHostCpusList parameter is deleted in derived_parameters list and adding the updated role parameters\n # back in the derived_parameters.\n derived_parameters: <% $.derived_parameters + dict(concat($.role_name, 'Parameters') => $.role_derive_params.delete('SriovHostCpusList')) %>\n on-success: get_host_dpdk_combined_cpus\n\n get_host_dpdk_combined_cpus:\n publish:\n host_dpdk_combined_cpus: <% let(pmd_cpus => $.role_derive_params.get('OvsPmdCoreList', '')) -> switch($pmd_cpus => concat($pmd_cpus, ',', $.host_cpus), not $pmd_cpus => $.host_cpus) %>\n reserved_cpus: []\n on-success:\n - get_host_dpdk_combined_cpus_num_list: <% $.host_dpdk_combined_cpus %>\n - set_status_failed_get_host_dpdk_combined_cpus: <% not $.host_dpdk_combined_cpus %>\n\n get_host_dpdk_combined_cpus_num_list:\n action: tripleo.derive_params.convert_range_to_number_list\n input:\n range_list: <% $.host_dpdk_combined_cpus %>\n publish:\n host_dpdk_combined_cpus: <% task().result %>\n reserved_cpus: <% task().result.split(',') %>\n on-success: get_nova_cpus\n on-error: set_status_failed_get_host_dpdk_combined_cpus_num_list\n\n get_nova_cpus:\n publish:\n nova_cpus: <% let(reserved_cpus => $.reserved_cpus) -> $.cpus.select($.thread_siblings).flatten().where(not (str($) in $reserved_cpus)).join(',') %>\n on-success:\n - get_isol_cpus: <% $.nova_cpus %>\n - set_status_failed_get_nova_cpus: <% not $.nova_cpus %>\n\n # concatinates OvsPmdCoreList range format and NovaVcpuPinSet in range format. it may not be in perfect range format.\n # example: concatinates '12-15,19' and 16-18' ranges '12-15,19,16-18'\n get_isol_cpus:\n publish:\n isol_cpus: <% let(pmd_cpus => $.role_derive_params.get('OvsPmdCoreList','')) -> switch($pmd_cpus => concat($pmd_cpus, ',', $.nova_cpus), not $pmd_cpus => $.nova_cpus) %>\n on-success: get_isol_cpus_num_list\n\n # Gets the isol_cpus in the number list\n # example: '12-15,19,16-18' into '12,13,14,15,16,17,18,19'\n get_isol_cpus_num_list:\n action: tripleo.derive_params.convert_range_to_number_list\n input:\n range_list: <% $.isol_cpus %>\n publish:\n isol_cpus: <% task().result %>\n on-success: get_nova_cpus_range_list\n on-error: set_status_failed_get_isol_cpus_num_list\n\n get_nova_cpus_range_list:\n action: tripleo.derive_params.convert_number_to_range_list\n input:\n num_list: <% $.nova_cpus %>\n publish:\n nova_cpus: <% task().result %>\n on-success: get_isol_cpus_range_list\n on-error: set_status_failed_get_nova_cpus_range_list\n\n # converts number format isol_cpus into range format\n # example: '12,13,14,15,16,17,18,19' into '12-19'\n get_isol_cpus_range_list:\n action: tripleo.derive_params.convert_number_to_range_list\n input:\n num_list: <% $.isol_cpus %>\n publish:\n isol_cpus: <% task().result %>\n on-success: get_host_mem\n on-error: set_status_failed_get_isol_cpus_range_list\n\n get_host_mem:\n publish:\n host_mem: <% $.user_inputs.get('host_mem_default', 4096) %>\n on-success: check_default_hugepage_supported\n\n check_default_hugepage_supported:\n publish:\n default_hugepage_supported: <% $.hw_data.get('inventory', {}).get('cpu', {}).get('flags', []).contains('pdpe1gb') %>\n on-success:\n - get_total_memory: <% $.default_hugepage_supported %>\n - set_status_failed_check_default_hugepage_supported: <% not $.default_hugepage_supported %>\n\n get_total_memory:\n publish:\n total_memory: <% $.hw_data.get('inventory', {}).get('memory', {}).get('physical_mb', 0) %>\n on-success:\n - get_hugepage_allocation_percentage: <% $.total_memory %>\n - set_status_failed_get_total_memory: <% not $.total_memory %>\n\n get_hugepage_allocation_percentage:\n publish:\n huge_page_allocation_percentage: <% $.user_inputs.get('huge_page_allocation_percentage', 0) %>\n on-success:\n - get_hugepages: <% isInteger($.huge_page_allocation_percentage) and $.huge_page_allocation_percentage > 0 %>\n - set_status_failed_get_hugepage_allocation_percentage_invalid: <% not isInteger($.huge_page_allocation_percentage) %>\n - set_status_failed_get_hugepage_allocation_percentage_not_provided: <% $.huge_page_allocation_percentage = 0 %>\n\n get_hugepages:\n publish:\n hugepages: <% let(huge_page_perc => float($.huge_page_allocation_percentage)/100)-> int((($.total_memory/1024)-4) * $huge_page_perc) %>\n on-success:\n - get_cpu_model: <% $.hugepages %>\n - set_status_failed_get_hugepages: <% not $.hugepages %>\n\n get_cpu_model:\n publish:\n intel_cpu_model: <% $.hw_data.get('inventory', {}).get('cpu', {}).get('model_name', '').startsWith('Intel') %>\n on-success: get_iommu_info\n\n get_iommu_info:\n publish:\n iommu_info: <% switch($.intel_cpu_model => 'intel_iommu=on iommu=pt', not $.intel_cpu_model => '') %>\n on-success: get_kernel_args\n\n get_kernel_args:\n publish:\n kernel_args: <% concat('default_hugepagesz=1GB hugepagesz=1G ', 'hugepages=', str($.hugepages), ' ', $.iommu_info, ' isolcpus=', $.isol_cpus) %>\n on-success: get_host_parameters\n\n get_host_parameters:\n publish:\n host_parameters: <% dict(concat($.role_name, 'Parameters') => dict('NovaVcpuPinSet' => $.get('nova_cpus', ''), 'NovaReservedHostMemory' => $.get('host_mem', ''), 'KernelArgs' => $.get('kernel_args', ''), 'IsolCpusList' => $.get('isol_cpus', ''))) %>\n\n set_status_failed_get_cpus:\n publish:\n status: FAILED\n message: \"Unable to determine CPU's on NUMA nodes\"\n on-success: fail\n\n set_status_failed_get_host_dpdk_combined_cpus:\n publish:\n status: FAILED\n message: 'Unable to combine host and dpdk cpus list'\n on-success: fail\n\n set_status_failed_get_host_dpdk_combined_cpus_num_list:\n publish:\n status: FAILED\n message: <% task(get_host_dpdk_combined_cpus_num_list).result %>\n on-success: fail\n\n set_status_failed_get_nova_cpus:\n publish:\n status: FAILED\n message: 'Unable to determine nova vcpu pin set'\n on-success: fail\n\n set_status_failed_get_nova_cpus_range_list:\n publish:\n status: FAILED\n message: <% task(get_nova_cpus_range_list).result %>\n on-success: fail\n\n set_status_failed_get_isol_cpus_num_list:\n publish:\n status: FAILED\n message: <% task(get_isol_cpus_num_list).result %>\n on-success: fail\n\n set_status_failed_get_isol_cpus_range_list:\n publish:\n status: FAILED\n message: <% task(get_isol_cpus_range_list).result %>\n on-success: fail\n\n set_status_failed_check_default_hugepage_supported:\n publish:\n status: FAILED\n message: 'default huge page size 1GB is not supported'\n on-success: fail\n\n set_status_failed_get_total_memory:\n publish:\n status: FAILED\n message: 'Unable to determine total memory'\n on-success: fail\n\n set_status_failed_get_hugepage_allocation_percentage_invalid:\n publish:\n status: FAILED\n message: <% \"huge_page_allocation_percentage user input '{0}' is invalid\".format($.huge_page_allocation_percentage) %>\n on-success: fail\n\n set_status_failed_get_hugepage_allocation_percentage_not_provided:\n publish:\n status: FAILED\n message: 'huge_page_allocation_percentage user input is not provided'\n on-success: fail\n\n set_status_failed_get_hugepages:\n publish:\n status: FAILED\n message: 'Unable to determine huge pages'\n on-success: fail\n\n\n hci_derive_params:\n description: Derive the deployment parameters for HCI\n input:\n - role_name\n - environment_parameters\n - heat_resource_tree\n - introspection_data\n - user_inputs\n - derived_parameters: {}\n\n output:\n derived_parameters: <% $.derived_parameters.mergeWith($.get('hci_parameters', {})) %>\n\n tags:\n - tripleo-common-managed\n\n tasks:\n get_hci_inputs:\n publish:\n hci_profile: <% $.user_inputs.get('hci_profile', '') %>\n hci_profile_config: <% $.user_inputs.get('hci_profile_config', {}) %>\n MB_PER_GB: 1024\n on-success:\n - get_average_guest_memory_size_in_mb: <% $.hci_profile and $.hci_profile_config.get($.hci_profile, {}) %>\n - set_failed_invalid_hci_profile: <% $.hci_profile and not $.hci_profile_config.get($.hci_profile, {}) %>\n # When no hci_profile is specified, the workflow terminates without deriving any HCI parameters.\n\n get_average_guest_memory_size_in_mb:\n publish:\n average_guest_memory_size_in_mb: <% $.hci_profile_config.get($.hci_profile, {}).get('average_guest_memory_size_in_mb', 0) %>\n on-success:\n - get_average_guest_cpu_utilization_percentage: <% isInteger($.average_guest_memory_size_in_mb) %>\n - set_failed_invalid_average_guest_memory_size_in_mb: <% not isInteger($.average_guest_memory_size_in_mb) %>\n\n get_average_guest_cpu_utilization_percentage:\n publish:\n average_guest_cpu_utilization_percentage: <% $.hci_profile_config.get($.hci_profile, {}).get('average_guest_cpu_utilization_percentage', 0) %>\n on-success:\n - get_gb_overhead_per_guest: <% isInteger($.average_guest_cpu_utilization_percentage) %>\n - set_failed_invalid_average_guest_cpu_utilization_percentage: <% not isInteger($.average_guest_cpu_utilization_percentage) %>\n\n get_gb_overhead_per_guest:\n publish:\n gb_overhead_per_guest: <% $.user_inputs.get('gb_overhead_per_guest', 0.5) %>\n on-success:\n - get_gb_per_osd: <% isNumber($.gb_overhead_per_guest) %>\n - set_failed_invalid_gb_overhead_per_guest: <% not isNumber($.gb_overhead_per_guest) %>\n\n get_gb_per_osd:\n publish:\n gb_per_osd: <% $.user_inputs.get('gb_per_osd', 5) %>\n on-success:\n - get_cores_per_osd: <% isNumber($.gb_per_osd) %>\n - set_failed_invalid_gb_per_osd: <% not isNumber($.gb_per_osd) %>\n\n get_cores_per_osd:\n publish:\n cores_per_osd: <% $.user_inputs.get('cores_per_osd', 1.0) %>\n on-success:\n - get_extra_configs: <% isNumber($.cores_per_osd) %>\n - set_failed_invalid_cores_per_osd: <% not isNumber($.cores_per_osd) %>\n\n get_extra_configs:\n publish:\n extra_config: <% $.environment_parameters.get('ExtraConfig', {}) %>\n role_extra_config: <% $.environment_parameters.get(concat($.role_name, 'ExtraConfig'), {}) %>\n role_env_params: <% $.environment_parameters.get(concat($.role_name, 'Parameters'), {}) %>\n role_derive_params: <% $.derived_parameters.get(concat($.role_name, 'Parameters'), {}) %>\n on-success: get_num_osds\n\n get_num_osds:\n publish:\n num_osds: <% $.heat_resource_tree.parameters.get('CephAnsibleDisksConfig', {}).get('default', {}).get('devices', []).count() %>\n on-success:\n - get_memory_mb: <% $.num_osds %>\n # If there's no CephAnsibleDisksConfig then look for OSD configuration in hiera data\n - get_num_osds_from_hiera: <% not $.num_osds %>\n\n get_num_osds_from_hiera:\n publish:\n num_osds: <% $.role_extra_config.get('ceph::profile::params::osds', $.extra_config.get('ceph::profile::params::osds', {})).keys().count() %>\n on-success:\n - get_memory_mb: <% $.num_osds %>\n - set_failed_no_osds: <% not $.num_osds %>\n\n get_memory_mb:\n publish:\n memory_mb: <% $.introspection_data.get('memory_mb', 0) %>\n on-success:\n - get_nova_vcpu_pin_set: <% $.memory_mb %>\n - set_failed_get_memory_mb: <% not $.memory_mb %>\n\n # Determine the number of CPU cores available to Nova and Ceph. If\n # NovaVcpuPinSet is defined then use the number of vCPUs in the set,\n # otherwise use all of the cores identified in the introspection data.\n\n get_nova_vcpu_pin_set:\n publish:\n # NovaVcpuPinSet can be defined in multiple locations, and it's\n # important to select the value in order of precedence:\n #\n # 1) User specified value for this role\n # 2) User specified default value for all roles\n # 3) Value derived by another derived parameters workflow\n nova_vcpu_pin_set: <% $.role_env_params.get('NovaVcpuPinSet', $.environment_parameters.get('NovaVcpuPinSet', $.role_derive_params.get('NovaVcpuPinSet', ''))) %>\n on-success:\n - get_nova_vcpu_count: <% $.nova_vcpu_pin_set %>\n - get_num_cores: <% not $.nova_vcpu_pin_set %>\n\n get_nova_vcpu_count:\n action: tripleo.derive_params.convert_range_to_number_list\n input:\n range_list: <% $.nova_vcpu_pin_set %>\n publish:\n num_cores: <% task().result.split(',').count() %>\n on-success: calculate_nova_parameters\n on-error: set_failed_get_nova_vcpu_count\n\n get_num_cores:\n publish:\n num_cores: <% $.introspection_data.get('cpus', 0) %>\n on-success:\n - calculate_nova_parameters: <% $.num_cores %>\n - set_failed_get_num_cores: <% not $.num_cores %>\n\n # HCI calculations are broken into multiple steps. This is necessary\n # because variables published by a Mistral task are not available\n # for use by that same task. Variables computed and published in a task\n # are only available in subsequent tasks.\n #\n # The HCI calculations compute two Nova parameters:\n # - reserved_host_memory\n # - cpu_allocation_ratio\n #\n # The reserved_host_memory calculation computes the amount of memory\n # that needs to be reserved for Ceph and the total amount of \"guest\n # overhead\" memory that is based on the anticipated number of guests.\n # Psuedo-code for the calculation (disregarding MB and GB units) is\n # as follows:\n #\n # ceph_memory = mem_per_osd * num_osds\n # nova_memory = total_memory - ceph_memory\n # num_guests = nova_memory /\n # (average_guest_memory_size + overhead_per_guest)\n # reserved_memory = ceph_memory + (num_guests * overhead_per_guest)\n #\n # The cpu_allocation_ratio calculation is similar in that it takes into\n # account the number of cores that must be reserved for Ceph.\n #\n # ceph_cores = cores_per_osd * num_osds\n # guest_cores = num_cores - ceph_cores\n # guest_vcpus = guest_cores / average_guest_utilization\n # cpu_allocation_ratio = guest_vcpus / num_cores\n\n calculate_nova_parameters:\n publish:\n avg_guest_util: <% $.average_guest_cpu_utilization_percentage / 100.0 %>\n avg_guest_size_gb: <% $.average_guest_memory_size_in_mb / float($.MB_PER_GB) %>\n memory_gb: <% $.memory_mb / float($.MB_PER_GB) %>\n ceph_mem_gb: <% $.gb_per_osd * $.num_osds %>\n nonceph_cores: <% $.num_cores - int($.cores_per_osd * $.num_osds) %>\n on-success: calc_step_2\n\n calc_step_2:\n publish:\n num_guests: <% int(($.memory_gb - $.ceph_mem_gb) / ($.avg_guest_size_gb + $.gb_overhead_per_guest)) %>\n guest_vcpus: <% $.nonceph_cores / $.avg_guest_util %>\n on-success: calc_step_3\n\n calc_step_3:\n publish:\n reserved_host_memory: <% $.MB_PER_GB * int($.ceph_mem_gb + ($.num_guests * $.gb_overhead_per_guest)) %>\n cpu_allocation_ratio: <% $.guest_vcpus / $.num_cores %>\n on-success: validate_results\n\n validate_results:\n publish:\n # Verify whether HCI is viable:\n # - At least 80% of the memory is reserved for Ceph and guest overhead\n # - At least half of the CPU cores must be available to Nova\n mem_ok: <% $.reserved_host_memory <= ($.memory_mb * 0.8) %>\n cpu_ok: <% $.cpu_allocation_ratio >= 0.5 %>\n on-success:\n - set_failed_insufficient_mem: <% not $.mem_ok %>\n - set_failed_insufficient_cpu: <% not $.cpu_ok %>\n - publish_hci_parameters: <% $.mem_ok and $.cpu_ok %>\n\n publish_hci_parameters:\n publish:\n # TODO(abishop): Update this when the cpu_allocation_ratio can be set\n # via a THT parameter (no such parameter currently exists). Until a\n # THT parameter exists, use hiera data to set the cpu_allocation_ratio.\n hci_parameters: <% dict(concat($.role_name, 'Parameters') => dict('NovaReservedHostMemory' => $.reserved_host_memory)) + dict(concat($.role_name, 'ExtraConfig') => dict('nova::cpu_allocation_ratio' => $.cpu_allocation_ratio)) %>\n\n set_failed_invalid_hci_profile:\n publish:\n message: \"'<% $.hci_profile %>' is not a valid HCI profile.\"\n on-success: fail\n\n set_failed_invalid_average_guest_memory_size_in_mb:\n publish:\n message: \"'<% $.average_guest_memory_size_in_mb %>' is not a valid average_guest_memory_size_in_mb value.\"\n on-success: fail\n\n set_failed_invalid_gb_overhead_per_guest:\n publish:\n message: \"'<% $.gb_overhead_per_guest %>' is not a valid gb_overhead_per_guest value.\"\n on-success: fail\n\n set_failed_invalid_gb_per_osd:\n publish:\n message: \"'<% $.gb_per_osd %>' is not a valid gb_per_osd value.\"\n on-success: fail\n\n set_failed_invalid_cores_per_osd:\n publish:\n message: \"'<% $.cores_per_osd %>' is not a valid cores_per_osd value.\"\n on-success: fail\n\n set_failed_invalid_average_guest_cpu_utilization_percentage:\n publish:\n message: \"'<% $.average_guest_cpu_utilization_percentage %>' is not a valid average_guest_cpu_utilization_percentage value.\"\n on-success: fail\n\n set_failed_no_osds:\n publish:\n message: \"No Ceph OSDs found in the overcloud definition ('ceph::profile::params::osds').\"\n on-success: fail\n\n set_failed_get_memory_mb:\n publish:\n message: \"Unable to determine the amount of physical memory (no 'memory_mb' found in introspection_data).\"\n on-success: fail\n\n set_failed_get_nova_vcpu_count:\n publish:\n message: <% task(get_nova_vcpu_count).result %>\n on-success: fail\n\n set_failed_get_num_cores:\n publish:\n message: \"Unable to determine the number of CPU cores (no 'cpus' found in introspection_data).\"\n on-success: fail\n\n set_failed_insufficient_mem:\n publish:\n message: \"<% $.memory_mb %> MB is not enough memory to run hyperconverged.\"\n on-success: fail\n\n set_failed_insufficient_cpu:\n publish:\n message: \"<% $.num_cores %> CPU cores are not enough to run hyperconverged.\"\n on-success: fail\n", "name": "tripleo.derive_params_formulas.v1", "tags": [], "created_at": "2018-08-21 13:37:03", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "ac0b49da-0b2b-497d-a948-96babaad262a"} > >2018-08-21 16:37:03,961 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:37:03,965 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/workbooks -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: text/plain" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '--- >version: '2.0' >name: tripleo.support.v1 >description: TripleO support workflows > >workflows: > > collect_logs: > description: > > This workflow runs sosreport on the servers where their names match the > provided server_name input. The logs are stored in the provided sos_dir. > input: > - server_name > - sos_dir: /var/tmp/tripleo-sos > - sos_options: boot,cluster,hardware,kernel,memory,nfs,openstack,packagemanager,performance,services,storage,system,webserver,virt > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > collect_logs_on_servers: > workflow: tripleo.deployment.v1.deploy_on_servers > on-success: send_message > on-error: set_collect_logs_on_servers_failed > input: > server_name: <% $.server_name %> > config_name: 'run_sosreport' > config: | > #!/bin/bash > mkdir -p <% $.sos_dir %> > sosreport --batch \ > -p <% $.sos_options %> \ > --tmp-dir <% $.sos_dir %> > > set_collect_logs_on_servers_failed: > on-complete: > - send_message > publish: > type: tripleo.deployment.v1.fetch_logs > status: FAILED > message: <% task().result %> > > # status messaging > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: <% $.get('type', 'tripleo.support.v1.collect_logs') %> > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = 'FAILED' %> > > upload_logs: > description: > > This workflow uploads the sosreport files stored in the provide sos_dir > on the provided host (server_uuid) to a swift container on the undercloud > input: > - server_uuid > - server_name > - container > - sos_dir: /var/tmp/tripleo-sos > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > # actions > get_swift_information: > action: tripleo.swift.swift_information > on-success: do_log_upload > on-error: set_get_swift_information_failed > input: > container: <% $.container %> > publish: > container_url: <% task().result.container_url %> > auth_key: <% task().result.auth_key %> > > set_get_swift_information_failed: > on-complete: > - send_message > publish: > status: FAILED > message: <% task(get_swift_information).result %> > > do_log_upload: > action: tripleo.deployment.config > on-success: send_message > on-error: set_do_log_upload_failed > input: > server_id: <% $.server_uuid %> > name: "upload_logs" > config: | > #!/bin/bash > CONTAINER_URL="<% $.container_url %>" > TOKEN="<% $.auth_key %>" > SOS_DIR="<% $.sos_dir %>" > for FILE in $(find $SOS_DIR -type f); do > FILENAME=$(basename $FILE) > curl -X PUT -i -H "X-Auth-Token: $TOKEN" -T $FILE $CONTAINER_URL/$FILENAME > if [ $? -eq 0 ]; then > rm -f $FILE > fi > done > group: "script" > publish: > message: "Uploaded logs from <% $.server_name %>" > > set_do_log_upload_failed: > on-complete: > - send_message > publish: > status: FAILED > message: <% tag(do_log_upload).result %> > > # status messaging > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: <% $.get('type', 'tripleo.support.v1.upload_logs') %> > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = 'FAILED' %> > > create_container: > description: > > This work flow is used to check if the container exists and creates it > if it does not exist. > input: > - container > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > check_container: > action: swift.head_container container=<% $.container %> > on-success: send_message > on-error: create_container > > create_container: > action: swift.put_container > input: > container: <% $.container %> > headers: > x-container-meta-usage-tripleo: support > on-success: send_message > on-error: set_create_container_failed > > set_create_container_failed: > on-complete: > - send_message > publish: > type: tripleo.support.v1.create_container.create_container > status: FAILED > message: <% task(create_container).result %> > > # status messaging > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: <% $.get('type', 'tripleo.support.v1.create_container') %> > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = 'FAILED' %> > > delete_container: > description: > > This workflow deletes all the objects in a provided swift container and > then removes the container itself from the undercloud. > input: > - container > - concurrency: 5 > - timeout: 900 > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > # actions > check_container: > action: swift.head_container container=<% $.container %> > on-success: list_objects > on-error: set_check_container_failure > > set_check_container_failure: > on-complete: send_message > publish: > status: FAILED > type: tripleo.support.v1.delete_container.check_container > message: <% task(check_container).result %> > > list_objects: > action: swift.get_container container=<% $.container %> > on-success: delete_objects > on-error: set_list_objects_failure > publish: > log_objects: <% task().result[1] %> > > set_list_objects_failure: > on-complete: send_message > publish: > status: FAILED > type: tripleo.support.v1.delete_container.list_objects > message: <% task(list_objects).result %> > > delete_objects: > action: swift.delete_object > concurrency: <% $.concurrency %> > timeout: <% $.timeout %> > with-items: object in <% $.log_objects %> > input: > container: <% $.container %> > obj: <% $.object.name %> > on-success: remove_container > on-error: set_delete_objects_failure > > set_delete_objects_failure: > on-complete: send_message > publish: > status: FAILED > type: tripleo.support.v1.delete_container.delete_objects > message: <% task(delete_objects).result %> > > remove_container: > action: swift.delete_container container=<% $.container %> > on-success: send_message > on-error: set_remove_container_failure > > set_remove_container_failure: > on-complete: send_message > publish: > status: FAILED > type: tripleo.support.v1.delete_container.remove_container > message: <% task(remove_container).result %> > > # status messaging > send_message: > action: zaqar.queue_post > wait-before: 5 > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: <% $.get('type', 'tripleo.support.v1.delete_container') %> > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = 'FAILED' %> > > fetch_logs: > description: > > This workflow creates a container on the undercloud, executes the log > collection on the servers whose names match the provided server_name, and > executes the log upload process on all the servers to the container on > the undercloud. > input: > - server_name > - container > - concurrency: 5 > - timeout: 1800 > - queue_name: tripleo > > tags: > - tripleo-common-managed > > tasks: > # actions > create_container: > workflow: tripleo.support.v1.create_container > on-success: get_servers_matching > on-error: set_create_container_failed > input: > container: <% $.container %> > queue_name: <% $.queue_name %> > > set_create_container_failed: > on-complete: send_message > publish: > type: tripleo.support.v1.fetch_logs.create_container > status: FAILED > message: <% task(create_container).result %> > > get_servers_matching: > action: nova.servers_list > on-success: collect_logs_on_servers > publish: > servers_with_name: <% task().result._info.where($.name.indexOf(execution().input.server_name) > -1) %> > > collect_logs_on_servers: > workflow: tripleo.support.v1.collect_logs > timeout: <% $.timeout %> > on-success: upload_logs_on_servers > on-error: set_collect_logs_on_servers_failed > input: > server_name: <% $.server_name %> > queue_name: <% $.queue_name %> > > set_collect_logs_on_servers_failed: > on-complete: send_message > publish: > type: tripleo.support.v1.fetch_logs.collect_logs_on_servers > status: FAILED > message: <% task(collect_logs_on_servers).result %> > > upload_logs_on_servers: > on-success: send_message > on-error: set_upload_logs_on_servers_failed > with-items: server in <% $.servers_with_name %> > concurrency: <% $.concurrency %> > workflow: tripleo.support.v1.upload_logs > input: > server_name: <% $.server.name %> > server_uuid: <% $.server.id %> > container: <% $.container %> > queue_name: <% $.queue_name %> > > set_upload_logs_on_servers_failed: > on-complete: send_message > publish: > type: tripleo.support.v1.fetch_logs.upload_logs > status: FAILED > message: <% task(upload_logs_on_servers).result %> > > # status messaging > send_message: > action: zaqar.queue_post > retry: count=5 delay=1 > input: > queue_name: <% $.queue_name %> > messages: > body: > type: <% $.get('type', 'tripleo.support.v1.fetch_logs') %> > payload: > status: <% $.get('status', 'SUCCESS') %> > message: <% $.get('message', '') %> > execution: <% execution() %> > on-success: > - fail: <% $.get('status') = 'FAILED' %> >' >2018-08-21 16:37:08,363 DEBUG: http://192.168.24.1:8989 "POST /v2/workbooks HTTP/1.1" 201 12000 >2018-08-21 16:37:08,369 DEBUG: RESP: [201] Content-Length: 12000 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:08 GMT Connection: keep-alive >RESP BODY: {"definition": "---\nversion: '2.0'\nname: tripleo.support.v1\ndescription: TripleO support workflows\n\nworkflows:\n\n collect_logs:\n description: >\n This workflow runs sosreport on the servers where their names match the\n provided server_name input. The logs are stored in the provided sos_dir.\n input:\n - server_name\n - sos_dir: /var/tmp/tripleo-sos\n - sos_options: boot,cluster,hardware,kernel,memory,nfs,openstack,packagemanager,performance,services,storage,system,webserver,virt\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n collect_logs_on_servers:\n workflow: tripleo.deployment.v1.deploy_on_servers\n on-success: send_message\n on-error: set_collect_logs_on_servers_failed\n input:\n server_name: <% $.server_name %>\n config_name: 'run_sosreport'\n config: |\n #!/bin/bash\n mkdir -p <% $.sos_dir %>\n sosreport --batch \\\n -p <% $.sos_options %> \\\n --tmp-dir <% $.sos_dir %>\n\n set_collect_logs_on_servers_failed:\n on-complete:\n - send_message\n publish:\n type: tripleo.deployment.v1.fetch_logs\n status: FAILED\n message: <% task().result %>\n\n # status messaging\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: <% $.get('type', 'tripleo.support.v1.collect_logs') %>\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = 'FAILED' %>\n\n upload_logs:\n description: >\n This workflow uploads the sosreport files stored in the provide sos_dir\n on the provided host (server_uuid) to a swift container on the undercloud\n input:\n - server_uuid\n - server_name\n - container\n - sos_dir: /var/tmp/tripleo-sos\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n # actions\n get_swift_information:\n action: tripleo.swift.swift_information\n on-success: do_log_upload\n on-error: set_get_swift_information_failed\n input:\n container: <% $.container %>\n publish:\n container_url: <% task().result.container_url %>\n auth_key: <% task().result.auth_key %>\n\n set_get_swift_information_failed:\n on-complete:\n - send_message\n publish:\n status: FAILED\n message: <% task(get_swift_information).result %>\n\n do_log_upload:\n action: tripleo.deployment.config\n on-success: send_message\n on-error: set_do_log_upload_failed\n input:\n server_id: <% $.server_uuid %>\n name: \"upload_logs\"\n config: |\n #!/bin/bash\n CONTAINER_URL=\"<% $.container_url %>\"\n TOKEN=\"<% $.auth_key %>\"\n SOS_DIR=\"<% $.sos_dir %>\"\n for FILE in $(find $SOS_DIR -type f); do\n FILENAME=$(basename $FILE)\n curl -X PUT -i -H \"X-Auth-Token: $TOKEN\" -T $FILE $CONTAINER_URL/$FILENAME\n if [ $? -eq 0 ]; then\n rm -f $FILE\n fi\n done\n group: \"script\"\n publish:\n message: \"Uploaded logs from <% $.server_name %>\"\n\n set_do_log_upload_failed:\n on-complete:\n - send_message\n publish:\n status: FAILED\n message: <% tag(do_log_upload).result %>\n\n # status messaging\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: <% $.get('type', 'tripleo.support.v1.upload_logs') %>\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = 'FAILED' %>\n\n create_container:\n description: >\n This work flow is used to check if the container exists and creates it\n if it does not exist.\n input:\n - container\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n check_container:\n action: swift.head_container container=<% $.container %>\n on-success: send_message\n on-error: create_container\n\n create_container:\n action: swift.put_container\n input:\n container: <% $.container %>\n headers:\n x-container-meta-usage-tripleo: support\n on-success: send_message\n on-error: set_create_container_failed\n\n set_create_container_failed:\n on-complete:\n - send_message\n publish:\n type: tripleo.support.v1.create_container.create_container\n status: FAILED\n message: <% task(create_container).result %>\n\n # status messaging\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: <% $.get('type', 'tripleo.support.v1.create_container') %>\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = 'FAILED' %>\n\n delete_container:\n description: >\n This workflow deletes all the objects in a provided swift container and\n then removes the container itself from the undercloud.\n input:\n - container\n - concurrency: 5\n - timeout: 900\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n # actions\n check_container:\n action: swift.head_container container=<% $.container %>\n on-success: list_objects\n on-error: set_check_container_failure\n\n set_check_container_failure:\n on-complete: send_message\n publish:\n status: FAILED\n type: tripleo.support.v1.delete_container.check_container\n message: <% task(check_container).result %>\n\n list_objects:\n action: swift.get_container container=<% $.container %>\n on-success: delete_objects\n on-error: set_list_objects_failure\n publish:\n log_objects: <% task().result[1] %>\n\n set_list_objects_failure:\n on-complete: send_message\n publish:\n status: FAILED\n type: tripleo.support.v1.delete_container.list_objects\n message: <% task(list_objects).result %>\n\n delete_objects:\n action: swift.delete_object\n concurrency: <% $.concurrency %>\n timeout: <% $.timeout %>\n with-items: object in <% $.log_objects %>\n input:\n container: <% $.container %>\n obj: <% $.object.name %>\n on-success: remove_container\n on-error: set_delete_objects_failure\n\n set_delete_objects_failure:\n on-complete: send_message\n publish:\n status: FAILED\n type: tripleo.support.v1.delete_container.delete_objects\n message: <% task(delete_objects).result %>\n\n remove_container:\n action: swift.delete_container container=<% $.container %>\n on-success: send_message\n on-error: set_remove_container_failure\n\n set_remove_container_failure:\n on-complete: send_message\n publish:\n status: FAILED\n type: tripleo.support.v1.delete_container.remove_container\n message: <% task(remove_container).result %>\n\n # status messaging\n send_message:\n action: zaqar.queue_post\n wait-before: 5\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: <% $.get('type', 'tripleo.support.v1.delete_container') %>\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = 'FAILED' %>\n\n fetch_logs:\n description: >\n This workflow creates a container on the undercloud, executes the log\n collection on the servers whose names match the provided server_name, and\n executes the log upload process on all the servers to the container on\n the undercloud.\n input:\n - server_name\n - container\n - concurrency: 5\n - timeout: 1800\n - queue_name: tripleo\n\n tags:\n - tripleo-common-managed\n\n tasks:\n # actions\n create_container:\n workflow: tripleo.support.v1.create_container\n on-success: get_servers_matching\n on-error: set_create_container_failed\n input:\n container: <% $.container %>\n queue_name: <% $.queue_name %>\n\n set_create_container_failed:\n on-complete: send_message\n publish:\n type: tripleo.support.v1.fetch_logs.create_container\n status: FAILED\n message: <% task(create_container).result %>\n\n get_servers_matching:\n action: nova.servers_list\n on-success: collect_logs_on_servers\n publish:\n servers_with_name: <% task().result._info.where($.name.indexOf(execution().input.server_name) > -1) %>\n\n collect_logs_on_servers:\n workflow: tripleo.support.v1.collect_logs\n timeout: <% $.timeout %>\n on-success: upload_logs_on_servers\n on-error: set_collect_logs_on_servers_failed\n input:\n server_name: <% $.server_name %>\n queue_name: <% $.queue_name %>\n\n set_collect_logs_on_servers_failed:\n on-complete: send_message\n publish:\n type: tripleo.support.v1.fetch_logs.collect_logs_on_servers\n status: FAILED\n message: <% task(collect_logs_on_servers).result %>\n\n upload_logs_on_servers:\n on-success: send_message\n on-error: set_upload_logs_on_servers_failed\n with-items: server in <% $.servers_with_name %>\n concurrency: <% $.concurrency %>\n workflow: tripleo.support.v1.upload_logs\n input:\n server_name: <% $.server.name %>\n server_uuid: <% $.server.id %>\n container: <% $.container %>\n queue_name: <% $.queue_name %>\n\n set_upload_logs_on_servers_failed:\n on-complete: send_message\n publish:\n type: tripleo.support.v1.fetch_logs.upload_logs\n status: FAILED\n message: <% task(upload_logs_on_servers).result %>\n\n # status messaging\n send_message:\n action: zaqar.queue_post\n retry: count=5 delay=1\n input:\n queue_name: <% $.queue_name %>\n messages:\n body:\n type: <% $.get('type', 'tripleo.support.v1.fetch_logs') %>\n payload:\n status: <% $.get('status', 'SUCCESS') %>\n message: <% $.get('message', '') %>\n execution: <% execution() %>\n on-success:\n - fail: <% $.get('status') = 'FAILED' %>\n", "name": "tripleo.support.v1", "tags": [], "created_at": "2018-08-21 13:37:08", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "7eaec575-0d0f-4c7c-81f7-f6027709cd7e"} > >2018-08-21 16:37:08,369 DEBUG: HTTP POST http://192.168.24.1:8989/v2/workbooks 201 >2018-08-21 16:37:08,371 INFO: Mistral workbooks configured successfully >2018-08-21 16:37:08,384 DEBUG: Starting new HTTP connection (1): 192.168.24.1 >2018-08-21 16:37:10,935 DEBUG: http://192.168.24.1:8080 "GET /v1/AUTH_f22c2ea140d3466abe874cd49d41a625?format=json HTTP/1.1" 200 2 >2018-08-21 16:37:10,938 DEBUG: REQ: curl -i http://192.168.24.1:8080/v1/AUTH_f22c2ea140d3466abe874cd49d41a625?format=json -X GET -H "Accept-Encoding: gzip" -H "X-Auth-Token: gAAAAABbfBSD7ogl..." >2018-08-21 16:37:10,939 DEBUG: RESP STATUS: 200 OK >2018-08-21 16:37:10,940 DEBUG: RESP HEADERS: {u'Content-Length': u'2', u'X-Put-Timestamp': u'1534858630.93068', u'X-Account-Object-Count': u'0', u'X-Timestamp': u'1534858630.93068', u'X-Trans-Id': u'txbc5ef052148e49b09642a-005b7c1584', u'Date': u'Tue, 21 Aug 2018 13:37:10 GMT', u'X-Account-Bytes-Used': u'0', u'X-Account-Container-Count': u'0', u'Content-Type': u'application/json; charset=utf-8', u'X-Openstack-Request-Id': u'txbc5ef052148e49b09642a-005b7c1584'} >2018-08-21 16:37:10,941 DEBUG: RESP BODY: [] >2018-08-21 16:37:10,943 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/environments/tripleo.undercloud-config -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:37:10,986 DEBUG: http://192.168.24.1:8989 "GET /v2/environments/tripleo.undercloud-config HTTP/1.1" 404 115 >2018-08-21 16:37:10,989 DEBUG: RESP: [404] Content-Length: 115 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:10 GMT Connection: keep-alive >RESP BODY: {"debuginfo": null, "faultcode": "Client", "faultstring": "Environment not found [name=tripleo.undercloud-config]"} > >2018-08-21 16:37:10,989 DEBUG: Request returned failure status: 404 >2018-08-21 16:37:10,992 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/environments -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"variables": "{\"undercloud_ceilometer_snmpd_password\": \"61a87fbc947bd19e29bf232e7309dbf0a2874b37\", \"undercloud_db_password\": \"114e0a647d47ba7e12d0c1b459bc5a3dd55490a6\"}", "name": "tripleo.undercloud-config", "description": "Undercloud configuration parameters"}' >2018-08-21 16:37:11,063 DEBUG: http://192.168.24.1:8989 "POST /v2/environments HTTP/1.1" 201 423 >2018-08-21 16:37:11,065 DEBUG: RESP: [201] Content-Length: 423 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:11 GMT Connection: keep-alive >RESP BODY: {"created_at": "2018-08-21 13:37:11", "description": "Undercloud configuration parameters", "variables": "{\"undercloud_ceilometer_snmpd_password\": \"61a87fbc947bd19e29bf232e7309dbf0a2874b37\", \"undercloud_db_password\": \"114e0a647d47ba7e12d0c1b459bc5a3dd55490a6\"}", "scope": "private", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "57a136ed-1565-4dcb-be07-efb2b3ab19af", "name": "tripleo.undercloud-config"} > >2018-08-21 16:37:11,066 DEBUG: HTTP POST http://192.168.24.1:8989/v2/environments 201 >2018-08-21 16:37:11,069 DEBUG: REQ: curl -g -i -X POST http://192.168.24.1:8989/v2/executions -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "content-type: application/json" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" -d '{"input": "{\"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\"}", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "description": ""}' >2018-08-21 16:37:14,594 DEBUG: http://192.168.24.1:8989 "POST /v2/executions HTTP/1.1" 201 684 >2018-08-21 16:37:14,597 DEBUG: RESP: [201] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:14 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:37:14,597 DEBUG: HTTP POST http://192.168.24.1:8989/v2/executions 201 >2018-08-21 16:37:14,599 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:37:14,660 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:37:14,663 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:14 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:37:14,663 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:37:19,668 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:37:19,720 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:37:19,722 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:19 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:37:19,723 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:37:24,730 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:37:24,781 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:37:24,783 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:24 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:37:24,784 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:37:29,790 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:37:29,846 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:37:29,849 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:29 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:37:29,850 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:37:34,853 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:37:34,903 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:37:34,906 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:34 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:37:34,906 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:37:39,909 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:37:39,961 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:37:39,963 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:39 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:37:39,964 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:37:44,971 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:37:45,021 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:37:45,023 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:45 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:37:45,024 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:37:50,030 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:37:50,081 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:37:50,084 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:50 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:37:50,085 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:37:55,092 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:37:55,142 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:37:55,145 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:37:55 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:37:55,145 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:38:00,151 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:38:00,204 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:38:00,207 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:38:00 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:38:00,207 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:38:05,214 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:38:05,264 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:38:05,267 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:38:05 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:38:05,267 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:38:10,273 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:38:10,325 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:38:10,327 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:38:10 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:38:10,328 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:38:15,331 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:38:15,381 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:38:15,383 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:38:15 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:38:15,384 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:38:20,391 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:38:20,444 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:38:20,447 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:38:20 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:38:20,448 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:38:25,451 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:38:25,524 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:38:25,527 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:38:25 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:38:25,527 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:38:30,534 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:38:30,586 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:38:30,588 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:38:30 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:38:30,589 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:38:35,592 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:38:35,642 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:38:35,645 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:38:35 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:38:35,645 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:38:40,652 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:38:40,704 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:38:40,707 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:38:40 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:38:40,707 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:38:45,714 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:38:45,764 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:38:45,767 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:38:45 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:38:45,767 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:38:50,771 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:38:50,823 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:38:50,825 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:38:50 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:38:50,826 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:38:55,829 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:38:55,879 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:38:55,882 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:38:55 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:38:55,882 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:39:00,889 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:39:00,941 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:39:00,944 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:39:00 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:39:00,944 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:39:05,947 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:39:05,997 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:39:06,000 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:39:05 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:39:06,000 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:39:11,007 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:39:11,059 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:39:11,061 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:39:11 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:39:11,062 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:39:16,065 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:39:16,116 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:39:16,119 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:39:16 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:39:16,119 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:39:21,124 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:39:21,181 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:39:21,183 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:39:21 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:39:21,184 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:39:26,191 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:39:26,241 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:39:26,243 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:39:26 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:39:26,244 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:39:31,250 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:39:31,302 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:39:31,304 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:39:31 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:39:31,305 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:39:36,311 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:39:36,361 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:39:36,364 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:39:36 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:39:36,365 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:39:41,368 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:39:41,420 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:39:41,422 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:39:41 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:39:41,423 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:39:46,428 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:39:46,478 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:39:46,481 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:39:46 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:39:46,481 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:39:51,484 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:39:51,565 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:39:51,568 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:39:51 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:39:51,569 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:39:56,574 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:39:56,623 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:39:56,626 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:39:56 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:39:56,627 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:40:01,629 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:40:01,680 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:40:01,683 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:40:01 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:40:01,684 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:40:06,689 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:40:06,738 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:40:06,741 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:40:06 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:40:06,742 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:40:11,744 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:40:11,797 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:40:11,799 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:40:11 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:40:11,800 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:40:16,806 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:40:16,856 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:40:16,859 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:40:16 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:40:16,860 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:40:21,862 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:40:21,914 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:40:21,917 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:40:21 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:40:21,918 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:40:26,924 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:40:26,974 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:40:26,977 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:40:26 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:40:26,977 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:40:31,984 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:40:32,836 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:40:32,838 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:40:32 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:40:32,839 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:40:37,845 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:40:37,896 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:40:37,898 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:40:37 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:40:37,899 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:40:42,901 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:40:42,953 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:40:42,956 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:40:42 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:40:42,957 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:40:47,963 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:40:48,013 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:40:48,016 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:40:48 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:40:48,016 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:40:53,019 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:40:53,071 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:40:53,073 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:40:53 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:40:53,074 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:40:58,081 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:40:58,131 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:40:58,134 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:40:58 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:40:58,134 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:41:03,138 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:41:03,190 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:41:03,192 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:41:03 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:41:03,193 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:41:08,199 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:41:08,249 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:41:08,252 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:41:08 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:41:08,253 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:41:13,259 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:41:13,310 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:41:13,313 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:41:13 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:41:13,314 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:41:18,319 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:41:18,369 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:41:18,371 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:41:18 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:41:18,372 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:41:23,378 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:41:23,435 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:41:23,438 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:41:23 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:41:23,439 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:41:28,445 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:41:28,495 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:41:28,498 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:41:28 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:41:28,499 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:41:33,505 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:41:33,557 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:41:33,560 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:41:33 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:41:33,560 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:41:38,567 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:41:38,617 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:41:38,620 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:41:38 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:41:38,621 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:41:43,625 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:41:43,677 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:41:43,679 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:41:43 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:41:43,680 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:41:48,686 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:41:48,736 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:41:48,739 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:41:48 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:41:48,740 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:41:53,746 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:41:53,825 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:41:53,828 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:41:53 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:41:53,828 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:41:58,835 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:41:58,885 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:41:58,888 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:41:58 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:41:58,888 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:42:03,893 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:42:03,947 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:42:03,950 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:42:03 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:42:03,950 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:42:08,957 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:42:09,008 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:42:09,010 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:42:09 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:42:09,011 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:42:14,017 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:42:14,070 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:42:14,072 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:42:14 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:42:14,073 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:42:19,078 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:42:19,128 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:42:19,131 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:42:19 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:42:19,131 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:42:24,137 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:42:24,189 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:42:24,192 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:42:24 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:42:24,193 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:42:29,197 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:42:29,248 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:42:29,251 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:42:29 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:42:29,251 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:42:34,257 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:42:34,309 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:42:34,312 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:42:34 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:42:34,312 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:42:39,314 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:42:39,364 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:42:39,367 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:42:39 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:42:39,368 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:42:44,372 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:42:44,424 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:42:44,427 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:42:44 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:42:44,428 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:42:49,434 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:42:49,486 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:42:49,489 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:42:49 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:42:49,489 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:42:54,492 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:42:54,546 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:42:54,549 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:42:54 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:42:54,549 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:42:59,556 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:42:59,608 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:42:59,610 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:42:59 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:42:59,611 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:43:04,618 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:43:04,671 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:43:04,674 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:43:04 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:43:04,675 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:43:09,681 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:43:09,734 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:43:09,736 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:43:09 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:43:09,737 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:43:14,742 DEBUG: REQ: curl -g -i -X GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 -H "User-Agent: -c keystoneauth1/3.4.0 python-requests/2.14.2 CPython/2.7.5" -H "X-Auth-Token: {SHA1}b995f3af1e846051bcc0732bd8b9b361b91ac673" >2018-08-21 16:43:14,795 DEBUG: http://192.168.24.1:8989 "GET /v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 HTTP/1.1" 200 684 >2018-08-21 16:43:14,798 DEBUG: RESP: [200] Content-Length: 684 Content-Type: application/json Date: Tue, 21 Aug 2018 13:43:14 GMT Connection: keep-alive >RESP BODY: {"root_execution_id": null, "state_info": null, "description": "", "state": "RUNNING", "workflow_name": "tripleo.plan_management.v1.create_deployment_plan", "task_execution_id": null, "updated_at": "2018-08-21 13:37:14", "workflow_id": "7285aaf4-974d-49c1-a9f4-dcaa639f744e", "params": "{\"namespace\": \"\", \"env\": {}}", "workflow_namespace": "", "output": "{}", "input": "{\"generate_passwords\": true, \"use_default_templates\": true, \"queue_name\": \"e6488ad1-1635-4462-87ae-52fcbfd44b5a\", \"container\": \"overcloud\", \"source_url\": null}", "created_at": "2018-08-21 13:37:14", "project_id": "f22c2ea140d3466abe874cd49d41a625", "id": "42f95343-e6a3-4afb-bf59-40c20cb61527"} > >2018-08-21 16:43:14,798 DEBUG: HTTP GET http://192.168.24.1:8989/v2/executions/42f95343-e6a3-4afb-bf59-40c20cb61527 200 >2018-08-21 16:43:14,799 ERROR: TIMEOUT waiting for execution 42f95343-e6a3-4afb-bf59-40c20cb61527 to finish. State: RUNNING >2018-08-21 16:43:14,799 DEBUG: An exception occurred >Traceback (most recent call last): > File "/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py", line 2337, in install > _post_config(instack_env, upgrade) > File "/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py", line 2029, in _post_config > _post_config_mistral(instack_env, mistral, swift) > File "/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py", line 1965, in _post_config_mistral > _create_default_plan(mistral, plans) > File "/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py", line 1907, in _create_default_plan > fail_on_error=True) > File "/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py", line 1852, in _wait_for_mistral_execution > raise RuntimeError(error_message) >RuntimeError: TIMEOUT waiting for execution 42f95343-e6a3-4afb-bf59-40c20cb61527 to finish. State: RUNNING >2018-08-21 16:43:14,802 ERROR: >############################################################################# >Undercloud install failed. > >Reason: TIMEOUT waiting for execution 42f95343-e6a3-4afb-bf59-40c20cb61527 to finish. State: RUNNING > >See the previous output for details about what went wrong. The full install >log can be found at /home/stack/.instack/install-undercloud.log. > >############################################################################# >
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 1619704
: 1477543 |
1477568