Started by upstream project "rdo-delorean-promote-mitaka" build number 622 originally caused by: Started by user RDO Project [EnvInject] - Loading node environment variables. Building remotely on rdo-ci-slave01 (rdo khaleesi) in workspace /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002 [WS-CLEANUP] Deleting project workspace... [WS-CLEANUP] Done Wiping out workspace first. Cloning the remote Git repository Cloning repository https://github.com/redhat-openstack/weirdo.git > /usr/bin/git init /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo # timeout=10 Fetching upstream changes from https://github.com/redhat-openstack/weirdo.git > /usr/bin/git --version # timeout=10 > /usr/bin/git -c core.askpass=true fetch --tags --progress https://github.com/redhat-openstack/weirdo.git +refs/heads/*:refs/remotes/origin/* > /usr/bin/git config remote.origin.url https://github.com/redhat-openstack/weirdo.git # timeout=10 > /usr/bin/git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > /usr/bin/git config remote.origin.url https://github.com/redhat-openstack/weirdo.git # timeout=10 Fetching upstream changes from https://github.com/redhat-openstack/weirdo.git > /usr/bin/git -c core.askpass=true fetch --tags --progress https://github.com/redhat-openstack/weirdo.git +refs/heads/*:refs/remotes/origin/* > /usr/bin/git rev-parse origin/master^{commit} # timeout=10 Checking out Revision 77084420349d74c7e5ce0594c62583c4ac0f6992 (origin/master) > /usr/bin/git config core.sparsecheckout # timeout=10 > /usr/bin/git checkout -f 77084420349d74c7e5ce0594c62583c4ac0f6992 > /usr/bin/git rev-list 77084420349d74c7e5ce0594c62583c4ac0f6992 # timeout=10 > /usr/bin/git tag -a -f -m Jenkins Build #245 jenkins-weirdo-mitaka-promote-puppet-openstack-scenario002-245 # timeout=10 [weirdo-mitaka-promote-puppet-openstack-scenario002] $ /bin/sh -xe /tmp/hudson8792083293300023705.sh + ARTIFACT_URL=https://ci.centos.org/artifacts/rdo + cat [weirdo-mitaka-promote-puppet-openstack-scenario002] $ /bin/sh -xe /tmp/hudson7522337465452060406.sh + export ANSIBLE_HOSTS=/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/hosts + ANSIBLE_HOSTS=/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/hosts + export SSID_FILE=/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/cico-ssid + SSID_FILE=/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/cico-ssid + NODE_COUNT=1 + ANSIBLE_HOSTS=/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/hosts + SSID_FILE=/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/cico-ssid + ANSIBLE_SSH_KEY=/home/rhos-ci/.ssh/id_rsa + cat ++ cico -q node get --count 1 --column hostname --column ip_address --column comment -f value + nodes='n2.dusty 172.19.2.66 5ad1554c' + touch /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/cico-ssid + IFS=' ' + for node in '${nodes}' ++ echo 'n2.dusty 172.19.2.66 5ad1554c' ++ cut -f1 -d ' ' + host=n2.dusty ++ cut -f2 -d ' ' ++ echo 'n2.dusty 172.19.2.66 5ad1554c' + address=172.19.2.66 ++ echo 'n2.dusty 172.19.2.66 5ad1554c' ++ cut -f3 -d ' ' + ssid=5ad1554c + line='n2.dusty ansible_host=172.19.2.66 ansible_user=root ansible_ssh_private_key_file=/home/rhos-ci/.ssh/id_rsa cico_ssid=5ad1554c' + echo 'n2.dusty ansible_host=172.19.2.66 ansible_user=root ansible_ssh_private_key_file=/home/rhos-ci/.ssh/id_rsa cico_ssid=5ad1554c' + grep -q 5ad1554c /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/cico-ssid + echo 5ad1554c [weirdo-mitaka-promote-puppet-openstack-scenario002] $ /bin/sh -xe /tmp/hudson1086521985592735907.sh + [[ mitaka != \m\a\s\t\e\r ]] + version=stable/mitaka + delorean_hash=2d/d7/2dd7f56b0b04af66c8ea2739df3dcc43dc3a9316_cbd0900e + delorean_url=http://trunk.rdoproject.org/centos7-mitaka/2d/d7/2dd7f56b0b04af66c8ea2739df3dcc43dc3a9316_cbd0900e/delorean.repo + export ARA_DATABASE=sqlite:////home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo-mitaka-promote-puppet-openstack-scenario002.sqlite + ARA_DATABASE=sqlite:////home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo-mitaka-promote-puppet-openstack-scenario002.sqlite + cd weirdo + git log -n 5 --oneline 7708442 Bump version of ARA to beta release 24e3433 Remove knowledge of log collection from weirdo scenarios 35eb10e Add support for ARA playbook recording in WeIRDO 5455ef6 Resolve issue retrieving rsync password for ci-centos ec73dd4 Decouple WeIRDO from ci-centos provisioner + tox -e ansible-playbook -- -vv -i hosts playbooks/puppet-openstack-scenario002.yml -e ci_environment=ci-centos -e delorean_url=http://trunk.rdoproject.org/centos7-mitaka/2d/d7/2dd7f56b0b04af66c8ea2739df3dcc43dc3a9316_cbd0900e/delorean.repo -e openstack_release=mitaka -e version=stable/mitaka ansible-playbook create: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/.tox/ansible-playbook ansible-playbook installdeps: -r/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/test-requirements.txt ansible-playbook develop-inst: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo ansible-playbook installed: alabaster==0.7.8,ansible==2.0.1.0,ansible-lint==2.7.0,ara==0.5.2,Babel==2.3.4,cffi==1.6.0,cliff==2.0.0,cmd2==0.6.8,cryptography==1.3.2,decorator==4.0.9,docutils==0.12,enum34==1.1.6,Flask==0.10.1,Flask-SQLAlchemy==2.1,graphviz==0.4.10,idna==2.1,imagesize==0.7.1,ipaddress==1.0.16,itsdangerous==0.24,Jinja2==2.8,MarkupSafe==0.23,paramiko==2.0.0,pbr==1.9.1,prettytable==0.7.2,pyasn1==0.1.9,pycparser==2.14,pycrypto==2.6.1,Pygments==2.1.3,PyMySQL==0.7.3,pyparsing==2.1.4,python-cicoclient==0.3.9,pytz==2016.4,PyYAML==3.11,requests==2.10.0,six==1.10.0,snowballstemmer==1.2.1,Sphinx==1.4.1,sphinx-rtd-theme==0.1.9,SQLAlchemy==1.0.13,stevedore==1.13.0,unicodecsv==0.14.1,-e git+https://github.com/redhat-openstack/weirdo.git@77084420349d74c7e5ce0594c62583c4ac0f6992#egg=weirdo-origin_master,Werkzeug==0.11.9,wheel==0.24.0,You are using pip version 7.1.2, however version 8.1.2 is available.,You should consider upgrading via the 'pip install --upgrade pip' command. ansible-playbook runtests: PYTHONHASHSEED='528614067' ansible-playbook runtests: commands[0] | ansible-galaxy install -r ansible-role-requirements.yml --ignore-errors - extracting packstack to playbooks/roles/packstack - packstack was installed successfully [DEPRECATION WARNING]: The comma separated role spec format, use the yaml/explicit format instead.. This feature will be removed in a future release. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. - extracting puppet-openstack to playbooks/roles/puppet-openstack - puppet-openstack was installed successfully - extracting kolla to playbooks/roles/kolla - kolla was installed successfully - extracting common to playbooks/roles/common - common was installed successfully ansible-playbook runtests: commands[1] | ansible-playbook -vv -i hosts playbooks/puppet-openstack-scenario002.yml -e ci_environment=ci-centos -e delorean_url=http://trunk.rdoproject.org/centos7-mitaka/2d/d7/2dd7f56b0b04af66c8ea2739df3dcc43dc3a9316_cbd0900e/delorean.repo -e openstack_release=mitaka -e version=stable/mitaka Using /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/ansible.cfg as config file 1 plays in playbooks/puppet-openstack-scenario002.yml PLAY [Run puppet-openstack-integration scenario002 tests] ********************** TASK [setup] ******************************************************************* Friday 20 May 2016 11:21:11 +0000 (0:00:01.006) 0:00:01.006 ************ ok: [n2.dusty] TASK [common : include] ******************************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/main.yml:17 Friday 20 May 2016 11:21:13 +0000 (0:00:02.153) 0:00:03.160 ************ included: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/packages.yml for n2.dusty TASK [common : Update all packages] ******************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/packages.yml:18 Friday 20 May 2016 11:21:14 +0000 (0:00:01.148) 0:00:04.308 ************ ok: [n2.dusty] => {"changed": false, "msg": "", "rc": 0, "results": ["Nothing to do here, all packages are up to date"]} results: Nothing to do here, all packages are up to date TASK [common : Install base packages] ****************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/packages.yml:26 Friday 20 May 2016 11:21:36 +0000 (0:00:21.648) 0:00:25.956 ************ changed: [n2.dusty] => (item=[u'deltarpm', u'@Development tools', u'git', u'python-setuptools', u'wget', u'redhat-lsb-core', u'libselinux-python', u'yum-plugin-priorities']) => {"changed": true, "item": ["deltarpm", "@Development tools", "git", "python-setuptools", "wget", "redhat-lsb-core", "libselinux-python", "yum-plugin-priorities"], "msg": "warning: /var/cache/yum/x86_64/7/base/packages/apr-1.4.8-3.el7.x86_64.rpm: Header V3 RSA/SHA256 Signature, key ID f4a80eb5: NOKEY\nImporting GPG key 0xF4A80EB5:\n Userid : \"CentOS-7 Key (CentOS 7 Official Signing Key) \"\n Fingerprint: 6341 ab27 53d7 8a78 a7c2 7bb1 24c6 a8a7 f4a8 0eb5\n Package : centos-release-7-2.1511.el7.centos.2.10.x86_64 (@base)\n From : /etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-7\n", "rc": 0, "results": ["libselinux-python-2.2.2-6.el7.x86_64 providing libselinux-python is already installed", "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * base: mirror.centos.org\n * extras: mirror.centos.org\n * updates: mirror.centos.org\nResolving Dependencies\n--> Running transaction check\n---> Package autoconf.noarch 0:2.69-11.el7 will be installed\n--> Processing Dependency: perl >= 5.006 for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: m4 >= 1.4.14 for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(warnings) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(vars) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(strict) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(constant) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(Text::ParseWords) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(POSIX) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(IO::File) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(Getopt::Long) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(File::stat) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(File::Spec) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(File::Path) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(File::Find) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(File::Copy) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(File::Compare) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(File::Basename) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(Exporter) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(Errno) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(DynaLoader) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(Data::Dumper) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(Cwd) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(Class::Struct) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: perl(Carp) for package: autoconf-2.69-11.el7.noarch\n--> Processing Dependency: /usr/bin/perl for package: autoconf-2.69-11.el7.noarch\n---> Package automake.noarch 0:1.13.4-3.el7 will be installed\n--> Processing Dependency: perl(threads) for package: automake-1.13.4-3.el7.noarch\n--> Processing Dependency: perl(Thread::Queue) for package: automake-1.13.4-3.el7.noarch\n--> Processing Dependency: perl(TAP::Parser) for package: automake-1.13.4-3.el7.noarch\n---> Package bison.x86_64 0:2.7-4.el7 will be installed\n---> Package byacc.x86_64 0:1.9.20130304-3.el7 will be installed\n---> Package cscope.x86_64 0:15.8-7.el7 will be installed\n--> Processing Dependency: emacs-filesystem for package: cscope-15.8-7.el7.x86_64\n---> Package ctags.x86_64 0:5.8-13.el7 will be installed\n---> Package deltarpm.x86_64 0:3.6-3.el7 will be installed\n---> Package diffstat.x86_64 0:1.57-4.el7 will be installed\n---> Package doxygen.x86_64 1:1.8.5-3.el7 will be installed\n---> Package elfutils.x86_64 0:0.163-3.el7 will be installed\n---> Package flex.x86_64 0:2.5.37-3.el7 will be installed\n---> Package gcc.x86_64 0:4.8.5-4.el7 will be installed\n--> Processing Dependency: cpp = 4.8.5-4.el7 for package: gcc-4.8.5-4.el7.x86_64\n--> Processing Dependency: glibc-devel >= 2.2.90-12 for package: gcc-4.8.5-4.el7.x86_64\n--> Processing Dependency: libmpfr.so.4()(64bit) for package: gcc-4.8.5-4.el7.x86_64\n--> Processing Dependency: libmpc.so.3()(64bit) for package: gcc-4.8.5-4.el7.x86_64\n---> Package gcc-c++.x86_64 0:4.8.5-4.el7 will be installed\n--> Processing Dependency: libstdc++-devel = 4.8.5-4.el7 for package: gcc-c++-4.8.5-4.el7.x86_64\n---> Package gcc-gfortran.x86_64 0:4.8.5-4.el7 will be installed\n--> Processing Dependency: libquadmath-devel = 4.8.5-4.el7 for package: gcc-gfortran-4.8.5-4.el7.x86_64\n--> Processing Dependency: libquadmath = 4.8.5-4.el7 for package: gcc-gfortran-4.8.5-4.el7.x86_64\n--> Processing Dependency: libgfortran = 4.8.5-4.el7 for package: gcc-gfortran-4.8.5-4.el7.x86_64\n--> Processing Dependency: libgfortran.so.3()(64bit) for package: gcc-gfortran-4.8.5-4.el7.x86_64\n---> Package git.x86_64 0:1.8.3.1-6.el7_2.1 will be installed\n--> Processing Dependency: perl-Git = 1.8.3.1-6.el7_2.1 for package: git-1.8.3.1-6.el7_2.1.x86_64\n--> Processing Dependency: rsync for package: git-1.8.3.1-6.el7_2.1.x86_64\n--> Processing Dependency: perl(Term::ReadKey) for package: git-1.8.3.1-6.el7_2.1.x86_64\n--> Processing Dependency: perl(Git) for package: git-1.8.3.1-6.el7_2.1.x86_64\n--> Processing Dependency: perl(File::Temp) for package: git-1.8.3.1-6.el7_2.1.x86_64\n--> Processing Dependency: perl(Error) for package: git-1.8.3.1-6.el7_2.1.x86_64\n--> Processing Dependency: libgnome-keyring.so.0()(64bit) for package: git-1.8.3.1-6.el7_2.1.x86_64\n---> Package indent.x86_64 0:2.2.11-13.el7 will be installed\n---> Package intltool.noarch 0:0.50.2-6.el7 will be installed\n--> Processing Dependency: perl(XML::Parser) for package: intltool-0.50.2-6.el7.noarch\n--> Processing Dependency: perl(Encode) for package: intltool-0.50.2-6.el7.noarch\n--> Processing Dependency: gettext-devel for package: intltool-0.50.2-6.el7.noarch\n---> Package libtool.x86_64 0:2.4.2-21.el7_2 will be installed\n---> Package patch.x86_64 0:2.7.1-8.el7 will be installed\n---> Package patchutils.x86_64 0:0.3.3-4.el7 will be installed\n---> Package python-setuptools.noarch 0:0.9.8-4.el7 will be installed\n--> Processing Dependency: python-backports-ssl_match_hostname for package: python-setuptools-0.9.8-4.el7.noarch\n---> Package rcs.x86_64 0:5.9.0-5.el7 will be installed\n---> Package redhat-lsb-core.x86_64 0:4.1-27.el7.centos.1 will be installed\n--> Processing Dependency: redhat-lsb-submod-security(x86-64) = 4.1-27.el7.centos.1 for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64\n--> Processing Dependency: spax for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64\n--> Processing Dependency: /usr/sbin/fuser for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64\n--> Processing Dependency: /usr/bin/time for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64\n--> Processing Dependency: /usr/bin/lpr for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64\n--> Processing Dependency: /usr/bin/lp for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64\n--> Processing Dependency: /usr/bin/killall for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64\n--> Processing Dependency: /usr/bin/bc for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64\n--> Processing Dependency: /usr/bin/batch for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64\n--> Processing Dependency: /usr/bin/at for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64\n--> Processing Dependency: /bin/mailx for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64\n--> Processing Dependency: /bin/ed for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64\n---> Package redhat-rpm-config.noarch 0:9.1.0-68.el7.centos will be installed\n--> Processing Dependency: dwz >= 0.4 for package: redhat-rpm-config-9.1.0-68.el7.centos.noarch\n--> Processing Dependency: zip for package: redhat-rpm-config-9.1.0-68.el7.centos.noarch\n--> Processing Dependency: perl-srpm-macros for package: redhat-rpm-config-9.1.0-68.el7.centos.noarch\n---> Package rpm-build.x86_64 0:4.11.3-17.el7 will be installed\n--> Processing Dependency: unzip for package: rpm-build-4.11.3-17.el7.x86_64\n--> Processing Dependency: bzip2 for package: rpm-build-4.11.3-17.el7.x86_64\n--> Processing Dependency: /usr/bin/gdb-add-index for package: rpm-build-4.11.3-17.el7.x86_64\n---> Package rpm-sign.x86_64 0:4.11.3-17.el7 will be installed\n---> Package subversion.x86_64 0:1.7.14-10.el7 will be installed\n--> Processing Dependency: subversion-libs(x86-64) = 1.7.14-10.el7 for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_wc-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_subr-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_repos-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_ra_svn-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_ra_neon-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_ra_local-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_ra-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_fs_util-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_fs_fs-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_fs_base-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_fs-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_diff-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_delta-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libsvn_client-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libneon.so.27()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libaprutil-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n--> Processing Dependency: libapr-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64\n---> Package swig.x86_64 0:2.0.10-4.el7 will be installed\n---> Package systemtap.x86_64 0:2.8-10.el7 will be installed\n--> Processing Dependency: systemtap-devel = 2.8-10.el7 for package: systemtap-2.8-10.el7.x86_64\n--> Processing Dependency: systemtap-client = 2.8-10.el7 for package: systemtap-2.8-10.el7.x86_64\n---> Package wget.x86_64 0:1.14-10.el7_0.1 will be installed\n---> Package yum-plugin-priorities.noarch 0:1.1.31-34.el7 will be installed\n--> Running transaction check\n---> Package apr.x86_64 0:1.4.8-3.el7 will be installed\n---> Package apr-util.x86_64 0:1.5.2-6.el7 will be installed\n---> Package at.x86_64 0:3.1.13-20.el7 will be installed\n---> Package bc.x86_64 0:1.06.95-13.el7 will be installed\n---> Package bzip2.x86_64 0:1.0.6-13.el7 will be installed\n---> Package cpp.x86_64 0:4.8.5-4.el7 will be installed\n---> Package cups-client.x86_64 1:1.6.3-22.el7 will be installed\n--> Processing Dependency: cups-libs(x86-64) = 1:1.6.3-22.el7 for package: 1:cups-client-1.6.3-22.el7.x86_64\n--> Processing Dependency: libcups.so.2()(64bit) for package: 1:cups-client-1.6.3-22.el7.x86_64\n---> Package dwz.x86_64 0:0.11-3.el7 will be installed\n---> Package ed.x86_64 0:1.9-4.el7 will be installed\n---> Package emacs-filesystem.noarch 1:24.3-18.el7 will be installed\n---> Package gdb.x86_64 0:7.6.1-80.el7 will be installed\n---> Package gettext-devel.x86_64 0:0.18.2.1-4.el7 will be installed\n--> Processing Dependency: gettext-common-devel = 0.18.2.1-4.el7 for package: gettext-devel-0.18.2.1-4.el7.x86_64\n---> Package glibc-devel.x86_64 0:2.17-106.el7_2.6 will be installed\n--> Processing Dependency: glibc-headers = 2.17-106.el7_2.6 for package: glibc-devel-2.17-106.el7_2.6.x86_64\n--> Processing Dependency: glibc-headers for package: glibc-devel-2.17-106.el7_2.6.x86_64\n---> Package libgfortran.x86_64 0:4.8.5-4.el7 will be installed\n---> Package libgnome-keyring.x86_64 0:3.8.0-3.el7 will be installed\n---> Package libmpc.x86_64 0:1.0.1-3.el7 will be installed\n---> Package libquadmath.x86_64 0:4.8.5-4.el7 will be installed\n---> Package libquadmath-devel.x86_64 0:4.8.5-4.el7 will be installed\n---> Package libstdc++-devel.x86_64 0:4.8.5-4.el7 will be installed\n---> Package m4.x86_64 0:1.4.16-10.el7 will be installed\n---> Package mailx.x86_64 0:12.5-12.el7_0 will be installed\n---> Package mpfr.x86_64 0:3.1.1-4.el7 will be installed\n---> Package neon.x86_64 0:0.30.0-3.el7 will be installed\n--> Processing Dependency: libpakchois.so.0()(64bit) for package: neon-0.30.0-3.el7.x86_64\n---> Package perl.x86_64 4:5.16.3-286.el7 will be installed\n--> Processing Dependency: perl-libs = 4:5.16.3-286.el7 for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl(Socket) >= 1.3 for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl(Scalar::Util) >= 1.10 for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl-macros for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl-libs for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl(threads::shared) for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl(Time::Local) for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl(Time::HiRes) for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl(Storable) for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl(Socket) for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl(Scalar::Util) for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl(Pod::Simple::XHTML) for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl(Pod::Simple::Search) for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: perl(Filter::Util::Call) for package: 4:perl-5.16.3-286.el7.x86_64\n--> Processing Dependency: libperl.so()(64bit) for package: 4:perl-5.16.3-286.el7.x86_64\n---> Package perl-Carp.noarch 0:1.26-244.el7 will be installed\n---> Package perl-Data-Dumper.x86_64 0:2.145-3.el7 will be installed\n---> Package perl-Encode.x86_64 0:2.51-7.el7 will be installed\n---> Package perl-Error.noarch 1:0.17020-2.el7 will be installed\n---> Package perl-Exporter.noarch 0:5.68-3.el7 will be installed\n---> Package perl-File-Path.noarch 0:2.09-2.el7 will be installed\n---> Package perl-File-Temp.noarch 0:0.23.01-3.el7 will be installed\n---> Package perl-Getopt-Long.noarch 0:2.40-2.el7 will be installed\n--> Processing Dependency: perl(Pod::Usage) >= 1.14 for package: perl-Getopt-Long-2.40-2.el7.noarch\n---> Package perl-Git.noarch 0:1.8.3.1-6.el7_2.1 will be installed\n---> Package perl-PathTools.x86_64 0:3.40-5.el7 will be installed\n---> Package perl-TermReadKey.x86_64 0:2.30-20.el7 will be installed\n---> Package perl-Test-Harness.noarch 0:3.28-3.el7 will be installed\n---> Package perl-Text-ParseWords.noarch 0:3.29-4.el7 will be installed\n---> Package perl-Thread-Queue.noarch 0:3.02-2.el7 will be installed\n---> Package perl-XML-Parser.x86_64 0:2.41-10.el7 will be installed\n---> Package perl-constant.noarch 0:1.27-2.el7 will be installed\n---> Package perl-srpm-macros.noarch 0:1-8.el7 will be installed\n---> Package perl-threads.x86_64 0:1.87-4.el7 will be installed\n---> Package psmisc.x86_64 0:22.20-9.el7 will be installed\n---> Package python-backports-ssl_match_hostname.noarch 0:3.4.0.2-4.el7 will be installed\n--> Processing Dependency: python-backports for package: python-backports-ssl_match_hostname-3.4.0.2-4.el7.noarch\n---> Package redhat-lsb-submod-security.x86_64 0:4.1-27.el7.centos.1 will be installed\n---> Package rsync.x86_64 0:3.0.9-17.el7 will be installed\n---> Package spax.x86_64 0:1.5.2-13.el7 will be installed\n---> Package subversion-libs.x86_64 0:1.7.14-10.el7 will be installed\n---> Package systemtap-client.x86_64 0:2.8-10.el7 will be installed\n--> Processing Dependency: systemtap-runtime = 2.8-10.el7 for package: systemtap-client-2.8-10.el7.x86_64\n--> Processing Dependency: mokutil for package: systemtap-client-2.8-10.el7.x86_64\n---> Package systemtap-devel.x86_64 0:2.8-10.el7 will be installed\n--> Processing Dependency: kernel-devel for package: systemtap-devel-2.8-10.el7.x86_64\n---> Package time.x86_64 0:1.7-45.el7 will be installed\n---> Package unzip.x86_64 0:6.0-15.el7 will be installed\n---> Package zip.x86_64 0:3.0-10.el7 will be installed\n--> Running transaction check\n---> Package cups-libs.x86_64 1:1.6.3-22.el7 will be installed\n---> Package gettext-common-devel.noarch 0:0.18.2.1-4.el7 will be installed\n---> Package glibc-headers.x86_64 0:2.17-106.el7_2.6 will be installed\n--> Processing Dependency: kernel-headers >= 2.2.1 for package: glibc-headers-2.17-106.el7_2.6.x86_64\n--> Processing Dependency: kernel-headers for package: glibc-headers-2.17-106.el7_2.6.x86_64\n---> Package kernel-devel.x86_64 0:3.10.0-327.18.2.el7 will be installed\n---> Package mokutil.x86_64 0:0.9-2.el7 will be installed\n---> Package pakchois.x86_64 0:0.4-10.el7 will be installed\n---> Package perl-Filter.x86_64 0:1.49-3.el7 will be installed\n---> Package perl-Pod-Simple.noarch 1:3.28-4.el7 will be installed\n--> Processing Dependency: perl(Pod::Escapes) >= 1.04 for package: 1:perl-Pod-Simple-3.28-4.el7.noarch\n---> Package perl-Pod-Usage.noarch 0:1.63-3.el7 will be installed\n--> Processing Dependency: perl(Pod::Text) >= 3.15 for package: perl-Pod-Usage-1.63-3.el7.noarch\n--> Processing Dependency: perl-Pod-Perldoc for package: perl-Pod-Usage-1.63-3.el7.noarch\n---> Package perl-Scalar-List-Utils.x86_64 0:1.27-248.el7 will be installed\n---> Package perl-Socket.x86_64 0:2.010-3.el7 will be installed\n---> Package perl-Storable.x86_64 0:2.45-3.el7 will be installed\n---> Package perl-Time-HiRes.x86_64 4:1.9725-3.el7 will be installed\n---> Package perl-Time-Local.noarch 0:1.2300-2.el7 will be installed\n---> Package perl-libs.x86_64 4:5.16.3-286.el7 will be installed\n---> Package perl-macros.x86_64 4:5.16.3-286.el7 will be installed\n---> Package perl-threads-shared.x86_64 0:1.43-6.el7 will be installed\n---> Package python-backports.x86_64 0:1.0-8.el7 will be installed\n---> Package systemtap-runtime.x86_64 0:2.8-10.el7 will be installed\n--> Processing Dependency: libsymtabAPI.so.8.2()(64bit) for package: systemtap-runtime-2.8-10.el7.x86_64\n--> Processing Dependency: libdyninstAPI.so.8.2()(64bit) for package: systemtap-runtime-2.8-10.el7.x86_64\n--> Running transaction check\n---> Package dyninst.x86_64 0:8.2.0-2.el7 will be installed\n--> Processing Dependency: libdwarf.so.0()(64bit) for package: dyninst-8.2.0-2.el7.x86_64\n--> Processing Dependency: libboost_thread-mt.so.1.53.0()(64bit) for package: dyninst-8.2.0-2.el7.x86_64\n--> Processing Dependency: libboost_system-mt.so.1.53.0()(64bit) for package: dyninst-8.2.0-2.el7.x86_64\n---> Package kernel-headers.x86_64 0:3.10.0-327.18.2.el7 will be installed\n---> Package perl-Pod-Escapes.noarch 1:1.04-286.el7 will be installed\n---> Package perl-Pod-Perldoc.noarch 0:3.20-4.el7 will be installed\n--> Processing Dependency: perl(parent) for package: perl-Pod-Perldoc-3.20-4.el7.noarch\n--> Processing Dependency: perl(HTTP::Tiny) for package: perl-Pod-Perldoc-3.20-4.el7.noarch\n---> Package perl-podlators.noarch 0:2.5.1-3.el7 will be installed\n--> Running transaction check\n---> Package boost-system.x86_64 0:1.53.0-25.el7 will be installed\n---> Package boost-thread.x86_64 0:1.53.0-25.el7 will be installed\n---> Package libdwarf.x86_64 0:20130207-4.el7 will be installed\n---> Package perl-HTTP-Tiny.noarch 0:0.033-3.el7 will be installed\n---> Package perl-parent.noarch 1:0.225-244.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository\n Size\n================================================================================\nInstalling for group install \"Development Tools\":\n autoconf noarch 2.69-11.el7 base 701 k\n automake noarch 1.13.4-3.el7 base 679 k\n bison x86_64 2.7-4.el7 base 578 k\n byacc x86_64 1.9.20130304-3.el7 base 65 k\n cscope x86_64 15.8-7.el7 base 203 k\n ctags x86_64 5.8-13.el7 base 155 k\n diffstat x86_64 1.57-4.el7 base 35 k\n doxygen x86_64 1:1.8.5-3.el7 base 3.6 M\n elfutils x86_64 0.163-3.el7 base 268 k\n flex x86_64 2.5.37-3.el7 base 292 k\n gcc x86_64 4.8.5-4.el7 base 16 M\n gcc-c++ x86_64 4.8.5-4.el7 base 7.2 M\n gcc-gfortran x86_64 4.8.5-4.el7 base 6.6 M\n git x86_64 1.8.3.1-6.el7_2.1 updates 4.4 M\n indent x86_64 2.2.11-13.el7 base 150 k\n intltool noarch 0.50.2-6.el7 base 59 k\n libtool x86_64 2.4.2-21.el7_2 updates 588 k\n patch x86_64 2.7.1-8.el7 base 110 k\n patchutils x86_64 0.3.3-4.el7 base 104 k\n rcs x86_64 5.9.0-5.el7 base 230 k\n redhat-rpm-config noarch 9.1.0-68.el7.centos base 77 k\n rpm-build x86_64 4.11.3-17.el7 base 143 k\n rpm-sign x86_64 4.11.3-17.el7 base 44 k\n subversion x86_64 1.7.14-10.el7 base 1.0 M\n swig x86_64 2.0.10-4.el7 base 1.3 M\n systemtap x86_64 2.8-10.el7 base 25 k\nInstalling:\n deltarpm x86_64 3.6-3.el7 base 82 k\n python-setuptools noarch 0.9.8-4.el7 base 396 k\n redhat-lsb-core x86_64 4.1-27.el7.centos.1 base 38 k\n wget x86_64 1.14-10.el7_0.1 base 545 k\n yum-plugin-priorities noarch 1.1.31-34.el7 base 25 k\nInstalling for dependencies:\n apr x86_64 1.4.8-3.el7 base 103 k\n apr-util x86_64 1.5.2-6.el7 base 92 k\n at x86_64 3.1.13-20.el7 base 50 k\n bc x86_64 1.06.95-13.el7 base 115 k\n boost-system x86_64 1.53.0-25.el7 base 39 k\n boost-thread x86_64 1.53.0-25.el7 base 57 k\n bzip2 x86_64 1.0.6-13.el7 base 52 k\n cpp x86_64 4.8.5-4.el7 base 5.9 M\n cups-client x86_64 1:1.6.3-22.el7 base 148 k\n cups-libs x86_64 1:1.6.3-22.el7 base 355 k\n dwz x86_64 0.11-3.el7 base 99 k\n dyninst x86_64 8.2.0-2.el7 base 2.5 M\n ed x86_64 1.9-4.el7 base 72 k\n emacs-filesystem noarch 1:24.3-18.el7 base 58 k\n gdb x86_64 7.6.1-80.el7 base 2.4 M\n gettext-common-devel noarch 0.18.2.1-4.el7 base 368 k\n gettext-devel x86_64 0.18.2.1-4.el7 base 315 k\n glibc-devel x86_64 2.17-106.el7_2.6 updates 1.0 M\n glibc-headers x86_64 2.17-106.el7_2.6 updates 662 k\n kernel-devel x86_64 3.10.0-327.18.2.el7 updates 11 M\n kernel-headers x86_64 3.10.0-327.18.2.el7 updates 3.2 M\n libdwarf x86_64 20130207-4.el7 base 109 k\n libgfortran x86_64 4.8.5-4.el7 base 293 k\n libgnome-keyring x86_64 3.8.0-3.el7 base 109 k\n libmpc x86_64 1.0.1-3.el7 base 51 k\n libquadmath x86_64 4.8.5-4.el7 base 182 k\n libquadmath-devel x86_64 4.8.5-4.el7 base 46 k\n libstdc++-devel x86_64 4.8.5-4.el7 base 1.5 M\n m4 x86_64 1.4.16-10.el7 base 256 k\n mailx x86_64 12.5-12.el7_0 base 244 k\n mokutil x86_64 0.9-2.el7 base 37 k\n mpfr x86_64 3.1.1-4.el7 base 203 k\n neon x86_64 0.30.0-3.el7 base 165 k\n pakchois x86_64 0.4-10.el7 base 14 k\n perl x86_64 4:5.16.3-286.el7 base 8.0 M\n perl-Carp noarch 1.26-244.el7 base 19 k\n perl-Data-Dumper x86_64 2.145-3.el7 base 47 k\n perl-Encode x86_64 2.51-7.el7 base 1.5 M\n perl-Error noarch 1:0.17020-2.el7 base 32 k\n perl-Exporter noarch 5.68-3.el7 base 28 k\n perl-File-Path noarch 2.09-2.el7 base 26 k\n perl-File-Temp noarch 0.23.01-3.el7 base 56 k\n perl-Filter x86_64 1.49-3.el7 base 76 k\n perl-Getopt-Long noarch 2.40-2.el7 base 56 k\n perl-Git noarch 1.8.3.1-6.el7_2.1 updates 53 k\n perl-HTTP-Tiny noarch 0.033-3.el7 base 38 k\n perl-PathTools x86_64 3.40-5.el7 base 82 k\n perl-Pod-Escapes noarch 1:1.04-286.el7 base 50 k\n perl-Pod-Perldoc noarch 3.20-4.el7 base 87 k\n perl-Pod-Simple noarch 1:3.28-4.el7 base 216 k\n perl-Pod-Usage noarch 1.63-3.el7 base 27 k\n perl-Scalar-List-Utils x86_64 1.27-248.el7 base 36 k\n perl-Socket x86_64 2.010-3.el7 base 49 k\n perl-Storable x86_64 2.45-3.el7 base 77 k\n perl-TermReadKey x86_64 2.30-20.el7 base 31 k\n perl-Test-Harness noarch 3.28-3.el7 base 302 k\n perl-Text-ParseWords noarch 3.29-4.el7 base 14 k\n perl-Thread-Queue noarch 3.02-2.el7 base 17 k\n perl-Time-HiRes x86_64 4:1.9725-3.el7 base 45 k\n perl-Time-Local noarch 1.2300-2.el7 base 24 k\n perl-XML-Parser x86_64 2.41-10.el7 base 223 k\n perl-constant noarch 1.27-2.el7 base 19 k\n perl-libs x86_64 4:5.16.3-286.el7 base 687 k\n perl-macros x86_64 4:5.16.3-286.el7 base 43 k\n perl-parent noarch 1:0.225-244.el7 base 12 k\n perl-podlators noarch 2.5.1-3.el7 base 112 k\n perl-srpm-macros noarch 1-8.el7 base 4.6 k\n perl-threads x86_64 1.87-4.el7 base 49 k\n perl-threads-shared x86_64 1.43-6.el7 base 39 k\n psmisc x86_64 22.20-9.el7 base 140 k\n python-backports x86_64 1.0-8.el7 base 5.8 k\n python-backports-ssl_match_hostname noarch 3.4.0.2-4.el7 base 12 k\n redhat-lsb-submod-security x86_64 4.1-27.el7.centos.1 base 15 k\n rsync x86_64 3.0.9-17.el7 base 360 k\n spax x86_64 1.5.2-13.el7 base 260 k\n subversion-libs x86_64 1.7.14-10.el7 base 921 k\n systemtap-client x86_64 2.8-10.el7 base 2.9 M\n systemtap-devel x86_64 2.8-10.el7 base 1.6 M\n systemtap-runtime x86_64 2.8-10.el7 base 270 k\n time x86_64 1.7-45.el7 base 30 k\n unzip x86_64 6.0-15.el7 base 166 k\n zip x86_64 3.0-10.el7 base 260 k\n\nTransaction Summary\n================================================================================\nInstall 31 Packages (+82 Dependent packages)\n\nTotal download size: 96 M\nInstalled size: 277 M\nDownloading packages:\nDelta RPMs disabled because /usr/bin/applydeltarpm not installed.\nPublic key for apr-1.4.8-3.el7.x86_64.rpm is not installed\nPublic key for git-1.8.3.1-6.el7_2.1.x86_64.rpm is not installed\n--------------------------------------------------------------------------------\nTotal 94 MB/s | 96 MB 00:01 \nRetrieving key from file:///etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-7\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : mpfr-3.1.1-4.el7.x86_64 1/113 \n Installing : libmpc-1.0.1-3.el7.x86_64 2/113 \n Installing : m4-1.4.16-10.el7.x86_64 3/113 \n Installing : libquadmath-4.8.5-4.el7.x86_64 4/113 \n Installing : apr-1.4.8-3.el7.x86_64 5/113 \n Installing : patch-2.7.1-8.el7.x86_64 6/113 \n Installing : apr-util-1.5.2-6.el7.x86_64 7/113 \n Installing : unzip-6.0-15.el7.x86_64 8/113 \n Installing : zip-3.0-10.el7.x86_64 9/113 \n Installing : boost-system-1.53.0-25.el7.x86_64 10/113 \n Installing : boost-thread-1.53.0-25.el7.x86_64 11/113 \n Installing : libgfortran-4.8.5-4.el7.x86_64 12/113 \n Installing : cpp-4.8.5-4.el7.x86_64 13/113 \n Installing : 1:perl-parent-0.225-244.el7.noarch 14/113 \n Installing : perl-HTTP-Tiny-0.033-3.el7.noarch 15/113 \n Installing : perl-podlators-2.5.1-3.el7.noarch 16/113 \n Installing : perl-Pod-Perldoc-3.20-4.el7.noarch 17/113 \n Installing : 1:perl-Pod-Escapes-1.04-286.el7.noarch 18/113 \n Installing : perl-Text-ParseWords-3.29-4.el7.noarch 19/113 \n Installing : perl-Encode-2.51-7.el7.x86_64 20/113 \n Installing : perl-Pod-Usage-1.63-3.el7.noarch 21/113 \n Installing : 4:perl-libs-5.16.3-286.el7.x86_64 22/113 \n Installing : 4:perl-macros-5.16.3-286.el7.x86_64 23/113 \n Installing : perl-threads-1.87-4.el7.x86_64 24/113 \n Installing : perl-Socket-2.010-3.el7.x86_64 25/113 \n Installing : 4:perl-Time-HiRes-1.9725-3.el7.x86_64 26/113 \n Installing : perl-threads-shared-1.43-6.el7.x86_64 27/113 \n Installing : perl-Scalar-List-Utils-1.27-248.el7.x86_64 28/113 \n Installing : perl-Storable-2.45-3.el7.x86_64 29/113 \n Installing : perl-Filter-1.49-3.el7.x86_64 30/113 \n Installing : perl-Exporter-5.68-3.el7.noarch 31/113 \n Installing : perl-constant-1.27-2.el7.noarch 32/113 \n Installing : perl-File-Temp-0.23.01-3.el7.noarch 33/113 \n Installing : perl-File-Path-2.09-2.el7.noarch 34/113 \n Installing : perl-PathTools-3.40-5.el7.x86_64 35/113 \n Installing : perl-Carp-1.26-244.el7.noarch 36/113 \n Installing : perl-Time-Local-1.2300-2.el7.noarch 37/113 \n Installing : 1:perl-Pod-Simple-3.28-4.el7.noarch 38/113 \n Installing : perl-Getopt-Long-2.40-2.el7.noarch 39/113 \n Installing : 4:perl-5.16.3-286.el7.x86_64 40/113 \n Installing : perl-Thread-Queue-3.02-2.el7.noarch 41/113 \n Installing : 1:perl-Error-0.17020-2.el7.noarch 42/113 \n Installing : perl-TermReadKey-2.30-20.el7.x86_64 43/113 \n Installing : perl-Data-Dumper-2.145-3.el7.x86_64 44/113 \n Installing : autoconf-2.69-11.el7.noarch 45/113 \n Installing : perl-XML-Parser-2.41-10.el7.x86_64 46/113 \n Installing : kernel-devel-3.10.0-327.18.2.el7.x86_64 47/113 \n Installing : perl-Test-Harness-3.28-3.el7.noarch 48/113 \n Installing : automake-1.13.4-3.el7.noarch 49/113 \n Installing : spax-1.5.2-13.el7.x86_64 50/113 \n Installing : gdb-7.6.1-80.el7.x86_64 51/113 \n Installing : python-backports-1.0-8.el7.x86_64 52/113 \n Installing : python-backports-ssl_match_hostname-3.4.0.2-4.el7.noar 53/113 \n Installing : 1:emacs-filesystem-24.3-18.el7.noarch 54/113 \n Installing : kernel-headers-3.10.0-327.18.2.el7.x86_64 55/113 \n Installing : glibc-headers-2.17-106.el7_2.6.x86_64 56/113 \n Installing : glibc-devel-2.17-106.el7_2.6.x86_64 57/113 \n Installing : gcc-4.8.5-4.el7.x86_64 58/113 \n Installing : systemtap-devel-2.8-10.el7.x86_64 59/113 \n Installing : libquadmath-devel-4.8.5-4.el7.x86_64 60/113 \n Installing : elfutils-0.163-3.el7.x86_64 61/113 \n Installing : rsync-3.0.9-17.el7.x86_64 62/113 \n Installing : pakchois-0.4-10.el7.x86_64 63/113 \n Installing : neon-0.30.0-3.el7.x86_64 64/113 \n Installing : subversion-libs-1.7.14-10.el7.x86_64 65/113 \n Installing : dwz-0.11-3.el7.x86_64 66/113 \n Installing : time-1.7-45.el7.x86_64 67/113 \n Installing : libstdc++-devel-4.8.5-4.el7.x86_64 68/113 \n Installing : libgnome-keyring-3.8.0-3.el7.x86_64 69/113 \n Installing : git-1.8.3.1-6.el7_2.1.x86_64 70/113 \n Installing : perl-Git-1.8.3.1-6.el7_2.1.noarch 71/113 \n Installing : gettext-common-devel-0.18.2.1-4.el7.noarch 72/113 \n Installing : gettext-devel-0.18.2.1-4.el7.x86_64 73/113 \n Installing : perl-srpm-macros-1-8.el7.noarch 74/113 \n Installing : redhat-rpm-config-9.1.0-68.el7.centos.noarch 75/113 \n Installing : bc-1.06.95-13.el7.x86_64 76/113 \n Installing : mokutil-0.9-2.el7.x86_64 77/113 \n Installing : mailx-12.5-12.el7_0.x86_64 78/113 \n Installing : ed-1.9-4.el7.x86_64 79/113 \n Installing : psmisc-22.20-9.el7.x86_64 80/113 \n Installing : at-3.1.13-20.el7.x86_64 81/113 \n Installing : bzip2-1.0.6-13.el7.x86_64 82/113 \n Installing : redhat-lsb-submod-security-4.1-27.el7.centos.1.x86_64 83/113 \n Installing : 1:cups-libs-1.6.3-22.el7.x86_64 84/113 \n Installing : 1:cups-client-1.6.3-22.el7.x86_64 85/113 \n Installing : libdwarf-20130207-4.el7.x86_64 86/113 \n Installing : dyninst-8.2.0-2.el7.x86_64 87/113 \n Installing : systemtap-runtime-2.8-10.el7.x86_64 88/113 \n Installing : systemtap-client-2.8-10.el7.x86_64 89/113 \n Installing : systemtap-2.8-10.el7.x86_64 90/113 \n Installing : redhat-lsb-core-4.1-27.el7.centos.1.x86_64 91/113 \n Installing : rpm-build-4.11.3-17.el7.x86_64 92/113 \n Installing : intltool-0.50.2-6.el7.noarch 93/113 \n Installing : gcc-c++-4.8.5-4.el7.x86_64 94/113 \n Installing : subversion-1.7.14-10.el7.x86_64 95/113 \n Installing : gcc-gfortran-4.8.5-4.el7.x86_64 96/113 \n Installing : libtool-2.4.2-21.el7_2.x86_64 97/113 \n Installing : cscope-15.8-7.el7.x86_64 98/113 \n Installing : python-setuptools-0.9.8-4.el7.noarch 99/113 \n Installing : patchutils-0.3.3-4.el7.x86_64 100/113 \n Installing : bison-2.7-4.el7.x86_64 101/113 \n Installing : flex-2.5.37-3.el7.x86_64 102/113 \n Installing : rpm-sign-4.11.3-17.el7.x86_64 103/113 \n Installing : indent-2.2.11-13.el7.x86_64 104/113 \n Installing : ctags-5.8-13.el7.x86_64 105/113 \n Installing : byacc-1.9.20130304-3.el7.x86_64 106/113 \n Installing : wget-1.14-10.el7_0.1.x86_64 107/113 \n Installing : rcs-5.9.0-5.el7.x86_64 108/113 \n Installing : swig-2.0.10-4.el7.x86_64 109/113 \n Installing : diffstat-1.57-4.el7.x86_64 110/113 \n Installing : deltarpm-3.6-3.el7.x86_64 111/113 \n Installing : 1:doxygen-1.8.5-3.el7.x86_64 112/113 \n Installing : yum-plugin-priorities-1.1.31-34.el7.noarch 113/113 \n Verifying : yum-plugin-priorities-1.1.31-34.el7.noarch 1/113 \n Verifying : perl-HTTP-Tiny-0.033-3.el7.noarch 2/113 \n Verifying : libdwarf-20130207-4.el7.x86_64 3/113 \n Verifying : 1:doxygen-1.8.5-3.el7.x86_64 4/113 \n Verifying : python-backports-ssl_match_hostname-3.4.0.2-4.el7.noar 5/113 \n Verifying : perl-TermReadKey-2.30-20.el7.x86_64 6/113 \n Verifying : glibc-devel-2.17-106.el7_2.6.x86_64 7/113 \n Verifying : perl-File-Temp-0.23.01-3.el7.noarch 8/113 \n Verifying : cscope-15.8-7.el7.x86_64 9/113 \n Verifying : patch-2.7.1-8.el7.x86_64 10/113 \n Verifying : glibc-headers-2.17-106.el7_2.6.x86_64 11/113 \n Verifying : perl-Data-Dumper-2.145-3.el7.x86_64 12/113 \n Verifying : 1:cups-libs-1.6.3-22.el7.x86_64 13/113 \n Verifying : apr-util-1.5.2-6.el7.x86_64 14/113 \n Verifying : deltarpm-3.6-3.el7.x86_64 15/113 \n Verifying : systemtap-2.8-10.el7.x86_64 16/113 \n Verifying : 1:perl-Pod-Escapes-1.04-286.el7.noarch 17/113 \n Verifying : patchutils-0.3.3-4.el7.x86_64 18/113 \n Verifying : neon-0.30.0-3.el7.x86_64 19/113 \n Verifying : subversion-libs-1.7.14-10.el7.x86_64 20/113 \n Verifying : intltool-0.50.2-6.el7.noarch 21/113 \n Verifying : perl-File-Path-2.09-2.el7.noarch 22/113 \n Verifying : autoconf-2.69-11.el7.noarch 23/113 \n Verifying : perl-Socket-2.010-3.el7.x86_64 24/113 \n Verifying : perl-Text-ParseWords-3.29-4.el7.noarch 25/113 \n Verifying : diffstat-1.57-4.el7.x86_64 26/113 \n Verifying : git-1.8.3.1-6.el7_2.1.x86_64 27/113 \n Verifying : swig-2.0.10-4.el7.x86_64 28/113 \n Verifying : systemtap-runtime-2.8-10.el7.x86_64 29/113 \n Verifying : redhat-lsb-submod-security-4.1-27.el7.centos.1.x86_64 30/113 \n Verifying : bison-2.7-4.el7.x86_64 31/113 \n Verifying : 4:perl-Time-HiRes-1.9725-3.el7.x86_64 32/113 \n Verifying : gcc-c++-4.8.5-4.el7.x86_64 33/113 \n Verifying : perl-XML-Parser-2.41-10.el7.x86_64 34/113 \n Verifying : python-setuptools-0.9.8-4.el7.noarch 35/113 \n Verifying : bzip2-1.0.6-13.el7.x86_64 36/113 \n Verifying : systemtap-devel-2.8-10.el7.x86_64 37/113 \n Verifying : rcs-5.9.0-5.el7.x86_64 38/113 \n Verifying : at-3.1.13-20.el7.x86_64 39/113 \n Verifying : libgfortran-4.8.5-4.el7.x86_64 40/113 \n Verifying : libmpc-1.0.1-3.el7.x86_64 41/113 \n Verifying : wget-1.14-10.el7_0.1.x86_64 42/113 \n Verifying : perl-Pod-Usage-1.63-3.el7.noarch 43/113 \n Verifying : perl-Encode-2.51-7.el7.x86_64 44/113 \n Verifying : boost-system-1.53.0-25.el7.x86_64 45/113 \n Verifying : perl-threads-1.87-4.el7.x86_64 46/113 \n Verifying : psmisc-22.20-9.el7.x86_64 47/113 \n Verifying : perl-Scalar-List-Utils-1.27-248.el7.x86_64 48/113 \n Verifying : ed-1.9-4.el7.x86_64 49/113 \n Verifying : redhat-lsb-core-4.1-27.el7.centos.1.x86_64 50/113 \n Verifying : mailx-12.5-12.el7_0.x86_64 51/113 \n Verifying : mokutil-0.9-2.el7.x86_64 52/113 \n Verifying : gcc-4.8.5-4.el7.x86_64 53/113 \n Verifying : perl-threads-shared-1.43-6.el7.x86_64 54/113 \n Verifying : perl-Storable-2.45-3.el7.x86_64 55/113 \n Verifying : byacc-1.9.20130304-3.el7.x86_64 56/113 \n Verifying : 4:perl-libs-5.16.3-286.el7.x86_64 57/113 \n Verifying : m4-1.4.16-10.el7.x86_64 58/113 \n Verifying : bc-1.06.95-13.el7.x86_64 59/113 \n Verifying : 1:perl-parent-0.225-244.el7.noarch 60/113 \n Verifying : perl-srpm-macros-1-8.el7.noarch 61/113 \n Verifying : gettext-common-devel-0.18.2.1-4.el7.noarch 62/113 \n Verifying : libquadmath-devel-4.8.5-4.el7.x86_64 63/113 \n Verifying : rpm-build-4.11.3-17.el7.x86_64 64/113 \n Verifying : libgnome-keyring-3.8.0-3.el7.x86_64 65/113 \n Verifying : libstdc++-devel-4.8.5-4.el7.x86_64 66/113 \n Verifying : perl-podlators-2.5.1-3.el7.noarch 67/113 \n Verifying : zip-3.0-10.el7.x86_64 68/113 \n Verifying : time-1.7-45.el7.x86_64 69/113 \n Verifying : mpfr-3.1.1-4.el7.x86_64 70/113 \n Verifying : dyninst-8.2.0-2.el7.x86_64 71/113 \n Verifying : perl-Filter-1.49-3.el7.x86_64 72/113 \n Verifying : dwz-0.11-3.el7.x86_64 73/113 \n Verifying : libtool-2.4.2-21.el7_2.x86_64 74/113 \n Verifying : 1:cups-client-1.6.3-22.el7.x86_64 75/113 \n Verifying : pakchois-0.4-10.el7.x86_64 76/113 \n Verifying : rsync-3.0.9-17.el7.x86_64 77/113 \n Verifying : ctags-5.8-13.el7.x86_64 78/113 \n Verifying : kernel-devel-3.10.0-327.18.2.el7.x86_64 79/113 \n Verifying : automake-1.13.4-3.el7.noarch 80/113 \n Verifying : perl-Exporter-5.68-3.el7.noarch 81/113 \n Verifying : perl-constant-1.27-2.el7.noarch 82/113 \n Verifying : perl-PathTools-3.40-5.el7.x86_64 83/113 \n Verifying : elfutils-0.163-3.el7.x86_64 84/113 \n Verifying : 4:perl-macros-5.16.3-286.el7.x86_64 85/113 \n Verifying : perl-Carp-1.26-244.el7.noarch 86/113 \n Verifying : perl-Test-Harness-3.28-3.el7.noarch 87/113 \n Verifying : kernel-headers-3.10.0-327.18.2.el7.x86_64 88/113 \n Verifying : apr-1.4.8-3.el7.x86_64 89/113 \n Verifying : 4:perl-5.16.3-286.el7.x86_64 90/113 \n Verifying : subversion-1.7.14-10.el7.x86_64 91/113 \n Verifying : perl-Thread-Queue-3.02-2.el7.noarch 92/113 \n Verifying : 1:perl-Pod-Simple-3.28-4.el7.noarch 93/113 \n Verifying : perl-Time-Local-1.2300-2.el7.noarch 94/113 \n Verifying : perl-Pod-Perldoc-3.20-4.el7.noarch 95/113 \n Verifying : perl-Git-1.8.3.1-6.el7_2.1.noarch 96/113 \n Verifying : boost-thread-1.53.0-25.el7.x86_64 97/113 \n Verifying : 1:emacs-filesystem-24.3-18.el7.noarch 98/113 \n Verifying : systemtap-client-2.8-10.el7.x86_64 99/113 \n Verifying : 1:perl-Error-0.17020-2.el7.noarch 100/113 \n Verifying : indent-2.2.11-13.el7.x86_64 101/113 \n Verifying : gcc-gfortran-4.8.5-4.el7.x86_64 102/113 \n Verifying : flex-2.5.37-3.el7.x86_64 103/113 \n Verifying : python-backports-1.0-8.el7.x86_64 104/113 \n Verifying : unzip-6.0-15.el7.x86_64 105/113 \n Verifying : gettext-devel-0.18.2.1-4.el7.x86_64 106/113 \n Verifying : gdb-7.6.1-80.el7.x86_64 107/113 \n Verifying : perl-Getopt-Long-2.40-2.el7.noarch 108/113 \n Verifying : cpp-4.8.5-4.el7.x86_64 109/113 \n Verifying : redhat-rpm-config-9.1.0-68.el7.centos.noarch 110/113 \n Verifying : spax-1.5.2-13.el7.x86_64 111/113 \n Verifying : libquadmath-4.8.5-4.el7.x86_64 112/113 \n Verifying : rpm-sign-4.11.3-17.el7.x86_64 113/113 \n\nInstalled:\n autoconf.noarch 0:2.69-11.el7 \n automake.noarch 0:1.13.4-3.el7 \n bison.x86_64 0:2.7-4.el7 \n byacc.x86_64 0:1.9.20130304-3.el7 \n cscope.x86_64 0:15.8-7.el7 \n ctags.x86_64 0:5.8-13.el7 \n deltarpm.x86_64 0:3.6-3.el7 \n diffstat.x86_64 0:1.57-4.el7 \n doxygen.x86_64 1:1.8.5-3.el7 \n elfutils.x86_64 0:0.163-3.el7 \n flex.x86_64 0:2.5.37-3.el7 \n gcc.x86_64 0:4.8.5-4.el7 \n gcc-c++.x86_64 0:4.8.5-4.el7 \n gcc-gfortran.x86_64 0:4.8.5-4.el7 \n git.x86_64 0:1.8.3.1-6.el7_2.1 \n indent.x86_64 0:2.2.11-13.el7 \n intltool.noarch 0:0.50.2-6.el7 \n libtool.x86_64 0:2.4.2-21.el7_2 \n patch.x86_64 0:2.7.1-8.el7 \n patchutils.x86_64 0:0.3.3-4.el7 \n python-setuptools.noarch 0:0.9.8-4.el7 \n rcs.x86_64 0:5.9.0-5.el7 \n redhat-lsb-core.x86_64 0:4.1-27.el7.centos.1 \n redhat-rpm-config.noarch 0:9.1.0-68.el7.centos \n rpm-build.x86_64 0:4.11.3-17.el7 \n rpm-sign.x86_64 0:4.11.3-17.el7 \n subversion.x86_64 0:1.7.14-10.el7 \n swig.x86_64 0:2.0.10-4.el7 \n systemtap.x86_64 0:2.8-10.el7 \n wget.x86_64 0:1.14-10.el7_0.1 \n yum-plugin-priorities.noarch 0:1.1.31-34.el7 \n\nDependency Installed:\n apr.x86_64 0:1.4.8-3.el7 \n apr-util.x86_64 0:1.5.2-6.el7 \n at.x86_64 0:3.1.13-20.el7 \n bc.x86_64 0:1.06.95-13.el7 \n boost-system.x86_64 0:1.53.0-25.el7 \n boost-thread.x86_64 0:1.53.0-25.el7 \n bzip2.x86_64 0:1.0.6-13.el7 \n cpp.x86_64 0:4.8.5-4.el7 \n cups-client.x86_64 1:1.6.3-22.el7 \n cups-libs.x86_64 1:1.6.3-22.el7 \n dwz.x86_64 0:0.11-3.el7 \n dyninst.x86_64 0:8.2.0-2.el7 \n ed.x86_64 0:1.9-4.el7 \n emacs-filesystem.noarch 1:24.3-18.el7 \n gdb.x86_64 0:7.6.1-80.el7 \n gettext-common-devel.noarch 0:0.18.2.1-4.el7 \n gettext-devel.x86_64 0:0.18.2.1-4.el7 \n glibc-devel.x86_64 0:2.17-106.el7_2.6 \n glibc-headers.x86_64 0:2.17-106.el7_2.6 \n kernel-devel.x86_64 0:3.10.0-327.18.2.el7 \n kernel-headers.x86_64 0:3.10.0-327.18.2.el7 \n libdwarf.x86_64 0:20130207-4.el7 \n libgfortran.x86_64 0:4.8.5-4.el7 \n libgnome-keyring.x86_64 0:3.8.0-3.el7 \n libmpc.x86_64 0:1.0.1-3.el7 \n libquadmath.x86_64 0:4.8.5-4.el7 \n libquadmath-devel.x86_64 0:4.8.5-4.el7 \n libstdc++-devel.x86_64 0:4.8.5-4.el7 \n m4.x86_64 0:1.4.16-10.el7 \n mailx.x86_64 0:12.5-12.el7_0 \n mokutil.x86_64 0:0.9-2.el7 \n mpfr.x86_64 0:3.1.1-4.el7 \n neon.x86_64 0:0.30.0-3.el7 \n pakchois.x86_64 0:0.4-10.el7 \n perl.x86_64 4:5.16.3-286.el7 \n perl-Carp.noarch 0:1.26-244.el7 \n perl-Data-Dumper.x86_64 0:2.145-3.el7 \n perl-Encode.x86_64 0:2.51-7.el7 \n perl-Error.noarch 1:0.17020-2.el7 \n perl-Exporter.noarch 0:5.68-3.el7 \n perl-File-Path.noarch 0:2.09-2.el7 \n perl-File-Temp.noarch 0:0.23.01-3.el7 \n perl-Filter.x86_64 0:1.49-3.el7 \n perl-Getopt-Long.noarch 0:2.40-2.el7 \n perl-Git.noarch 0:1.8.3.1-6.el7_2.1 \n perl-HTTP-Tiny.noarch 0:0.033-3.el7 \n perl-PathTools.x86_64 0:3.40-5.el7 \n perl-Pod-Escapes.noarch 1:1.04-286.el7 \n perl-Pod-Perldoc.noarch 0:3.20-4.el7 \n perl-Pod-Simple.noarch 1:3.28-4.el7 \n perl-Pod-Usage.noarch 0:1.63-3.el7 \n perl-Scalar-List-Utils.x86_64 0:1.27-248.el7 \n perl-Socket.x86_64 0:2.010-3.el7 \n perl-Storable.x86_64 0:2.45-3.el7 \n perl-TermReadKey.x86_64 0:2.30-20.el7 \n perl-Test-Harness.noarch 0:3.28-3.el7 \n perl-Text-ParseWords.noarch 0:3.29-4.el7 \n perl-Thread-Queue.noarch 0:3.02-2.el7 \n perl-Time-HiRes.x86_64 4:1.9725-3.el7 \n perl-Time-Local.noarch 0:1.2300-2.el7 \n perl-XML-Parser.x86_64 0:2.41-10.el7 \n perl-constant.noarch 0:1.27-2.el7 \n perl-libs.x86_64 4:5.16.3-286.el7 \n perl-macros.x86_64 4:5.16.3-286.el7 \n perl-parent.noarch 1:0.225-244.el7 \n perl-podlators.noarch 0:2.5.1-3.el7 \n perl-srpm-macros.noarch 0:1-8.el7 \n perl-threads.x86_64 0:1.87-4.el7 \n perl-threads-shared.x86_64 0:1.43-6.el7 \n psmisc.x86_64 0:22.20-9.el7 \n python-backports.x86_64 0:1.0-8.el7 \n python-backports-ssl_match_hostname.noarch 0:3.4.0.2-4.el7 \n redhat-lsb-submod-security.x86_64 0:4.1-27.el7.centos.1 \n rsync.x86_64 0:3.0.9-17.el7 \n spax.x86_64 0:1.5.2-13.el7 \n subversion-libs.x86_64 0:1.7.14-10.el7 \n systemtap-client.x86_64 0:2.8-10.el7 \n systemtap-devel.x86_64 0:2.8-10.el7 \n systemtap-runtime.x86_64 0:2.8-10.el7 \n time.x86_64 0:1.7-45.el7 \n unzip.x86_64 0:6.0-15.el7 \n zip.x86_64 0:3.0-10.el7 \n\nComplete!\n"]} msg: All items completed results: [ { "_ansible_no_log": false, "invocation": { "module_name": "yum", "module_args": { "disable_gpg_check": false, "disablerepo": null, "install_repoquery": true, "list": null, "state": "present", "enablerepo": null, "update_cache": false, "conf_file": null, "exclude": null, "name": [ "deltarpm", "@Development tools", "git", "python-setuptools", "wget", "redhat-lsb-core", "libselinux-python", "yum-plugin-priorities" ] } }, "msg": "warning: /var/cache/yum/x86_64/7/base/packages/apr-1.4.8-3.el7.x86_64.rpm: Header V3 RSA/SHA256 Signature, key ID f4a80eb5: NOKEY Importing GPG key 0xF4A80EB5: Userid : \"CentOS-7 Key (CentOS 7 Official Signing Key) \" Fingerprint: 6341 ab27 53d7 8a78 a7c2 7bb1 24c6 a8a7 f4a8 0eb5 Package : centos-release-7-2.1511.el7.centos.2.10.x86_64 (@base) From : /etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-7 ", "rc": 0, "changed": true, "item": [ "deltarpm", "@Development tools", "git", "python-setuptools", "wget", "redhat-lsb-core", "libselinux-python", "yum-plugin-priorities" ], "results": " libselinux-python-2.2.2-6.el7.x86_64 providing libselinux-python is already installed Loaded plugins: fastestmirror Loading mirror speeds from cached hostfile * base: mirror.centos.org * extras: mirror.centos.org * updates: mirror.centos.org Resolving Dependencies --> Running transaction check ---> Package autoconf.noarch 0:2.69-11.el7 will be installed --> Processing Dependency: perl >= 5.006 for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: m4 >= 1.4.14 for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(warnings) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(vars) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(strict) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(constant) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(Text::ParseWords) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(POSIX) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(IO::File) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(Getopt::Long) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(File::stat) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(File::Spec) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(File::Path) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(File::Find) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(File::Copy) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(File::Compare) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(File::Basename) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(Exporter) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(Errno) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(DynaLoader) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(Data::Dumper) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(Cwd) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(Class::Struct) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: perl(Carp) for package: autoconf-2.69-11.el7.noarch --> Processing Dependency: /usr/bin/perl for package: autoconf-2.69-11.el7.noarch ---> Package automake.noarch 0:1.13.4-3.el7 will be installed --> Processing Dependency: perl(threads) for package: automake-1.13.4-3.el7.noarch --> Processing Dependency: perl(Thread::Queue) for package: automake-1.13.4-3.el7.noarch --> Processing Dependency: perl(TAP::Parser) for package: automake-1.13.4-3.el7.noarch ---> Package bison.x86_64 0:2.7-4.el7 will be installed ---> Package byacc.x86_64 0:1.9.20130304-3.el7 will be installed ---> Package cscope.x86_64 0:15.8-7.el7 will be installed --> Processing Dependency: emacs-filesystem for package: cscope-15.8-7.el7.x86_64 ---> Package ctags.x86_64 0:5.8-13.el7 will be installed ---> Package deltarpm.x86_64 0:3.6-3.el7 will be installed ---> Package diffstat.x86_64 0:1.57-4.el7 will be installed ---> Package doxygen.x86_64 1:1.8.5-3.el7 will be installed ---> Package elfutils.x86_64 0:0.163-3.el7 will be installed ---> Package flex.x86_64 0:2.5.37-3.el7 will be installed ---> Package gcc.x86_64 0:4.8.5-4.el7 will be installed --> Processing Dependency: cpp = 4.8.5-4.el7 for package: gcc-4.8.5-4.el7.x86_64 --> Processing Dependency: glibc-devel >= 2.2.90-12 for package: gcc-4.8.5-4.el7.x86_64 --> Processing Dependency: libmpfr.so.4()(64bit) for package: gcc-4.8.5-4.el7.x86_64 --> Processing Dependency: libmpc.so.3()(64bit) for package: gcc-4.8.5-4.el7.x86_64 ---> Package gcc-c++.x86_64 0:4.8.5-4.el7 will be installed --> Processing Dependency: libstdc++-devel = 4.8.5-4.el7 for package: gcc-c++-4.8.5-4.el7.x86_64 ---> Package gcc-gfortran.x86_64 0:4.8.5-4.el7 will be installed --> Processing Dependency: libquadmath-devel = 4.8.5-4.el7 for package: gcc-gfortran-4.8.5-4.el7.x86_64 --> Processing Dependency: libquadmath = 4.8.5-4.el7 for package: gcc-gfortran-4.8.5-4.el7.x86_64 --> Processing Dependency: libgfortran = 4.8.5-4.el7 for package: gcc-gfortran-4.8.5-4.el7.x86_64 --> Processing Dependency: libgfortran.so.3()(64bit) for package: gcc-gfortran-4.8.5-4.el7.x86_64 ---> Package git.x86_64 0:1.8.3.1-6.el7_2.1 will be installed --> Processing Dependency: perl-Git = 1.8.3.1-6.el7_2.1 for package: git-1.8.3.1-6.el7_2.1.x86_64 --> Processing Dependency: rsync for package: git-1.8.3.1-6.el7_2.1.x86_64 --> Processing Dependency: perl(Term::ReadKey) for package: git-1.8.3.1-6.el7_2.1.x86_64 --> Processing Dependency: perl(Git) for package: git-1.8.3.1-6.el7_2.1.x86_64 --> Processing Dependency: perl(File::Temp) for package: git-1.8.3.1-6.el7_2.1.x86_64 --> Processing Dependency: perl(Error) for package: git-1.8.3.1-6.el7_2.1.x86_64 --> Processing Dependency: libgnome-keyring.so.0()(64bit) for package: git-1.8.3.1-6.el7_2.1.x86_64 ---> Package indent.x86_64 0:2.2.11-13.el7 will be installed ---> Package intltool.noarch 0:0.50.2-6.el7 will be installed --> Processing Dependency: perl(XML::Parser) for package: intltool-0.50.2-6.el7.noarch --> Processing Dependency: perl(Encode) for package: intltool-0.50.2-6.el7.noarch --> Processing Dependency: gettext-devel for package: intltool-0.50.2-6.el7.noarch ---> Package libtool.x86_64 0:2.4.2-21.el7_2 will be installed ---> Package patch.x86_64 0:2.7.1-8.el7 will be installed ---> Package patchutils.x86_64 0:0.3.3-4.el7 will be installed ---> Package python-setuptools.noarch 0:0.9.8-4.el7 will be installed --> Processing Dependency: python-backports-ssl_match_hostname for package: python-setuptools-0.9.8-4.el7.noarch ---> Package rcs.x86_64 0:5.9.0-5.el7 will be installed ---> Package redhat-lsb-core.x86_64 0:4.1-27.el7.centos.1 will be installed --> Processing Dependency: redhat-lsb-submod-security(x86-64) = 4.1-27.el7.centos.1 for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64 --> Processing Dependency: spax for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64 --> Processing Dependency: /usr/sbin/fuser for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64 --> Processing Dependency: /usr/bin/time for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64 --> Processing Dependency: /usr/bin/lpr for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64 --> Processing Dependency: /usr/bin/lp for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64 --> Processing Dependency: /usr/bin/killall for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64 --> Processing Dependency: /usr/bin/bc for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64 --> Processing Dependency: /usr/bin/batch for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64 --> Processing Dependency: /usr/bin/at for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64 --> Processing Dependency: /bin/mailx for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64 --> Processing Dependency: /bin/ed for package: redhat-lsb-core-4.1-27.el7.centos.1.x86_64 ---> Package redhat-rpm-config.noarch 0:9.1.0-68.el7.centos will be installed --> Processing Dependency: dwz >= 0.4 for package: redhat-rpm-config-9.1.0-68.el7.centos.noarch --> Processing Dependency: zip for package: redhat-rpm-config-9.1.0-68.el7.centos.noarch --> Processing Dependency: perl-srpm-macros for package: redhat-rpm-config-9.1.0-68.el7.centos.noarch ---> Package rpm-build.x86_64 0:4.11.3-17.el7 will be installed --> Processing Dependency: unzip for package: rpm-build-4.11.3-17.el7.x86_64 --> Processing Dependency: bzip2 for package: rpm-build-4.11.3-17.el7.x86_64 --> Processing Dependency: /usr/bin/gdb-add-index for package: rpm-build-4.11.3-17.el7.x86_64 ---> Package rpm-sign.x86_64 0:4.11.3-17.el7 will be installed ---> Package subversion.x86_64 0:1.7.14-10.el7 will be installed --> Processing Dependency: subversion-libs(x86-64) = 1.7.14-10.el7 for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_wc-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_subr-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_repos-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_ra_svn-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_ra_neon-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_ra_local-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_ra-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_fs_util-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_fs_fs-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_fs_base-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_fs-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_diff-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_delta-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libsvn_client-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libneon.so.27()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libaprutil-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 --> Processing Dependency: libapr-1.so.0()(64bit) for package: subversion-1.7.14-10.el7.x86_64 ---> Package swig.x86_64 0:2.0.10-4.el7 will be installed ---> Package systemtap.x86_64 0:2.8-10.el7 will be installed --> Processing Dependency: systemtap-devel = 2.8-10.el7 for package: systemtap-2.8-10.el7.x86_64 --> Processing Dependency: systemtap-client = 2.8-10.el7 for package: systemtap-2.8-10.el7.x86_64 ---> Package wget.x86_64 0:1.14-10.el7_0.1 will be installed ---> Package yum-plugin-priorities.noarch 0:1.1.31-34.el7 will be installed --> Running transaction check ---> Package apr.x86_64 0:1.4.8-3.el7 will be installed ---> Package apr-util.x86_64 0:1.5.2-6.el7 will be installed ---> Package at.x86_64 0:3.1.13-20.el7 will be installed ---> Package bc.x86_64 0:1.06.95-13.el7 will be installed ---> Package bzip2.x86_64 0:1.0.6-13.el7 will be installed ---> Package cpp.x86_64 0:4.8.5-4.el7 will be installed ---> Package cups-client.x86_64 1:1.6.3-22.el7 will be installed --> Processing Dependency: cups-libs(x86-64) = 1:1.6.3-22.el7 for package: 1:cups-client-1.6.3-22.el7.x86_64 --> Processing Dependency: libcups.so.2()(64bit) for package: 1:cups-client-1.6.3-22.el7.x86_64 ---> Package dwz.x86_64 0:0.11-3.el7 will be installed ---> Package ed.x86_64 0:1.9-4.el7 will be installed ---> Package emacs-filesystem.noarch 1:24.3-18.el7 will be installed ---> Package gdb.x86_64 0:7.6.1-80.el7 will be installed ---> Package gettext-devel.x86_64 0:0.18.2.1-4.el7 will be installed --> Processing Dependency: gettext-common-devel = 0.18.2.1-4.el7 for package: gettext-devel-0.18.2.1-4.el7.x86_64 ---> Package glibc-devel.x86_64 0:2.17-106.el7_2.6 will be installed --> Processing Dependency: glibc-headers = 2.17-106.el7_2.6 for package: glibc-devel-2.17-106.el7_2.6.x86_64 --> Processing Dependency: glibc-headers for package: glibc-devel-2.17-106.el7_2.6.x86_64 ---> Package libgfortran.x86_64 0:4.8.5-4.el7 will be installed ---> Package libgnome-keyring.x86_64 0:3.8.0-3.el7 will be installed ---> Package libmpc.x86_64 0:1.0.1-3.el7 will be installed ---> Package libquadmath.x86_64 0:4.8.5-4.el7 will be installed ---> Package libquadmath-devel.x86_64 0:4.8.5-4.el7 will be installed ---> Package libstdc++-devel.x86_64 0:4.8.5-4.el7 will be installed ---> Package m4.x86_64 0:1.4.16-10.el7 will be installed ---> Package mailx.x86_64 0:12.5-12.el7_0 will be installed ---> Package mpfr.x86_64 0:3.1.1-4.el7 will be installed ---> Package neon.x86_64 0:0.30.0-3.el7 will be installed --> Processing Dependency: libpakchois.so.0()(64bit) for package: neon-0.30.0-3.el7.x86_64 ---> Package perl.x86_64 4:5.16.3-286.el7 will be installed --> Processing Dependency: perl-libs = 4:5.16.3-286.el7 for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl(Socket) >= 1.3 for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl(Scalar::Util) >= 1.10 for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl-macros for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl-libs for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl(threads::shared) for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl(Time::Local) for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl(Time::HiRes) for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl(Storable) for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl(Socket) for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl(Scalar::Util) for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl(Pod::Simple::XHTML) for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl(Pod::Simple::Search) for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: perl(Filter::Util::Call) for package: 4:perl-5.16.3-286.el7.x86_64 --> Processing Dependency: libperl.so()(64bit) for package: 4:perl-5.16.3-286.el7.x86_64 ---> Package perl-Carp.noarch 0:1.26-244.el7 will be installed ---> Package perl-Data-Dumper.x86_64 0:2.145-3.el7 will be installed ---> Package perl-Encode.x86_64 0:2.51-7.el7 will be installed ---> Package perl-Error.noarch 1:0.17020-2.el7 will be installed ---> Package perl-Exporter.noarch 0:5.68-3.el7 will be installed ---> Package perl-File-Path.noarch 0:2.09-2.el7 will be installed ---> Package perl-File-Temp.noarch 0:0.23.01-3.el7 will be installed ---> Package perl-Getopt-Long.noarch 0:2.40-2.el7 will be installed --> Processing Dependency: perl(Pod::Usage) >= 1.14 for package: perl-Getopt-Long-2.40-2.el7.noarch ---> Package perl-Git.noarch 0:1.8.3.1-6.el7_2.1 will be installed ---> Package perl-PathTools.x86_64 0:3.40-5.el7 will be installed ---> Package perl-TermReadKey.x86_64 0:2.30-20.el7 will be installed ---> Package perl-Test-Harness.noarch 0:3.28-3.el7 will be installed ---> Package perl-Text-ParseWords.noarch 0:3.29-4.el7 will be installed ---> Package perl-Thread-Queue.noarch 0:3.02-2.el7 will be installed ---> Package perl-XML-Parser.x86_64 0:2.41-10.el7 will be installed ---> Package perl-constant.noarch 0:1.27-2.el7 will be installed ---> Package perl-srpm-macros.noarch 0:1-8.el7 will be installed ---> Package perl-threads.x86_64 0:1.87-4.el7 will be installed ---> Package psmisc.x86_64 0:22.20-9.el7 will be installed ---> Package python-backports-ssl_match_hostname.noarch 0:3.4.0.2-4.el7 will be installed --> Processing Dependency: python-backports for package: python-backports-ssl_match_hostname-3.4.0.2-4.el7.noarch ---> Package redhat-lsb-submod-security.x86_64 0:4.1-27.el7.centos.1 will be installed ---> Package rsync.x86_64 0:3.0.9-17.el7 will be installed ---> Package spax.x86_64 0:1.5.2-13.el7 will be installed ---> Package subversion-libs.x86_64 0:1.7.14-10.el7 will be installed ---> Package systemtap-client.x86_64 0:2.8-10.el7 will be installed --> Processing Dependency: systemtap-runtime = 2.8-10.el7 for package: systemtap-client-2.8-10.el7.x86_64 --> Processing Dependency: mokutil for package: systemtap-client-2.8-10.el7.x86_64 ---> Package systemtap-devel.x86_64 0:2.8-10.el7 will be installed --> Processing Dependency: kernel-devel for package: systemtap-devel-2.8-10.el7.x86_64 ---> Package time.x86_64 0:1.7-45.el7 will be installed ---> Package unzip.x86_64 0:6.0-15.el7 will be installed ---> Package zip.x86_64 0:3.0-10.el7 will be installed --> Running transaction check ---> Package cups-libs.x86_64 1:1.6.3-22.el7 will be installed ---> Package gettext-common-devel.noarch 0:0.18.2.1-4.el7 will be installed ---> Package glibc-headers.x86_64 0:2.17-106.el7_2.6 will be installed --> Processing Dependency: kernel-headers >= 2.2.1 for package: glibc-headers-2.17-106.el7_2.6.x86_64 --> Processing Dependency: kernel-headers for package: glibc-headers-2.17-106.el7_2.6.x86_64 ---> Package kernel-devel.x86_64 0:3.10.0-327.18.2.el7 will be installed ---> Package mokutil.x86_64 0:0.9-2.el7 will be installed ---> Package pakchois.x86_64 0:0.4-10.el7 will be installed ---> Package perl-Filter.x86_64 0:1.49-3.el7 will be installed ---> Package perl-Pod-Simple.noarch 1:3.28-4.el7 will be installed --> Processing Dependency: perl(Pod::Escapes) >= 1.04 for package: 1:perl-Pod-Simple-3.28-4.el7.noarch ---> Package perl-Pod-Usage.noarch 0:1.63-3.el7 will be installed --> Processing Dependency: perl(Pod::Text) >= 3.15 for package: perl-Pod-Usage-1.63-3.el7.noarch --> Processing Dependency: perl-Pod-Perldoc for package: perl-Pod-Usage-1.63-3.el7.noarch ---> Package perl-Scalar-List-Utils.x86_64 0:1.27-248.el7 will be installed ---> Package perl-Socket.x86_64 0:2.010-3.el7 will be installed ---> Package perl-Storable.x86_64 0:2.45-3.el7 will be installed ---> Package perl-Time-HiRes.x86_64 4:1.9725-3.el7 will be installed ---> Package perl-Time-Local.noarch 0:1.2300-2.el7 will be installed ---> Package perl-libs.x86_64 4:5.16.3-286.el7 will be installed ---> Package perl-macros.x86_64 4:5.16.3-286.el7 will be installed ---> Package perl-threads-shared.x86_64 0:1.43-6.el7 will be installed ---> Package python-backports.x86_64 0:1.0-8.el7 will be installed ---> Package systemtap-runtime.x86_64 0:2.8-10.el7 will be installed --> Processing Dependency: libsymtabAPI.so.8.2()(64bit) for package: systemtap-runtime-2.8-10.el7.x86_64 --> Processing Dependency: libdyninstAPI.so.8.2()(64bit) for package: systemtap-runtime-2.8-10.el7.x86_64 --> Running transaction check ---> Package dyninst.x86_64 0:8.2.0-2.el7 will be installed --> Processing Dependency: libdwarf.so.0()(64bit) for package: dyninst-8.2.0-2.el7.x86_64 --> Processing Dependency: libboost_thread-mt.so.1.53.0()(64bit) for package: dyninst-8.2.0-2.el7.x86_64 --> Processing Dependency: libboost_system-mt.so.1.53.0()(64bit) for package: dyninst-8.2.0-2.el7.x86_64 ---> Package kernel-headers.x86_64 0:3.10.0-327.18.2.el7 will be installed ---> Package perl-Pod-Escapes.noarch 1:1.04-286.el7 will be installed ---> Package perl-Pod-Perldoc.noarch 0:3.20-4.el7 will be installed --> Processing Dependency: perl(parent) for package: perl-Pod-Perldoc-3.20-4.el7.noarch --> Processing Dependency: perl(HTTP::Tiny) for package: perl-Pod-Perldoc-3.20-4.el7.noarch ---> Package perl-podlators.noarch 0:2.5.1-3.el7 will be installed --> Running transaction check ---> Package boost-system.x86_64 0:1.53.0-25.el7 will be installed ---> Package boost-thread.x86_64 0:1.53.0-25.el7 will be installed ---> Package libdwarf.x86_64 0:20130207-4.el7 will be installed ---> Package perl-HTTP-Tiny.noarch 0:0.033-3.el7 will be installed ---> Package perl-parent.noarch 1:0.225-244.el7 will be installed --> Finished Dependency Resolution Dependencies Resolved ================================================================================ Package Arch Version Repository Size ================================================================================ Installing for group install \"Development Tools\": autoconf noarch 2.69-11.el7 base 701 k automake noarch 1.13.4-3.el7 base 679 k bison x86_64 2.7-4.el7 base 578 k byacc x86_64 1.9.20130304-3.el7 base 65 k cscope x86_64 15.8-7.el7 base 203 k ctags x86_64 5.8-13.el7 base 155 k diffstat x86_64 1.57-4.el7 base 35 k doxygen x86_64 1:1.8.5-3.el7 base 3.6 M elfutils x86_64 0.163-3.el7 base 268 k flex x86_64 2.5.37-3.el7 base 292 k gcc x86_64 4.8.5-4.el7 base 16 M gcc-c++ x86_64 4.8.5-4.el7 base 7.2 M gcc-gfortran x86_64 4.8.5-4.el7 base 6.6 M git x86_64 1.8.3.1-6.el7_2.1 updates 4.4 M indent x86_64 2.2.11-13.el7 base 150 k intltool noarch 0.50.2-6.el7 base 59 k libtool x86_64 2.4.2-21.el7_2 updates 588 k patch x86_64 2.7.1-8.el7 base 110 k patchutils x86_64 0.3.3-4.el7 base 104 k rcs x86_64 5.9.0-5.el7 base 230 k redhat-rpm-config noarch 9.1.0-68.el7.centos base 77 k rpm-build x86_64 4.11.3-17.el7 base 143 k rpm-sign x86_64 4.11.3-17.el7 base 44 k subversion x86_64 1.7.14-10.el7 base 1.0 M swig x86_64 2.0.10-4.el7 base 1.3 M systemtap x86_64 2.8-10.el7 base 25 k Installing: deltarpm x86_64 3.6-3.el7 base 82 k python-setuptools noarch 0.9.8-4.el7 base 396 k redhat-lsb-core x86_64 4.1-27.el7.centos.1 base 38 k wget x86_64 1.14-10.el7_0.1 base 545 k yum-plugin-priorities noarch 1.1.31-34.el7 base 25 k Installing for dependencies: apr x86_64 1.4.8-3.el7 base 103 k apr-util x86_64 1.5.2-6.el7 base 92 k at x86_64 3.1.13-20.el7 base 50 k bc x86_64 1.06.95-13.el7 base 115 k boost-system x86_64 1.53.0-25.el7 base 39 k boost-thread x86_64 1.53.0-25.el7 base 57 k bzip2 x86_64 1.0.6-13.el7 base 52 k cpp x86_64 4.8.5-4.el7 base 5.9 M cups-client x86_64 1:1.6.3-22.el7 base 148 k cups-libs x86_64 1:1.6.3-22.el7 base 355 k dwz x86_64 0.11-3.el7 base 99 k dyninst x86_64 8.2.0-2.el7 base 2.5 M ed x86_64 1.9-4.el7 base 72 k emacs-filesystem noarch 1:24.3-18.el7 base 58 k gdb x86_64 7.6.1-80.el7 base 2.4 M gettext-common-devel noarch 0.18.2.1-4.el7 base 368 k gettext-devel x86_64 0.18.2.1-4.el7 base 315 k glibc-devel x86_64 2.17-106.el7_2.6 updates 1.0 M glibc-headers x86_64 2.17-106.el7_2.6 updates 662 k kernel-devel x86_64 3.10.0-327.18.2.el7 updates 11 M kernel-headers x86_64 3.10.0-327.18.2.el7 updates 3.2 M libdwarf x86_64 20130207-4.el7 base 109 k libgfortran x86_64 4.8.5-4.el7 base 293 k libgnome-keyring x86_64 3.8.0-3.el7 base 109 k libmpc x86_64 1.0.1-3.el7 base 51 k libquadmath x86_64 4.8.5-4.el7 base 182 k libquadmath-devel x86_64 4.8.5-4.el7 base 46 k libstdc++-devel x86_64 4.8.5-4.el7 base 1.5 M m4 x86_64 1.4.16-10.el7 base 256 k mailx x86_64 12.5-12.el7_0 base 244 k mokutil x86_64 0.9-2.el7 base 37 k mpfr x86_64 3.1.1-4.el7 base 203 k neon x86_64 0.30.0-3.el7 base 165 k pakchois x86_64 0.4-10.el7 base 14 k perl x86_64 4:5.16.3-286.el7 base 8.0 M perl-Carp noarch 1.26-244.el7 base 19 k perl-Data-Dumper x86_64 2.145-3.el7 base 47 k perl-Encode x86_64 2.51-7.el7 base 1.5 M perl-Error noarch 1:0.17020-2.el7 base 32 k perl-Exporter noarch 5.68-3.el7 base 28 k perl-File-Path noarch 2.09-2.el7 base 26 k perl-File-Temp noarch 0.23.01-3.el7 base 56 k perl-Filter x86_64 1.49-3.el7 base 76 k perl-Getopt-Long noarch 2.40-2.el7 base 56 k perl-Git noarch 1.8.3.1-6.el7_2.1 updates 53 k perl-HTTP-Tiny noarch 0.033-3.el7 base 38 k perl-PathTools x86_64 3.40-5.el7 base 82 k perl-Pod-Escapes noarch 1:1.04-286.el7 base 50 k perl-Pod-Perldoc noarch 3.20-4.el7 base 87 k perl-Pod-Simple noarch 1:3.28-4.el7 base 216 k perl-Pod-Usage noarch 1.63-3.el7 base 27 k perl-Scalar-List-Utils x86_64 1.27-248.el7 base 36 k perl-Socket x86_64 2.010-3.el7 base 49 k perl-Storable x86_64 2.45-3.el7 base 77 k perl-TermReadKey x86_64 2.30-20.el7 base 31 k perl-Test-Harness noarch 3.28-3.el7 base 302 k perl-Text-ParseWords noarch 3.29-4.el7 base 14 k perl-Thread-Queue noarch 3.02-2.el7 base 17 k perl-Time-HiRes x86_64 4:1.9725-3.el7 base 45 k perl-Time-Local noarch 1.2300-2.el7 base 24 k perl-XML-Parser x86_64 2.41-10.el7 base 223 k perl-constant noarch 1.27-2.el7 base 19 k perl-libs x86_64 4:5.16.3-286.el7 base 687 k perl-macros x86_64 4:5.16.3-286.el7 base 43 k perl-parent noarch 1:0.225-244.el7 base 12 k perl-podlators noarch 2.5.1-3.el7 base 112 k perl-srpm-macros noarch 1-8.el7 base 4.6 k perl-threads x86_64 1.87-4.el7 base 49 k perl-threads-shared x86_64 1.43-6.el7 base 39 k psmisc x86_64 22.20-9.el7 base 140 k python-backports x86_64 1.0-8.el7 base 5.8 k python-backports-ssl_match_hostname noarch 3.4.0.2-4.el7 base 12 k redhat-lsb-submod-security x86_64 4.1-27.el7.centos.1 base 15 k rsync x86_64 3.0.9-17.el7 base 360 k spax x86_64 1.5.2-13.el7 base 260 k subversion-libs x86_64 1.7.14-10.el7 base 921 k systemtap-client x86_64 2.8-10.el7 base 2.9 M systemtap-devel x86_64 2.8-10.el7 base 1.6 M systemtap-runtime x86_64 2.8-10.el7 base 270 k time x86_64 1.7-45.el7 base 30 k unzip x86_64 6.0-15.el7 base 166 k zip x86_64 3.0-10.el7 base 260 k Transaction Summary ================================================================================ Install 31 Packages (+82 Dependent packages) Total download size: 96 M Installed size: 277 M Downloading packages: Delta RPMs disabled because /usr/bin/applydeltarpm not installed. Public key for apr-1.4.8-3.el7.x86_64.rpm is not installed Public key for git-1.8.3.1-6.el7_2.1.x86_64.rpm is not installed -------------------------------------------------------------------------------- Total 94 MB/s | 96 MB 00:01 Retrieving key from file:///etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-7 Running transaction check Running transaction test Transaction test succeeded Running transaction Installing : mpfr-3.1.1-4.el7.x86_64 1/113 Installing : libmpc-1.0.1-3.el7.x86_64 2/113 Installing : m4-1.4.16-10.el7.x86_64 3/113 Installing : libquadmath-4.8.5-4.el7.x86_64 4/113 Installing : apr-1.4.8-3.el7.x86_64 5/113 Installing : patch-2.7.1-8.el7.x86_64 6/113 Installing : apr-util-1.5.2-6.el7.x86_64 7/113 Installing : unzip-6.0-15.el7.x86_64 8/113 Installing : zip-3.0-10.el7.x86_64 9/113 Installing : boost-system-1.53.0-25.el7.x86_64 10/113 Installing : boost-thread-1.53.0-25.el7.x86_64 11/113 Installing : libgfortran-4.8.5-4.el7.x86_64 12/113 Installing : cpp-4.8.5-4.el7.x86_64 13/113 Installing : 1:perl-parent-0.225-244.el7.noarch 14/113 Installing : perl-HTTP-Tiny-0.033-3.el7.noarch 15/113 Installing : perl-podlators-2.5.1-3.el7.noarch 16/113 Installing : perl-Pod-Perldoc-3.20-4.el7.noarch 17/113 Installing : 1:perl-Pod-Escapes-1.04-286.el7.noarch 18/113 Installing : perl-Text-ParseWords-3.29-4.el7.noarch 19/113 Installing : perl-Encode-2.51-7.el7.x86_64 20/113 Installing : perl-Pod-Usage-1.63-3.el7.noarch 21/113 Installing : 4:perl-libs-5.16.3-286.el7.x86_64 22/113 Installing : 4:perl-macros-5.16.3-286.el7.x86_64 23/113 Installing : perl-threads-1.87-4.el7.x86_64 24/113 Installing : perl-Socket-2.010-3.el7.x86_64 25/113 Installing : 4:perl-Time-HiRes-1.9725-3.el7.x86_64 26/113 Installing : perl-threads-shared-1.43-6.el7.x86_64 27/113 Installing : perl-Scalar-List-Utils-1.27-248.el7.x86_64 28/113 Installing : perl-Storable-2.45-3.el7.x86_64 29/113 Installing : perl-Filter-1.49-3.el7.x86_64 30/113 Installing : perl-Exporter-5.68-3.el7.noarch 31/113 Installing : perl-constant-1.27-2.el7.noarch 32/113 Installing : perl-File-Temp-0.23.01-3.el7.noarch 33/113 Installing : perl-File-Path-2.09-2.el7.noarch 34/113 Installing : perl-PathTools-3.40-5.el7.x86_64 35/113 Installing : perl-Carp-1.26-244.el7.noarch 36/113 Installing : perl-Time-Local-1.2300-2.el7.noarch 37/113 Installing : 1:perl-Pod-Simple-3.28-4.el7.noarch 38/113 Installing : perl-Getopt-Long-2.40-2.el7.noarch 39/113 Installing : 4:perl-5.16.3-286.el7.x86_64 40/113 Installing : perl-Thread-Queue-3.02-2.el7.noarch 41/113 Installing : 1:perl-Error-0.17020-2.el7.noarch 42/113 Installing : perl-TermReadKey-2.30-20.el7.x86_64 43/113 Installing : perl-Data-Dumper-2.145-3.el7.x86_64 44/113 Installing : autoconf-2.69-11.el7.noarch 45/113 Installing : perl-XML-Parser-2.41-10.el7.x86_64 46/113 Installing : kernel-devel-3.10.0-327.18.2.el7.x86_64 47/113 Installing : perl-Test-Harness-3.28-3.el7.noarch 48/113 Installing : automake-1.13.4-3.el7.noarch 49/113 Installing : spax-1.5.2-13.el7.x86_64 50/113 Installing : gdb-7.6.1-80.el7.x86_64 51/113 Installing : python-backports-1.0-8.el7.x86_64 52/113 Installing : python-backports-ssl_match_hostname-3.4.0.2-4.el7.noar 53/113 Installing : 1:emacs-filesystem-24.3-18.el7.noarch 54/113 Installing : kernel-headers-3.10.0-327.18.2.el7.x86_64 55/113 Installing : glibc-headers-2.17-106.el7_2.6.x86_64 56/113 Installing : glibc-devel-2.17-106.el7_2.6.x86_64 57/113 Installing : gcc-4.8.5-4.el7.x86_64 58/113 Installing : systemtap-devel-2.8-10.el7.x86_64 59/113 Installing : libquadmath-devel-4.8.5-4.el7.x86_64 60/113 Installing : elfutils-0.163-3.el7.x86_64 61/113 Installing : rsync-3.0.9-17.el7.x86_64 62/113 Installing : pakchois-0.4-10.el7.x86_64 63/113 Installing : neon-0.30.0-3.el7.x86_64 64/113 Installing : subversion-libs-1.7.14-10.el7.x86_64 65/113 Installing : dwz-0.11-3.el7.x86_64 66/113 Installing : time-1.7-45.el7.x86_64 67/113 Installing : libstdc++-devel-4.8.5-4.el7.x86_64 68/113 Installing : libgnome-keyring-3.8.0-3.el7.x86_64 69/113 Installing : git-1.8.3.1-6.el7_2.1.x86_64 70/113 Installing : perl-Git-1.8.3.1-6.el7_2.1.noarch 71/113 Installing : gettext-common-devel-0.18.2.1-4.el7.noarch 72/113 Installing : gettext-devel-0.18.2.1-4.el7.x86_64 73/113 Installing : perl-srpm-macros-1-8.el7.noarch 74/113 Installing : redhat-rpm-config-9.1.0-68.el7.centos.noarch 75/113 Installing : bc-1.06.95-13.el7.x86_64 76/113 Installing : mokutil-0.9-2.el7.x86_64 77/113 Installing : mailx-12.5-12.el7_0.x86_64 78/113 Installing : ed-1.9-4.el7.x86_64 79/113 Installing : psmisc-22.20-9.el7.x86_64 80/113 Installing : at-3.1.13-20.el7.x86_64 81/113 Installing : bzip2-1.0.6-13.el7.x86_64 82/113 Installing : redhat-lsb-submod-security-4.1-27.el7.centos.1.x86_64 83/113 Installing : 1:cups-libs-1.6.3-22.el7.x86_64 84/113 Installing : 1:cups-client-1.6.3-22.el7.x86_64 85/113 Installing : libdwarf-20130207-4.el7.x86_64 86/113 Installing : dyninst-8.2.0-2.el7.x86_64 87/113 Installing : systemtap-runtime-2.8-10.el7.x86_64 88/113 Installing : systemtap-client-2.8-10.el7.x86_64 89/113 Installing : systemtap-2.8-10.el7.x86_64 90/113 Installing : redhat-lsb-core-4.1-27.el7.centos.1.x86_64 91/113 Installing : rpm-build-4.11.3-17.el7.x86_64 92/113 Installing : intltool-0.50.2-6.el7.noarch 93/113 Installing : gcc-c++-4.8.5-4.el7.x86_64 94/113 Installing : subversion-1.7.14-10.el7.x86_64 95/113 Installing : gcc-gfortran-4.8.5-4.el7.x86_64 96/113 Installing : libtool-2.4.2-21.el7_2.x86_64 97/113 Installing : cscope-15.8-7.el7.x86_64 98/113 Installing : python-setuptools-0.9.8-4.el7.noarch 99/113 Installing : patchutils-0.3.3-4.el7.x86_64 100/113 Installing : bison-2.7-4.el7.x86_64 101/113 Installing : flex-2.5.37-3.el7.x86_64 102/113 Installing : rpm-sign-4.11.3-17.el7.x86_64 103/113 Installing : indent-2.2.11-13.el7.x86_64 104/113 Installing : ctags-5.8-13.el7.x86_64 105/113 Installing : byacc-1.9.20130304-3.el7.x86_64 106/113 Installing : wget-1.14-10.el7_0.1.x86_64 107/113 Installing : rcs-5.9.0-5.el7.x86_64 108/113 Installing : swig-2.0.10-4.el7.x86_64 109/113 Installing : diffstat-1.57-4.el7.x86_64 110/113 Installing : deltarpm-3.6-3.el7.x86_64 111/113 Installing : 1:doxygen-1.8.5-3.el7.x86_64 112/113 Installing : yum-plugin-priorities-1.1.31-34.el7.noarch 113/113 Verifying : yum-plugin-priorities-1.1.31-34.el7.noarch 1/113 Verifying : perl-HTTP-Tiny-0.033-3.el7.noarch 2/113 Verifying : libdwarf-20130207-4.el7.x86_64 3/113 Verifying : 1:doxygen-1.8.5-3.el7.x86_64 4/113 Verifying : python-backports-ssl_match_hostname-3.4.0.2-4.el7.noar 5/113 Verifying : perl-TermReadKey-2.30-20.el7.x86_64 6/113 Verifying : glibc-devel-2.17-106.el7_2.6.x86_64 7/113 Verifying : perl-File-Temp-0.23.01-3.el7.noarch 8/113 Verifying : cscope-15.8-7.el7.x86_64 9/113 Verifying : patch-2.7.1-8.el7.x86_64 10/113 Verifying : glibc-headers-2.17-106.el7_2.6.x86_64 11/113 Verifying : perl-Data-Dumper-2.145-3.el7.x86_64 12/113 Verifying : 1:cups-libs-1.6.3-22.el7.x86_64 13/113 Verifying : apr-util-1.5.2-6.el7.x86_64 14/113 Verifying : deltarpm-3.6-3.el7.x86_64 15/113 Verifying : systemtap-2.8-10.el7.x86_64 16/113 Verifying : 1:perl-Pod-Escapes-1.04-286.el7.noarch 17/113 Verifying : patchutils-0.3.3-4.el7.x86_64 18/113 Verifying : neon-0.30.0-3.el7.x86_64 19/113 Verifying : subversion-libs-1.7.14-10.el7.x86_64 20/113 Verifying : intltool-0.50.2-6.el7.noarch 21/113 Verifying : perl-File-Path-2.09-2.el7.noarch 22/113 Verifying : autoconf-2.69-11.el7.noarch 23/113 Verifying : perl-Socket-2.010-3.el7.x86_64 24/113 Verifying : perl-Text-ParseWords-3.29-4.el7.noarch 25/113 Verifying : diffstat-1.57-4.el7.x86_64 26/113 Verifying : git-1.8.3.1-6.el7_2.1.x86_64 27/113 Verifying : swig-2.0.10-4.el7.x86_64 28/113 Verifying : systemtap-runtime-2.8-10.el7.x86_64 29/113 Verifying : redhat-lsb-submod-security-4.1-27.el7.centos.1.x86_64 30/113 Verifying : bison-2.7-4.el7.x86_64 31/113 Verifying : 4:perl-Time-HiRes-1.9725-3.el7.x86_64 32/113 Verifying : gcc-c++-4.8.5-4.el7.x86_64 33/113 Verifying : perl-XML-Parser-2.41-10.el7.x86_64 34/113 Verifying : python-setuptools-0.9.8-4.el7.noarch 35/113 Verifying : bzip2-1.0.6-13.el7.x86_64 36/113 Verifying : systemtap-devel-2.8-10.el7.x86_64 37/113 Verifying : rcs-5.9.0-5.el7.x86_64 38/113 Verifying : at-3.1.13-20.el7.x86_64 39/113 Verifying : libgfortran-4.8.5-4.el7.x86_64 40/113 Verifying : libmpc-1.0.1-3.el7.x86_64 41/113 Verifying : wget-1.14-10.el7_0.1.x86_64 42/113 Verifying : perl-Pod-Usage-1.63-3.el7.noarch 43/113 Verifying : perl-Encode-2.51-7.el7.x86_64 44/113 Verifying : boost-system-1.53.0-25.el7.x86_64 45/113 Verifying : perl-threads-1.87-4.el7.x86_64 46/113 Verifying : psmisc-22.20-9.el7.x86_64 47/113 Verifying : perl-Scalar-List-Utils-1.27-248.el7.x86_64 48/113 Verifying : ed-1.9-4.el7.x86_64 49/113 Verifying : redhat-lsb-core-4.1-27.el7.centos.1.x86_64 50/113 Verifying : mailx-12.5-12.el7_0.x86_64 51/113 Verifying : mokutil-0.9-2.el7.x86_64 52/113 Verifying : gcc-4.8.5-4.el7.x86_64 53/113 Verifying : perl-threads-shared-1.43-6.el7.x86_64 54/113 Verifying : perl-Storable-2.45-3.el7.x86_64 55/113 Verifying : byacc-1.9.20130304-3.el7.x86_64 56/113 Verifying : 4:perl-libs-5.16.3-286.el7.x86_64 57/113 Verifying : m4-1.4.16-10.el7.x86_64 58/113 Verifying : bc-1.06.95-13.el7.x86_64 59/113 Verifying : 1:perl-parent-0.225-244.el7.noarch 60/113 Verifying : perl-srpm-macros-1-8.el7.noarch 61/113 Verifying : gettext-common-devel-0.18.2.1-4.el7.noarch 62/113 Verifying : libquadmath-devel-4.8.5-4.el7.x86_64 63/113 Verifying : rpm-build-4.11.3-17.el7.x86_64 64/113 Verifying : libgnome-keyring-3.8.0-3.el7.x86_64 65/113 Verifying : libstdc++-devel-4.8.5-4.el7.x86_64 66/113 Verifying : perl-podlators-2.5.1-3.el7.noarch 67/113 Verifying : zip-3.0-10.el7.x86_64 68/113 Verifying : time-1.7-45.el7.x86_64 69/113 Verifying : mpfr-3.1.1-4.el7.x86_64 70/113 Verifying : dyninst-8.2.0-2.el7.x86_64 71/113 Verifying : perl-Filter-1.49-3.el7.x86_64 72/113 Verifying : dwz-0.11-3.el7.x86_64 73/113 Verifying : libtool-2.4.2-21.el7_2.x86_64 74/113 Verifying : 1:cups-client-1.6.3-22.el7.x86_64 75/113 Verifying : pakchois-0.4-10.el7.x86_64 76/113 Verifying : rsync-3.0.9-17.el7.x86_64 77/113 Verifying : ctags-5.8-13.el7.x86_64 78/113 Verifying : kernel-devel-3.10.0-327.18.2.el7.x86_64 79/113 Verifying : automake-1.13.4-3.el7.noarch 80/113 Verifying : perl-Exporter-5.68-3.el7.noarch 81/113 Verifying : perl-constant-1.27-2.el7.noarch 82/113 Verifying : perl-PathTools-3.40-5.el7.x86_64 83/113 Verifying : elfutils-0.163-3.el7.x86_64 84/113 Verifying : 4:perl-macros-5.16.3-286.el7.x86_64 85/113 Verifying : perl-Carp-1.26-244.el7.noarch 86/113 Verifying : perl-Test-Harness-3.28-3.el7.noarch 87/113 Verifying : kernel-headers-3.10.0-327.18.2.el7.x86_64 88/113 Verifying : apr-1.4.8-3.el7.x86_64 89/113 Verifying : 4:perl-5.16.3-286.el7.x86_64 90/113 Verifying : subversion-1.7.14-10.el7.x86_64 91/113 Verifying : perl-Thread-Queue-3.02-2.el7.noarch 92/113 Verifying : 1:perl-Pod-Simple-3.28-4.el7.noarch 93/113 Verifying : perl-Time-Local-1.2300-2.el7.noarch 94/113 Verifying : perl-Pod-Perldoc-3.20-4.el7.noarch 95/113 Verifying : perl-Git-1.8.3.1-6.el7_2.1.noarch 96/113 Verifying : boost-thread-1.53.0-25.el7.x86_64 97/113 Verifying : 1:emacs-filesystem-24.3-18.el7.noarch 98/113 Verifying : systemtap-client-2.8-10.el7.x86_64 99/113 Verifying : 1:perl-Error-0.17020-2.el7.noarch 100/113 Verifying : indent-2.2.11-13.el7.x86_64 101/113 Verifying : gcc-gfortran-4.8.5-4.el7.x86_64 102/113 Verifying : flex-2.5.37-3.el7.x86_64 103/113 Verifying : python-backports-1.0-8.el7.x86_64 104/113 Verifying : unzip-6.0-15.el7.x86_64 105/113 Verifying : gettext-devel-0.18.2.1-4.el7.x86_64 106/113 Verifying : gdb-7.6.1-80.el7.x86_64 107/113 Verifying : perl-Getopt-Long-2.40-2.el7.noarch 108/113 Verifying : cpp-4.8.5-4.el7.x86_64 109/113 Verifying : redhat-rpm-config-9.1.0-68.el7.centos.noarch 110/113 Verifying : spax-1.5.2-13.el7.x86_64 111/113 Verifying : libquadmath-4.8.5-4.el7.x86_64 112/113 Verifying : rpm-sign-4.11.3-17.el7.x86_64 113/113 Installed: autoconf.noarch 0:2.69-11.el7 automake.noarch 0:1.13.4-3.el7 bison.x86_64 0:2.7-4.el7 byacc.x86_64 0:1.9.20130304-3.el7 cscope.x86_64 0:15.8-7.el7 ctags.x86_64 0:5.8-13.el7 deltarpm.x86_64 0:3.6-3.el7 diffstat.x86_64 0:1.57-4.el7 doxygen.x86_64 1:1.8.5-3.el7 elfutils.x86_64 0:0.163-3.el7 flex.x86_64 0:2.5.37-3.el7 gcc.x86_64 0:4.8.5-4.el7 gcc-c++.x86_64 0:4.8.5-4.el7 gcc-gfortran.x86_64 0:4.8.5-4.el7 git.x86_64 0:1.8.3.1-6.el7_2.1 indent.x86_64 0:2.2.11-13.el7 intltool.noarch 0:0.50.2-6.el7 libtool.x86_64 0:2.4.2-21.el7_2 patch.x86_64 0:2.7.1-8.el7 patchutils.x86_64 0:0.3.3-4.el7 python-setuptools.noarch 0:0.9.8-4.el7 rcs.x86_64 0:5.9.0-5.el7 redhat-lsb-core.x86_64 0:4.1-27.el7.centos.1 redhat-rpm-config.noarch 0:9.1.0-68.el7.centos rpm-build.x86_64 0:4.11.3-17.el7 rpm-sign.x86_64 0:4.11.3-17.el7 subversion.x86_64 0:1.7.14-10.el7 swig.x86_64 0:2.0.10-4.el7 systemtap.x86_64 0:2.8-10.el7 wget.x86_64 0:1.14-10.el7_0.1 yum-plugin-priorities.noarch 0:1.1.31-34.el7 Dependency Installed: apr.x86_64 0:1.4.8-3.el7 apr-util.x86_64 0:1.5.2-6.el7 at.x86_64 0:3.1.13-20.el7 bc.x86_64 0:1.06.95-13.el7 boost-system.x86_64 0:1.53.0-25.el7 boost-thread.x86_64 0:1.53.0-25.el7 bzip2.x86_64 0:1.0.6-13.el7 cpp.x86_64 0:4.8.5-4.el7 cups-client.x86_64 1:1.6.3-22.el7 cups-libs.x86_64 1:1.6.3-22.el7 dwz.x86_64 0:0.11-3.el7 dyninst.x86_64 0:8.2.0-2.el7 ed.x86_64 0:1.9-4.el7 emacs-filesystem.noarch 1:24.3-18.el7 gdb.x86_64 0:7.6.1-80.el7 gettext-common-devel.noarch 0:0.18.2.1-4.el7 gettext-devel.x86_64 0:0.18.2.1-4.el7 glibc-devel.x86_64 0:2.17-106.el7_2.6 glibc-headers.x86_64 0:2.17-106.el7_2.6 kernel-devel.x86_64 0:3.10.0-327.18.2.el7 kernel-headers.x86_64 0:3.10.0-327.18.2.el7 libdwarf.x86_64 0:20130207-4.el7 libgfortran.x86_64 0:4.8.5-4.el7 libgnome-keyring.x86_64 0:3.8.0-3.el7 libmpc.x86_64 0:1.0.1-3.el7 libquadmath.x86_64 0:4.8.5-4.el7 libquadmath-devel.x86_64 0:4.8.5-4.el7 libstdc++-devel.x86_64 0:4.8.5-4.el7 m4.x86_64 0:1.4.16-10.el7 mailx.x86_64 0:12.5-12.el7_0 mokutil.x86_64 0:0.9-2.el7 mpfr.x86_64 0:3.1.1-4.el7 neon.x86_64 0:0.30.0-3.el7 pakchois.x86_64 0:0.4-10.el7 perl.x86_64 4:5.16.3-286.el7 perl-Carp.noarch 0:1.26-244.el7 perl-Data-Dumper.x86_64 0:2.145-3.el7 perl-Encode.x86_64 0:2.51-7.el7 perl-Error.noarch 1:0.17020-2.el7 perl-Exporter.noarch 0:5.68-3.el7 perl-File-Path.noarch 0:2.09-2.el7 perl-File-Temp.noarch 0:0.23.01-3.el7 perl-Filter.x86_64 0:1.49-3.el7 perl-Getopt-Long.noarch 0:2.40-2.el7 perl-Git.noarch 0:1.8.3.1-6.el7_2.1 perl-HTTP-Tiny.noarch 0:0.033-3.el7 perl-PathTools.x86_64 0:3.40-5.el7 perl-Pod-Escapes.noarch 1:1.04-286.el7 perl-Pod-Perldoc.noarch 0:3.20-4.el7 perl-Pod-Simple.noarch 1:3.28-4.el7 perl-Pod-Usage.noarch 0:1.63-3.el7 perl-Scalar-List-Utils.x86_64 0:1.27-248.el7 perl-Socket.x86_64 0:2.010-3.el7 perl-Storable.x86_64 0:2.45-3.el7 perl-TermReadKey.x86_64 0:2.30-20.el7 perl-Test-Harness.noarch 0:3.28-3.el7 perl-Text-ParseWords.noarch 0:3.29-4.el7 perl-Thread-Queue.noarch 0:3.02-2.el7 perl-Time-HiRes.x86_64 4:1.9725-3.el7 perl-Time-Local.noarch 0:1.2300-2.el7 perl-XML-Parser.x86_64 0:2.41-10.el7 perl-constant.noarch 0:1.27-2.el7 perl-libs.x86_64 4:5.16.3-286.el7 perl-macros.x86_64 4:5.16.3-286.el7 perl-parent.noarch 1:0.225-244.el7 perl-podlators.noarch 0:2.5.1-3.el7 perl-srpm-macros.noarch 0:1-8.el7 perl-threads.x86_64 0:1.87-4.el7 perl-threads-shared.x86_64 0:1.43-6.el7 psmisc.x86_64 0:22.20-9.el7 python-backports.x86_64 0:1.0-8.el7 python-backports-ssl_match_hostname.noarch 0:3.4.0.2-4.el7 redhat-lsb-submod-security.x86_64 0:4.1-27.el7.centos.1 rsync.x86_64 0:3.0.9-17.el7 spax.x86_64 0:1.5.2-13.el7 subversion-libs.x86_64 0:1.7.14-10.el7 systemtap-client.x86_64 0:2.8-10.el7 systemtap-devel.x86_64 0:2.8-10.el7 systemtap-runtime.x86_64 0:2.8-10.el7 time.x86_64 0:1.7-45.el7 unzip.x86_64 0:6.0-15.el7 zip.x86_64 0:3.0-10.el7 Complete! " } ] TASK [common : Install debug packages] ***************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/packages.yml:34 Friday 20 May 2016 11:22:10 +0000 (0:00:33.539) 0:00:59.495 ************ changed: [n2.dusty] => (item=[u'net-tools', u'lsof', u'tcpdump', u'sysstat', u'psmisc']) => {"changed": true, "item": ["net-tools", "lsof", "tcpdump", "sysstat", "psmisc"], "msg": "", "rc": 0, "results": ["psmisc-22.20-9.el7.x86_64 providing psmisc is already installed", "Loaded plugins: fastestmirror, priorities\nLoading mirror speeds from cached hostfile\n * base: mirror.centos.org\n * extras: mirror.centos.org\n * updates: mirror.centos.org\nResolving Dependencies\n--> Running transaction check\n---> Package lsof.x86_64 0:4.87-4.el7 will be installed\n---> Package net-tools.x86_64 0:2.0-0.17.20131004git.el7 will be installed\n---> Package sysstat.x86_64 0:10.1.5-7.el7 will be installed\n--> Processing Dependency: libsensors.so.4()(64bit) for package: sysstat-10.1.5-7.el7.x86_64\n---> Package tcpdump.x86_64 14:4.5.1-3.el7 will be installed\n--> Running transaction check\n---> Package lm_sensors-libs.x86_64 0:3.3.4-11.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository\n Size\n================================================================================\nInstalling:\n lsof x86_64 4.87-4.el7 base 331 k\n net-tools x86_64 2.0-0.17.20131004git.el7 base 304 k\n sysstat x86_64 10.1.5-7.el7 base 296 k\n tcpdump x86_64 14:4.5.1-3.el7 base 387 k\nInstalling for dependencies:\n lm_sensors-libs x86_64 3.3.4-11.el7 base 40 k\n\nTransaction Summary\n================================================================================\nInstall 4 Packages (+1 Dependent package)\n\nTotal download size: 1.3 M\nInstalled size: 3.8 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal 14 MB/s | 1.3 MB 00:00 \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : lm_sensors-libs-3.3.4-11.el7.x86_64 1/5 \n Installing : sysstat-10.1.5-7.el7.x86_64 2/5 \n Installing : net-tools-2.0-0.17.20131004git.el7.x86_64 3/5 \n Installing : lsof-4.87-4.el7.x86_64 4/5 \n Installing : 14:tcpdump-4.5.1-3.el7.x86_64 5/5 \n Verifying : sysstat-10.1.5-7.el7.x86_64 1/5 \n Verifying : lm_sensors-libs-3.3.4-11.el7.x86_64 2/5 \n Verifying : 14:tcpdump-4.5.1-3.el7.x86_64 3/5 \n Verifying : lsof-4.87-4.el7.x86_64 4/5 \n Verifying : net-tools-2.0-0.17.20131004git.el7.x86_64 5/5 \n\nInstalled:\n lsof.x86_64 0:4.87-4.el7 net-tools.x86_64 0:2.0-0.17.20131004git.el7 \n sysstat.x86_64 0:10.1.5-7.el7 tcpdump.x86_64 14:4.5.1-3.el7 \n\nDependency Installed:\n lm_sensors-libs.x86_64 0:3.3.4-11.el7 \n\nComplete!\n"]} msg: All items completed results: [ { "_ansible_no_log": false, "invocation": { "module_name": "yum", "module_args": { "disable_gpg_check": false, "disablerepo": null, "install_repoquery": true, "list": null, "state": "present", "enablerepo": null, "update_cache": false, "conf_file": null, "exclude": null, "name": [ "net-tools", "lsof", "tcpdump", "sysstat", "psmisc" ] } }, "msg": "", "rc": 0, "changed": true, "item": [ "net-tools", "lsof", "tcpdump", "sysstat", "psmisc" ], "results": " psmisc-22.20-9.el7.x86_64 providing psmisc is already installed Loaded plugins: fastestmirror, priorities Loading mirror speeds from cached hostfile * base: mirror.centos.org * extras: mirror.centos.org * updates: mirror.centos.org Resolving Dependencies --> Running transaction check ---> Package lsof.x86_64 0:4.87-4.el7 will be installed ---> Package net-tools.x86_64 0:2.0-0.17.20131004git.el7 will be installed ---> Package sysstat.x86_64 0:10.1.5-7.el7 will be installed --> Processing Dependency: libsensors.so.4()(64bit) for package: sysstat-10.1.5-7.el7.x86_64 ---> Package tcpdump.x86_64 14:4.5.1-3.el7 will be installed --> Running transaction check ---> Package lm_sensors-libs.x86_64 0:3.3.4-11.el7 will be installed --> Finished Dependency Resolution Dependencies Resolved ================================================================================ Package Arch Version Repository Size ================================================================================ Installing: lsof x86_64 4.87-4.el7 base 331 k net-tools x86_64 2.0-0.17.20131004git.el7 base 304 k sysstat x86_64 10.1.5-7.el7 base 296 k tcpdump x86_64 14:4.5.1-3.el7 base 387 k Installing for dependencies: lm_sensors-libs x86_64 3.3.4-11.el7 base 40 k Transaction Summary ================================================================================ Install 4 Packages (+1 Dependent package) Total download size: 1.3 M Installed size: 3.8 M Downloading packages: -------------------------------------------------------------------------------- Total 14 MB/s | 1.3 MB 00:00 Running transaction check Running transaction test Transaction test succeeded Running transaction Installing : lm_sensors-libs-3.3.4-11.el7.x86_64 1/5 Installing : sysstat-10.1.5-7.el7.x86_64 2/5 Installing : net-tools-2.0-0.17.20131004git.el7.x86_64 3/5 Installing : lsof-4.87-4.el7.x86_64 4/5 Installing : 14:tcpdump-4.5.1-3.el7.x86_64 5/5 Verifying : sysstat-10.1.5-7.el7.x86_64 1/5 Verifying : lm_sensors-libs-3.3.4-11.el7.x86_64 2/5 Verifying : 14:tcpdump-4.5.1-3.el7.x86_64 3/5 Verifying : lsof-4.87-4.el7.x86_64 4/5 Verifying : net-tools-2.0-0.17.20131004git.el7.x86_64 5/5 Installed: lsof.x86_64 0:4.87-4.el7 net-tools.x86_64 0:2.0-0.17.20131004git.el7 sysstat.x86_64 0:10.1.5-7.el7 tcpdump.x86_64 14:4.5.1-3.el7 Dependency Installed: lm_sensors-libs.x86_64 0:3.3.4-11.el7 Complete! " } ] TASK [common : include] ******************************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/main.yml:20 Friday 20 May 2016 11:22:12 +0000 (0:00:02.603) 0:01:02.099 ************ included: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/setup.yml for n2.dusty TASK [common : Create log folder to centralize logs in] ************************ task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/setup.yml:17 Friday 20 May 2016 11:22:13 +0000 (0:00:00.408) 0:01:02.508 ************ changed: [n2.dusty] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [common : Enable sysstat] ************************************************* task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/setup.yml:23 Friday 20 May 2016 11:22:13 +0000 (0:00:00.612) 0:01:03.120 ************ changed: [n2.dusty] => {"changed": true, "enabled": true, "name": "sysstat", "state": "started"} TASK [common : Run sysstat every minute instead of every 10 minutes] *********** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/setup.yml:29 Friday 20 May 2016 11:22:14 +0000 (0:00:00.606) 0:01:03.726 ************ changed: [n2.dusty] => {"changed": true, "msg": "1 replacements made"} msg: 1 replacements made TASK [common : include] ******************************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/main.yml:23 Friday 20 May 2016 11:22:15 +0000 (0:00:00.755) 0:01:04.482 ************ skipping: [n2.dusty] => {"changed": false, "skip_reason": "Conditional check failed", "skipped": true} TASK [puppet-openstack : include] ********************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/main.yml:17 Friday 20 May 2016 11:22:15 +0000 (0:00:00.372) 0:01:04.854 ************ included: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/repositories.yml for n2.dusty TASK [puppet-openstack : Setup delorean repository] **************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/repositories.yml:18 Friday 20 May 2016 11:22:15 +0000 (0:00:00.439) 0:01:05.294 ************ changed: [n2.dusty] => {"changed": true, "checksum_dest": null, "checksum_src": "ba824e8a46c1482ea9bd82063da7c905b7be8f89", "dest": "/etc/yum.repos.d/delorean.repo", "gid": 0, "group": "root", "md5sum": "d192b5df3931b2fefd9d2ac66acdf2c5", "mode": "0644", "msg": "OK (237 bytes)", "owner": "root", "secontext": "unconfined_u:object_r:system_conf_t:s0", "size": 237, "src": "/tmp/tmphryYPf", "state": "file", "uid": 0, "url": "http://trunk.rdoproject.org/centos7-mitaka/2d/d7/2dd7f56b0b04af66c8ea2739df3dcc43dc3a9316_cbd0900e/delorean.repo"} msg: OK (237 bytes) TASK [puppet-openstack : Setup delorean-deps repository] *********************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/repositories.yml:24 Friday 20 May 2016 11:22:16 +0000 (0:00:00.878) 0:01:06.172 ************ changed: [n2.dusty] => {"changed": true, "checksum_dest": null, "checksum_src": "1d5ef7d2398904b2880879f25c25cb7f24ee8c78", "dest": "/etc/yum.repos.d/delorean-deps.repo", "gid": 0, "group": "root", "md5sum": "4a24ad09f6cd90bb31301f958810d2b2", "mode": "0644", "msg": "OK (166 bytes)", "owner": "root", "secontext": "unconfined_u:object_r:system_conf_t:s0", "size": 166, "src": "/tmp/tmpJph2ZX", "state": "file", "uid": 0, "url": "http://trunk.rdoproject.org/centos7-mitaka/delorean-deps.repo"} msg: OK (166 bytes) TASK [puppet-openstack : Enable CentOS storage SIG repository] ***************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/repositories.yml:31 Friday 20 May 2016 11:22:17 +0000 (0:00:00.844) 0:01:07.017 ************ skipping: [n2.dusty] => {"changed": false, "skip_reason": "Conditional check failed", "skipped": true} TASK [puppet-openstack : include] ********************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/main.yml:20 Friday 20 May 2016 11:22:18 +0000 (0:00:00.420) 0:01:07.437 ************ included: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/packages.yml for n2.dusty TASK [puppet-openstack : Install required packages] **************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/packages.yml:18 Friday 20 May 2016 11:22:18 +0000 (0:00:00.326) 0:01:07.764 ************ changed: [n2.dusty] => (item=[u'libxml2-devel', u'libxslt-devel', u'ruby-devel', u'rubygems']) => {"changed": true, "item": ["libxml2-devel", "libxslt-devel", "ruby-devel", "rubygems"], "msg": "", "rc": 0, "results": ["Loaded plugins: fastestmirror, priorities\nLoading mirror speeds from cached hostfile\n * base: mirror.centos.org\n * extras: mirror.centos.org\n * updates: mirror.centos.org\n492 packages excluded due to repository priority protections\nResolving Dependencies\n--> Running transaction check\n---> Package libxml2-devel.x86_64 0:2.9.1-6.el7_2.2 will be installed\n--> Processing Dependency: zlib-devel for package: libxml2-devel-2.9.1-6.el7_2.2.x86_64\n--> Processing Dependency: xz-devel for package: libxml2-devel-2.9.1-6.el7_2.2.x86_64\n---> Package libxslt-devel.x86_64 0:1.1.28-5.el7 will be installed\n--> Processing Dependency: libxslt = 1.1.28-5.el7 for package: libxslt-devel-1.1.28-5.el7.x86_64\n--> Processing Dependency: libgcrypt-devel for package: libxslt-devel-1.1.28-5.el7.x86_64\n--> Processing Dependency: libxslt.so.1()(64bit) for package: libxslt-devel-1.1.28-5.el7.x86_64\n--> Processing Dependency: libexslt.so.0()(64bit) for package: libxslt-devel-1.1.28-5.el7.x86_64\n---> Package ruby-devel.x86_64 0:2.0.0.598-25.el7_1 will be installed\n--> Processing Dependency: ruby(x86-64) = 2.0.0.598-25.el7_1 for package: ruby-devel-2.0.0.598-25.el7_1.x86_64\n--> Processing Dependency: libruby.so.2.0()(64bit) for package: ruby-devel-2.0.0.598-25.el7_1.x86_64\n---> Package rubygems.noarch 0:2.0.14-25.el7_1 will be installed\n--> Processing Dependency: rubygem(rdoc) >= 4.0.0 for package: rubygems-2.0.14-25.el7_1.noarch\n--> Processing Dependency: rubygem(psych) >= 2.0.0 for package: rubygems-2.0.14-25.el7_1.noarch\n--> Processing Dependency: rubygem(io-console) >= 0.4.2 for package: rubygems-2.0.14-25.el7_1.noarch\n--> Running transaction check\n---> Package libgcrypt-devel.x86_64 0:1.5.3-12.el7_1.1 will be installed\n--> Processing Dependency: libgpg-error-devel for package: libgcrypt-devel-1.5.3-12.el7_1.1.x86_64\n---> Package libxslt.x86_64 0:1.1.28-5.el7 will be installed\n---> Package ruby.x86_64 0:2.0.0.598-25.el7_1 will be installed\n--> Processing Dependency: rubygem(bigdecimal) >= 1.2.0 for package: ruby-2.0.0.598-25.el7_1.x86_64\n---> Package ruby-libs.x86_64 0:2.0.0.598-25.el7_1 will be installed\n---> Package rubygem-io-console.x86_64 0:0.4.2-25.el7_1 will be installed\n---> Package rubygem-psych.x86_64 0:2.0.0-25.el7_1 will be installed\n--> Processing Dependency: libyaml-0.so.2()(64bit) for package: rubygem-psych-2.0.0-25.el7_1.x86_64\n---> Package rubygem-rdoc.noarch 0:4.0.0-25.el7_1 will be installed\n--> Processing Dependency: ruby(irb) = 2.0.0.598 for package: rubygem-rdoc-4.0.0-25.el7_1.noarch\n--> Processing Dependency: rubygem(json) >= 1.7.7 for package: rubygem-rdoc-4.0.0-25.el7_1.noarch\n---> Package xz-devel.x86_64 0:5.1.2-12alpha.el7 will be installed\n---> Package zlib-devel.x86_64 0:1.2.7-15.el7 will be installed\n--> Running transaction check\n---> Package libgpg-error-devel.x86_64 0:1.12-3.el7 will be installed\n---> Package libyaml.x86_64 0:0.1.4-11.el7_0 will be installed\n---> Package ruby-irb.noarch 0:2.0.0.598-25.el7_1 will be installed\n---> Package rubygem-bigdecimal.x86_64 0:1.2.0-25.el7_1 will be installed\n---> Package rubygem-json.x86_64 0:1.7.7-25.el7_1 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n libxml2-devel x86_64 2.9.1-6.el7_2.2 updates 1.0 M\n libxslt-devel x86_64 1.1.28-5.el7 base 309 k\n ruby-devel x86_64 2.0.0.598-25.el7_1 base 127 k\n rubygems noarch 2.0.14-25.el7_1 base 212 k\nInstalling for dependencies:\n libgcrypt-devel x86_64 1.5.3-12.el7_1.1 base 129 k\n libgpg-error-devel x86_64 1.12-3.el7 base 16 k\n libxslt x86_64 1.1.28-5.el7 base 242 k\n libyaml x86_64 0.1.4-11.el7_0 base 55 k\n ruby x86_64 2.0.0.598-25.el7_1 base 67 k\n ruby-irb noarch 2.0.0.598-25.el7_1 base 88 k\n ruby-libs x86_64 2.0.0.598-25.el7_1 base 2.8 M\n rubygem-bigdecimal x86_64 1.2.0-25.el7_1 base 79 k\n rubygem-io-console x86_64 0.4.2-25.el7_1 base 50 k\n rubygem-json x86_64 1.7.7-25.el7_1 base 75 k\n rubygem-psych x86_64 2.0.0-25.el7_1 base 77 k\n rubygem-rdoc noarch 4.0.0-25.el7_1 base 318 k\n xz-devel x86_64 5.1.2-12alpha.el7 base 44 k\n zlib-devel x86_64 1.2.7-15.el7 base 50 k\n\nTransaction Summary\n================================================================================\nInstall 4 Packages (+14 Dependent packages)\n\nTotal download size: 5.7 M\nInstalled size: 25 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal 39 MB/s | 5.7 MB 00:00 \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : ruby-libs-2.0.0.598-25.el7_1.x86_64 1/18 \n Installing : libgpg-error-devel-1.12-3.el7.x86_64 2/18 \n Installing : libgcrypt-devel-1.5.3-12.el7_1.1.x86_64 3/18 \n Installing : xz-devel-5.1.2-12alpha.el7.x86_64 4/18 \n Installing : libxslt-1.1.28-5.el7.x86_64 5/18 \n Installing : libyaml-0.1.4-11.el7_0.x86_64 6/18 \n Installing : rubygem-json-1.7.7-25.el7_1.x86_64 7/18 \n Installing : rubygem-bigdecimal-1.2.0-25.el7_1.x86_64 8/18 \n Installing : rubygem-rdoc-4.0.0-25.el7_1.noarch 9/18 \n Installing : ruby-irb-2.0.0.598-25.el7_1.noarch 10/18 \n Installing : ruby-2.0.0.598-25.el7_1.x86_64 11/18 \n Installing : rubygem-io-console-0.4.2-25.el7_1.x86_64 12/18 \n Installing : rubygems-2.0.14-25.el7_1.noarch 13/18 \n Installing : rubygem-psych-2.0.0-25.el7_1.x86_64 14/18 \n Installing : zlib-devel-1.2.7-15.el7.x86_64 15/18 \n Installing : libxml2-devel-2.9.1-6.el7_2.2.x86_64 16/18 \n Installing : libxslt-devel-1.1.28-5.el7.x86_64 17/18 \n Installing : ruby-devel-2.0.0.598-25.el7_1.x86_64 18/18 \n Verifying : zlib-devel-1.2.7-15.el7.x86_64 1/18 \n Verifying : libyaml-0.1.4-11.el7_0.x86_64 2/18 \n Verifying : rubygem-json-1.7.7-25.el7_1.x86_64 3/18 \n Verifying : ruby-devel-2.0.0.598-25.el7_1.x86_64 4/18 \n Verifying : rubygems-2.0.14-25.el7_1.noarch 5/18 \n Verifying : libxslt-devel-1.1.28-5.el7.x86_64 6/18 \n Verifying : libxslt-1.1.28-5.el7.x86_64 7/18 \n Verifying : rubygem-bigdecimal-1.2.0-25.el7_1.x86_64 8/18 \n Verifying : xz-devel-5.1.2-12alpha.el7.x86_64 9/18 \n Verifying : libgpg-error-devel-1.12-3.el7.x86_64 10/18 \n Verifying : libgcrypt-devel-1.5.3-12.el7_1.1.x86_64 11/18 \n Verifying : rubygem-psych-2.0.0-25.el7_1.x86_64 12/18 \n Verifying : ruby-libs-2.0.0.598-25.el7_1.x86_64 13/18 \n Verifying : rubygem-rdoc-4.0.0-25.el7_1.noarch 14/18 \n Verifying : ruby-2.0.0.598-25.el7_1.x86_64 15/18 \n Verifying : ruby-irb-2.0.0.598-25.el7_1.noarch 16/18 \n Verifying : libxml2-devel-2.9.1-6.el7_2.2.x86_64 17/18 \n Verifying : rubygem-io-console-0.4.2-25.el7_1.x86_64 18/18 \n\nInstalled:\n libxml2-devel.x86_64 0:2.9.1-6.el7_2.2 libxslt-devel.x86_64 0:1.1.28-5.el7 \n ruby-devel.x86_64 0:2.0.0.598-25.el7_1 rubygems.noarch 0:2.0.14-25.el7_1 \n\nDependency Installed:\n libgcrypt-devel.x86_64 0:1.5.3-12.el7_1.1 \n libgpg-error-devel.x86_64 0:1.12-3.el7 \n libxslt.x86_64 0:1.1.28-5.el7 \n libyaml.x86_64 0:0.1.4-11.el7_0 \n ruby.x86_64 0:2.0.0.598-25.el7_1 \n ruby-irb.noarch 0:2.0.0.598-25.el7_1 \n ruby-libs.x86_64 0:2.0.0.598-25.el7_1 \n rubygem-bigdecimal.x86_64 0:1.2.0-25.el7_1 \n rubygem-io-console.x86_64 0:0.4.2-25.el7_1 \n rubygem-json.x86_64 0:1.7.7-25.el7_1 \n rubygem-psych.x86_64 0:2.0.0-25.el7_1 \n rubygem-rdoc.noarch 0:4.0.0-25.el7_1 \n xz-devel.x86_64 0:5.1.2-12alpha.el7 \n zlib-devel.x86_64 0:1.2.7-15.el7 \n\nComplete!\n"]} msg: All items completed results: [ { "_ansible_no_log": false, "invocation": { "module_name": "yum", "module_args": { "disable_gpg_check": false, "disablerepo": null, "install_repoquery": true, "list": null, "state": "latest", "enablerepo": null, "update_cache": false, "conf_file": null, "exclude": null, "name": [ "libxml2-devel", "libxslt-devel", "ruby-devel", "rubygems" ] } }, "msg": "", "rc": 0, "changed": true, "item": [ "libxml2-devel", "libxslt-devel", "ruby-devel", "rubygems" ], "results": " Loaded plugins: fastestmirror, priorities Loading mirror speeds from cached hostfile * base: mirror.centos.org * extras: mirror.centos.org * updates: mirror.centos.org 492 packages excluded due to repository priority protections Resolving Dependencies --> Running transaction check ---> Package libxml2-devel.x86_64 0:2.9.1-6.el7_2.2 will be installed --> Processing Dependency: zlib-devel for package: libxml2-devel-2.9.1-6.el7_2.2.x86_64 --> Processing Dependency: xz-devel for package: libxml2-devel-2.9.1-6.el7_2.2.x86_64 ---> Package libxslt-devel.x86_64 0:1.1.28-5.el7 will be installed --> Processing Dependency: libxslt = 1.1.28-5.el7 for package: libxslt-devel-1.1.28-5.el7.x86_64 --> Processing Dependency: libgcrypt-devel for package: libxslt-devel-1.1.28-5.el7.x86_64 --> Processing Dependency: libxslt.so.1()(64bit) for package: libxslt-devel-1.1.28-5.el7.x86_64 --> Processing Dependency: libexslt.so.0()(64bit) for package: libxslt-devel-1.1.28-5.el7.x86_64 ---> Package ruby-devel.x86_64 0:2.0.0.598-25.el7_1 will be installed --> Processing Dependency: ruby(x86-64) = 2.0.0.598-25.el7_1 for package: ruby-devel-2.0.0.598-25.el7_1.x86_64 --> Processing Dependency: libruby.so.2.0()(64bit) for package: ruby-devel-2.0.0.598-25.el7_1.x86_64 ---> Package rubygems.noarch 0:2.0.14-25.el7_1 will be installed --> Processing Dependency: rubygem(rdoc) >= 4.0.0 for package: rubygems-2.0.14-25.el7_1.noarch --> Processing Dependency: rubygem(psych) >= 2.0.0 for package: rubygems-2.0.14-25.el7_1.noarch --> Processing Dependency: rubygem(io-console) >= 0.4.2 for package: rubygems-2.0.14-25.el7_1.noarch --> Running transaction check ---> Package libgcrypt-devel.x86_64 0:1.5.3-12.el7_1.1 will be installed --> Processing Dependency: libgpg-error-devel for package: libgcrypt-devel-1.5.3-12.el7_1.1.x86_64 ---> Package libxslt.x86_64 0:1.1.28-5.el7 will be installed ---> Package ruby.x86_64 0:2.0.0.598-25.el7_1 will be installed --> Processing Dependency: rubygem(bigdecimal) >= 1.2.0 for package: ruby-2.0.0.598-25.el7_1.x86_64 ---> Package ruby-libs.x86_64 0:2.0.0.598-25.el7_1 will be installed ---> Package rubygem-io-console.x86_64 0:0.4.2-25.el7_1 will be installed ---> Package rubygem-psych.x86_64 0:2.0.0-25.el7_1 will be installed --> Processing Dependency: libyaml-0.so.2()(64bit) for package: rubygem-psych-2.0.0-25.el7_1.x86_64 ---> Package rubygem-rdoc.noarch 0:4.0.0-25.el7_1 will be installed --> Processing Dependency: ruby(irb) = 2.0.0.598 for package: rubygem-rdoc-4.0.0-25.el7_1.noarch --> Processing Dependency: rubygem(json) >= 1.7.7 for package: rubygem-rdoc-4.0.0-25.el7_1.noarch ---> Package xz-devel.x86_64 0:5.1.2-12alpha.el7 will be installed ---> Package zlib-devel.x86_64 0:1.2.7-15.el7 will be installed --> Running transaction check ---> Package libgpg-error-devel.x86_64 0:1.12-3.el7 will be installed ---> Package libyaml.x86_64 0:0.1.4-11.el7_0 will be installed ---> Package ruby-irb.noarch 0:2.0.0.598-25.el7_1 will be installed ---> Package rubygem-bigdecimal.x86_64 0:1.2.0-25.el7_1 will be installed ---> Package rubygem-json.x86_64 0:1.7.7-25.el7_1 will be installed --> Finished Dependency Resolution Dependencies Resolved ================================================================================ Package Arch Version Repository Size ================================================================================ Installing: libxml2-devel x86_64 2.9.1-6.el7_2.2 updates 1.0 M libxslt-devel x86_64 1.1.28-5.el7 base 309 k ruby-devel x86_64 2.0.0.598-25.el7_1 base 127 k rubygems noarch 2.0.14-25.el7_1 base 212 k Installing for dependencies: libgcrypt-devel x86_64 1.5.3-12.el7_1.1 base 129 k libgpg-error-devel x86_64 1.12-3.el7 base 16 k libxslt x86_64 1.1.28-5.el7 base 242 k libyaml x86_64 0.1.4-11.el7_0 base 55 k ruby x86_64 2.0.0.598-25.el7_1 base 67 k ruby-irb noarch 2.0.0.598-25.el7_1 base 88 k ruby-libs x86_64 2.0.0.598-25.el7_1 base 2.8 M rubygem-bigdecimal x86_64 1.2.0-25.el7_1 base 79 k rubygem-io-console x86_64 0.4.2-25.el7_1 base 50 k rubygem-json x86_64 1.7.7-25.el7_1 base 75 k rubygem-psych x86_64 2.0.0-25.el7_1 base 77 k rubygem-rdoc noarch 4.0.0-25.el7_1 base 318 k xz-devel x86_64 5.1.2-12alpha.el7 base 44 k zlib-devel x86_64 1.2.7-15.el7 base 50 k Transaction Summary ================================================================================ Install 4 Packages (+14 Dependent packages) Total download size: 5.7 M Installed size: 25 M Downloading packages: -------------------------------------------------------------------------------- Total 39 MB/s | 5.7 MB 00:00 Running transaction check Running transaction test Transaction test succeeded Running transaction Installing : ruby-libs-2.0.0.598-25.el7_1.x86_64 1/18 Installing : libgpg-error-devel-1.12-3.el7.x86_64 2/18 Installing : libgcrypt-devel-1.5.3-12.el7_1.1.x86_64 3/18 Installing : xz-devel-5.1.2-12alpha.el7.x86_64 4/18 Installing : libxslt-1.1.28-5.el7.x86_64 5/18 Installing : libyaml-0.1.4-11.el7_0.x86_64 6/18 Installing : rubygem-json-1.7.7-25.el7_1.x86_64 7/18 Installing : rubygem-bigdecimal-1.2.0-25.el7_1.x86_64 8/18 Installing : rubygem-rdoc-4.0.0-25.el7_1.noarch 9/18 Installing : ruby-irb-2.0.0.598-25.el7_1.noarch 10/18 Installing : ruby-2.0.0.598-25.el7_1.x86_64 11/18 Installing : rubygem-io-console-0.4.2-25.el7_1.x86_64 12/18 Installing : rubygems-2.0.14-25.el7_1.noarch 13/18 Installing : rubygem-psych-2.0.0-25.el7_1.x86_64 14/18 Installing : zlib-devel-1.2.7-15.el7.x86_64 15/18 Installing : libxml2-devel-2.9.1-6.el7_2.2.x86_64 16/18 Installing : libxslt-devel-1.1.28-5.el7.x86_64 17/18 Installing : ruby-devel-2.0.0.598-25.el7_1.x86_64 18/18 Verifying : zlib-devel-1.2.7-15.el7.x86_64 1/18 Verifying : libyaml-0.1.4-11.el7_0.x86_64 2/18 Verifying : rubygem-json-1.7.7-25.el7_1.x86_64 3/18 Verifying : ruby-devel-2.0.0.598-25.el7_1.x86_64 4/18 Verifying : rubygems-2.0.14-25.el7_1.noarch 5/18 Verifying : libxslt-devel-1.1.28-5.el7.x86_64 6/18 Verifying : libxslt-1.1.28-5.el7.x86_64 7/18 Verifying : rubygem-bigdecimal-1.2.0-25.el7_1.x86_64 8/18 Verifying : xz-devel-5.1.2-12alpha.el7.x86_64 9/18 Verifying : libgpg-error-devel-1.12-3.el7.x86_64 10/18 Verifying : libgcrypt-devel-1.5.3-12.el7_1.1.x86_64 11/18 Verifying : rubygem-psych-2.0.0-25.el7_1.x86_64 12/18 Verifying : ruby-libs-2.0.0.598-25.el7_1.x86_64 13/18 Verifying : rubygem-rdoc-4.0.0-25.el7_1.noarch 14/18 Verifying : ruby-2.0.0.598-25.el7_1.x86_64 15/18 Verifying : ruby-irb-2.0.0.598-25.el7_1.noarch 16/18 Verifying : libxml2-devel-2.9.1-6.el7_2.2.x86_64 17/18 Verifying : rubygem-io-console-0.4.2-25.el7_1.x86_64 18/18 Installed: libxml2-devel.x86_64 0:2.9.1-6.el7_2.2 libxslt-devel.x86_64 0:1.1.28-5.el7 ruby-devel.x86_64 0:2.0.0.598-25.el7_1 rubygems.noarch 0:2.0.14-25.el7_1 Dependency Installed: libgcrypt-devel.x86_64 0:1.5.3-12.el7_1.1 libgpg-error-devel.x86_64 0:1.12-3.el7 libxslt.x86_64 0:1.1.28-5.el7 libyaml.x86_64 0:0.1.4-11.el7_0 ruby.x86_64 0:2.0.0.598-25.el7_1 ruby-irb.noarch 0:2.0.0.598-25.el7_1 ruby-libs.x86_64 0:2.0.0.598-25.el7_1 rubygem-bigdecimal.x86_64 0:1.2.0-25.el7_1 rubygem-io-console.x86_64 0:0.4.2-25.el7_1 rubygem-json.x86_64 0:1.7.7-25.el7_1 rubygem-psych.x86_64 0:2.0.0-25.el7_1 rubygem-rdoc.noarch 0:4.0.0-25.el7_1 xz-devel.x86_64 0:5.1.2-12alpha.el7 zlib-devel.x86_64 0:1.2.7-15.el7 Complete! " } ] TASK [puppet-openstack : Install required ruby gems] *************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/packages.yml:26 Friday 20 May 2016 11:22:28 +0000 (0:00:09.769) 0:01:17.534 ************ changed: [n2.dusty] => (item=bundler) => {"changed": true, "item": "bundler", "name": "bundler", "state": "present"} msg: All items completed results: [ { "_ansible_no_log": false, "invocation": { "module_name": "gem", "module_args": { "pre_release": false, "repository": null, "include_dependencies": true, "executable": null, "gem_source": "bundler", "build_flags": null, "include_doc": false, "state": "present", "version": null, "name": "bundler", "user_install": true } }, "changed": true, "state": "present", "item": "bundler", "name": "bundler" } ] TASK [puppet-openstack : include] ********************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/main.yml:21 Friday 20 May 2016 11:22:35 +0000 (0:00:07.465) 0:01:24.999 ************ included: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/setup.yml for n2.dusty TASK [puppet-openstack : Clone upstream puppet-openstack-integration repository] *** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/setup.yml:18 Friday 20 May 2016 11:22:36 +0000 (0:00:00.397) 0:01:25.397 ************ changed: [n2.dusty] => {"after": "d013024c826667d95b9db807c6078d2f2e1959e8", "before": null, "changed": true} TASK [puppet-openstack : Download script for retrieving logs] ****************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/setup.yml:29 Friday 20 May 2016 11:22:38 +0000 (0:00:02.371) 0:01:27.768 ************ changed: [n2.dusty] => {"changed": true, "checksum_dest": null, "checksum_src": "c8817155cc4b0c91d25fd28f24a324127de6759d", "dest": "/tmp/puppet-openstack/copy_puppet_logs.sh", "gid": 0, "group": "root", "md5sum": "84da0ed819bc930a1aaa35caa2792ec3", "mode": "0755", "msg": "OK (6329 bytes)", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6329, "src": "/tmp/tmp6CYt4C", "state": "file", "uid": 0, "url": "https://raw.githubusercontent.com/openstack-infra/project-config/master/jenkins/scripts/copy_puppet_logs.sh"} msg: OK (6329 bytes) TASK [puppet-openstack : Create directory where puppet-openstack logs will be stored] *** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/setup.yml:35 Friday 20 May 2016 11:22:41 +0000 (0:00:03.318) 0:01:31.087 ************ changed: [n2.dusty] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [puppet-openstack : include] ********************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/main.yml:22 Friday 20 May 2016 11:22:42 +0000 (0:00:00.652) 0:01:31.739 ************ included: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/run.yml for n2.dusty TASK [puppet-openstack : Run puppet integration test - scenario002] ************ task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/run.yml:18 Friday 20 May 2016 11:22:42 +0000 (0:00:00.396) 0:01:32.136 ************ fatal: [n2.dusty]: FAILED! => {"changed": true, "cmd": ["./run_tests.sh"], "delta": "0:26:47.750726", "end": "2016-05-20 12:49:31.052546", "failed": true, "rc": 1, "start": "2016-05-20 12:22:43.301820", "stderr": "+ export PUPPET_VERSION=3\n+ PUPPET_VERSION=3\n+ export SCENARIO=scenario002\n+ SCENARIO=scenario002\n+ export MANAGE_PUPPET_MODULES=true\n+ MANAGE_PUPPET_MODULES=true\n+ export MANAGE_REPOS=false\n+ MANAGE_REPOS=false\n+ export PUPPET_ARGS=\n+ PUPPET_ARGS=\n+++ dirname ./run_tests.sh\n++ cd .\n++ pwd -P\n+ export SCRIPT_DIR=/tmp/puppet-openstack\n+ SCRIPT_DIR=/tmp/puppet-openstack\n+ '[' -f /etc/nodepool/provider ']'\n+ NODEPOOL_MIRROR_HOST=mirror.centos.org\n+ export FACTER_nodepool_mirror_host=http://mirror.centos.org\n+ FACTER_nodepool_mirror_host=http://mirror.centos.org\n+ '[' 3 == 4 ']'\n+ export PUPPET_RELEASE_FILE=puppetlabs-release\n+ PUPPET_RELEASE_FILE=puppetlabs-release\n+ export PUPPET_BASE_PATH=/etc/puppet\n+ PUPPET_BASE_PATH=/etc/puppet\n+ export PUPPET_PKG=puppet\n+ PUPPET_PKG=puppet\n+ source /tmp/puppet-openstack/functions\n+ '[' '!' -f fixtures/scenario002.pp ']'\n++ id -u\n+ '[' 0 '!=' 0 ']'\n+ git clone -b 12.0.0 git://git.openstack.org/openstack/tempest /tmp/openstack/tempest\nNote: checking out 'aff9cc072bbbb222b09a3411b203c180b493eae8'.\n\nYou are in 'detached HEAD' state. You can look around, make experimental\nchanges and commit them, and you can discard any commits you make in this\nstate without impacting any branches by performing another checkout.\n\nIf you want to create a new branch to retain commits you create, you may\ndo so (now or later) by using -b with the checkout command again. Example:\n\n git checkout -b new_branch_name\n\n+ PUPPET_ARGS=' --detailed-exitcodes --color=false --test --trace'\n+ uses_debs\n+ type apt-get\n+ is_fedora\n+ '[' -f /etc/os-release ']'\n+ source /etc/os-release\n++ NAME='CentOS Linux'\n++ VERSION='7 (Core)'\n++ ID=centos\n++ ID_LIKE='rhel fedora'\n++ VERSION_ID=7\n++ PRETTY_NAME='CentOS Linux 7 (Core)'\n++ ANSI_COLOR='0;31'\n++ CPE_NAME=cpe:/o:centos:centos:7\n++ HOME_URL=https://www.centos.org/\n++ BUG_REPORT_URL=https://bugs.centos.org/\n++ CENTOS_MANTISBT_PROJECT=CentOS-7\n++ CENTOS_MANTISBT_PROJECT_VERSION=7\n++ REDHAT_SUPPORT_PRODUCT=centos\n++ REDHAT_SUPPORT_PRODUCT_VERSION=7\n+ test centos = fedora -o centos = centos\n+ rpm --quiet -q puppetlabs-release\n+ rpm --quiet -q epel-release\n+ rm -f /tmp/puppet.rpm\n+ wget http://yum.puppetlabs.com/puppetlabs-release-el-7.noarch.rpm -O /tmp/puppet.rpm\n--2016-05-20 12:22:49-- http://yum.puppetlabs.com/puppetlabs-release-el-7.noarch.rpm\nResolving yum.puppetlabs.com (yum.puppetlabs.com)... 192.155.89.90, 2600:3c03::f03c:91ff:fedb:6b1d\nConnecting to yum.puppetlabs.com (yum.puppetlabs.com)|192.155.89.90|:80... connected.\nHTTP request sent, awaiting response... 200 OK\nLength: 12504 (12K) [application/x-redhat-package-manager]\nSaving to: ‘/tmp/puppet.rpm’\n\n 0K .......... .. 100% 291M=0s\n\n2016-05-20 12:22:49 (291 MB/s) - ‘/tmp/puppet.rpm’ saved [12504/12504]\n\n+ rpm -ivh /tmp/puppet.rpm\nwarning: /tmp/puppet.rpm: Header V4 RSA/SHA1 Signature, key ID 4bd6ec30: NOKEY\n+ yum install -y dstat puppet\nWarning: RPMDB altered outside of yum.\n+ type dstat\n+ dstat -tcmndrylpg --top-cpu-adv --top-io-adv --nocolor\n+ '[' true = true ']'\n+ ./install_modules.sh\n+ tee --append /var/log/dstat.log\n+ '[' -n '' ']'\n+ '[' 3 = 4 ']'\n+ export PUPPET_BASE_PATH=/etc/puppet\n+ PUPPET_BASE_PATH=/etc/puppet\n+++ dirname ./install_modules.sh\n++ cd .\n++ pwd -P\n+ export SCRIPT_DIR=/tmp/puppet-openstack\n+ SCRIPT_DIR=/tmp/puppet-openstack\n+ export PUPPETFILE_DIR=/etc/puppet/modules\n+ PUPPETFILE_DIR=/etc/puppet/modules\n+ source /tmp/puppet-openstack/functions\n+ gem install r10k --no-ri --no-rdoc\n+ rm -rf '/etc/puppet/modules/*'\n+ install_modules\n+ '[' -e /usr/zuul-env/bin/zuul-cloner ']'\n+ install_all\n+ PUPPETFILE=/tmp/puppet-openstack/Puppetfile\n+ r10k puppetfile install -v\nINFO\t -> Updating module /etc/puppet/modules/aodh\nINFO\t -> Updating module /etc/puppet/modules/barbican\nINFO\t -> Updating module /etc/puppet/modules/ceilometer\nINFO\t -> Updating module /etc/puppet/modules/ceph\nINFO\t -> Updating module /etc/puppet/modules/cinder\nINFO\t -> Updating module /etc/puppet/modules/designate\nINFO\t -> Updating module /etc/puppet/modules/glance\nINFO\t -> Updating module /etc/puppet/modules/gnocchi\nINFO\t -> Updating module /etc/puppet/modules/heat\nINFO\t -> Updating module /etc/puppet/modules/horizon\nINFO\t -> Updating module /etc/puppet/modules/ironic\nINFO\t -> Updating module /etc/puppet/modules/keystone\nINFO\t -> Updating module /etc/puppet/modules/manila\nINFO\t -> Updating module /etc/puppet/modules/mistral\nINFO\t -> Updating module /etc/puppet/modules/monasca\nINFO\t -> Updating module /etc/puppet/modules/murano\nINFO\t -> Updating module /etc/puppet/modules/neutron\nINFO\t -> Updating module /etc/puppet/modules/nova\nINFO\t -> Updating module /etc/puppet/modules/octavia\nINFO\t -> Updating module /etc/puppet/modules/openstack_integration\nINFO\t -> Updating module /etc/puppet/modules/openstack_extras\nINFO\t -> Updating module /etc/puppet/modules/openstacklib\nINFO\t -> Updating module /etc/puppet/modules/oslo\nINFO\t -> Updating module /etc/puppet/modules/sahara\nINFO\t -> Updating module /etc/puppet/modules/swift\nINFO\t -> Updating module /etc/puppet/modules/tempest\nINFO\t -> Updating module /etc/puppet/modules/trove\nINFO\t -> Updating module /etc/puppet/modules/vswitch\nINFO\t -> Updating module /etc/puppet/modules/zaqar\nINFO\t -> Updating module /etc/puppet/modules/apache\nINFO\t -> Updating module /etc/puppet/modules/apt\nINFO\t -> Updating module /etc/puppet/modules/concat\nINFO\t -> Updating module /etc/puppet/modules/corosync\nINFO\t -> Updating module /etc/puppet/modules/dns\nINFO\t -> Updating module /etc/puppet/modules/firewall\nINFO\t -> Updating module /etc/puppet/modules/inifile\nINFO\t -> Updating module /etc/puppet/modules/memcached\nINFO\t -> Updating module /etc/puppet/modules/mongodb\nINFO\t -> Updating module /etc/puppet/modules/mysql\nINFO\t -> Updating module /etc/puppet/modules/postgresql\nINFO\t -> Updating module /etc/puppet/modules/powerdns\nINFO\t -> Updating module /etc/puppet/modules/python\nINFO\t -> Updating module /etc/puppet/modules/qpid\nINFO\t -> Updating module /etc/puppet/modules/rabbitmq\nINFO\t -> Updating module /etc/puppet/modules/rsync\nINFO\t -> Updating module /etc/puppet/modules/staging\nINFO\t -> Updating module /etc/puppet/modules/stdlib\nINFO\t -> Updating module /etc/puppet/modules/sysctl\nINFO\t -> Updating module /etc/puppet/modules/vcsrepo\nINFO\t -> Updating module /etc/puppet/modules/xinetd\n+ puppet module list\n\u001b[1;31mWarning: Module 'openstack-openstacklib' (v8.0.1) fails to meet some dependencies:\n 'puppet-octavia' (v0.0.1) requires 'openstack-openstacklib' (>=7.0.0 <8.0.0)\n 'puppet-oslo' (v0.0.1) requires 'openstack-openstacklib' (>=7.0.0 <8.0.0)\u001b[0m\n\u001b[1;31mWarning: Module 'puppetlabs-inifile' (v1.4.3) fails to meet some dependencies:\n 'openstack-gnocchi' (v8.0.1) requires 'puppetlabs-inifile' (>=1.5.0 <2.0.0)\u001b[0m\n\u001b[1;31mWarning: Missing dependency 'deric-storm':\n 'openstack-monasca' (v1.0.0) requires 'deric-storm' (>=0.0.1 <1.0.0)\u001b[0m\n\u001b[1;31mWarning: Missing dependency 'deric-zookeeper':\n 'openstack-monasca' (v1.0.0) requires 'deric-zookeeper' (>=0.0.1 <1.0.0)\u001b[0m\n\u001b[1;31mWarning: Missing dependency 'jdowning-influxdb':\n 'openstack-monasca' (v1.0.0) requires 'jdowning-influxdb' (>=0.3.0 <1.0.0)\u001b[0m\n\u001b[1;31mWarning: Missing dependency 'openstack-oslo':\n 'openstack-barbican' (v0.0.1) requires 'openstack-oslo' (<9.0.0)\u001b[0m\n\u001b[1;31mWarning: Missing dependency 'opentable-kafka':\n 'openstack-monasca' (v1.0.0) requires 'opentable-kafka' (>=1.0.0 <2.0.0)\u001b[0m\n\u001b[1;31mWarning: Missing dependency 'puppetlabs-stdlib':\n 'antonlindstrom-powerdns' (v0.0.5) requires 'puppetlabs-stdlib' (>= 0.0.0)\u001b[0m\n\u001b[1;31mWarning: Missing dependency 'puppetlabs-corosync':\n 'openstack-openstack_extras' (v8.0.1) requires 'puppetlabs-corosync' (>=0.8.0 <1.0.0)\u001b[0m\n\u001b[1;31mWarning: Missing dependency 'stahnma-epel':\n 'stankevich-python' (v1.10.0) requires 'stahnma-epel' (>= 1.0.1 < 2.0.0)\u001b[0m\n+ set +e\n+ '[' false = true ']'\n+ run_puppet scenario002\n+ local manifest=scenario002\n+ puppet apply --detailed-exitcodes --color=false --test --trace fixtures/scenario002.pp\nWarning: Config file /etc/puppet/hiera.yaml not found, using Hiera defaults\nWarning: Scope(Class[Nova::Keystone::Auth]): Note that service_name parameter default value will be changed to \"Compute Service\" (according to Keystone default catalog) in a future release. In case you use different value, please update your manifests accordingly.\nWarning: Scope(Class[Nova::Keystone::Auth]): Note that service_name_v3 parameter default value will be changed to \"Compute Service v3\" (according to Keystone default catalog) in a future release. In case you use different value, please update your manifests accordingly.\nWarning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::host'; class ::nova::vncproxy has not been evaluated\nWarning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::vncproxy_protocol'; class ::nova::vncproxy has not been evaluated\nWarning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::port'; class ::nova::vncproxy has not been evaluated\nWarning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::vncproxy_path'; class ::nova::vncproxy has not been evaluated\nWarning: Scope(Class[Swift]): swift_hash_suffix has been deprecated and should be replaced with swift_hash_path_suffix, this will be removed as part of the N-cycle\nWarning: The package type's allow_virtual parameter will be changing its default value from false to true in a future release. If you do not want to allow virtual packages, please explicitly set allow_virtual to false.\n (at /usr/share/ruby/vendor_ruby/puppet/type.rb:816:in `set_default'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2263:in `block in set_parameters'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2262:in `each'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2262:in `set_parameters'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2200:in `initialize'; /usr/share/ruby/vendor_ruby/puppet/resource.rb:314:in `new'; /usr/share/ruby/vendor_ruby/puppet/resource.rb:314:in `to_ral'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:510:in `block in to_catalog'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:502:in `each'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:502:in `to_catalog'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:405:in `to_ral'; /usr/share/ruby/vendor_ruby/puppet/application/apply.rb:217:in `block in main'; /usr/share/ruby/vendor_ruby/puppet/context.rb:64:in `override'; /usr/share/ruby/vendor_ruby/puppet.rb:234:in `override'; /usr/share/ruby/vendor_ruby/puppet/application/apply.rb:190:in `main'; /usr/share/ruby/vendor_ruby/puppet/application/apply.rb:151:in `run_command'; /usr/share/ruby/vendor_ruby/puppet/application.rb:371:in `block (2 levels) in run'; /usr/share/ruby/vendor_ruby/puppet/application.rb:477:in `plugin_hook'; /usr/share/ruby/vendor_ruby/puppet/application.rb:371:in `block in run'; /usr/share/ruby/vendor_ruby/puppet/util.rb:479:in `exit_on_fail'; /usr/share/ruby/vendor_ruby/puppet/application.rb:371:in `run'; /usr/share/ruby/vendor_ruby/puppet/util/command_line.rb:137:in `run'; /usr/share/ruby/vendor_ruby/puppet/util/command_line.rb:91:in `execute'; /usr/bin/puppet:8:in `
')\nWarning: Unexpected line: Ring file /etc/swift/object.ring.gz not found, probably it hasn't been written yet\nWarning: Unexpected line: Ring file /etc/swift/container.ring.gz not found, probably it hasn't been written yet\nWarning: Unexpected line: Ring file /etc/swift/account.ring.gz not found, probably it hasn't been written yet\n+ local res=2\n+ return 2\n+ RESULT=2\n+ set -e\n+ '[' 2 -ne 2 ']'\n+ set +e\n+ run_puppet scenario002\n+ local manifest=scenario002\n+ puppet apply --detailed-exitcodes --color=false --test --trace fixtures/scenario002.pp\nWarning: Config file /etc/puppet/hiera.yaml not found, using Hiera defaults\nWarning: Scope(Class[Nova::Keystone::Auth]): Note that service_name parameter default value will be changed to \"Compute Service\" (according to Keystone default catalog) in a future release. In case you use different value, please update your manifests accordingly.\nWarning: Scope(Class[Nova::Keystone::Auth]): Note that service_name_v3 parameter default value will be changed to \"Compute Service v3\" (according to Keystone default catalog) in a future release. In case you use different value, please update your manifests accordingly.\nWarning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::host'; class ::nova::vncproxy has not been evaluated\nWarning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::vncproxy_protocol'; class ::nova::vncproxy has not been evaluated\nWarning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::port'; class ::nova::vncproxy has not been evaluated\nWarning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::vncproxy_path'; class ::nova::vncproxy has not been evaluated\nWarning: Scope(Class[Swift]): swift_hash_suffix has been deprecated and should be replaced with swift_hash_path_suffix, this will be removed as part of the N-cycle\nWarning: The package type's allow_virtual parameter will be changing its default value from false to true in a future release. If you do not want to allow virtual packages, please explicitly set allow_virtual to false.\n (at /usr/share/ruby/vendor_ruby/puppet/type.rb:816:in `set_default'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2263:in `block in set_parameters'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2262:in `each'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2262:in `set_parameters'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2200:in `initialize'; /usr/share/ruby/vendor_ruby/puppet/resource.rb:314:in `new'; /usr/share/ruby/vendor_ruby/puppet/resource.rb:314:in `to_ral'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:510:in `block in to_catalog'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:502:in `each'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:502:in `to_catalog'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:405:in `to_ral'; /usr/share/ruby/vendor_ruby/puppet/application/apply.rb:217:in `block in main'; /usr/share/ruby/vendor_ruby/puppet/context.rb:64:in `override'; /usr/share/ruby/vendor_ruby/puppet.rb:234:in `override'; /usr/share/ruby/vendor_ruby/puppet/application/apply.rb:190:in `main'; /usr/share/ruby/vendor_ruby/puppet/application/apply.rb:151:in `run_command'; /usr/share/ruby/vendor_ruby/puppet/application.rb:371:in `block (2 levels) in run'; /usr/share/ruby/vendor_ruby/puppet/application.rb:477:in `plugin_hook'; /usr/share/ruby/vendor_ruby/puppet/application.rb:371:in `block in run'; /usr/share/ruby/vendor_ruby/puppet/util.rb:479:in `exit_on_fail'; /usr/share/ruby/vendor_ruby/puppet/application.rb:371:in `run'; /usr/share/ruby/vendor_ruby/puppet/util/command_line.rb:137:in `run'; /usr/share/ruby/vendor_ruby/puppet/util/command_line.rb:91:in `execute'; /usr/bin/puppet:8:in `
')\nWarning: Unexpected line: Ring file /etc/swift/object.ring.gz is up-to-date\nWarning: Unexpected line: Devices: id region zone ip address port replication ip replication port name weight partitions balance flags meta\nWarning: Unexpected line: Ring file /etc/swift/container.ring.gz is up-to-date\nWarning: Unexpected line: Devices: id region zone ip address port replication ip replication port name weight partitions balance flags meta\nWarning: Unexpected line: Ring file /etc/swift/account.ring.gz is up-to-date\nWarning: Unexpected line: Devices: id region zone ip address port replication ip replication port name weight partitions balance flags meta\n+ local res=0\n+ return 0\n+ RESULT=0\n+ set -e\n+ '[' 0 -ne 0 ']'\n+ mkdir -p /tmp/openstack/tempest\n+ rm -f /tmp/openstack/tempest/cirros-0.3.4-x86_64-disk.img\n+ wget http://download.cirros-cloud.net/0.3.4/cirros-0.3.4-x86_64-disk.img -P /tmp/openstack/tempest\n--2016-05-20 12:34:42-- http://download.cirros-cloud.net/0.3.4/cirros-0.3.4-x86_64-disk.img\nResolving download.cirros-cloud.net (download.cirros-cloud.net)... 64.90.42.85\nConnecting to download.cirros-cloud.net (download.cirros-cloud.net)|64.90.42.85|:80... connected.\nHTTP request sent, awaiting response... 200 OK\nLength: 13287936 (13M) [text/plain]\nSaving to: ‘/tmp/openstack/tempest/cirros-0.3.4-x86_64-disk.img’\n\n 0K .......... .......... .......... .......... .......... 0% 197K 65s\n 50K .......... .......... .......... .......... .......... 0% 586K 44s\n 100K .......... .......... .......... .......... .......... 1% 297K 43s\n 150K .......... .......... .......... .......... .......... 1% 592K 38s\n 200K .......... .......... .......... .......... .......... 1% 27.3M 30s\n 250K .......... .......... .......... .......... .......... 2% 595K 29s\n 300K .......... .......... .......... .......... .......... 2% 597K 27s\n 350K .......... .......... .......... .......... .......... 3% 77.4M 24s\n 400K .......... .......... .......... .......... .......... 3% 596K 24s\n 450K .......... .......... .......... .......... .......... 3% 52.6M 21s\n 500K .......... .......... .......... .......... .......... 4% 45.3M 19s\n 550K .......... .......... .......... .......... .......... 4% 596K 19s\n 600K .......... .......... .......... .......... .......... 5% 134M 18s\n 650K .......... .......... .......... .......... .......... 5% 48.6M 16s\n 700K .......... .......... .......... .......... .......... 5% 605K 17s\n 750K .......... .......... .......... .......... .......... 6% 39.9M 15s\n 800K .......... .......... .......... .......... .......... 6% 23.3M 15s\n 850K .......... .......... .......... .......... .......... 6% 242M 14s\n 900K .......... .......... .......... .......... .......... 7% 616K 14s\n 950K .......... .......... .......... .......... .......... 7% 47.1M 13s\n 1000K .......... .......... .......... .......... .......... 8% 26.5M 13s\n 1050K .......... .......... .......... .......... .......... 8% 52.2M 12s\n 1100K .......... .......... .......... .......... .......... 8% 583K 12s\n 1150K .......... .......... .......... .......... .......... 9% 289M 12s\n 1200K .......... .......... .......... .......... .......... 9% 291M 11s\n 1250K .......... .......... .......... .......... .......... 10% 275M 11s\n 1300K .......... .......... .......... .......... .......... 10% 307M 10s\n 1350K .......... .......... .......... .......... .......... 10% 319M 10s\n 1400K .......... .......... .......... .......... .......... 11% 635K 10s\n 1450K .......... .......... .......... .......... .......... 11% 30.5M 10s\n 1500K .......... .......... .......... .......... .......... 11% 607K 10s\n 1550K .......... .......... .......... .......... .......... 12% 22.7M 10s\n 1600K .......... .......... .......... .......... .......... 12% 1.31M 10s\n 1650K .......... .......... .......... .......... .......... 13% 1.05M 10s\n 1700K .......... .......... .......... .......... .......... 13% 33.8M 9s\n 1750K .......... .......... .......... .......... .......... 13% 21.8M 9s\n 1800K .......... .......... .......... .......... .......... 14% 625K 9s\n 1850K .......... .......... .......... .......... .......... 14% 18.3M 9s\n 1900K .......... .......... .......... .......... .......... 15% 75.5M 9s\n 1950K .......... .......... .......... .......... .......... 15% 1.38M 9s\n 2000K .......... .......... .......... .......... .......... 15% 1.07M 9s\n 2050K .......... .......... .......... .......... .......... 16% 14.1M 8s\n 2100K .......... .......... .......... .......... .......... 16% 79.4M 8s\n 2150K .......... .......... .......... .......... .......... 16% 1.41M 8s\n 2200K .......... .......... .......... .......... .......... 17% 1.07M 8s\n 2250K .......... .......... .......... .......... .......... 17% 13.8M 8s\n 2300K .......... .......... .......... .......... .......... 18% 40.7M 8s\n 2350K .......... .......... .......... .......... .......... 18% 1.45M 8s\n 2400K .......... .......... .......... .......... .......... 18% 1.06M 8s\n 2450K .......... .......... .......... .......... .......... 19% 10.3M 7s\n 2500K .......... .......... .......... .......... .......... 19% 55.3M 7s\n 2550K .......... .......... .......... .......... .......... 20% 85.7M 7s\n 2600K .......... .......... .......... .......... .......... 20% 638K 7s\n 2650K .......... .......... .......... .......... .......... 20% 10.3M 7s\n 2700K .......... .......... .......... .......... .......... 21% 48.8M 7s\n 2750K .......... .......... .......... .......... .......... 21% 62.6M 7s\n 2800K .......... .......... .......... .......... .......... 21% 639K 7s\n 2850K .......... .......... .......... .......... .......... 22% 84.0M 7s\n 2900K .......... .......... .......... .......... .......... 22% 9.73M 7s\n 2950K .......... .......... .......... .......... .......... 23% 64.7M 6s\n 3000K .......... .......... .......... .......... .......... 23% 1.54M 6s\n 3050K .......... .......... .......... .......... .......... 23% 1.04M 6s\n 3100K .......... .......... .......... .......... .......... 24% 10.2M 6s\n 3150K .......... .......... .......... .......... .......... 24% 44.8M 6s\n 3200K .......... .......... .......... .......... .......... 25% 63.1M 6s\n 3250K .......... .......... .......... .......... .......... 25% 643K 6s\n 3300K .......... .......... .......... .......... .......... 25% 13.5M 6s\n 3350K .......... .......... .......... .......... .......... 26% 28.0M 6s\n 3400K .......... .......... .......... .......... .......... 26% 34.0M 6s\n 3450K .......... .......... .......... .......... .......... 26% 1.59M 6s\n 3500K .......... .......... .......... .......... .......... 27% 1.02M 6s\n 3550K .......... .......... .......... .......... .......... 27% 11.3M 6s\n 3600K .......... .......... .......... .......... .......... 28% 20.9M 6s\n 3650K .......... .......... .......... .......... .......... 28% 138M 6s\n 3700K .......... .......... .......... .......... .......... 28% 643K 6s\n 3750K .......... .......... .......... .......... .......... 29% 20.7M 6s\n 3800K .......... .......... .......... .......... .......... 29% 22.4M 5s\n 3850K .......... .......... .......... .......... .......... 30% 17.7M 5s\n 3900K .......... .......... .......... .......... .......... 30% 1.69M 5s\n 3950K .......... .......... .......... .......... .......... 30% 1014K 5s\n 4000K .......... .......... .......... .......... .......... 31% 15.7M 5s\n 4050K .......... .......... .......... .......... .......... 31% 15.3M 5s\n 4100K .......... .......... .......... .......... .......... 31% 80.7M 5s\n 4150K .......... .......... .......... .......... .......... 32% 652K 5s\n 4200K .......... .......... .......... .......... .......... 32% 28.3M 5s\n 4250K .......... .......... .......... .......... .......... 33% 15.1M 5s\n 4300K .......... .......... .......... .......... .......... 33% 15.3M 5s\n 4350K .......... .......... .......... .......... .......... 33% 120M 5s\n 4400K .......... .......... .......... .......... .......... 34% 649K 5s\n 4450K .......... .......... .......... .......... .......... 34% 68.6M 5s\n 4500K .......... .......... .......... .......... .......... 35% 15.9M 5s\n 4550K .......... .......... .......... .......... .......... 35% 14.3M 5s\n 4600K .......... .......... .......... .......... .......... 35% 1.77M 5s\n 4650K .......... .......... .......... .......... .......... 36% 990K 5s\n 4700K .......... .......... .......... .......... .......... 36% 63.9M 5s\n 4750K .......... .......... .......... .......... .......... 36% 18.4M 4s\n 4800K .......... .......... .......... .......... .......... 37% 15.5M 4s\n 4850K .......... .......... .......... .......... .......... 37% 1.78M 4s\n 4900K .......... .......... .......... .......... .......... 38% 972K 4s\n 4950K .......... .......... .......... .......... .......... 38% 145M 4s\n 5000K .......... .......... .......... .......... .......... 38% 22.7M 4s\n 5050K .......... .......... .......... .......... .......... 39% 14.2M 4s\n 5100K .......... .......... .......... .......... .......... 39% 762K 4s\n 5150K .......... .......... .......... .......... .......... 40% 3.74M 4s\n 5200K .......... .......... .......... .......... .......... 40% 183M 4s\n 5250K .......... .......... .......... .......... .......... 40% 10.4M 4s\n 5300K .......... .......... .......... .......... .......... 41% 51.2M 4s\n 5350K .......... .......... .......... .......... .......... 41% 765K 4s\n 5400K .......... .......... .......... .......... .......... 41% 3.49M 4s\n 5450K .......... .......... .......... .......... .......... 42% 63.1M 4s\n 5500K .......... .......... .......... .......... .......... 42% 14.5M 4s\n 5550K .......... .......... .......... .......... .......... 43% 49.5M 4s\n 5600K .......... .......... .......... .......... .......... 43% 767K 4s\n 5650K .......... .......... .......... .......... .......... 43% 3.23M 4s\n 5700K .......... .......... .......... .......... .......... 44% 123M 4s\n 5750K .......... .......... .......... .......... .......... 44% 18.4M 4s\n 5800K .......... .......... .......... .......... .......... 45% 37.9M 4s\n 5850K .......... .......... .......... .......... .......... 45% 769K 4s\n 5900K .......... .......... .......... .......... .......... 45% 2.94M 4s\n 5950K .......... .......... .......... .......... .......... 46% 147M 4s\n 6000K .......... .......... .......... .......... .......... 46% 44.9M 3s\n 6050K .......... .......... .......... .......... .......... 47% 27.3M 3s\n 6100K .......... .......... .......... .......... .......... 47% 773K 3s\n 6150K .......... .......... .......... .......... .......... 47% 2.88M 3s\n 6200K .......... .......... .......... .......... .......... 48% 76.7M 3s\n 6250K .......... .......... .......... .......... .......... 48% 149M 3s\n 6300K .......... .......... .......... .......... .......... 48% 15.7M 3s\n 6350K .......... .......... .......... .......... .......... 49% 1.96M 3s\n 6400K .......... .......... .......... .......... .......... 49% 897K 3s\n 6450K .......... .......... .......... .......... .......... 50% 68.4M 3s\n 6500K .......... .......... .......... .......... .......... 50% 65.9M 3s\n 6550K .......... .......... .......... .......... .......... 50% 18.0M 3s\n 6600K .......... .......... .......... .......... .......... 51% 2.00M 3s\n 6650K .......... .......... .......... .......... .......... 51% 872K 3s\n 6700K .......... .......... .......... .......... .......... 52% 69.7M 3s\n 6750K .......... .......... .......... .......... .......... 52% 126M 3s\n 6800K .......... .......... .......... .......... .......... 52% 32.5M 3s\n 6850K .......... .......... .......... .......... .......... 53% 2.01M 3s\n 6900K .......... .......... .......... .......... .......... 53% 871K 3s\n 6950K .......... .......... .......... .......... .......... 53% 78.4M 3s\n 7000K .......... .......... .......... .......... .......... 54% 70.5M 3s\n 7050K .......... .......... .......... .......... .......... 54% 37.4M 3s\n 7100K .......... .......... .......... .......... .......... 55% 2.02M 3s\n 7150K .......... .......... .......... .......... .......... 55% 902K 3s\n 7200K .......... .......... .......... .......... .......... 55% 16.4M 3s\n 7250K .......... .......... .......... .......... .......... 56% 48.6M 3s\n 7300K .......... .......... .......... .......... .......... 56% 97.7M 3s\n 7350K .......... .......... .......... .......... .......... 57% 55.5M 3s\n 7400K .......... .......... .......... .......... .......... 57% 802K 3s\n 7450K .......... .......... .......... .......... .......... 57% 2.50M 3s\n 7500K .......... .......... .......... .......... .......... 58% 32.5M 3s\n 7550K .......... .......... .......... .......... .......... 58% 65.7M 3s\n 7600K .......... .......... .......... .......... .......... 58% 96.9M 2s\n 7650K .......... .......... .......... .......... .......... 59% 2.04M 2s\n 7700K .......... .......... .......... .......... .......... 59% 867K 2s\n 7750K .......... .......... .......... .......... .......... 60% 23.8M 2s\n 7800K .......... .......... .......... .......... .......... 60% 68.3M 2s\n 7850K .......... .......... .......... .......... .......... 60% 50.8M 2s\n 7900K .......... .......... .......... .......... .......... 61% 2.14M 2s\n 7950K .......... .......... .......... .......... .......... 61% 872K 2s\n 8000K .......... .......... .......... .......... .......... 62% 11.1M 2s\n 8050K .......... .......... .......... .......... .......... 62% 108M 2s\n 8100K .......... .......... .......... .......... .......... 62% 83.9M 2s\n 8150K .......... .......... .......... .......... .......... 63% 108M 2s\n 8200K .......... .......... .......... .......... .......... 63% 658K 2s\n 8250K .......... .......... .......... .......... .......... 63% 6.58M 2s\n 8300K .......... .......... .......... .......... .......... 64% 31.4M 2s\n 8350K .......... .......... .......... .......... .......... 64% 60.8M 2s\n 8400K .......... .......... .......... .......... .......... 65% 110M 2s\n 8450K .......... .......... .......... .......... .......... 65% 2.46M 2s\n 8500K .......... .......... .......... .......... .......... 65% 867K 2s\n 8550K .......... .......... .......... .......... .......... 66% 6.34M 2s\n 8600K .......... .......... .......... .......... .......... 66% 46.6M 2s\n 8650K .......... .......... .......... .......... .......... 67% 308M 2s\n 8700K .......... .......... .......... .......... .......... 67% 2.88M 2s\n 8750K .......... .......... .......... .......... .......... 67% 857K 2s\n 8800K .......... .......... .......... .......... .......... 68% 5.47M 2s\n 8850K .......... .......... .......... .......... .......... 68% 80.1M 2s\n 8900K .......... .......... .......... .......... .......... 68% 65.2M 2s\n 8950K .......... .......... .......... .......... .......... 69% 338M 2s\n 9000K .......... .......... .......... .......... .......... 69% 899K 2s\n 9050K .......... .......... .......... .......... .......... 70% 2.59M 2s\n 9100K .......... .......... .......... .......... .......... 70% 5.55M 2s\n 9150K .......... .......... .......... .......... .......... 70% 71.9M 2s\n 9200K .......... .......... .......... .......... .......... 71% 70.5M 2s\n 9250K .......... .......... .......... .......... .......... 71% 3.16M 2s\n 9300K .......... .......... .......... .......... .......... 72% 854K 2s\n 9350K .......... .......... .......... .......... .......... 72% 5.38M 2s\n 9400K .......... .......... .......... .......... .......... 72% 48.1M 2s\n 9450K .......... .......... .......... .......... .......... 73% 70.6M 2s\n 9500K .......... .......... .......... .......... .......... 73% 98.9M 1s\n 9550K .......... .......... .......... .......... .......... 73% 919K 1s\n 9600K .......... .......... .......... .......... .......... 74% 2.55M 1s\n 9650K .......... .......... .......... .......... .......... 74% 5.51M 1s\n 9700K .......... .......... .......... .......... .......... 75% 50.2M 1s\n 9750K .......... .......... .......... .......... .......... 75% 45.2M 1s\n 9800K .......... .......... .......... .......... .......... 75% 3.43M 1s\n 9850K .......... .......... .......... .......... .......... 76% 844K 1s\n 9900K .......... .......... .......... .......... .......... 76% 5.76M 1s\n 9950K .......... .......... .......... .......... .......... 77% 47.0M 1s\n 10000K .......... .......... .......... .......... .......... 77% 33.3M 1s\n 10050K .......... .......... .......... .......... .......... 77% 72.7M 1s\n 10100K .......... .......... .......... .......... .......... 78% 930K 1s\n 10150K .......... .......... .......... .......... .......... 78% 2.52M 1s\n 10200K .......... .......... .......... .......... .......... 78% 5.83M 1s\n 10250K .......... .......... .......... .......... .......... 79% 44.5M 1s\n 10300K .......... .......... .......... .......... .......... 79% 24.4M 1s\n 10350K .......... .......... .......... .......... .......... 80% 3.74M 1s\n 10400K .......... .......... .......... .......... .......... 80% 842K 1s\n 10450K .......... .......... .......... .......... .......... 80% 6.47M 1s\n 10500K .......... .......... .......... .......... .......... 81% 24.4M 1s\n 10550K .......... .......... .......... .......... .......... 81% 20.3M 1s\n 10600K .......... .......... .......... .......... .......... 82% 106M 1s\n 10650K .......... .......... .......... .......... .......... 82% 937K 1s\n 10700K .......... .......... .......... .......... .......... 82% 2.55M 1s\n 10750K .......... .......... .......... .......... .......... 83% 5.78M 1s\n 10800K .......... .......... .......... .......... .......... 83% 43.1M 1s\n 10850K .......... .......... .......... .......... .......... 83% 18.3M 1s\n 10900K .......... .......... .......... .......... .......... 84% 3.95M 1s\n 10950K .......... .......... .......... .......... .......... 84% 902K 1s\n 11000K .......... .......... .......... .......... .......... 85% 4.20M 1s\n 11050K .......... .......... .......... .......... .......... 85% 26.4M 1s\n 11100K .......... .......... .......... .......... .......... 85% 16.9M 1s\n 11150K .......... .......... .......... .......... .......... 86% 37.1M 1s\n 11200K .......... .......... .......... .......... .......... 86% 968K 1s\n 11250K .......... .......... .......... .......... .......... 87% 2.47M 1s\n 11300K .......... .......... .......... .......... .......... 87% 6.26M 1s\n 11350K .......... .......... .......... .......... .......... 87% 30.5M 1s\n 11400K .......... .......... .......... .......... .......... 88% 14.8M 1s\n 11450K .......... .......... .......... .......... .......... 88% 4.16M 1s\n 11500K .......... .......... .......... .......... .......... 89% 908K 1s\n 11550K .......... .......... .......... .......... .......... 89% 9.24M 1s\n 11600K .......... .......... .......... .......... .......... 89% 5.67M 1s\n 11650K .......... .......... .......... .......... .......... 90% 49.7M 1s\n 11700K .......... .......... .......... .......... .......... 90% 13.3M 1s\n 11750K .......... .......... .......... .......... .......... 90% 4.25M 0s\n 11800K .......... .......... .......... .......... .......... 91% 846K 0s\n 11850K .......... .......... .......... .......... .......... 91% 6.19M 0s\n 11900K .......... .......... .......... .......... .......... 92% 30.9M 0s\n 11950K .......... .......... .......... .......... .......... 92% 12.0M 0s\n 12000K .......... .......... .......... .......... .......... 92% 88.7M 0s\n 12050K .......... .......... .......... .......... .......... 93% 983K 0s\n 12100K .......... .......... .......... .......... .......... 93% 2.49M 0s\n 12150K .......... .......... .......... .......... .......... 94% 5.51M 0s\n 12200K .......... .......... .......... .......... .......... 94% 68.0M 0s\n 12250K .......... .......... .......... .......... .......... 94% 12.1M 0s\n 12300K .......... .......... .......... .......... .......... 95% 4.24M 0s\n 12350K .......... .......... .......... .......... .......... 95% 915K 0s\n 12400K .......... .......... .......... .......... .......... 95% 4.45M 0s\n 12450K .......... .......... .......... .......... .......... 96% 16.8M 0s\n 12500K .......... .......... .......... .......... .......... 96% 25.9M 0s\n 12550K .......... .......... .......... .......... .......... 97% 16.1M 0s\n 12600K .......... .......... .......... .......... .......... 97% 977K 0s\n 12650K .......... .......... .......... .......... .......... 97% 2.56M 0s\n 12700K .......... .......... .......... .......... .......... 98% 6.39M 0s\n 12750K .......... .......... .......... .......... .......... 98% 28.3M 0s\n 12800K .......... .......... .......... .......... .......... 99% 13.0M 0s\n 12850K .......... .......... .......... .......... .......... 99% 38.4M 0s\n 12900K .......... .......... .......... .......... .......... 99% 973K 0s\n 12950K .......... .......... ...... 100% 1.39M=5.3s\n\n2016-05-20 12:34:47 (2.41 MB/s) - ‘/tmp/openstack/tempest/cirros-0.3.4-x86_64-disk.img’ saved [13287936/13287936]\n\n+ set +e\n+ TESTS=smoke\n+ TESTS='smoke dashbboard'\n+ TESTS='smoke dashbboard TelemetryAlarming'\n+ TESTS='smoke dashbboard TelemetryAlarming api.baremetal.admin.test_drivers'\n+ cd /tmp/openstack/tempest\n+ tox -eall -- --concurrency=2 smoke dashbboard TelemetryAlarming api.baremetal.admin.test_drivers\nOption \"verbose\" from group \"DEFAULT\" is deprecated for removal. Its value may be silently ignored in the future.\nOption \"verbose\" from group \"DEFAULT\" is deprecated for removal. Its value may be silently ignored in the future.\nOption \"verbose\" from group \"DEFAULT\" is deprecated for removal. Its value may be silently ignored in the future.\n+ RESULT=1\n+ set -e\n+ /tmp/openstack/tempest/.tox/tempest/bin/testr last --subunit\n+ exit 1", "stdout": "Cloning into '/tmp/openstack/tempest'...\nPreparing... ########################################\nUpdating / installing...\npuppetlabs-release-7-12 ########################################\nLoaded plugins: fastestmirror, priorities\nLoading mirror speeds from cached hostfile\n * base: mirror.centos.org\n * extras: mirror.centos.org\n * updates: mirror.centos.org\n545 packages excluded due to repository priority protections\nResolving Dependencies\n--> Running transaction check\n---> Package dstat.noarch 0:0.7.2-12.el7 will be installed\n---> Package puppet.noarch 0:3.6.2-3.el7 will be installed\n--> Processing Dependency: hiera >= 1.0.0 for package: puppet-3.6.2-3.el7.noarch\n--> Processing Dependency: facter >= 1.6.6 for package: puppet-3.6.2-3.el7.noarch\n--> Processing Dependency: rubygem(rgen) for package: puppet-3.6.2-3.el7.noarch\n--> Processing Dependency: ruby(shadow) for package: puppet-3.6.2-3.el7.noarch\n--> Processing Dependency: ruby(selinux) for package: puppet-3.6.2-3.el7.noarch\n--> Processing Dependency: ruby(augeas) for package: puppet-3.6.2-3.el7.noarch\n--> Running transaction check\n---> Package facter.x86_64 0:2.4.4-3.el7 will be installed\n--> Processing Dependency: pciutils for package: facter-2.4.4-3.el7.x86_64\n---> Package hiera.noarch 0:1.3.4-1.el7 will be installed\n---> Package libselinux-ruby.x86_64 0:2.2.2-6.el7 will be installed\n---> Package ruby-augeas.x86_64 0:0.5.0-1.el7 will be installed\n--> Processing Dependency: augeas-libs >= 1.0.0 for package: ruby-augeas-0.5.0-1.el7.x86_64\n--> Processing Dependency: libaugeas.so.0(AUGEAS_0.8.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64\n--> Processing Dependency: libaugeas.so.0(AUGEAS_0.16.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64\n--> Processing Dependency: libaugeas.so.0(AUGEAS_0.14.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64\n--> Processing Dependency: libaugeas.so.0(AUGEAS_0.12.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64\n--> Processing Dependency: libaugeas.so.0(AUGEAS_0.11.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64\n--> Processing Dependency: libaugeas.so.0(AUGEAS_0.10.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64\n--> Processing Dependency: libaugeas.so.0(AUGEAS_0.1.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64\n--> Processing Dependency: libaugeas.so.0()(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64\n---> Package ruby-shadow.x86_64 0:1.4.1-23.el7 will be installed\n---> Package rubygem-rgen.noarch 0:0.6.6-2.el7 will be installed\n--> Running transaction check\n---> Package augeas-libs.x86_64 0:1.4.0-2.el7 will be installed\n---> Package pciutils.x86_64 0:3.2.1-4.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n dstat noarch 0.7.2-12.el7 base 163 k\n puppet noarch 3.6.2-3.el7 delorean-mitaka-testing 1.2 M\nInstalling for dependencies:\n augeas-libs x86_64 1.4.0-2.el7 base 355 k\n facter x86_64 2.4.4-3.el7 delorean-mitaka-testing 101 k\n hiera noarch 1.3.4-1.el7 delorean-mitaka-testing 24 k\n libselinux-ruby x86_64 2.2.2-6.el7 base 127 k\n pciutils x86_64 3.2.1-4.el7 base 90 k\n ruby-augeas x86_64 0.5.0-1.el7 delorean-mitaka-testing 23 k\n ruby-shadow x86_64 1.4.1-23.el7 delorean-mitaka-testing 13 k\n rubygem-rgen noarch 0.6.6-2.el7 delorean-mitaka-testing 83 k\n\nTransaction Summary\n================================================================================\nInstall 2 Packages (+8 Dependent packages)\n\nTotal download size: 2.2 M\nInstalled size: 7.1 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal 401 kB/s | 2.2 MB 00:05 \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : rubygem-rgen-0.6.6-2.el7.noarch 1/10 \n Installing : augeas-libs-1.4.0-2.el7.x86_64 2/10 \n Installing : ruby-augeas-0.5.0-1.el7.x86_64 3/10 \n Installing : ruby-shadow-1.4.1-23.el7.x86_64 4/10 \n Installing : hiera-1.3.4-1.el7.noarch 5/10 \n Installing : pciutils-3.2.1-4.el7.x86_64 6/10 \n Installing : facter-2.4.4-3.el7.x86_64 7/10 \n Installing : libselinux-ruby-2.2.2-6.el7.x86_64 8/10 \n Installing : puppet-3.6.2-3.el7.noarch 9/10 \n Installing : dstat-0.7.2-12.el7.noarch 10/10 \n Verifying : ruby-augeas-0.5.0-1.el7.x86_64 1/10 \n Verifying : libselinux-ruby-2.2.2-6.el7.x86_64 2/10 \n Verifying : pciutils-3.2.1-4.el7.x86_64 3/10 \n Verifying : hiera-1.3.4-1.el7.noarch 4/10 \n Verifying : puppet-3.6.2-3.el7.noarch 5/10 \n Verifying : facter-2.4.4-3.el7.x86_64 6/10 \n Verifying : dstat-0.7.2-12.el7.noarch 7/10 \n Verifying : ruby-shadow-1.4.1-23.el7.x86_64 8/10 \n Verifying : augeas-libs-1.4.0-2.el7.x86_64 9/10 \n Verifying : rubygem-rgen-0.6.6-2.el7.noarch 10/10 \n\nInstalled:\n dstat.noarch 0:0.7.2-12.el7 puppet.noarch 0:3.6.2-3.el7 \n\nDependency Installed:\n augeas-libs.x86_64 0:1.4.0-2.el7 facter.x86_64 0:2.4.4-3.el7 \n hiera.noarch 0:1.3.4-1.el7 libselinux-ruby.x86_64 0:2.2.2-6.el7 \n pciutils.x86_64 0:3.2.1-4.el7 ruby-augeas.x86_64 0:0.5.0-1.el7 \n ruby-shadow.x86_64 0:1.4.1-23.el7 rubygem-rgen.noarch 0:0.6.6-2.el7 \n\nComplete!\ndstat is /usr/bin/dstat\nSuccessfully installed colored-1.2\nSuccessfully installed cri-2.6.1\nSuccessfully installed log4r-1.1.10\nSuccessfully installed multi_json-1.12.1\nSuccessfully installed multipart-post-2.0.0\nSuccessfully installed faraday-0.9.2\nSuccessfully installed faraday_middleware-0.10.0\nSuccessfully installed semantic_puppet-0.1.2\nSuccessfully installed minitar-0.5.4\nSuccessfully installed puppet_forge-2.2.0\nSuccessfully installed r10k-2.3.0\n11 gems installed\n/etc/puppet/modules\n├── antonlindstrom-powerdns (\u001b[0;36mv0.0.5\u001b[0m)\n├── duritong-sysctl (\u001b[0;36mv0.0.11\u001b[0m)\n├── nanliu-staging (\u001b[0;36mv1.0.4\u001b[0m)\n├── openstack-aodh (\u001b[0;36mv8.0.2\u001b[0m)\n├── openstack-barbican (\u001b[0;36mv0.0.1\u001b[0m)\n├── openstack-ceilometer (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-ceph (\u001b[0;36mv1.0.0\u001b[0m)\n├── openstack-cinder (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-designate (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-glance (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-gnocchi (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-heat (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-horizon (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-ironic (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-keystone (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-manila (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-mistral (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-monasca (\u001b[0;36mv1.0.0\u001b[0m)\n├── openstack-murano (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-neutron (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-nova (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-openstack_extras (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-openstacklib (\u001b[0;36mv8.0.1\u001b[0m) \u001b[0;31minvalid\u001b[0m\n├── openstack-sahara (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-swift (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-tempest (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-trove (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack-vswitch (\u001b[0;36mv4.0.0\u001b[0m)\n├── openstack-zaqar (\u001b[0;36mv8.0.1\u001b[0m)\n├── openstack_integration (\u001b[0;36m???\u001b[0m)\n├── puppet-corosync (\u001b[0;36mv0.8.0\u001b[0m)\n├── puppet-octavia (\u001b[0;36mv0.0.1\u001b[0m)\n├── puppet-oslo (\u001b[0;36mv0.0.1\u001b[0m)\n├── puppetlabs-apache (\u001b[0;36mv1.8.1\u001b[0m)\n├── puppetlabs-apt (\u001b[0;36mv2.2.2\u001b[0m)\n├── puppetlabs-concat (\u001b[0;36mv1.2.5\u001b[0m)\n├── puppetlabs-firewall (\u001b[0;36mv1.7.2\u001b[0m)\n├── puppetlabs-inifile (\u001b[0;36mv1.4.3\u001b[0m) \u001b[0;31minvalid\u001b[0m\n├── puppetlabs-mongodb (\u001b[0;36mv0.12.0\u001b[0m)\n├── puppetlabs-mysql (\u001b[0;36mv3.6.2\u001b[0m)\n├── puppetlabs-postgresql (\u001b[0;36mv4.7.1\u001b[0m)\n├── puppetlabs-rabbitmq (\u001b[0;36mv5.3.1\u001b[0m)\n├── puppetlabs-rsync (\u001b[0;36mv0.4.0\u001b[0m)\n├── puppetlabs-stdlib (\u001b[0;36mv4.9.1\u001b[0m)\n├── puppetlabs-vcsrepo (\u001b[0;36mv1.3.2\u001b[0m)\n├── puppetlabs-xinetd (\u001b[0;36mv1.5.0\u001b[0m)\n├── qpid (\u001b[0;36m???\u001b[0m)\n├── saz-memcached (\u001b[0;36mv2.8.1\u001b[0m)\n├── stankevich-python (\u001b[0;36mv1.10.0\u001b[0m)\n└── theforeman-dns (\u001b[0;36mv3.1.0\u001b[0m)\n/usr/share/puppet/modules (no modules installed)\nInfo: Loading external facts from /etc/puppet/modules/openstacklib/facts.d\nInfo: Loading facts in /etc/puppet/modules/nova/lib/facter/libvirt_uuid.rb\nInfo: Loading facts in /etc/puppet/modules/openstacklib/lib/facter/os_package_type.rb\nInfo: Loading facts in /etc/puppet/modules/openstacklib/lib/facter/os_service_default.rb\nInfo: Loading facts in /etc/puppet/modules/vswitch/lib/facter/ovs.rb\nInfo: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_reboot_required.rb\nInfo: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_update_last_success.rb\nInfo: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_updates.rb\nInfo: Loading facts in /etc/puppet/modules/concat/lib/facter/concat_basedir.rb\nInfo: Loading facts in /etc/puppet/modules/firewall/lib/facter/ip6tables_version.rb\nInfo: Loading facts in /etc/puppet/modules/firewall/lib/facter/iptables_persistent_version.rb\nInfo: Loading facts in /etc/puppet/modules/firewall/lib/facter/iptables_version.rb\nInfo: Loading facts in /etc/puppet/modules/mysql/lib/facter/mysql_version.rb\nInfo: Loading facts in /etc/puppet/modules/mysql/lib/facter/mysql_server_id.rb\nInfo: Loading facts in /etc/puppet/modules/python/lib/facter/pip_version.rb\nInfo: Loading facts in /etc/puppet/modules/python/lib/facter/python_version.rb\nInfo: Loading facts in /etc/puppet/modules/python/lib/facter/virtualenv_version.rb\nInfo: Loading facts in /etc/puppet/modules/staging/lib/facter/staging_http_get.rb\nInfo: Loading facts in /etc/puppet/modules/staging/lib/facter/staging_windir.rb\nInfo: Loading facts in /etc/puppet/modules/stdlib/lib/facter/puppet_vardir.rb\nInfo: Loading facts in /etc/puppet/modules/stdlib/lib/facter/facter_dot_d.rb\nInfo: Loading facts in /etc/puppet/modules/stdlib/lib/facter/pe_version.rb\nInfo: Loading facts in /etc/puppet/modules/stdlib/lib/facter/root_home.rb\nNotice: Compiled catalog for n2.dusty.ci.centos.org in environment production in 8.94 seconds\nInfo: Applying configuration version '1463743459'\nNotice: /Stage[main]/Concat::Setup/File[/var/lib/puppet/concat]/ensure: created\nNotice: /Stage[main]/Concat::Setup/File[/var/lib/puppet/concat/bin]/ensure: created\nNotice: /Stage[main]/Concat::Setup/File[/var/lib/puppet/concat/bin/concatfragments.rb]/ensure: defined content as '{md5}b684db0eac243553a6a79365119a363d'\nNotice: /Stage[main]/Xinetd/Package[xinetd]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Cacert/File[/etc/pki/ca-trust/source/anchors/puppet_openstack.pem]/ensure: defined content as '{md5}78f42ae07a4fc8ebdd5b89c4c74bba5e'\nInfo: /Stage[main]/Openstack_integration::Cacert/File[/etc/pki/ca-trust/source/anchors/puppet_openstack.pem]: Scheduling refresh of Exec[update-ca-certificates]\nNotice: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[glance-registry]\nNotice: /Stage[main]/Memcached/Package[memcached]/ensure: created\nNotice: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]/content: \n--- /etc/sysconfig/memcached\t2015-04-10 10:40:42.000000000 +0100\n+++ /tmp/puppet-file20160520-26469-d60985\t2016-05-20 12:24:40.532841270 +0100\n@@ -1,5 +1,5 @@\n PORT=\"11211\"\n USER=\"memcached\"\n-MAXCONN=\"1024\"\n-CACHESIZE=\"64\"\n-OPTIONS=\"\"\n+MAXCONN=\"8192\"\n+CACHESIZE=\"30400\"\n+OPTIONS=\"-l 0.0.0.0 -U 11211 -t 8 >> /var/log/memcached.log 2>&1\"\nInfo: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]: Filebucketed /etc/sysconfig/memcached to puppet with sum 05503957e3796fbe6fddd756a7a102a0\nNotice: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]/content: content changed '{md5}05503957e3796fbe6fddd756a7a102a0' to '{md5}607d5b4345a63a5155f9fbe6c19b6c9b'\nInfo: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]: Scheduling refresh of Service[memcached]\nNotice: /Stage[main]/Memcached/Service[memcached]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Memcached/Service[memcached]: Unscheduling refresh on Service[memcached]\nNotice: /Stage[main]/Rabbitmq::Repo::Rhel/Exec[rpm --import http://www.rabbitmq.com/rabbitmq-signing-key-public.asc]/returns: executed successfully\nNotice: /Stage[main]/Neutron::Agents::Lbaas/Package[haproxy]/ensure: created\nNotice: /Stage[main]/Glance/Package[openstack-glance]/ensure: created\nInfo: /Stage[main]/Glance/Package[openstack-glance]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Glance/Package[openstack-glance]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_config_file]/ensure: created\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_config_file]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_config_file]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_port]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_port]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_port]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_cache_config[glance_store/os_region_name]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[glance_store/os_region_name]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[glance_store/os_region_name]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_dir]/ensure: created\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_dir]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_dir]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/show_image_direct_url]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/show_image_direct_url]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/show_image_direct_url]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_port]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_port]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_port]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/debug]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/debug]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_user]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_user]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_user]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/image_cache_dir]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/image_cache_dir]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/image_cache_dir]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_host]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_host]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_host]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_key_file]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_key_file]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_key_file]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_port]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_store]/ensure: created\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_store]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_store]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/amqp_durable_queues]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_swift_reference]/ensure: created\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_swift_reference]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_swift_reference]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/key_file]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/key_file]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/key_file]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/user]/ensure: created\nNotice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/identity_uri]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/identity_uri]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_stderr]/ensure: created\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_stderr]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_stderr]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/key_file]/ensure: created\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/key_file]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/key_file]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_host]/ensure: created\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_host]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_host]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/workers]/ensure: created\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/workers]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/workers]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_password]/ensure: created\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_password]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[paste_deploy/flavor]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[paste_deploy/flavor]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[paste_deploy/flavor]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_dir]/ensure: created\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_dir]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_dir]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_rate]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_port]/ensure: created\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_port]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_port]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_virtual_host]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_syslog]/ensure: created\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_syslog]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_syslog]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/kombu_ssl_version]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_exchange]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_exchange]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_exchange]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_password]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_password]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[glance_store/os_region_name]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[glance_store/os_region_name]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[glance_store/os_region_name]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Neutron/Package[neutron]/ensure: created\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-ovs-agent-service]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-ovs-agent-service]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-lbaasv2-service]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-metering-service]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-metering-service]\nInfo: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Exec[neutron-db-sync]\nNotice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/resync_interval]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/resync_interval]: Scheduling refresh of Service[neutron-dhcp-service]\nNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Agents::Metering/Package[neutron-metering-agent]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Metering/Package[neutron-metering-agent]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Neutron::Agents::Metering/Package[neutron-metering-agent]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Agents::Metering/Package[neutron-metering-agent]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/metadata_workers]: Scheduling refresh of Service[neutron-metadata]\nNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_metadata_network]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_metadata_network]: Scheduling refresh of Service[neutron-dhcp-service]\nNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/driver]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/driver]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_tenant_name]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_tenant_name]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_tenant_name]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_tenant_name]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_isolated_metadata]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_isolated_metadata]: Scheduling refresh of Service[neutron-dhcp-service]\nNotice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/cert_file]/ensure: created\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/cert_file]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/cert_file]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-metadata]\nNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-dhcp-service]\nNotice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/syslog_log_facility]/ensure: created\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/syslog_log_facility]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/syslog_log_facility]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Openstack_integration/Package[openstack-selinux]/ensure: created\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/backlog]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/backlog]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/backlog]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Mysql::Client::Install/Package[mysql_client]/ensure: created\nNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Mongodb::Server::Install/Package[mongodb_server]/ensure: created\nNotice: /Stage[main]/Cinder/Package[cinder]/ensure: created\nInfo: /Stage[main]/Cinder/Package[cinder]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Cinder/Package[cinder]: Scheduling refresh of Exec[cinder-manage db_sync]\nNotice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]/ensure: created\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Cron::Db_purge/Cron[cinder-manage db purge]/ensure: created\nNotice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]/ensure: created\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]/ensure: created\nInfo: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]/ensure: created\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]/ensure: created\nInfo: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]/ensure: created\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]/ensure: created\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]/ensure: created\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]/ensure: created\nInfo: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Staging/File[/opt/staging]/ensure: created\nNotice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_file]/ensure: created\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_file]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_file]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_dir]/ensure: created\nInfo: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_dir]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_dir]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/verbose]/ensure: created\nInfo: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/verbose]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/verbose]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_file]/ensure: created\nInfo: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_file]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_file]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_cert_file]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_cert_file]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_cert_file]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]/ensure: created\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_protocol]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_protocol]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_protocol]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_create_container_on_put]/ensure: created\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_create_container_on_put]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_create_container_on_put]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_endpoint_type]/ensure: created\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_endpoint_type]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_endpoint_type]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Ironic/Package[ironic-common]/ensure: created\nInfo: /Stage[main]/Ironic/Package[ironic-common]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Ironic/Package[ironic-common]: Scheduling refresh of Exec[ironic-dbsync]\nNotice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_virtual_host]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic/Ironic_config[glance/glance_api_insecure]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[glance/glance_api_insecure]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[glance/glance_api_insecure]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/kombu_ssl_version]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_port]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic/Ironic_config[DEFAULT/enabled_drivers]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[DEFAULT/enabled_drivers]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[DEFAULT/enabled_drivers]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_host]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic/Ironic_config[DEFAULT/auth_strategy]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/amqp_durable_queues]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_password]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/cert_file]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/cert_file]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/cert_file]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Glance::Registry/Glance_registry_config[paste_deploy/flavor]/ensure: created\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[paste_deploy/flavor]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[paste_deploy/flavor]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Apache::Mod::Mime/Package[mailcap]/ensure: created\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/workers]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/workers]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/workers]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_password]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_password]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_password]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_password]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_syslog]/ensure: created\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_syslog]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_syslog]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Ironic::Client/Package[python-ironicclient]/ensure: created\nInfo: /Stage[main]/Ironic::Client/Package[python-ironicclient]: Scheduling refresh of Anchor[keystone::service::end]\nNotice: /Stage[main]/Tempest/Tempest_config[identity/admin_username]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/zaqar]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[network/public_router_id]/ensure: created\nNotice: /Stage[main]/Tempest/Package[openssl-devel]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[identity/auth_version]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/sahara]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[identity/uri_v3]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[DEFAULT/use_syslog]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/swift]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[identity/admin_tenant_name]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[scenario/img_dir]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[compute/flavor_ref_alt]/ensure: created\nNotice: /Stage[main]/Tempest/Package[libffi-devel]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[scenario/img_file]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[DEFAULT/log_file]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[dashboard/dashboard_url]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/cinder]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/ironic]/ensure: created\nNotice: /Stage[main]/Tempest/Exec[install-pip]/returns: executed successfully\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/heat]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[identity-feature-enabled/api_v2]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[identity/ca_certificates_file]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/trove]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/murano]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[compute/flavor_ref]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[identity/admin_password]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/ceilometer]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[oslo_concurrency/lock_path]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/glance]/ensure: created\nNotice: /Stage[main]/Tempest/Exec[install-tox]/returns: executed successfully\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/nova]/ensure: created\nNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/root_helper]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/root_helper]: Scheduling refresh of Service[neutron-dhcp-service]\nNotice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]/ensure: created\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_host]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_host]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_host]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Openstack_integration::Glance/Openstack_integration::Ssl_key[glance]/File[/etc/glance/ssl]/ensure: created\nNotice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_port]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_port]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_port]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/auth_version]/ensure: created\nNotice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/syslog_log_facility]/ensure: created\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/syslog_log_facility]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/syslog_log_facility]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/interface_driver]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/interface_driver]: Scheduling refresh of Service[neutron-dhcp-service]\nNotice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Tempest/Tempest_config[identity-feature-enabled/api_v3]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[compute/image_ssh_user]/ensure: created\nNotice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/auth_uri]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/auth_uri]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Tempest/Tempest_config[compute/image_alt_ssh_user]/ensure: created\nNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/dhcp_driver]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/dhcp_driver]: Scheduling refresh of Service[neutron-dhcp-service]\nNotice: /Stage[main]/Neutron::Server/Neutron_api_config[filter:authtoken/auth_uri]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_api_config[filter:authtoken/auth_uri]: Scheduling refresh of Service[neutron-server]\nNotice: /Stage[main]/Neutron::Services::Fwaas/Package[neutron-fwaas]/ensure: created\nInfo: /Stage[main]/Neutron::Services::Fwaas/Package[neutron-fwaas]: Scheduling refresh of Anchor[keystone::service::end]\nNotice: /Stage[main]/Neutron::Services::Fwaas/Neutron_fwaas_service_config[fwaas/driver]/ensure: created\nInfo: /Stage[main]/Neutron::Services::Fwaas/Neutron_fwaas_service_config[fwaas/driver]: Scheduling refresh of Service[neutron-l3]\nNotice: /Stage[main]/Openstacklib::Openstackclient/Package[python-openstackclient]/ensure: created\nInfo: /Stage[main]/Openstacklib::Openstackclient/Package[python-openstackclient]: Scheduling refresh of Anchor[keystone::service::end]\nNotice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Package[targetcli]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Tempest/Tempest_config[DEFAULT/use_stderr]/ensure: created\nNotice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_file]/ensure: created\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_file]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_file]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/horizon]/ensure: created\nNotice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_user]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_user]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Vswitch::Ovs/Package[openvswitch]/ensure: created\nNotice: /Stage[main]/Vswitch::Ovs/Service[openvswitch]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Vswitch::Ovs/Service[openvswitch]: Unscheduling refresh on Service[openvswitch]\nNotice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/content: \n--- /etc/xinetd.conf\t2014-06-09 19:55:06.000000000 +0100\n+++ /tmp/puppet-file20160520-26469-qo45km\t2016-05-20 12:26:52.098856758 +0100\n@@ -1,3 +1,5 @@\n+# This file is being maintained by Puppet.\n+# DO NOT EDIT\n #\n # This is the master xinetd configuration file. Settings in the\n # default section will be inherited by all service configurations\n@@ -10,41 +12,40 @@\n # The next two items are intended to be a quick access place to\n # temporarily enable or disable services.\n #\n-#\tenabled\t\t=\n-#\tdisabled\t=\n+# enabled =\n+# disabled =\n \n # Define general logging characteristics.\n-\tlog_type\t= SYSLOG daemon info \n-\tlog_on_failure\t= HOST\n-\tlog_on_success\t= PID HOST DURATION EXIT\n+ log_type = SYSLOG daemon info\n+ log_on_failure = HOST\n+ log_on_success = PID HOST DURATION EXIT\n \n # Define access restriction defaults\n #\n-#\tno_access\t=\n-#\tonly_from\t=\n-#\tmax_load\t= 0\n-\tcps\t\t= 50 10\n-\tinstances\t= 50\n-\tper_source\t= 10\n+# no_access =\n+# only_from =\n+# max_load = 0\n+ cps = 50 10\n+ instances = 50\n+ per_source = 10\n \n # Address and networking defaults\n #\n-#\tbind\t\t=\n-#\tmdns\t\t= yes\n-\tv6only\t\t= no\n+# bind =\n+# mdns = yes\n+ v6only = no\n \n # setup environmental attributes\n #\n-#\tpassenv\t\t=\n-\tgroups\t\t= yes\n-\tumask\t\t= 002\n+# passenv =\n+ groups = yes\n+ umask = 002\n \n # Generally, banners are not used. This sets up their global defaults\n #\n-#\tbanner\t\t=\n-#\tbanner_fail\t=\n-#\tbanner_success\t=\n+# banner =\n+# banner_fail =\n+# banner_success =\n }\n \n includedir /etc/xinetd.d\n- \nInfo: /Stage[main]/Xinetd/File[/etc/xinetd.conf]: Filebucketed /etc/xinetd.conf to puppet with sum 9ff8cc688dd9f0dfc45e5afd25c427a7\nNotice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/content: content changed '{md5}9ff8cc688dd9f0dfc45e5afd25c427a7' to '{md5}011e3163044bef3aa02a664f3785d30c'\nNotice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/mode: mode changed '0600' to '0644'\nInfo: /Stage[main]/Xinetd/File[/etc/xinetd.conf]: Scheduling refresh of Service[xinetd]\nInfo: /Stage[main]/Xinetd/File[/etc/xinetd.conf]: Scheduling refresh of Service[xinetd]\nNotice: /Stage[main]/Mysql::Server::Install/Package[mysql-server]/ensure: created\nNotice: /Stage[main]/Mysql::Server::Config/File[mysql-config-file]/ensure: defined content as '{md5}ff09a4033f718f08f69da17f0aa86652'\nNotice: /Stage[main]/Mysql::Server::Installdb/Exec[mysql_install_db]/returns: executed successfully\nNotice: /File[/var/log/mariadb/mariadb.log]/seluser: seluser changed 'unconfined_u' to 'system_u'\nNotice: /Stage[main]/Mysql::Server::Service/Service[mysqld]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Mysql::Server::Service/Service[mysqld]: Unscheduling refresh on Service[mysqld]\nNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Mysql_database[neutron]/ensure: created\nNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[neutron_127.0.0.1]/Mysql_user[neutron@127.0.0.1]/ensure: created\nNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Mysql_database[glance]/ensure: created\nNotice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Mysql_database[cinder]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/aodh]/ensure: created\nNotice: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]/ensure: created\nInfo: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Tempest/Tempest_config[identity/uri]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[identity/admin_domain_name]/ensure: created\nNotice: /Stage[main]/Ironic/Ironic_config[DEFAULT/rpc_backend]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Neutron::Services::Fwaas/Neutron_fwaas_service_config[fwaas/enabled]/ensure: created\nInfo: /Stage[main]/Neutron::Services::Fwaas/Neutron_fwaas_service_config[fwaas/enabled]: Scheduling refresh of Service[neutron-l3]\nNotice: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/log_dir]/ensure: created\nInfo: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/log_dir]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/log_dir]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_stderr]/ensure: created\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_stderr]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_stderr]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Nova/Package[python-nova]/ensure: created\nInfo: /Stage[main]/Nova/Package[python-nova]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Nova/Package[python-nova]: Scheduling refresh of Anchor[nova::install::end]\nNotice: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Package[nova-consoleauth]/ensure: created\nInfo: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Package[nova-consoleauth]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Package[nova-consoleauth]: Scheduling refresh of Anchor[nova::install::end]\nNotice: /Stage[main]/Nova::Compute/Package[genisoimage]/ensure: created\nInfo: /Stage[main]/Nova::Compute/Package[genisoimage]: Scheduling refresh of Anchor[keystone::service::end]\nNotice: /Stage[main]/Openstack_integration::Nova/Openstack_integration::Ssl_key[nova]/File[/etc/nova/ssl]/ensure: created\nNotice: /Stage[main]/Nova::Compute::Libvirt/Package[libvirt-nwfilter]/ensure: created\nInfo: /Stage[main]/Nova::Compute::Libvirt/Package[libvirt-nwfilter]: Scheduling refresh of Anchor[keystone::service::end]\nNotice: /Stage[main]/Nova::Compute::Libvirt/Package[libvirt]/ensure: created\nInfo: /Stage[main]/Nova::Compute::Libvirt/Package[libvirt]: Scheduling refresh of Anchor[keystone::service::end]\nNotice: /Stage[main]/Openstack_integration::Nova/Openstack_integration::Ssl_key[nova]/File[/etc/nova/ssl/private]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Nova/Openstack_integration::Ssl_key[nova]/File[/etc/nova/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00'\nInfo: Openstack_integration::Ssl_key[nova]: Scheduling refresh of Service[httpd]\nNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_127.0.0.1]/Mysql_user[glance@127.0.0.1]/ensure: created\nNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_127.0.0.1]/Mysql_grant[glance@127.0.0.1/glance.*]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Glance/Openstack_integration::Ssl_key[glance]/File[/etc/glance/ssl/private]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Glance/Openstack_integration::Ssl_key[glance]/File[/etc/glance/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00'\nInfo: Openstack_integration::Ssl_key[glance]: Scheduling refresh of Service[glance-api]\nInfo: Openstack_integration::Ssl_key[glance]: Scheduling refresh of Service[glance-registry]\nNotice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/auth_uri]/ensure: created\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/auth_uri]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Tempest/Tempest_config[compute/build_interval]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Neutron::Agents::Lbaas/Package[neutron-lbaas-agent]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Lbaas/Package[neutron-lbaas-agent]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Neutron::Agents::Lbaas/Package[neutron-lbaas-agent]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Agents::Lbaas/Package[neutron-lbaas-agent]: Scheduling refresh of Service[neutron-lbaas-service]\nNotice: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[haproxy/user_group]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[haproxy/user_group]: Scheduling refresh of Service[neutron-lbaas-service]\nNotice: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-lbaas-service]\nNotice: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/interface_driver]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/interface_driver]: Scheduling refresh of Service[neutron-lbaas-service]\nNotice: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/device_driver]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/device_driver]: Scheduling refresh of Service[neutron-lbaas-service]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_topic]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_topic]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_topic]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/auth_address]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Provision/Vs_bridge[br-ex]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Provision/Vs_bridge[br-ex]: Scheduling refresh of Exec[create_loop1_port]\nNotice: /Stage[main]/Openstack_integration::Provision/Exec[create_loop1_port]: Triggered 'refresh' from 1 events\nNotice: /Stage[main]/Openstack_integration::Provision/Vs_port[loop1]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Provision/Vs_port[loop1]: Scheduling refresh of Exec[create_br-ex_vif]\nNotice: /Stage[main]/Openstack_integration::Provision/Exec[create_br-ex_vif]: Triggered 'refresh' from 1 events\nNotice: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Package[nova-compute]/ensure: created\nInfo: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Package[nova-compute]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Package[nova-compute]: Scheduling refresh of Anchor[nova::install::end]\nNotice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Setup_test_volume/Exec[create_/var/lib/cinder/cinder-volumes]/returns: executed successfully\nInfo: /Stage[main]/Cinder::Setup_test_volume/Exec[create_/var/lib/cinder/cinder-volumes]: Scheduling refresh of Exec[losetup /dev/loop2 /var/lib/cinder/cinder-volumes]\nNotice: /Stage[main]/Cinder::Setup_test_volume/Exec[losetup /dev/loop2 /var/lib/cinder/cinder-volumes]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Cinder::Setup_test_volume/Exec[losetup /dev/loop2 /var/lib/cinder/cinder-volumes]: Scheduling refresh of Exec[pvcreate /dev/loop2]\nNotice: /Stage[main]/Cinder::Setup_test_volume/Exec[pvcreate /dev/loop2]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Cinder::Setup_test_volume/Exec[pvcreate /dev/loop2]: Scheduling refresh of Exec[vgcreate cinder-volumes /dev/loop2]\nNotice: /Stage[main]/Cinder::Setup_test_volume/Exec[vgcreate cinder-volumes /dev/loop2]: Triggered 'refresh' from 1 events\nNotice: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]/ensure: created\nInfo: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/state_path]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/state_path]: Scheduling refresh of Service[neutron-dhcp-service]\nNotice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/identity_uri]/ensure: created\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/identity_uri]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_host]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_host]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_host]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Service[target]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Service[target]: Unscheduling refresh on Service[target]\nNotice: /Stage[main]/Swift/Package[swift]/ensure: created\nInfo: /Stage[main]/Swift/Package[swift]: Scheduling refresh of Anchor[keystone::service::end]\nNotice: /Stage[main]/Swift/File[/var/lib/swift]/group: group changed 'root' to 'swift'\nNotice: /Stage[main]/Swift/File[/etc/swift]/owner: owner changed 'root' to 'swift'\nNotice: /Stage[main]/Swift/File[/etc/swift]/group: group changed 'root' to 'swift'\nNotice: /Stage[main]/Swift/File[/etc/swift/swift.conf]/owner: owner changed 'root' to 'swift'\nNotice: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]/ensure: created\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-proxy-server]\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-account-reaper]\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-container-updater]\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-object-updater]\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-account-server]\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-account-auditor]\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-account-replicator]\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-container-server]\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-container-auditor]\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-container-replicator]\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-object-server]\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-object-auditor]\nInfo: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-object-replicator]\nNotice: /Stage[main]/Openstack_integration::Swift/File[/srv/node]/ensure: created\nNotice: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]/value: value changed '%SWIFT_HASH_PATH_SUFFIX%' to 'secrete'\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-proxy-server]\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-account-reaper]\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-container-updater]\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-object-updater]\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-account-server]\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-account-auditor]\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-account-replicator]\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-container-server]\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-container-auditor]\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-container-replicator]\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-object-server]\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-object-auditor]\nInfo: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-object-replicator]\nNotice: /Stage[main]/Swift/File[/var/run/swift]/group: group changed 'root' to 'swift'\nNotice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Create[container]/Exec[create_container]/returns: executed successfully\nNotice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Create[object]/Exec[create_object]/returns: executed successfully\nNotice: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/3]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/3]: Scheduling refresh of Swift::Ringbuilder::Rebalance[object]\nNotice: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/3]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/3]: Scheduling refresh of Swift::Ringbuilder::Rebalance[container]\nNotice: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/2]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/2]: Scheduling refresh of Swift::Ringbuilder::Rebalance[container]\nNotice: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/2]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/2]: Scheduling refresh of Swift::Ringbuilder::Rebalance[object]\nNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[neutron_127.0.0.1]/Mysql_grant[neutron@127.0.0.1/neutron.*]/ensure: created\nInfo: Openstacklib::Db::Mysql[neutron]: Scheduling refresh of Service[neutron-server]\nInfo: Openstacklib::Db::Mysql[neutron]: Scheduling refresh of Exec[neutron-db-sync]\nNotice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/debug]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/debug]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Api/Glance_api_config[glance_store/stores]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_api_config[glance_store/stores]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_api_config[glance_store/stores]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Neutron::Plugins::Ml2/Package[neutron-plugin-ml2]/ensure: created\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Package[neutron-plugin-ml2]: Scheduling refresh of Anchor[keystone::service::end]\nNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vni_ranges]/ensure: created\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vni_ranges]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vni_ranges]: Scheduling refresh of Exec[neutron-db-sync]\nNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/type_drivers]/ensure: created\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/type_drivers]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/type_drivers]: Scheduling refresh of Exec[neutron-db-sync]\nNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/path_mtu]/ensure: created\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/path_mtu]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/path_mtu]: Scheduling refresh of Exec[neutron-db-sync]\nNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/tenant_network_types]/ensure: created\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/tenant_network_types]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/tenant_network_types]: Scheduling refresh of Exec[neutron-db-sync]\nNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/mechanism_drivers]/ensure: created\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/mechanism_drivers]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/mechanism_drivers]: Scheduling refresh of Exec[neutron-db-sync]\nNotice: /Stage[main]/Neutron::Plugins::Ml2/File[/etc/default/neutron-server]/ensure: created\nNotice: /Stage[main]/Neutron::Plugins::Ml2/File[/etc/neutron/plugin.ini]/ensure: created\nNotice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]/ensure: created\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Ironic/Ironic_config[DEFAULT/control_exchange]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[DEFAULT/control_exchange]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[DEFAULT/control_exchange]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/debug]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/debug]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/1]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/1]: Scheduling refresh of Swift::Ringbuilder::Rebalance[container]\nInfo: Swift::Ringbuilder::Rebalance[container]: Scheduling refresh of Exec[rebalance_container]\nNotice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Rebalance[container]/Exec[rebalance_container]: Triggered 'refresh' from 1 events\nNotice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/verbose]/ensure: created\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/verbose]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/verbose]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/verbose]/ensure: created\nInfo: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/verbose]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/verbose]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Openstacklib::Db::Mysql::Host_access[cinder_127.0.0.1]/Mysql_user[cinder@127.0.0.1]/ensure: created\nNotice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Openstacklib::Db::Mysql::Host_access[cinder_127.0.0.1]/Mysql_grant[cinder@127.0.0.1/cinder.*]/ensure: created\nInfo: Openstacklib::Db::Mysql[cinder]: Scheduling refresh of Exec[cinder-manage db_sync]\nNotice: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Package[nova-conductor]/ensure: created\nInfo: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Package[nova-conductor]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Package[nova-conductor]: Scheduling refresh of Anchor[nova::install::end]\nNotice: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/1]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/1]: Scheduling refresh of Swift::Ringbuilder::Rebalance[object]\nInfo: Swift::Ringbuilder::Rebalance[object]: Scheduling refresh of Exec[rebalance_object]\nNotice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Rebalance[object]/Exec[rebalance_object]: Triggered 'refresh' from 1 events\nNotice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/verbose]/ensure: created\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/verbose]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/verbose]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Ironic::Conductor/Package[ironic-conductor]/ensure: created\nInfo: /Stage[main]/Ironic::Conductor/Package[ironic-conductor]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Ironic::Conductor/Package[ironic-conductor]: Scheduling refresh of Exec[ironic-dbsync]\nNotice: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/max_time_interval]/ensure: created\nInfo: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/max_time_interval]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/max_time_interval]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/force_power_state_during_sync]/ensure: created\nInfo: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/force_power_state_during_sync]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/force_power_state_during_sync]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Package[nova-vncproxy]/ensure: created\nInfo: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Package[nova-vncproxy]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Package[nova-vncproxy]: Scheduling refresh of Anchor[nova::install::end]\nNotice: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Ironic/Ironic_config[glance/glance_num_retries]/ensure: created\nInfo: /Stage[main]/Ironic/Ironic_config[glance/glance_num_retries]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic/Ironic_config[glance/glance_num_retries]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf]/ensure: created\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments]/ensure: created\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf]\nNotice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[container]/Concat::Fragment[swift_recon_container]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/35_swift_recon_container]/ensure: defined content as '{md5}d847d2d529a3596ed6a74d841d790dc7'\nInfo: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[container]/Concat::Fragment[swift_recon_container]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/35_swift_recon_container]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments.concat]/ensure: created\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf]/ensure: created\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments.concat]/ensure: created\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments.concat.out]/ensure: created\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments]/ensure: created\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat::Fragment[swift-account-6002]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/00_swift-account-6002]/ensure: defined content as '{md5}666661f3805b49b4682cc11f80dad508'\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat::Fragment[swift-account-6002]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/00_swift-account-6002]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments.concat.out]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_config[service_broker/run_service_broker_tests]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Tempest/Tempest_config[DEFAULT/debug]/ensure: created\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[DEFAULT/notification_driver]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[DEFAULT/notification_driver]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[DEFAULT/notification_driver]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/interface_driver]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/interface_driver]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf]/ensure: created\nInfo: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf]: Scheduling refresh of Exec[concat_/etc/rsync.conf]\nNotice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments.concat.out]/ensure: created\nNotice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments]/ensure: created\nInfo: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments]: Scheduling refresh of Exec[concat_/etc/rsync.conf]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Rsync::Server::Module[container]/Concat::Fragment[frag-container]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_container_frag-container]/ensure: defined content as '{md5}7dd5f706fbeccaf9a45d40737af512ac'\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Rsync::Server::Module[container]/Concat::Fragment[frag-container]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_container_frag-container]: Scheduling refresh of Exec[concat_/etc/rsync.conf]\nNotice: /Stage[main]/Tempest/Tempest_config[service_available/neutron]/ensure: created\nNotice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]/ensure: created\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/key]/ensure: created\nNotice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/auth_uri]/ensure: created\nInfo: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic::Api/Package[ironic-api]/ensure: created\nInfo: /Stage[main]/Ironic::Api/Package[ironic-api]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Ironic::Api/Package[ironic-api]: Scheduling refresh of Exec[ironic-dbsync]\nNotice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_tenant_name]/ensure: created\nInfo: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic::Api/Ironic_config[api/host_ip]/ensure: created\nInfo: /Stage[main]/Ironic::Api/Ironic_config[api/host_ip]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Api/Ironic_config[api/host_ip]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_password]/ensure: created\nInfo: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/identity_uri]/ensure: created\nInfo: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic::Api/Ironic_config[neutron/url]/ensure: created\nInfo: /Stage[main]/Ironic::Api/Ironic_config[neutron/url]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Api/Ironic_config[neutron/url]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_user]/ensure: created\nInfo: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Ironic::Api/Ironic_config[api/max_limit]/ensure: created\nInfo: /Stage[main]/Ironic::Api/Ironic_config[api/max_limit]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Api/Ironic_config[api/max_limit]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf]/ensure: created\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments]/ensure: created\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf]\nNotice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[object]/Concat::Fragment[swift_healthcheck_object]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/25_swift_healthcheck_object]/ensure: defined content as '{md5}8c92056c41082619d179f88ea15c5fc6'\nInfo: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[object]/Concat::Fragment[swift_healthcheck_object]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/25_swift_healthcheck_object]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments.concat.out]/ensure: created\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat::Fragment[swift-object-6000]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/00_swift-object-6000]/ensure: defined content as '{md5}f5bb62f4798612b143fc441befa50ecc'\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat::Fragment[swift-object-6000]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/00_swift-object-6000]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Rsync::Server::Module[object]/Concat::Fragment[frag-object]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_object_frag-object]/ensure: defined content as '{md5}d0ecd24502eb0f9cd5c387b2e1e32943'\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Rsync::Server::Module[object]/Concat::Fragment[frag-object]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_object_frag-object]: Scheduling refresh of Exec[concat_/etc/rsync.conf]\nNotice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Create[account]/Exec[create_account]/returns: executed successfully\nNotice: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/1]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/1]: Scheduling refresh of Swift::Ringbuilder::Rebalance[account]\nNotice: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/3]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/3]: Scheduling refresh of Swift::Ringbuilder::Rebalance[account]\nNotice: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/2]/ensure: created\nInfo: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/2]: Scheduling refresh of Swift::Ringbuilder::Rebalance[account]\nInfo: Swift::Ringbuilder::Rebalance[account]: Scheduling refresh of Exec[rebalance_account]\nNotice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Rebalance[account]/Exec[rebalance_account]: Triggered 'refresh' from 1 events\nNotice: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/debug]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/debug]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Keystone::Client/Package[python-keystoneclient]/ensure: created\nInfo: /Stage[main]/Keystone::Client/Package[python-keystoneclient]: Scheduling refresh of Anchor[keystone::service::end]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat::Fragment[swift-container-6001]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/00_swift-container-6001]/ensure: defined content as '{md5}26d25a9fa3702760a9fc42a4a2bd22c2'\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat::Fragment[swift-container-6001]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/00_swift-container-6001]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf]\nNotice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[account]/Concat::Fragment[swift_healthcheck_account]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/25_swift_healthcheck_account]/ensure: defined content as '{md5}8c92056c41082619d179f88ea15c5fc6'\nInfo: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[account]/Concat::Fragment[swift_healthcheck_account]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/25_swift_healthcheck_account]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf]\nNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]: Scheduling refresh of Service[neutron-ovs-agent-service]\nNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/local_ip]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/local_ip]: Scheduling refresh of Service[neutron-ovs-agent-service]\nNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/tunnel_bridge]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/tunnel_bridge]: Scheduling refresh of Service[neutron-ovs-agent-service]\nNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/enable_tunneling]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/enable_tunneling]: Scheduling refresh of Service[neutron-ovs-agent-service]\nNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/integration_bridge]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/integration_bridge]: Scheduling refresh of Service[neutron-ovs-agent-service]\nNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[securitygroup/firewall_driver]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[securitygroup/firewall_driver]: Scheduling refresh of Service[neutron-ovs-agent-service]\nNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Service[ovs-cleanup-service]/enable: enable changed 'false' to 'true'\nNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/drop_flows_on_start]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/drop_flows_on_start]: Scheduling refresh of Service[neutron-ovs-agent-service]\nNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/vxlan_udp_port]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/vxlan_udp_port]: Scheduling refresh of Service[neutron-ovs-agent-service]\nNotice: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/agent_mode]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/agent_mode]: Scheduling refresh of Service[neutron-l3]\nNotice: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/interface_driver]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/interface_driver]: Scheduling refresh of Service[neutron-l3]\nNotice: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-l3]\nNotice: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]: Scheduling refresh of Service[neutron-metadata]\nNotice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Mysql_database[ironic]/ensure: created\nNotice: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vxlan_group]/ensure: created\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vxlan_group]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vxlan_group]: Scheduling refresh of Exec[neutron-db-sync]\nNotice: /Stage[main]/Tempest/Tempest_config[DEFAULT/verbose]/ensure: created\nNotice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]/ensure: created\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Ironic::Api/Ironic_config[api/port]/ensure: created\nInfo: /Stage[main]/Ironic::Api/Ironic_config[api/port]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Api/Ironic_config[api/port]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Rsync::Server/Xinetd::Service[rsync]/File[/etc/xinetd.d/rsync]/ensure: created\nInfo: /Stage[main]/Rsync::Server/Xinetd::Service[rsync]/File[/etc/xinetd.d/rsync]: Scheduling refresh of Service[xinetd]\nNotice: /Stage[main]/Xinetd/Service[xinetd]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Xinetd/Service[xinetd]: Unscheduling refresh on Service[xinetd]\nInfo: Openstacklib::Db::Mysql[glance]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[account]/Concat::Fragment[swift_recon_account]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/35_swift_recon_account]/ensure: defined content as '{md5}d847d2d529a3596ed6a74d841d790dc7'\nInfo: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[account]/Concat::Fragment[swift_recon_account]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/35_swift_recon_account]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf]\nNotice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_tenant_name]/ensure: created\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Openstacklib::Db::Mysql::Host_access[ironic_127.0.0.1]/Mysql_user[ironic@127.0.0.1]/ensure: created\nNotice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Openstacklib::Db::Mysql::Host_access[ironic_127.0.0.1]/Mysql_grant[ironic@127.0.0.1/ironic.*]/ensure: created\nInfo: Openstacklib::Db::Mysql[ironic]: Scheduling refresh of Exec[ironic-dbsync]\nNotice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_container]/ensure: created\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_container]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_container]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Apache/Package[httpd]/ensure: created\nInfo: /Stage[main]/Apache/Package[httpd]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/File[/var/www/cgi-bin/nova]/ensure: created\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/File[/var/www/cgi-bin/ironic]/ensure: created\nNotice: /Stage[main]/Apache::Mod::Wsgi/Apache::Mod[wsgi]/Package[mod_wsgi]/ensure: created\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/File[nova_api_wsgi]/ensure: defined content as '{md5}87dec420e9b6e707b94b149f1432bad2'\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_digest]/File[auth_digest.load]/ensure: defined content as '{md5}df9e85f8da0b239fe8e698ae7ead4f60'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_digest]/File[auth_digest.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Mime/Apache::Mod[mime]/File[mime.load]/ensure: defined content as '{md5}e36257b9efab01459141d423cae57c7c'\nInfo: /Stage[main]/Apache::Mod::Mime/Apache::Mod[mime]/File[mime.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[expires]/File[expires.load]/ensure: defined content as '{md5}f0825bad1e470de86ffabeb86dcc5d95'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[expires]/File[expires.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Dav/Apache::Mod[dav]/File[dav.load]/ensure: defined content as '{md5}588e496251838c4840c14b28b5aa7881'\nInfo: /Stage[main]/Apache::Mod::Dav/Apache::Mod[dav]/File[dav.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_owner]/File[authz_owner.load]/ensure: defined content as '{md5}f30a9be1016df87f195449d9e02d1857'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_owner]/File[authz_owner.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_groupfile]/File[authz_groupfile.load]/ensure: defined content as '{md5}ae005a36b3ac8c20af36c434561c8a75'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_groupfile]/File[authz_groupfile.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_dbm]/File[authn_dbm.load]/ensure: defined content as '{md5}90ee8f8ef1a017cacadfda4225e10651'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_dbm]/File[authn_dbm.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Authn_core/Apache::Mod[authn_core]/File[authn_core.load]/ensure: defined content as '{md5}704d6e8b02b0eca0eba4083960d16c52'\nInfo: /Stage[main]/Apache::Mod::Authn_core/Apache::Mod[authn_core]/File[authn_core.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Authz_user/Apache::Mod[authz_user]/File[authz_user.load]/ensure: defined content as '{md5}63594303ee808423679b1ea13dd5a784'\nInfo: /Stage[main]/Apache::Mod::Authz_user/Apache::Mod[authz_user]/File[authz_user.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[log_config]/File[log_config.load]/ensure: defined content as '{md5}785d35cb285e190d589163b45263ca89'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[log_config]/File[log_config.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[logio]/File[logio.load]/ensure: defined content as '{md5}084533c7a44e9129d0e6df952e2472b6'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[logio]/File[logio.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[socache_shmcb]/File[socache_shmcb.load]/ensure: defined content as '{md5}ab31a6ea611785f74851b578572e4157'\nInfo: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[socache_shmcb]/File[socache_shmcb.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Mime/File[mime.conf]/ensure: defined content as '{md5}9da85e58f3bd6c780ce76db603b7f028'\nInfo: /Stage[main]/Apache::Mod::Mime/File[mime.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[access_compat]/File[access_compat.load]/ensure: defined content as '{md5}d5feb88bec4570e2dbc41cce7e0de003'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[access_compat]/File[access_compat.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Version/Apache::Mod[version]/File[version.load]/ensure: defined content as '{md5}1c9243de22ace4dc8266442c48ae0c92'\nInfo: /Stage[main]/Apache::Mod::Version/Apache::Mod[version]/File[version.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Setenvif/File[setenvif.conf]/ensure: defined content as '{md5}c7ede4173da1915b7ec088201f030c28'\nInfo: /Stage[main]/Apache::Mod::Setenvif/File[setenvif.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Actions/Apache::Mod[actions]/File[actions.load]/ensure: defined content as '{md5}599866dfaf734f60f7e2d41ee8235515'\nInfo: /Stage[main]/Apache::Mod::Actions/Apache::Mod[actions]/File[actions.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Deflate/File[deflate.conf]/ensure: defined content as '{md5}a045d750d819b1e9dae3fbfb3f20edd5'\nInfo: /Stage[main]/Apache::Mod::Deflate/File[deflate.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_core]/File[authz_core.load]/ensure: defined content as '{md5}39942569bff2abdb259f9a347c7246bc'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_core]/File[authz_core.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Negotiation/File[negotiation.conf]/ensure: defined content as '{md5}47284b5580b986a6ba32580b6ffb9fd7'\nInfo: /Stage[main]/Apache::Mod::Negotiation/File[negotiation.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Alias/Apache::Mod[alias]/File[alias.load]/ensure: defined content as '{md5}3cf2fa309ccae4c29a4b875d0894cd79'\nInfo: /Stage[main]/Apache::Mod::Alias/Apache::Mod[alias]/File[alias.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[env]/File[env.load]/ensure: defined content as '{md5}d74184d40d0ee24ba02626a188ee7e1a'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[env]/File[env.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Negotiation/Apache::Mod[negotiation]/File[negotiation.load]/ensure: defined content as '{md5}d262ee6a5f20d9dd7f87770638dc2ccd'\nInfo: /Stage[main]/Apache::Mod::Negotiation/Apache::Mod[negotiation]/File[negotiation.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_dbm]/File[authz_dbm.load]/ensure: defined content as '{md5}c1363277984d22f99b70f7dce8753b60'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_dbm]/File[authz_dbm.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Dir/File[dir.conf]/ensure: defined content as '{md5}c741d8ea840e6eb999d739eed47c69d7'\nInfo: /Stage[main]/Apache::Mod::Dir/File[dir.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[usertrack]/File[usertrack.load]/ensure: defined content as '{md5}e95fbbf030fabec98b948f8dc217775c'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[usertrack]/File[usertrack.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Prefork/File[/etc/httpd/conf.modules.d/prefork.conf]/ensure: defined content as '{md5}109c4f51dac10fc1b39373855e566d01'\nInfo: /Stage[main]/Apache::Mod::Prefork/File[/etc/httpd/conf.modules.d/prefork.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Vhost_alias/Apache::Mod[vhost_alias]/File[vhost_alias.load]/ensure: defined content as '{md5}eca907865997d50d5130497665c3f82e'\nInfo: /Stage[main]/Apache::Mod::Vhost_alias/Apache::Mod[vhost_alias]/File[vhost_alias.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[substitute]/File[substitute.load]/ensure: defined content as '{md5}8077c34a71afcf41c8fc644830935915'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[substitute]/File[substitute.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Setenvif/Apache::Mod[setenvif]/File[setenvif.load]/ensure: defined content as '{md5}ec6c99f7cc8e35bdbcf8028f652c9f6d'\nInfo: /Stage[main]/Apache::Mod::Setenvif/Apache::Mod[setenvif]/File[setenvif.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[unixd]/File[unixd.load]/ensure: defined content as '{md5}0e8468ecc1265f8947b8725f4d1be9c0'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[unixd]/File[unixd.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Deflate/Apache::Mod[deflate]/File[deflate.load]/ensure: defined content as '{md5}2d1a1afcae0c70557251829a8586eeaf'\nInfo: /Stage[main]/Apache::Mod::Deflate/Apache::Mod[deflate]/File[deflate.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Wsgi/Apache::Mod[wsgi]/File[wsgi.load]/ensure: defined content as '{md5}e1795e051e7aae1f865fde0d3b86a507'\nInfo: /Stage[main]/Apache::Mod::Wsgi/Apache::Mod[wsgi]/File[wsgi.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_basic]/File[auth_basic.load]/ensure: defined content as '{md5}494bcf4b843f7908675d663d8dc1bdc8'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_basic]/File[auth_basic.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Prefork/Apache::Mpm[prefork]/File[/etc/httpd/conf.modules.d/prefork.load]/ensure: defined content as '{md5}157529aafcf03fa491bc924103e4608e'\nInfo: /Stage[main]/Apache::Mod::Prefork/Apache::Mpm[prefork]/File[/etc/httpd/conf.modules.d/prefork.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Authn_file/Apache::Mod[authn_file]/File[authn_file.load]/ensure: defined content as '{md5}d41656680003d7b890267bb73621c60b'\nInfo: /Stage[main]/Apache::Mod::Authn_file/Apache::Mod[authn_file]/File[authn_file.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Ext_filter/Apache::Mod[ext_filter]/File[ext_filter.load]/ensure: defined content as '{md5}76d5e0ac3411a4be57ac33ebe2e52ac8'\nInfo: /Stage[main]/Apache::Mod::Ext_filter/Apache::Mod[ext_filter]/File[ext_filter.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Wsgi/File[wsgi.conf]/ensure: defined content as '{md5}8b3feb3fc2563de439920bb2c52cbd11'\nInfo: /Stage[main]/Apache::Mod::Wsgi/File[wsgi.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Speling/Apache::Mod[speling]/File[speling.load]/ensure: defined content as '{md5}f82e9e6b871a276c324c9eeffcec8a61'\nInfo: /Stage[main]/Apache::Mod::Speling/Apache::Mod[speling]/File[speling.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Dir/Apache::Mod[dir]/File[dir.load]/ensure: defined content as '{md5}1bfb1c2a46d7351fc9eb47c659dee068'\nInfo: /Stage[main]/Apache::Mod::Dir/Apache::Mod[dir]/File[dir.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Dav_fs/Apache::Mod[dav_fs]/File[dav_fs.load]/ensure: defined content as '{md5}2996277c73b1cd684a9a3111c355e0d3'\nInfo: /Stage[main]/Apache::Mod::Dav_fs/Apache::Mod[dav_fs]/File[dav_fs.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[include]/File[include.load]/ensure: defined content as '{md5}88095a914eedc3c2c184dd5d74c3954c'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[include]/File[include.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[systemd]/File[systemd.load]/ensure: defined content as '{md5}26e5d44aae258b3e9d821cbbbd3e2826'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[systemd]/File[systemd.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Alias/File[alias.conf]/ensure: defined content as '{md5}983e865be85f5e0daaed7433db82995e'\nInfo: /Stage[main]/Apache::Mod::Alias/File[alias.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[ssl]/Package[mod_ssl]/ensure: created\nNotice: /Stage[main]/Apache::Mod::Ssl/File[ssl.conf]/ensure: defined content as '{md5}8884ea33793365e0784cfd43be72464e'\nInfo: /Stage[main]/Apache::Mod::Ssl/File[ssl.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[ssl]/File[ssl.load]/ensure: defined content as '{md5}e282ac9f82fe5538692a4de3616fb695'\nInfo: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[ssl]/File[ssl.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_host]/File[authz_host.load]/ensure: defined content as '{md5}d1045f54d2798499ca0f030ca0eef920'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_host]/File[authz_host.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Suexec/Apache::Mod[suexec]/File[suexec.load]/ensure: defined content as '{md5}c7d5c61c534ba423a79b0ae78ff9be35'\nInfo: /Stage[main]/Apache::Mod::Suexec/Apache::Mod[suexec]/File[suexec.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Cache/Apache::Mod[cache]/File[cache.load]/ensure: defined content as '{md5}01e4d392225b518a65b0f7d6c4e21d29'\nInfo: /Stage[main]/Apache::Mod::Cache/Apache::Mod[cache]/File[cache.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Rewrite/Apache::Mod[rewrite]/File[rewrite.load]/ensure: defined content as '{md5}26e2683352fc1599f29573ff0d934e79'\nInfo: /Stage[main]/Apache::Mod::Rewrite/Apache::Mod[rewrite]/File[rewrite.load]: Scheduling refresh of Class[Apache::Service]\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.d/autoindex.conf]: Filebucketed /etc/httpd/conf.d/autoindex.conf to puppet with sum 09726332c2fd6fc73a57fbe69fc10427\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.d/autoindex.conf]/ensure: removed\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.d/userdir.conf]: Filebucketed /etc/httpd/conf.d/userdir.conf to puppet with sum d4a2620683cc3ff2315c685f9f354265\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.d/userdir.conf]/ensure: removed\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.d/ssl.conf]: Filebucketed /etc/httpd/conf.d/ssl.conf to puppet with sum 1888b608773b45f4acea3604eccf3562\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.d/ssl.conf]/ensure: removed\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.d/welcome.conf]: Filebucketed /etc/httpd/conf.d/welcome.conf to puppet with sum 9d1328b985d0851eb5bc610da6122f44\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.d/welcome.conf]/ensure: removed\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.d/README]: Filebucketed /etc/httpd/conf.d/README to puppet with sum 20b886e8496027dcbc31ed28d404ebb1\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.d/README]/ensure: removed\nNotice: /Stage[main]/Apache::Mod::Autoindex/Apache::Mod[autoindex]/File[autoindex.load]/ensure: defined content as '{md5}515cdf5b573e961a60d2931d39248648'\nInfo: /Stage[main]/Apache::Mod::Autoindex/Apache::Mod[autoindex]/File[autoindex.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_anon]/File[authn_anon.load]/ensure: defined content as '{md5}bf57b94b5aec35476fc2a2dc3861f132'\nInfo: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_anon]/File[authn_anon.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Autoindex/File[autoindex.conf]/ensure: defined content as '{md5}2421a3c6df32c7e38c2a7a22afdf5728'\nInfo: /Stage[main]/Apache::Mod::Autoindex/File[autoindex.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf]/ensure: created\nInfo: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]\nNotice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments.concat.out]/ensure: created\nNotice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments]/ensure: created\nInfo: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]\nNotice: /Stage[main]/Apache/Concat::Fragment[Apache ports header]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Apache ports header]/ensure: defined content as '{md5}afe35cb5747574b700ebaa0f0b3a626e'\nInfo: /Stage[main]/Apache/Concat::Fragment[Apache ports header]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Apache ports header]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]\nNotice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments.concat]/ensure: created\nNotice: /Stage[main]/Apache::Mod::Filter/Apache::Mod[filter]/File[filter.load]/ensure: defined content as '{md5}66a1e2064a140c3e7dca7ac33877700e'\nInfo: /Stage[main]/Apache::Mod::Filter/Apache::Mod[filter]/File[filter.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/File[ironic_wsgi]/ensure: defined content as '{md5}77ef07cc957e05e2024c75ef82d6fbbd'\nNotice: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]/ensure: created\nInfo: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[object]/Concat::Fragment[swift_recon_object]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/35_swift_recon_object]/ensure: defined content as '{md5}d847d2d529a3596ed6a74d841d790dc7'\nInfo: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[object]/Concat::Fragment[swift_recon_object]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/35_swift_recon_object]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf]\nNotice: /Stage[main]/Rsync::Server/Concat::Fragment[rsyncd_conf_header]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/00_header_rsyncd_conf_header]/ensure: defined content as '{md5}3a2ab53ad81bbfc64ceb17fb3a7efee0'\nInfo: /Stage[main]/Rsync::Server/Concat::Fragment[rsyncd_conf_header]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/00_header_rsyncd_conf_header]: Scheduling refresh of Exec[concat_/etc/rsync.conf]\nNotice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_user]/ensure: created\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_user]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Mongodb::Client::Install/Package[mongodb_client]/ensure: created\nNotice: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Package[nova-scheduler]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Package[nova-scheduler]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Package[nova-scheduler]: Scheduling refresh of Anchor[nova::install::end]\nNotice: /Stage[main]/Neutron::Db/Neutron_config[database/connection]/ensure: created\nInfo: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Glance::Registry::Db/Glance_registry_config[database/connection]/ensure: created\nInfo: /Stage[main]/Glance::Registry::Db/Glance_registry_config[database/connection]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Registry::Db/Glance_registry_config[database/connection]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Ironic::Db/Ironic_config[database/connection]/ensure: created\nInfo: /Stage[main]/Ironic::Db/Ironic_config[database/connection]: Scheduling refresh of Exec[ironic-dbsync]\nInfo: /Stage[main]/Ironic::Db/Ironic_config[database/connection]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Ironic::Db/Ironic_config[database/connection]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Glance::Api::Db/Glance_api_config[database/connection]/ensure: created\nInfo: /Stage[main]/Glance::Api::Db/Glance_api_config[database/connection]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api::Db/Glance_api_config[database/connection]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Cinder::Db/Cinder_config[database/connection]/ensure: created\nInfo: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Exec[cinder-manage db_sync]\nInfo: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Ironic::Db::Sync/Exec[ironic-dbsync]: Triggered 'refresh' from 5 events\nInfo: /Stage[main]/Ironic::Db::Sync/Exec[ironic-dbsync]: Scheduling refresh of Service[ironic-api]\nInfo: /Stage[main]/Ironic::Db::Sync/Exec[ironic-dbsync]: Scheduling refresh of Service[ironic-conductor]\nNotice: /Stage[main]/Apache::Mod::Mime_magic/File[mime_magic.conf]/ensure: defined content as '{md5}b258529b332429e2ff8344f726a95457'\nInfo: /Stage[main]/Apache::Mod::Mime_magic/File[mime_magic.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache::Mod::Mime_magic/Apache::Mod[mime_magic]/File[mime_magic.load]/ensure: defined content as '{md5}cb8670bb2fb352aac7ebf3a85d52094c'\nInfo: /Stage[main]/Apache::Mod::Mime_magic/Apache::Mod[mime_magic]/File[mime_magic.load]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments.concat]/ensure: created\nNotice: /Stage[main]/Mongodb::Server::Config/File[/etc/mongod.conf]/content: \n--- /etc/mongod.conf\t2015-12-07 22:55:21.000000000 +0000\n+++ /tmp/puppet-file20160520-26469-1krjs66\t2016-05-20 12:29:37.587578098 +0100\n@@ -1,237 +1,19 @@\n-##\n-### Basic Defaults\n-##\n+# mongo.conf - generated from Puppet\n \n-# Comma separated list of ip addresses to listen on (all local ips by default)\n-bind_ip = 127.0.0.1\n-\n-# Specify port number (27017 by default)\n-#port = 27017\n-\n-# Fork server process (false by default)\n-fork = true\n-\n-# Full path to pidfile (if not set, no pidfile is created)\n-pidfilepath = /var/run/mongodb/mongod.pid\n-\n-# Log file to send write to instead of stdout - has to be a file, not directory\n-logpath = /var/log/mongodb/mongod.log\n-\n-# Alternative directory for UNIX domain sockets (defaults to /tmp)\n-unixSocketPrefix = /var/run/mongodb\n-\n-# Directory for datafiles (defaults to /data/db/)\n-dbpath = /var/lib/mongodb\n-\n-# Enable/Disable journaling (journaling is on by default for 64 bit)\n-#journal = true\n-#nojournal = true\n-\n-\n-\n-##\n-### General options\n-##\n-\n-# Be more verbose (include multiple times for more verbosity e.g. -vvvvv) (v by default)\n-#verbose = v\n-\n-# Max number of simultaneous connections (1000000 by default)\n-#maxConns = 1000000 \n-\n-# Log to system's syslog facility instead of file or stdout (false by default)\n-#syslog = true\n-\n-# Syslog facility used for monogdb syslog message (user by defautl)\n-#syslogFacility = user\n-\n-# Append to logpath instead of over-writing (false by default)\n-#logappend = true\n-\n-# Desired format for timestamps in log messages (One of ctime, iso8601-utc or iso8601-local) (iso8601-local by default)\n-#timeStampFormat = arg \n-\n-# Private key for cluster authentication\n-#keyFile = arg\n-\n-# Set a configurable parameter\n-#setParameter = arg\n-\n-# Enable http interface (false by default)\n-#httpinterface = true\n-\n-# Authentication mode used for cluster authentication. Alternatives are (keyFile|sendKeyFile|sendX509|x509) (keyFile by default)\n-#clusterAuthMode = arg\n-\n-# Disable listening on unix sockets (false by default)\n-#nounixsocket = true\n-\n-# Run with/without security (without by default)\n-#auth = true\n-#noauth = true\n-\n-# Enable IPv6 support (disabled by default)\n-#ipv6 = true\n-\n-# Allow JSONP access via http (has security implications) (false by default)\n-#jsonp = true\n-\n-# Turn on simple rest api (false by default)\n-#rest = true\n-\n-# Value of slow for profile and console log (100 by default)\n-#slowms = 100\n-\n-# 0=off 1=slow, 2=all (0 by default)\n-#profile = 0\n-\n-# Periodically show cpu and iowait utilization (false by default)\n-#cpu = true\n-\n-# Print some diagnostic system information (false by default)\n-#sysinfo = true\n-\n-# Each database will be stored in a separate directory (false by default)\n-#directoryperdb = true\n-\n-# Don't retry any index builds that were interrupted by shutdown (false by default)\n-#noIndexBuildRetry = true\n-\n-# Disable data file preallocation - will often hurt performance (false by default)\n-#noprealloc = true\n-\n-# .ns file size (in MB) for new databases (16 MB by default)\n-#nssize = 16\n-\n-# Limits each database to a certain number of files (8 default)\n-#quota\n-\n-# Number of files allowed per db, implies --quota (8 by default)\n-#quotaFiles = 8\n-\n-# Use a smaller default file size (false by default)\n-#smallfiles = true\n-\n-# Seconds between disk syncs (0=never, but not recommended) (60 by default)\n-#syncdelay = 60\n-\n-# Upgrade db if needed (false by default)\n-#upgrade = true\n-\n-# Run repair on all dbs (false by default)\n-#repair = true\n-\n-# Root directory for repair files (defaults to dbpath)\n-#repairpath = arg\n-\n-# Disable scripting engine (false by default)\n-#noscripting = true\n-\n-# Do not allow table scans (false by default)\n-#notablescan = true\n-\n-# Journal diagnostic options (0 by default)\n-#journalOptions = 0\n-\n-# How often to group/batch commit (ms) (100 or 30 by default)\n-#journalCommitInterval = 100 \n-\n-\n-\n-##\n-### Replication options\n-##\n-\n-# Size to use (in MB) for replication op log (default 5% of disk space - i.e. large is good)\n-#oplogSize = arg\n-\n-\n-\n-##\n-### Master/slave options (old; use replica sets instead)\n-##\n-\n-# Master mode\n-#master = true\n-\n-# Slave mode\n-#slave = true\n-\n-# When slave: specify master as \n-#source = arg\n-\n-# When slave: specify a single database to replicate\n-#only = arg\n-\n-# Specify delay (in seconds) to be used when applying master ops to slave\n-#slavedelay = arg\n-\n-# Automatically resync if slave data is stale\n-#autoresync = true\n-\n-\n-\n-##\n-### Replica set options\n-##\n-\n-# Arg is [/]\n-#replSet = arg\n-\n-# Specify index prefetching behavior (if secondary) [none|_id_only|all] (all by default)\n-#replIndexPrefetch = all\n-\n-\n-\n-##\n-### Sharding options\n-##\n-\n-# Declare this is a config db of a cluster (default port 27019; default dir /data/configdb) (false by default)\n-#configsvr = true\n-\n-# Declare this is a shard db of a cluster (default port 27018) (false by default)\n-#shardsvr = true\n-\n-\n-\n-##\n-### SSL options\n-##\n-\n-# Use ssl on configured ports\n-#sslOnNormalPorts = true\n-\n-# Set the SSL operation mode (disabled|allowSSL|preferSSL|requireSSL)\n-# sslMode = arg\n-\n-# PEM file for ssl\n-#sslPEMKeyFile = arg\n-\n-# PEM file password\n-#sslPEMKeyPassword = arg\n-\n-# Key file for internal SSL authentication\n-#sslClusterFile = arg\n-\n-# Internal authentication key file password\n-#sslClusterPassword = arg\n-\n-# Certificate Authority file for SSL\n-#sslCAFile = arg\n-\n-# Certificate Revocation List file for SSL\n-#sslCRLFile = arg\n-\n-# Allow client to connect without presenting a certificate\n-#sslWeakCertificateValidation = true\n-\n-# Allow server certificates to provide non-matching hostnames\n-#sslAllowInvalidHostnames = true\n-\n-# Allow connections to servers with invalid certificates\n-#sslAllowInvalidCertificates = true\n-\n-# Activate FIPS 140-2 mode at startup\n-#sslFIPSMode = true\n \n+#where to log\n+logpath=/var/log/mongodb/mongodb.log\n+logappend=true\n+# Set this option to configure the mongod or mongos process to bind to and\n+# listen for connections from applications on this address.\n+# You may concatenate a list of comma separated values to bind mongod to multiple IP addresses.\n+bind_ip = 127.0.0.1\n+# fork and run in background\n+fork=true\n+dbpath=/var/lib/mongodb\n+# location of pidfile\n+pidfilepath=/var/run/mongodb/mongod.pid\n+# Enables journaling\n+journal = true\n+# Turn on/off security. Off is currently the default\n+noauth=true\nInfo: /Stage[main]/Mongodb::Server::Config/File[/etc/mongod.conf]: Filebucketed /etc/mongod.conf to puppet with sum c9466bad2ec40e2613630b7d49d58b2b\nNotice: /Stage[main]/Mongodb::Server::Config/File[/etc/mongod.conf]/content: content changed '{md5}c9466bad2ec40e2613630b7d49d58b2b' to '{md5}b770678a1c1e5991d9990e8fdb0fabea'\nNotice: /Stage[main]/Mongodb::Server::Config/File[/var/lib/mongodb]/group: group changed 'root' to 'mongodb'\nNotice: /Stage[main]/Mongodb::Server::Config/File[/var/lib/mongodb]/mode: mode changed '0750' to '0755'\nInfo: Class[Mongodb::Server::Config]: Scheduling refresh of Class[Mongodb::Server::Service]\nInfo: Class[Mongodb::Server::Service]: Scheduling refresh of Service[mongodb]\nNotice: /Stage[main]/Mongodb::Server::Service/Service[mongodb]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Mongodb::Server::Service/Service[mongodb]: Unscheduling refresh on Service[mongodb]\nNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/tunnel_types]/ensure: created\nInfo: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/tunnel_types]: Scheduling refresh of Service[neutron-ovs-agent-service]\nNotice: /Stage[main]/Keystone/Package[keystone]/ensure: created\nInfo: /Stage[main]/Keystone/Package[keystone]: Scheduling refresh of Anchor[keystone::install::end]\nInfo: /Stage[main]/Keystone/Package[keystone]: Scheduling refresh of Anchor[keystone::service::end]\nNotice: /Stage[main]/Rabbitmq::Install/Package[rabbitmq-server]/ensure: created\nInfo: /Stage[main]/Rabbitmq::Install/Package[rabbitmq-server]: Scheduling refresh of Class[Rabbitmq::Service]\nNotice: /Stage[main]/Rabbitmq/Rabbitmq_plugin[rabbitmq_management]/ensure: created\nInfo: /Stage[main]/Rabbitmq/Rabbitmq_plugin[rabbitmq_management]: Scheduling refresh of Class[Rabbitmq::Service]\nNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]/content: \n--- /etc/rabbitmq/rabbitmq.config\t2014-08-11 12:36:33.000000000 +0100\n+++ /tmp/puppet-file20160520-26469-1onsrit\t2016-05-20 12:29:47.927185812 +0100\n@@ -1,567 +1,42 @@\n-%% -*- mode: erlang -*-\n-%% ----------------------------------------------------------------------------\n-%% RabbitMQ Sample Configuration File.\n-%%\n-%% See http://www.rabbitmq.com/configure.html for details.\n-%% ----------------------------------------------------------------------------\n+% This file managed by Puppet\n+% Template Path: rabbitmq/templates/rabbitmq.config\n [\n- {rabbit,\n- [%%\n- %% Network Connectivity\n- %% ====================\n- %%\n-\n- %% By default, RabbitMQ will listen on all interfaces, using\n- %% the standard (reserved) AMQP port.\n- %%\n- %% {tcp_listeners, [5672]},\n-\n- %% To listen on a specific interface, provide a tuple of {IpAddress, Port}.\n- %% For example, to listen only on localhost for both IPv4 and IPv6:\n- %%\n- %% {tcp_listeners, [{\"127.0.0.1\", 5672},\n- %% {\"::1\", 5672}]},\n-\n- %% SSL listeners are configured in the same fashion as TCP listeners,\n- %% including the option to control the choice of interface.\n- %%\n- %% {ssl_listeners, [5671]},\n-\n- %% Log levels (currently just used for connection logging).\n- %% One of 'info', 'warning', 'error' or 'none', in decreasing order\n- %% of verbosity. Defaults to 'info'.\n- %%\n- %% {log_levels, [{connection, info}]},\n-\n- %% Set to 'true' to perform reverse DNS lookups when accepting a\n- %% connection. Hostnames will then be shown instead of IP addresses\n- %% in rabbitmqctl and the management plugin.\n- %%\n- %% {reverse_dns_lookups, true},\n-\n- %%\n- %% Security / AAA\n- %% ==============\n- %%\n-\n- %% The default \"guest\" user is only permitted to access the server\n- %% via a loopback interface (e.g. localhost).\n- %% {loopback_users, [<<\"guest\">>]},\n- %%\n- %% Uncomment the following line if you want to allow access to the\n- %% guest user from anywhere on the network.\n- %% {loopback_users, []},\n-\n- %% Configuring SSL.\n- %% See http://www.rabbitmq.com/ssl.html for full documentation.\n- %%\n- %% {ssl_options, [{cacertfile, \"/path/to/testca/cacert.pem\"},\n- %% {certfile, \"/path/to/server/cert.pem\"},\n- %% {keyfile, \"/path/to/server/key.pem\"},\n- %% {verify, verify_peer},\n- %% {fail_if_no_peer_cert, false}]},\n-\n- %% Choose the available SASL mechanism(s) to expose.\n- %% The two default (built in) mechanisms are 'PLAIN' and\n- %% 'AMQPLAIN'. Additional mechanisms can be added via\n- %% plugins.\n- %%\n- %% See http://www.rabbitmq.com/authentication.html for more details.\n- %%\n- %% {auth_mechanisms, ['PLAIN', 'AMQPLAIN']},\n-\n- %% Select an authentication database to use. RabbitMQ comes bundled\n- %% with a built-in auth-database, based on mnesia.\n- %%\n- %% {auth_backends, [rabbit_auth_backend_internal]},\n-\n- %% Configurations supporting the rabbitmq_auth_mechanism_ssl and\n- %% rabbitmq_auth_backend_ldap plugins.\n- %%\n- %% NB: These options require that the relevant plugin is enabled.\n- %% See http://www.rabbitmq.com/plugins.html for further details.\n-\n- %% The RabbitMQ-auth-mechanism-ssl plugin makes it possible to\n- %% authenticate a user based on the client's SSL certificate.\n- %%\n- %% To use auth-mechanism-ssl, add to or replace the auth_mechanisms\n- %% list with the entry 'EXTERNAL'.\n- %%\n- %% {auth_mechanisms, ['EXTERNAL']},\n-\n- %% The rabbitmq_auth_backend_ldap plugin allows the broker to\n- %% perform authentication and authorisation by deferring to an\n- %% external LDAP server.\n- %%\n- %% For more information about configuring the LDAP backend, see\n- %% http://www.rabbitmq.com/ldap.html.\n- %%\n- %% Enable the LDAP auth backend by adding to or replacing the\n- %% auth_backends entry:\n- %%\n- %% {auth_backends, [rabbit_auth_backend_ldap]},\n-\n- %% This pertains to both the rabbitmq_auth_mechanism_ssl plugin and\n- %% STOMP ssl_cert_login configurations. See the rabbitmq_stomp\n- %% configuration section later in this fail and the README in\n- %% https://github.com/rabbitmq/rabbitmq-auth-mechanism-ssl for further\n- %% details.\n- %%\n- %% To use the SSL cert's CN instead of its DN as the username\n- %%\n- %% {ssl_cert_login_from, common_name},\n-\n- %%\n- %% Default User / VHost\n- %% ====================\n- %%\n-\n- %% On first start RabbitMQ will create a vhost and a user. These\n- %% config items control what gets created. See\n- %% http://www.rabbitmq.com/access-control.html for further\n- %% information about vhosts and access control.\n- %%\n- %% {default_vhost, <<\"/\">>},\n- %% {default_user, <<\"guest\">>},\n- %% {default_pass, <<\"guest\">>},\n- %% {default_permissions, [<<\".*\">>, <<\".*\">>, <<\".*\">>]},\n-\n- %% Tags for default user\n- %%\n- %% For more details about tags, see the documentation for the\n- %% Management Plugin at http://www.rabbitmq.com/management.html.\n- %%\n- %% {default_user_tags, [administrator]},\n-\n- %%\n- %% Additional network and protocol related configuration\n- %% =====================================================\n- %%\n-\n- %% Set the default AMQP heartbeat delay (in seconds).\n- %%\n- %% {heartbeat, 600},\n-\n- %% Set the max permissible size of an AMQP frame (in bytes).\n- %%\n- %% {frame_max, 131072},\n-\n- %% Set the max permissible number of channels per connection.\n- %% 0 means \"no limit\".\n- %%\n- %% {channel_max, 128},\n-\n- %% Customising Socket Options.\n- %%\n- %% See (http://www.erlang.org/doc/man/inet.html#setopts-2) for\n- %% further documentation.\n- %%\n- %% {tcp_listen_options, [binary,\n- %% {packet, raw},\n- %% {reuseaddr, true},\n- %% {backlog, 128},\n- %% {nodelay, true},\n- %% {exit_on_close, false}]},\n-\n- %%\n- %% Resource Limits & Flow Control\n- %% ==============================\n- %%\n- %% See http://www.rabbitmq.com/memory.html for full details.\n-\n- %% Memory-based Flow Control threshold.\n- %%\n- %% {vm_memory_high_watermark, 0.4},\n-\n- %% Fraction of the high watermark limit at which queues start to\n- %% page message out to disc in order to free up memory.\n- %%\n- %% {vm_memory_high_watermark_paging_ratio, 0.5},\n-\n- %% Set disk free limit (in bytes). Once free disk space reaches this\n- %% lower bound, a disk alarm will be set - see the documentation\n- %% listed above for more details.\n- %%\n- %% {disk_free_limit, 50000000},\n-\n- %% Alternatively, we can set a limit relative to total available RAM.\n- %%\n- %% {disk_free_limit, {mem_relative, 1.0}},\n-\n- %%\n- %% Misc/Advanced Options\n- %% =====================\n- %%\n- %% NB: Change these only if you understand what you are doing!\n- %%\n-\n- %% To announce custom properties to clients on connection:\n- %%\n- %% {server_properties, []},\n-\n- %% How to respond to cluster partitions.\n- %% See http://www.rabbitmq.com/partitions.html for further details.\n- %%\n- %% {cluster_partition_handling, ignore},\n-\n- %% Make clustering happen *automatically* at startup - only applied\n- %% to nodes that have just been reset or started for the first time.\n- %% See http://www.rabbitmq.com/clustering.html#auto-config for\n- %% further details.\n- %%\n- %% {cluster_nodes, {['rabbit@my.host.com'], disc}},\n-\n- %% Set (internal) statistics collection granularity.\n- %%\n- %% {collect_statistics, none},\n-\n- %% Statistics collection interval (in milliseconds).\n- %%\n- %% {collect_statistics_interval, 5000},\n-\n- %% Explicitly enable/disable hipe compilation.\n- %%\n- %% {hipe_compile, true}\n-\n- ]},\n-\n- %% ----------------------------------------------------------------------------\n- %% Advanced Erlang Networking/Clustering Options.\n- %%\n- %% See http://www.rabbitmq.com/clustering.html for details\n- %% ----------------------------------------------------------------------------\n- {kernel,\n- [%% Sets the net_kernel tick time.\n- %% Please see http://erlang.org/doc/man/kernel_app.html and\n- %% http://www.rabbitmq.com/nettick.html for further details.\n- %%\n- %% {net_ticktime, 60}\n+ {rabbit, [\n+ {tcp_listen_options,\n+ [binary,\n+ {packet, raw},\n+ {reuseaddr, true},\n+ {backlog, 128},\n+ {nodelay, true},\n+ {exit_on_close, false}]\n+ },\n+ {tcp_listeners, []},\n+ {ssl_listeners, [5671]},\n+ {ssl_options, [\n+ {cacertfile,\"/etc/ssl/certs/ca-bundle.crt\"},\n+ {certfile,\"/etc/pki/ca-trust/source/anchors/puppet_openstack.pem\"},\n+ {keyfile,\"/etc/rabbitmq/ssl/private/n2.dusty.ci.centos.org.pem\"},\n+ {verify,verify_none},\n+ {fail_if_no_peer_cert,false}\n+ ]},\n+ {default_user, <<\"guest\">>},\n+ {default_pass, <<\"guest\">>}\n ]},\n-\n- %% ----------------------------------------------------------------------------\n- %% RabbitMQ Management Plugin\n- %%\n- %% See http://www.rabbitmq.com/management.html for details\n- %% ----------------------------------------------------------------------------\n-\n- {rabbitmq_management,\n- [%% Pre-Load schema definitions from the following JSON file. See\n- %% http://www.rabbitmq.com/management.html#load-definitions\n- %%\n- %% {load_definitions, \"/path/to/schema.json\"},\n-\n- %% Log all requests to the management HTTP API to a file.\n- %%\n- %% {http_log_dir, \"/path/to/access.log\"},\n-\n- %% Change the port on which the HTTP listener listens,\n- %% specifying an interface for the web server to bind to.\n- %% Also set the listener to use SSL and provide SSL options.\n- %%\n- %% {listener, [{port, 12345},\n- %% {ip, \"127.0.0.1\"},\n- %% {ssl, true},\n- %% {ssl_opts, [{cacertfile, \"/path/to/cacert.pem\"},\n- %% {certfile, \"/path/to/cert.pem\"},\n- %% {keyfile, \"/path/to/key.pem\"}]}]},\n-\n- %% Configure how long aggregated data (such as message rates and queue\n- %% lengths) is retained. Please read the plugin's documentation in\n- %% https://www.rabbitmq.com/management.html#configuration for more\n- %% details.\n- %%\n- %% {sample_retention_policies,\n- %% [{global, [{60, 5}, {3600, 60}, {86400, 1200}]},\n- %% {basic, [{60, 5}, {3600, 60}]},\n- %% {detailed, [{10, 5}]}]}\n- ]},\n-\n- {rabbitmq_management_agent,\n- [%% Misc/Advanced Options\n- %%\n- %% NB: Change these only if you understand what you are doing!\n- %%\n- %% {force_fine_statistics, true}\n- ]},\n-\n- %% ----------------------------------------------------------------------------\n- %% RabbitMQ Shovel Plugin\n- %%\n- %% See http://www.rabbitmq.com/shovel.html for details\n- %% ----------------------------------------------------------------------------\n-\n- {rabbitmq_shovel,\n- [{shovels,\n- [%% A named shovel worker.\n- %% {my_first_shovel,\n- %% [\n-\n- %% List the source broker(s) from which to consume.\n- %%\n- %% {sources,\n- %% [%% URI(s) and pre-declarations for all source broker(s).\n- %% {brokers, [\"amqp://user:password@host.domain/my_vhost\"]},\n- %% {declarations, []}\n- %% ]},\n-\n- %% List the destination broker(s) to publish to.\n- %% {destinations,\n- %% [%% A singular version of the 'brokers' element.\n- %% {broker, \"amqp://\"},\n- %% {declarations, []}\n- %% ]},\n-\n- %% Name of the queue to shovel messages from.\n- %%\n- %% {queue, <<\"your-queue-name-goes-here\">>},\n-\n- %% Optional prefetch count.\n- %%\n- %% {prefetch_count, 10},\n-\n- %% when to acknowledge messages:\n- %% - no_ack: never (auto)\n- %% - on_publish: after each message is republished\n- %% - on_confirm: when the destination broker confirms receipt\n- %%\n- %% {ack_mode, on_confirm},\n-\n- %% Overwrite fields of the outbound basic.publish.\n- %%\n- %% {publish_fields, [{exchange, <<\"my_exchange\">>},\n- %% {routing_key, <<\"from_shovel\">>}]},\n-\n- %% Static list of basic.properties to set on re-publication.\n- %%\n- %% {publish_properties, [{delivery_mode, 2}]},\n-\n- %% The number of seconds to wait before attempting to\n- %% reconnect in the event of a connection failure.\n- %%\n- %% {reconnect_delay, 2.5}\n-\n- %% ]} %% End of my_first_shovel\n+ {kernel, [\n+ \n+ ]}\n+,\n+ {rabbitmq_management, [\n+ {listener, [\n+ {port, 15671},\n+ {ssl, true},\n+ {ssl_opts, [\n+ {cacertfile, \"/etc/ssl/certs/ca-bundle.crt\"},\n+\n+ {certfile, \"/etc/pki/ca-trust/source/anchors/puppet_openstack.pem\"},\n+ {keyfile, \"/etc/rabbitmq/ssl/private/n2.dusty.ci.centos.org.pem\"}\n+ ]}\n ]}\n- %% Rather than specifying some values per-shovel, you can specify\n- %% them for all shovels here.\n- %%\n- %% {defaults, [{prefetch_count, 0},\n- %% {ack_mode, on_confirm},\n- %% {publish_fields, []},\n- %% {publish_properties, [{delivery_mode, 2}]},\n- %% {reconnect_delay, 2.5}]}\n- ]},\n-\n- %% ----------------------------------------------------------------------------\n- %% RabbitMQ Stomp Adapter\n- %%\n- %% See http://www.rabbitmq.com/stomp.html for details\n- %% ----------------------------------------------------------------------------\n-\n- {rabbitmq_stomp,\n- [%% Network Configuration - the format is generally the same as for the broker\n-\n- %% Listen only on localhost (ipv4 & ipv6) on a specific port.\n- %% {tcp_listeners, [{\"127.0.0.1\", 61613},\n- %% {\"::1\", 61613}]},\n-\n- %% Listen for SSL connections on a specific port.\n- %% {ssl_listeners, [61614]},\n-\n- %% Additional SSL options\n-\n- %% Extract a name from the client's certificate when using SSL.\n- %%\n- %% {ssl_cert_login, true},\n-\n- %% Set a default user name and password. This is used as the default login\n- %% whenever a CONNECT frame omits the login and passcode headers.\n- %%\n- %% Please note that setting this will allow clients to connect without\n- %% authenticating!\n- %%\n- %% {default_user, [{login, \"guest\"},\n- %% {passcode, \"guest\"}]},\n-\n- %% If a default user is configured, or you have configured use SSL client\n- %% certificate based authentication, you can choose to allow clients to\n- %% omit the CONNECT frame entirely. If set to true, the client is\n- %% automatically connected as the default user or user supplied in the\n- %% SSL certificate whenever the first frame sent on a session is not a\n- %% CONNECT frame.\n- %%\n- %% {implicit_connect, true}\n- ]},\n-\n- %% ----------------------------------------------------------------------------\n- %% RabbitMQ MQTT Adapter\n- %%\n- %% See http://hg.rabbitmq.com/rabbitmq-mqtt/file/stable/README.md for details\n- %% ----------------------------------------------------------------------------\n-\n- {rabbitmq_mqtt,\n- [%% Set the default user name and password. Will be used as the default login\n- %% if a connecting client provides no other login details.\n- %%\n- %% Please note that setting this will allow clients to connect without\n- %% authenticating!\n- %%\n- %% {default_user, <<\"guest\">>},\n- %% {default_pass, <<\"guest\">>},\n-\n- %% Enable anonymous access. If this is set to false, clients MUST provide\n- %% login information in order to connect. See the default_user/default_pass\n- %% configuration elements for managing logins without authentication.\n- %%\n- %% {allow_anonymous, true},\n-\n- %% If you have multiple chosts, specify the one to which the\n- %% adapter connects.\n- %%\n- %% {vhost, <<\"/\">>},\n-\n- %% Specify the exchange to which messages from MQTT clients are published.\n- %%\n- %% {exchange, <<\"amq.topic\">>},\n-\n- %% Specify TTL (time to live) to control the lifetime of non-clean sessions.\n- %%\n- %% {subscription_ttl, 1800000},\n-\n- %% Set the prefetch count (governing the maximum number of unacknowledged\n- %% messages that will be delivered).\n- %%\n- %% {prefetch, 10},\n-\n- %% TCP/SSL Configuration (as per the broker configuration).\n- %%\n- %% {tcp_listeners, [1883]},\n- %% {ssl_listeners, []},\n-\n- %% TCP/Socket options (as per the broker configuration).\n- %%\n- %% {tcp_listen_options, [binary,\n- %% {packet, raw},\n- %% {reuseaddr, true},\n- %% {backlog, 128},\n- %% {nodelay, true}]}\n- ]},\n-\n- %% ----------------------------------------------------------------------------\n- %% RabbitMQ AMQP 1.0 Support\n- %%\n- %% See http://hg.rabbitmq.com/rabbitmq-amqp1.0/file/default/README.md\n- %% for details\n- %% ----------------------------------------------------------------------------\n-\n- {rabbitmq_amqp1_0,\n- [%% Connections that are not authenticated with SASL will connect as this\n- %% account. See the README for more information.\n- %%\n- %% Please note that setting this will allow clients to connect without\n- %% authenticating!\n- %%\n- %% {default_user, \"guest\"},\n-\n- %% Enable protocol strict mode. See the README for more information.\n- %%\n- %% {protocol_strict_mode, false}\n- ]},\n-\n- %% ----------------------------------------------------------------------------\n- %% RabbitMQ LDAP Plugin\n- %%\n- %% See http://www.rabbitmq.com/ldap.html for details.\n- %%\n- %% ----------------------------------------------------------------------------\n-\n- {rabbitmq_auth_backend_ldap,\n- [%%\n- %% Connecting to the LDAP server(s)\n- %% ================================\n- %%\n-\n- %% Specify servers to bind to. You *must* set this in order for the plugin\n- %% to work properly.\n- %%\n- %% {servers, [\"your-server-name-goes-here\"]},\n-\n- %% Connect to the LDAP server using SSL\n- %%\n- %% {use_ssl, false},\n-\n- %% Specify the LDAP port to connect to\n- %%\n- %% {port, 389},\n-\n- %% LDAP connection timeout, in milliseconds or 'infinity'\n- %%\n- %% {timeout, infinity},\n-\n- %% Enable logging of LDAP queries.\n- %% One of\n- %% - false (no logging is performed)\n- %% - true (verbose logging of the logic used by the plugin)\n- %% - network (as true, but additionally logs LDAP network traffic)\n- %%\n- %% Defaults to false.\n- %%\n- %% {log, false},\n-\n- %%\n- %% Authentication\n- %% ==============\n- %%\n-\n- %% Pattern to convert the username given through AMQP to a DN before\n- %% binding\n- %%\n- %% {user_dn_pattern, \"cn=${username},ou=People,dc=example,dc=com\"},\n-\n- %% Alternatively, you can convert a username to a Distinguished\n- %% Name via an LDAP lookup after binding. See the documentation for\n- %% full details.\n-\n- %% When converting a username to a dn via a lookup, set these to\n- %% the name of the attribute that represents the user name, and the\n- %% base DN for the lookup query.\n- %%\n- %% {dn_lookup_attribute, \"userPrincipalName\"},\n- %% {dn_lookup_base, \"DC=gopivotal,DC=com\"},\n-\n- %% Controls how to bind for authorisation queries and also to\n- %% retrieve the details of users logging in without presenting a\n- %% password (e.g., SASL EXTERNAL).\n- %% One of\n- %% - as_user (to bind as the authenticated user - requires a password)\n- %% - anon (to bind anonymously)\n- %% - {UserDN, Password} (to bind with a specified user name and password)\n- %%\n- %% Defaults to 'as_user'.\n- %%\n- %% {other_bind, as_user},\n-\n- %%\n- %% Authorisation\n- %% =============\n- %%\n-\n- %% The LDAP plugin can perform a variety of queries against your\n- %% LDAP server to determine questions of authorisation. See\n- %% http://www.rabbitmq.com/ldap.html#authorisation for more\n- %% information.\n-\n- %% Set the query to use when determining vhost access\n- %%\n- %% {vhost_access_query, {in_group,\n- %% \"ou=${vhost}-users,ou=vhosts,dc=example,dc=com\"}},\n-\n- %% Set the query to use when determining resource (e.g., queue) access\n- %%\n- %% {resource_access_query, {constant, true}},\n-\n- %% Set queries to determine which tags a user has\n- %%\n- %% {tag_queries, []}\n ]}\n ].\n+% EOF\nInfo: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]: Filebucketed /etc/rabbitmq/rabbitmq.config to puppet with sum 3e342d4a660626a9b588a723ad6cba74\nNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]/content: content changed '{md5}3e342d4a660626a9b588a723ad6cba74' to '{md5}808c7824d2fe3217e34c0f11b45084ed'\nInfo: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]: Scheduling refresh of Class[Rabbitmq::Service]\nNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmqadmin.conf]/ensure: defined content as '{md5}56b4bb3dfb32765e14d2a04faea60e62'\nNotice: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d]/ensure: created\nNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::install::end]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Keystone::Deps/Anchor[keystone::install::end]: Scheduling refresh of Anchor[keystone::service::begin]\nInfo: /Stage[main]/Keystone::Deps/Anchor[keystone::install::end]: Scheduling refresh of Exec[keystone-manage db_sync]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/File[/var/www/cgi-bin/keystone]/ensure: created\nNotice: /Stage[main]/Keystone::Wsgi::Apache/File[keystone_wsgi_admin]/ensure: defined content as '{md5}b60f70d60e09d39ab5900f4b4eebf921'\nNotice: /Stage[main]/Keystone::Wsgi::Apache/File[keystone_wsgi_main]/ensure: defined content as '{md5}b60f70d60e09d39ab5900f4b4eebf921'\nNotice: /Stage[main]/Openstack_integration::Keystone/Openstack_integration::Ssl_key[keystone]/File[/etc/keystone/ssl]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Keystone/Openstack_integration::Ssl_key[keystone]/File[/etc/keystone/ssl/private]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Keystone/Openstack_integration::Ssl_key[keystone]/File[/etc/keystone/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00'\nInfo: Openstack_integration::Ssl_key[keystone]: Scheduling refresh of Service[httpd]\nNotice: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d/limits.conf]/ensure: defined content as '{md5}8eb9ff6c576b9869944215af3a568c2e'\nInfo: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d/limits.conf]: Scheduling refresh of Exec[rabbitmq-systemd-reload]\nNotice: /Stage[main]/Rabbitmq::Config/Exec[rabbitmq-systemd-reload]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Rabbitmq::Config/Exec[rabbitmq-systemd-reload]: Scheduling refresh of Class[Rabbitmq::Service]\nNotice: /Stage[main]/Keystone::Cron::Token_flush/Cron[keystone-manage token_flush]/ensure: created\nNotice: /Stage[main]/Rabbitmq::Config/File[/etc/rabbitmq/ssl]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Rabbitmq/File[/etc/rabbitmq/ssl/private]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Rabbitmq/Openstack_integration::Ssl_key[rabbitmq]/File[/etc/rabbitmq/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00'\nInfo: Openstack_integration::Ssl_key[rabbitmq]: Scheduling refresh of Service[rabbitmq-server]\nNotice: /Stage[main]/Keystone/Keystone_config[ssl/ca_certs]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[ssl/ca_certs]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[fernet_tokens/key_repository]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[fernet_tokens/key_repository]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[ssl/cert_subject]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[ssl/cert_subject]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[signing/keyfile]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[signing/keyfile]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[catalog/driver]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[catalog/driver]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[ssl/ca_key]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[ssl/ca_key]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[ssl/enable]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[ssl/enable]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[token/provider]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[token/provider]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[signing/key_size]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[signing/key_size]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_port]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_port]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[signing/ca_certs]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[signing/ca_certs]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[signing/ca_key]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[signing/ca_key]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[ssl/certfile]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[ssl/certfile]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf]/ensure: created\nInfo: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf]: Scheduling refresh of Exec[concat_15-default.conf]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf/fragments]/ensure: created\nInfo: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf/fragments]: Scheduling refresh of Exec[concat_15-default.conf]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-serversignature]/File[/var/lib/puppet/concat/15-default.conf/fragments/90_default-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7'\nInfo: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-serversignature]/File[/var/lib/puppet/concat/15-default.conf/fragments/90_default-serversignature]: Scheduling refresh of Exec[concat_15-default.conf]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-directories]/File[/var/lib/puppet/concat/15-default.conf/fragments/60_default-directories]/ensure: defined content as '{md5}5e2a84875965faa5e3df0e222301ba37'\nInfo: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-directories]/File[/var/lib/puppet/concat/15-default.conf/fragments/60_default-directories]: Scheduling refresh of Exec[concat_15-default.conf]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-docroot]/File[/var/lib/puppet/concat/15-default.conf/fragments/10_default-docroot]/ensure: defined content as '{md5}6faaccbc7ca8bc885ebf139223885d52'\nInfo: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-docroot]/File[/var/lib/puppet/concat/15-default.conf/fragments/10_default-docroot]: Scheduling refresh of Exec[concat_15-default.conf]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-apache-header]/File[/var/lib/puppet/concat/15-default.conf/fragments/0_default-apache-header]/ensure: defined content as '{md5}c46eea5ff4d7874403fa7a9228888f0e'\nInfo: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-apache-header]/File[/var/lib/puppet/concat/15-default.conf/fragments/0_default-apache-header]: Scheduling refresh of Exec[concat_15-default.conf]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf/fragments.concat]/ensure: created\nNotice: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/debug]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[ssl/keyfile]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[ssl/keyfile]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[signing/certfile]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[signing/certfile]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_bind_host]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_bind_host]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Apache::Listen[35357]/Concat::Fragment[Listen 35357]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 35357]/ensure: defined content as '{md5}37dc13694e40f667def8eaa0cc261d03'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Apache::Listen[35357]/Concat::Fragment[Listen 35357]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 35357]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]\nNotice: /Stage[main]/Keystone/Keystone_config[eventlet_server/public_workers]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[eventlet_server/public_workers]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Apache::Listen[5000]/Concat::Fragment[Listen 5000]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 5000]/ensure: defined content as '{md5}9ce4fddc0fe1c0dd6016a171946def55'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Apache::Listen[5000]/Concat::Fragment[Listen 5000]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 5000]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Apache::Listen[80]/Concat::Fragment[Listen 80]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 80]/ensure: defined content as '{md5}d5fcefc335117f400d451de47efeca87'\nInfo: /Stage[main]/Apache/Apache::Vhost[default]/Apache::Listen[80]/Concat::Fragment[Listen 80]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 80]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]\nNotice: /Stage[main]/Keystone/Keystone_config[catalog/template_file]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[catalog/template_file]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[token/driver]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[token/driver]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_bind_host]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_bind_host]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_token]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_token]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Apache::Listen[8774]/Concat::Fragment[Listen 8774]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 8774]/ensure: defined content as '{md5}edb2a81e84f59aaa4978ff2d53c01a3e'\nInfo: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Apache::Listen[8774]/Concat::Fragment[Listen 8774]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 8774]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]\nNotice: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/log_dir]/ensure: created\nInfo: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/log_dir]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_port]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_port]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf/fragments.concat.out]/ensure: created\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-file_footer]/File[/var/lib/puppet/concat/15-default.conf/fragments/999_default-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d'\nInfo: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-file_footer]/File[/var/lib/puppet/concat/15-default.conf/fragments/999_default-file_footer]: Scheduling refresh of Exec[concat_15-default.conf]\nNotice: /Stage[main]/Keystone/Keystone_config[signing/cert_subject]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[signing/cert_subject]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-scriptalias]/File[/var/lib/puppet/concat/15-default.conf/fragments/200_default-scriptalias]/ensure: defined content as '{md5}7fc65400381c3a010f38870f94f236f0'\nInfo: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-scriptalias]/File[/var/lib/puppet/concat/15-default.conf/fragments/200_default-scriptalias]: Scheduling refresh of Exec[concat_15-default.conf]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Apache::Listen[6385]/Concat::Fragment[Listen 6385]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 6385]/ensure: defined content as '{md5}dab46123b45901c26ef6386ec1195db9'\nInfo: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Apache::Listen[6385]/Concat::Fragment[Listen 6385]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 6385]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]\nNotice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/Exec[concat_/etc/httpd/conf/ports.conf]/returns: executed successfully\nNotice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/Exec[concat_/etc/httpd/conf/ports.conf]: Triggered 'refresh' from 8 events\nNotice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/etc/httpd/conf/ports.conf]/ensure: defined content as '{md5}ae39e379894fcb4065bbee3724f7036d'\nInfo: Concat[/etc/httpd/conf/ports.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]/content: \n--- /etc/httpd/conf/httpd.conf\t2016-05-12 11:16:14.000000000 +0100\n+++ /tmp/puppet-file20160520-26469-t9ohgg\t2016-05-20 12:29:49.253135506 +0100\n@@ -1,353 +1,49 @@\n-#\n-# This is the main Apache HTTP server configuration file. It contains the\n-# configuration directives that give the server its instructions.\n-# See for detailed information.\n-# In particular, see \n-# \n-# for a discussion of each configuration directive.\n-#\n-# Do NOT simply read the instructions in here without understanding\n-# what they do. They're here only as hints or reminders. If you are unsure\n-# consult the online docs. You have been warned. \n-#\n-# Configuration and logfile names: If the filenames you specify for many\n-# of the server's control files begin with \"/\" (or \"drive:/\" for Win32), the\n-# server will use that explicit path. If the filenames do *not* begin\n-# with \"/\", the value of ServerRoot is prepended -- so 'log/access_log'\n-# with ServerRoot set to '/www' will be interpreted by the\n-# server as '/www/log/access_log', where as '/log/access_log' will be\n-# interpreted as '/log/access_log'.\n-\n-#\n-# ServerRoot: The top of the directory tree under which the server's\n-# configuration, error, and log files are kept.\n-#\n-# Do not add a slash at the end of the directory path. If you point\n-# ServerRoot at a non-local disk, be sure to specify a local disk on the\n-# Mutex directive, if file-based mutexes are used. If you wish to share the\n-# same ServerRoot for multiple httpd daemons, you will need to change at\n-# least PidFile.\n-#\n+# Security\n+ServerTokens OS\n+ServerSignature On\n+TraceEnable On\n+\n+ServerName \"n2.dusty.ci.centos.org\"\n ServerRoot \"/etc/httpd\"\n+PidFile run/httpd.pid\n+Timeout 120\n+KeepAlive Off\n+MaxKeepAliveRequests 100\n+KeepAliveTimeout 15\n+LimitRequestFieldSize 8190\n+\n \n-#\n-# Listen: Allows you to bind Apache to specific IP addresses and/or\n-# ports, instead of the default. See also the \n-# directive.\n-#\n-# Change this to Listen on specific IP addresses as shown below to \n-# prevent Apache from glomming onto all bound IP addresses.\n-#\n-#Listen 12.34.56.78:80\n-Listen 80\n-\n-#\n-# Dynamic Shared Object (DSO) Support\n-#\n-# To be able to use the functionality of a module which was built as a DSO you\n-# have to place corresponding `LoadModule' lines at this location so the\n-# directives contained in it are actually available _before_ they are used.\n-# Statically compiled modules (those listed by `httpd -l') do not need\n-# to be loaded here.\n-#\n-# Example:\n-# LoadModule foo_module modules/mod_foo.so\n-#\n-Include conf.modules.d/*.conf\n-\n-#\n-# If you wish httpd to run as a different user or group, you must run\n-# httpd as root initially and it will switch. \n-#\n-# User/Group: The name (or #number) of the user/group to run httpd as.\n-# It is usually good practice to create a dedicated user and group for\n-# running httpd, as with most system services.\n-#\n User apache\n Group apache\n \n-# 'Main' server configuration\n-#\n-# The directives in this section set up the values used by the 'main'\n-# server, which responds to any requests that aren't handled by a\n-# definition. These values also provide defaults for\n-# any containers you may define later in the file.\n-#\n-# All of these directives may appear inside containers,\n-# in which case these default settings will be overridden for the\n-# virtual host being defined.\n-#\n-\n-#\n-# ServerAdmin: Your address, where problems with the server should be\n-# e-mailed. This address appears on some server-generated pages, such\n-# as error documents. e.g. admin@your-domain.com\n-#\n-ServerAdmin root@localhost\n-\n-#\n-# ServerName gives the name and port that the server uses to identify itself.\n-# This can often be determined automatically, but we recommend you specify\n-# it explicitly to prevent problems during startup.\n-#\n-# If your host doesn't have a registered DNS name, enter its IP address here.\n-#\n-#ServerName www.example.com:80\n-\n-#\n-# Deny access to the entirety of your server's filesystem. You must\n-# explicitly permit access to web content directories in other \n-# blocks below.\n-#\n-\n- AllowOverride none\n+AccessFileName .htaccess\n+\n Require all denied\n-\n+\n \n-#\n-# Note that from this point forward you must specifically allow\n-# particular features to be enabled - so if something's not working as\n-# you might expect, make sure that you have specifically enabled it\n-# below.\n-#\n-\n-#\n-# DocumentRoot: The directory out of which you will serve your\n-# documents. By default, all requests are taken from this directory, but\n-# symbolic links and aliases may be used to point to other locations.\n-#\n-DocumentRoot \"/var/www/html\"\n-\n-#\n-# Relax access to content within /var/www.\n-#\n-\n- AllowOverride None\n- # Allow open access:\n- Require all granted\n+\n+ Options FollowSymLinks\n+ AllowOverride None\n \n \n-# Further relax access to the default document root:\n-\n- #\n- # Possible values for the Options directive are \"None\", \"All\",\n- # or any combination of:\n- # Indexes Includes FollowSymLinks SymLinksifOwnerMatch ExecCGI MultiViews\n- #\n- # Note that \"MultiViews\" must be named *explicitly* --- \"Options All\"\n- # doesn't give it to you.\n- #\n- # The Options directive is both complicated and important. Please see\n- # http://httpd.apache.org/docs/2.4/mod/core.html#options\n- # for more information.\n- #\n- Options Indexes FollowSymLinks\n-\n- #\n- # AllowOverride controls what directives may be placed in .htaccess files.\n- # It can be \"All\", \"None\", or any combination of the keywords:\n- # Options FileInfo AuthConfig Limit\n- #\n- AllowOverride None\n-\n- #\n- # Controls who can get stuff from this server.\n- #\n- Require all granted\n-\n \n-#\n-# DirectoryIndex: sets the file that Apache will serve if a directory\n-# is requested.\n-#\n-\n- DirectoryIndex index.html\n-\n-\n-#\n-# The following lines prevent .htaccess and .htpasswd files from being \n-# viewed by Web clients. \n-#\n-\n- Require all denied\n-\n-\n-#\n-# ErrorLog: The location of the error log file.\n-# If you do not specify an ErrorLog directive within a \n-# container, error messages relating to that virtual host will be\n-# logged here. If you *do* define an error logfile for a \n-# container, that host's errors will be logged there and not here.\n-#\n-ErrorLog \"logs/error_log\"\n-\n-#\n-# LogLevel: Control the number of messages logged to the error_log.\n-# Possible values include: debug, info, notice, warn, error, crit,\n-# alert, emerg.\n-#\n+HostnameLookups Off\n+ErrorLog \"/var/log/httpd/error_log\"\n LogLevel warn\n+EnableSendfile On\n \n-\n- #\n- # The following directives define some format nicknames for use with\n- # a CustomLog directive (see below).\n- #\n- LogFormat \"%h %l %u %t \\\"%r\\\" %>s %b \\\"%{Referer}i\\\" \\\"%{User-Agent}i\\\"\" combined\n- LogFormat \"%h %l %u %t \\\"%r\\\" %>s %b\" common\n-\n- \n- # You need to enable mod_logio.c to use %I and %O\n- LogFormat \"%h %l %u %t \\\"%r\\\" %>s %b \\\"%{Referer}i\\\" \\\"%{User-Agent}i\\\" %I %O\" combinedio\n- \n-\n- #\n- # The location and format of the access logfile (Common Logfile Format).\n- # If you do not define any access logfiles within a \n- # container, they will be logged here. Contrariwise, if you *do*\n- # define per- access logfiles, transactions will be\n- # logged therein and *not* in this file.\n- #\n- #CustomLog \"logs/access_log\" common\n-\n- #\n- # If you prefer a logfile with access, agent, and referer information\n- # (Combined Logfile Format) you can use the following directive.\n- #\n- CustomLog \"logs/access_log\" combined\n-\n-\n-\n- #\n- # Redirect: Allows you to tell clients about documents that used to \n- # exist in your server's namespace, but do not anymore. The client \n- # will make a new request for the document at its new location.\n- # Example:\n- # Redirect permanent /foo http://www.example.com/bar\n-\n- #\n- # Alias: Maps web paths into filesystem paths and is used to\n- # access content that does not live under the DocumentRoot.\n- # Example:\n- # Alias /webpath /full/filesystem/path\n- #\n- # If you include a trailing / on /webpath then the server will\n- # require it to be present in the URL. You will also likely\n- # need to provide a section to allow access to\n- # the filesystem path.\n-\n- #\n- # ScriptAlias: This controls which directories contain server scripts. \n- # ScriptAliases are essentially the same as Aliases, except that\n- # documents in the target directory are treated as applications and\n- # run by the server when requested rather than as documents sent to the\n- # client. The same rules about trailing \"/\" apply to ScriptAlias\n- # directives as to Alias.\n- #\n- ScriptAlias /cgi-bin/ \"/var/www/cgi-bin/\"\n-\n-\n-\n-#\n-# \"/var/www/cgi-bin\" should be changed to whatever your ScriptAliased\n-# CGI directory exists, if you have that configured.\n-#\n-\n- AllowOverride None\n- Options None\n- Require all granted\n-\n+#Listen 80\n+\n+\n+Include \"/etc/httpd/conf.modules.d/*.load\"\n+Include \"/etc/httpd/conf.modules.d/*.conf\"\n+Include \"/etc/httpd/conf/ports.conf\"\n+\n+LogFormat \"%h %l %u %t \\\"%r\\\" %>s %b \\\"%{Referer}i\\\" \\\"%{User-Agent}i\\\"\" combined\n+LogFormat \"%h %l %u %t \\\"%r\\\" %>s %b\" common\n+LogFormat \"%{Referer}i -> %U\" referer\n+LogFormat \"%{User-agent}i\" agent\n+LogFormat \"%{X-Forwarded-For}i %l %u %t \\\"%r\\\" %s %b \\\"%{Referer}i\\\" \\\"%{User-agent}i\\\"\" forwarded\n+\n+IncludeOptional \"/etc/httpd/conf.d/*.conf\"\n \n-\n- #\n- # TypesConfig points to the file containing the list of mappings from\n- # filename extension to MIME-type.\n- #\n- TypesConfig /etc/mime.types\n-\n- #\n- # AddType allows you to add to or override the MIME configuration\n- # file specified in TypesConfig for specific file types.\n- #\n- #AddType application/x-gzip .tgz\n- #\n- # AddEncoding allows you to have certain browsers uncompress\n- # information on the fly. Note: Not all browsers support this.\n- #\n- #AddEncoding x-compress .Z\n- #AddEncoding x-gzip .gz .tgz\n- #\n- # If the AddEncoding directives above are commented-out, then you\n- # probably should define those extensions to indicate media types:\n- #\n- AddType application/x-compress .Z\n- AddType application/x-gzip .gz .tgz\n-\n- #\n- # AddHandler allows you to map certain file extensions to \"handlers\":\n- # actions unrelated to filetype. These can be either built into the server\n- # or added with the Action directive (see below)\n- #\n- # To use CGI scripts outside of ScriptAliased directories:\n- # (You will also need to add \"ExecCGI\" to the \"Options\" directive.)\n- #\n- #AddHandler cgi-script .cgi\n-\n- # For type maps (negotiated resources):\n- #AddHandler type-map var\n-\n- #\n- # Filters allow you to process content before it is sent to the client.\n- #\n- # To parse .shtml files for server-side includes (SSI):\n- # (You will also need to add \"Includes\" to the \"Options\" directive.)\n- #\n- AddType text/html .shtml\n- AddOutputFilter INCLUDES .shtml\n-\n-\n-#\n-# Specify a default charset for all content served; this enables\n-# interpretation of all content as UTF-8 by default. To use the \n-# default browser choice (ISO-8859-1), or to allow the META tags\n-# in HTML content to override this choice, comment out this\n-# directive:\n-#\n-AddDefaultCharset UTF-8\n-\n-\n- #\n- # The mod_mime_magic module allows the server to use various hints from the\n- # contents of the file itself to determine its type. The MIMEMagicFile\n- # directive tells the module where the hint definitions are located.\n- #\n- MIMEMagicFile conf/magic\n-\n-\n-#\n-# Customizable error responses come in three flavors:\n-# 1) plain text 2) local redirects 3) external redirects\n-#\n-# Some examples:\n-#ErrorDocument 500 \"The server made a boo boo.\"\n-#ErrorDocument 404 /missing.html\n-#ErrorDocument 404 \"/cgi-bin/missing_handler.pl\"\n-#ErrorDocument 402 http://www.example.com/subscription_info.html\n-#\n-\n-#\n-# EnableMMAP and EnableSendfile: On systems that support it, \n-# memory-mapping or the sendfile syscall may be used to deliver\n-# files. This usually improves server performance, but must\n-# be turned off when serving from networked-mounted \n-# filesystems or if support for these functions is otherwise\n-# broken on your system.\n-# Defaults if commented: EnableMMAP On, EnableSendfile Off\n-#\n-#EnableMMAP off\n-EnableSendfile on\n-\n-# Supplemental configuration\n-#\n-# Load config files in the \"/etc/httpd/conf.d\" directory, if any.\n-IncludeOptional conf.d/*.conf\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]: Filebucketed /etc/httpd/conf/httpd.conf to puppet with sum f5e7449c0f17bc856e86011cb5d152ba\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]/content: content changed '{md5}f5e7449c0f17bc856e86011cb5d152ba' to '{md5}b3ed70a3a40f48d061c63f23fbbea111'\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf]/ensure: created\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments]/ensure: created\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-wsgi]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/260_keystone_wsgi_admin-wsgi]/ensure: defined content as '{md5}eab4d58b350697a7677844fd645581bf'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-wsgi]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/260_keystone_wsgi_admin-wsgi]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-directories]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/60_keystone_wsgi_admin-directories]/ensure: defined content as '{md5}cc81234a3bbf77f857ed3f11bb369e8c'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-directories]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/60_keystone_wsgi_admin-directories]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-file_footer]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/999_keystone_wsgi_admin-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-file_footer]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/999_keystone_wsgi_admin-file_footer]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-docroot]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/10_keystone_wsgi_admin-docroot]/ensure: defined content as '{md5}e250ff3401328e2e106702576d684293'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-docroot]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/10_keystone_wsgi_admin-docroot]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-serversignature]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/90_keystone_wsgi_admin-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-serversignature]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/90_keystone_wsgi_admin-serversignature]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-logging]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/80_keystone_wsgi_admin-logging]/ensure: defined content as '{md5}6e95210e81b53fbd537c884ba77577a6'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-logging]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/80_keystone_wsgi_admin-logging]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-ssl]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/230_keystone_wsgi_admin-ssl]/ensure: defined content as '{md5}30fbced56cdd99b65558d366e970e5fd'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-ssl]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/230_keystone_wsgi_admin-ssl]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments.concat]/ensure: created\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-access_log]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/100_keystone_wsgi_admin-access_log]/ensure: defined content as '{md5}f3a5a390b72c0e5ada35efbd1ab9c568'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-access_log]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/100_keystone_wsgi_admin-access_log]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf]/ensure: created\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments]/ensure: created\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-file_footer]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/999_keystone_wsgi_main-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-file_footer]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/999_keystone_wsgi_main-file_footer]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments.concat]/ensure: created\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-serversignature]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/90_keystone_wsgi_main-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-serversignature]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/90_keystone_wsgi_main-serversignature]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments.concat.out]/ensure: created\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-logging]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/80_keystone_wsgi_main-logging]/ensure: defined content as '{md5}2e5c08362091258b73059cd0e5435e9a'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-logging]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/80_keystone_wsgi_main-logging]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-access_log]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/100_keystone_wsgi_main-access_log]/ensure: defined content as '{md5}f8509b8e1ef317dd58bbcca1480a9c61'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-access_log]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/100_keystone_wsgi_main-access_log]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-apache-header]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/0_keystone_wsgi_main-apache-header]/ensure: defined content as '{md5}bcbedce152a9ba8190ab5a78ad4256f9'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-apache-header]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/0_keystone_wsgi_main-apache-header]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-wsgi]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/260_keystone_wsgi_main-wsgi]/ensure: defined content as '{md5}0ed0f415940e9362ef9e1871efb2c050'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-wsgi]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/260_keystone_wsgi_main-wsgi]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-ssl]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/230_keystone_wsgi_main-ssl]/ensure: defined content as '{md5}30fbced56cdd99b65558d366e970e5fd'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-ssl]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/230_keystone_wsgi_main-ssl]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-docroot]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/10_keystone_wsgi_main-docroot]/ensure: defined content as '{md5}e250ff3401328e2e106702576d684293'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-docroot]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/10_keystone_wsgi_main-docroot]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]\nNotice: /Stage[main]/Keystone/Keystone_config[token/expiration]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[token/expiration]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf]/ensure: created\nInfo: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments]/ensure: created\nInfo: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-serversignature]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/90_nova_api_wsgi-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7'\nInfo: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-serversignature]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/90_nova_api_wsgi-serversignature]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-ssl]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/230_nova_api_wsgi-ssl]/ensure: defined content as '{md5}6e6f07e9782e4535b25afa0e9dbd5964'\nInfo: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-ssl]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/230_nova_api_wsgi-ssl]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-docroot]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/10_nova_api_wsgi-docroot]/ensure: defined content as '{md5}a24d3496cbab869d04b9f6400e91f05b'\nInfo: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-docroot]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/10_nova_api_wsgi-docroot]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-file_footer]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/999_nova_api_wsgi-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d'\nInfo: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-file_footer]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/999_nova_api_wsgi-file_footer]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments.concat]/ensure: created\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-apache-header]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/0_nova_api_wsgi-apache-header]/ensure: defined content as '{md5}532286892f0965124c5d0f7a2d7ad2d2'\nInfo: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-apache-header]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/0_nova_api_wsgi-apache-header]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments.concat.out]/ensure: created\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-logging]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/80_nova_api_wsgi-logging]/ensure: defined content as '{md5}fffc2d2c643ad504aca6c347d7aec2d6'\nInfo: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-logging]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/80_nova_api_wsgi-logging]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-directories]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/60_nova_api_wsgi-directories]/ensure: defined content as '{md5}969793e0f283be30a0641501324cd29c'\nInfo: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-directories]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/60_nova_api_wsgi-directories]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]\nNotice: /Stage[main]/Keystone/Keystone_config[eventlet_server/admin_workers]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[eventlet_server/admin_workers]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone::Db/Keystone_config[database/connection]/ensure: created\nInfo: /Stage[main]/Keystone::Db/Keystone_config[database/connection]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments.concat.out]/ensure: created\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-access_log]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/100_nova_api_wsgi-access_log]/ensure: defined content as '{md5}3202d2662ed78e6f729646225603e1f5'\nInfo: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-access_log]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/100_nova_api_wsgi-access_log]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]\nNotice: /Stage[main]/Rabbitmq::Config/File[/etc/security/limits.d/rabbitmq-server.conf]/ensure: defined content as '{md5}5ddc6ba5fcaeddd5b1565e5adfda5236'\nInfo: /Stage[main]/Rabbitmq::Config/File[/etc/security/limits.d/rabbitmq-server.conf]: Scheduling refresh of Class[Rabbitmq::Service]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-access_log]/File[/var/lib/puppet/concat/15-default.conf/fragments/100_default-access_log]/ensure: defined content as '{md5}65fb033baac888b4ab85c295e870cb8f'\nInfo: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-access_log]/File[/var/lib/puppet/concat/15-default.conf/fragments/100_default-access_log]: Scheduling refresh of Exec[concat_15-default.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-apache-header]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/0_keystone_wsgi_admin-apache-header]/ensure: defined content as '{md5}36e2769e5e22c8ff440262db545892f0'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-apache-header]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/0_keystone_wsgi_admin-apache-header]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/Exec[concat_10-keystone_wsgi_admin.conf]/returns: executed successfully\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/Exec[concat_10-keystone_wsgi_admin.conf]: Triggered 'refresh' from 11 events\nNotice: /Stage[main]/Rabbitmq::Config/File[rabbitmq-env.config]/ensure: defined content as '{md5}d1faed99ee5f85f2e3ef458c2d19f3a8'\nInfo: /Stage[main]/Rabbitmq::Config/File[rabbitmq-env.config]: Scheduling refresh of Class[Rabbitmq::Service]\nInfo: Class[Rabbitmq::Config]: Scheduling refresh of Class[Rabbitmq::Service]\nInfo: Class[Rabbitmq::Service]: Scheduling refresh of Service[rabbitmq-server]\nNotice: /Stage[main]/Rabbitmq::Service/Service[rabbitmq-server]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Rabbitmq::Service/Service[rabbitmq-server]: Unscheduling refresh on Service[rabbitmq-server]\nNotice: /Stage[main]/Rabbitmq::Management/Rabbitmq_user[guest]/ensure: removed\nNotice: /Stage[main]/Rabbitmq::Install::Rabbitmqadmin/Staging::File[rabbitmqadmin]/Exec[/var/lib/rabbitmq/rabbitmqadmin]/returns: executed successfully\nNotice: /Stage[main]/Rabbitmq::Install::Rabbitmqadmin/File[/usr/local/bin/rabbitmqadmin]/ensure: defined content as '{md5}63d7331e825c865a97b7a8d1299841ff'\nNotice: /Stage[main]/Openstack_integration::Ironic/Rabbitmq_user[ironic]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Neutron/Rabbitmq_user[neutron]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Cinder/Rabbitmq_user[cinder]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Nova/Rabbitmq_user[nova]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Ironic/Rabbitmq_user_permissions[ironic@/]/ensure: created\nNotice: /Stage[main]/Ironic::Conductor/Service[ironic-conductor]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Ironic::Conductor/Service[ironic-conductor]: Unscheduling refresh on Service[ironic-conductor]\nNotice: /Stage[main]/Ironic::Api/Service[ironic-api]: Triggered 'refresh' from 1 events\nNotice: /Stage[main]/Openstack_integration::Nova/Rabbitmq_user_permissions[nova@/]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Cinder/Rabbitmq_user_permissions[cinder@/]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Glance/Rabbitmq_user[glance]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Glance/Rabbitmq_user_permissions[glance@/]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Neutron/Rabbitmq_user_permissions[neutron@/]/ensure: created\nNotice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_large_object_size]/ensure: created\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_large_object_size]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_large_object_size]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]/ensure: created\nInfo: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Service[swift-account-server]\nInfo: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Service[swift-account-replicator]\nInfo: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Service[swift-account-auditor]\nInfo: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Swift::Service[swift-account-server]\nInfo: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Swift::Service[swift-account-replicator]\nInfo: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Swift::Service[swift-account-auditor]\nInfo: Swift::Service[swift-account-server]: Scheduling refresh of Service[swift-account-server]\nNotice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/File[/etc/swift/account-server/]/owner: owner changed 'root' to 'swift'\nNotice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/File[/etc/swift/account-server/]/group: group changed 'root' to 'swift'\nInfo: Swift::Service[swift-account-replicator]: Scheduling refresh of Service[swift-account-replicator]\nNotice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]/ensure: created\nInfo: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Service[swift-container-server]\nInfo: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Service[swift-container-replicator]\nInfo: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Service[swift-container-auditor]\nInfo: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Swift::Service[swift-container-server]\nInfo: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Swift::Service[swift-container-replicator]\nInfo: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Swift::Service[swift-container-auditor]\nInfo: Swift::Service[swift-container-replicator]: Scheduling refresh of Service[swift-container-replicator]\nNotice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/File[/etc/swift/container-server/]/owner: owner changed 'root' to 'swift'\nNotice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/File[/etc/swift/container-server/]/group: group changed 'root' to 'swift'\nInfo: Swift::Service[swift-container-auditor]: Scheduling refresh of Service[swift-container-auditor]\nInfo: Swift::Service[swift-container-server]: Scheduling refresh of Service[swift-container-server]\nNotice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]/ensure: created\nInfo: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Service[swift-object-server]\nInfo: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Service[swift-object-replicator]\nInfo: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Service[swift-object-auditor]\nInfo: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Swift::Service[swift-object-server]\nInfo: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Swift::Service[swift-object-replicator]\nInfo: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Swift::Service[swift-object-auditor]\nNotice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/File[/etc/swift/object-server/]/owner: owner changed 'root' to 'swift'\nNotice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/File[/etc/swift/object-server/]/group: group changed 'root' to 'swift'\nInfo: Swift::Service[swift-object-server]: Scheduling refresh of Service[swift-object-server]\nInfo: Swift::Service[swift-object-replicator]: Scheduling refresh of Service[swift-object-replicator]\nInfo: Swift::Service[swift-object-auditor]: Scheduling refresh of Service[swift-object-auditor]\nInfo: Swift::Service[swift-account-auditor]: Scheduling refresh of Service[swift-account-auditor]\nNotice: /Stage[main]/Swift::Proxy/Package[swift-proxy]/ensure: created\nInfo: /Stage[main]/Swift::Proxy/Package[swift-proxy]: Scheduling refresh of Anchor[keystone::service::end]\nNotice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf]/ensure: created\nInfo: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments.concat]/ensure: created\nNotice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments.concat.out]/ensure: created\nNotice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments]/ensure: created\nInfo: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy/Concat::Fragment[swift_proxy]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/00_swift_proxy]/ensure: defined content as '{md5}3e7368112b701526ac018208596b6f2d'\nInfo: /Stage[main]/Swift::Proxy/Concat::Fragment[swift_proxy]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/00_swift_proxy]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Tempauth/Concat::Fragment[swift-proxy-swauth]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/01_swift-proxy-swauth]/ensure: defined content as '{md5}77ae9d1ddf6d75e07b795e520797adb4'\nInfo: /Stage[main]/Swift::Proxy::Tempauth/Concat::Fragment[swift-proxy-swauth]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/01_swift-proxy-swauth]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Container_quotas/Concat::Fragment[swift_container_quotas]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/81_swift_container_quotas]/ensure: defined content as '{md5}9cb7c3e198ec9152a4e1f80eb6448f6a'\nInfo: /Stage[main]/Swift::Proxy::Container_quotas/Concat::Fragment[swift_container_quotas]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/81_swift_container_quotas]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Catch_errors/Concat::Fragment[swift_catch_errors]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/24_swift_catch_errors]/ensure: defined content as '{md5}e07f0e5b125db7d6c8b4724c1648bcd5'\nInfo: /Stage[main]/Swift::Proxy::Catch_errors/Concat::Fragment[swift_catch_errors]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/24_swift_catch_errors]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Healthcheck/Concat::Fragment[swift_healthcheck]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/25_swift_healthcheck]/ensure: defined content as '{md5}8c92056c41082619d179f88ea15c5fc6'\nInfo: /Stage[main]/Swift::Proxy::Healthcheck/Concat::Fragment[swift_healthcheck]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/25_swift_healthcheck]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Account_quotas/Concat::Fragment[swift_account_quotas]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/80_swift_account_quotas]/ensure: defined content as '{md5}c1ff253d7976e5b952647085fb3cefe3'\nInfo: /Stage[main]/Swift::Proxy::Account_quotas/Concat::Fragment[swift_account_quotas]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/80_swift_account_quotas]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Cache/Concat::Fragment[swift_cache]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/23_swift_cache]/ensure: defined content as '{md5}cf82123513431b136e71a4503aeb82d9'\nInfo: /Stage[main]/Swift::Proxy::Cache/Concat::Fragment[swift_cache]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/23_swift_cache]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Tempurl/Concat::Fragment[swift-proxy-tempurl]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/29_swift-proxy-tempurl]/ensure: defined content as '{md5}2fe004eae9f03fc684f9ed90044bd9c5'\nInfo: /Stage[main]/Swift::Proxy::Tempurl/Concat::Fragment[swift-proxy-tempurl]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/29_swift-proxy-tempurl]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Formpost/Concat::Fragment[swift-proxy-formpost]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/31_swift-proxy-formpost]/ensure: defined content as '{md5}6907293ed6375b05de487bb7e0556ddd'\nInfo: /Stage[main]/Swift::Proxy::Formpost/Concat::Fragment[swift-proxy-formpost]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/31_swift-proxy-formpost]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Keystone/Concat::Fragment[swift_keystone]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/79_swift_keystone]/ensure: defined content as '{md5}1cf1118a35e6b76ab6ee194eb0722f53'\nInfo: /Stage[main]/Swift::Proxy::Keystone/Concat::Fragment[swift_keystone]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/79_swift_keystone]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Ratelimit/Concat::Fragment[swift_ratelimit]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/26_swift_ratelimit]/ensure: defined content as '{md5}2421e61cdf9eb2689fd5f1cc3740eb08'\nInfo: /Stage[main]/Swift::Proxy::Ratelimit/Concat::Fragment[swift_ratelimit]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/26_swift_ratelimit]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Staticweb/Concat::Fragment[swift-proxy-staticweb]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/32_swift-proxy-staticweb]/ensure: defined content as '{md5}3e8e5d2820dc79360e8f1e07541ef8dc'\nInfo: /Stage[main]/Swift::Proxy::Staticweb/Concat::Fragment[swift-proxy-staticweb]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/32_swift-proxy-staticweb]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Authtoken/File[/var/cache/swift]/group: group changed 'root' to 'swift'\nNotice: /Stage[main]/Swift::Proxy::Authtoken/File[/var/cache/swift]/mode: mode changed '0755' to '0700'\nNotice: /Stage[main]/Swift::Proxy::Authtoken/Concat::Fragment[swift_authtoken]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/22_swift_authtoken]/ensure: defined content as '{md5}f056388ce12c47fdd707acf18f5a14db'\nInfo: /Stage[main]/Swift::Proxy::Authtoken/Concat::Fragment[swift_authtoken]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/22_swift_authtoken]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy::Proxy_logging/Concat::Fragment[swift_proxy-logging]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/27_swift_proxy-logging]/ensure: defined content as '{md5}a7f5751de4957dadfee13dc6e6c83c1a'\nInfo: /Stage[main]/Swift::Proxy::Proxy_logging/Concat::Fragment[swift_proxy-logging]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/27_swift_proxy-logging]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]\nNotice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/Exec[concat_/etc/swift/proxy-server.conf]/returns: executed successfully\nNotice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/Exec[concat_/etc/swift/proxy-server.conf]: Triggered 'refresh' from 16 events\nNotice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]/content: \n--- /etc/swift/proxy-server.conf\t2016-05-07 16:57:43.000000000 +0100\n+++ /tmp/puppet-file20160520-26469-1jhbxzd\t2016-05-20 12:30:10.292337276 +0100\n@@ -1,16 +1,57 @@\n+# This file is managed by puppet. Do not edit\n+#\n [DEFAULT]\n bind_port = 8080\n-workers = 8\n+\n+bind_ip = 127.0.0.1\n+\n+workers = 2\n user = swift\n+log_name = proxy-server\n+log_facility = LOG_LOCAL1\n+log_level = INFO\n+log_headers = False\n+log_address = /dev/log\n+\n+\n \n [pipeline:main]\n-pipeline = healthcheck cache authtoken keystone proxy-logging proxy-server\n+pipeline = catch_errors healthcheck cache tempurl ratelimit authtoken keystone formpost staticweb container_quotas account_quotas proxy-logging proxy-server\n \n [app:proxy-server]\n use = egg:swift#proxy\n+set log_name = proxy-server\n+set log_facility = LOG_LOCAL1\n+set log_level = INFO\n+set log_address = /dev/log\n+log_handoffs = true\n allow_account_management = true\n account_autocreate = true\n \n+\n+\n+\n+\n+[filter:tempauth]\n+use = egg:swift#tempauth\n+\n+user_admin_admin = admin .admin .reseller_admin\n+\n+\n+[filter:authtoken]\n+log_name = swift\n+signing_dir = /var/cache/swift\n+paste.filter_factory = keystonemiddleware.auth_token:filter_factory\n+\n+auth_uri = https://127.0.0.1:5000/v2.0\n+identity_uri = https://127.0.0.1:35357/\n+admin_tenant_name = services\n+admin_user = swift\n+admin_password = a_big_secret\n+delay_auth_decision = 1\n+cache = swift.cache\n+include_service_catalog = False\n+\n [filter:cache]\n use = egg:swift#memcache\n memcache_servers = 127.0.0.1:11211\n@@ -21,21 +62,34 @@\n [filter:healthcheck]\n use = egg:swift#healthcheck\n \n+[filter:ratelimit]\n+use = egg:swift#ratelimit\n+clock_accuracy = 1000\n+max_sleep_time_seconds = 60\n+log_sleep_time_seconds = 0\n+rate_buffer_seconds = 5\n+account_ratelimit = 0\n+\n [filter:proxy-logging]\n use = egg:swift#proxy_logging\n \n+[filter:tempurl]\n+use = egg:swift#tempurl\n+\n+[filter:formpost]\n+use = egg:swift#formpost\n+\n+[filter:staticweb]\n+use = egg:swift#staticweb\n+\n [filter:keystone]\n use = egg:swift#keystoneauth\n-operator_roles = admin, SwiftOperator\n+operator_roles = Member, admin, SwiftOperator\n is_admin = true\n-cache = swift.cache\n+reseller_prefix = AUTH_\n \n-[filter:authtoken]\n-paste.filter_factory = keystonemiddleware.auth_token:filter_factory\n-admin_tenant_name = %SERVICE_TENANT_NAME%\n-admin_user = %SERVICE_USER%\n-admin_password = %SERVICE_PASSWORD%\n-auth_host = 127.0.0.1\n-auth_port = 35357\n-auth_protocol = http\n-signing_dir = /tmp/keystone-signing-swift\n+[filter:account_quotas]\n+use = egg:swift#account_quotas\n+\n+[filter:container_quotas]\n+use = egg:swift#container_quotas\nInfo: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]: Filebucketed /etc/swift/proxy-server.conf to puppet with sum cd347a2631d48647d000f5d34985704c\nNotice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]/content: content changed '{md5}cd347a2631d48647d000f5d34985704c' to '{md5}d6844dcb64e004f7b06f1e9ac75a5a56'\nNotice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]/owner: owner changed 'root' to 'swift'\nNotice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]/mode: mode changed '0640' to '0644'\nInfo: Concat[/etc/swift/proxy-server.conf]: Scheduling refresh of Swift::Service[swift-proxy-server]\nInfo: Concat[/etc/swift/proxy-server.conf]: Scheduling refresh of Service[swift-proxy-server]\nInfo: Swift::Service[swift-proxy-server]: Scheduling refresh of Service[swift-proxy-server]\nNotice: /Stage[main]/Swift::Proxy/Swift::Service[swift-proxy-server]/Service[swift-proxy-server]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Proxy/Swift::Service[swift-proxy-server]/Service[swift-proxy-server]: Unscheduling refresh on Service[swift-proxy-server]\nNotice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/Exec[concat_/etc/swift/account-server.conf]/returns: executed successfully\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/Exec[concat_/etc/swift/account-server.conf]: Triggered 'refresh' from 5 events\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/content: \n--- /etc/swift/account-server.conf\t2016-05-07 16:57:43.000000000 +0100\n+++ /tmp/puppet-file20160520-26469-ic4cnw\t2016-05-20 12:30:10.638324149 +0100\n@@ -1,21 +1,39 @@\n [DEFAULT]\n-\n-# Make sure your swift-ring-builder arguments match the bind_ip and bind_port.\n-# You almost certainly do not want to listen just on loopback unless testing.\n-# However, you want to keep port 6202 if SElinux is enabled.\n+devices = /srv/node\n bind_ip = 127.0.0.1\n-bind_port = 6202\n+bind_port = 6002\n+mount_check = false\n+user = swift\n+workers = 1\n+log_name = account-server\n+log_facility = LOG_LOCAL2\n+log_level = INFO\n+log_address = /dev/log\n+\n \n-workers = 2\n \n [pipeline:main]\n pipeline = account-server\n \n [app:account-server]\n use = egg:swift#account\n+set log_name = account-server\n+set log_facility = LOG_LOCAL2\n+set log_level = INFO\n+set log_requests = true\n+set log_address = /dev/log\n \n [account-replicator]\n+concurrency = 8\n \n [account-auditor]\n \n [account-reaper]\n+concurrency = 8\n+\n+[filter:healthcheck]\n+use = egg:swift#healthcheck\n+\n+[filter:recon]\n+use = egg:swift#recon\n+recon_cache_path = /var/cache/swift\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Filebucketed /etc/swift/account-server.conf to puppet with sum 07e5a1a1e5a0ab83d745e20680eb32c1\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/content: content changed '{md5}07e5a1a1e5a0ab83d745e20680eb32c1' to '{md5}b09bb7b7833b29c19014f8963d0e6884'\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/owner: owner changed 'root' to 'swift'\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/mode: mode changed '0640' to '0644'\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-reaper]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Swift::Service[swift-account-reaper]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-reaper]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Swift::Service[swift-account-reaper]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-reaper]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Swift::Service[swift-account-reaper]\nInfo: Swift::Service[swift-account-reaper]: Scheduling refresh of Service[swift-account-reaper]\nNotice: /Stage[main]/Swift::Storage::Account/Swift::Service[swift-account-reaper]/Service[swift-account-reaper]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Storage::Account/Swift::Service[swift-account-reaper]/Service[swift-account-reaper]: Unscheduling refresh on Service[swift-account-reaper]\nInfo: Concat[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-server]\nInfo: Concat[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-replicator]\nInfo: Concat[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-auditor]\nNotice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-replicator]/Service[swift-account-replicator]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-replicator]/Service[swift-account-replicator]: Unscheduling refresh on Service[swift-account-replicator]\nNotice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-auditor]/Service[swift-account-auditor]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-auditor]/Service[swift-account-auditor]: Unscheduling refresh on Service[swift-account-auditor]\nNotice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-server]/Service[swift-account-server]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-server]/Service[swift-account-server]: Unscheduling refresh on Service[swift-account-server]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-logging]/File[/var/lib/puppet/concat/15-default.conf/fragments/80_default-logging]/ensure: defined content as '{md5}f202203ce2fe5d885160be988ff36151'\nInfo: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-logging]/File[/var/lib/puppet/concat/15-default.conf/fragments/80_default-logging]: Scheduling refresh of Exec[concat_15-default.conf]\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/Exec[concat_15-default.conf]/returns: executed successfully\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/Exec[concat_15-default.conf]: Triggered 'refresh' from 10 events\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-wsgi]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/260_nova_api_wsgi-wsgi]/ensure: defined content as '{md5}d8fcfbd8a3ec337955722d8a7c10844a'\nInfo: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-wsgi]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/260_nova_api_wsgi-wsgi]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/Exec[concat_10-nova_api_wsgi.conf]/returns: executed successfully\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/Exec[concat_10-nova_api_wsgi.conf]: Triggered 'refresh' from 11 events\nNotice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[container]/Concat::Fragment[swift_healthcheck_container]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/25_swift_healthcheck_container]/ensure: defined content as '{md5}8c92056c41082619d179f88ea15c5fc6'\nInfo: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[container]/Concat::Fragment[swift_healthcheck_container]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/25_swift_healthcheck_container]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/Exec[concat_/etc/swift/container-server.conf]/returns: executed successfully\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/Exec[concat_/etc/swift/container-server.conf]: Triggered 'refresh' from 5 events\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/content: \n--- /etc/swift/container-server.conf\t2016-05-07 16:57:43.000000000 +0100\n+++ /tmp/puppet-file20160520-26469-1khud84\t2016-05-20 12:30:11.635286324 +0100\n@@ -1,23 +1,43 @@\n [DEFAULT]\n-\n-# Make sure your swift-ring-builder arguments match the bind_ip and bind_port.\n-# You almost certainly do not want to listen just on loopback unless testing.\n-# However, you want to keep port 6201 if SElinux is enabled.\n+devices = /srv/node\n bind_ip = 127.0.0.1\n-bind_port = 6201\n+bind_port = 6001\n+mount_check = false\n+user = swift\n+log_name = container-server\n+log_facility = LOG_LOCAL2\n+log_level = INFO\n+log_address = /dev/log\n+\n \n-workers = 2\n+workers = 1\n+allowed_sync_hosts = 127.0.0.1\n \n [pipeline:main]\n pipeline = container-server\n \n [app:container-server]\n+allow_versions = false\n use = egg:swift#container\n+set log_name = container-server\n+set log_facility = LOG_LOCAL2\n+set log_level = INFO\n+set log_requests = true\n+set log_address = /dev/log\n \n [container-replicator]\n+concurrency = 8\n \n [container-updater]\n+concurrency = 8\n \n [container-auditor]\n \n [container-sync]\n+\n+[filter:healthcheck]\n+use = egg:swift#healthcheck\n+\n+[filter:recon]\n+use = egg:swift#recon\n+recon_cache_path = /var/cache/swift\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Filebucketed /etc/swift/container-server.conf to puppet with sum 4998257eb89ff63e838b37686ebb1ee7\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/content: content changed '{md5}4998257eb89ff63e838b37686ebb1ee7' to '{md5}21c2517e90b3e9698ae546bfbf8e210f'\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/owner: owner changed 'root' to 'swift'\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/mode: mode changed '0640' to '0644'\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-updater]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Swift::Service[swift-container-updater]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-updater]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Swift::Service[swift-container-updater]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-updater]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Swift::Service[swift-container-updater]\nInfo: Concat[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-server]\nInfo: Concat[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-replicator]\nInfo: Concat[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-auditor]\nNotice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-replicator]/Service[swift-container-replicator]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-replicator]/Service[swift-container-replicator]: Unscheduling refresh on Service[swift-container-replicator]\nNotice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-auditor]/Service[swift-container-auditor]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-auditor]/Service[swift-container-auditor]: Unscheduling refresh on Service[swift-container-auditor]\nNotice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-server]/Service[swift-container-server]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-server]/Service[swift-container-server]: Unscheduling refresh on Service[swift-container-server]\nInfo: Swift::Service[swift-container-updater]: Scheduling refresh of Service[swift-container-updater]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Rsync::Server::Module[account]/Concat::Fragment[frag-account]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_account_frag-account]/ensure: defined content as '{md5}c1253249b9f960b4c5ab27bffc4c0382'\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Rsync::Server::Module[account]/Concat::Fragment[frag-account]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_account_frag-account]: Scheduling refresh of Exec[concat_/etc/rsync.conf]\nNotice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/Exec[concat_/etc/rsync.conf]/returns: executed successfully\nNotice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/Exec[concat_/etc/rsync.conf]: Triggered 'refresh' from 6 events\nNotice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/etc/rsync.conf]/ensure: defined content as '{md5}4b60030f2dab5c450c9d32e3fa3705c2'\nNotice: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Package[nova-api]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Package[nova-api]: Scheduling refresh of Anchor[keystone::service::end]\nInfo: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Package[nova-api]: Scheduling refresh of Anchor[nova::install::end]\nNotice: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Triggered 'refresh' from 7 events\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Scheduling refresh of Anchor[nova::service::begin]\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Scheduling refresh of Exec[nova-db-sync]\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Scheduling refresh of Exec[nova-db-sync-api]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf auth_tcp]/ensure: created\nInfo: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf auth_tcp]: Scheduling refresh of Service[libvirt]\nNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created\nInfo: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/debug]/ensure: created\nInfo: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/debug]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[neutron/service_metadata_proxy]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[neutron/service_metadata_proxy]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/use_neutron]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/use_neutron]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/log_dir]/ensure: created\nInfo: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/log_dir]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_use_baremetal_filters]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_use_baremetal_filters]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[DEFAULT/notify_api_faults]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[DEFAULT/notify_api_faults]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Db/Nova_config[api_database/connection]/ensure: created\nInfo: /Stage[main]/Nova::Db/Nova_config[api_database/connection]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Conductor/Nova_config[conductor/use_local]/ensure: created\nInfo: /Stage[main]/Nova::Conductor/Nova_config[conductor/use_local]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[osapi_v3/enabled]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[osapi_v3/enabled]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[DEFAULT/notification_driver]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[DEFAULT/notification_driver]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/max_io_ops_per_host]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/max_io_ops_per_host]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_password]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_password]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/compute_manager]/ensure: created\nInfo: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/compute_manager]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_password]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_user]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_user]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created\nInfo: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Vncproxy/Nova_config[DEFAULT/novncproxy_host]/ensure: created\nInfo: /Stage[main]/Nova::Vncproxy/Nova_config[DEFAULT/novncproxy_host]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created\nInfo: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/verbose]/ensure: created\nInfo: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/verbose]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_weight_classes]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_weight_classes]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_plugin]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_plugin]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/security_group_api]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/security_group_api]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/enabled_apis]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/enabled_apis]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen_port]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen_port]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_virtual_host]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[glance/api_servers]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[glance/api_servers]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_port]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[DEFAULT/image_service]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[DEFAULT/image_service]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/volume_api_class]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/volume_api_class]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/heal_instance_info_cache_interval]/ensure: created\nInfo: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/heal_instance_info_cache_interval]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[DEFAULT/notify_on_state_change]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[DEFAULT/notify_on_state_change]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/firewall_driver]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/firewall_driver]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created\nInfo: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_host_subset_size]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_host_subset_size]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Vncproxy/Nova_config[DEFAULT/novncproxy_port]/ensure: created\nInfo: /Stage[main]/Nova::Vncproxy/Nova_config[DEFAULT/novncproxy_port]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[cinder/catalog_info]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[cinder/catalog_info]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/kombu_reconnect_delay]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/kombu_reconnect_delay]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/ram_allocation_ratio]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Db/Nova_config[database/connection]/ensure: created\nInfo: /Stage[main]/Nova::Db/Nova_config[database/connection]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created\nInfo: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_host]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/default_floating_pool]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/default_floating_pool]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created\nInfo: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Scheduler/Nova_config[DEFAULT/scheduler_driver]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler/Nova_config[DEFAULT/scheduler_driver]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/amqp_durable_queues]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/disk_allocation_ratio]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/heartbeat_rate]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute/Nova_config[vnc/vncserver_proxyclient_address]/ensure: created\nInfo: /Stage[main]/Nova::Compute/Nova_config[vnc/vncserver_proxyclient_address]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/api_paste_config]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/api_paste_config]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/vncserver_listen]/ensure: created\nInfo: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/vncserver_listen]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_max_attempts]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_max_attempts]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/cpu_allocation_ratio]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen_port]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen_port]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created\nInfo: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/max_instances_per_host]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/max_instances_per_host]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_available_filters]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_available_filters]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute/Nova_config[vnc/keymap]/ensure: created\nInfo: /Stage[main]/Nova::Compute/Nova_config[vnc/keymap]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/url]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/url]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_tenant_name]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf listen_tls]/ensure: created\nInfo: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf listen_tls]: Scheduling refresh of Service[libvirt]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_host_manager]/ensure: created\nInfo: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_host_manager]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[DEFAULT/notification_topics]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[DEFAULT/notification_topics]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/kombu_ssl_version]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/use_forwarded_for]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/use_forwarded_for]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/sysconfig/libvirtd libvirtd args]/ensure: created\nInfo: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/sysconfig/libvirtd libvirtd args]: Scheduling refresh of Service[libvirt]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_volume_listen]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created\nInfo: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created\nInfo: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[DEFAULT/auth_strategy]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[DEFAULT/auth_strategy]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created\nInfo: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created\nInfo: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_workers]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_workers]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/identity_uri]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/identity_uri]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[DEFAULT/rpc_backend]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[DEFAULT/rpc_backend]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf listen_tcp]/ensure: created\nInfo: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf listen_tcp]: Scheduling refresh of Service[libvirt]\nNotice: /Stage[main]/Nova::Compute::Libvirt/Service[libvirt]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Nova::Compute::Libvirt/Service[libvirt]: Unscheduling refresh on Service[libvirt]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_workers]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_workers]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/dhcp_domain]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/dhcp_domain]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/allow_resize_to_same_host]/ensure: created\nInfo: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/allow_resize_to_same_host]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova/Nova_config[oslo_concurrency/lock_path]/ensure: created\nInfo: /Stage[main]/Nova/Nova_config[oslo_concurrency/lock_path]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/auth_uri]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/auth_uri]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[neutron/metadata_proxy_shared_secret]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[neutron/metadata_proxy_shared_secret]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created\nInfo: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]: Scheduling refresh of Anchor[nova::config::end]\nInfo: /etc/httpd/conf.d: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[10-nova_api_wsgi.conf]/ensure: defined content as '{md5}a201c1c5ac33c244ff2071cfe9b38046'\nNotice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[15-default.conf]/ensure: defined content as '{md5}a430bf4e003be964b419e7aea251c6c4'\nInfo: Concat[10-nova_api_wsgi.conf]: Scheduling refresh of Class[Apache::Service]\nInfo: Apache::Vhost[nova_api_wsgi]: Scheduling refresh of Anchor[keystone::config::end]\nInfo: Concat[15-default.conf]: Scheduling refresh of Class[Apache::Service]\nInfo: Apache::Vhost[default]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/fping_path]/ensure: created\nInfo: /Stage[main]/Nova::Api/Nova_config[DEFAULT/fping_path]: Scheduling refresh of Anchor[nova::config::end]\nNotice: /Stage[main]/Apache::Mod::Dav_fs/File[dav_fs.conf]/ensure: defined content as '{md5}899a57534f3d84efa81887ec93c90c9b'\nInfo: /Stage[main]/Apache::Mod::Dav_fs/File[dav_fs.conf]: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments.concat]/ensure: created\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/Exec[concat_/etc/swift/object-server.conf]/returns: executed successfully\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/Exec[concat_/etc/swift/object-server.conf]: Triggered 'refresh' from 5 events\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/content: \n--- /etc/swift/object-server.conf\t2016-05-07 16:57:43.000000000 +0100\n+++ /tmp/puppet-file20160520-26469-14vpviw\t2016-05-20 12:30:21.421915019 +0100\n@@ -1,21 +1,39 @@\n [DEFAULT]\n-\n-# Make sure your swift-ring-builder arguments match the bind_ip and bind_port.\n-# You almost certainly do not want to listen just on loopback unless testing.\n-# However, you want to keep port 6200 if SElinux is enabled.\n+devices = /srv/node\n bind_ip = 127.0.0.1\n-bind_port = 6200\n+bind_port = 6000\n+mount_check = false\n+user = swift\n+log_name = object-server\n+log_facility = LOG_LOCAL2\n+log_level = INFO\n+log_address = /dev/log\n+\n \n-workers = 3\n+workers = 1\n \n [pipeline:main]\n pipeline = object-server\n \n [app:object-server]\n use = egg:swift#object\n+set log_name = object-server\n+set log_facility = LOG_LOCAL2\n+set log_level = INFO\n+set log_requests = true\n+set log_address = /dev/log\n \n [object-replicator]\n+concurrency = 8\n \n [object-updater]\n+concurrency = 8\n \n [object-auditor]\n+\n+[filter:healthcheck]\n+use = egg:swift#healthcheck\n+\n+[filter:recon]\n+use = egg:swift#recon\n+recon_cache_path = /var/cache/swift\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Filebucketed /etc/swift/object-server.conf to puppet with sum 43f14d676b28bc8111d6100e06e9a8bf\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/content: content changed '{md5}43f14d676b28bc8111d6100e06e9a8bf' to '{md5}396c3ccb85387cbac0df92cdbad14646'\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/owner: owner changed 'root' to 'swift'\nNotice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/mode: mode changed '0640' to '0644'\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-updater]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Swift::Service[swift-object-updater]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-updater]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Swift::Service[swift-object-updater]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-updater]\nInfo: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Swift::Service[swift-object-updater]\nInfo: Concat[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-server]\nInfo: Concat[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-replicator]\nInfo: Concat[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-auditor]\nNotice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-server]/Service[swift-object-server]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-server]/Service[swift-object-server]: Unscheduling refresh on Service[swift-object-server]\nNotice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-replicator]/Service[swift-object-replicator]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-replicator]/Service[swift-object-replicator]: Unscheduling refresh on Service[swift-object-replicator]\nNotice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-auditor]/Service[swift-object-auditor]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-auditor]/Service[swift-object-auditor]: Unscheduling refresh on Service[swift-object-auditor]\nInfo: Swift::Service[swift-object-updater]: Scheduling refresh of Service[swift-object-updater]\nNotice: /Stage[main]/Swift::Storage::Object/Swift::Service[swift-object-updater]/Service[swift-object-updater]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Storage::Object/Swift::Service[swift-object-updater]/Service[swift-object-updater]: Unscheduling refresh on Service[swift-object-updater]\nNotice: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Triggered 'refresh' from 47 events\nInfo: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-api]\nInfo: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-scheduler]\nInfo: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-volume]\nInfo: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Api/Service[cinder-api]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Cinder::Api/Service[cinder-api]: Unscheduling refresh on Service[cinder-api]\nNotice: /Stage[main]/Cinder::Volume/Service[cinder-volume]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Cinder::Volume/Service[cinder-volume]: Unscheduling refresh on Service[cinder-volume]\nNotice: /Stage[main]/Cinder::Scheduler/Service[cinder-scheduler]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Cinder::Scheduler/Service[cinder-scheduler]: Unscheduling refresh on Service[cinder-scheduler]\nNotice: /Stage[main]/Keystone/Keystone_config[token/revoke_by_id]/ensure: created\nInfo: /Stage[main]/Keystone/Keystone_config[token/revoke_by_id]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf]/ensure: created\nInfo: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments]/ensure: created\nInfo: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-access_log]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/100_ironic_wsgi-access_log]/ensure: defined content as '{md5}f2a2c3f663fb69cb0f359c1ae7ad320c'\nInfo: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-access_log]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/100_ironic_wsgi-access_log]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-directories]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/60_ironic_wsgi-directories]/ensure: defined content as '{md5}29d0408a3b55a4415d880929f9a3ad46'\nInfo: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-directories]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/60_ironic_wsgi-directories]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-ssl]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/230_ironic_wsgi-ssl]/ensure: defined content as '{md5}d6cec447dc3b9d177de1da941662dde7'\nInfo: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-ssl]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/230_ironic_wsgi-ssl]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-docroot]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/10_ironic_wsgi-docroot]/ensure: defined content as '{md5}5cce1f4b838a61eb9353dc516b6f1912'\nInfo: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-docroot]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/10_ironic_wsgi-docroot]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-wsgi]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/260_ironic_wsgi-wsgi]/ensure: defined content as '{md5}ce69252b664facd16f8d6d002943bde9'\nInfo: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-wsgi]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/260_ironic_wsgi-wsgi]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-apache-header]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/0_ironic_wsgi-apache-header]/ensure: defined content as '{md5}eed662cc75f34394db84b64d61142357'\nInfo: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-apache-header]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/0_ironic_wsgi-apache-header]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-logging]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/80_ironic_wsgi-logging]/ensure: defined content as '{md5}228ae1c4025ea06df280b6c090746264'\nInfo: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-logging]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/80_ironic_wsgi-logging]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-file_footer]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/999_ironic_wsgi-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d'\nInfo: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-file_footer]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/999_ironic_wsgi-file_footer]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments.concat]/ensure: created\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments.concat.out]/ensure: created\nNotice: /Stage[main]/Swift::Storage::Container/Swift::Service[swift-container-updater]/Service[swift-container-updater]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Swift::Storage::Container/Swift::Service[swift-container-updater]/Service[swift-container-updater]: Unscheduling refresh on Service[swift-container-updater]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-serversignature]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/90_ironic_wsgi-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7'\nInfo: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-serversignature]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/90_ironic_wsgi-serversignature]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/Exec[concat_10-ironic_wsgi.conf]/returns: executed successfully\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/Exec[concat_10-ironic_wsgi.conf]: Triggered 'refresh' from 11 events\nNotice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[10-ironic_wsgi.conf]/ensure: defined content as '{md5}fd0438eae872c05b10e229854a6dd56d'\nInfo: Concat[10-ironic_wsgi.conf]: Scheduling refresh of Class[Apache::Service]\nInfo: Apache::Vhost[ironic_wsgi]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]/ensure: created\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_host]/ensure: created\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Openstack_integration::Ironic/Openstack_integration::Ssl_key[ironic]/File[/etc/ironic/ssl]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Ironic/Openstack_integration::Ssl_key[ironic]/File[/etc/ironic/ssl/private]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Ironic/Openstack_integration::Ssl_key[ironic]/File[/etc/ironic/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00'\nInfo: Openstack_integration::Ssl_key[ironic]: Scheduling refresh of Service[httpd]\nNotice: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/verbose]/ensure: created\nInfo: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/verbose]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/auth_url]/ensure: created\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/auth_url]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/auth_url]: Scheduling refresh of Exec[glance-manage db_sync]\nNotice: /Stage[main]/Glance::Db::Sync/Exec[glance-manage db_sync]: Triggered 'refresh' from 83 events\nInfo: /Stage[main]/Glance::Db::Sync/Exec[glance-manage db_sync]: Scheduling refresh of Service[glance-api]\nInfo: /Stage[main]/Glance::Db::Sync/Exec[glance-manage db_sync]: Scheduling refresh of Service[glance-registry]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-directories]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/60_keystone_wsgi_main-directories]/ensure: defined content as '{md5}cc81234a3bbf77f857ed3f11bb369e8c'\nInfo: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-directories]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/60_keystone_wsgi_main-directories]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/Exec[concat_10-keystone_wsgi_main.conf]/returns: executed successfully\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/Exec[concat_10-keystone_wsgi_main.conf]: Triggered 'refresh' from 11 events\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[10-keystone_wsgi_main.conf]/ensure: defined content as '{md5}fa0ea0cfef0ad72ddbeb9b6110bd2c86'\nInfo: Concat[10-keystone_wsgi_main.conf]: Scheduling refresh of Class[Apache::Service]\nInfo: Apache::Vhost[keystone_wsgi_main]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Triggered 'refresh' from 103 events\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Scheduling refresh of Anchor[nova::service::begin]\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Scheduling refresh of Exec[nova-db-sync]\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Scheduling refresh of Exec[nova-db-sync-api]\nNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Mysql_database[nova]/ensure: created\nNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_127.0.0.1]/Mysql_user[nova@127.0.0.1]/ensure: created\nNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_127.0.0.1]/Mysql_grant[nova@127.0.0.1/nova.*]/ensure: created\nInfo: Class[Nova::Db::Mysql]: Scheduling refresh of Anchor[nova::db::end]\nNotice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Mysql_database[nova_api]/ensure: created\nNotice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Openstacklib::Db::Mysql::Host_access[nova_api_127.0.0.1]/Mysql_user[nova_api@127.0.0.1]/ensure: created\nNotice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Openstacklib::Db::Mysql::Host_access[nova_api_127.0.0.1]/Mysql_grant[nova_api@127.0.0.1/nova_api.*]/ensure: created\nInfo: Class[Nova::Db::Mysql_api]: Scheduling refresh of Anchor[nova::db::end]\nNotice: /Stage[main]/Nova::Deps/Anchor[nova::db::end]: Triggered 'refresh' from 2 events\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::db::end]: Scheduling refresh of Anchor[nova::dbsync::begin]\nNotice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::begin]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::begin]: Scheduling refresh of Exec[nova-db-sync]\nNotice: /Stage[main]/Nova::Db::Sync/Exec[nova-db-sync]: Triggered 'refresh' from 3 events\nInfo: /Stage[main]/Nova::Db::Sync/Exec[nova-db-sync]: Scheduling refresh of Anchor[nova::dbsync::end]\nNotice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::end]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::end]: Scheduling refresh of Anchor[nova::dbsync_api::begin]\nNotice: /Stage[main]/Nova::Cron::Archive_deleted_rows/Cron[nova-manage db archive_deleted_rows]/ensure: created\nNotice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::begin]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::begin]: Scheduling refresh of Exec[nova-db-sync-api]\nNotice: /Stage[main]/Nova::Db::Sync_api/Exec[nova-db-sync-api]: Triggered 'refresh' from 3 events\nInfo: /Stage[main]/Nova::Db::Sync_api/Exec[nova-db-sync-api]: Scheduling refresh of Anchor[nova::dbsync_api::end]\nNotice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::end]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::end]: Scheduling refresh of Anchor[nova::service::begin]\nNotice: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Triggered 'refresh' from 3 events\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-api]\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-conductor]\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-consoleauth]\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-compute]\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-scheduler]\nInfo: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-vncproxy]\nNotice: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Service[nova-vncproxy]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Service[nova-vncproxy]: Scheduling refresh of Anchor[nova::service::end]\nInfo: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Service[nova-vncproxy]: Unscheduling refresh on Service[nova-vncproxy]\nNotice: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Service[nova-consoleauth]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Service[nova-consoleauth]: Scheduling refresh of Anchor[nova::service::end]\nInfo: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Service[nova-consoleauth]: Unscheduling refresh on Service[nova-consoleauth]\nNotice: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Service[nova-scheduler]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Service[nova-scheduler]: Scheduling refresh of Anchor[nova::service::end]\nInfo: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Service[nova-scheduler]: Unscheduling refresh on Service[nova-scheduler]\nNotice: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Service[nova-conductor]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Service[nova-conductor]: Scheduling refresh of Anchor[nova::service::end]\nInfo: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Service[nova-conductor]: Unscheduling refresh on Service[nova-conductor]\nNotice: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Service[nova-compute]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Service[nova-compute]: Scheduling refresh of Anchor[nova::service::end]\nInfo: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Service[nova-compute]: Unscheduling refresh on Service[nova-compute]\nNotice: /Stage[main]/Apache::Mod::Cgi/Apache::Mod[cgi]/File[cgi.load]/ensure: defined content as '{md5}ac20c5c5779b37ab06b480d6485a0881'\nInfo: /Stage[main]/Apache::Mod::Cgi/Apache::Mod[cgi]/File[cgi.load]: Scheduling refresh of Class[Apache::Service]\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-proxy.conf]: Filebucketed /etc/httpd/conf.modules.d/00-proxy.conf to puppet with sum 85487c6777a89a8494dc8976dfff3268\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-proxy.conf]/ensure: removed\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/01-cgi.conf]: Filebucketed /etc/httpd/conf.modules.d/01-cgi.conf to puppet with sum 36e54d4b2bd190f5cbad876bfbeda461\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/01-cgi.conf]/ensure: removed\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-ssl.conf]: Filebucketed /etc/httpd/conf.modules.d/00-ssl.conf to puppet with sum e282ac9f82fe5538692a4de3616fb695\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-ssl.conf]/ensure: removed\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-mpm.conf]: Filebucketed /etc/httpd/conf.modules.d/00-mpm.conf to puppet with sum 820f672ca85595fd80620db585d51970\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-mpm.conf]/ensure: removed\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-systemd.conf]: Filebucketed /etc/httpd/conf.modules.d/00-systemd.conf to puppet with sum fd94264cd695af2ad86e7715c10e285d\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-systemd.conf]/ensure: removed\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/10-wsgi.conf]: Filebucketed /etc/httpd/conf.modules.d/10-wsgi.conf to puppet with sum e1795e051e7aae1f865fde0d3b86a507\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/10-wsgi.conf]/ensure: removed\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-base.conf]: Filebucketed /etc/httpd/conf.modules.d/00-base.conf to puppet with sum 6098845a84033f0fabe536488e52b1a0\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-base.conf]/ensure: removed\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-lua.conf]: Filebucketed /etc/httpd/conf.modules.d/00-lua.conf to puppet with sum 449a4aea60473ac4a16f025fca4463e3\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-lua.conf]/ensure: removed\nInfo: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-dav.conf]: Filebucketed /etc/httpd/conf.modules.d/00-dav.conf to puppet with sum 56406b62d1fc7b7f1912e5b9e223f7a0\nNotice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-dav.conf]/ensure: removed\nInfo: /etc/httpd/conf.modules.d: Scheduling refresh of Class[Apache::Service]\nNotice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]/ensure: created\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Service[neutron-ovs-agent-service]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Neutron::Agents::Ml2::Ovs/Service[neutron-ovs-agent-service]: Unscheduling refresh on Service[neutron-ovs-agent-service]\nNotice: /Stage[main]/Neutron::Agents::Lbaas/Service[neutron-lbaasv2-service]: Triggered 'refresh' from 1 events\nNotice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[10-keystone_wsgi_admin.conf]/ensure: defined content as '{md5}5147e80911d47f807820c80ccf1b3f9e'\nInfo: Concat[10-keystone_wsgi_admin.conf]: Scheduling refresh of Class[Apache::Service]\nInfo: Apache::Vhost[keystone_wsgi_admin]: Scheduling refresh of Anchor[keystone::config::end]\nNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Triggered 'refresh' from 36 events\nInfo: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Scheduling refresh of Anchor[keystone::service::begin]\nInfo: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Scheduling refresh of Service[httpd]\nInfo: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Scheduling refresh of Exec[keystone-manage db_sync]\nNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Mysql_database[keystone]/ensure: created\nInfo: Class[Apache::Service]: Scheduling refresh of Service[httpd]\nNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_127.0.0.1]/Mysql_user[keystone@127.0.0.1]/ensure: created\nNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_127.0.0.1]/Mysql_grant[keystone@127.0.0.1/keystone.*]/ensure: created\nInfo: Class[Keystone::Db::Mysql]: Scheduling refresh of Anchor[keystone::db::end]\nNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::db::end]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Keystone::Deps/Anchor[keystone::db::end]: Scheduling refresh of Anchor[keystone::dbsync::begin]\nNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::begin]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::begin]: Scheduling refresh of Exec[keystone-manage db_sync]\nNotice: /Stage[main]/Keystone::Db::Sync/Exec[keystone-manage db_sync]: Triggered 'refresh' from 3 events\nInfo: /Stage[main]/Keystone::Db::Sync/Exec[keystone-manage db_sync]: Scheduling refresh of Anchor[keystone::dbsync::end]\nNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::end]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::end]: Scheduling refresh of Anchor[keystone::service::begin]\nInfo: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::end]: Scheduling refresh of Exec[keystone-manage bootstrap]\nNotice: /Stage[main]/Keystone/Exec[keystone-manage bootstrap]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Keystone/Exec[keystone-manage bootstrap]: Scheduling refresh of Anchor[keystone::service::begin]\nNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::service::begin]: Triggered 'refresh' from 4 events\nInfo: /Stage[main]/Keystone::Deps/Anchor[keystone::service::begin]: Scheduling refresh of Service[keystone]\nNotice: /Stage[main]/Keystone::Service/Service[keystone]: Triggered 'refresh' from 1 events\nInfo: /Stage[main]/Keystone::Service/Service[keystone]: Scheduling refresh of Anchor[keystone::service::end]\nNotice: /Stage[main]/Apache::Service/Service[httpd]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Apache::Service/Service[httpd]: Unscheduling refresh on Service[httpd]\nNotice: /Stage[main]/Keystone::Deps/Anchor[keystone::service::end]: Triggered 'refresh' from 31 events\nNotice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift_s3]/Keystone_service[swift_s3::s3]/ensure: created\nNotice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift_s3]/Keystone_endpoint[RegionOne/swift_s3::s3]/ensure: created\nNotice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_service[Image Service::image]/ensure: created\nNotice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_user[neutron]/ensure: created\nNotice: /Stage[main]/Swift::Keystone::Auth/Keystone_role[ResellerAdmin]/ensure: created\nNotice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_service[ironic::baremetal]/ensure: created\nNotice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_endpoint[RegionOne/Image Service::image]/ensure: created\nInfo: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_endpoint[RegionOne/Image Service::image]: Scheduling refresh of Service[glance-api]\nNotice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova service, user nova]/Keystone_user[nova]/ensure: created\nNotice: /Stage[main]/Swift::Keystone::Auth/Keystone_role[SwiftOperator]/ensure: created\nNotice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user[glance]/ensure: created\nNotice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova v3 service, user novav3]/Keystone_service[novav3::computev3]/ensure: created\nNotice: /Stage[main]/Keystone::Roles::Admin/Keystone_tenant[services]/ensure: created\nNotice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user_role[glance@services]/ensure: created\nInfo: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user_role[glance@services]: Scheduling refresh of Service[glance-registry]\nInfo: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user_role[glance@services]: Scheduling refresh of Service[glance-api]\nNotice: /Stage[main]/Glance::Registry/Service[glance-registry]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Glance::Registry/Service[glance-registry]: Unscheduling refresh on Service[glance-registry]\nNotice: /Stage[main]/Keystone::Roles::Admin/Keystone_tenant[openstack]/ensure: created\nNotice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_user[ironic]/ensure: created\nNotice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_user_role[ironic@services]/ensure: created\nNotice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova service, user nova]/Keystone_service[nova::compute]/ensure: created\nNotice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinderv3]/Keystone_service[cinderv3::volumev3]/ensure: created\nNotice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinderv3]/Keystone_endpoint[RegionOne/cinderv3::volumev3]/ensure: created\nNotice: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Service[nova-api]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Service[nova-api]: Scheduling refresh of Anchor[nova::service::end]\nInfo: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Service[nova-api]: Unscheduling refresh on Service[nova-api]\nNotice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_service[neutron::network]/ensure: created\nNotice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova service, user nova]/Keystone_user_role[nova@services]/ensure: created\nNotice: /Stage[main]/Keystone::Roles::Admin/Keystone_user[admin]/password: changed password\nNotice: /Stage[main]/Keystone::Roles::Admin/Keystone_user[admin]/email: defined 'email' as 'test@example.tld'\nNotice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinder]/Keystone_user[cinder]/ensure: created\nNotice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinderv2]/Keystone_service[cinderv2::volumev2]/ensure: created\nNotice: /Stage[main]/Nova::Deps/Anchor[nova::service::end]: Triggered 'refresh' from 6 events\nNotice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova v3 service, user novav3]/Keystone_endpoint[RegionOne/novav3::computev3]/ensure: created\nNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]/ensure: created\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]/ensure: created\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]/ensure: created\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]/ensure: created\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]/ensure: created\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]/ensure: created\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]/ensure: created\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]/ensure: created\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova service, user nova]/Keystone_endpoint[RegionOne/nova::compute]/ensure: created\nNotice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_endpoint[RegionOne/neutron::network]/ensure: created\nInfo: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_endpoint[RegionOne/neutron::network]: Scheduling refresh of Service[neutron-server]\nNotice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_service[swift::object-store]/ensure: created\nNotice: /Stage[main]/Keystone::Endpoint/Keystone::Resource::Service_identity[keystone]/Keystone_service[keystone::identity]/ensure: created\nNotice: /Stage[main]/Keystone::Endpoint/Keystone::Resource::Service_identity[keystone]/Keystone_endpoint[RegionOne/keystone::identity]/ensure: created\nNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]/ensure: created\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_user_role[neutron@services]/ensure: created\nInfo: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_user_role[neutron@services]: Scheduling refresh of Service[neutron-server]\nNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]/ensure: created\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]/ensure: created\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-server]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Exec[neutron-db-sync]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-metadata]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-lbaas-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-l3]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-dhcp-service]\nInfo: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Agents::Metering/Service[neutron-metering-service]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Neutron::Agents::Metering/Service[neutron-metering-service]: Unscheduling refresh on Service[neutron-metering-service]\nNotice: /Stage[main]/Neutron::Agents::Dhcp/Service[neutron-dhcp-service]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Neutron::Agents::Dhcp/Service[neutron-dhcp-service]: Unscheduling refresh on Service[neutron-dhcp-service]\nNotice: /Stage[main]/Neutron::Agents::L3/Service[neutron-l3]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Neutron::Agents::L3/Service[neutron-l3]: Unscheduling refresh on Service[neutron-l3]\nNotice: /Stage[main]/Neutron::Agents::Lbaas/Service[neutron-lbaas-service]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Neutron::Agents::Lbaas/Service[neutron-lbaas-service]: Unscheduling refresh on Service[neutron-lbaas-service]\nNotice: /Stage[main]/Neutron::Agents::Metadata/Service[neutron-metadata]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Neutron::Agents::Metadata/Service[neutron-metadata]: Unscheduling refresh on Service[neutron-metadata]\nNotice: /Stage[main]/Glance::Api/Service[glance-api]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Glance::Api/Service[glance-api]: Unscheduling refresh on Service[glance-api]\nNotice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_endpoint[RegionOne/ironic::baremetal]/ensure: created\nNotice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinder]/Keystone_service[cinder::volume]/ensure: created\nNotice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinder]/Keystone_endpoint[RegionOne/cinder::volume]/ensure: created\nNotice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_endpoint[RegionOne/swift::object-store]/ensure: created\nNotice: /Stage[main]/Neutron::Db::Sync/Exec[neutron-db-sync]: Triggered 'refresh' from 59 events\nInfo: /Stage[main]/Neutron::Db::Sync/Exec[neutron-db-sync]: Scheduling refresh of Service[neutron-server]\nNotice: /Stage[main]/Neutron::Server/Service[neutron-server]/ensure: ensure changed 'stopped' to 'running'\nInfo: /Stage[main]/Neutron::Server/Service[neutron-server]: Unscheduling refresh on Service[neutron-server]\nNotice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_user[swift]/ensure: created\nNotice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_user_role[swift@services]/ensure: created\nNotice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinderv2]/Keystone_endpoint[RegionOne/cinderv2::volumev2]/ensure: created\nNotice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinder]/Keystone_user_role[cinder@services]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Cinder/Cinder_type[BACKEND_1]/ensure: created\nNotice: /Stage[main]/Keystone::Roles::Admin/Keystone_user_role[admin@openstack]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Provision/Glance_image[cirros]/ensure: created\nNotice: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[public_api/admin_token_auth]/ensure: removed\nInfo: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[public_api/admin_token_auth]: Scheduling refresh of Exec[restart_keystone]\nNotice: /Stage[main]/Openstack_integration::Provision/Neutron_network[public]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_neutron_net_id_setter[public_network_id]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Provision/Exec[manage_m1.micro_nova_flavor]/returns: executed successfully\nNotice: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[api_v3/admin_token_auth]/ensure: removed\nInfo: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[api_v3/admin_token_auth]: Scheduling refresh of Exec[restart_keystone]\nNotice: /Stage[main]/Openstack_extras::Auth_file/File[/root/openrc]/ensure: defined content as '{md5}3f4b596583820c76e15d3092a5c6dcc0'\nNotice: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[admin_api/admin_token_auth]/ensure: removed\nInfo: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[admin_api/admin_token_auth]: Scheduling refresh of Exec[restart_keystone]\nNotice: /Stage[main]/Keystone/Exec[restart_keystone]: Triggered 'refresh' from 3 events\nNotice: /Stage[main]/Openstack_integration::Provision/Exec[manage_m1.nano_nova_flavor]/returns: executed successfully\nNotice: /Stage[main]/Openstack_integration::Provision/Neutron_subnet[public-subnet]/ensure: created\nNotice: /Stage[main]/Openstack_integration::Provision/Glance_image[cirros_alt]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_glance_id_setter[image_ref]/ensure: created\nNotice: /Stage[main]/Tempest/Tempest_glance_id_setter[image_ref_alt]/ensure: created\nInfo: Creating state file /var/lib/puppet/state/state.yaml\nNotice: Finished catalog run in 519.98 seconds\nInfo: Loading external facts from /etc/puppet/modules/openstacklib/facts.d\nInfo: Loading facts in /etc/puppet/modules/nova/lib/facter/libvirt_uuid.rb\nInfo: Loading facts in /etc/puppet/modules/openstacklib/lib/facter/os_package_type.rb\nInfo: Loading facts in /etc/puppet/modules/openstacklib/lib/facter/os_service_default.rb\nInfo: Loading facts in /etc/puppet/modules/vswitch/lib/facter/ovs.rb\nInfo: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_reboot_required.rb\nInfo: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_update_last_success.rb\nInfo: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_updates.rb\nInfo: Loading facts in /etc/puppet/modules/concat/lib/facter/concat_basedir.rb\nInfo: Loading facts in /etc/puppet/modules/firewall/lib/facter/ip6tables_version.rb\nInfo: Loading facts in /etc/puppet/modules/firewall/lib/facter/iptables_persistent_version.rb\nInfo: Loading facts in /etc/puppet/modules/firewall/lib/facter/iptables_version.rb\nInfo: Loading facts in /etc/puppet/modules/mysql/lib/facter/mysql_version.rb\nInfo: Loading facts in /etc/puppet/modules/mysql/lib/facter/mysql_server_id.rb\nInfo: Loading facts in /etc/puppet/modules/python/lib/facter/pip_version.rb\nInfo: Loading facts in /etc/puppet/modules/python/lib/facter/python_version.rb\nInfo: Loading facts in /etc/puppet/modules/python/lib/facter/virtualenv_version.rb\nInfo: Loading facts in /etc/puppet/modules/staging/lib/facter/staging_http_get.rb\nInfo: Loading facts in /etc/puppet/modules/staging/lib/facter/staging_windir.rb\nInfo: Loading facts in /etc/puppet/modules/stdlib/lib/facter/puppet_vardir.rb\nInfo: Loading facts in /etc/puppet/modules/stdlib/lib/facter/facter_dot_d.rb\nInfo: Loading facts in /etc/puppet/modules/stdlib/lib/facter/pe_version.rb\nInfo: Loading facts in /etc/puppet/modules/stdlib/lib/facter/root_home.rb\nNotice: Compiled catalog for n2.dusty.ci.centos.org in environment production in 9.47 seconds\nInfo: Applying configuration version '1463743997'\nNotice: Finished catalog run in 69.42 seconds\nall create: /tmp/openstack/tempest/.tox/tempest\nall installdeps: setuptools, -r/tmp/openstack/tempest/requirements.txt\nall develop-inst: /tmp/openstack/tempest\nall installed: Babel==2.3.4,cffi==1.6.0,cliff==2.0.0,cmd2==0.6.8,cryptography==1.3.2,debtcollector==1.4.0,enum34==1.1.6,extras==1.0.0,fasteners==0.14.1,fixtures==1.4.0,funcsigs==1.0.2,functools32==3.2.3.post2,idna==2.1,ipaddress==1.0.16,iso8601==0.1.11,jsonschema==2.5.1,linecache2==1.0.0,monotonic==1.1,msgpack-python==0.4.7,netaddr==0.7.18,netifaces==0.10.4,os-testr==0.6.0,oslo.concurrency==3.8.0,oslo.config==3.9.0,oslo.context==2.3.0,oslo.i18n==3.6.0,oslo.log==3.7.0,oslo.serialization==2.6.0,oslo.utils==3.10.0,paramiko==2.0.0,pbr==1.9.1,prettytable==0.7.2,pyasn1==0.1.9,pycparser==2.14,pyinotify==0.9.6,pyOpenSSL==16.0.0,pyparsing==2.1.4,python-dateutil==2.5.3,python-mimeparse==1.5.2,python-subunit==1.2.0,pytz==2016.4,PyYAML==3.11,retrying==1.3.3,six==1.10.0,stevedore==1.13.0,-e git://git.openstack.org/openstack/tempest@aff9cc072bbbb222b09a3411b203c180b493eae8#egg=tempest,testrepository==0.0.20,testscenarios==0.5.0,testtools==2.2.0,traceback2==1.4.0,unicodecsv==0.14.1,unittest2==1.1.0,urllib3==1.15.1,wrapt==1.10.8\nall runtests: PYTHONHASHSEED='3977220619'\nall runtests: commands[0] | find . -type f -name *.pyc -delete\nall runtests: commands[1] | bash tools/pretty_tox.sh --concurrency=2 smoke dashbboard TelemetryAlarming api.baremetal.admin.test_drivers\nrunning testr\nrunning=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \\\nOS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \\\nOS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-500} \\\nOS_TEST_LOCK_PATH=${OS_TEST_LOCK_PATH:-${TMPDIR:-'/tmp'}} \\\n${PYTHON:-python} -m subunit.run discover -t ${OS_TOP_LEVEL:-./} ${OS_TEST_PATH:-./tempest/test_discover} --list \nrunning=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \\\nOS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \\\nOS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-500} \\\nOS_TEST_LOCK_PATH=${OS_TEST_LOCK_PATH:-${TMPDIR:-'/tmp'}} \\\n${PYTHON:-python} -m subunit.run discover -t ${OS_TOP_LEVEL:-./} ${OS_TEST_PATH:-./tempest/test_discover} --load-list /tmp/tmp0s50qd\nrunning=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \\\nOS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \\\nOS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-500} \\\nOS_TEST_LOCK_PATH=${OS_TEST_LOCK_PATH:-${TMPDIR:-'/tmp'}} \\\n${PYTHON:-python} -m subunit.run discover -t ${OS_TOP_LEVEL:-./} ${OS_TEST_PATH:-./tempest/test_discover} --load-list /tmp/tmpdnKp0B\n{1} tempest.api.compute.security_groups.test_security_group_rules.SecurityGroupRulesTestJSON.test_security_group_rules_create [0.510592s] ... ok\n{1} tempest.api.compute.security_groups.test_security_group_rules.SecurityGroupRulesTestJSON.test_security_group_rules_list [0.637885s] ... ok\n{0} tempest.api.baremetal.admin.test_drivers.TestDrivers.test_list_drivers [1.271812s] ... ok\n{0} tempest.api.baremetal.admin.test_drivers.TestDrivers.test_show_driver [1.338437s] ... ok\n{0} tempest.api.compute.flavors.test_flavors.FlavorsV2TestJSON.test_get_flavor [0.106825s] ... ok\n{0} tempest.api.compute.flavors.test_flavors.FlavorsV2TestJSON.test_list_flavors [0.130774s] ... ok\n{0} tempest.api.compute.security_groups.test_security_groups.SecurityGroupsTestJSON.test_security_groups_create_list_delete [1.448903s] ... ok\n{1} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_list_servers [0.064264s] ... ok\n{1} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_verify_server_details [0.000582s] ... ok\n{0} tempest.api.compute.servers.test_attach_interfaces.AttachInterfacesTestJSON.test_add_remove_fixed_ip [11.839100s] ... ok\n{1} tempest.api.compute.servers.test_server_addresses.ServerAddressesTestJSON.test_list_server_addresses [0.071666s] ... ok\n{1} tempest.api.compute.servers.test_server_addresses.ServerAddressesTestJSON.test_list_server_addresses_by_network [0.159031s] ... ok\n{0} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_list_servers [0.063791s] ... ok\n{0} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_verify_server_details [0.000565s] ... ok\n{1} setUpClass (tempest.api.data_processing.test_cluster_templates.ClusterTemplateTest) ... SKIPPED: Sahara support is required\n{1} setUpClass (tempest.api.data_processing.test_data_sources.DataSourceTest) ... SKIPPED: Sahara support is required\n{1} setUpClass (tempest.api.data_processing.test_job_binaries.JobBinaryTest) ... SKIPPED: Sahara support is required\n{1} setUpClass (tempest.api.data_processing.test_jobs.JobTest) ... SKIPPED: Sahara support is required\n{1} setUpClass (tempest.api.data_processing.test_node_group_templates.NodeGroupTemplateTest) ... SKIPPED: Sahara support is required\n{1} setUpClass (tempest.api.database.flavors.test_flavors.DatabaseFlavorsTest) ... SKIPPED: DatabaseFlavorsTest skipped as trove is not available\n{1} setUpClass (tempest.api.database.limits.test_limits.DatabaseLimitsTest) ... SKIPPED: DatabaseLimitsTest skipped as trove is not available\n{1} tempest.api.identity.admin.v3.test_credentials.CredentialsTestJSON.test_credentials_create_get_update_delete [0.152104s] ... ok\n{1} tempest.api.identity.admin.v3.test_domains.DefaultDomainTestJSON.test_default_domain_exists [0.037410s] ... ok\n{1} tempest.api.identity.admin.v3.test_domains.DomainsTestJSON.test_create_update_delete_domain [0.397761s] ... ok\n{1} tempest.api.identity.admin.v3.test_endpoints.EndPointsTestJSON.test_update_endpoint [0.215164s] ... ok\n{0} tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_reboot_server_hard [11.347985s] ... ok\n{1} tempest.api.identity.admin.v3.test_groups.GroupsV3TestJSON.test_group_users_add_list_delete [1.166480s] ... ok\n{0} setUpClass (tempest.api.data_processing.test_job_binary_internals.JobBinaryInternalTest) ... SKIPPED: Sahara support is required\n{0} setUpClass (tempest.api.data_processing.test_plugins.PluginsTest) ... SKIPPED: Sahara support is required\n{0} setUpClass (tempest.api.database.versions.test_versions.DatabaseVersionsTest) ... SKIPPED: DatabaseVersionsTest skipped as trove is not available\n{1} tempest.api.identity.admin.v3.test_regions.RegionsTestJSON.test_create_region_with_specific_id [0.166700s] ... ok\n{0} tempest.api.identity.admin.v2.test_services.ServicesTestJSON.test_list_services [0.373792s] ... ok\n{1} tempest.api.identity.admin.v3.test_roles.RolesV3TestJSON.test_role_create_update_show_list [0.286381s] ... ok\n{0} tempest.api.identity.admin.v2.test_users.UsersTestJSON.test_create_user [0.143726s] ... ok\n{1} tempest.api.identity.admin.v3.test_trusts.TrustsV3TestJSON.test_get_trusts_all [1.541047s] ... ok\n{0} tempest.api.identity.admin.v3.test_policies.PoliciesTestJSON.test_create_update_delete_policy [0.206302s] ... ok\n{1} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_delete_image [0.517553s] ... ok\n{1} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_register_upload_get_image_file [1.137646s] ... ok\n{1} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_update_image [1.469844s] ... ok\n{1} tempest.api.network.test_extensions.ExtensionsTestJSON.test_list_show_extensions [0.430944s] ... ok\n{0} tempest.api.identity.admin.v3.test_services.ServicesTestJSON.test_create_update_get_service [0.295808s] ... ok\n{1} tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_network [0.846129s] ... ok\n{1} tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_port [1.372009s] ... ok\n{0} tempest.api.identity.v2.test_api_discovery.TestApiDiscovery.test_api_media_types [0.048919s] ... ok\n{0} tempest.api.identity.v2.test_api_discovery.TestApiDiscovery.test_api_version_resources [0.054454s] ... ok\n{0} tempest.api.identity.v2.test_api_discovery.TestApiDiscovery.test_api_version_statuses [0.045440s] ... ok\n{1} tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_subnet [4.599057s] ... ok\n{1} setUpClass (tempest.api.network.test_networks.NetworksIpV6TestAttrs) ... SKIPPED: IPv6 extended attributes for subnets not available\n{0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_media_types [0.054893s] ... ok\n{0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_version_resources [0.061760s] ... ok\n{0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_version_statuses [0.059559s] ... ok\n{1} tempest.api.network.test_networks.NetworksTest.test_create_update_delete_network_subnet [1.563828s] ... ok\n{1} tempest.api.network.test_networks.NetworksTest.test_external_network_visibility [0.184518s] ... ok\n{1} tempest.api.network.test_networks.NetworksTest.test_list_networks [0.079143s] ... ok\n{1} tempest.api.network.test_networks.NetworksTest.test_list_subnets [0.046161s] ... ok\n{1} tempest.api.network.test_networks.NetworksTest.test_show_network [0.052944s] ... ok\n{1} tempest.api.network.test_networks.NetworksTest.test_show_subnet [0.051730s] ... ok\n{1} tempest.api.network.test_ports.PortsTestJSON.test_create_port_in_allowed_allocation_pools [1.498361s] ... ok\n{0} tempest.api.network.test_floating_ips.FloatingIPTestJSON.test_create_floating_ip_specifying_a_fixed_ip_address [0.891415s] ... ok\n{1} tempest.api.network.test_ports.PortsTestJSON.test_create_port_with_no_securitygroups [1.660860s] ... ok\n{0} tempest.api.network.test_floating_ips.FloatingIPTestJSON.test_create_list_show_update_delete_floating_ip [1.472597s] ... ok\n{1} tempest.api.network.test_ports.PortsTestJSON.test_create_update_delete_port [1.024533s] ... ok\n{1} tempest.api.network.test_ports.PortsTestJSON.test_list_ports [0.028519s] ... ok\n{1} tempest.api.network.test_ports.PortsTestJSON.test_show_port [0.031189s] ... ok\n{0} tempest.api.network.test_networks.BulkNetworkOpsTest.test_bulk_create_delete_network [0.826167s] ... ok\n{0} tempest.api.network.test_networks.BulkNetworkOpsTest.test_bulk_create_delete_port [1.382763s] ... ok\n{1} tempest.api.network.test_routers.RoutersTest.test_add_multiple_router_interfaces [3.649875s] ... ok\n{0} tempest.api.network.test_networks.BulkNetworkOpsTest.test_bulk_create_delete_subnet [1.938906s] ... ok\n{1} tempest.api.network.test_routers.RoutersTest.test_add_remove_router_interface_with_port_id [2.267556s] ... ok\n{1} tempest.api.network.test_routers.RoutersTest.test_add_remove_router_interface_with_subnet_id [1.954573s] ... ok\n{1} tempest.api.network.test_routers.RoutersTest.test_create_show_list_update_delete_router [1.438991s] ... ok\n{0} tempest.api.network.test_networks.NetworksIpV6Test.test_create_update_delete_network_subnet [1.268792s] ... ok\n{0} tempest.api.network.test_networks.NetworksIpV6Test.test_external_network_visibility [0.112706s] ... ok\n{0} tempest.api.network.test_networks.NetworksIpV6Test.test_list_networks [0.051579s] ... ok\n{0} tempest.api.network.test_networks.NetworksIpV6Test.test_list_subnets [0.045387s] ... ok\n{0} tempest.api.network.test_networks.NetworksIpV6Test.test_show_network [0.132461s] ... ok\n{0} tempest.api.network.test_networks.NetworksIpV6Test.test_show_subnet [0.044410s] ... ok\n{1} tempest.api.network.test_security_groups.SecGroupIPv6Test.test_create_list_update_show_delete_security_group [0.375148s] ... ok\n{1} tempest.api.network.test_security_groups.SecGroupIPv6Test.test_create_show_delete_security_group_rule [0.470574s] ... ok\n{1} tempest.api.network.test_security_groups.SecGroupIPv6Test.test_list_security_groups [0.035922s] ... ok\n{0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_create_port_in_allowed_allocation_pools [1.515059s] ... ok\n{1} tempest.api.object_storage.test_account_quotas.AccountQuotasTest.test_admin_modify_quota [0.210497s] ... ok\n{1} tempest.api.object_storage.test_account_quotas.AccountQuotasTest.test_upload_valid_object [0.071776s] ... ok\n{0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_create_port_with_no_securitygroups [1.804811s] ... ok\n{0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_create_update_delete_port [0.783414s] ... ok\n{0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_list_ports [0.030561s] ... ok\n{0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_show_port [0.029457s] ... ok\n{1} tempest.api.object_storage.test_account_services.AccountTest.test_list_account_metadata [0.054494s] ... ok\n{1} tempest.api.object_storage.test_account_services.AccountTest.test_list_containers [0.013434s] ... ok\n{1} setUpClass (tempest.api.orchestration.stacks.test_stacks.StacksTestJSON) ... SKIPPED: Heat support is required\n{1} setUpClass (tempest.api.telemetry.test_alarming_api.TelemetryAlarmingAPITestJSON) ... SKIPPED: Aodh support is required\n{1} setUpClass (tempest.api.telemetry.test_alarming_api_negative.TelemetryAlarmingNegativeTest) ... SKIPPED: Aodh support is required\n{0} tempest.api.network.test_routers.RoutersIpV6Test.test_add_multiple_router_interfaces [3.744758s] ... ok\n{0} tempest.api.network.test_routers.RoutersIpV6Test.test_add_remove_router_interface_with_port_id [2.046541s] ... ok\n{0} tempest.api.network.test_routers.RoutersIpV6Test.test_add_remove_router_interface_with_subnet_id [2.020083s] ... ok\n{1} tempest.api.volume.test_volumes_get.VolumesV2GetTest.test_volume_create_get_update_delete [7.842538s] ... ok\n{0} tempest.api.network.test_routers.RoutersIpV6Test.test_create_show_list_update_delete_router [1.502659s] ... ok\n{0} tempest.api.network.test_security_groups.SecGroupTest.test_create_list_update_show_delete_security_group [0.368896s] ... ok\n{0} tempest.api.network.test_security_groups.SecGroupTest.test_create_show_delete_security_group_rule [0.471705s] ... ok\n{0} tempest.api.network.test_security_groups.SecGroupTest.test_list_security_groups [0.044018s] ... ok\n{0} tempest.api.network.test_subnetpools_extensions.SubnetPoolsTestJSON.test_create_list_show_update_delete_subnetpools [0.268973s] ... ok\n{0} tempest.api.object_storage.test_container_quotas.ContainerQuotasTest.test_upload_large_object [0.391122s] ... ok\n{0} tempest.api.object_storage.test_container_quotas.ContainerQuotasTest.test_upload_too_many_objects [0.285795s] ... ok\n{0} tempest.api.object_storage.test_container_quotas.ContainerQuotasTest.test_upload_valid_object [0.195461s] ... ok\n{1} tempest.api.volume.test_volumes_get.VolumesV2GetTest.test_volume_create_get_update_delete_from_image [32.992742s] ... ok\n{0} tempest.api.object_storage.test_container_services.ContainerTest.test_create_container [0.335287s] ... ok\n{0} tempest.api.object_storage.test_container_services.ContainerTest.test_list_container_contents [0.149195s] ... ok\n{0} tempest.api.object_storage.test_container_services.ContainerTest.test_list_container_metadata [0.121836s] ... ok\n{0} tempest.api.object_storage.test_object_services.ObjectTest.test_create_object [0.050552s] ... ok\n{0} tempest.api.object_storage.test_object_services.ObjectTest.test_get_object [0.026938s] ... ok\n{0} tempest.api.object_storage.test_object_services.ObjectTest.test_list_object_metadata [0.024721s] ... ok\n{0} tempest.api.object_storage.test_object_services.ObjectTest.test_update_object_metadata [0.051842s] ... ok\n{0} setUpClass (tempest.api.orchestration.stacks.test_resource_types.ResourceTypesTest) ... SKIPPED: Heat support is required\n{0} setUpClass (tempest.api.orchestration.stacks.test_soft_conf.TestSoftwareConfig) ... SKIPPED: Heat support is required\n{0} setUpClass (tempest.api.telemetry.test_telemetry_notification_api.TelemetryNotificationAPITestJSON) ... SKIPPED: Ceilometer support is required\n{1} tempest.api.volume.test_volumes_list.VolumesV1ListTestJSON.test_volume_list [0.049297s] ... ok\n{0} tempest.api.volume.test_volumes_actions.VolumesV1ActionsTest.test_attach_detach_volume_to_instance [1.502631s] ... ok\n{1} tempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops [36.835052s] ... FAILED\n{1} setUpClass (tempest.scenario.test_server_multinode.TestServerMultinode) ... SKIPPED: Less than 2 compute nodes, skipping multinode tests.\n{0} tempest.api.volume.test_volumes_actions.VolumesV2ActionsTest.test_attach_detach_volume_to_instance [1.199769s] ... ok\n{0} tempest.api.volume.test_volumes_get.VolumesV1GetTest.test_volume_create_get_update_delete [17.959206s] ... ok\n{0} tempest.api.volume.test_volumes_get.VolumesV1GetTest.test_volume_create_get_update_delete_from_image [42.517879s] ... ok\n{0} tempest.api.volume.test_volumes_list.VolumesV2ListTestJSON.test_volume_list [0.047964s] ... ok\n{0} tempest.scenario.test_network_basic_ops.TestNetworkBasicOps.test_network_basic_ops [132.155957s] ... ok\n{0} tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_volume_boot_pattern [151.413748s] ... ok\n{0} tempest.scenario.test_volume_boot_pattern.TestVolumeBootPatternV2.test_volume_boot_pattern [155.015026s] ... ok\n\n==============================\nFailed 1 tests - output below:\n==============================\n\ntempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops[compute,id-7fff3fb3-91d8-4fd0-bd7d-0204f1f180ba,network,smoke]\n----------------------------------------------------------------------------------------------------------------------------------------------\n\nCaptured pythonlogging:\n~~~~~~~~~~~~~~~~~~~~~~~\n 2016-05-20 12:39:33,977 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:setUp): 200 GET https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59 0.461s\n 2016-05-20 12:39:33,978 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''}\n Body: None\n Response - Headers: {'vary': 'Accept-Encoding', 'status': '200', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '677', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59', 'date': 'Fri, 20 May 2016 11:39:33 GMT', 'x-compute-request-id': 'req-2d694247-967f-4d4c-b110-8dd52b397df7'}\n Body: {\"image\": {\"status\": \"ACTIVE\", \"updated\": \"2016-05-20T11:32:39Z\", \"links\": [{\"href\": \"https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59\", \"rel\": \"self\"}, {\"href\": \"https://127.0.0.1:8774/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59\", \"rel\": \"bookmark\"}, {\"href\": \"http://172.19.2.66:9292/images/ffff3a3a-5101-497a-b186-38682e723d59\", \"type\": \"application/vnd.openstack.image\", \"rel\": \"alternate\"}], \"id\": \"ffff3a3a-5101-497a-b186-38682e723d59\", \"OS-EXT-IMG-SIZE:size\": 13287936, \"name\": \"cirros\", \"created\": \"2016-05-20T11:32:36Z\", \"minDisk\": 0, \"progress\": 100, \"minRam\": 0, \"metadata\": {}}}\n 2016-05-20 12:39:34,143 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:setUp): 200 GET https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/flavors/42 0.162s\n 2016-05-20 12:39:34,144 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''}\n Body: None\n Response - Headers: {'vary': 'Accept-Encoding', 'status': '200', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '421', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/flavors/42', 'date': 'Fri, 20 May 2016 11:39:33 GMT', 'x-compute-request-id': 'req-8aa2720b-8e5f-4b38-a53a-2f0a4f4a3442'}\n Body: {\"flavor\": {\"name\": \"m1.nano\", \"links\": [{\"href\": \"https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/flavors/42\", \"rel\": \"self\"}, {\"href\": \"https://127.0.0.1:8774/2b19e26c86fb4b48abe8551003fc00c7/flavors/42\", \"rel\": \"bookmark\"}], \"ram\": 128, \"OS-FLV-DISABLED:disabled\": false, \"vcpus\": 1, \"swap\": \"\", \"os-flavor-access:is_public\": true, \"rxtx_factor\": 1.0, \"OS-FLV-EXT-DATA:ephemeral\": 0, \"disk\": 0, \"id\": \"42\"}}\n 2016-05-20 12:39:34,561 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:setUp): 200 GET https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59 0.415s\n 2016-05-20 12:39:34,562 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''}\n Body: None\n Response - Headers: {'vary': 'Accept-Encoding', 'status': '200', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '677', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59', 'date': 'Fri, 20 May 2016 11:39:34 GMT', 'x-compute-request-id': 'req-47d134ff-c986-4e47-87bc-8b1865abeb34'}\n Body: {\"image\": {\"status\": \"ACTIVE\", \"updated\": \"2016-05-20T11:32:39Z\", \"links\": [{\"href\": \"https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59\", \"rel\": \"self\"}, {\"href\": \"https://127.0.0.1:8774/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59\", \"rel\": \"bookmark\"}, {\"href\": \"http://172.19.2.66:9292/images/ffff3a3a-5101-497a-b186-38682e723d59\", \"type\": \"application/vnd.openstack.image\", \"rel\": \"alternate\"}], \"id\": \"ffff3a3a-5101-497a-b186-38682e723d59\", \"OS-EXT-IMG-SIZE:size\": 13287936, \"name\": \"cirros\", \"created\": \"2016-05-20T11:32:36Z\", \"minDisk\": 0, \"progress\": 100, \"minRam\": 0, \"metadata\": {}}}\n 2016-05-20 12:39:34,568 7734 DEBUG [tempest.scenario.test_server_basic_ops] Starting test for i:ffff3a3a-5101-497a-b186-38682e723d59, f:42. Run ssh: False, user: cirros\n 2016-05-20 12:39:34,754 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:test_server_basic_ops): 200 POST https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-keypairs 0.184s\n 2016-05-20 12:39:34,754 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''}\n Body: {\"keypair\": {\"name\": \"tempest-TestServerBasicOps-1692537820\"}}\n Response - Headers: {'vary': 'Accept-Encoding', 'status': '200', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '2320', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-keypairs', 'date': 'Fri, 20 May 2016 11:39:34 GMT', 'x-compute-request-id': 'req-174eb88e-a1a0-4382-abaa-3db83a7276b4'}\n Body: {\"keypair\": {\"public_key\": \"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCf0YVs8Qd2HOxGejejNA86wa9jKGRUqadnX16ux7D0QgTxcru4ll4JtSPY3azJqwwUAajeHOge/vPM6ySLlJscB9iPo0k4A0AbNed1hfmEvYXYEYmss58gkgFjwrv5wqIz08V4Fu+I9FMjD0PmFFQNqSv35i3C6i54LUZRGkFzT7HxXM4aAZUjpCfjNXsJSDoRSz0GBC0QbZ+GQah7mYiVMDJO1MFWKrReDjYMNr3xdooTb2m3G2rvksHgl0ezVRDbvkgCodJz4YQrC82gitJdLyGEJZpYPTMbOp/dsOAkKPGtkyF4Qqv/FDMyCHM8bsiOog/xXmBIT87xlzBtAzZ9 Generated-by-Nova\", \"private_key\": \"-----BEGIN RSA PRIVATE KEY-----\\nMIIEqQIBAAKCAQEAn9GFbPEHdhzsRno3ozQPOsGvYyhkVKmnZ19ersew9EIE8XK7\\nuJZeCbUj2N2syasMFAGo3hzoHv7zzOski5SbHAfYj6NJOANAGzXndYX5hL2F2BGJ\\nrLOfIJIBY8K7+cKiM9PFeBbviPRTIw9D5hRUDakr9+YtwuoueC1GURpBc0+x8VzO\\nGgGVI6Qn4zV7CUg6EUs9BgQtEG2fhkGoe5mIlTAyTtTBViq0Xg42DDa98XaKE29p\\ntxtq75LB4JdHs1UQ275IAqHSc+GEKwvNoIrSXS8hhCWaWD0zGzqf3bDgJCjxrZMh\\neEKr/xQzMghzPG7IjqIP8V5gSE/O8ZcwbQM2fQIDAQABAoIBABO2sJKjmJwFLU/0\\nO3CyNz60LYI5tUaMNs4VgYRltXoruphd4rH+OlNQOL/DeFDX/IFrQv1C648HO+OH\\nDdb52bg3b4soRRvXqsywdYCVqhWpmxzv7N+UuIg3+lvn5XAFhiSGdtE9YwatvKOS\\nenmjAEs/FuFZT0O/x0OjsgzHBFPIyt15vGAOIIhbWRBoWJSBD5MglPHpqFRMbWnh\\nIma71YSEn62dddHzlnk5+7gVf7FF9eZl4hcLrfqWuZhi8lNTiu/FtBQT9cEnoAXb\\nu6Y/59eoZSBv334s3D/nlbtqY922xJrwVjucfbw7tDrzDaDlurkHKST/jr29weOM\\nPl7T8gECggCBAL/E6Y/62Eja/DoQwMGytxb612xbA3lasZnpyBjBFpByKuK8tPy9\\nwp9K+dT8nk8+E1GToPOGGyvk1UqnYl2mShiDpZWRrtDf2JZqL0r4FhDs1DoMvbcO\\nscAt9KYT9yjMwFtZXflA2N7sU5pWovJccnEsAN47elxT9ROC9l0Sqt2dAoIAgQDV\\nWQTawXkU2bJlyUqC+EXEFEtHR1uUfLWB7ZbwoqB5tYKUydKk0d7CNOLSPQgJJnpt\\nb5l/iRtypsZ0FbjRiBtdkzsn7zzsY5pvaptasbeSNG7EOdADRRmfXSDPQi9J5TIL\\nsqxxbu9lLlIgT1J8ECQARpNx7VmSzA697JjpS9TWYQKCAIEAr9hhj3wmXdAoHxqD\\nllpJV1IWva5LZjkKyCa+LCzKgxOdTaJal7NtxmGa63nltKYoUtJ7cTLUsZA5ISaR\\npWw5X3dAHAGlerT4Rx0BVs5cdZKlHMHYKQbZaW76eluudQQjkuBEsq2K8Admtgyh\\niHnLGwmNljqV/hmijgy12iym72UCggCBAJ/MzZYM1GSJqtYSr3zp+Vek273H9RCD\\nWHC5RRV4ujpveh94DA7oI7HTaxGOJTa1W34c2Rxt1eFKidrpakWHbPfqD6UZzMhC\\n0qohb7u+4YDhRRY1N1k7qLV1S93x9PmkcpfQfNl5/lYLG/iXcXD7pfuO4WG0JiOO\\nNHyNevtDkWgBAoIAgBXL82F/ICjK7i4B232bJB0RQEzAevqBCoRFMUUGl9rePcgB\\nUOSiiDVHfl2C1yu3WabzNehoDO5/RqyxpPji/SrnMvi4aPPywLvJ9gqEfUwld1Wo\\np6riJoPx6aS+VLPLP0rDhKGuEJkIu4Qv9tCdG7nReWWEImiM6ldN9kzOZfIN\\n-----END RSA PRIVATE KEY-----\\n\", \"user_id\": \"4f2057b1b7744ce9b90440c0f47efbef\", \"name\": \"tempest-TestServerBasicOps-1692537820\", \"fingerprint\": \"01:a5:e4:68:53:67:e9:cc:22:5b:d6:b0:21:ff:5a:f4\"}}\n 2016-05-20 12:40:04,819 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:test_server_basic_ops): 500 POST https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-security-groups 30.063s\n 2016-05-20 12:40:04,819 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''}\n Body: {\"security_group\": {\"description\": \"tempest-TestServerBasicOps-1404384290 description\", \"name\": \"tempest-TestServerBasicOps-1404384290\"}}\n Response - Headers: {'status': '500', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '224', 'content-type': 'application/json; charset=UTF-8', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-security-groups', 'date': 'Fri, 20 May 2016 11:39:34 GMT', 'x-compute-request-id': 'req-940cd5d7-8a3c-478b-9285-2964bfe29105'}\n Body: {\"computeFault\": {\"message\": \"Unexpected API Error. Please report this at http://bugs.launchpad.net/nova/ and attach the Nova API log if possible.\\n\", \"code\": 500}}\n 2016-05-20 12:40:10,344 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:_run_cleanups): 202 DELETE https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-keypairs/tempest-TestServerBasicOps-1692537820 5.521s\n 2016-05-20 12:40:10,349 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''}\n Body: None\n Response - Headers: {'status': '202', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '0', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-keypairs/tempest-TestServerBasicOps-1692537820', 'date': 'Fri, 20 May 2016 11:40:04 GMT', 'x-compute-request-id': 'req-a0ec0f02-aeeb-4a83-81f3-01be3558a2df'}\n Body: \n \n\nCaptured traceback:\n~~~~~~~~~~~~~~~~~~~\n Traceback (most recent call last):\n File \"tempest/test.py\", line 113, in wrapper\n return f(self, *func_args, **func_kwargs)\n File \"tempest/scenario/test_server_basic_ops.py\", line 124, in test_server_basic_ops\n self.security_group = self._create_security_group()\n File \"tempest/scenario/manager.py\", line 333, in _create_security_group\n name=sg_name, description=sg_desc)['security_group']\n File \"tempest/lib/services/compute/security_groups_client.py\", line 55, in create_security_group\n resp, body = self.post('os-security-groups', post_body)\n File \"tempest/lib/common/rest_client.py\", line 259, in post\n return self.request('POST', url, extra_headers, headers, body)\n File \"tempest/lib/services/compute/base_compute_client.py\", line 53, in request\n method, url, extra_headers, headers, body)\n File \"tempest/lib/common/rest_client.py\", line 641, in request\n resp, resp_body)\n File \"tempest/lib/common/rest_client.py\", line 760, in _error_checker\n message=message)\n tempest.lib.exceptions.ServerFault: Got server fault\n Details: Unexpected API Error. Please report this at http://bugs.launchpad.net/nova/ and attach the Nova API log if possible.\n \n \n\n\n======\nTotals\n======\nRan: 126 tests in 837.0000 sec.\n - Passed: 107\n - Skipped: 18\n - Expected Fail: 0\n - Unexpected Success: 0\n - Failed: 1\nSum of execute time for each test: 665.4004 sec.\n\n==============\nWorker Balance\n==============\n - Worker 0 (67 tests) => 0:13:47.264576\n - Worker 1 (59 tests) => 0:04:33.373255\n\nSlowest Tests:\n\nTest id Runtime (s)\n-------------------------------------------------------------------------------------------------------------------------------------------------------------- -----------\ntempest.scenario.test_volume_boot_pattern.TestVolumeBootPatternV2.test_volume_boot_pattern[compute,id-557cd2c2-4eb8-4dce-98be-f86765ff311b,image,smoke,volume] 155.015\ntempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_volume_boot_pattern[compute,id-557cd2c2-4eb8-4dce-98be-f86765ff311b,image,smoke,volume] 151.414\ntempest.scenario.test_network_basic_ops.TestNetworkBasicOps.test_network_basic_ops[compute,id-f323b3ba-82f8-4db7-8ea6-6a895869ec49,network,smoke] 132.156\ntempest.api.volume.test_volumes_get.VolumesV1GetTest.test_volume_create_get_update_delete_from_image[id-54a01030-c7fc-447c-86ee-c1182beae638,image,smoke] 42.518\ntempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops[compute,id-7fff3fb3-91d8-4fd0-bd7d-0204f1f180ba,network,smoke] 36.835\ntempest.api.volume.test_volumes_get.VolumesV2GetTest.test_volume_create_get_update_delete_from_image[id-54a01030-c7fc-447c-86ee-c1182beae638,image,smoke] 32.993\ntempest.api.volume.test_volumes_get.VolumesV1GetTest.test_volume_create_get_update_delete[id-27fb0e9f-fb64-41dd-8bdb-1ffa762f0d51,smoke] 17.959\ntempest.api.compute.servers.test_attach_interfaces.AttachInterfacesTestJSON.test_add_remove_fixed_ip[id-c7e0e60b-ee45-43d0-abeb-8596fd42a2f9,network,smoke] 11.839\ntempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_reboot_server_hard[id-2cb1baf6-ac8d-4429-bf0d-ba8a0ba53e32,smoke] 11.348\ntempest.api.volume.test_volumes_get.VolumesV2GetTest.test_volume_create_get_update_delete[id-27fb0e9f-fb64-41dd-8bdb-1ffa762f0d51,smoke] 7.843\nERROR: InvocationError: '/usr/bin/bash tools/pretty_tox.sh --concurrency=2 smoke dashbboard TelemetryAlarming api.baremetal.admin.test_drivers'\n___________________________________ summary ____________________________________\nERROR: all: commands failed", "stdout_lines": ["Cloning into '/tmp/openstack/tempest'...", "Preparing... ########################################", "Updating / installing...", "puppetlabs-release-7-12 ########################################", "Loaded plugins: fastestmirror, priorities", "Loading mirror speeds from cached hostfile", " * base: mirror.centos.org", " * extras: mirror.centos.org", " * updates: mirror.centos.org", "545 packages excluded due to repository priority protections", "Resolving Dependencies", "--> Running transaction check", "---> Package dstat.noarch 0:0.7.2-12.el7 will be installed", "---> Package puppet.noarch 0:3.6.2-3.el7 will be installed", "--> Processing Dependency: hiera >= 1.0.0 for package: puppet-3.6.2-3.el7.noarch", "--> Processing Dependency: facter >= 1.6.6 for package: puppet-3.6.2-3.el7.noarch", "--> Processing Dependency: rubygem(rgen) for package: puppet-3.6.2-3.el7.noarch", "--> Processing Dependency: ruby(shadow) for package: puppet-3.6.2-3.el7.noarch", "--> Processing Dependency: ruby(selinux) for package: puppet-3.6.2-3.el7.noarch", "--> Processing Dependency: ruby(augeas) for package: puppet-3.6.2-3.el7.noarch", "--> Running transaction check", "---> Package facter.x86_64 0:2.4.4-3.el7 will be installed", "--> Processing Dependency: pciutils for package: facter-2.4.4-3.el7.x86_64", "---> Package hiera.noarch 0:1.3.4-1.el7 will be installed", "---> Package libselinux-ruby.x86_64 0:2.2.2-6.el7 will be installed", "---> Package ruby-augeas.x86_64 0:0.5.0-1.el7 will be installed", "--> Processing Dependency: augeas-libs >= 1.0.0 for package: ruby-augeas-0.5.0-1.el7.x86_64", "--> Processing Dependency: libaugeas.so.0(AUGEAS_0.8.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64", "--> Processing Dependency: libaugeas.so.0(AUGEAS_0.16.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64", "--> Processing Dependency: libaugeas.so.0(AUGEAS_0.14.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64", "--> Processing Dependency: libaugeas.so.0(AUGEAS_0.12.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64", "--> Processing Dependency: libaugeas.so.0(AUGEAS_0.11.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64", "--> Processing Dependency: libaugeas.so.0(AUGEAS_0.10.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64", "--> Processing Dependency: libaugeas.so.0(AUGEAS_0.1.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64", "--> Processing Dependency: libaugeas.so.0()(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64", "---> Package ruby-shadow.x86_64 0:1.4.1-23.el7 will be installed", "---> Package rubygem-rgen.noarch 0:0.6.6-2.el7 will be installed", "--> Running transaction check", "---> Package augeas-libs.x86_64 0:1.4.0-2.el7 will be installed", "---> Package pciutils.x86_64 0:3.2.1-4.el7 will be installed", "--> Finished Dependency Resolution", "", "Dependencies Resolved", "", "================================================================================", " Package Arch Version Repository Size", "================================================================================", "Installing:", " dstat noarch 0.7.2-12.el7 base 163 k", " puppet noarch 3.6.2-3.el7 delorean-mitaka-testing 1.2 M", "Installing for dependencies:", " augeas-libs x86_64 1.4.0-2.el7 base 355 k", " facter x86_64 2.4.4-3.el7 delorean-mitaka-testing 101 k", " hiera noarch 1.3.4-1.el7 delorean-mitaka-testing 24 k", " libselinux-ruby x86_64 2.2.2-6.el7 base 127 k", " pciutils x86_64 3.2.1-4.el7 base 90 k", " ruby-augeas x86_64 0.5.0-1.el7 delorean-mitaka-testing 23 k", " ruby-shadow x86_64 1.4.1-23.el7 delorean-mitaka-testing 13 k", " rubygem-rgen noarch 0.6.6-2.el7 delorean-mitaka-testing 83 k", "", "Transaction Summary", "================================================================================", "Install 2 Packages (+8 Dependent packages)", "", "Total download size: 2.2 M", "Installed size: 7.1 M", "Downloading packages:", "--------------------------------------------------------------------------------", "Total 401 kB/s | 2.2 MB 00:05 ", "Running transaction check", "Running transaction test", "Transaction test succeeded", "Running transaction", " Installing : rubygem-rgen-0.6.6-2.el7.noarch 1/10 ", " Installing : augeas-libs-1.4.0-2.el7.x86_64 2/10 ", " Installing : ruby-augeas-0.5.0-1.el7.x86_64 3/10 ", " Installing : ruby-shadow-1.4.1-23.el7.x86_64 4/10 ", " Installing : hiera-1.3.4-1.el7.noarch 5/10 ", " Installing : pciutils-3.2.1-4.el7.x86_64 6/10 ", " Installing : facter-2.4.4-3.el7.x86_64 7/10 ", " Installing : libselinux-ruby-2.2.2-6.el7.x86_64 8/10 ", " Installing : puppet-3.6.2-3.el7.noarch 9/10 ", " Installing : dstat-0.7.2-12.el7.noarch 10/10 ", " Verifying : ruby-augeas-0.5.0-1.el7.x86_64 1/10 ", " Verifying : libselinux-ruby-2.2.2-6.el7.x86_64 2/10 ", " Verifying : pciutils-3.2.1-4.el7.x86_64 3/10 ", " Verifying : hiera-1.3.4-1.el7.noarch 4/10 ", " Verifying : puppet-3.6.2-3.el7.noarch 5/10 ", " Verifying : facter-2.4.4-3.el7.x86_64 6/10 ", " Verifying : dstat-0.7.2-12.el7.noarch 7/10 ", " Verifying : ruby-shadow-1.4.1-23.el7.x86_64 8/10 ", " Verifying : augeas-libs-1.4.0-2.el7.x86_64 9/10 ", " Verifying : rubygem-rgen-0.6.6-2.el7.noarch 10/10 ", "", "Installed:", " dstat.noarch 0:0.7.2-12.el7 puppet.noarch 0:3.6.2-3.el7 ", "", "Dependency Installed:", " augeas-libs.x86_64 0:1.4.0-2.el7 facter.x86_64 0:2.4.4-3.el7 ", " hiera.noarch 0:1.3.4-1.el7 libselinux-ruby.x86_64 0:2.2.2-6.el7 ", " pciutils.x86_64 0:3.2.1-4.el7 ruby-augeas.x86_64 0:0.5.0-1.el7 ", " ruby-shadow.x86_64 0:1.4.1-23.el7 rubygem-rgen.noarch 0:0.6.6-2.el7 ", "", "Complete!", "dstat is /usr/bin/dstat", "Successfully installed colored-1.2", "Successfully installed cri-2.6.1", "Successfully installed log4r-1.1.10", "Successfully installed multi_json-1.12.1", "Successfully installed multipart-post-2.0.0", "Successfully installed faraday-0.9.2", "Successfully installed faraday_middleware-0.10.0", "Successfully installed semantic_puppet-0.1.2", "Successfully installed minitar-0.5.4", "Successfully installed puppet_forge-2.2.0", "Successfully installed r10k-2.3.0", "11 gems installed", "/etc/puppet/modules", "├── antonlindstrom-powerdns (\u001b[0;36mv0.0.5\u001b[0m)", "├── duritong-sysctl (\u001b[0;36mv0.0.11\u001b[0m)", "├── nanliu-staging (\u001b[0;36mv1.0.4\u001b[0m)", "├── openstack-aodh (\u001b[0;36mv8.0.2\u001b[0m)", "├── openstack-barbican (\u001b[0;36mv0.0.1\u001b[0m)", "├── openstack-ceilometer (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-ceph (\u001b[0;36mv1.0.0\u001b[0m)", "├── openstack-cinder (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-designate (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-glance (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-gnocchi (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-heat (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-horizon (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-ironic (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-keystone (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-manila (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-mistral (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-monasca (\u001b[0;36mv1.0.0\u001b[0m)", "├── openstack-murano (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-neutron (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-nova (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-openstack_extras (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-openstacklib (\u001b[0;36mv8.0.1\u001b[0m) \u001b[0;31minvalid\u001b[0m", "├── openstack-sahara (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-swift (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-tempest (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-trove (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack-vswitch (\u001b[0;36mv4.0.0\u001b[0m)", "├── openstack-zaqar (\u001b[0;36mv8.0.1\u001b[0m)", "├── openstack_integration (\u001b[0;36m???\u001b[0m)", "├── puppet-corosync (\u001b[0;36mv0.8.0\u001b[0m)", "├── puppet-octavia (\u001b[0;36mv0.0.1\u001b[0m)", "├── puppet-oslo (\u001b[0;36mv0.0.1\u001b[0m)", "├── puppetlabs-apache (\u001b[0;36mv1.8.1\u001b[0m)", "├── puppetlabs-apt (\u001b[0;36mv2.2.2\u001b[0m)", "├── puppetlabs-concat (\u001b[0;36mv1.2.5\u001b[0m)", "├── puppetlabs-firewall (\u001b[0;36mv1.7.2\u001b[0m)", "├── puppetlabs-inifile (\u001b[0;36mv1.4.3\u001b[0m) \u001b[0;31minvalid\u001b[0m", "├── puppetlabs-mongodb (\u001b[0;36mv0.12.0\u001b[0m)", "├── puppetlabs-mysql (\u001b[0;36mv3.6.2\u001b[0m)", "├── puppetlabs-postgresql (\u001b[0;36mv4.7.1\u001b[0m)", "├── puppetlabs-rabbitmq (\u001b[0;36mv5.3.1\u001b[0m)", "├── puppetlabs-rsync (\u001b[0;36mv0.4.0\u001b[0m)", "├── puppetlabs-stdlib (\u001b[0;36mv4.9.1\u001b[0m)", "├── puppetlabs-vcsrepo (\u001b[0;36mv1.3.2\u001b[0m)", "├── puppetlabs-xinetd (\u001b[0;36mv1.5.0\u001b[0m)", "├── qpid (\u001b[0;36m???\u001b[0m)", "├── saz-memcached (\u001b[0;36mv2.8.1\u001b[0m)", "├── stankevich-python (\u001b[0;36mv1.10.0\u001b[0m)", "└── theforeman-dns (\u001b[0;36mv3.1.0\u001b[0m)", "/usr/share/puppet/modules (no modules installed)", "Info: Loading external facts from /etc/puppet/modules/openstacklib/facts.d", "Info: Loading facts in /etc/puppet/modules/nova/lib/facter/libvirt_uuid.rb", "Info: Loading facts in /etc/puppet/modules/openstacklib/lib/facter/os_package_type.rb", "Info: Loading facts in /etc/puppet/modules/openstacklib/lib/facter/os_service_default.rb", "Info: Loading facts in /etc/puppet/modules/vswitch/lib/facter/ovs.rb", "Info: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_reboot_required.rb", "Info: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_update_last_success.rb", "Info: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_updates.rb", "Info: Loading facts in /etc/puppet/modules/concat/lib/facter/concat_basedir.rb", "Info: Loading facts in /etc/puppet/modules/firewall/lib/facter/ip6tables_version.rb", "Info: Loading facts in /etc/puppet/modules/firewall/lib/facter/iptables_persistent_version.rb", "Info: Loading facts in /etc/puppet/modules/firewall/lib/facter/iptables_version.rb", "Info: Loading facts in /etc/puppet/modules/mysql/lib/facter/mysql_version.rb", "Info: Loading facts in /etc/puppet/modules/mysql/lib/facter/mysql_server_id.rb", "Info: Loading facts in /etc/puppet/modules/python/lib/facter/pip_version.rb", "Info: Loading facts in /etc/puppet/modules/python/lib/facter/python_version.rb", "Info: Loading facts in /etc/puppet/modules/python/lib/facter/virtualenv_version.rb", "Info: Loading facts in /etc/puppet/modules/staging/lib/facter/staging_http_get.rb", "Info: Loading facts in /etc/puppet/modules/staging/lib/facter/staging_windir.rb", "Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/puppet_vardir.rb", "Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/facter_dot_d.rb", "Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/pe_version.rb", "Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/root_home.rb", "Notice: Compiled catalog for n2.dusty.ci.centos.org in environment production in 8.94 seconds", "Info: Applying configuration version '1463743459'", "Notice: /Stage[main]/Concat::Setup/File[/var/lib/puppet/concat]/ensure: created", "Notice: /Stage[main]/Concat::Setup/File[/var/lib/puppet/concat/bin]/ensure: created", "Notice: /Stage[main]/Concat::Setup/File[/var/lib/puppet/concat/bin/concatfragments.rb]/ensure: defined content as '{md5}b684db0eac243553a6a79365119a363d'", "Notice: /Stage[main]/Xinetd/Package[xinetd]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Cacert/File[/etc/pki/ca-trust/source/anchors/puppet_openstack.pem]/ensure: defined content as '{md5}78f42ae07a4fc8ebdd5b89c4c74bba5e'", "Info: /Stage[main]/Openstack_integration::Cacert/File[/etc/pki/ca-trust/source/anchors/puppet_openstack.pem]: Scheduling refresh of Exec[update-ca-certificates]", "Notice: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[glance-registry]", "Notice: /Stage[main]/Memcached/Package[memcached]/ensure: created", "Notice: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]/content: ", "--- /etc/sysconfig/memcached\t2015-04-10 10:40:42.000000000 +0100", "+++ /tmp/puppet-file20160520-26469-d60985\t2016-05-20 12:24:40.532841270 +0100", "@@ -1,5 +1,5 @@", " PORT=\"11211\"", " USER=\"memcached\"", "-MAXCONN=\"1024\"", "-CACHESIZE=\"64\"", "-OPTIONS=\"\"", "+MAXCONN=\"8192\"", "+CACHESIZE=\"30400\"", "+OPTIONS=\"-l 0.0.0.0 -U 11211 -t 8 >> /var/log/memcached.log 2>&1\"", "Info: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]: Filebucketed /etc/sysconfig/memcached to puppet with sum 05503957e3796fbe6fddd756a7a102a0", "Notice: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]/content: content changed '{md5}05503957e3796fbe6fddd756a7a102a0' to '{md5}607d5b4345a63a5155f9fbe6c19b6c9b'", "Info: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]: Scheduling refresh of Service[memcached]", "Notice: /Stage[main]/Memcached/Service[memcached]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Memcached/Service[memcached]: Unscheduling refresh on Service[memcached]", "Notice: /Stage[main]/Rabbitmq::Repo::Rhel/Exec[rpm --import http://www.rabbitmq.com/rabbitmq-signing-key-public.asc]/returns: executed successfully", "Notice: /Stage[main]/Neutron::Agents::Lbaas/Package[haproxy]/ensure: created", "Notice: /Stage[main]/Glance/Package[openstack-glance]/ensure: created", "Info: /Stage[main]/Glance/Package[openstack-glance]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Glance/Package[openstack-glance]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_config_file]/ensure: created", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_config_file]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_config_file]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_port]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_port]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_port]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_cache_config[glance_store/os_region_name]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_cache_config[glance_store/os_region_name]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_cache_config[glance_store/os_region_name]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_dir]/ensure: created", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_dir]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_dir]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/show_image_direct_url]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/show_image_direct_url]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/show_image_direct_url]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_port]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_port]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_port]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/debug]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/debug]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_user]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_user]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_user]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/image_cache_dir]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/image_cache_dir]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/image_cache_dir]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_host]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_host]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_host]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_key_file]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_key_file]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_key_file]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_port]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_store]/ensure: created", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_store]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_store]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/amqp_durable_queues]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_swift_reference]/ensure: created", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_swift_reference]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_swift_reference]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/key_file]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/key_file]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/key_file]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/user]/ensure: created", "Notice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/identity_uri]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/identity_uri]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_stderr]/ensure: created", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_stderr]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_stderr]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/key_file]/ensure: created", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/key_file]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/key_file]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_host]/ensure: created", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_host]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_host]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/workers]/ensure: created", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/workers]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/workers]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_password]/ensure: created", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_password]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[paste_deploy/flavor]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[paste_deploy/flavor]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[paste_deploy/flavor]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_dir]/ensure: created", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_dir]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_dir]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_rate]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_port]/ensure: created", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_port]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_port]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_virtual_host]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_syslog]/ensure: created", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_syslog]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_syslog]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/kombu_ssl_version]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_exchange]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_exchange]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_exchange]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_password]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_password]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[glance_store/os_region_name]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[glance_store/os_region_name]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[glance_store/os_region_name]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Neutron/Package[neutron]/ensure: created", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-ovs-agent-service]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-ovs-agent-service]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-lbaasv2-service]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-metering-service]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-metering-service]", "Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Exec[neutron-db-sync]", "Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/resync_interval]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/resync_interval]: Scheduling refresh of Service[neutron-dhcp-service]", "Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Agents::Metering/Package[neutron-metering-agent]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Metering/Package[neutron-metering-agent]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Neutron::Agents::Metering/Package[neutron-metering-agent]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Agents::Metering/Package[neutron-metering-agent]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/metadata_workers]: Scheduling refresh of Service[neutron-metadata]", "Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_metadata_network]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_metadata_network]: Scheduling refresh of Service[neutron-dhcp-service]", "Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/driver]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/driver]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_tenant_name]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_tenant_name]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_tenant_name]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_tenant_name]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_isolated_metadata]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_isolated_metadata]: Scheduling refresh of Service[neutron-dhcp-service]", "Notice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/cert_file]/ensure: created", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/cert_file]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/cert_file]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-metadata]", "Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-dhcp-service]", "Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/syslog_log_facility]/ensure: created", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/syslog_log_facility]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/syslog_log_facility]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Openstack_integration/Package[openstack-selinux]/ensure: created", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/backlog]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/backlog]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/backlog]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Mysql::Client::Install/Package[mysql_client]/ensure: created", "Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Mongodb::Server::Install/Package[mongodb_server]/ensure: created", "Notice: /Stage[main]/Cinder/Package[cinder]/ensure: created", "Info: /Stage[main]/Cinder/Package[cinder]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Cinder/Package[cinder]: Scheduling refresh of Exec[cinder-manage db_sync]", "Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]/ensure: created", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Cron::Db_purge/Cron[cinder-manage db purge]/ensure: created", "Notice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]/ensure: created", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]/ensure: created", "Info: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]/ensure: created", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]/ensure: created", "Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]/ensure: created", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]/ensure: created", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]/ensure: created", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]/ensure: created", "Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Staging/File[/opt/staging]/ensure: created", "Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_file]/ensure: created", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_file]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_file]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_dir]/ensure: created", "Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_dir]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_dir]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/verbose]/ensure: created", "Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/verbose]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/verbose]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_file]/ensure: created", "Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_file]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_file]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_cert_file]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_cert_file]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_cert_file]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]/ensure: created", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_protocol]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_protocol]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_protocol]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_create_container_on_put]/ensure: created", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_create_container_on_put]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_create_container_on_put]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_endpoint_type]/ensure: created", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_endpoint_type]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_endpoint_type]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Ironic/Package[ironic-common]/ensure: created", "Info: /Stage[main]/Ironic/Package[ironic-common]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Ironic/Package[ironic-common]: Scheduling refresh of Exec[ironic-dbsync]", "Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_virtual_host]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic/Ironic_config[glance/glance_api_insecure]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[glance/glance_api_insecure]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[glance/glance_api_insecure]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/kombu_ssl_version]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_port]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic/Ironic_config[DEFAULT/enabled_drivers]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/enabled_drivers]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/enabled_drivers]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_host]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic/Ironic_config[DEFAULT/auth_strategy]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/amqp_durable_queues]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_password]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/cert_file]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/cert_file]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/cert_file]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Glance::Registry/Glance_registry_config[paste_deploy/flavor]/ensure: created", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[paste_deploy/flavor]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[paste_deploy/flavor]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Apache::Mod::Mime/Package[mailcap]/ensure: created", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/workers]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/workers]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/workers]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_password]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_password]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_password]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_password]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_syslog]/ensure: created", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_syslog]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_syslog]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Ironic::Client/Package[python-ironicclient]/ensure: created", "Info: /Stage[main]/Ironic::Client/Package[python-ironicclient]: Scheduling refresh of Anchor[keystone::service::end]", "Notice: /Stage[main]/Tempest/Tempest_config[identity/admin_username]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/zaqar]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[network/public_router_id]/ensure: created", "Notice: /Stage[main]/Tempest/Package[openssl-devel]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[identity/auth_version]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/sahara]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[identity/uri_v3]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[DEFAULT/use_syslog]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/swift]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[identity/admin_tenant_name]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[scenario/img_dir]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[compute/flavor_ref_alt]/ensure: created", "Notice: /Stage[main]/Tempest/Package[libffi-devel]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[scenario/img_file]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[DEFAULT/log_file]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[dashboard/dashboard_url]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/cinder]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/ironic]/ensure: created", "Notice: /Stage[main]/Tempest/Exec[install-pip]/returns: executed successfully", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/heat]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[identity-feature-enabled/api_v2]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[identity/ca_certificates_file]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/trove]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/murano]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[compute/flavor_ref]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[identity/admin_password]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/ceilometer]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[oslo_concurrency/lock_path]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/glance]/ensure: created", "Notice: /Stage[main]/Tempest/Exec[install-tox]/returns: executed successfully", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/nova]/ensure: created", "Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/root_helper]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/root_helper]: Scheduling refresh of Service[neutron-dhcp-service]", "Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]/ensure: created", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_host]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_host]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_host]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Openstack_integration::Glance/Openstack_integration::Ssl_key[glance]/File[/etc/glance/ssl]/ensure: created", "Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_port]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_port]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_port]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/auth_version]/ensure: created", "Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/syslog_log_facility]/ensure: created", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/syslog_log_facility]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/syslog_log_facility]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/interface_driver]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/interface_driver]: Scheduling refresh of Service[neutron-dhcp-service]", "Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Tempest/Tempest_config[identity-feature-enabled/api_v3]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[compute/image_ssh_user]/ensure: created", "Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/auth_uri]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/auth_uri]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Tempest/Tempest_config[compute/image_alt_ssh_user]/ensure: created", "Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/dhcp_driver]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/dhcp_driver]: Scheduling refresh of Service[neutron-dhcp-service]", "Notice: /Stage[main]/Neutron::Server/Neutron_api_config[filter:authtoken/auth_uri]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_api_config[filter:authtoken/auth_uri]: Scheduling refresh of Service[neutron-server]", "Notice: /Stage[main]/Neutron::Services::Fwaas/Package[neutron-fwaas]/ensure: created", "Info: /Stage[main]/Neutron::Services::Fwaas/Package[neutron-fwaas]: Scheduling refresh of Anchor[keystone::service::end]", "Notice: /Stage[main]/Neutron::Services::Fwaas/Neutron_fwaas_service_config[fwaas/driver]/ensure: created", "Info: /Stage[main]/Neutron::Services::Fwaas/Neutron_fwaas_service_config[fwaas/driver]: Scheduling refresh of Service[neutron-l3]", "Notice: /Stage[main]/Openstacklib::Openstackclient/Package[python-openstackclient]/ensure: created", "Info: /Stage[main]/Openstacklib::Openstackclient/Package[python-openstackclient]: Scheduling refresh of Anchor[keystone::service::end]", "Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]/ensure: created", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Package[targetcli]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]/ensure: created", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]/ensure: created", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Tempest/Tempest_config[DEFAULT/use_stderr]/ensure: created", "Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_file]/ensure: created", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_file]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_file]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/horizon]/ensure: created", "Notice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_user]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_user]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Vswitch::Ovs/Package[openvswitch]/ensure: created", "Notice: /Stage[main]/Vswitch::Ovs/Service[openvswitch]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Vswitch::Ovs/Service[openvswitch]: Unscheduling refresh on Service[openvswitch]", "Notice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/content: ", "--- /etc/xinetd.conf\t2014-06-09 19:55:06.000000000 +0100", "+++ /tmp/puppet-file20160520-26469-qo45km\t2016-05-20 12:26:52.098856758 +0100", "@@ -1,3 +1,5 @@", "+# This file is being maintained by Puppet.", "+# DO NOT EDIT", " #", " # This is the master xinetd configuration file. Settings in the", " # default section will be inherited by all service configurations", "@@ -10,41 +12,40 @@", " # The next two items are intended to be a quick access place to", " # temporarily enable or disable services.", " #", "-#\tenabled\t\t=", "-#\tdisabled\t=", "+# enabled =", "+# disabled =", " ", " # Define general logging characteristics.", "-\tlog_type\t= SYSLOG daemon info ", "-\tlog_on_failure\t= HOST", "-\tlog_on_success\t= PID HOST DURATION EXIT", "+ log_type = SYSLOG daemon info", "+ log_on_failure = HOST", "+ log_on_success = PID HOST DURATION EXIT", " ", " # Define access restriction defaults", " #", "-#\tno_access\t=", "-#\tonly_from\t=", "-#\tmax_load\t= 0", "-\tcps\t\t= 50 10", "-\tinstances\t= 50", "-\tper_source\t= 10", "+# no_access =", "+# only_from =", "+# max_load = 0", "+ cps = 50 10", "+ instances = 50", "+ per_source = 10", " ", " # Address and networking defaults", " #", "-#\tbind\t\t=", "-#\tmdns\t\t= yes", "-\tv6only\t\t= no", "+# bind =", "+# mdns = yes", "+ v6only = no", " ", " # setup environmental attributes", " #", "-#\tpassenv\t\t=", "-\tgroups\t\t= yes", "-\tumask\t\t= 002", "+# passenv =", "+ groups = yes", "+ umask = 002", " ", " # Generally, banners are not used. This sets up their global defaults", " #", "-#\tbanner\t\t=", "-#\tbanner_fail\t=", "-#\tbanner_success\t=", "+# banner =", "+# banner_fail =", "+# banner_success =", " }", " ", " includedir /etc/xinetd.d", "- ", "Info: /Stage[main]/Xinetd/File[/etc/xinetd.conf]: Filebucketed /etc/xinetd.conf to puppet with sum 9ff8cc688dd9f0dfc45e5afd25c427a7", "Notice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/content: content changed '{md5}9ff8cc688dd9f0dfc45e5afd25c427a7' to '{md5}011e3163044bef3aa02a664f3785d30c'", "Notice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/mode: mode changed '0600' to '0644'", "Info: /Stage[main]/Xinetd/File[/etc/xinetd.conf]: Scheduling refresh of Service[xinetd]", "Info: /Stage[main]/Xinetd/File[/etc/xinetd.conf]: Scheduling refresh of Service[xinetd]", "Notice: /Stage[main]/Mysql::Server::Install/Package[mysql-server]/ensure: created", "Notice: /Stage[main]/Mysql::Server::Config/File[mysql-config-file]/ensure: defined content as '{md5}ff09a4033f718f08f69da17f0aa86652'", "Notice: /Stage[main]/Mysql::Server::Installdb/Exec[mysql_install_db]/returns: executed successfully", "Notice: /File[/var/log/mariadb/mariadb.log]/seluser: seluser changed 'unconfined_u' to 'system_u'", "Notice: /Stage[main]/Mysql::Server::Service/Service[mysqld]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Mysql::Server::Service/Service[mysqld]: Unscheduling refresh on Service[mysqld]", "Notice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Mysql_database[neutron]/ensure: created", "Notice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[neutron_127.0.0.1]/Mysql_user[neutron@127.0.0.1]/ensure: created", "Notice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Mysql_database[glance]/ensure: created", "Notice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Mysql_database[cinder]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/aodh]/ensure: created", "Notice: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]/ensure: created", "Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Tempest/Tempest_config[identity/uri]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[identity/admin_domain_name]/ensure: created", "Notice: /Stage[main]/Ironic/Ironic_config[DEFAULT/rpc_backend]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Neutron::Services::Fwaas/Neutron_fwaas_service_config[fwaas/enabled]/ensure: created", "Info: /Stage[main]/Neutron::Services::Fwaas/Neutron_fwaas_service_config[fwaas/enabled]: Scheduling refresh of Service[neutron-l3]", "Notice: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/log_dir]/ensure: created", "Info: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/log_dir]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/log_dir]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_stderr]/ensure: created", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_stderr]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_stderr]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Nova/Package[python-nova]/ensure: created", "Info: /Stage[main]/Nova/Package[python-nova]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Nova/Package[python-nova]: Scheduling refresh of Anchor[nova::install::end]", "Notice: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Package[nova-consoleauth]/ensure: created", "Info: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Package[nova-consoleauth]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Package[nova-consoleauth]: Scheduling refresh of Anchor[nova::install::end]", "Notice: /Stage[main]/Nova::Compute/Package[genisoimage]/ensure: created", "Info: /Stage[main]/Nova::Compute/Package[genisoimage]: Scheduling refresh of Anchor[keystone::service::end]", "Notice: /Stage[main]/Openstack_integration::Nova/Openstack_integration::Ssl_key[nova]/File[/etc/nova/ssl]/ensure: created", "Notice: /Stage[main]/Nova::Compute::Libvirt/Package[libvirt-nwfilter]/ensure: created", "Info: /Stage[main]/Nova::Compute::Libvirt/Package[libvirt-nwfilter]: Scheduling refresh of Anchor[keystone::service::end]", "Notice: /Stage[main]/Nova::Compute::Libvirt/Package[libvirt]/ensure: created", "Info: /Stage[main]/Nova::Compute::Libvirt/Package[libvirt]: Scheduling refresh of Anchor[keystone::service::end]", "Notice: /Stage[main]/Openstack_integration::Nova/Openstack_integration::Ssl_key[nova]/File[/etc/nova/ssl/private]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Nova/Openstack_integration::Ssl_key[nova]/File[/etc/nova/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00'", "Info: Openstack_integration::Ssl_key[nova]: Scheduling refresh of Service[httpd]", "Notice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_127.0.0.1]/Mysql_user[glance@127.0.0.1]/ensure: created", "Notice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_127.0.0.1]/Mysql_grant[glance@127.0.0.1/glance.*]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Glance/Openstack_integration::Ssl_key[glance]/File[/etc/glance/ssl/private]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Glance/Openstack_integration::Ssl_key[glance]/File[/etc/glance/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00'", "Info: Openstack_integration::Ssl_key[glance]: Scheduling refresh of Service[glance-api]", "Info: Openstack_integration::Ssl_key[glance]: Scheduling refresh of Service[glance-registry]", "Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/auth_uri]/ensure: created", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/auth_uri]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Tempest/Tempest_config[compute/build_interval]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]/ensure: created", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Neutron::Agents::Lbaas/Package[neutron-lbaas-agent]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Lbaas/Package[neutron-lbaas-agent]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Neutron::Agents::Lbaas/Package[neutron-lbaas-agent]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Agents::Lbaas/Package[neutron-lbaas-agent]: Scheduling refresh of Service[neutron-lbaas-service]", "Notice: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[haproxy/user_group]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[haproxy/user_group]: Scheduling refresh of Service[neutron-lbaas-service]", "Notice: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-lbaas-service]", "Notice: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/interface_driver]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/interface_driver]: Scheduling refresh of Service[neutron-lbaas-service]", "Notice: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/device_driver]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/device_driver]: Scheduling refresh of Service[neutron-lbaas-service]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_topic]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_topic]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_topic]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/auth_address]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Provision/Vs_bridge[br-ex]/ensure: created", "Info: /Stage[main]/Openstack_integration::Provision/Vs_bridge[br-ex]: Scheduling refresh of Exec[create_loop1_port]", "Notice: /Stage[main]/Openstack_integration::Provision/Exec[create_loop1_port]: Triggered 'refresh' from 1 events", "Notice: /Stage[main]/Openstack_integration::Provision/Vs_port[loop1]/ensure: created", "Info: /Stage[main]/Openstack_integration::Provision/Vs_port[loop1]: Scheduling refresh of Exec[create_br-ex_vif]", "Notice: /Stage[main]/Openstack_integration::Provision/Exec[create_br-ex_vif]: Triggered 'refresh' from 1 events", "Notice: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Package[nova-compute]/ensure: created", "Info: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Package[nova-compute]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Package[nova-compute]: Scheduling refresh of Anchor[nova::install::end]", "Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Setup_test_volume/Exec[create_/var/lib/cinder/cinder-volumes]/returns: executed successfully", "Info: /Stage[main]/Cinder::Setup_test_volume/Exec[create_/var/lib/cinder/cinder-volumes]: Scheduling refresh of Exec[losetup /dev/loop2 /var/lib/cinder/cinder-volumes]", "Notice: /Stage[main]/Cinder::Setup_test_volume/Exec[losetup /dev/loop2 /var/lib/cinder/cinder-volumes]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Cinder::Setup_test_volume/Exec[losetup /dev/loop2 /var/lib/cinder/cinder-volumes]: Scheduling refresh of Exec[pvcreate /dev/loop2]", "Notice: /Stage[main]/Cinder::Setup_test_volume/Exec[pvcreate /dev/loop2]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Cinder::Setup_test_volume/Exec[pvcreate /dev/loop2]: Scheduling refresh of Exec[vgcreate cinder-volumes /dev/loop2]", "Notice: /Stage[main]/Cinder::Setup_test_volume/Exec[vgcreate cinder-volumes /dev/loop2]: Triggered 'refresh' from 1 events", "Notice: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]/ensure: created", "Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/state_path]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/state_path]: Scheduling refresh of Service[neutron-dhcp-service]", "Notice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/identity_uri]/ensure: created", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/identity_uri]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_host]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_host]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_host]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Service[target]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Service[target]: Unscheduling refresh on Service[target]", "Notice: /Stage[main]/Swift/Package[swift]/ensure: created", "Info: /Stage[main]/Swift/Package[swift]: Scheduling refresh of Anchor[keystone::service::end]", "Notice: /Stage[main]/Swift/File[/var/lib/swift]/group: group changed 'root' to 'swift'", "Notice: /Stage[main]/Swift/File[/etc/swift]/owner: owner changed 'root' to 'swift'", "Notice: /Stage[main]/Swift/File[/etc/swift]/group: group changed 'root' to 'swift'", "Notice: /Stage[main]/Swift/File[/etc/swift/swift.conf]/owner: owner changed 'root' to 'swift'", "Notice: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]/ensure: created", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-proxy-server]", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-account-reaper]", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-container-updater]", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-object-updater]", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-account-server]", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-account-auditor]", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-account-replicator]", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-container-server]", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-container-auditor]", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-container-replicator]", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-object-server]", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-object-auditor]", "Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-object-replicator]", "Notice: /Stage[main]/Openstack_integration::Swift/File[/srv/node]/ensure: created", "Notice: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]/value: value changed '%SWIFT_HASH_PATH_SUFFIX%' to 'secrete'", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-proxy-server]", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-account-reaper]", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-container-updater]", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-object-updater]", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-account-server]", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-account-auditor]", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-account-replicator]", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-container-server]", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-container-auditor]", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-container-replicator]", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-object-server]", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-object-auditor]", "Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-object-replicator]", "Notice: /Stage[main]/Swift/File[/var/run/swift]/group: group changed 'root' to 'swift'", "Notice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Create[container]/Exec[create_container]/returns: executed successfully", "Notice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Create[object]/Exec[create_object]/returns: executed successfully", "Notice: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/3]/ensure: created", "Info: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/3]: Scheduling refresh of Swift::Ringbuilder::Rebalance[object]", "Notice: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/3]/ensure: created", "Info: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/3]: Scheduling refresh of Swift::Ringbuilder::Rebalance[container]", "Notice: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/2]/ensure: created", "Info: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/2]: Scheduling refresh of Swift::Ringbuilder::Rebalance[container]", "Notice: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/2]/ensure: created", "Info: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/2]: Scheduling refresh of Swift::Ringbuilder::Rebalance[object]", "Notice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[neutron_127.0.0.1]/Mysql_grant[neutron@127.0.0.1/neutron.*]/ensure: created", "Info: Openstacklib::Db::Mysql[neutron]: Scheduling refresh of Service[neutron-server]", "Info: Openstacklib::Db::Mysql[neutron]: Scheduling refresh of Exec[neutron-db-sync]", "Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/debug]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/debug]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Api/Glance_api_config[glance_store/stores]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_api_config[glance_store/stores]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_api_config[glance_store/stores]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Neutron::Plugins::Ml2/Package[neutron-plugin-ml2]/ensure: created", "Info: /Stage[main]/Neutron::Plugins::Ml2/Package[neutron-plugin-ml2]: Scheduling refresh of Anchor[keystone::service::end]", "Notice: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vni_ranges]/ensure: created", "Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vni_ranges]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vni_ranges]: Scheduling refresh of Exec[neutron-db-sync]", "Notice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/type_drivers]/ensure: created", "Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/type_drivers]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/type_drivers]: Scheduling refresh of Exec[neutron-db-sync]", "Notice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/path_mtu]/ensure: created", "Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/path_mtu]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/path_mtu]: Scheduling refresh of Exec[neutron-db-sync]", "Notice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/tenant_network_types]/ensure: created", "Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/tenant_network_types]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/tenant_network_types]: Scheduling refresh of Exec[neutron-db-sync]", "Notice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/mechanism_drivers]/ensure: created", "Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/mechanism_drivers]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/mechanism_drivers]: Scheduling refresh of Exec[neutron-db-sync]", "Notice: /Stage[main]/Neutron::Plugins::Ml2/File[/etc/default/neutron-server]/ensure: created", "Notice: /Stage[main]/Neutron::Plugins::Ml2/File[/etc/neutron/plugin.ini]/ensure: created", "Notice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]/ensure: created", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Ironic/Ironic_config[DEFAULT/control_exchange]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/control_exchange]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/control_exchange]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/debug]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/debug]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/1]/ensure: created", "Info: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/1]: Scheduling refresh of Swift::Ringbuilder::Rebalance[container]", "Info: Swift::Ringbuilder::Rebalance[container]: Scheduling refresh of Exec[rebalance_container]", "Notice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Rebalance[container]/Exec[rebalance_container]: Triggered 'refresh' from 1 events", "Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/verbose]/ensure: created", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/verbose]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/verbose]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/verbose]/ensure: created", "Info: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/verbose]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/verbose]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Openstacklib::Db::Mysql::Host_access[cinder_127.0.0.1]/Mysql_user[cinder@127.0.0.1]/ensure: created", "Notice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Openstacklib::Db::Mysql::Host_access[cinder_127.0.0.1]/Mysql_grant[cinder@127.0.0.1/cinder.*]/ensure: created", "Info: Openstacklib::Db::Mysql[cinder]: Scheduling refresh of Exec[cinder-manage db_sync]", "Notice: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Package[nova-conductor]/ensure: created", "Info: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Package[nova-conductor]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Package[nova-conductor]: Scheduling refresh of Anchor[nova::install::end]", "Notice: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/1]/ensure: created", "Info: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/1]: Scheduling refresh of Swift::Ringbuilder::Rebalance[object]", "Info: Swift::Ringbuilder::Rebalance[object]: Scheduling refresh of Exec[rebalance_object]", "Notice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Rebalance[object]/Exec[rebalance_object]: Triggered 'refresh' from 1 events", "Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/verbose]/ensure: created", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/verbose]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/verbose]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Ironic::Conductor/Package[ironic-conductor]/ensure: created", "Info: /Stage[main]/Ironic::Conductor/Package[ironic-conductor]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Ironic::Conductor/Package[ironic-conductor]: Scheduling refresh of Exec[ironic-dbsync]", "Notice: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/max_time_interval]/ensure: created", "Info: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/max_time_interval]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/max_time_interval]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/force_power_state_during_sync]/ensure: created", "Info: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/force_power_state_during_sync]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/force_power_state_during_sync]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Package[nova-vncproxy]/ensure: created", "Info: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Package[nova-vncproxy]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Package[nova-vncproxy]: Scheduling refresh of Anchor[nova::install::end]", "Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Ironic/Ironic_config[glance/glance_num_retries]/ensure: created", "Info: /Stage[main]/Ironic/Ironic_config[glance/glance_num_retries]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic/Ironic_config[glance/glance_num_retries]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf]/ensure: created", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments]/ensure: created", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf]", "Notice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[container]/Concat::Fragment[swift_recon_container]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/35_swift_recon_container]/ensure: defined content as '{md5}d847d2d529a3596ed6a74d841d790dc7'", "Info: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[container]/Concat::Fragment[swift_recon_container]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/35_swift_recon_container]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments.concat]/ensure: created", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf]/ensure: created", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments.concat]/ensure: created", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments.concat.out]/ensure: created", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments]/ensure: created", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat::Fragment[swift-account-6002]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/00_swift-account-6002]/ensure: defined content as '{md5}666661f3805b49b4682cc11f80dad508'", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat::Fragment[swift-account-6002]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/00_swift-account-6002]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments.concat.out]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_config[service_broker/run_service_broker_tests]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]/ensure: created", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Tempest/Tempest_config[DEFAULT/debug]/ensure: created", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[DEFAULT/notification_driver]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[DEFAULT/notification_driver]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[DEFAULT/notification_driver]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/interface_driver]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/interface_driver]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf]/ensure: created", "Info: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf]: Scheduling refresh of Exec[concat_/etc/rsync.conf]", "Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments.concat.out]/ensure: created", "Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments]/ensure: created", "Info: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments]: Scheduling refresh of Exec[concat_/etc/rsync.conf]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Rsync::Server::Module[container]/Concat::Fragment[frag-container]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_container_frag-container]/ensure: defined content as '{md5}7dd5f706fbeccaf9a45d40737af512ac'", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Rsync::Server::Module[container]/Concat::Fragment[frag-container]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_container_frag-container]: Scheduling refresh of Exec[concat_/etc/rsync.conf]", "Notice: /Stage[main]/Tempest/Tempest_config[service_available/neutron]/ensure: created", "Notice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]/ensure: created", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/key]/ensure: created", "Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/auth_uri]/ensure: created", "Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic::Api/Package[ironic-api]/ensure: created", "Info: /Stage[main]/Ironic::Api/Package[ironic-api]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Ironic::Api/Package[ironic-api]: Scheduling refresh of Exec[ironic-dbsync]", "Notice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_tenant_name]/ensure: created", "Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic::Api/Ironic_config[api/host_ip]/ensure: created", "Info: /Stage[main]/Ironic::Api/Ironic_config[api/host_ip]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Api/Ironic_config[api/host_ip]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_password]/ensure: created", "Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/identity_uri]/ensure: created", "Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic::Api/Ironic_config[neutron/url]/ensure: created", "Info: /Stage[main]/Ironic::Api/Ironic_config[neutron/url]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Api/Ironic_config[neutron/url]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_user]/ensure: created", "Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Ironic::Api/Ironic_config[api/max_limit]/ensure: created", "Info: /Stage[main]/Ironic::Api/Ironic_config[api/max_limit]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Api/Ironic_config[api/max_limit]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf]/ensure: created", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments]/ensure: created", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf]", "Notice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[object]/Concat::Fragment[swift_healthcheck_object]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/25_swift_healthcheck_object]/ensure: defined content as '{md5}8c92056c41082619d179f88ea15c5fc6'", "Info: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[object]/Concat::Fragment[swift_healthcheck_object]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/25_swift_healthcheck_object]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments.concat.out]/ensure: created", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat::Fragment[swift-object-6000]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/00_swift-object-6000]/ensure: defined content as '{md5}f5bb62f4798612b143fc441befa50ecc'", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat::Fragment[swift-object-6000]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/00_swift-object-6000]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Rsync::Server::Module[object]/Concat::Fragment[frag-object]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_object_frag-object]/ensure: defined content as '{md5}d0ecd24502eb0f9cd5c387b2e1e32943'", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Rsync::Server::Module[object]/Concat::Fragment[frag-object]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_object_frag-object]: Scheduling refresh of Exec[concat_/etc/rsync.conf]", "Notice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Create[account]/Exec[create_account]/returns: executed successfully", "Notice: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/1]/ensure: created", "Info: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/1]: Scheduling refresh of Swift::Ringbuilder::Rebalance[account]", "Notice: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/3]/ensure: created", "Info: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/3]: Scheduling refresh of Swift::Ringbuilder::Rebalance[account]", "Notice: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/2]/ensure: created", "Info: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/2]: Scheduling refresh of Swift::Ringbuilder::Rebalance[account]", "Info: Swift::Ringbuilder::Rebalance[account]: Scheduling refresh of Exec[rebalance_account]", "Notice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Rebalance[account]/Exec[rebalance_account]: Triggered 'refresh' from 1 events", "Notice: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/debug]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/debug]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Keystone::Client/Package[python-keystoneclient]/ensure: created", "Info: /Stage[main]/Keystone::Client/Package[python-keystoneclient]: Scheduling refresh of Anchor[keystone::service::end]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat::Fragment[swift-container-6001]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/00_swift-container-6001]/ensure: defined content as '{md5}26d25a9fa3702760a9fc42a4a2bd22c2'", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat::Fragment[swift-container-6001]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/00_swift-container-6001]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf]", "Notice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[account]/Concat::Fragment[swift_healthcheck_account]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/25_swift_healthcheck_account]/ensure: defined content as '{md5}8c92056c41082619d179f88ea15c5fc6'", "Info: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[account]/Concat::Fragment[swift_healthcheck_account]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/25_swift_healthcheck_account]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf]", "Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]: Scheduling refresh of Service[neutron-ovs-agent-service]", "Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/local_ip]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/local_ip]: Scheduling refresh of Service[neutron-ovs-agent-service]", "Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/tunnel_bridge]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/tunnel_bridge]: Scheduling refresh of Service[neutron-ovs-agent-service]", "Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/enable_tunneling]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/enable_tunneling]: Scheduling refresh of Service[neutron-ovs-agent-service]", "Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/integration_bridge]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/integration_bridge]: Scheduling refresh of Service[neutron-ovs-agent-service]", "Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[securitygroup/firewall_driver]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[securitygroup/firewall_driver]: Scheduling refresh of Service[neutron-ovs-agent-service]", "Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Service[ovs-cleanup-service]/enable: enable changed 'false' to 'true'", "Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/drop_flows_on_start]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/drop_flows_on_start]: Scheduling refresh of Service[neutron-ovs-agent-service]", "Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/vxlan_udp_port]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/vxlan_udp_port]: Scheduling refresh of Service[neutron-ovs-agent-service]", "Notice: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/agent_mode]/ensure: created", "Info: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/agent_mode]: Scheduling refresh of Service[neutron-l3]", "Notice: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/interface_driver]/ensure: created", "Info: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/interface_driver]: Scheduling refresh of Service[neutron-l3]", "Notice: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-l3]", "Notice: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]: Scheduling refresh of Service[neutron-metadata]", "Notice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Mysql_database[ironic]/ensure: created", "Notice: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vxlan_group]/ensure: created", "Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vxlan_group]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vxlan_group]: Scheduling refresh of Exec[neutron-db-sync]", "Notice: /Stage[main]/Tempest/Tempest_config[DEFAULT/verbose]/ensure: created", "Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]/ensure: created", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Ironic::Api/Ironic_config[api/port]/ensure: created", "Info: /Stage[main]/Ironic::Api/Ironic_config[api/port]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Api/Ironic_config[api/port]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Rsync::Server/Xinetd::Service[rsync]/File[/etc/xinetd.d/rsync]/ensure: created", "Info: /Stage[main]/Rsync::Server/Xinetd::Service[rsync]/File[/etc/xinetd.d/rsync]: Scheduling refresh of Service[xinetd]", "Notice: /Stage[main]/Xinetd/Service[xinetd]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Xinetd/Service[xinetd]: Unscheduling refresh on Service[xinetd]", "Info: Openstacklib::Db::Mysql[glance]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[account]/Concat::Fragment[swift_recon_account]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/35_swift_recon_account]/ensure: defined content as '{md5}d847d2d529a3596ed6a74d841d790dc7'", "Info: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[account]/Concat::Fragment[swift_recon_account]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/35_swift_recon_account]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf]", "Notice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_tenant_name]/ensure: created", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Openstacklib::Db::Mysql::Host_access[ironic_127.0.0.1]/Mysql_user[ironic@127.0.0.1]/ensure: created", "Notice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Openstacklib::Db::Mysql::Host_access[ironic_127.0.0.1]/Mysql_grant[ironic@127.0.0.1/ironic.*]/ensure: created", "Info: Openstacklib::Db::Mysql[ironic]: Scheduling refresh of Exec[ironic-dbsync]", "Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_container]/ensure: created", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_container]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_container]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Apache/Package[httpd]/ensure: created", "Info: /Stage[main]/Apache/Package[httpd]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/File[/var/www/cgi-bin/nova]/ensure: created", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/File[/var/www/cgi-bin/ironic]/ensure: created", "Notice: /Stage[main]/Apache::Mod::Wsgi/Apache::Mod[wsgi]/Package[mod_wsgi]/ensure: created", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/File[nova_api_wsgi]/ensure: defined content as '{md5}87dec420e9b6e707b94b149f1432bad2'", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_digest]/File[auth_digest.load]/ensure: defined content as '{md5}df9e85f8da0b239fe8e698ae7ead4f60'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_digest]/File[auth_digest.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Mime/Apache::Mod[mime]/File[mime.load]/ensure: defined content as '{md5}e36257b9efab01459141d423cae57c7c'", "Info: /Stage[main]/Apache::Mod::Mime/Apache::Mod[mime]/File[mime.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[expires]/File[expires.load]/ensure: defined content as '{md5}f0825bad1e470de86ffabeb86dcc5d95'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[expires]/File[expires.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Dav/Apache::Mod[dav]/File[dav.load]/ensure: defined content as '{md5}588e496251838c4840c14b28b5aa7881'", "Info: /Stage[main]/Apache::Mod::Dav/Apache::Mod[dav]/File[dav.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_owner]/File[authz_owner.load]/ensure: defined content as '{md5}f30a9be1016df87f195449d9e02d1857'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_owner]/File[authz_owner.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_groupfile]/File[authz_groupfile.load]/ensure: defined content as '{md5}ae005a36b3ac8c20af36c434561c8a75'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_groupfile]/File[authz_groupfile.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_dbm]/File[authn_dbm.load]/ensure: defined content as '{md5}90ee8f8ef1a017cacadfda4225e10651'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_dbm]/File[authn_dbm.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Authn_core/Apache::Mod[authn_core]/File[authn_core.load]/ensure: defined content as '{md5}704d6e8b02b0eca0eba4083960d16c52'", "Info: /Stage[main]/Apache::Mod::Authn_core/Apache::Mod[authn_core]/File[authn_core.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Authz_user/Apache::Mod[authz_user]/File[authz_user.load]/ensure: defined content as '{md5}63594303ee808423679b1ea13dd5a784'", "Info: /Stage[main]/Apache::Mod::Authz_user/Apache::Mod[authz_user]/File[authz_user.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[log_config]/File[log_config.load]/ensure: defined content as '{md5}785d35cb285e190d589163b45263ca89'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[log_config]/File[log_config.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[logio]/File[logio.load]/ensure: defined content as '{md5}084533c7a44e9129d0e6df952e2472b6'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[logio]/File[logio.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[socache_shmcb]/File[socache_shmcb.load]/ensure: defined content as '{md5}ab31a6ea611785f74851b578572e4157'", "Info: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[socache_shmcb]/File[socache_shmcb.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Mime/File[mime.conf]/ensure: defined content as '{md5}9da85e58f3bd6c780ce76db603b7f028'", "Info: /Stage[main]/Apache::Mod::Mime/File[mime.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[access_compat]/File[access_compat.load]/ensure: defined content as '{md5}d5feb88bec4570e2dbc41cce7e0de003'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[access_compat]/File[access_compat.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Version/Apache::Mod[version]/File[version.load]/ensure: defined content as '{md5}1c9243de22ace4dc8266442c48ae0c92'", "Info: /Stage[main]/Apache::Mod::Version/Apache::Mod[version]/File[version.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Setenvif/File[setenvif.conf]/ensure: defined content as '{md5}c7ede4173da1915b7ec088201f030c28'", "Info: /Stage[main]/Apache::Mod::Setenvif/File[setenvif.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Actions/Apache::Mod[actions]/File[actions.load]/ensure: defined content as '{md5}599866dfaf734f60f7e2d41ee8235515'", "Info: /Stage[main]/Apache::Mod::Actions/Apache::Mod[actions]/File[actions.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Deflate/File[deflate.conf]/ensure: defined content as '{md5}a045d750d819b1e9dae3fbfb3f20edd5'", "Info: /Stage[main]/Apache::Mod::Deflate/File[deflate.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_core]/File[authz_core.load]/ensure: defined content as '{md5}39942569bff2abdb259f9a347c7246bc'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_core]/File[authz_core.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Negotiation/File[negotiation.conf]/ensure: defined content as '{md5}47284b5580b986a6ba32580b6ffb9fd7'", "Info: /Stage[main]/Apache::Mod::Negotiation/File[negotiation.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Alias/Apache::Mod[alias]/File[alias.load]/ensure: defined content as '{md5}3cf2fa309ccae4c29a4b875d0894cd79'", "Info: /Stage[main]/Apache::Mod::Alias/Apache::Mod[alias]/File[alias.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[env]/File[env.load]/ensure: defined content as '{md5}d74184d40d0ee24ba02626a188ee7e1a'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[env]/File[env.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Negotiation/Apache::Mod[negotiation]/File[negotiation.load]/ensure: defined content as '{md5}d262ee6a5f20d9dd7f87770638dc2ccd'", "Info: /Stage[main]/Apache::Mod::Negotiation/Apache::Mod[negotiation]/File[negotiation.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_dbm]/File[authz_dbm.load]/ensure: defined content as '{md5}c1363277984d22f99b70f7dce8753b60'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_dbm]/File[authz_dbm.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Dir/File[dir.conf]/ensure: defined content as '{md5}c741d8ea840e6eb999d739eed47c69d7'", "Info: /Stage[main]/Apache::Mod::Dir/File[dir.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[usertrack]/File[usertrack.load]/ensure: defined content as '{md5}e95fbbf030fabec98b948f8dc217775c'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[usertrack]/File[usertrack.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Prefork/File[/etc/httpd/conf.modules.d/prefork.conf]/ensure: defined content as '{md5}109c4f51dac10fc1b39373855e566d01'", "Info: /Stage[main]/Apache::Mod::Prefork/File[/etc/httpd/conf.modules.d/prefork.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Vhost_alias/Apache::Mod[vhost_alias]/File[vhost_alias.load]/ensure: defined content as '{md5}eca907865997d50d5130497665c3f82e'", "Info: /Stage[main]/Apache::Mod::Vhost_alias/Apache::Mod[vhost_alias]/File[vhost_alias.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[substitute]/File[substitute.load]/ensure: defined content as '{md5}8077c34a71afcf41c8fc644830935915'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[substitute]/File[substitute.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Setenvif/Apache::Mod[setenvif]/File[setenvif.load]/ensure: defined content as '{md5}ec6c99f7cc8e35bdbcf8028f652c9f6d'", "Info: /Stage[main]/Apache::Mod::Setenvif/Apache::Mod[setenvif]/File[setenvif.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[unixd]/File[unixd.load]/ensure: defined content as '{md5}0e8468ecc1265f8947b8725f4d1be9c0'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[unixd]/File[unixd.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Deflate/Apache::Mod[deflate]/File[deflate.load]/ensure: defined content as '{md5}2d1a1afcae0c70557251829a8586eeaf'", "Info: /Stage[main]/Apache::Mod::Deflate/Apache::Mod[deflate]/File[deflate.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Wsgi/Apache::Mod[wsgi]/File[wsgi.load]/ensure: defined content as '{md5}e1795e051e7aae1f865fde0d3b86a507'", "Info: /Stage[main]/Apache::Mod::Wsgi/Apache::Mod[wsgi]/File[wsgi.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_basic]/File[auth_basic.load]/ensure: defined content as '{md5}494bcf4b843f7908675d663d8dc1bdc8'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_basic]/File[auth_basic.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Prefork/Apache::Mpm[prefork]/File[/etc/httpd/conf.modules.d/prefork.load]/ensure: defined content as '{md5}157529aafcf03fa491bc924103e4608e'", "Info: /Stage[main]/Apache::Mod::Prefork/Apache::Mpm[prefork]/File[/etc/httpd/conf.modules.d/prefork.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Authn_file/Apache::Mod[authn_file]/File[authn_file.load]/ensure: defined content as '{md5}d41656680003d7b890267bb73621c60b'", "Info: /Stage[main]/Apache::Mod::Authn_file/Apache::Mod[authn_file]/File[authn_file.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Ext_filter/Apache::Mod[ext_filter]/File[ext_filter.load]/ensure: defined content as '{md5}76d5e0ac3411a4be57ac33ebe2e52ac8'", "Info: /Stage[main]/Apache::Mod::Ext_filter/Apache::Mod[ext_filter]/File[ext_filter.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Wsgi/File[wsgi.conf]/ensure: defined content as '{md5}8b3feb3fc2563de439920bb2c52cbd11'", "Info: /Stage[main]/Apache::Mod::Wsgi/File[wsgi.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Speling/Apache::Mod[speling]/File[speling.load]/ensure: defined content as '{md5}f82e9e6b871a276c324c9eeffcec8a61'", "Info: /Stage[main]/Apache::Mod::Speling/Apache::Mod[speling]/File[speling.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Dir/Apache::Mod[dir]/File[dir.load]/ensure: defined content as '{md5}1bfb1c2a46d7351fc9eb47c659dee068'", "Info: /Stage[main]/Apache::Mod::Dir/Apache::Mod[dir]/File[dir.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Dav_fs/Apache::Mod[dav_fs]/File[dav_fs.load]/ensure: defined content as '{md5}2996277c73b1cd684a9a3111c355e0d3'", "Info: /Stage[main]/Apache::Mod::Dav_fs/Apache::Mod[dav_fs]/File[dav_fs.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[include]/File[include.load]/ensure: defined content as '{md5}88095a914eedc3c2c184dd5d74c3954c'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[include]/File[include.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[systemd]/File[systemd.load]/ensure: defined content as '{md5}26e5d44aae258b3e9d821cbbbd3e2826'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[systemd]/File[systemd.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Alias/File[alias.conf]/ensure: defined content as '{md5}983e865be85f5e0daaed7433db82995e'", "Info: /Stage[main]/Apache::Mod::Alias/File[alias.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[ssl]/Package[mod_ssl]/ensure: created", "Notice: /Stage[main]/Apache::Mod::Ssl/File[ssl.conf]/ensure: defined content as '{md5}8884ea33793365e0784cfd43be72464e'", "Info: /Stage[main]/Apache::Mod::Ssl/File[ssl.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[ssl]/File[ssl.load]/ensure: defined content as '{md5}e282ac9f82fe5538692a4de3616fb695'", "Info: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[ssl]/File[ssl.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_host]/File[authz_host.load]/ensure: defined content as '{md5}d1045f54d2798499ca0f030ca0eef920'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_host]/File[authz_host.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Suexec/Apache::Mod[suexec]/File[suexec.load]/ensure: defined content as '{md5}c7d5c61c534ba423a79b0ae78ff9be35'", "Info: /Stage[main]/Apache::Mod::Suexec/Apache::Mod[suexec]/File[suexec.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Cache/Apache::Mod[cache]/File[cache.load]/ensure: defined content as '{md5}01e4d392225b518a65b0f7d6c4e21d29'", "Info: /Stage[main]/Apache::Mod::Cache/Apache::Mod[cache]/File[cache.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Rewrite/Apache::Mod[rewrite]/File[rewrite.load]/ensure: defined content as '{md5}26e2683352fc1599f29573ff0d934e79'", "Info: /Stage[main]/Apache::Mod::Rewrite/Apache::Mod[rewrite]/File[rewrite.load]: Scheduling refresh of Class[Apache::Service]", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.d/autoindex.conf]: Filebucketed /etc/httpd/conf.d/autoindex.conf to puppet with sum 09726332c2fd6fc73a57fbe69fc10427", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.d/autoindex.conf]/ensure: removed", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.d/userdir.conf]: Filebucketed /etc/httpd/conf.d/userdir.conf to puppet with sum d4a2620683cc3ff2315c685f9f354265", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.d/userdir.conf]/ensure: removed", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.d/ssl.conf]: Filebucketed /etc/httpd/conf.d/ssl.conf to puppet with sum 1888b608773b45f4acea3604eccf3562", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.d/ssl.conf]/ensure: removed", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.d/welcome.conf]: Filebucketed /etc/httpd/conf.d/welcome.conf to puppet with sum 9d1328b985d0851eb5bc610da6122f44", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.d/welcome.conf]/ensure: removed", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.d/README]: Filebucketed /etc/httpd/conf.d/README to puppet with sum 20b886e8496027dcbc31ed28d404ebb1", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.d/README]/ensure: removed", "Notice: /Stage[main]/Apache::Mod::Autoindex/Apache::Mod[autoindex]/File[autoindex.load]/ensure: defined content as '{md5}515cdf5b573e961a60d2931d39248648'", "Info: /Stage[main]/Apache::Mod::Autoindex/Apache::Mod[autoindex]/File[autoindex.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_anon]/File[authn_anon.load]/ensure: defined content as '{md5}bf57b94b5aec35476fc2a2dc3861f132'", "Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_anon]/File[authn_anon.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Autoindex/File[autoindex.conf]/ensure: defined content as '{md5}2421a3c6df32c7e38c2a7a22afdf5728'", "Info: /Stage[main]/Apache::Mod::Autoindex/File[autoindex.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf]/ensure: created", "Info: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]", "Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments.concat.out]/ensure: created", "Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments]/ensure: created", "Info: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]", "Notice: /Stage[main]/Apache/Concat::Fragment[Apache ports header]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Apache ports header]/ensure: defined content as '{md5}afe35cb5747574b700ebaa0f0b3a626e'", "Info: /Stage[main]/Apache/Concat::Fragment[Apache ports header]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Apache ports header]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]", "Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments.concat]/ensure: created", "Notice: /Stage[main]/Apache::Mod::Filter/Apache::Mod[filter]/File[filter.load]/ensure: defined content as '{md5}66a1e2064a140c3e7dca7ac33877700e'", "Info: /Stage[main]/Apache::Mod::Filter/Apache::Mod[filter]/File[filter.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/File[ironic_wsgi]/ensure: defined content as '{md5}77ef07cc957e05e2024c75ef82d6fbbd'", "Notice: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]/ensure: created", "Info: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[object]/Concat::Fragment[swift_recon_object]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/35_swift_recon_object]/ensure: defined content as '{md5}d847d2d529a3596ed6a74d841d790dc7'", "Info: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[object]/Concat::Fragment[swift_recon_object]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/35_swift_recon_object]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf]", "Notice: /Stage[main]/Rsync::Server/Concat::Fragment[rsyncd_conf_header]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/00_header_rsyncd_conf_header]/ensure: defined content as '{md5}3a2ab53ad81bbfc64ceb17fb3a7efee0'", "Info: /Stage[main]/Rsync::Server/Concat::Fragment[rsyncd_conf_header]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/00_header_rsyncd_conf_header]: Scheduling refresh of Exec[concat_/etc/rsync.conf]", "Notice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_user]/ensure: created", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_user]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Mongodb::Client::Install/Package[mongodb_client]/ensure: created", "Notice: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Package[nova-scheduler]/ensure: created", "Info: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Package[nova-scheduler]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Package[nova-scheduler]: Scheduling refresh of Anchor[nova::install::end]", "Notice: /Stage[main]/Neutron::Db/Neutron_config[database/connection]/ensure: created", "Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Glance::Registry::Db/Glance_registry_config[database/connection]/ensure: created", "Info: /Stage[main]/Glance::Registry::Db/Glance_registry_config[database/connection]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Registry::Db/Glance_registry_config[database/connection]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Ironic::Db/Ironic_config[database/connection]/ensure: created", "Info: /Stage[main]/Ironic::Db/Ironic_config[database/connection]: Scheduling refresh of Exec[ironic-dbsync]", "Info: /Stage[main]/Ironic::Db/Ironic_config[database/connection]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Ironic::Db/Ironic_config[database/connection]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Glance::Api::Db/Glance_api_config[database/connection]/ensure: created", "Info: /Stage[main]/Glance::Api::Db/Glance_api_config[database/connection]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api::Db/Glance_api_config[database/connection]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Cinder::Db/Cinder_config[database/connection]/ensure: created", "Info: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Exec[cinder-manage db_sync]", "Info: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Ironic::Db::Sync/Exec[ironic-dbsync]: Triggered 'refresh' from 5 events", "Info: /Stage[main]/Ironic::Db::Sync/Exec[ironic-dbsync]: Scheduling refresh of Service[ironic-api]", "Info: /Stage[main]/Ironic::Db::Sync/Exec[ironic-dbsync]: Scheduling refresh of Service[ironic-conductor]", "Notice: /Stage[main]/Apache::Mod::Mime_magic/File[mime_magic.conf]/ensure: defined content as '{md5}b258529b332429e2ff8344f726a95457'", "Info: /Stage[main]/Apache::Mod::Mime_magic/File[mime_magic.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache::Mod::Mime_magic/Apache::Mod[mime_magic]/File[mime_magic.load]/ensure: defined content as '{md5}cb8670bb2fb352aac7ebf3a85d52094c'", "Info: /Stage[main]/Apache::Mod::Mime_magic/Apache::Mod[mime_magic]/File[mime_magic.load]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments.concat]/ensure: created", "Notice: /Stage[main]/Mongodb::Server::Config/File[/etc/mongod.conf]/content: ", "--- /etc/mongod.conf\t2015-12-07 22:55:21.000000000 +0000", "+++ /tmp/puppet-file20160520-26469-1krjs66\t2016-05-20 12:29:37.587578098 +0100", "@@ -1,237 +1,19 @@", "-##", "-### Basic Defaults", "-##", "+# mongo.conf - generated from Puppet", " ", "-# Comma separated list of ip addresses to listen on (all local ips by default)", "-bind_ip = 127.0.0.1", "-", "-# Specify port number (27017 by default)", "-#port = 27017", "-", "-# Fork server process (false by default)", "-fork = true", "-", "-# Full path to pidfile (if not set, no pidfile is created)", "-pidfilepath = /var/run/mongodb/mongod.pid", "-", "-# Log file to send write to instead of stdout - has to be a file, not directory", "-logpath = /var/log/mongodb/mongod.log", "-", "-# Alternative directory for UNIX domain sockets (defaults to /tmp)", "-unixSocketPrefix = /var/run/mongodb", "-", "-# Directory for datafiles (defaults to /data/db/)", "-dbpath = /var/lib/mongodb", "-", "-# Enable/Disable journaling (journaling is on by default for 64 bit)", "-#journal = true", "-#nojournal = true", "-", "-", "-", "-##", "-### General options", "-##", "-", "-# Be more verbose (include multiple times for more verbosity e.g. -vvvvv) (v by default)", "-#verbose = v", "-", "-# Max number of simultaneous connections (1000000 by default)", "-#maxConns = 1000000 ", "-", "-# Log to system's syslog facility instead of file or stdout (false by default)", "-#syslog = true", "-", "-# Syslog facility used for monogdb syslog message (user by defautl)", "-#syslogFacility = user", "-", "-# Append to logpath instead of over-writing (false by default)", "-#logappend = true", "-", "-# Desired format for timestamps in log messages (One of ctime, iso8601-utc or iso8601-local) (iso8601-local by default)", "-#timeStampFormat = arg ", "-", "-# Private key for cluster authentication", "-#keyFile = arg", "-", "-# Set a configurable parameter", "-#setParameter = arg", "-", "-# Enable http interface (false by default)", "-#httpinterface = true", "-", "-# Authentication mode used for cluster authentication. Alternatives are (keyFile|sendKeyFile|sendX509|x509) (keyFile by default)", "-#clusterAuthMode = arg", "-", "-# Disable listening on unix sockets (false by default)", "-#nounixsocket = true", "-", "-# Run with/without security (without by default)", "-#auth = true", "-#noauth = true", "-", "-# Enable IPv6 support (disabled by default)", "-#ipv6 = true", "-", "-# Allow JSONP access via http (has security implications) (false by default)", "-#jsonp = true", "-", "-# Turn on simple rest api (false by default)", "-#rest = true", "-", "-# Value of slow for profile and console log (100 by default)", "-#slowms = 100", "-", "-# 0=off 1=slow, 2=all (0 by default)", "-#profile = 0", "-", "-# Periodically show cpu and iowait utilization (false by default)", "-#cpu = true", "-", "-# Print some diagnostic system information (false by default)", "-#sysinfo = true", "-", "-# Each database will be stored in a separate directory (false by default)", "-#directoryperdb = true", "-", "-# Don't retry any index builds that were interrupted by shutdown (false by default)", "-#noIndexBuildRetry = true", "-", "-# Disable data file preallocation - will often hurt performance (false by default)", "-#noprealloc = true", "-", "-# .ns file size (in MB) for new databases (16 MB by default)", "-#nssize = 16", "-", "-# Limits each database to a certain number of files (8 default)", "-#quota", "-", "-# Number of files allowed per db, implies --quota (8 by default)", "-#quotaFiles = 8", "-", "-# Use a smaller default file size (false by default)", "-#smallfiles = true", "-", "-# Seconds between disk syncs (0=never, but not recommended) (60 by default)", "-#syncdelay = 60", "-", "-# Upgrade db if needed (false by default)", "-#upgrade = true", "-", "-# Run repair on all dbs (false by default)", "-#repair = true", "-", "-# Root directory for repair files (defaults to dbpath)", "-#repairpath = arg", "-", "-# Disable scripting engine (false by default)", "-#noscripting = true", "-", "-# Do not allow table scans (false by default)", "-#notablescan = true", "-", "-# Journal diagnostic options (0 by default)", "-#journalOptions = 0", "-", "-# How often to group/batch commit (ms) (100 or 30 by default)", "-#journalCommitInterval = 100 ", "-", "-", "-", "-##", "-### Replication options", "-##", "-", "-# Size to use (in MB) for replication op log (default 5% of disk space - i.e. large is good)", "-#oplogSize = arg", "-", "-", "-", "-##", "-### Master/slave options (old; use replica sets instead)", "-##", "-", "-# Master mode", "-#master = true", "-", "-# Slave mode", "-#slave = true", "-", "-# When slave: specify master as ", "-#source = arg", "-", "-# When slave: specify a single database to replicate", "-#only = arg", "-", "-# Specify delay (in seconds) to be used when applying master ops to slave", "-#slavedelay = arg", "-", "-# Automatically resync if slave data is stale", "-#autoresync = true", "-", "-", "-", "-##", "-### Replica set options", "-##", "-", "-# Arg is [/]", "-#replSet = arg", "-", "-# Specify index prefetching behavior (if secondary) [none|_id_only|all] (all by default)", "-#replIndexPrefetch = all", "-", "-", "-", "-##", "-### Sharding options", "-##", "-", "-# Declare this is a config db of a cluster (default port 27019; default dir /data/configdb) (false by default)", "-#configsvr = true", "-", "-# Declare this is a shard db of a cluster (default port 27018) (false by default)", "-#shardsvr = true", "-", "-", "-", "-##", "-### SSL options", "-##", "-", "-# Use ssl on configured ports", "-#sslOnNormalPorts = true", "-", "-# Set the SSL operation mode (disabled|allowSSL|preferSSL|requireSSL)", "-# sslMode = arg", "-", "-# PEM file for ssl", "-#sslPEMKeyFile = arg", "-", "-# PEM file password", "-#sslPEMKeyPassword = arg", "-", "-# Key file for internal SSL authentication", "-#sslClusterFile = arg", "-", "-# Internal authentication key file password", "-#sslClusterPassword = arg", "-", "-# Certificate Authority file for SSL", "-#sslCAFile = arg", "-", "-# Certificate Revocation List file for SSL", "-#sslCRLFile = arg", "-", "-# Allow client to connect without presenting a certificate", "-#sslWeakCertificateValidation = true", "-", "-# Allow server certificates to provide non-matching hostnames", "-#sslAllowInvalidHostnames = true", "-", "-# Allow connections to servers with invalid certificates", "-#sslAllowInvalidCertificates = true", "-", "-# Activate FIPS 140-2 mode at startup", "-#sslFIPSMode = true", " ", "+#where to log", "+logpath=/var/log/mongodb/mongodb.log", "+logappend=true", "+# Set this option to configure the mongod or mongos process to bind to and", "+# listen for connections from applications on this address.", "+# You may concatenate a list of comma separated values to bind mongod to multiple IP addresses.", "+bind_ip = 127.0.0.1", "+# fork and run in background", "+fork=true", "+dbpath=/var/lib/mongodb", "+# location of pidfile", "+pidfilepath=/var/run/mongodb/mongod.pid", "+# Enables journaling", "+journal = true", "+# Turn on/off security. Off is currently the default", "+noauth=true", "Info: /Stage[main]/Mongodb::Server::Config/File[/etc/mongod.conf]: Filebucketed /etc/mongod.conf to puppet with sum c9466bad2ec40e2613630b7d49d58b2b", "Notice: /Stage[main]/Mongodb::Server::Config/File[/etc/mongod.conf]/content: content changed '{md5}c9466bad2ec40e2613630b7d49d58b2b' to '{md5}b770678a1c1e5991d9990e8fdb0fabea'", "Notice: /Stage[main]/Mongodb::Server::Config/File[/var/lib/mongodb]/group: group changed 'root' to 'mongodb'", "Notice: /Stage[main]/Mongodb::Server::Config/File[/var/lib/mongodb]/mode: mode changed '0750' to '0755'", "Info: Class[Mongodb::Server::Config]: Scheduling refresh of Class[Mongodb::Server::Service]", "Info: Class[Mongodb::Server::Service]: Scheduling refresh of Service[mongodb]", "Notice: /Stage[main]/Mongodb::Server::Service/Service[mongodb]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Mongodb::Server::Service/Service[mongodb]: Unscheduling refresh on Service[mongodb]", "Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/tunnel_types]/ensure: created", "Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/tunnel_types]: Scheduling refresh of Service[neutron-ovs-agent-service]", "Notice: /Stage[main]/Keystone/Package[keystone]/ensure: created", "Info: /Stage[main]/Keystone/Package[keystone]: Scheduling refresh of Anchor[keystone::install::end]", "Info: /Stage[main]/Keystone/Package[keystone]: Scheduling refresh of Anchor[keystone::service::end]", "Notice: /Stage[main]/Rabbitmq::Install/Package[rabbitmq-server]/ensure: created", "Info: /Stage[main]/Rabbitmq::Install/Package[rabbitmq-server]: Scheduling refresh of Class[Rabbitmq::Service]", "Notice: /Stage[main]/Rabbitmq/Rabbitmq_plugin[rabbitmq_management]/ensure: created", "Info: /Stage[main]/Rabbitmq/Rabbitmq_plugin[rabbitmq_management]: Scheduling refresh of Class[Rabbitmq::Service]", "Notice: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]/content: ", "--- /etc/rabbitmq/rabbitmq.config\t2014-08-11 12:36:33.000000000 +0100", "+++ /tmp/puppet-file20160520-26469-1onsrit\t2016-05-20 12:29:47.927185812 +0100", "@@ -1,567 +1,42 @@", "-%% -*- mode: erlang -*-", "-%% ----------------------------------------------------------------------------", "-%% RabbitMQ Sample Configuration File.", "-%%", "-%% See http://www.rabbitmq.com/configure.html for details.", "-%% ----------------------------------------------------------------------------", "+% This file managed by Puppet", "+% Template Path: rabbitmq/templates/rabbitmq.config", " [", "- {rabbit,", "- [%%", "- %% Network Connectivity", "- %% ====================", "- %%", "-", "- %% By default, RabbitMQ will listen on all interfaces, using", "- %% the standard (reserved) AMQP port.", "- %%", "- %% {tcp_listeners, [5672]},", "-", "- %% To listen on a specific interface, provide a tuple of {IpAddress, Port}.", "- %% For example, to listen only on localhost for both IPv4 and IPv6:", "- %%", "- %% {tcp_listeners, [{\"127.0.0.1\", 5672},", "- %% {\"::1\", 5672}]},", "-", "- %% SSL listeners are configured in the same fashion as TCP listeners,", "- %% including the option to control the choice of interface.", "- %%", "- %% {ssl_listeners, [5671]},", "-", "- %% Log levels (currently just used for connection logging).", "- %% One of 'info', 'warning', 'error' or 'none', in decreasing order", "- %% of verbosity. Defaults to 'info'.", "- %%", "- %% {log_levels, [{connection, info}]},", "-", "- %% Set to 'true' to perform reverse DNS lookups when accepting a", "- %% connection. Hostnames will then be shown instead of IP addresses", "- %% in rabbitmqctl and the management plugin.", "- %%", "- %% {reverse_dns_lookups, true},", "-", "- %%", "- %% Security / AAA", "- %% ==============", "- %%", "-", "- %% The default \"guest\" user is only permitted to access the server", "- %% via a loopback interface (e.g. localhost).", "- %% {loopback_users, [<<\"guest\">>]},", "- %%", "- %% Uncomment the following line if you want to allow access to the", "- %% guest user from anywhere on the network.", "- %% {loopback_users, []},", "-", "- %% Configuring SSL.", "- %% See http://www.rabbitmq.com/ssl.html for full documentation.", "- %%", "- %% {ssl_options, [{cacertfile, \"/path/to/testca/cacert.pem\"},", "- %% {certfile, \"/path/to/server/cert.pem\"},", "- %% {keyfile, \"/path/to/server/key.pem\"},", "- %% {verify, verify_peer},", "- %% {fail_if_no_peer_cert, false}]},", "-", "- %% Choose the available SASL mechanism(s) to expose.", "- %% The two default (built in) mechanisms are 'PLAIN' and", "- %% 'AMQPLAIN'. Additional mechanisms can be added via", "- %% plugins.", "- %%", "- %% See http://www.rabbitmq.com/authentication.html for more details.", "- %%", "- %% {auth_mechanisms, ['PLAIN', 'AMQPLAIN']},", "-", "- %% Select an authentication database to use. RabbitMQ comes bundled", "- %% with a built-in auth-database, based on mnesia.", "- %%", "- %% {auth_backends, [rabbit_auth_backend_internal]},", "-", "- %% Configurations supporting the rabbitmq_auth_mechanism_ssl and", "- %% rabbitmq_auth_backend_ldap plugins.", "- %%", "- %% NB: These options require that the relevant plugin is enabled.", "- %% See http://www.rabbitmq.com/plugins.html for further details.", "-", "- %% The RabbitMQ-auth-mechanism-ssl plugin makes it possible to", "- %% authenticate a user based on the client's SSL certificate.", "- %%", "- %% To use auth-mechanism-ssl, add to or replace the auth_mechanisms", "- %% list with the entry 'EXTERNAL'.", "- %%", "- %% {auth_mechanisms, ['EXTERNAL']},", "-", "- %% The rabbitmq_auth_backend_ldap plugin allows the broker to", "- %% perform authentication and authorisation by deferring to an", "- %% external LDAP server.", "- %%", "- %% For more information about configuring the LDAP backend, see", "- %% http://www.rabbitmq.com/ldap.html.", "- %%", "- %% Enable the LDAP auth backend by adding to or replacing the", "- %% auth_backends entry:", "- %%", "- %% {auth_backends, [rabbit_auth_backend_ldap]},", "-", "- %% This pertains to both the rabbitmq_auth_mechanism_ssl plugin and", "- %% STOMP ssl_cert_login configurations. See the rabbitmq_stomp", "- %% configuration section later in this fail and the README in", "- %% https://github.com/rabbitmq/rabbitmq-auth-mechanism-ssl for further", "- %% details.", "- %%", "- %% To use the SSL cert's CN instead of its DN as the username", "- %%", "- %% {ssl_cert_login_from, common_name},", "-", "- %%", "- %% Default User / VHost", "- %% ====================", "- %%", "-", "- %% On first start RabbitMQ will create a vhost and a user. These", "- %% config items control what gets created. See", "- %% http://www.rabbitmq.com/access-control.html for further", "- %% information about vhosts and access control.", "- %%", "- %% {default_vhost, <<\"/\">>},", "- %% {default_user, <<\"guest\">>},", "- %% {default_pass, <<\"guest\">>},", "- %% {default_permissions, [<<\".*\">>, <<\".*\">>, <<\".*\">>]},", "-", "- %% Tags for default user", "- %%", "- %% For more details about tags, see the documentation for the", "- %% Management Plugin at http://www.rabbitmq.com/management.html.", "- %%", "- %% {default_user_tags, [administrator]},", "-", "- %%", "- %% Additional network and protocol related configuration", "- %% =====================================================", "- %%", "-", "- %% Set the default AMQP heartbeat delay (in seconds).", "- %%", "- %% {heartbeat, 600},", "-", "- %% Set the max permissible size of an AMQP frame (in bytes).", "- %%", "- %% {frame_max, 131072},", "-", "- %% Set the max permissible number of channels per connection.", "- %% 0 means \"no limit\".", "- %%", "- %% {channel_max, 128},", "-", "- %% Customising Socket Options.", "- %%", "- %% See (http://www.erlang.org/doc/man/inet.html#setopts-2) for", "- %% further documentation.", "- %%", "- %% {tcp_listen_options, [binary,", "- %% {packet, raw},", "- %% {reuseaddr, true},", "- %% {backlog, 128},", "- %% {nodelay, true},", "- %% {exit_on_close, false}]},", "-", "- %%", "- %% Resource Limits & Flow Control", "- %% ==============================", "- %%", "- %% See http://www.rabbitmq.com/memory.html for full details.", "-", "- %% Memory-based Flow Control threshold.", "- %%", "- %% {vm_memory_high_watermark, 0.4},", "-", "- %% Fraction of the high watermark limit at which queues start to", "- %% page message out to disc in order to free up memory.", "- %%", "- %% {vm_memory_high_watermark_paging_ratio, 0.5},", "-", "- %% Set disk free limit (in bytes). Once free disk space reaches this", "- %% lower bound, a disk alarm will be set - see the documentation", "- %% listed above for more details.", "- %%", "- %% {disk_free_limit, 50000000},", "-", "- %% Alternatively, we can set a limit relative to total available RAM.", "- %%", "- %% {disk_free_limit, {mem_relative, 1.0}},", "-", "- %%", "- %% Misc/Advanced Options", "- %% =====================", "- %%", "- %% NB: Change these only if you understand what you are doing!", "- %%", "-", "- %% To announce custom properties to clients on connection:", "- %%", "- %% {server_properties, []},", "-", "- %% How to respond to cluster partitions.", "- %% See http://www.rabbitmq.com/partitions.html for further details.", "- %%", "- %% {cluster_partition_handling, ignore},", "-", "- %% Make clustering happen *automatically* at startup - only applied", "- %% to nodes that have just been reset or started for the first time.", "- %% See http://www.rabbitmq.com/clustering.html#auto-config for", "- %% further details.", "- %%", "- %% {cluster_nodes, {['rabbit@my.host.com'], disc}},", "-", "- %% Set (internal) statistics collection granularity.", "- %%", "- %% {collect_statistics, none},", "-", "- %% Statistics collection interval (in milliseconds).", "- %%", "- %% {collect_statistics_interval, 5000},", "-", "- %% Explicitly enable/disable hipe compilation.", "- %%", "- %% {hipe_compile, true}", "-", "- ]},", "-", "- %% ----------------------------------------------------------------------------", "- %% Advanced Erlang Networking/Clustering Options.", "- %%", "- %% See http://www.rabbitmq.com/clustering.html for details", "- %% ----------------------------------------------------------------------------", "- {kernel,", "- [%% Sets the net_kernel tick time.", "- %% Please see http://erlang.org/doc/man/kernel_app.html and", "- %% http://www.rabbitmq.com/nettick.html for further details.", "- %%", "- %% {net_ticktime, 60}", "+ {rabbit, [", "+ {tcp_listen_options,", "+ [binary,", "+ {packet, raw},", "+ {reuseaddr, true},", "+ {backlog, 128},", "+ {nodelay, true},", "+ {exit_on_close, false}]", "+ },", "+ {tcp_listeners, []},", "+ {ssl_listeners, [5671]},", "+ {ssl_options, [", "+ {cacertfile,\"/etc/ssl/certs/ca-bundle.crt\"},", "+ {certfile,\"/etc/pki/ca-trust/source/anchors/puppet_openstack.pem\"},", "+ {keyfile,\"/etc/rabbitmq/ssl/private/n2.dusty.ci.centos.org.pem\"},", "+ {verify,verify_none},", "+ {fail_if_no_peer_cert,false}", "+ ]},", "+ {default_user, <<\"guest\">>},", "+ {default_pass, <<\"guest\">>}", " ]},", "-", "- %% ----------------------------------------------------------------------------", "- %% RabbitMQ Management Plugin", "- %%", "- %% See http://www.rabbitmq.com/management.html for details", "- %% ----------------------------------------------------------------------------", "-", "- {rabbitmq_management,", "- [%% Pre-Load schema definitions from the following JSON file. See", "- %% http://www.rabbitmq.com/management.html#load-definitions", "- %%", "- %% {load_definitions, \"/path/to/schema.json\"},", "-", "- %% Log all requests to the management HTTP API to a file.", "- %%", "- %% {http_log_dir, \"/path/to/access.log\"},", "-", "- %% Change the port on which the HTTP listener listens,", "- %% specifying an interface for the web server to bind to.", "- %% Also set the listener to use SSL and provide SSL options.", "- %%", "- %% {listener, [{port, 12345},", "- %% {ip, \"127.0.0.1\"},", "- %% {ssl, true},", "- %% {ssl_opts, [{cacertfile, \"/path/to/cacert.pem\"},", "- %% {certfile, \"/path/to/cert.pem\"},", "- %% {keyfile, \"/path/to/key.pem\"}]}]},", "-", "- %% Configure how long aggregated data (such as message rates and queue", "- %% lengths) is retained. Please read the plugin's documentation in", "- %% https://www.rabbitmq.com/management.html#configuration for more", "- %% details.", "- %%", "- %% {sample_retention_policies,", "- %% [{global, [{60, 5}, {3600, 60}, {86400, 1200}]},", "- %% {basic, [{60, 5}, {3600, 60}]},", "- %% {detailed, [{10, 5}]}]}", "- ]},", "-", "- {rabbitmq_management_agent,", "- [%% Misc/Advanced Options", "- %%", "- %% NB: Change these only if you understand what you are doing!", "- %%", "- %% {force_fine_statistics, true}", "- ]},", "-", "- %% ----------------------------------------------------------------------------", "- %% RabbitMQ Shovel Plugin", "- %%", "- %% See http://www.rabbitmq.com/shovel.html for details", "- %% ----------------------------------------------------------------------------", "-", "- {rabbitmq_shovel,", "- [{shovels,", "- [%% A named shovel worker.", "- %% {my_first_shovel,", "- %% [", "-", "- %% List the source broker(s) from which to consume.", "- %%", "- %% {sources,", "- %% [%% URI(s) and pre-declarations for all source broker(s).", "- %% {brokers, [\"amqp://user:password@host.domain/my_vhost\"]},", "- %% {declarations, []}", "- %% ]},", "-", "- %% List the destination broker(s) to publish to.", "- %% {destinations,", "- %% [%% A singular version of the 'brokers' element.", "- %% {broker, \"amqp://\"},", "- %% {declarations, []}", "- %% ]},", "-", "- %% Name of the queue to shovel messages from.", "- %%", "- %% {queue, <<\"your-queue-name-goes-here\">>},", "-", "- %% Optional prefetch count.", "- %%", "- %% {prefetch_count, 10},", "-", "- %% when to acknowledge messages:", "- %% - no_ack: never (auto)", "- %% - on_publish: after each message is republished", "- %% - on_confirm: when the destination broker confirms receipt", "- %%", "- %% {ack_mode, on_confirm},", "-", "- %% Overwrite fields of the outbound basic.publish.", "- %%", "- %% {publish_fields, [{exchange, <<\"my_exchange\">>},", "- %% {routing_key, <<\"from_shovel\">>}]},", "-", "- %% Static list of basic.properties to set on re-publication.", "- %%", "- %% {publish_properties, [{delivery_mode, 2}]},", "-", "- %% The number of seconds to wait before attempting to", "- %% reconnect in the event of a connection failure.", "- %%", "- %% {reconnect_delay, 2.5}", "-", "- %% ]} %% End of my_first_shovel", "+ {kernel, [", "+ ", "+ ]}", "+,", "+ {rabbitmq_management, [", "+ {listener, [", "+ {port, 15671},", "+ {ssl, true},", "+ {ssl_opts, [", "+ {cacertfile, \"/etc/ssl/certs/ca-bundle.crt\"},", "+", "+ {certfile, \"/etc/pki/ca-trust/source/anchors/puppet_openstack.pem\"},", "+ {keyfile, \"/etc/rabbitmq/ssl/private/n2.dusty.ci.centos.org.pem\"}", "+ ]}", " ]}", "- %% Rather than specifying some values per-shovel, you can specify", "- %% them for all shovels here.", "- %%", "- %% {defaults, [{prefetch_count, 0},", "- %% {ack_mode, on_confirm},", "- %% {publish_fields, []},", "- %% {publish_properties, [{delivery_mode, 2}]},", "- %% {reconnect_delay, 2.5}]}", "- ]},", "-", "- %% ----------------------------------------------------------------------------", "- %% RabbitMQ Stomp Adapter", "- %%", "- %% See http://www.rabbitmq.com/stomp.html for details", "- %% ----------------------------------------------------------------------------", "-", "- {rabbitmq_stomp,", "- [%% Network Configuration - the format is generally the same as for the broker", "-", "- %% Listen only on localhost (ipv4 & ipv6) on a specific port.", "- %% {tcp_listeners, [{\"127.0.0.1\", 61613},", "- %% {\"::1\", 61613}]},", "-", "- %% Listen for SSL connections on a specific port.", "- %% {ssl_listeners, [61614]},", "-", "- %% Additional SSL options", "-", "- %% Extract a name from the client's certificate when using SSL.", "- %%", "- %% {ssl_cert_login, true},", "-", "- %% Set a default user name and password. This is used as the default login", "- %% whenever a CONNECT frame omits the login and passcode headers.", "- %%", "- %% Please note that setting this will allow clients to connect without", "- %% authenticating!", "- %%", "- %% {default_user, [{login, \"guest\"},", "- %% {passcode, \"guest\"}]},", "-", "- %% If a default user is configured, or you have configured use SSL client", "- %% certificate based authentication, you can choose to allow clients to", "- %% omit the CONNECT frame entirely. If set to true, the client is", "- %% automatically connected as the default user or user supplied in the", "- %% SSL certificate whenever the first frame sent on a session is not a", "- %% CONNECT frame.", "- %%", "- %% {implicit_connect, true}", "- ]},", "-", "- %% ----------------------------------------------------------------------------", "- %% RabbitMQ MQTT Adapter", "- %%", "- %% See http://hg.rabbitmq.com/rabbitmq-mqtt/file/stable/README.md for details", "- %% ----------------------------------------------------------------------------", "-", "- {rabbitmq_mqtt,", "- [%% Set the default user name and password. Will be used as the default login", "- %% if a connecting client provides no other login details.", "- %%", "- %% Please note that setting this will allow clients to connect without", "- %% authenticating!", "- %%", "- %% {default_user, <<\"guest\">>},", "- %% {default_pass, <<\"guest\">>},", "-", "- %% Enable anonymous access. If this is set to false, clients MUST provide", "- %% login information in order to connect. See the default_user/default_pass", "- %% configuration elements for managing logins without authentication.", "- %%", "- %% {allow_anonymous, true},", "-", "- %% If you have multiple chosts, specify the one to which the", "- %% adapter connects.", "- %%", "- %% {vhost, <<\"/\">>},", "-", "- %% Specify the exchange to which messages from MQTT clients are published.", "- %%", "- %% {exchange, <<\"amq.topic\">>},", "-", "- %% Specify TTL (time to live) to control the lifetime of non-clean sessions.", "- %%", "- %% {subscription_ttl, 1800000},", "-", "- %% Set the prefetch count (governing the maximum number of unacknowledged", "- %% messages that will be delivered).", "- %%", "- %% {prefetch, 10},", "-", "- %% TCP/SSL Configuration (as per the broker configuration).", "- %%", "- %% {tcp_listeners, [1883]},", "- %% {ssl_listeners, []},", "-", "- %% TCP/Socket options (as per the broker configuration).", "- %%", "- %% {tcp_listen_options, [binary,", "- %% {packet, raw},", "- %% {reuseaddr, true},", "- %% {backlog, 128},", "- %% {nodelay, true}]}", "- ]},", "-", "- %% ----------------------------------------------------------------------------", "- %% RabbitMQ AMQP 1.0 Support", "- %%", "- %% See http://hg.rabbitmq.com/rabbitmq-amqp1.0/file/default/README.md", "- %% for details", "- %% ----------------------------------------------------------------------------", "-", "- {rabbitmq_amqp1_0,", "- [%% Connections that are not authenticated with SASL will connect as this", "- %% account. See the README for more information.", "- %%", "- %% Please note that setting this will allow clients to connect without", "- %% authenticating!", "- %%", "- %% {default_user, \"guest\"},", "-", "- %% Enable protocol strict mode. See the README for more information.", "- %%", "- %% {protocol_strict_mode, false}", "- ]},", "-", "- %% ----------------------------------------------------------------------------", "- %% RabbitMQ LDAP Plugin", "- %%", "- %% See http://www.rabbitmq.com/ldap.html for details.", "- %%", "- %% ----------------------------------------------------------------------------", "-", "- {rabbitmq_auth_backend_ldap,", "- [%%", "- %% Connecting to the LDAP server(s)", "- %% ================================", "- %%", "-", "- %% Specify servers to bind to. You *must* set this in order for the plugin", "- %% to work properly.", "- %%", "- %% {servers, [\"your-server-name-goes-here\"]},", "-", "- %% Connect to the LDAP server using SSL", "- %%", "- %% {use_ssl, false},", "-", "- %% Specify the LDAP port to connect to", "- %%", "- %% {port, 389},", "-", "- %% LDAP connection timeout, in milliseconds or 'infinity'", "- %%", "- %% {timeout, infinity},", "-", "- %% Enable logging of LDAP queries.", "- %% One of", "- %% - false (no logging is performed)", "- %% - true (verbose logging of the logic used by the plugin)", "- %% - network (as true, but additionally logs LDAP network traffic)", "- %%", "- %% Defaults to false.", "- %%", "- %% {log, false},", "-", "- %%", "- %% Authentication", "- %% ==============", "- %%", "-", "- %% Pattern to convert the username given through AMQP to a DN before", "- %% binding", "- %%", "- %% {user_dn_pattern, \"cn=${username},ou=People,dc=example,dc=com\"},", "-", "- %% Alternatively, you can convert a username to a Distinguished", "- %% Name via an LDAP lookup after binding. See the documentation for", "- %% full details.", "-", "- %% When converting a username to a dn via a lookup, set these to", "- %% the name of the attribute that represents the user name, and the", "- %% base DN for the lookup query.", "- %%", "- %% {dn_lookup_attribute, \"userPrincipalName\"},", "- %% {dn_lookup_base, \"DC=gopivotal,DC=com\"},", "-", "- %% Controls how to bind for authorisation queries and also to", "- %% retrieve the details of users logging in without presenting a", "- %% password (e.g., SASL EXTERNAL).", "- %% One of", "- %% - as_user (to bind as the authenticated user - requires a password)", "- %% - anon (to bind anonymously)", "- %% - {UserDN, Password} (to bind with a specified user name and password)", "- %%", "- %% Defaults to 'as_user'.", "- %%", "- %% {other_bind, as_user},", "-", "- %%", "- %% Authorisation", "- %% =============", "- %%", "-", "- %% The LDAP plugin can perform a variety of queries against your", "- %% LDAP server to determine questions of authorisation. See", "- %% http://www.rabbitmq.com/ldap.html#authorisation for more", "- %% information.", "-", "- %% Set the query to use when determining vhost access", "- %%", "- %% {vhost_access_query, {in_group,", "- %% \"ou=${vhost}-users,ou=vhosts,dc=example,dc=com\"}},", "-", "- %% Set the query to use when determining resource (e.g., queue) access", "- %%", "- %% {resource_access_query, {constant, true}},", "-", "- %% Set queries to determine which tags a user has", "- %%", "- %% {tag_queries, []}", " ]}", " ].", "+% EOF", "Info: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]: Filebucketed /etc/rabbitmq/rabbitmq.config to puppet with sum 3e342d4a660626a9b588a723ad6cba74", "Notice: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]/content: content changed '{md5}3e342d4a660626a9b588a723ad6cba74' to '{md5}808c7824d2fe3217e34c0f11b45084ed'", "Info: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]: Scheduling refresh of Class[Rabbitmq::Service]", "Notice: /Stage[main]/Rabbitmq::Config/File[rabbitmqadmin.conf]/ensure: defined content as '{md5}56b4bb3dfb32765e14d2a04faea60e62'", "Notice: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d]/ensure: created", "Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::install::end]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Keystone::Deps/Anchor[keystone::install::end]: Scheduling refresh of Anchor[keystone::service::begin]", "Info: /Stage[main]/Keystone::Deps/Anchor[keystone::install::end]: Scheduling refresh of Exec[keystone-manage db_sync]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/File[/var/www/cgi-bin/keystone]/ensure: created", "Notice: /Stage[main]/Keystone::Wsgi::Apache/File[keystone_wsgi_admin]/ensure: defined content as '{md5}b60f70d60e09d39ab5900f4b4eebf921'", "Notice: /Stage[main]/Keystone::Wsgi::Apache/File[keystone_wsgi_main]/ensure: defined content as '{md5}b60f70d60e09d39ab5900f4b4eebf921'", "Notice: /Stage[main]/Openstack_integration::Keystone/Openstack_integration::Ssl_key[keystone]/File[/etc/keystone/ssl]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Keystone/Openstack_integration::Ssl_key[keystone]/File[/etc/keystone/ssl/private]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Keystone/Openstack_integration::Ssl_key[keystone]/File[/etc/keystone/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00'", "Info: Openstack_integration::Ssl_key[keystone]: Scheduling refresh of Service[httpd]", "Notice: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d/limits.conf]/ensure: defined content as '{md5}8eb9ff6c576b9869944215af3a568c2e'", "Info: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d/limits.conf]: Scheduling refresh of Exec[rabbitmq-systemd-reload]", "Notice: /Stage[main]/Rabbitmq::Config/Exec[rabbitmq-systemd-reload]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Rabbitmq::Config/Exec[rabbitmq-systemd-reload]: Scheduling refresh of Class[Rabbitmq::Service]", "Notice: /Stage[main]/Keystone::Cron::Token_flush/Cron[keystone-manage token_flush]/ensure: created", "Notice: /Stage[main]/Rabbitmq::Config/File[/etc/rabbitmq/ssl]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Rabbitmq/File[/etc/rabbitmq/ssl/private]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Rabbitmq/Openstack_integration::Ssl_key[rabbitmq]/File[/etc/rabbitmq/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00'", "Info: Openstack_integration::Ssl_key[rabbitmq]: Scheduling refresh of Service[rabbitmq-server]", "Notice: /Stage[main]/Keystone/Keystone_config[ssl/ca_certs]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[ssl/ca_certs]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[fernet_tokens/key_repository]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[fernet_tokens/key_repository]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[ssl/cert_subject]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[ssl/cert_subject]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[signing/keyfile]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[signing/keyfile]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[catalog/driver]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[catalog/driver]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[ssl/ca_key]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[ssl/ca_key]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[ssl/enable]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[ssl/enable]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[token/provider]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[token/provider]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[signing/key_size]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[signing/key_size]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_port]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_port]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[signing/ca_certs]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[signing/ca_certs]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[signing/ca_key]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[signing/ca_key]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[ssl/certfile]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[ssl/certfile]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf]/ensure: created", "Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf]: Scheduling refresh of Exec[concat_15-default.conf]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf/fragments]/ensure: created", "Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf/fragments]: Scheduling refresh of Exec[concat_15-default.conf]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-serversignature]/File[/var/lib/puppet/concat/15-default.conf/fragments/90_default-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7'", "Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-serversignature]/File[/var/lib/puppet/concat/15-default.conf/fragments/90_default-serversignature]: Scheduling refresh of Exec[concat_15-default.conf]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-directories]/File[/var/lib/puppet/concat/15-default.conf/fragments/60_default-directories]/ensure: defined content as '{md5}5e2a84875965faa5e3df0e222301ba37'", "Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-directories]/File[/var/lib/puppet/concat/15-default.conf/fragments/60_default-directories]: Scheduling refresh of Exec[concat_15-default.conf]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-docroot]/File[/var/lib/puppet/concat/15-default.conf/fragments/10_default-docroot]/ensure: defined content as '{md5}6faaccbc7ca8bc885ebf139223885d52'", "Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-docroot]/File[/var/lib/puppet/concat/15-default.conf/fragments/10_default-docroot]: Scheduling refresh of Exec[concat_15-default.conf]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-apache-header]/File[/var/lib/puppet/concat/15-default.conf/fragments/0_default-apache-header]/ensure: defined content as '{md5}c46eea5ff4d7874403fa7a9228888f0e'", "Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-apache-header]/File[/var/lib/puppet/concat/15-default.conf/fragments/0_default-apache-header]: Scheduling refresh of Exec[concat_15-default.conf]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf/fragments.concat]/ensure: created", "Notice: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/debug]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[ssl/keyfile]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[ssl/keyfile]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[signing/certfile]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[signing/certfile]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_bind_host]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_bind_host]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Apache::Listen[35357]/Concat::Fragment[Listen 35357]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 35357]/ensure: defined content as '{md5}37dc13694e40f667def8eaa0cc261d03'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Apache::Listen[35357]/Concat::Fragment[Listen 35357]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 35357]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]", "Notice: /Stage[main]/Keystone/Keystone_config[eventlet_server/public_workers]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[eventlet_server/public_workers]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Apache::Listen[5000]/Concat::Fragment[Listen 5000]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 5000]/ensure: defined content as '{md5}9ce4fddc0fe1c0dd6016a171946def55'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Apache::Listen[5000]/Concat::Fragment[Listen 5000]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 5000]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Apache::Listen[80]/Concat::Fragment[Listen 80]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 80]/ensure: defined content as '{md5}d5fcefc335117f400d451de47efeca87'", "Info: /Stage[main]/Apache/Apache::Vhost[default]/Apache::Listen[80]/Concat::Fragment[Listen 80]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 80]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]", "Notice: /Stage[main]/Keystone/Keystone_config[catalog/template_file]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[catalog/template_file]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[token/driver]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[token/driver]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_bind_host]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_bind_host]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_token]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_token]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Apache::Listen[8774]/Concat::Fragment[Listen 8774]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 8774]/ensure: defined content as '{md5}edb2a81e84f59aaa4978ff2d53c01a3e'", "Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Apache::Listen[8774]/Concat::Fragment[Listen 8774]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 8774]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]", "Notice: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/log_dir]/ensure: created", "Info: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/log_dir]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_port]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_port]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf/fragments.concat.out]/ensure: created", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-file_footer]/File[/var/lib/puppet/concat/15-default.conf/fragments/999_default-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d'", "Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-file_footer]/File[/var/lib/puppet/concat/15-default.conf/fragments/999_default-file_footer]: Scheduling refresh of Exec[concat_15-default.conf]", "Notice: /Stage[main]/Keystone/Keystone_config[signing/cert_subject]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[signing/cert_subject]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-scriptalias]/File[/var/lib/puppet/concat/15-default.conf/fragments/200_default-scriptalias]/ensure: defined content as '{md5}7fc65400381c3a010f38870f94f236f0'", "Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-scriptalias]/File[/var/lib/puppet/concat/15-default.conf/fragments/200_default-scriptalias]: Scheduling refresh of Exec[concat_15-default.conf]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Apache::Listen[6385]/Concat::Fragment[Listen 6385]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 6385]/ensure: defined content as '{md5}dab46123b45901c26ef6386ec1195db9'", "Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Apache::Listen[6385]/Concat::Fragment[Listen 6385]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 6385]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf]", "Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/Exec[concat_/etc/httpd/conf/ports.conf]/returns: executed successfully", "Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/Exec[concat_/etc/httpd/conf/ports.conf]: Triggered 'refresh' from 8 events", "Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/etc/httpd/conf/ports.conf]/ensure: defined content as '{md5}ae39e379894fcb4065bbee3724f7036d'", "Info: Concat[/etc/httpd/conf/ports.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]/content: ", "--- /etc/httpd/conf/httpd.conf\t2016-05-12 11:16:14.000000000 +0100", "+++ /tmp/puppet-file20160520-26469-t9ohgg\t2016-05-20 12:29:49.253135506 +0100", "@@ -1,353 +1,49 @@", "-#", "-# This is the main Apache HTTP server configuration file. It contains the", "-# configuration directives that give the server its instructions.", "-# See for detailed information.", "-# In particular, see ", "-# ", "-# for a discussion of each configuration directive.", "-#", "-# Do NOT simply read the instructions in here without understanding", "-# what they do. They're here only as hints or reminders. If you are unsure", "-# consult the online docs. You have been warned. ", "-#", "-# Configuration and logfile names: If the filenames you specify for many", "-# of the server's control files begin with \"/\" (or \"drive:/\" for Win32), the", "-# server will use that explicit path. If the filenames do *not* begin", "-# with \"/\", the value of ServerRoot is prepended -- so 'log/access_log'", "-# with ServerRoot set to '/www' will be interpreted by the", "-# server as '/www/log/access_log', where as '/log/access_log' will be", "-# interpreted as '/log/access_log'.", "-", "-#", "-# ServerRoot: The top of the directory tree under which the server's", "-# configuration, error, and log files are kept.", "-#", "-# Do not add a slash at the end of the directory path. If you point", "-# ServerRoot at a non-local disk, be sure to specify a local disk on the", "-# Mutex directive, if file-based mutexes are used. If you wish to share the", "-# same ServerRoot for multiple httpd daemons, you will need to change at", "-# least PidFile.", "-#", "+# Security", "+ServerTokens OS", "+ServerSignature On", "+TraceEnable On", "+", "+ServerName \"n2.dusty.ci.centos.org\"", " ServerRoot \"/etc/httpd\"", "+PidFile run/httpd.pid", "+Timeout 120", "+KeepAlive Off", "+MaxKeepAliveRequests 100", "+KeepAliveTimeout 15", "+LimitRequestFieldSize 8190", "+", " ", "-#", "-# Listen: Allows you to bind Apache to specific IP addresses and/or", "-# ports, instead of the default. See also the ", "-# directive.", "-#", "-# Change this to Listen on specific IP addresses as shown below to ", "-# prevent Apache from glomming onto all bound IP addresses.", "-#", "-#Listen 12.34.56.78:80", "-Listen 80", "-", "-#", "-# Dynamic Shared Object (DSO) Support", "-#", "-# To be able to use the functionality of a module which was built as a DSO you", "-# have to place corresponding `LoadModule' lines at this location so the", "-# directives contained in it are actually available _before_ they are used.", "-# Statically compiled modules (those listed by `httpd -l') do not need", "-# to be loaded here.", "-#", "-# Example:", "-# LoadModule foo_module modules/mod_foo.so", "-#", "-Include conf.modules.d/*.conf", "-", "-#", "-# If you wish httpd to run as a different user or group, you must run", "-# httpd as root initially and it will switch. ", "-#", "-# User/Group: The name (or #number) of the user/group to run httpd as.", "-# It is usually good practice to create a dedicated user and group for", "-# running httpd, as with most system services.", "-#", " User apache", " Group apache", " ", "-# 'Main' server configuration", "-#", "-# The directives in this section set up the values used by the 'main'", "-# server, which responds to any requests that aren't handled by a", "-# definition. These values also provide defaults for", "-# any containers you may define later in the file.", "-#", "-# All of these directives may appear inside containers,", "-# in which case these default settings will be overridden for the", "-# virtual host being defined.", "-#", "-", "-#", "-# ServerAdmin: Your address, where problems with the server should be", "-# e-mailed. This address appears on some server-generated pages, such", "-# as error documents. e.g. admin@your-domain.com", "-#", "-ServerAdmin root@localhost", "-", "-#", "-# ServerName gives the name and port that the server uses to identify itself.", "-# This can often be determined automatically, but we recommend you specify", "-# it explicitly to prevent problems during startup.", "-#", "-# If your host doesn't have a registered DNS name, enter its IP address here.", "-#", "-#ServerName www.example.com:80", "-", "-#", "-# Deny access to the entirety of your server's filesystem. You must", "-# explicitly permit access to web content directories in other ", "-# blocks below.", "-#", "-", "- AllowOverride none", "+AccessFileName .htaccess", "+", " Require all denied", "-", "+", " ", "-#", "-# Note that from this point forward you must specifically allow", "-# particular features to be enabled - so if something's not working as", "-# you might expect, make sure that you have specifically enabled it", "-# below.", "-#", "-", "-#", "-# DocumentRoot: The directory out of which you will serve your", "-# documents. By default, all requests are taken from this directory, but", "-# symbolic links and aliases may be used to point to other locations.", "-#", "-DocumentRoot \"/var/www/html\"", "-", "-#", "-# Relax access to content within /var/www.", "-#", "-", "- AllowOverride None", "- # Allow open access:", "- Require all granted", "+", "+ Options FollowSymLinks", "+ AllowOverride None", " ", " ", "-# Further relax access to the default document root:", "-", "- #", "- # Possible values for the Options directive are \"None\", \"All\",", "- # or any combination of:", "- # Indexes Includes FollowSymLinks SymLinksifOwnerMatch ExecCGI MultiViews", "- #", "- # Note that \"MultiViews\" must be named *explicitly* --- \"Options All\"", "- # doesn't give it to you.", "- #", "- # The Options directive is both complicated and important. Please see", "- # http://httpd.apache.org/docs/2.4/mod/core.html#options", "- # for more information.", "- #", "- Options Indexes FollowSymLinks", "-", "- #", "- # AllowOverride controls what directives may be placed in .htaccess files.", "- # It can be \"All\", \"None\", or any combination of the keywords:", "- # Options FileInfo AuthConfig Limit", "- #", "- AllowOverride None", "-", "- #", "- # Controls who can get stuff from this server.", "- #", "- Require all granted", "-", " ", "-#", "-# DirectoryIndex: sets the file that Apache will serve if a directory", "-# is requested.", "-#", "-", "- DirectoryIndex index.html", "-", "-", "-#", "-# The following lines prevent .htaccess and .htpasswd files from being ", "-# viewed by Web clients. ", "-#", "-", "- Require all denied", "-", "-", "-#", "-# ErrorLog: The location of the error log file.", "-# If you do not specify an ErrorLog directive within a ", "-# container, error messages relating to that virtual host will be", "-# logged here. If you *do* define an error logfile for a ", "-# container, that host's errors will be logged there and not here.", "-#", "-ErrorLog \"logs/error_log\"", "-", "-#", "-# LogLevel: Control the number of messages logged to the error_log.", "-# Possible values include: debug, info, notice, warn, error, crit,", "-# alert, emerg.", "-#", "+HostnameLookups Off", "+ErrorLog \"/var/log/httpd/error_log\"", " LogLevel warn", "+EnableSendfile On", " ", "-", "- #", "- # The following directives define some format nicknames for use with", "- # a CustomLog directive (see below).", "- #", "- LogFormat \"%h %l %u %t \\\"%r\\\" %>s %b \\\"%{Referer}i\\\" \\\"%{User-Agent}i\\\"\" combined", "- LogFormat \"%h %l %u %t \\\"%r\\\" %>s %b\" common", "-", "- ", "- # You need to enable mod_logio.c to use %I and %O", "- LogFormat \"%h %l %u %t \\\"%r\\\" %>s %b \\\"%{Referer}i\\\" \\\"%{User-Agent}i\\\" %I %O\" combinedio", "- ", "-", "- #", "- # The location and format of the access logfile (Common Logfile Format).", "- # If you do not define any access logfiles within a ", "- # container, they will be logged here. Contrariwise, if you *do*", "- # define per- access logfiles, transactions will be", "- # logged therein and *not* in this file.", "- #", "- #CustomLog \"logs/access_log\" common", "-", "- #", "- # If you prefer a logfile with access, agent, and referer information", "- # (Combined Logfile Format) you can use the following directive.", "- #", "- CustomLog \"logs/access_log\" combined", "-", "-", "-", "- #", "- # Redirect: Allows you to tell clients about documents that used to ", "- # exist in your server's namespace, but do not anymore. The client ", "- # will make a new request for the document at its new location.", "- # Example:", "- # Redirect permanent /foo http://www.example.com/bar", "-", "- #", "- # Alias: Maps web paths into filesystem paths and is used to", "- # access content that does not live under the DocumentRoot.", "- # Example:", "- # Alias /webpath /full/filesystem/path", "- #", "- # If you include a trailing / on /webpath then the server will", "- # require it to be present in the URL. You will also likely", "- # need to provide a section to allow access to", "- # the filesystem path.", "-", "- #", "- # ScriptAlias: This controls which directories contain server scripts. ", "- # ScriptAliases are essentially the same as Aliases, except that", "- # documents in the target directory are treated as applications and", "- # run by the server when requested rather than as documents sent to the", "- # client. The same rules about trailing \"/\" apply to ScriptAlias", "- # directives as to Alias.", "- #", "- ScriptAlias /cgi-bin/ \"/var/www/cgi-bin/\"", "-", "-", "-", "-#", "-# \"/var/www/cgi-bin\" should be changed to whatever your ScriptAliased", "-# CGI directory exists, if you have that configured.", "-#", "-", "- AllowOverride None", "- Options None", "- Require all granted", "-", "+#Listen 80", "+", "+", "+Include \"/etc/httpd/conf.modules.d/*.load\"", "+Include \"/etc/httpd/conf.modules.d/*.conf\"", "+Include \"/etc/httpd/conf/ports.conf\"", "+", "+LogFormat \"%h %l %u %t \\\"%r\\\" %>s %b \\\"%{Referer}i\\\" \\\"%{User-Agent}i\\\"\" combined", "+LogFormat \"%h %l %u %t \\\"%r\\\" %>s %b\" common", "+LogFormat \"%{Referer}i -> %U\" referer", "+LogFormat \"%{User-agent}i\" agent", "+LogFormat \"%{X-Forwarded-For}i %l %u %t \\\"%r\\\" %s %b \\\"%{Referer}i\\\" \\\"%{User-agent}i\\\"\" forwarded", "+", "+IncludeOptional \"/etc/httpd/conf.d/*.conf\"", " ", "-", "- #", "- # TypesConfig points to the file containing the list of mappings from", "- # filename extension to MIME-type.", "- #", "- TypesConfig /etc/mime.types", "-", "- #", "- # AddType allows you to add to or override the MIME configuration", "- # file specified in TypesConfig for specific file types.", "- #", "- #AddType application/x-gzip .tgz", "- #", "- # AddEncoding allows you to have certain browsers uncompress", "- # information on the fly. Note: Not all browsers support this.", "- #", "- #AddEncoding x-compress .Z", "- #AddEncoding x-gzip .gz .tgz", "- #", "- # If the AddEncoding directives above are commented-out, then you", "- # probably should define those extensions to indicate media types:", "- #", "- AddType application/x-compress .Z", "- AddType application/x-gzip .gz .tgz", "-", "- #", "- # AddHandler allows you to map certain file extensions to \"handlers\":", "- # actions unrelated to filetype. These can be either built into the server", "- # or added with the Action directive (see below)", "- #", "- # To use CGI scripts outside of ScriptAliased directories:", "- # (You will also need to add \"ExecCGI\" to the \"Options\" directive.)", "- #", "- #AddHandler cgi-script .cgi", "-", "- # For type maps (negotiated resources):", "- #AddHandler type-map var", "-", "- #", "- # Filters allow you to process content before it is sent to the client.", "- #", "- # To parse .shtml files for server-side includes (SSI):", "- # (You will also need to add \"Includes\" to the \"Options\" directive.)", "- #", "- AddType text/html .shtml", "- AddOutputFilter INCLUDES .shtml", "-", "-", "-#", "-# Specify a default charset for all content served; this enables", "-# interpretation of all content as UTF-8 by default. To use the ", "-# default browser choice (ISO-8859-1), or to allow the META tags", "-# in HTML content to override this choice, comment out this", "-# directive:", "-#", "-AddDefaultCharset UTF-8", "-", "-", "- #", "- # The mod_mime_magic module allows the server to use various hints from the", "- # contents of the file itself to determine its type. The MIMEMagicFile", "- # directive tells the module where the hint definitions are located.", "- #", "- MIMEMagicFile conf/magic", "-", "-", "-#", "-# Customizable error responses come in three flavors:", "-# 1) plain text 2) local redirects 3) external redirects", "-#", "-# Some examples:", "-#ErrorDocument 500 \"The server made a boo boo.\"", "-#ErrorDocument 404 /missing.html", "-#ErrorDocument 404 \"/cgi-bin/missing_handler.pl\"", "-#ErrorDocument 402 http://www.example.com/subscription_info.html", "-#", "-", "-#", "-# EnableMMAP and EnableSendfile: On systems that support it, ", "-# memory-mapping or the sendfile syscall may be used to deliver", "-# files. This usually improves server performance, but must", "-# be turned off when serving from networked-mounted ", "-# filesystems or if support for these functions is otherwise", "-# broken on your system.", "-# Defaults if commented: EnableMMAP On, EnableSendfile Off", "-#", "-#EnableMMAP off", "-EnableSendfile on", "-", "-# Supplemental configuration", "-#", "-# Load config files in the \"/etc/httpd/conf.d\" directory, if any.", "-IncludeOptional conf.d/*.conf", "Info: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]: Filebucketed /etc/httpd/conf/httpd.conf to puppet with sum f5e7449c0f17bc856e86011cb5d152ba", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]/content: content changed '{md5}f5e7449c0f17bc856e86011cb5d152ba' to '{md5}b3ed70a3a40f48d061c63f23fbbea111'", "Info: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf]/ensure: created", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments]/ensure: created", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-wsgi]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/260_keystone_wsgi_admin-wsgi]/ensure: defined content as '{md5}eab4d58b350697a7677844fd645581bf'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-wsgi]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/260_keystone_wsgi_admin-wsgi]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-directories]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/60_keystone_wsgi_admin-directories]/ensure: defined content as '{md5}cc81234a3bbf77f857ed3f11bb369e8c'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-directories]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/60_keystone_wsgi_admin-directories]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-file_footer]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/999_keystone_wsgi_admin-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-file_footer]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/999_keystone_wsgi_admin-file_footer]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-docroot]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/10_keystone_wsgi_admin-docroot]/ensure: defined content as '{md5}e250ff3401328e2e106702576d684293'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-docroot]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/10_keystone_wsgi_admin-docroot]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-serversignature]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/90_keystone_wsgi_admin-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-serversignature]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/90_keystone_wsgi_admin-serversignature]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-logging]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/80_keystone_wsgi_admin-logging]/ensure: defined content as '{md5}6e95210e81b53fbd537c884ba77577a6'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-logging]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/80_keystone_wsgi_admin-logging]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-ssl]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/230_keystone_wsgi_admin-ssl]/ensure: defined content as '{md5}30fbced56cdd99b65558d366e970e5fd'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-ssl]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/230_keystone_wsgi_admin-ssl]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments.concat]/ensure: created", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-access_log]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/100_keystone_wsgi_admin-access_log]/ensure: defined content as '{md5}f3a5a390b72c0e5ada35efbd1ab9c568'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-access_log]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/100_keystone_wsgi_admin-access_log]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf]/ensure: created", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments]/ensure: created", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-file_footer]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/999_keystone_wsgi_main-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-file_footer]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/999_keystone_wsgi_main-file_footer]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments.concat]/ensure: created", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-serversignature]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/90_keystone_wsgi_main-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-serversignature]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/90_keystone_wsgi_main-serversignature]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments.concat.out]/ensure: created", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-logging]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/80_keystone_wsgi_main-logging]/ensure: defined content as '{md5}2e5c08362091258b73059cd0e5435e9a'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-logging]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/80_keystone_wsgi_main-logging]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-access_log]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/100_keystone_wsgi_main-access_log]/ensure: defined content as '{md5}f8509b8e1ef317dd58bbcca1480a9c61'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-access_log]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/100_keystone_wsgi_main-access_log]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-apache-header]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/0_keystone_wsgi_main-apache-header]/ensure: defined content as '{md5}bcbedce152a9ba8190ab5a78ad4256f9'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-apache-header]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/0_keystone_wsgi_main-apache-header]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-wsgi]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/260_keystone_wsgi_main-wsgi]/ensure: defined content as '{md5}0ed0f415940e9362ef9e1871efb2c050'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-wsgi]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/260_keystone_wsgi_main-wsgi]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-ssl]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/230_keystone_wsgi_main-ssl]/ensure: defined content as '{md5}30fbced56cdd99b65558d366e970e5fd'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-ssl]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/230_keystone_wsgi_main-ssl]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-docroot]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/10_keystone_wsgi_main-docroot]/ensure: defined content as '{md5}e250ff3401328e2e106702576d684293'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-docroot]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/10_keystone_wsgi_main-docroot]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]", "Notice: /Stage[main]/Keystone/Keystone_config[token/expiration]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[token/expiration]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf]/ensure: created", "Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments]/ensure: created", "Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-serversignature]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/90_nova_api_wsgi-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7'", "Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-serversignature]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/90_nova_api_wsgi-serversignature]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-ssl]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/230_nova_api_wsgi-ssl]/ensure: defined content as '{md5}6e6f07e9782e4535b25afa0e9dbd5964'", "Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-ssl]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/230_nova_api_wsgi-ssl]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-docroot]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/10_nova_api_wsgi-docroot]/ensure: defined content as '{md5}a24d3496cbab869d04b9f6400e91f05b'", "Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-docroot]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/10_nova_api_wsgi-docroot]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-file_footer]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/999_nova_api_wsgi-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d'", "Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-file_footer]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/999_nova_api_wsgi-file_footer]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments.concat]/ensure: created", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-apache-header]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/0_nova_api_wsgi-apache-header]/ensure: defined content as '{md5}532286892f0965124c5d0f7a2d7ad2d2'", "Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-apache-header]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/0_nova_api_wsgi-apache-header]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments.concat.out]/ensure: created", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-logging]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/80_nova_api_wsgi-logging]/ensure: defined content as '{md5}fffc2d2c643ad504aca6c347d7aec2d6'", "Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-logging]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/80_nova_api_wsgi-logging]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-directories]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/60_nova_api_wsgi-directories]/ensure: defined content as '{md5}969793e0f283be30a0641501324cd29c'", "Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-directories]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/60_nova_api_wsgi-directories]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]", "Notice: /Stage[main]/Keystone/Keystone_config[eventlet_server/admin_workers]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[eventlet_server/admin_workers]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone::Db/Keystone_config[database/connection]/ensure: created", "Info: /Stage[main]/Keystone::Db/Keystone_config[database/connection]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments.concat.out]/ensure: created", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-access_log]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/100_nova_api_wsgi-access_log]/ensure: defined content as '{md5}3202d2662ed78e6f729646225603e1f5'", "Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-access_log]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/100_nova_api_wsgi-access_log]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]", "Notice: /Stage[main]/Rabbitmq::Config/File[/etc/security/limits.d/rabbitmq-server.conf]/ensure: defined content as '{md5}5ddc6ba5fcaeddd5b1565e5adfda5236'", "Info: /Stage[main]/Rabbitmq::Config/File[/etc/security/limits.d/rabbitmq-server.conf]: Scheduling refresh of Class[Rabbitmq::Service]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-access_log]/File[/var/lib/puppet/concat/15-default.conf/fragments/100_default-access_log]/ensure: defined content as '{md5}65fb033baac888b4ab85c295e870cb8f'", "Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-access_log]/File[/var/lib/puppet/concat/15-default.conf/fragments/100_default-access_log]: Scheduling refresh of Exec[concat_15-default.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-apache-header]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/0_keystone_wsgi_admin-apache-header]/ensure: defined content as '{md5}36e2769e5e22c8ff440262db545892f0'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-apache-header]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/0_keystone_wsgi_admin-apache-header]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/Exec[concat_10-keystone_wsgi_admin.conf]/returns: executed successfully", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/Exec[concat_10-keystone_wsgi_admin.conf]: Triggered 'refresh' from 11 events", "Notice: /Stage[main]/Rabbitmq::Config/File[rabbitmq-env.config]/ensure: defined content as '{md5}d1faed99ee5f85f2e3ef458c2d19f3a8'", "Info: /Stage[main]/Rabbitmq::Config/File[rabbitmq-env.config]: Scheduling refresh of Class[Rabbitmq::Service]", "Info: Class[Rabbitmq::Config]: Scheduling refresh of Class[Rabbitmq::Service]", "Info: Class[Rabbitmq::Service]: Scheduling refresh of Service[rabbitmq-server]", "Notice: /Stage[main]/Rabbitmq::Service/Service[rabbitmq-server]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Rabbitmq::Service/Service[rabbitmq-server]: Unscheduling refresh on Service[rabbitmq-server]", "Notice: /Stage[main]/Rabbitmq::Management/Rabbitmq_user[guest]/ensure: removed", "Notice: /Stage[main]/Rabbitmq::Install::Rabbitmqadmin/Staging::File[rabbitmqadmin]/Exec[/var/lib/rabbitmq/rabbitmqadmin]/returns: executed successfully", "Notice: /Stage[main]/Rabbitmq::Install::Rabbitmqadmin/File[/usr/local/bin/rabbitmqadmin]/ensure: defined content as '{md5}63d7331e825c865a97b7a8d1299841ff'", "Notice: /Stage[main]/Openstack_integration::Ironic/Rabbitmq_user[ironic]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Neutron/Rabbitmq_user[neutron]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Cinder/Rabbitmq_user[cinder]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Nova/Rabbitmq_user[nova]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Ironic/Rabbitmq_user_permissions[ironic@/]/ensure: created", "Notice: /Stage[main]/Ironic::Conductor/Service[ironic-conductor]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Ironic::Conductor/Service[ironic-conductor]: Unscheduling refresh on Service[ironic-conductor]", "Notice: /Stage[main]/Ironic::Api/Service[ironic-api]: Triggered 'refresh' from 1 events", "Notice: /Stage[main]/Openstack_integration::Nova/Rabbitmq_user_permissions[nova@/]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Cinder/Rabbitmq_user_permissions[cinder@/]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Glance/Rabbitmq_user[glance]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Glance/Rabbitmq_user_permissions[glance@/]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Neutron/Rabbitmq_user_permissions[neutron@/]/ensure: created", "Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_large_object_size]/ensure: created", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_large_object_size]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_large_object_size]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]/ensure: created", "Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Service[swift-account-server]", "Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Service[swift-account-replicator]", "Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Service[swift-account-auditor]", "Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Swift::Service[swift-account-server]", "Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Swift::Service[swift-account-replicator]", "Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Swift::Service[swift-account-auditor]", "Info: Swift::Service[swift-account-server]: Scheduling refresh of Service[swift-account-server]", "Notice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/File[/etc/swift/account-server/]/owner: owner changed 'root' to 'swift'", "Notice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/File[/etc/swift/account-server/]/group: group changed 'root' to 'swift'", "Info: Swift::Service[swift-account-replicator]: Scheduling refresh of Service[swift-account-replicator]", "Notice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]/ensure: created", "Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Service[swift-container-server]", "Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Service[swift-container-replicator]", "Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Service[swift-container-auditor]", "Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Swift::Service[swift-container-server]", "Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Swift::Service[swift-container-replicator]", "Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Swift::Service[swift-container-auditor]", "Info: Swift::Service[swift-container-replicator]: Scheduling refresh of Service[swift-container-replicator]", "Notice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/File[/etc/swift/container-server/]/owner: owner changed 'root' to 'swift'", "Notice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/File[/etc/swift/container-server/]/group: group changed 'root' to 'swift'", "Info: Swift::Service[swift-container-auditor]: Scheduling refresh of Service[swift-container-auditor]", "Info: Swift::Service[swift-container-server]: Scheduling refresh of Service[swift-container-server]", "Notice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]/ensure: created", "Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Service[swift-object-server]", "Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Service[swift-object-replicator]", "Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Service[swift-object-auditor]", "Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Swift::Service[swift-object-server]", "Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Swift::Service[swift-object-replicator]", "Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Swift::Service[swift-object-auditor]", "Notice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/File[/etc/swift/object-server/]/owner: owner changed 'root' to 'swift'", "Notice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/File[/etc/swift/object-server/]/group: group changed 'root' to 'swift'", "Info: Swift::Service[swift-object-server]: Scheduling refresh of Service[swift-object-server]", "Info: Swift::Service[swift-object-replicator]: Scheduling refresh of Service[swift-object-replicator]", "Info: Swift::Service[swift-object-auditor]: Scheduling refresh of Service[swift-object-auditor]", "Info: Swift::Service[swift-account-auditor]: Scheduling refresh of Service[swift-account-auditor]", "Notice: /Stage[main]/Swift::Proxy/Package[swift-proxy]/ensure: created", "Info: /Stage[main]/Swift::Proxy/Package[swift-proxy]: Scheduling refresh of Anchor[keystone::service::end]", "Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf]/ensure: created", "Info: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments.concat]/ensure: created", "Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments.concat.out]/ensure: created", "Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments]/ensure: created", "Info: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy/Concat::Fragment[swift_proxy]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/00_swift_proxy]/ensure: defined content as '{md5}3e7368112b701526ac018208596b6f2d'", "Info: /Stage[main]/Swift::Proxy/Concat::Fragment[swift_proxy]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/00_swift_proxy]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Tempauth/Concat::Fragment[swift-proxy-swauth]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/01_swift-proxy-swauth]/ensure: defined content as '{md5}77ae9d1ddf6d75e07b795e520797adb4'", "Info: /Stage[main]/Swift::Proxy::Tempauth/Concat::Fragment[swift-proxy-swauth]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/01_swift-proxy-swauth]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Container_quotas/Concat::Fragment[swift_container_quotas]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/81_swift_container_quotas]/ensure: defined content as '{md5}9cb7c3e198ec9152a4e1f80eb6448f6a'", "Info: /Stage[main]/Swift::Proxy::Container_quotas/Concat::Fragment[swift_container_quotas]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/81_swift_container_quotas]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Catch_errors/Concat::Fragment[swift_catch_errors]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/24_swift_catch_errors]/ensure: defined content as '{md5}e07f0e5b125db7d6c8b4724c1648bcd5'", "Info: /Stage[main]/Swift::Proxy::Catch_errors/Concat::Fragment[swift_catch_errors]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/24_swift_catch_errors]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Healthcheck/Concat::Fragment[swift_healthcheck]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/25_swift_healthcheck]/ensure: defined content as '{md5}8c92056c41082619d179f88ea15c5fc6'", "Info: /Stage[main]/Swift::Proxy::Healthcheck/Concat::Fragment[swift_healthcheck]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/25_swift_healthcheck]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Account_quotas/Concat::Fragment[swift_account_quotas]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/80_swift_account_quotas]/ensure: defined content as '{md5}c1ff253d7976e5b952647085fb3cefe3'", "Info: /Stage[main]/Swift::Proxy::Account_quotas/Concat::Fragment[swift_account_quotas]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/80_swift_account_quotas]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Cache/Concat::Fragment[swift_cache]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/23_swift_cache]/ensure: defined content as '{md5}cf82123513431b136e71a4503aeb82d9'", "Info: /Stage[main]/Swift::Proxy::Cache/Concat::Fragment[swift_cache]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/23_swift_cache]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Tempurl/Concat::Fragment[swift-proxy-tempurl]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/29_swift-proxy-tempurl]/ensure: defined content as '{md5}2fe004eae9f03fc684f9ed90044bd9c5'", "Info: /Stage[main]/Swift::Proxy::Tempurl/Concat::Fragment[swift-proxy-tempurl]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/29_swift-proxy-tempurl]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Formpost/Concat::Fragment[swift-proxy-formpost]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/31_swift-proxy-formpost]/ensure: defined content as '{md5}6907293ed6375b05de487bb7e0556ddd'", "Info: /Stage[main]/Swift::Proxy::Formpost/Concat::Fragment[swift-proxy-formpost]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/31_swift-proxy-formpost]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Keystone/Concat::Fragment[swift_keystone]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/79_swift_keystone]/ensure: defined content as '{md5}1cf1118a35e6b76ab6ee194eb0722f53'", "Info: /Stage[main]/Swift::Proxy::Keystone/Concat::Fragment[swift_keystone]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/79_swift_keystone]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Ratelimit/Concat::Fragment[swift_ratelimit]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/26_swift_ratelimit]/ensure: defined content as '{md5}2421e61cdf9eb2689fd5f1cc3740eb08'", "Info: /Stage[main]/Swift::Proxy::Ratelimit/Concat::Fragment[swift_ratelimit]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/26_swift_ratelimit]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Staticweb/Concat::Fragment[swift-proxy-staticweb]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/32_swift-proxy-staticweb]/ensure: defined content as '{md5}3e8e5d2820dc79360e8f1e07541ef8dc'", "Info: /Stage[main]/Swift::Proxy::Staticweb/Concat::Fragment[swift-proxy-staticweb]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/32_swift-proxy-staticweb]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Authtoken/File[/var/cache/swift]/group: group changed 'root' to 'swift'", "Notice: /Stage[main]/Swift::Proxy::Authtoken/File[/var/cache/swift]/mode: mode changed '0755' to '0700'", "Notice: /Stage[main]/Swift::Proxy::Authtoken/Concat::Fragment[swift_authtoken]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/22_swift_authtoken]/ensure: defined content as '{md5}f056388ce12c47fdd707acf18f5a14db'", "Info: /Stage[main]/Swift::Proxy::Authtoken/Concat::Fragment[swift_authtoken]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/22_swift_authtoken]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy::Proxy_logging/Concat::Fragment[swift_proxy-logging]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/27_swift_proxy-logging]/ensure: defined content as '{md5}a7f5751de4957dadfee13dc6e6c83c1a'", "Info: /Stage[main]/Swift::Proxy::Proxy_logging/Concat::Fragment[swift_proxy-logging]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/27_swift_proxy-logging]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf]", "Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/Exec[concat_/etc/swift/proxy-server.conf]/returns: executed successfully", "Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/Exec[concat_/etc/swift/proxy-server.conf]: Triggered 'refresh' from 16 events", "Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]/content: ", "--- /etc/swift/proxy-server.conf\t2016-05-07 16:57:43.000000000 +0100", "+++ /tmp/puppet-file20160520-26469-1jhbxzd\t2016-05-20 12:30:10.292337276 +0100", "@@ -1,16 +1,57 @@", "+# This file is managed by puppet. Do not edit", "+#", " [DEFAULT]", " bind_port = 8080", "-workers = 8", "+", "+bind_ip = 127.0.0.1", "+", "+workers = 2", " user = swift", "+log_name = proxy-server", "+log_facility = LOG_LOCAL1", "+log_level = INFO", "+log_headers = False", "+log_address = /dev/log", "+", "+", " ", " [pipeline:main]", "-pipeline = healthcheck cache authtoken keystone proxy-logging proxy-server", "+pipeline = catch_errors healthcheck cache tempurl ratelimit authtoken keystone formpost staticweb container_quotas account_quotas proxy-logging proxy-server", " ", " [app:proxy-server]", " use = egg:swift#proxy", "+set log_name = proxy-server", "+set log_facility = LOG_LOCAL1", "+set log_level = INFO", "+set log_address = /dev/log", "+log_handoffs = true", " allow_account_management = true", " account_autocreate = true", " ", "+", "+", "+", "+", "+[filter:tempauth]", "+use = egg:swift#tempauth", "+", "+user_admin_admin = admin .admin .reseller_admin", "+", "+", "+[filter:authtoken]", "+log_name = swift", "+signing_dir = /var/cache/swift", "+paste.filter_factory = keystonemiddleware.auth_token:filter_factory", "+", "+auth_uri = https://127.0.0.1:5000/v2.0", "+identity_uri = https://127.0.0.1:35357/", "+admin_tenant_name = services", "+admin_user = swift", "+admin_password = a_big_secret", "+delay_auth_decision = 1", "+cache = swift.cache", "+include_service_catalog = False", "+", " [filter:cache]", " use = egg:swift#memcache", " memcache_servers = 127.0.0.1:11211", "@@ -21,21 +62,34 @@", " [filter:healthcheck]", " use = egg:swift#healthcheck", " ", "+[filter:ratelimit]", "+use = egg:swift#ratelimit", "+clock_accuracy = 1000", "+max_sleep_time_seconds = 60", "+log_sleep_time_seconds = 0", "+rate_buffer_seconds = 5", "+account_ratelimit = 0", "+", " [filter:proxy-logging]", " use = egg:swift#proxy_logging", " ", "+[filter:tempurl]", "+use = egg:swift#tempurl", "+", "+[filter:formpost]", "+use = egg:swift#formpost", "+", "+[filter:staticweb]", "+use = egg:swift#staticweb", "+", " [filter:keystone]", " use = egg:swift#keystoneauth", "-operator_roles = admin, SwiftOperator", "+operator_roles = Member, admin, SwiftOperator", " is_admin = true", "-cache = swift.cache", "+reseller_prefix = AUTH_", " ", "-[filter:authtoken]", "-paste.filter_factory = keystonemiddleware.auth_token:filter_factory", "-admin_tenant_name = %SERVICE_TENANT_NAME%", "-admin_user = %SERVICE_USER%", "-admin_password = %SERVICE_PASSWORD%", "-auth_host = 127.0.0.1", "-auth_port = 35357", "-auth_protocol = http", "-signing_dir = /tmp/keystone-signing-swift", "+[filter:account_quotas]", "+use = egg:swift#account_quotas", "+", "+[filter:container_quotas]", "+use = egg:swift#container_quotas", "Info: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]: Filebucketed /etc/swift/proxy-server.conf to puppet with sum cd347a2631d48647d000f5d34985704c", "Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]/content: content changed '{md5}cd347a2631d48647d000f5d34985704c' to '{md5}d6844dcb64e004f7b06f1e9ac75a5a56'", "Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]/owner: owner changed 'root' to 'swift'", "Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]/mode: mode changed '0640' to '0644'", "Info: Concat[/etc/swift/proxy-server.conf]: Scheduling refresh of Swift::Service[swift-proxy-server]", "Info: Concat[/etc/swift/proxy-server.conf]: Scheduling refresh of Service[swift-proxy-server]", "Info: Swift::Service[swift-proxy-server]: Scheduling refresh of Service[swift-proxy-server]", "Notice: /Stage[main]/Swift::Proxy/Swift::Service[swift-proxy-server]/Service[swift-proxy-server]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Proxy/Swift::Service[swift-proxy-server]/Service[swift-proxy-server]: Unscheduling refresh on Service[swift-proxy-server]", "Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/Exec[concat_/etc/swift/account-server.conf]/returns: executed successfully", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/Exec[concat_/etc/swift/account-server.conf]: Triggered 'refresh' from 5 events", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/content: ", "--- /etc/swift/account-server.conf\t2016-05-07 16:57:43.000000000 +0100", "+++ /tmp/puppet-file20160520-26469-ic4cnw\t2016-05-20 12:30:10.638324149 +0100", "@@ -1,21 +1,39 @@", " [DEFAULT]", "-", "-# Make sure your swift-ring-builder arguments match the bind_ip and bind_port.", "-# You almost certainly do not want to listen just on loopback unless testing.", "-# However, you want to keep port 6202 if SElinux is enabled.", "+devices = /srv/node", " bind_ip = 127.0.0.1", "-bind_port = 6202", "+bind_port = 6002", "+mount_check = false", "+user = swift", "+workers = 1", "+log_name = account-server", "+log_facility = LOG_LOCAL2", "+log_level = INFO", "+log_address = /dev/log", "+", " ", "-workers = 2", " ", " [pipeline:main]", " pipeline = account-server", " ", " [app:account-server]", " use = egg:swift#account", "+set log_name = account-server", "+set log_facility = LOG_LOCAL2", "+set log_level = INFO", "+set log_requests = true", "+set log_address = /dev/log", " ", " [account-replicator]", "+concurrency = 8", " ", " [account-auditor]", " ", " [account-reaper]", "+concurrency = 8", "+", "+[filter:healthcheck]", "+use = egg:swift#healthcheck", "+", "+[filter:recon]", "+use = egg:swift#recon", "+recon_cache_path = /var/cache/swift", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Filebucketed /etc/swift/account-server.conf to puppet with sum 07e5a1a1e5a0ab83d745e20680eb32c1", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/content: content changed '{md5}07e5a1a1e5a0ab83d745e20680eb32c1' to '{md5}b09bb7b7833b29c19014f8963d0e6884'", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/owner: owner changed 'root' to 'swift'", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/mode: mode changed '0640' to '0644'", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-reaper]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Swift::Service[swift-account-reaper]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-reaper]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Swift::Service[swift-account-reaper]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-reaper]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Swift::Service[swift-account-reaper]", "Info: Swift::Service[swift-account-reaper]: Scheduling refresh of Service[swift-account-reaper]", "Notice: /Stage[main]/Swift::Storage::Account/Swift::Service[swift-account-reaper]/Service[swift-account-reaper]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Storage::Account/Swift::Service[swift-account-reaper]/Service[swift-account-reaper]: Unscheduling refresh on Service[swift-account-reaper]", "Info: Concat[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-server]", "Info: Concat[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-replicator]", "Info: Concat[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-auditor]", "Notice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-replicator]/Service[swift-account-replicator]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-replicator]/Service[swift-account-replicator]: Unscheduling refresh on Service[swift-account-replicator]", "Notice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-auditor]/Service[swift-account-auditor]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-auditor]/Service[swift-account-auditor]: Unscheduling refresh on Service[swift-account-auditor]", "Notice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-server]/Service[swift-account-server]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-server]/Service[swift-account-server]: Unscheduling refresh on Service[swift-account-server]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-logging]/File[/var/lib/puppet/concat/15-default.conf/fragments/80_default-logging]/ensure: defined content as '{md5}f202203ce2fe5d885160be988ff36151'", "Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-logging]/File[/var/lib/puppet/concat/15-default.conf/fragments/80_default-logging]: Scheduling refresh of Exec[concat_15-default.conf]", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/Exec[concat_15-default.conf]/returns: executed successfully", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/Exec[concat_15-default.conf]: Triggered 'refresh' from 10 events", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-wsgi]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/260_nova_api_wsgi-wsgi]/ensure: defined content as '{md5}d8fcfbd8a3ec337955722d8a7c10844a'", "Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-wsgi]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/260_nova_api_wsgi-wsgi]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/Exec[concat_10-nova_api_wsgi.conf]/returns: executed successfully", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/Exec[concat_10-nova_api_wsgi.conf]: Triggered 'refresh' from 11 events", "Notice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[container]/Concat::Fragment[swift_healthcheck_container]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/25_swift_healthcheck_container]/ensure: defined content as '{md5}8c92056c41082619d179f88ea15c5fc6'", "Info: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[container]/Concat::Fragment[swift_healthcheck_container]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/25_swift_healthcheck_container]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/Exec[concat_/etc/swift/container-server.conf]/returns: executed successfully", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/Exec[concat_/etc/swift/container-server.conf]: Triggered 'refresh' from 5 events", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/content: ", "--- /etc/swift/container-server.conf\t2016-05-07 16:57:43.000000000 +0100", "+++ /tmp/puppet-file20160520-26469-1khud84\t2016-05-20 12:30:11.635286324 +0100", "@@ -1,23 +1,43 @@", " [DEFAULT]", "-", "-# Make sure your swift-ring-builder arguments match the bind_ip and bind_port.", "-# You almost certainly do not want to listen just on loopback unless testing.", "-# However, you want to keep port 6201 if SElinux is enabled.", "+devices = /srv/node", " bind_ip = 127.0.0.1", "-bind_port = 6201", "+bind_port = 6001", "+mount_check = false", "+user = swift", "+log_name = container-server", "+log_facility = LOG_LOCAL2", "+log_level = INFO", "+log_address = /dev/log", "+", " ", "-workers = 2", "+workers = 1", "+allowed_sync_hosts = 127.0.0.1", " ", " [pipeline:main]", " pipeline = container-server", " ", " [app:container-server]", "+allow_versions = false", " use = egg:swift#container", "+set log_name = container-server", "+set log_facility = LOG_LOCAL2", "+set log_level = INFO", "+set log_requests = true", "+set log_address = /dev/log", " ", " [container-replicator]", "+concurrency = 8", " ", " [container-updater]", "+concurrency = 8", " ", " [container-auditor]", " ", " [container-sync]", "+", "+[filter:healthcheck]", "+use = egg:swift#healthcheck", "+", "+[filter:recon]", "+use = egg:swift#recon", "+recon_cache_path = /var/cache/swift", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Filebucketed /etc/swift/container-server.conf to puppet with sum 4998257eb89ff63e838b37686ebb1ee7", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/content: content changed '{md5}4998257eb89ff63e838b37686ebb1ee7' to '{md5}21c2517e90b3e9698ae546bfbf8e210f'", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/owner: owner changed 'root' to 'swift'", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/mode: mode changed '0640' to '0644'", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-updater]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Swift::Service[swift-container-updater]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-updater]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Swift::Service[swift-container-updater]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-updater]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Swift::Service[swift-container-updater]", "Info: Concat[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-server]", "Info: Concat[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-replicator]", "Info: Concat[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-auditor]", "Notice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-replicator]/Service[swift-container-replicator]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-replicator]/Service[swift-container-replicator]: Unscheduling refresh on Service[swift-container-replicator]", "Notice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-auditor]/Service[swift-container-auditor]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-auditor]/Service[swift-container-auditor]: Unscheduling refresh on Service[swift-container-auditor]", "Notice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-server]/Service[swift-container-server]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-server]/Service[swift-container-server]: Unscheduling refresh on Service[swift-container-server]", "Info: Swift::Service[swift-container-updater]: Scheduling refresh of Service[swift-container-updater]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Rsync::Server::Module[account]/Concat::Fragment[frag-account]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_account_frag-account]/ensure: defined content as '{md5}c1253249b9f960b4c5ab27bffc4c0382'", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Rsync::Server::Module[account]/Concat::Fragment[frag-account]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_account_frag-account]: Scheduling refresh of Exec[concat_/etc/rsync.conf]", "Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/Exec[concat_/etc/rsync.conf]/returns: executed successfully", "Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/Exec[concat_/etc/rsync.conf]: Triggered 'refresh' from 6 events", "Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/etc/rsync.conf]/ensure: defined content as '{md5}4b60030f2dab5c450c9d32e3fa3705c2'", "Notice: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Package[nova-api]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Package[nova-api]: Scheduling refresh of Anchor[keystone::service::end]", "Info: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Package[nova-api]: Scheduling refresh of Anchor[nova::install::end]", "Notice: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Triggered 'refresh' from 7 events", "Info: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Scheduling refresh of Anchor[nova::service::begin]", "Info: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Scheduling refresh of Exec[nova-db-sync]", "Info: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Scheduling refresh of Exec[nova-db-sync-api]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf auth_tcp]/ensure: created", "Info: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf auth_tcp]: Scheduling refresh of Service[libvirt]", "Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created", "Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/debug]/ensure: created", "Info: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/debug]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[neutron/service_metadata_proxy]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[neutron/service_metadata_proxy]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/use_neutron]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/use_neutron]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/log_dir]/ensure: created", "Info: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/log_dir]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_use_baremetal_filters]/ensure: created", "Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_use_baremetal_filters]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[DEFAULT/notify_api_faults]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[DEFAULT/notify_api_faults]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Db/Nova_config[api_database/connection]/ensure: created", "Info: /Stage[main]/Nova::Db/Nova_config[api_database/connection]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Conductor/Nova_config[conductor/use_local]/ensure: created", "Info: /Stage[main]/Nova::Conductor/Nova_config[conductor/use_local]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[osapi_v3/enabled]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[osapi_v3/enabled]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[DEFAULT/notification_driver]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[DEFAULT/notification_driver]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/max_io_ops_per_host]/ensure: created", "Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/max_io_ops_per_host]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_password]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_password]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/compute_manager]/ensure: created", "Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/compute_manager]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_password]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_user]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_user]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created", "Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Vncproxy/Nova_config[DEFAULT/novncproxy_host]/ensure: created", "Info: /Stage[main]/Nova::Vncproxy/Nova_config[DEFAULT/novncproxy_host]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created", "Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/verbose]/ensure: created", "Info: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/verbose]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_weight_classes]/ensure: created", "Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_weight_classes]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_plugin]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_plugin]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/security_group_api]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/security_group_api]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/enabled_apis]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/enabled_apis]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen_port]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen_port]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_virtual_host]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[glance/api_servers]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[glance/api_servers]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_port]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[DEFAULT/image_service]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[DEFAULT/image_service]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/volume_api_class]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/volume_api_class]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/heal_instance_info_cache_interval]/ensure: created", "Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/heal_instance_info_cache_interval]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[DEFAULT/notify_on_state_change]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[DEFAULT/notify_on_state_change]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/firewall_driver]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/firewall_driver]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created", "Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_host_subset_size]/ensure: created", "Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_host_subset_size]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Vncproxy/Nova_config[DEFAULT/novncproxy_port]/ensure: created", "Info: /Stage[main]/Nova::Vncproxy/Nova_config[DEFAULT/novncproxy_port]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[cinder/catalog_info]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[cinder/catalog_info]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/kombu_reconnect_delay]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/kombu_reconnect_delay]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created", "Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/ram_allocation_ratio]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Db/Nova_config[database/connection]/ensure: created", "Info: /Stage[main]/Nova::Db/Nova_config[database/connection]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created", "Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_host]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/default_floating_pool]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/default_floating_pool]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created", "Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Scheduler/Nova_config[DEFAULT/scheduler_driver]/ensure: created", "Info: /Stage[main]/Nova::Scheduler/Nova_config[DEFAULT/scheduler_driver]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/amqp_durable_queues]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created", "Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/disk_allocation_ratio]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/heartbeat_rate]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/vncserver_proxyclient_address]/ensure: created", "Info: /Stage[main]/Nova::Compute/Nova_config[vnc/vncserver_proxyclient_address]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/api_paste_config]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/api_paste_config]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/vncserver_listen]/ensure: created", "Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/vncserver_listen]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_max_attempts]/ensure: created", "Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_max_attempts]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created", "Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/cpu_allocation_ratio]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen_port]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen_port]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created", "Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/max_instances_per_host]/ensure: created", "Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/max_instances_per_host]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_available_filters]/ensure: created", "Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_available_filters]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/keymap]/ensure: created", "Info: /Stage[main]/Nova::Compute/Nova_config[vnc/keymap]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/url]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/url]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_tenant_name]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf listen_tls]/ensure: created", "Info: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf listen_tls]: Scheduling refresh of Service[libvirt]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_host_manager]/ensure: created", "Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_host_manager]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[DEFAULT/notification_topics]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[DEFAULT/notification_topics]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/kombu_ssl_version]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/use_forwarded_for]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/use_forwarded_for]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/sysconfig/libvirtd libvirtd args]/ensure: created", "Info: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/sysconfig/libvirtd libvirtd args]: Scheduling refresh of Service[libvirt]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_volume_listen]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created", "Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created", "Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[DEFAULT/auth_strategy]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[DEFAULT/auth_strategy]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created", "Info: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created", "Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_workers]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_workers]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/identity_uri]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/identity_uri]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rpc_backend]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[DEFAULT/rpc_backend]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf listen_tcp]/ensure: created", "Info: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf listen_tcp]: Scheduling refresh of Service[libvirt]", "Notice: /Stage[main]/Nova::Compute::Libvirt/Service[libvirt]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Nova::Compute::Libvirt/Service[libvirt]: Unscheduling refresh on Service[libvirt]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_workers]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_workers]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/dhcp_domain]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/dhcp_domain]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/allow_resize_to_same_host]/ensure: created", "Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/allow_resize_to_same_host]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova/Nova_config[oslo_concurrency/lock_path]/ensure: created", "Info: /Stage[main]/Nova/Nova_config[oslo_concurrency/lock_path]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/auth_uri]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/auth_uri]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[neutron/metadata_proxy_shared_secret]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[neutron/metadata_proxy_shared_secret]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created", "Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]: Scheduling refresh of Anchor[nova::config::end]", "Info: /etc/httpd/conf.d: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[10-nova_api_wsgi.conf]/ensure: defined content as '{md5}a201c1c5ac33c244ff2071cfe9b38046'", "Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[15-default.conf]/ensure: defined content as '{md5}a430bf4e003be964b419e7aea251c6c4'", "Info: Concat[10-nova_api_wsgi.conf]: Scheduling refresh of Class[Apache::Service]", "Info: Apache::Vhost[nova_api_wsgi]: Scheduling refresh of Anchor[keystone::config::end]", "Info: Concat[15-default.conf]: Scheduling refresh of Class[Apache::Service]", "Info: Apache::Vhost[default]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/fping_path]/ensure: created", "Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/fping_path]: Scheduling refresh of Anchor[nova::config::end]", "Notice: /Stage[main]/Apache::Mod::Dav_fs/File[dav_fs.conf]/ensure: defined content as '{md5}899a57534f3d84efa81887ec93c90c9b'", "Info: /Stage[main]/Apache::Mod::Dav_fs/File[dav_fs.conf]: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments.concat]/ensure: created", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/Exec[concat_/etc/swift/object-server.conf]/returns: executed successfully", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/Exec[concat_/etc/swift/object-server.conf]: Triggered 'refresh' from 5 events", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/content: ", "--- /etc/swift/object-server.conf\t2016-05-07 16:57:43.000000000 +0100", "+++ /tmp/puppet-file20160520-26469-14vpviw\t2016-05-20 12:30:21.421915019 +0100", "@@ -1,21 +1,39 @@", " [DEFAULT]", "-", "-# Make sure your swift-ring-builder arguments match the bind_ip and bind_port.", "-# You almost certainly do not want to listen just on loopback unless testing.", "-# However, you want to keep port 6200 if SElinux is enabled.", "+devices = /srv/node", " bind_ip = 127.0.0.1", "-bind_port = 6200", "+bind_port = 6000", "+mount_check = false", "+user = swift", "+log_name = object-server", "+log_facility = LOG_LOCAL2", "+log_level = INFO", "+log_address = /dev/log", "+", " ", "-workers = 3", "+workers = 1", " ", " [pipeline:main]", " pipeline = object-server", " ", " [app:object-server]", " use = egg:swift#object", "+set log_name = object-server", "+set log_facility = LOG_LOCAL2", "+set log_level = INFO", "+set log_requests = true", "+set log_address = /dev/log", " ", " [object-replicator]", "+concurrency = 8", " ", " [object-updater]", "+concurrency = 8", " ", " [object-auditor]", "+", "+[filter:healthcheck]", "+use = egg:swift#healthcheck", "+", "+[filter:recon]", "+use = egg:swift#recon", "+recon_cache_path = /var/cache/swift", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Filebucketed /etc/swift/object-server.conf to puppet with sum 43f14d676b28bc8111d6100e06e9a8bf", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/content: content changed '{md5}43f14d676b28bc8111d6100e06e9a8bf' to '{md5}396c3ccb85387cbac0df92cdbad14646'", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/owner: owner changed 'root' to 'swift'", "Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/mode: mode changed '0640' to '0644'", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-updater]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Swift::Service[swift-object-updater]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-updater]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Swift::Service[swift-object-updater]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-updater]", "Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Swift::Service[swift-object-updater]", "Info: Concat[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-server]", "Info: Concat[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-replicator]", "Info: Concat[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-auditor]", "Notice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-server]/Service[swift-object-server]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-server]/Service[swift-object-server]: Unscheduling refresh on Service[swift-object-server]", "Notice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-replicator]/Service[swift-object-replicator]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-replicator]/Service[swift-object-replicator]: Unscheduling refresh on Service[swift-object-replicator]", "Notice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-auditor]/Service[swift-object-auditor]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-auditor]/Service[swift-object-auditor]: Unscheduling refresh on Service[swift-object-auditor]", "Info: Swift::Service[swift-object-updater]: Scheduling refresh of Service[swift-object-updater]", "Notice: /Stage[main]/Swift::Storage::Object/Swift::Service[swift-object-updater]/Service[swift-object-updater]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Storage::Object/Swift::Service[swift-object-updater]/Service[swift-object-updater]: Unscheduling refresh on Service[swift-object-updater]", "Notice: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Triggered 'refresh' from 47 events", "Info: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-api]", "Info: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-scheduler]", "Info: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-volume]", "Info: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Api/Service[cinder-api]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Cinder::Api/Service[cinder-api]: Unscheduling refresh on Service[cinder-api]", "Notice: /Stage[main]/Cinder::Volume/Service[cinder-volume]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Cinder::Volume/Service[cinder-volume]: Unscheduling refresh on Service[cinder-volume]", "Notice: /Stage[main]/Cinder::Scheduler/Service[cinder-scheduler]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Cinder::Scheduler/Service[cinder-scheduler]: Unscheduling refresh on Service[cinder-scheduler]", "Notice: /Stage[main]/Keystone/Keystone_config[token/revoke_by_id]/ensure: created", "Info: /Stage[main]/Keystone/Keystone_config[token/revoke_by_id]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf]/ensure: created", "Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments]/ensure: created", "Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-access_log]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/100_ironic_wsgi-access_log]/ensure: defined content as '{md5}f2a2c3f663fb69cb0f359c1ae7ad320c'", "Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-access_log]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/100_ironic_wsgi-access_log]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-directories]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/60_ironic_wsgi-directories]/ensure: defined content as '{md5}29d0408a3b55a4415d880929f9a3ad46'", "Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-directories]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/60_ironic_wsgi-directories]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-ssl]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/230_ironic_wsgi-ssl]/ensure: defined content as '{md5}d6cec447dc3b9d177de1da941662dde7'", "Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-ssl]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/230_ironic_wsgi-ssl]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-docroot]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/10_ironic_wsgi-docroot]/ensure: defined content as '{md5}5cce1f4b838a61eb9353dc516b6f1912'", "Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-docroot]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/10_ironic_wsgi-docroot]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-wsgi]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/260_ironic_wsgi-wsgi]/ensure: defined content as '{md5}ce69252b664facd16f8d6d002943bde9'", "Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-wsgi]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/260_ironic_wsgi-wsgi]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-apache-header]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/0_ironic_wsgi-apache-header]/ensure: defined content as '{md5}eed662cc75f34394db84b64d61142357'", "Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-apache-header]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/0_ironic_wsgi-apache-header]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-logging]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/80_ironic_wsgi-logging]/ensure: defined content as '{md5}228ae1c4025ea06df280b6c090746264'", "Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-logging]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/80_ironic_wsgi-logging]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-file_footer]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/999_ironic_wsgi-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d'", "Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-file_footer]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/999_ironic_wsgi-file_footer]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments.concat]/ensure: created", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments.concat.out]/ensure: created", "Notice: /Stage[main]/Swift::Storage::Container/Swift::Service[swift-container-updater]/Service[swift-container-updater]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Swift::Storage::Container/Swift::Service[swift-container-updater]/Service[swift-container-updater]: Unscheduling refresh on Service[swift-container-updater]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-serversignature]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/90_ironic_wsgi-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7'", "Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-serversignature]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/90_ironic_wsgi-serversignature]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf]", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/Exec[concat_10-ironic_wsgi.conf]/returns: executed successfully", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/Exec[concat_10-ironic_wsgi.conf]: Triggered 'refresh' from 11 events", "Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[10-ironic_wsgi.conf]/ensure: defined content as '{md5}fd0438eae872c05b10e229854a6dd56d'", "Info: Concat[10-ironic_wsgi.conf]: Scheduling refresh of Class[Apache::Service]", "Info: Apache::Vhost[ironic_wsgi]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]/ensure: created", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_host]/ensure: created", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Openstack_integration::Ironic/Openstack_integration::Ssl_key[ironic]/File[/etc/ironic/ssl]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Ironic/Openstack_integration::Ssl_key[ironic]/File[/etc/ironic/ssl/private]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Ironic/Openstack_integration::Ssl_key[ironic]/File[/etc/ironic/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00'", "Info: Openstack_integration::Ssl_key[ironic]: Scheduling refresh of Service[httpd]", "Notice: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/verbose]/ensure: created", "Info: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/verbose]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/auth_url]/ensure: created", "Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/auth_url]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/auth_url]: Scheduling refresh of Exec[glance-manage db_sync]", "Notice: /Stage[main]/Glance::Db::Sync/Exec[glance-manage db_sync]: Triggered 'refresh' from 83 events", "Info: /Stage[main]/Glance::Db::Sync/Exec[glance-manage db_sync]: Scheduling refresh of Service[glance-api]", "Info: /Stage[main]/Glance::Db::Sync/Exec[glance-manage db_sync]: Scheduling refresh of Service[glance-registry]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-directories]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/60_keystone_wsgi_main-directories]/ensure: defined content as '{md5}cc81234a3bbf77f857ed3f11bb369e8c'", "Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-directories]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/60_keystone_wsgi_main-directories]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf]", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/Exec[concat_10-keystone_wsgi_main.conf]/returns: executed successfully", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/Exec[concat_10-keystone_wsgi_main.conf]: Triggered 'refresh' from 11 events", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[10-keystone_wsgi_main.conf]/ensure: defined content as '{md5}fa0ea0cfef0ad72ddbeb9b6110bd2c86'", "Info: Concat[10-keystone_wsgi_main.conf]: Scheduling refresh of Class[Apache::Service]", "Info: Apache::Vhost[keystone_wsgi_main]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Triggered 'refresh' from 103 events", "Info: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Scheduling refresh of Anchor[nova::service::begin]", "Info: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Scheduling refresh of Exec[nova-db-sync]", "Info: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Scheduling refresh of Exec[nova-db-sync-api]", "Notice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Mysql_database[nova]/ensure: created", "Notice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_127.0.0.1]/Mysql_user[nova@127.0.0.1]/ensure: created", "Notice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_127.0.0.1]/Mysql_grant[nova@127.0.0.1/nova.*]/ensure: created", "Info: Class[Nova::Db::Mysql]: Scheduling refresh of Anchor[nova::db::end]", "Notice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Mysql_database[nova_api]/ensure: created", "Notice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Openstacklib::Db::Mysql::Host_access[nova_api_127.0.0.1]/Mysql_user[nova_api@127.0.0.1]/ensure: created", "Notice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Openstacklib::Db::Mysql::Host_access[nova_api_127.0.0.1]/Mysql_grant[nova_api@127.0.0.1/nova_api.*]/ensure: created", "Info: Class[Nova::Db::Mysql_api]: Scheduling refresh of Anchor[nova::db::end]", "Notice: /Stage[main]/Nova::Deps/Anchor[nova::db::end]: Triggered 'refresh' from 2 events", "Info: /Stage[main]/Nova::Deps/Anchor[nova::db::end]: Scheduling refresh of Anchor[nova::dbsync::begin]", "Notice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::begin]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::begin]: Scheduling refresh of Exec[nova-db-sync]", "Notice: /Stage[main]/Nova::Db::Sync/Exec[nova-db-sync]: Triggered 'refresh' from 3 events", "Info: /Stage[main]/Nova::Db::Sync/Exec[nova-db-sync]: Scheduling refresh of Anchor[nova::dbsync::end]", "Notice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::end]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::end]: Scheduling refresh of Anchor[nova::dbsync_api::begin]", "Notice: /Stage[main]/Nova::Cron::Archive_deleted_rows/Cron[nova-manage db archive_deleted_rows]/ensure: created", "Notice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::begin]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::begin]: Scheduling refresh of Exec[nova-db-sync-api]", "Notice: /Stage[main]/Nova::Db::Sync_api/Exec[nova-db-sync-api]: Triggered 'refresh' from 3 events", "Info: /Stage[main]/Nova::Db::Sync_api/Exec[nova-db-sync-api]: Scheduling refresh of Anchor[nova::dbsync_api::end]", "Notice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::end]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::end]: Scheduling refresh of Anchor[nova::service::begin]", "Notice: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Triggered 'refresh' from 3 events", "Info: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-api]", "Info: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-conductor]", "Info: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-consoleauth]", "Info: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-compute]", "Info: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-scheduler]", "Info: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-vncproxy]", "Notice: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Service[nova-vncproxy]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Service[nova-vncproxy]: Scheduling refresh of Anchor[nova::service::end]", "Info: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Service[nova-vncproxy]: Unscheduling refresh on Service[nova-vncproxy]", "Notice: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Service[nova-consoleauth]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Service[nova-consoleauth]: Scheduling refresh of Anchor[nova::service::end]", "Info: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Service[nova-consoleauth]: Unscheduling refresh on Service[nova-consoleauth]", "Notice: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Service[nova-scheduler]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Service[nova-scheduler]: Scheduling refresh of Anchor[nova::service::end]", "Info: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Service[nova-scheduler]: Unscheduling refresh on Service[nova-scheduler]", "Notice: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Service[nova-conductor]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Service[nova-conductor]: Scheduling refresh of Anchor[nova::service::end]", "Info: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Service[nova-conductor]: Unscheduling refresh on Service[nova-conductor]", "Notice: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Service[nova-compute]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Service[nova-compute]: Scheduling refresh of Anchor[nova::service::end]", "Info: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Service[nova-compute]: Unscheduling refresh on Service[nova-compute]", "Notice: /Stage[main]/Apache::Mod::Cgi/Apache::Mod[cgi]/File[cgi.load]/ensure: defined content as '{md5}ac20c5c5779b37ab06b480d6485a0881'", "Info: /Stage[main]/Apache::Mod::Cgi/Apache::Mod[cgi]/File[cgi.load]: Scheduling refresh of Class[Apache::Service]", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-proxy.conf]: Filebucketed /etc/httpd/conf.modules.d/00-proxy.conf to puppet with sum 85487c6777a89a8494dc8976dfff3268", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-proxy.conf]/ensure: removed", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/01-cgi.conf]: Filebucketed /etc/httpd/conf.modules.d/01-cgi.conf to puppet with sum 36e54d4b2bd190f5cbad876bfbeda461", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/01-cgi.conf]/ensure: removed", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-ssl.conf]: Filebucketed /etc/httpd/conf.modules.d/00-ssl.conf to puppet with sum e282ac9f82fe5538692a4de3616fb695", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-ssl.conf]/ensure: removed", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-mpm.conf]: Filebucketed /etc/httpd/conf.modules.d/00-mpm.conf to puppet with sum 820f672ca85595fd80620db585d51970", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-mpm.conf]/ensure: removed", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-systemd.conf]: Filebucketed /etc/httpd/conf.modules.d/00-systemd.conf to puppet with sum fd94264cd695af2ad86e7715c10e285d", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-systemd.conf]/ensure: removed", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/10-wsgi.conf]: Filebucketed /etc/httpd/conf.modules.d/10-wsgi.conf to puppet with sum e1795e051e7aae1f865fde0d3b86a507", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/10-wsgi.conf]/ensure: removed", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-base.conf]: Filebucketed /etc/httpd/conf.modules.d/00-base.conf to puppet with sum 6098845a84033f0fabe536488e52b1a0", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-base.conf]/ensure: removed", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-lua.conf]: Filebucketed /etc/httpd/conf.modules.d/00-lua.conf to puppet with sum 449a4aea60473ac4a16f025fca4463e3", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-lua.conf]/ensure: removed", "Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-dav.conf]: Filebucketed /etc/httpd/conf.modules.d/00-dav.conf to puppet with sum 56406b62d1fc7b7f1912e5b9e223f7a0", "Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-dav.conf]/ensure: removed", "Info: /etc/httpd/conf.modules.d: Scheduling refresh of Class[Apache::Service]", "Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]/ensure: created", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Service[neutron-ovs-agent-service]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Service[neutron-ovs-agent-service]: Unscheduling refresh on Service[neutron-ovs-agent-service]", "Notice: /Stage[main]/Neutron::Agents::Lbaas/Service[neutron-lbaasv2-service]: Triggered 'refresh' from 1 events", "Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[10-keystone_wsgi_admin.conf]/ensure: defined content as '{md5}5147e80911d47f807820c80ccf1b3f9e'", "Info: Concat[10-keystone_wsgi_admin.conf]: Scheduling refresh of Class[Apache::Service]", "Info: Apache::Vhost[keystone_wsgi_admin]: Scheduling refresh of Anchor[keystone::config::end]", "Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Triggered 'refresh' from 36 events", "Info: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Scheduling refresh of Anchor[keystone::service::begin]", "Info: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Scheduling refresh of Service[httpd]", "Info: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Scheduling refresh of Exec[keystone-manage db_sync]", "Notice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Mysql_database[keystone]/ensure: created", "Info: Class[Apache::Service]: Scheduling refresh of Service[httpd]", "Notice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_127.0.0.1]/Mysql_user[keystone@127.0.0.1]/ensure: created", "Notice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_127.0.0.1]/Mysql_grant[keystone@127.0.0.1/keystone.*]/ensure: created", "Info: Class[Keystone::Db::Mysql]: Scheduling refresh of Anchor[keystone::db::end]", "Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::db::end]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Keystone::Deps/Anchor[keystone::db::end]: Scheduling refresh of Anchor[keystone::dbsync::begin]", "Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::begin]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::begin]: Scheduling refresh of Exec[keystone-manage db_sync]", "Notice: /Stage[main]/Keystone::Db::Sync/Exec[keystone-manage db_sync]: Triggered 'refresh' from 3 events", "Info: /Stage[main]/Keystone::Db::Sync/Exec[keystone-manage db_sync]: Scheduling refresh of Anchor[keystone::dbsync::end]", "Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::end]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::end]: Scheduling refresh of Anchor[keystone::service::begin]", "Info: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::end]: Scheduling refresh of Exec[keystone-manage bootstrap]", "Notice: /Stage[main]/Keystone/Exec[keystone-manage bootstrap]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Keystone/Exec[keystone-manage bootstrap]: Scheduling refresh of Anchor[keystone::service::begin]", "Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::service::begin]: Triggered 'refresh' from 4 events", "Info: /Stage[main]/Keystone::Deps/Anchor[keystone::service::begin]: Scheduling refresh of Service[keystone]", "Notice: /Stage[main]/Keystone::Service/Service[keystone]: Triggered 'refresh' from 1 events", "Info: /Stage[main]/Keystone::Service/Service[keystone]: Scheduling refresh of Anchor[keystone::service::end]", "Notice: /Stage[main]/Apache::Service/Service[httpd]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Apache::Service/Service[httpd]: Unscheduling refresh on Service[httpd]", "Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::service::end]: Triggered 'refresh' from 31 events", "Notice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift_s3]/Keystone_service[swift_s3::s3]/ensure: created", "Notice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift_s3]/Keystone_endpoint[RegionOne/swift_s3::s3]/ensure: created", "Notice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_service[Image Service::image]/ensure: created", "Notice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_user[neutron]/ensure: created", "Notice: /Stage[main]/Swift::Keystone::Auth/Keystone_role[ResellerAdmin]/ensure: created", "Notice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_service[ironic::baremetal]/ensure: created", "Notice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_endpoint[RegionOne/Image Service::image]/ensure: created", "Info: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_endpoint[RegionOne/Image Service::image]: Scheduling refresh of Service[glance-api]", "Notice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova service, user nova]/Keystone_user[nova]/ensure: created", "Notice: /Stage[main]/Swift::Keystone::Auth/Keystone_role[SwiftOperator]/ensure: created", "Notice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user[glance]/ensure: created", "Notice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova v3 service, user novav3]/Keystone_service[novav3::computev3]/ensure: created", "Notice: /Stage[main]/Keystone::Roles::Admin/Keystone_tenant[services]/ensure: created", "Notice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user_role[glance@services]/ensure: created", "Info: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user_role[glance@services]: Scheduling refresh of Service[glance-registry]", "Info: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user_role[glance@services]: Scheduling refresh of Service[glance-api]", "Notice: /Stage[main]/Glance::Registry/Service[glance-registry]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Glance::Registry/Service[glance-registry]: Unscheduling refresh on Service[glance-registry]", "Notice: /Stage[main]/Keystone::Roles::Admin/Keystone_tenant[openstack]/ensure: created", "Notice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_user[ironic]/ensure: created", "Notice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_user_role[ironic@services]/ensure: created", "Notice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova service, user nova]/Keystone_service[nova::compute]/ensure: created", "Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinderv3]/Keystone_service[cinderv3::volumev3]/ensure: created", "Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinderv3]/Keystone_endpoint[RegionOne/cinderv3::volumev3]/ensure: created", "Notice: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Service[nova-api]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Service[nova-api]: Scheduling refresh of Anchor[nova::service::end]", "Info: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Service[nova-api]: Unscheduling refresh on Service[nova-api]", "Notice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_service[neutron::network]/ensure: created", "Notice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova service, user nova]/Keystone_user_role[nova@services]/ensure: created", "Notice: /Stage[main]/Keystone::Roles::Admin/Keystone_user[admin]/password: changed password", "Notice: /Stage[main]/Keystone::Roles::Admin/Keystone_user[admin]/email: defined 'email' as 'test@example.tld'", "Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinder]/Keystone_user[cinder]/ensure: created", "Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinderv2]/Keystone_service[cinderv2::volumev2]/ensure: created", "Notice: /Stage[main]/Nova::Deps/Anchor[nova::service::end]: Triggered 'refresh' from 6 events", "Notice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova v3 service, user novav3]/Keystone_endpoint[RegionOne/novav3::computev3]/ensure: created", "Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]/ensure: created", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]/ensure: created", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]/ensure: created", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]/ensure: created", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]/ensure: created", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]/ensure: created", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]/ensure: created", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]/ensure: created", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova service, user nova]/Keystone_endpoint[RegionOne/nova::compute]/ensure: created", "Notice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_endpoint[RegionOne/neutron::network]/ensure: created", "Info: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_endpoint[RegionOne/neutron::network]: Scheduling refresh of Service[neutron-server]", "Notice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_service[swift::object-store]/ensure: created", "Notice: /Stage[main]/Keystone::Endpoint/Keystone::Resource::Service_identity[keystone]/Keystone_service[keystone::identity]/ensure: created", "Notice: /Stage[main]/Keystone::Endpoint/Keystone::Resource::Service_identity[keystone]/Keystone_endpoint[RegionOne/keystone::identity]/ensure: created", "Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]/ensure: created", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_user_role[neutron@services]/ensure: created", "Info: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_user_role[neutron@services]: Scheduling refresh of Service[neutron-server]", "Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]/ensure: created", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]/ensure: created", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-server]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Exec[neutron-db-sync]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-metadata]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-lbaas-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-l3]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-dhcp-service]", "Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Agents::Metering/Service[neutron-metering-service]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Neutron::Agents::Metering/Service[neutron-metering-service]: Unscheduling refresh on Service[neutron-metering-service]", "Notice: /Stage[main]/Neutron::Agents::Dhcp/Service[neutron-dhcp-service]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Neutron::Agents::Dhcp/Service[neutron-dhcp-service]: Unscheduling refresh on Service[neutron-dhcp-service]", "Notice: /Stage[main]/Neutron::Agents::L3/Service[neutron-l3]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Neutron::Agents::L3/Service[neutron-l3]: Unscheduling refresh on Service[neutron-l3]", "Notice: /Stage[main]/Neutron::Agents::Lbaas/Service[neutron-lbaas-service]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Neutron::Agents::Lbaas/Service[neutron-lbaas-service]: Unscheduling refresh on Service[neutron-lbaas-service]", "Notice: /Stage[main]/Neutron::Agents::Metadata/Service[neutron-metadata]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Neutron::Agents::Metadata/Service[neutron-metadata]: Unscheduling refresh on Service[neutron-metadata]", "Notice: /Stage[main]/Glance::Api/Service[glance-api]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Glance::Api/Service[glance-api]: Unscheduling refresh on Service[glance-api]", "Notice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_endpoint[RegionOne/ironic::baremetal]/ensure: created", "Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinder]/Keystone_service[cinder::volume]/ensure: created", "Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinder]/Keystone_endpoint[RegionOne/cinder::volume]/ensure: created", "Notice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_endpoint[RegionOne/swift::object-store]/ensure: created", "Notice: /Stage[main]/Neutron::Db::Sync/Exec[neutron-db-sync]: Triggered 'refresh' from 59 events", "Info: /Stage[main]/Neutron::Db::Sync/Exec[neutron-db-sync]: Scheduling refresh of Service[neutron-server]", "Notice: /Stage[main]/Neutron::Server/Service[neutron-server]/ensure: ensure changed 'stopped' to 'running'", "Info: /Stage[main]/Neutron::Server/Service[neutron-server]: Unscheduling refresh on Service[neutron-server]", "Notice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_user[swift]/ensure: created", "Notice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_user_role[swift@services]/ensure: created", "Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinderv2]/Keystone_endpoint[RegionOne/cinderv2::volumev2]/ensure: created", "Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinder]/Keystone_user_role[cinder@services]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Cinder/Cinder_type[BACKEND_1]/ensure: created", "Notice: /Stage[main]/Keystone::Roles::Admin/Keystone_user_role[admin@openstack]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Provision/Glance_image[cirros]/ensure: created", "Notice: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[public_api/admin_token_auth]/ensure: removed", "Info: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[public_api/admin_token_auth]: Scheduling refresh of Exec[restart_keystone]", "Notice: /Stage[main]/Openstack_integration::Provision/Neutron_network[public]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_neutron_net_id_setter[public_network_id]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Provision/Exec[manage_m1.micro_nova_flavor]/returns: executed successfully", "Notice: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[api_v3/admin_token_auth]/ensure: removed", "Info: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[api_v3/admin_token_auth]: Scheduling refresh of Exec[restart_keystone]", "Notice: /Stage[main]/Openstack_extras::Auth_file/File[/root/openrc]/ensure: defined content as '{md5}3f4b596583820c76e15d3092a5c6dcc0'", "Notice: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[admin_api/admin_token_auth]/ensure: removed", "Info: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[admin_api/admin_token_auth]: Scheduling refresh of Exec[restart_keystone]", "Notice: /Stage[main]/Keystone/Exec[restart_keystone]: Triggered 'refresh' from 3 events", "Notice: /Stage[main]/Openstack_integration::Provision/Exec[manage_m1.nano_nova_flavor]/returns: executed successfully", "Notice: /Stage[main]/Openstack_integration::Provision/Neutron_subnet[public-subnet]/ensure: created", "Notice: /Stage[main]/Openstack_integration::Provision/Glance_image[cirros_alt]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_glance_id_setter[image_ref]/ensure: created", "Notice: /Stage[main]/Tempest/Tempest_glance_id_setter[image_ref_alt]/ensure: created", "Info: Creating state file /var/lib/puppet/state/state.yaml", "Notice: Finished catalog run in 519.98 seconds", "Info: Loading external facts from /etc/puppet/modules/openstacklib/facts.d", "Info: Loading facts in /etc/puppet/modules/nova/lib/facter/libvirt_uuid.rb", "Info: Loading facts in /etc/puppet/modules/openstacklib/lib/facter/os_package_type.rb", "Info: Loading facts in /etc/puppet/modules/openstacklib/lib/facter/os_service_default.rb", "Info: Loading facts in /etc/puppet/modules/vswitch/lib/facter/ovs.rb", "Info: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_reboot_required.rb", "Info: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_update_last_success.rb", "Info: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_updates.rb", "Info: Loading facts in /etc/puppet/modules/concat/lib/facter/concat_basedir.rb", "Info: Loading facts in /etc/puppet/modules/firewall/lib/facter/ip6tables_version.rb", "Info: Loading facts in /etc/puppet/modules/firewall/lib/facter/iptables_persistent_version.rb", "Info: Loading facts in /etc/puppet/modules/firewall/lib/facter/iptables_version.rb", "Info: Loading facts in /etc/puppet/modules/mysql/lib/facter/mysql_version.rb", "Info: Loading facts in /etc/puppet/modules/mysql/lib/facter/mysql_server_id.rb", "Info: Loading facts in /etc/puppet/modules/python/lib/facter/pip_version.rb", "Info: Loading facts in /etc/puppet/modules/python/lib/facter/python_version.rb", "Info: Loading facts in /etc/puppet/modules/python/lib/facter/virtualenv_version.rb", "Info: Loading facts in /etc/puppet/modules/staging/lib/facter/staging_http_get.rb", "Info: Loading facts in /etc/puppet/modules/staging/lib/facter/staging_windir.rb", "Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/puppet_vardir.rb", "Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/facter_dot_d.rb", "Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/pe_version.rb", "Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/root_home.rb", "Notice: Compiled catalog for n2.dusty.ci.centos.org in environment production in 9.47 seconds", "Info: Applying configuration version '1463743997'", "Notice: Finished catalog run in 69.42 seconds", "all create: /tmp/openstack/tempest/.tox/tempest", "all installdeps: setuptools, -r/tmp/openstack/tempest/requirements.txt", "all develop-inst: /tmp/openstack/tempest", "all installed: Babel==2.3.4,cffi==1.6.0,cliff==2.0.0,cmd2==0.6.8,cryptography==1.3.2,debtcollector==1.4.0,enum34==1.1.6,extras==1.0.0,fasteners==0.14.1,fixtures==1.4.0,funcsigs==1.0.2,functools32==3.2.3.post2,idna==2.1,ipaddress==1.0.16,iso8601==0.1.11,jsonschema==2.5.1,linecache2==1.0.0,monotonic==1.1,msgpack-python==0.4.7,netaddr==0.7.18,netifaces==0.10.4,os-testr==0.6.0,oslo.concurrency==3.8.0,oslo.config==3.9.0,oslo.context==2.3.0,oslo.i18n==3.6.0,oslo.log==3.7.0,oslo.serialization==2.6.0,oslo.utils==3.10.0,paramiko==2.0.0,pbr==1.9.1,prettytable==0.7.2,pyasn1==0.1.9,pycparser==2.14,pyinotify==0.9.6,pyOpenSSL==16.0.0,pyparsing==2.1.4,python-dateutil==2.5.3,python-mimeparse==1.5.2,python-subunit==1.2.0,pytz==2016.4,PyYAML==3.11,retrying==1.3.3,six==1.10.0,stevedore==1.13.0,-e git://git.openstack.org/openstack/tempest@aff9cc072bbbb222b09a3411b203c180b493eae8#egg=tempest,testrepository==0.0.20,testscenarios==0.5.0,testtools==2.2.0,traceback2==1.4.0,unicodecsv==0.14.1,unittest2==1.1.0,urllib3==1.15.1,wrapt==1.10.8", "all runtests: PYTHONHASHSEED='3977220619'", "all runtests: commands[0] | find . -type f -name *.pyc -delete", "all runtests: commands[1] | bash tools/pretty_tox.sh --concurrency=2 smoke dashbboard TelemetryAlarming api.baremetal.admin.test_drivers", "running testr", "running=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \\", "OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \\", "OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-500} \\", "OS_TEST_LOCK_PATH=${OS_TEST_LOCK_PATH:-${TMPDIR:-'/tmp'}} \\", "${PYTHON:-python} -m subunit.run discover -t ${OS_TOP_LEVEL:-./} ${OS_TEST_PATH:-./tempest/test_discover} --list ", "running=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \\", "OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \\", "OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-500} \\", "OS_TEST_LOCK_PATH=${OS_TEST_LOCK_PATH:-${TMPDIR:-'/tmp'}} \\", "${PYTHON:-python} -m subunit.run discover -t ${OS_TOP_LEVEL:-./} ${OS_TEST_PATH:-./tempest/test_discover} --load-list /tmp/tmp0s50qd", "running=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \\", "OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \\", "OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-500} \\", "OS_TEST_LOCK_PATH=${OS_TEST_LOCK_PATH:-${TMPDIR:-'/tmp'}} \\", "${PYTHON:-python} -m subunit.run discover -t ${OS_TOP_LEVEL:-./} ${OS_TEST_PATH:-./tempest/test_discover} --load-list /tmp/tmpdnKp0B", "{1} tempest.api.compute.security_groups.test_security_group_rules.SecurityGroupRulesTestJSON.test_security_group_rules_create [0.510592s] ... ok", "{1} tempest.api.compute.security_groups.test_security_group_rules.SecurityGroupRulesTestJSON.test_security_group_rules_list [0.637885s] ... ok", "{0} tempest.api.baremetal.admin.test_drivers.TestDrivers.test_list_drivers [1.271812s] ... ok", "{0} tempest.api.baremetal.admin.test_drivers.TestDrivers.test_show_driver [1.338437s] ... ok", "{0} tempest.api.compute.flavors.test_flavors.FlavorsV2TestJSON.test_get_flavor [0.106825s] ... ok", "{0} tempest.api.compute.flavors.test_flavors.FlavorsV2TestJSON.test_list_flavors [0.130774s] ... ok", "{0} tempest.api.compute.security_groups.test_security_groups.SecurityGroupsTestJSON.test_security_groups_create_list_delete [1.448903s] ... ok", "{1} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_list_servers [0.064264s] ... ok", "{1} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_verify_server_details [0.000582s] ... ok", "{0} tempest.api.compute.servers.test_attach_interfaces.AttachInterfacesTestJSON.test_add_remove_fixed_ip [11.839100s] ... ok", "{1} tempest.api.compute.servers.test_server_addresses.ServerAddressesTestJSON.test_list_server_addresses [0.071666s] ... ok", "{1} tempest.api.compute.servers.test_server_addresses.ServerAddressesTestJSON.test_list_server_addresses_by_network [0.159031s] ... ok", "{0} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_list_servers [0.063791s] ... ok", "{0} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_verify_server_details [0.000565s] ... ok", "{1} setUpClass (tempest.api.data_processing.test_cluster_templates.ClusterTemplateTest) ... SKIPPED: Sahara support is required", "{1} setUpClass (tempest.api.data_processing.test_data_sources.DataSourceTest) ... SKIPPED: Sahara support is required", "{1} setUpClass (tempest.api.data_processing.test_job_binaries.JobBinaryTest) ... SKIPPED: Sahara support is required", "{1} setUpClass (tempest.api.data_processing.test_jobs.JobTest) ... SKIPPED: Sahara support is required", "{1} setUpClass (tempest.api.data_processing.test_node_group_templates.NodeGroupTemplateTest) ... SKIPPED: Sahara support is required", "{1} setUpClass (tempest.api.database.flavors.test_flavors.DatabaseFlavorsTest) ... SKIPPED: DatabaseFlavorsTest skipped as trove is not available", "{1} setUpClass (tempest.api.database.limits.test_limits.DatabaseLimitsTest) ... SKIPPED: DatabaseLimitsTest skipped as trove is not available", "{1} tempest.api.identity.admin.v3.test_credentials.CredentialsTestJSON.test_credentials_create_get_update_delete [0.152104s] ... ok", "{1} tempest.api.identity.admin.v3.test_domains.DefaultDomainTestJSON.test_default_domain_exists [0.037410s] ... ok", "{1} tempest.api.identity.admin.v3.test_domains.DomainsTestJSON.test_create_update_delete_domain [0.397761s] ... ok", "{1} tempest.api.identity.admin.v3.test_endpoints.EndPointsTestJSON.test_update_endpoint [0.215164s] ... ok", "{0} tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_reboot_server_hard [11.347985s] ... ok", "{1} tempest.api.identity.admin.v3.test_groups.GroupsV3TestJSON.test_group_users_add_list_delete [1.166480s] ... ok", "{0} setUpClass (tempest.api.data_processing.test_job_binary_internals.JobBinaryInternalTest) ... SKIPPED: Sahara support is required", "{0} setUpClass (tempest.api.data_processing.test_plugins.PluginsTest) ... SKIPPED: Sahara support is required", "{0} setUpClass (tempest.api.database.versions.test_versions.DatabaseVersionsTest) ... SKIPPED: DatabaseVersionsTest skipped as trove is not available", "{1} tempest.api.identity.admin.v3.test_regions.RegionsTestJSON.test_create_region_with_specific_id [0.166700s] ... ok", "{0} tempest.api.identity.admin.v2.test_services.ServicesTestJSON.test_list_services [0.373792s] ... ok", "{1} tempest.api.identity.admin.v3.test_roles.RolesV3TestJSON.test_role_create_update_show_list [0.286381s] ... ok", "{0} tempest.api.identity.admin.v2.test_users.UsersTestJSON.test_create_user [0.143726s] ... ok", "{1} tempest.api.identity.admin.v3.test_trusts.TrustsV3TestJSON.test_get_trusts_all [1.541047s] ... ok", "{0} tempest.api.identity.admin.v3.test_policies.PoliciesTestJSON.test_create_update_delete_policy [0.206302s] ... ok", "{1} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_delete_image [0.517553s] ... ok", "{1} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_register_upload_get_image_file [1.137646s] ... ok", "{1} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_update_image [1.469844s] ... ok", "{1} tempest.api.network.test_extensions.ExtensionsTestJSON.test_list_show_extensions [0.430944s] ... ok", "{0} tempest.api.identity.admin.v3.test_services.ServicesTestJSON.test_create_update_get_service [0.295808s] ... ok", "{1} tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_network [0.846129s] ... ok", "{1} tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_port [1.372009s] ... ok", "{0} tempest.api.identity.v2.test_api_discovery.TestApiDiscovery.test_api_media_types [0.048919s] ... ok", "{0} tempest.api.identity.v2.test_api_discovery.TestApiDiscovery.test_api_version_resources [0.054454s] ... ok", "{0} tempest.api.identity.v2.test_api_discovery.TestApiDiscovery.test_api_version_statuses [0.045440s] ... ok", "{1} tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_subnet [4.599057s] ... ok", "{1} setUpClass (tempest.api.network.test_networks.NetworksIpV6TestAttrs) ... SKIPPED: IPv6 extended attributes for subnets not available", "{0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_media_types [0.054893s] ... ok", "{0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_version_resources [0.061760s] ... ok", "{0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_version_statuses [0.059559s] ... ok", "{1} tempest.api.network.test_networks.NetworksTest.test_create_update_delete_network_subnet [1.563828s] ... ok", "{1} tempest.api.network.test_networks.NetworksTest.test_external_network_visibility [0.184518s] ... ok", "{1} tempest.api.network.test_networks.NetworksTest.test_list_networks [0.079143s] ... ok", "{1} tempest.api.network.test_networks.NetworksTest.test_list_subnets [0.046161s] ... ok", "{1} tempest.api.network.test_networks.NetworksTest.test_show_network [0.052944s] ... ok", "{1} tempest.api.network.test_networks.NetworksTest.test_show_subnet [0.051730s] ... ok", "{1} tempest.api.network.test_ports.PortsTestJSON.test_create_port_in_allowed_allocation_pools [1.498361s] ... ok", "{0} tempest.api.network.test_floating_ips.FloatingIPTestJSON.test_create_floating_ip_specifying_a_fixed_ip_address [0.891415s] ... ok", "{1} tempest.api.network.test_ports.PortsTestJSON.test_create_port_with_no_securitygroups [1.660860s] ... ok", "{0} tempest.api.network.test_floating_ips.FloatingIPTestJSON.test_create_list_show_update_delete_floating_ip [1.472597s] ... ok", "{1} tempest.api.network.test_ports.PortsTestJSON.test_create_update_delete_port [1.024533s] ... ok", "{1} tempest.api.network.test_ports.PortsTestJSON.test_list_ports [0.028519s] ... ok", "{1} tempest.api.network.test_ports.PortsTestJSON.test_show_port [0.031189s] ... ok", "{0} tempest.api.network.test_networks.BulkNetworkOpsTest.test_bulk_create_delete_network [0.826167s] ... ok", "{0} tempest.api.network.test_networks.BulkNetworkOpsTest.test_bulk_create_delete_port [1.382763s] ... ok", "{1} tempest.api.network.test_routers.RoutersTest.test_add_multiple_router_interfaces [3.649875s] ... ok", "{0} tempest.api.network.test_networks.BulkNetworkOpsTest.test_bulk_create_delete_subnet [1.938906s] ... ok", "{1} tempest.api.network.test_routers.RoutersTest.test_add_remove_router_interface_with_port_id [2.267556s] ... ok", "{1} tempest.api.network.test_routers.RoutersTest.test_add_remove_router_interface_with_subnet_id [1.954573s] ... ok", "{1} tempest.api.network.test_routers.RoutersTest.test_create_show_list_update_delete_router [1.438991s] ... ok", "{0} tempest.api.network.test_networks.NetworksIpV6Test.test_create_update_delete_network_subnet [1.268792s] ... ok", "{0} tempest.api.network.test_networks.NetworksIpV6Test.test_external_network_visibility [0.112706s] ... ok", "{0} tempest.api.network.test_networks.NetworksIpV6Test.test_list_networks [0.051579s] ... ok", "{0} tempest.api.network.test_networks.NetworksIpV6Test.test_list_subnets [0.045387s] ... ok", "{0} tempest.api.network.test_networks.NetworksIpV6Test.test_show_network [0.132461s] ... ok", "{0} tempest.api.network.test_networks.NetworksIpV6Test.test_show_subnet [0.044410s] ... ok", "{1} tempest.api.network.test_security_groups.SecGroupIPv6Test.test_create_list_update_show_delete_security_group [0.375148s] ... ok", "{1} tempest.api.network.test_security_groups.SecGroupIPv6Test.test_create_show_delete_security_group_rule [0.470574s] ... ok", "{1} tempest.api.network.test_security_groups.SecGroupIPv6Test.test_list_security_groups [0.035922s] ... ok", "{0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_create_port_in_allowed_allocation_pools [1.515059s] ... ok", "{1} tempest.api.object_storage.test_account_quotas.AccountQuotasTest.test_admin_modify_quota [0.210497s] ... ok", "{1} tempest.api.object_storage.test_account_quotas.AccountQuotasTest.test_upload_valid_object [0.071776s] ... ok", "{0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_create_port_with_no_securitygroups [1.804811s] ... ok", "{0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_create_update_delete_port [0.783414s] ... ok", "{0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_list_ports [0.030561s] ... ok", "{0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_show_port [0.029457s] ... ok", "{1} tempest.api.object_storage.test_account_services.AccountTest.test_list_account_metadata [0.054494s] ... ok", "{1} tempest.api.object_storage.test_account_services.AccountTest.test_list_containers [0.013434s] ... ok", "{1} setUpClass (tempest.api.orchestration.stacks.test_stacks.StacksTestJSON) ... SKIPPED: Heat support is required", "{1} setUpClass (tempest.api.telemetry.test_alarming_api.TelemetryAlarmingAPITestJSON) ... SKIPPED: Aodh support is required", "{1} setUpClass (tempest.api.telemetry.test_alarming_api_negative.TelemetryAlarmingNegativeTest) ... SKIPPED: Aodh support is required", "{0} tempest.api.network.test_routers.RoutersIpV6Test.test_add_multiple_router_interfaces [3.744758s] ... ok", "{0} tempest.api.network.test_routers.RoutersIpV6Test.test_add_remove_router_interface_with_port_id [2.046541s] ... ok", "{0} tempest.api.network.test_routers.RoutersIpV6Test.test_add_remove_router_interface_with_subnet_id [2.020083s] ... ok", "{1} tempest.api.volume.test_volumes_get.VolumesV2GetTest.test_volume_create_get_update_delete [7.842538s] ... ok", "{0} tempest.api.network.test_routers.RoutersIpV6Test.test_create_show_list_update_delete_router [1.502659s] ... ok", "{0} tempest.api.network.test_security_groups.SecGroupTest.test_create_list_update_show_delete_security_group [0.368896s] ... ok", "{0} tempest.api.network.test_security_groups.SecGroupTest.test_create_show_delete_security_group_rule [0.471705s] ... ok", "{0} tempest.api.network.test_security_groups.SecGroupTest.test_list_security_groups [0.044018s] ... ok", "{0} tempest.api.network.test_subnetpools_extensions.SubnetPoolsTestJSON.test_create_list_show_update_delete_subnetpools [0.268973s] ... ok", "{0} tempest.api.object_storage.test_container_quotas.ContainerQuotasTest.test_upload_large_object [0.391122s] ... ok", "{0} tempest.api.object_storage.test_container_quotas.ContainerQuotasTest.test_upload_too_many_objects [0.285795s] ... ok", "{0} tempest.api.object_storage.test_container_quotas.ContainerQuotasTest.test_upload_valid_object [0.195461s] ... ok", "{1} tempest.api.volume.test_volumes_get.VolumesV2GetTest.test_volume_create_get_update_delete_from_image [32.992742s] ... ok", "{0} tempest.api.object_storage.test_container_services.ContainerTest.test_create_container [0.335287s] ... ok", "{0} tempest.api.object_storage.test_container_services.ContainerTest.test_list_container_contents [0.149195s] ... ok", "{0} tempest.api.object_storage.test_container_services.ContainerTest.test_list_container_metadata [0.121836s] ... ok", "{0} tempest.api.object_storage.test_object_services.ObjectTest.test_create_object [0.050552s] ... ok", "{0} tempest.api.object_storage.test_object_services.ObjectTest.test_get_object [0.026938s] ... ok", "{0} tempest.api.object_storage.test_object_services.ObjectTest.test_list_object_metadata [0.024721s] ... ok", "{0} tempest.api.object_storage.test_object_services.ObjectTest.test_update_object_metadata [0.051842s] ... ok", "{0} setUpClass (tempest.api.orchestration.stacks.test_resource_types.ResourceTypesTest) ... SKIPPED: Heat support is required", "{0} setUpClass (tempest.api.orchestration.stacks.test_soft_conf.TestSoftwareConfig) ... SKIPPED: Heat support is required", "{0} setUpClass (tempest.api.telemetry.test_telemetry_notification_api.TelemetryNotificationAPITestJSON) ... SKIPPED: Ceilometer support is required", "{1} tempest.api.volume.test_volumes_list.VolumesV1ListTestJSON.test_volume_list [0.049297s] ... ok", "{0} tempest.api.volume.test_volumes_actions.VolumesV1ActionsTest.test_attach_detach_volume_to_instance [1.502631s] ... ok", "{1} tempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops [36.835052s] ... FAILED", "{1} setUpClass (tempest.scenario.test_server_multinode.TestServerMultinode) ... SKIPPED: Less than 2 compute nodes, skipping multinode tests.", "{0} tempest.api.volume.test_volumes_actions.VolumesV2ActionsTest.test_attach_detach_volume_to_instance [1.199769s] ... ok", "{0} tempest.api.volume.test_volumes_get.VolumesV1GetTest.test_volume_create_get_update_delete [17.959206s] ... ok", "{0} tempest.api.volume.test_volumes_get.VolumesV1GetTest.test_volume_create_get_update_delete_from_image [42.517879s] ... ok", "{0} tempest.api.volume.test_volumes_list.VolumesV2ListTestJSON.test_volume_list [0.047964s] ... ok", "{0} tempest.scenario.test_network_basic_ops.TestNetworkBasicOps.test_network_basic_ops [132.155957s] ... ok", "{0} tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_volume_boot_pattern [151.413748s] ... ok", "{0} tempest.scenario.test_volume_boot_pattern.TestVolumeBootPatternV2.test_volume_boot_pattern [155.015026s] ... ok", "", "==============================", "Failed 1 tests - output below:", "==============================", "", "tempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops[compute,id-7fff3fb3-91d8-4fd0-bd7d-0204f1f180ba,network,smoke]", "----------------------------------------------------------------------------------------------------------------------------------------------", "", "Captured pythonlogging:", "~~~~~~~~~~~~~~~~~~~~~~~", " 2016-05-20 12:39:33,977 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:setUp): 200 GET https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59 0.461s", " 2016-05-20 12:39:33,978 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''}", " Body: None", " Response - Headers: {'vary': 'Accept-Encoding', 'status': '200', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '677', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59', 'date': 'Fri, 20 May 2016 11:39:33 GMT', 'x-compute-request-id': 'req-2d694247-967f-4d4c-b110-8dd52b397df7'}", " Body: {\"image\": {\"status\": \"ACTIVE\", \"updated\": \"2016-05-20T11:32:39Z\", \"links\": [{\"href\": \"https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59\", \"rel\": \"self\"}, {\"href\": \"https://127.0.0.1:8774/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59\", \"rel\": \"bookmark\"}, {\"href\": \"http://172.19.2.66:9292/images/ffff3a3a-5101-497a-b186-38682e723d59\", \"type\": \"application/vnd.openstack.image\", \"rel\": \"alternate\"}], \"id\": \"ffff3a3a-5101-497a-b186-38682e723d59\", \"OS-EXT-IMG-SIZE:size\": 13287936, \"name\": \"cirros\", \"created\": \"2016-05-20T11:32:36Z\", \"minDisk\": 0, \"progress\": 100, \"minRam\": 0, \"metadata\": {}}}", " 2016-05-20 12:39:34,143 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:setUp): 200 GET https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/flavors/42 0.162s", " 2016-05-20 12:39:34,144 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''}", " Body: None", " Response - Headers: {'vary': 'Accept-Encoding', 'status': '200', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '421', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/flavors/42', 'date': 'Fri, 20 May 2016 11:39:33 GMT', 'x-compute-request-id': 'req-8aa2720b-8e5f-4b38-a53a-2f0a4f4a3442'}", " Body: {\"flavor\": {\"name\": \"m1.nano\", \"links\": [{\"href\": \"https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/flavors/42\", \"rel\": \"self\"}, {\"href\": \"https://127.0.0.1:8774/2b19e26c86fb4b48abe8551003fc00c7/flavors/42\", \"rel\": \"bookmark\"}], \"ram\": 128, \"OS-FLV-DISABLED:disabled\": false, \"vcpus\": 1, \"swap\": \"\", \"os-flavor-access:is_public\": true, \"rxtx_factor\": 1.0, \"OS-FLV-EXT-DATA:ephemeral\": 0, \"disk\": 0, \"id\": \"42\"}}", " 2016-05-20 12:39:34,561 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:setUp): 200 GET https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59 0.415s", " 2016-05-20 12:39:34,562 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''}", " Body: None", " Response - Headers: {'vary': 'Accept-Encoding', 'status': '200', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '677', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59', 'date': 'Fri, 20 May 2016 11:39:34 GMT', 'x-compute-request-id': 'req-47d134ff-c986-4e47-87bc-8b1865abeb34'}", " Body: {\"image\": {\"status\": \"ACTIVE\", \"updated\": \"2016-05-20T11:32:39Z\", \"links\": [{\"href\": \"https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59\", \"rel\": \"self\"}, {\"href\": \"https://127.0.0.1:8774/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59\", \"rel\": \"bookmark\"}, {\"href\": \"http://172.19.2.66:9292/images/ffff3a3a-5101-497a-b186-38682e723d59\", \"type\": \"application/vnd.openstack.image\", \"rel\": \"alternate\"}], \"id\": \"ffff3a3a-5101-497a-b186-38682e723d59\", \"OS-EXT-IMG-SIZE:size\": 13287936, \"name\": \"cirros\", \"created\": \"2016-05-20T11:32:36Z\", \"minDisk\": 0, \"progress\": 100, \"minRam\": 0, \"metadata\": {}}}", " 2016-05-20 12:39:34,568 7734 DEBUG [tempest.scenario.test_server_basic_ops] Starting test for i:ffff3a3a-5101-497a-b186-38682e723d59, f:42. Run ssh: False, user: cirros", " 2016-05-20 12:39:34,754 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:test_server_basic_ops): 200 POST https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-keypairs 0.184s", " 2016-05-20 12:39:34,754 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''}", " Body: {\"keypair\": {\"name\": \"tempest-TestServerBasicOps-1692537820\"}}", " Response - Headers: {'vary': 'Accept-Encoding', 'status': '200', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '2320', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-keypairs', 'date': 'Fri, 20 May 2016 11:39:34 GMT', 'x-compute-request-id': 'req-174eb88e-a1a0-4382-abaa-3db83a7276b4'}", " Body: {\"keypair\": {\"public_key\": \"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCf0YVs8Qd2HOxGejejNA86wa9jKGRUqadnX16ux7D0QgTxcru4ll4JtSPY3azJqwwUAajeHOge/vPM6ySLlJscB9iPo0k4A0AbNed1hfmEvYXYEYmss58gkgFjwrv5wqIz08V4Fu+I9FMjD0PmFFQNqSv35i3C6i54LUZRGkFzT7HxXM4aAZUjpCfjNXsJSDoRSz0GBC0QbZ+GQah7mYiVMDJO1MFWKrReDjYMNr3xdooTb2m3G2rvksHgl0ezVRDbvkgCodJz4YQrC82gitJdLyGEJZpYPTMbOp/dsOAkKPGtkyF4Qqv/FDMyCHM8bsiOog/xXmBIT87xlzBtAzZ9 Generated-by-Nova\", \"private_key\": \"-----BEGIN RSA PRIVATE KEY-----\\nMIIEqQIBAAKCAQEAn9GFbPEHdhzsRno3ozQPOsGvYyhkVKmnZ19ersew9EIE8XK7\\nuJZeCbUj2N2syasMFAGo3hzoHv7zzOski5SbHAfYj6NJOANAGzXndYX5hL2F2BGJ\\nrLOfIJIBY8K7+cKiM9PFeBbviPRTIw9D5hRUDakr9+YtwuoueC1GURpBc0+x8VzO\\nGgGVI6Qn4zV7CUg6EUs9BgQtEG2fhkGoe5mIlTAyTtTBViq0Xg42DDa98XaKE29p\\ntxtq75LB4JdHs1UQ275IAqHSc+GEKwvNoIrSXS8hhCWaWD0zGzqf3bDgJCjxrZMh\\neEKr/xQzMghzPG7IjqIP8V5gSE/O8ZcwbQM2fQIDAQABAoIBABO2sJKjmJwFLU/0\\nO3CyNz60LYI5tUaMNs4VgYRltXoruphd4rH+OlNQOL/DeFDX/IFrQv1C648HO+OH\\nDdb52bg3b4soRRvXqsywdYCVqhWpmxzv7N+UuIg3+lvn5XAFhiSGdtE9YwatvKOS\\nenmjAEs/FuFZT0O/x0OjsgzHBFPIyt15vGAOIIhbWRBoWJSBD5MglPHpqFRMbWnh\\nIma71YSEn62dddHzlnk5+7gVf7FF9eZl4hcLrfqWuZhi8lNTiu/FtBQT9cEnoAXb\\nu6Y/59eoZSBv334s3D/nlbtqY922xJrwVjucfbw7tDrzDaDlurkHKST/jr29weOM\\nPl7T8gECggCBAL/E6Y/62Eja/DoQwMGytxb612xbA3lasZnpyBjBFpByKuK8tPy9\\nwp9K+dT8nk8+E1GToPOGGyvk1UqnYl2mShiDpZWRrtDf2JZqL0r4FhDs1DoMvbcO\\nscAt9KYT9yjMwFtZXflA2N7sU5pWovJccnEsAN47elxT9ROC9l0Sqt2dAoIAgQDV\\nWQTawXkU2bJlyUqC+EXEFEtHR1uUfLWB7ZbwoqB5tYKUydKk0d7CNOLSPQgJJnpt\\nb5l/iRtypsZ0FbjRiBtdkzsn7zzsY5pvaptasbeSNG7EOdADRRmfXSDPQi9J5TIL\\nsqxxbu9lLlIgT1J8ECQARpNx7VmSzA697JjpS9TWYQKCAIEAr9hhj3wmXdAoHxqD\\nllpJV1IWva5LZjkKyCa+LCzKgxOdTaJal7NtxmGa63nltKYoUtJ7cTLUsZA5ISaR\\npWw5X3dAHAGlerT4Rx0BVs5cdZKlHMHYKQbZaW76eluudQQjkuBEsq2K8Admtgyh\\niHnLGwmNljqV/hmijgy12iym72UCggCBAJ/MzZYM1GSJqtYSr3zp+Vek273H9RCD\\nWHC5RRV4ujpveh94DA7oI7HTaxGOJTa1W34c2Rxt1eFKidrpakWHbPfqD6UZzMhC\\n0qohb7u+4YDhRRY1N1k7qLV1S93x9PmkcpfQfNl5/lYLG/iXcXD7pfuO4WG0JiOO\\nNHyNevtDkWgBAoIAgBXL82F/ICjK7i4B232bJB0RQEzAevqBCoRFMUUGl9rePcgB\\nUOSiiDVHfl2C1yu3WabzNehoDO5/RqyxpPji/SrnMvi4aPPywLvJ9gqEfUwld1Wo\\np6riJoPx6aS+VLPLP0rDhKGuEJkIu4Qv9tCdG7nReWWEImiM6ldN9kzOZfIN\\n-----END RSA PRIVATE KEY-----\\n\", \"user_id\": \"4f2057b1b7744ce9b90440c0f47efbef\", \"name\": \"tempest-TestServerBasicOps-1692537820\", \"fingerprint\": \"01:a5:e4:68:53:67:e9:cc:22:5b:d6:b0:21:ff:5a:f4\"}}", " 2016-05-20 12:40:04,819 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:test_server_basic_ops): 500 POST https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-security-groups 30.063s", " 2016-05-20 12:40:04,819 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''}", " Body: {\"security_group\": {\"description\": \"tempest-TestServerBasicOps-1404384290 description\", \"name\": \"tempest-TestServerBasicOps-1404384290\"}}", " Response - Headers: {'status': '500', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '224', 'content-type': 'application/json; charset=UTF-8', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-security-groups', 'date': 'Fri, 20 May 2016 11:39:34 GMT', 'x-compute-request-id': 'req-940cd5d7-8a3c-478b-9285-2964bfe29105'}", " Body: {\"computeFault\": {\"message\": \"Unexpected API Error. Please report this at http://bugs.launchpad.net/nova/ and attach the Nova API log if possible.\\n\", \"code\": 500}}", " 2016-05-20 12:40:10,344 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:_run_cleanups): 202 DELETE https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-keypairs/tempest-TestServerBasicOps-1692537820 5.521s", " 2016-05-20 12:40:10,349 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''}", " Body: None", " Response - Headers: {'status': '202', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '0', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-keypairs/tempest-TestServerBasicOps-1692537820', 'date': 'Fri, 20 May 2016 11:40:04 GMT', 'x-compute-request-id': 'req-a0ec0f02-aeeb-4a83-81f3-01be3558a2df'}", " Body: ", " ", "", "Captured traceback:", "~~~~~~~~~~~~~~~~~~~", " Traceback (most recent call last):", " File \"tempest/test.py\", line 113, in wrapper", " return f(self, *func_args, **func_kwargs)", " File \"tempest/scenario/test_server_basic_ops.py\", line 124, in test_server_basic_ops", " self.security_group = self._create_security_group()", " File \"tempest/scenario/manager.py\", line 333, in _create_security_group", " name=sg_name, description=sg_desc)['security_group']", " File \"tempest/lib/services/compute/security_groups_client.py\", line 55, in create_security_group", " resp, body = self.post('os-security-groups', post_body)", " File \"tempest/lib/common/rest_client.py\", line 259, in post", " return self.request('POST', url, extra_headers, headers, body)", " File \"tempest/lib/services/compute/base_compute_client.py\", line 53, in request", " method, url, extra_headers, headers, body)", " File \"tempest/lib/common/rest_client.py\", line 641, in request", " resp, resp_body)", " File \"tempest/lib/common/rest_client.py\", line 760, in _error_checker", " message=message)", " tempest.lib.exceptions.ServerFault: Got server fault", " Details: Unexpected API Error. Please report this at http://bugs.launchpad.net/nova/ and attach the Nova API log if possible.", " ", " ", "", "", "======", "Totals", "======", "Ran: 126 tests in 837.0000 sec.", " - Passed: 107", " - Skipped: 18", " - Expected Fail: 0", " - Unexpected Success: 0", " - Failed: 1", "Sum of execute time for each test: 665.4004 sec.", "", "==============", "Worker Balance", "==============", " - Worker 0 (67 tests) => 0:13:47.264576", " - Worker 1 (59 tests) => 0:04:33.373255", "", "Slowest Tests:", "", "Test id Runtime (s)", "-------------------------------------------------------------------------------------------------------------------------------------------------------------- -----------", "tempest.scenario.test_volume_boot_pattern.TestVolumeBootPatternV2.test_volume_boot_pattern[compute,id-557cd2c2-4eb8-4dce-98be-f86765ff311b,image,smoke,volume] 155.015", "tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_volume_boot_pattern[compute,id-557cd2c2-4eb8-4dce-98be-f86765ff311b,image,smoke,volume] 151.414", "tempest.scenario.test_network_basic_ops.TestNetworkBasicOps.test_network_basic_ops[compute,id-f323b3ba-82f8-4db7-8ea6-6a895869ec49,network,smoke] 132.156", "tempest.api.volume.test_volumes_get.VolumesV1GetTest.test_volume_create_get_update_delete_from_image[id-54a01030-c7fc-447c-86ee-c1182beae638,image,smoke] 42.518", "tempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops[compute,id-7fff3fb3-91d8-4fd0-bd7d-0204f1f180ba,network,smoke] 36.835", "tempest.api.volume.test_volumes_get.VolumesV2GetTest.test_volume_create_get_update_delete_from_image[id-54a01030-c7fc-447c-86ee-c1182beae638,image,smoke] 32.993", "tempest.api.volume.test_volumes_get.VolumesV1GetTest.test_volume_create_get_update_delete[id-27fb0e9f-fb64-41dd-8bdb-1ffa762f0d51,smoke] 17.959", "tempest.api.compute.servers.test_attach_interfaces.AttachInterfacesTestJSON.test_add_remove_fixed_ip[id-c7e0e60b-ee45-43d0-abeb-8596fd42a2f9,network,smoke] 11.839", "tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_reboot_server_hard[id-2cb1baf6-ac8d-4429-bf0d-ba8a0ba53e32,smoke] 11.348", "tempest.api.volume.test_volumes_get.VolumesV2GetTest.test_volume_create_get_update_delete[id-27fb0e9f-fb64-41dd-8bdb-1ffa762f0d51,smoke] 7.843", "ERROR: InvocationError: '/usr/bin/bash tools/pretty_tox.sh --concurrency=2 smoke dashbboard TelemetryAlarming api.baremetal.admin.test_drivers'", "___________________________________ summary ____________________________________", "ERROR: all: commands failed"], "warnings": []} cmd: ./run_tests.sh start: 2016-05-20 12:22:43.301820 end: 2016-05-20 12:49:31.052546 delta: 0:26:47.750726 stdout: Cloning into '/tmp/openstack/tempest'... Preparing... ######################################## Updating / installing... puppetlabs-release-7-12 ######################################## Loaded plugins: fastestmirror, priorities Loading mirror speeds from cached hostfile * base: mirror.centos.org * extras: mirror.centos.org * updates: mirror.centos.org 545 packages excluded due to repository priority protections Resolving Dependencies --> Running transaction check ---> Package dstat.noarch 0:0.7.2-12.el7 will be installed ---> Package puppet.noarch 0:3.6.2-3.el7 will be installed --> Processing Dependency: hiera >= 1.0.0 for package: puppet-3.6.2-3.el7.noarch --> Processing Dependency: facter >= 1.6.6 for package: puppet-3.6.2-3.el7.noarch --> Processing Dependency: rubygem(rgen) for package: puppet-3.6.2-3.el7.noarch --> Processing Dependency: ruby(shadow) for package: puppet-3.6.2-3.el7.noarch --> Processing Dependency: ruby(selinux) for package: puppet-3.6.2-3.el7.noarch --> Processing Dependency: ruby(augeas) for package: puppet-3.6.2-3.el7.noarch --> Running transaction check ---> Package facter.x86_64 0:2.4.4-3.el7 will be installed --> Processing Dependency: pciutils for package: facter-2.4.4-3.el7.x86_64 ---> Package hiera.noarch 0:1.3.4-1.el7 will be installed ---> Package libselinux-ruby.x86_64 0:2.2.2-6.el7 will be installed ---> Package ruby-augeas.x86_64 0:0.5.0-1.el7 will be installed --> Processing Dependency: augeas-libs >= 1.0.0 for package: ruby-augeas-0.5.0-1.el7.x86_64 --> Processing Dependency: libaugeas.so.0(AUGEAS_0.8.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64 --> Processing Dependency: libaugeas.so.0(AUGEAS_0.16.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64 --> Processing Dependency: libaugeas.so.0(AUGEAS_0.14.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64 --> Processing Dependency: libaugeas.so.0(AUGEAS_0.12.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64 --> Processing Dependency: libaugeas.so.0(AUGEAS_0.11.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64 --> Processing Dependency: libaugeas.so.0(AUGEAS_0.10.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64 --> Processing Dependency: libaugeas.so.0(AUGEAS_0.1.0)(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64 --> Processing Dependency: libaugeas.so.0()(64bit) for package: ruby-augeas-0.5.0-1.el7.x86_64 ---> Package ruby-shadow.x86_64 0:1.4.1-23.el7 will be installed ---> Package rubygem-rgen.noarch 0:0.6.6-2.el7 will be installed --> Running transaction check ---> Package augeas-libs.x86_64 0:1.4.0-2.el7 will be installed ---> Package pciutils.x86_64 0:3.2.1-4.el7 will be installed --> Finished Dependency Resolution Dependencies Resolved ================================================================================ Package Arch Version Repository Size ================================================================================ Installing: dstat noarch 0.7.2-12.el7 base 163 k puppet noarch 3.6.2-3.el7 delorean-mitaka-testing 1.2 M Installing for dependencies: augeas-libs x86_64 1.4.0-2.el7 base 355 k facter x86_64 2.4.4-3.el7 delorean-mitaka-testing 101 k hiera noarch 1.3.4-1.el7 delorean-mitaka-testing 24 k libselinux-ruby x86_64 2.2.2-6.el7 base 127 k pciutils x86_64 3.2.1-4.el7 base 90 k ruby-augeas x86_64 0.5.0-1.el7 delorean-mitaka-testing 23 k ruby-shadow x86_64 1.4.1-23.el7 delorean-mitaka-testing 13 k rubygem-rgen noarch 0.6.6-2.el7 delorean-mitaka-testing 83 k Transaction Summary ================================================================================ Install 2 Packages (+8 Dependent packages) Total download size: 2.2 M Installed size: 7.1 M Downloading packages: -------------------------------------------------------------------------------- Total 401 kB/s | 2.2 MB 00:05 Running transaction check Running transaction test Transaction test succeeded Running transaction Installing : rubygem-rgen-0.6.6-2.el7.noarch 1/10 Installing : augeas-libs-1.4.0-2.el7.x86_64 2/10 Installing : ruby-augeas-0.5.0-1.el7.x86_64 3/10 Installing : ruby-shadow-1.4.1-23.el7.x86_64 4/10 Installing : hiera-1.3.4-1.el7.noarch 5/10 Installing : pciutils-3.2.1-4.el7.x86_64 6/10 Installing : facter-2.4.4-3.el7.x86_64 7/10 Installing : libselinux-ruby-2.2.2-6.el7.x86_64 8/10 Installing : puppet-3.6.2-3.el7.noarch 9/10 Installing : dstat-0.7.2-12.el7.noarch 10/10 Verifying : ruby-augeas-0.5.0-1.el7.x86_64 1/10 Verifying : libselinux-ruby-2.2.2-6.el7.x86_64 2/10 Verifying : pciutils-3.2.1-4.el7.x86_64 3/10 Verifying : hiera-1.3.4-1.el7.noarch 4/10 Verifying : puppet-3.6.2-3.el7.noarch 5/10 Verifying : facter-2.4.4-3.el7.x86_64 6/10 Verifying : dstat-0.7.2-12.el7.noarch 7/10 Verifying : ruby-shadow-1.4.1-23.el7.x86_64 8/10 Verifying : augeas-libs-1.4.0-2.el7.x86_64 9/10 Verifying : rubygem-rgen-0.6.6-2.el7.noarch 10/10 Installed: dstat.noarch 0:0.7.2-12.el7 puppet.noarch 0:3.6.2-3.el7 Dependency Installed: augeas-libs.x86_64 0:1.4.0-2.el7 facter.x86_64 0:2.4.4-3.el7 hiera.noarch 0:1.3.4-1.el7 libselinux-ruby.x86_64 0:2.2.2-6.el7 pciutils.x86_64 0:3.2.1-4.el7 ruby-augeas.x86_64 0:0.5.0-1.el7 ruby-shadow.x86_64 0:1.4.1-23.el7 rubygem-rgen.noarch 0:0.6.6-2.el7 Complete! dstat is /usr/bin/dstat Successfully installed colored-1.2 Successfully installed cri-2.6.1 Successfully installed log4r-1.1.10 Successfully installed multi_json-1.12.1 Successfully installed multipart-post-2.0.0 Successfully installed faraday-0.9.2 Successfully installed faraday_middleware-0.10.0 Successfully installed semantic_puppet-0.1.2 Successfully installed minitar-0.5.4 Successfully installed puppet_forge-2.2.0 Successfully installed r10k-2.3.0 11 gems installed /etc/puppet/modules ??? antonlindstrom-powerdns (v0.0.5) ??? duritong-sysctl (v0.0.11) ??? nanliu-staging (v1.0.4) ??? openstack-aodh (v8.0.2) ??? openstack-barbican (v0.0.1) ??? openstack-ceilometer (v8.0.1) ??? openstack-ceph (v1.0.0) ??? openstack-cinder (v8.0.1) ??? openstack-designate (v8.0.1) ??? openstack-glance (v8.0.1) ??? openstack-gnocchi (v8.0.1) ??? openstack-heat (v8.0.1) ??? openstack-horizon (v8.0.1) ??? openstack-ironic (v8.0.1) ??? openstack-keystone (v8.0.1) ??? openstack-manila (v8.0.1) ??? openstack-mistral (v8.0.1) ??? openstack-monasca (v1.0.0) ??? openstack-murano (v8.0.1) ??? openstack-neutron (v8.0.1) ??? openstack-nova (v8.0.1) ??? openstack-openstack_extras (v8.0.1) ??? openstack-openstacklib (v8.0.1) invalid ??? openstack-sahara (v8.0.1) ??? openstack-swift (v8.0.1) ??? openstack-tempest (v8.0.1) ??? openstack-trove (v8.0.1) ??? openstack-vswitch (v4.0.0) ??? openstack-zaqar (v8.0.1) ??? openstack_integration (???) ??? puppet-corosync (v0.8.0) ??? puppet-octavia (v0.0.1) ??? puppet-oslo (v0.0.1) ??? puppetlabs-apache (v1.8.1) ??? puppetlabs-apt (v2.2.2) ??? puppetlabs-concat (v1.2.5) ??? puppetlabs-firewall (v1.7.2) ??? puppetlabs-inifile (v1.4.3) invalid ??? puppetlabs-mongodb (v0.12.0) ??? puppetlabs-mysql (v3.6.2) ??? puppetlabs-postgresql (v4.7.1) ??? puppetlabs-rabbitmq (v5.3.1) ??? puppetlabs-rsync (v0.4.0) ??? puppetlabs-stdlib (v4.9.1) ??? puppetlabs-vcsrepo (v1.3.2) ??? puppetlabs-xinetd (v1.5.0) ??? qpid (???) ??? saz-memcached (v2.8.1) ??? stankevich-python (v1.10.0) ??? theforeman-dns (v3.1.0) /usr/share/puppet/modules (no modules installed) Info: Loading external facts from /etc/puppet/modules/openstacklib/facts.d Info: Loading facts in /etc/puppet/modules/nova/lib/facter/libvirt_uuid.rb Info: Loading facts in /etc/puppet/modules/openstacklib/lib/facter/os_package_type.rb Info: Loading facts in /etc/puppet/modules/openstacklib/lib/facter/os_service_default.rb Info: Loading facts in /etc/puppet/modules/vswitch/lib/facter/ovs.rb Info: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_reboot_required.rb Info: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_update_last_success.rb Info: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_updates.rb Info: Loading facts in /etc/puppet/modules/concat/lib/facter/concat_basedir.rb Info: Loading facts in /etc/puppet/modules/firewall/lib/facter/ip6tables_version.rb Info: Loading facts in /etc/puppet/modules/firewall/lib/facter/iptables_persistent_version.rb Info: Loading facts in /etc/puppet/modules/firewall/lib/facter/iptables_version.rb Info: Loading facts in /etc/puppet/modules/mysql/lib/facter/mysql_version.rb Info: Loading facts in /etc/puppet/modules/mysql/lib/facter/mysql_server_id.rb Info: Loading facts in /etc/puppet/modules/python/lib/facter/pip_version.rb Info: Loading facts in /etc/puppet/modules/python/lib/facter/python_version.rb Info: Loading facts in /etc/puppet/modules/python/lib/facter/virtualenv_version.rb Info: Loading facts in /etc/puppet/modules/staging/lib/facter/staging_http_get.rb Info: Loading facts in /etc/puppet/modules/staging/lib/facter/staging_windir.rb Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/puppet_vardir.rb Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/facter_dot_d.rb Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/pe_version.rb Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/root_home.rb Notice: Compiled catalog for n2.dusty.ci.centos.org in environment production in 8.94 seconds Info: Applying configuration version '1463743459' Notice: /Stage[main]/Concat::Setup/File[/var/lib/puppet/concat]/ensure: created Notice: /Stage[main]/Concat::Setup/File[/var/lib/puppet/concat/bin]/ensure: created Notice: /Stage[main]/Concat::Setup/File[/var/lib/puppet/concat/bin/concatfragments.rb]/ensure: defined content as '{md5}b684db0eac243553a6a79365119a363d' Notice: /Stage[main]/Xinetd/Package[xinetd]/ensure: created Notice: /Stage[main]/Openstack_integration::Cacert/File[/etc/pki/ca-trust/source/anchors/puppet_openstack.pem]/ensure: defined content as '{md5}78f42ae07a4fc8ebdd5b89c4c74bba5e' Info: /Stage[main]/Openstack_integration::Cacert/File[/etc/pki/ca-trust/source/anchors/puppet_openstack.pem]: Scheduling refresh of Exec[update-ca-certificates] Notice: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Triggered 'refresh' from 1 events Info: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Openstack_integration::Cacert/Exec[update-ca-certificates]: Scheduling refresh of Service[glance-registry] Notice: /Stage[main]/Memcached/Package[memcached]/ensure: created Notice: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]/content: --- /etc/sysconfig/memcached 2015-04-10 10:40:42.000000000 +0100 +++ /tmp/puppet-file20160520-26469-d60985 2016-05-20 12:24:40.532841270 +0100 @@ -1,5 +1,5 @@ PORT="11211" USER="memcached" -MAXCONN="1024" -CACHESIZE="64" -OPTIONS="" +MAXCONN="8192" +CACHESIZE="30400" +OPTIONS="-l 0.0.0.0 -U 11211 -t 8 >> /var/log/memcached.log 2>&1" Info: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]: Filebucketed /etc/sysconfig/memcached to puppet with sum 05503957e3796fbe6fddd756a7a102a0 Notice: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]/content: content changed '{md5}05503957e3796fbe6fddd756a7a102a0' to '{md5}607d5b4345a63a5155f9fbe6c19b6c9b' Info: /Stage[main]/Memcached/File[/etc/sysconfig/memcached]: Scheduling refresh of Service[memcached] Notice: /Stage[main]/Memcached/Service[memcached]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Memcached/Service[memcached]: Unscheduling refresh on Service[memcached] Notice: /Stage[main]/Rabbitmq::Repo::Rhel/Exec[rpm --import http://www.rabbitmq.com/rabbitmq-signing-key-public.asc]/returns: executed successfully Notice: /Stage[main]/Neutron::Agents::Lbaas/Package[haproxy]/ensure: created Notice: /Stage[main]/Glance/Package[openstack-glance]/ensure: created Info: /Stage[main]/Glance/Package[openstack-glance]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Glance/Package[openstack-glance]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_config_file]/ensure: created Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_config_file]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_config_file]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_port]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_port]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_port]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_cache_config[glance_store/os_region_name]/ensure: created Info: /Stage[main]/Glance::Api/Glance_cache_config[glance_store/os_region_name]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_cache_config[glance_store/os_region_name]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_dir]/ensure: created Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_dir]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_dir]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/show_image_direct_url]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/show_image_direct_url]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/show_image_direct_url]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_port]/ensure: created Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_port]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_port]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/debug]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/debug]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_user]/ensure: created Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_user]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_user]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/image_cache_dir]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/image_cache_dir]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/image_cache_dir]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_host]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_host]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_host]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_key_file]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_key_file]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_key_file]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_port]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_store]/ensure: created Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_store]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_store]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/amqp_durable_queues]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_swift_reference]/ensure: created Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_swift_reference]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/default_swift_reference]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/key_file]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/key_file]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/key_file]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/user]/ensure: created Notice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/identity_uri]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/identity_uri]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_stderr]/ensure: created Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_stderr]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_stderr]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/key_file]/ensure: created Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/key_file]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/key_file]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_host]/ensure: created Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_host]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_host]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/workers]/ensure: created Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/workers]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/workers]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_password]/ensure: created Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_password]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_api_config[paste_deploy/flavor]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[paste_deploy/flavor]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[paste_deploy/flavor]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_dir]/ensure: created Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_dir]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_dir]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_rate]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_port]/ensure: created Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_port]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/bind_port]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_virtual_host]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_syslog]/ensure: created Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_syslog]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_syslog]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/kombu_ssl_version]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_exchange]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_exchange]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_exchange]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_password]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_password]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_api_config[glance_store/os_region_name]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[glance_store/os_region_name]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[glance_store/os_region_name]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Neutron/Package[neutron]/ensure: created Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-ovs-agent-service] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-ovs-agent-service] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-lbaasv2-service] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-metering-service] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Service[neutron-metering-service] Info: /Stage[main]/Neutron/Package[neutron]: Scheduling refresh of Exec[neutron-db-sync] Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/resync_interval]/ensure: created Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/resync_interval]: Scheduling refresh of Service[neutron-dhcp-service] Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/log_dir]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[agent/root_helper]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Agents::Metering/Package[neutron-metering-agent]/ensure: created Info: /Stage[main]/Neutron::Agents::Metering/Package[neutron-metering-agent]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Neutron::Agents::Metering/Package[neutron-metering-agent]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Agents::Metering/Package[neutron-metering-agent]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created Info: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/metadata_workers]: Scheduling refresh of Service[neutron-metadata] Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_metadata_network]/ensure: created Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_metadata_network]: Scheduling refresh of Service[neutron-dhcp-service] Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/control_exchange]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/driver]/ensure: created Info: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/driver]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_tenant_name]/ensure: created Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_tenant_name]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_tenant_name]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_tenant_name]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_isolated_metadata]/ensure: created Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/enable_isolated_metadata]: Scheduling refresh of Service[neutron-dhcp-service] Notice: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/cert_file]/ensure: created Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/cert_file]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry/Glance_registry_config[DEFAULT/cert_file]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-metadata] Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-dhcp-service] Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/syslog_log_facility]/ensure: created Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/syslog_log_facility]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/syslog_log_facility]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Openstack_integration/Package[openstack-selinux]/ensure: created Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/backlog]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/backlog]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/backlog]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Mysql::Client::Install/Package[mysql_client]/ensure: created Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Mongodb::Server::Install/Package[mongodb_server]/ensure: created Notice: /Stage[main]/Cinder/Package[cinder]/ensure: created Info: /Stage[main]/Cinder/Package[cinder]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Cinder/Package[cinder]: Scheduling refresh of Exec[cinder-manage db_sync] Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v2_api]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]/ensure: created Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder::Cron::Db_purge/Cron[cinder-manage db purge]/ensure: created Notice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]/ensure: created Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/default_volume_type]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]/ensure: created Info: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/debug]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]/ensure: created Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_info]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]/ensure: created Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]/ensure: created Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]/ensure: created Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]/ensure: created Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/osapi_volume_workers]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]/ensure: created Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Staging/File[/opt/staging]/ensure: created Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_file]/ensure: created Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_file]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/log_file]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_dir]/ensure: created Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_dir]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_dir]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/verbose]/ensure: created Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/verbose]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/verbose]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_file]/ensure: created Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_file]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/log_file]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_cert_file]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_cert_file]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_cert_file]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/router_scheduler_driver]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_type]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/auth_type]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/tenant_name]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_name]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/password]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/username]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/rpc_workers]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/api_workers]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]/ensure: created Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_protocol]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_protocol]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_client_protocol]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_create_container_on_put]/ensure: created Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_create_container_on_put]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_create_container_on_put]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_endpoint_type]/ensure: created Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_endpoint_type]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_endpoint_type]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Ironic/Package[ironic-common]/ensure: created Info: /Stage[main]/Ironic/Package[ironic-common]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Ironic/Package[ironic-common]: Scheduling refresh of Exec[ironic-dbsync] Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_virtual_host]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic/Ironic_config[glance/glance_api_insecure]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[glance/glance_api_insecure]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[glance/glance_api_insecure]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/kombu_ssl_version]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_port]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic/Ironic_config[DEFAULT/enabled_drivers]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/enabled_drivers]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/enabled_drivers]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_host]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic/Ironic_config[DEFAULT/auth_strategy]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/auth_strategy]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/amqp_durable_queues]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_password]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/cert_file]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/cert_file]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/cert_file]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[DEFAULT/verbose]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Glance::Registry/Glance_registry_config[paste_deploy/flavor]/ensure: created Info: /Stage[main]/Glance::Registry/Glance_registry_config[paste_deploy/flavor]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry/Glance_registry_config[paste_deploy/flavor]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Apache::Mod::Mime/Package[mailcap]/ensure: created Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/workers]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/workers]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/workers]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_password]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_password]/ensure: created Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_password]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/admin_password]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_syslog]/ensure: created Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_syslog]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/use_syslog]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Ironic::Client/Package[python-ironicclient]/ensure: created Info: /Stage[main]/Ironic::Client/Package[python-ironicclient]: Scheduling refresh of Anchor[keystone::service::end] Notice: /Stage[main]/Tempest/Tempest_config[identity/admin_username]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[service_available/zaqar]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[network/public_router_id]/ensure: created Notice: /Stage[main]/Tempest/Package[openssl-devel]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[identity/auth_version]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[service_available/sahara]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[identity/uri_v3]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[DEFAULT/use_syslog]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[service_available/swift]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[identity/admin_tenant_name]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[scenario/img_dir]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[compute/flavor_ref_alt]/ensure: created Notice: /Stage[main]/Tempest/Package[libffi-devel]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[scenario/img_file]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[DEFAULT/log_file]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[dashboard/dashboard_url]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[service_available/cinder]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[service_available/ironic]/ensure: created Notice: /Stage[main]/Tempest/Exec[install-pip]/returns: executed successfully Notice: /Stage[main]/Tempest/Tempest_config[service_available/heat]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[identity-feature-enabled/api_v2]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[identity/ca_certificates_file]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[service_available/trove]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[service_available/murano]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[compute/flavor_ref]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[identity/admin_password]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[service_available/ceilometer]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[oslo_concurrency/lock_path]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[service_available/glance]/ensure: created Notice: /Stage[main]/Tempest/Exec[install-tox]/returns: executed successfully Notice: /Stage[main]/Tempest/Tempest_config[service_available/nova]/ensure: created Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/root_helper]/ensure: created Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/root_helper]: Scheduling refresh of Service[neutron-dhcp-service] Notice: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]/ensure: created Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_host]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_host]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/registry_host]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Openstack_integration::Glance/Openstack_integration::Ssl_key[glance]/File[/etc/glance/ssl]/ensure: created Notice: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_port]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_port]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[DEFAULT/bind_port]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/auth_version]/ensure: created Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/syslog_log_facility]/ensure: created Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/syslog_log_facility]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/syslog_log_facility]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/interface_driver]/ensure: created Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/interface_driver]: Scheduling refresh of Service[neutron-dhcp-service] Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Tempest/Tempest_config[identity-feature-enabled/api_v3]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[compute/image_ssh_user]/ensure: created Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/auth_uri]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/auth_uri]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Tempest/Tempest_config[compute/image_alt_ssh_user]/ensure: created Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/dhcp_driver]/ensure: created Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/dhcp_driver]: Scheduling refresh of Service[neutron-dhcp-service] Notice: /Stage[main]/Neutron::Server/Neutron_api_config[filter:authtoken/auth_uri]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_api_config[filter:authtoken/auth_uri]: Scheduling refresh of Service[neutron-server] Notice: /Stage[main]/Neutron::Services::Fwaas/Package[neutron-fwaas]/ensure: created Info: /Stage[main]/Neutron::Services::Fwaas/Package[neutron-fwaas]: Scheduling refresh of Anchor[keystone::service::end] Notice: /Stage[main]/Neutron::Services::Fwaas/Neutron_fwaas_service_config[fwaas/driver]/ensure: created Info: /Stage[main]/Neutron::Services::Fwaas/Neutron_fwaas_service_config[fwaas/driver]: Scheduling refresh of Service[neutron-l3] Notice: /Stage[main]/Openstacklib::Openstackclient/Package[python-openstackclient]/ensure: created Info: /Stage[main]/Openstacklib::Openstackclient/Package[python-openstackclient]: Scheduling refresh of Anchor[keystone::service::end] Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]/ensure: created Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_driver]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Package[targetcli]/ensure: created Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]/ensure: created Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volumes_dir]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]/ensure: created Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_ip_address]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Tempest/Tempest_config[DEFAULT/use_stderr]/ensure: created Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_file]/ensure: created Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_file]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/log_file]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Tempest/Tempest_config[service_available/horizon]/ensure: created Notice: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_user]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[keystone_authtoken/admin_user]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Vswitch::Ovs/Package[openvswitch]/ensure: created Notice: /Stage[main]/Vswitch::Ovs/Service[openvswitch]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Vswitch::Ovs/Service[openvswitch]: Unscheduling refresh on Service[openvswitch] Notice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/content: --- /etc/xinetd.conf 2014-06-09 19:55:06.000000000 +0100 +++ /tmp/puppet-file20160520-26469-qo45km 2016-05-20 12:26:52.098856758 +0100 @@ -1,3 +1,5 @@ +# This file is being maintained by Puppet. +# DO NOT EDIT # # This is the master xinetd configuration file. Settings in the # default section will be inherited by all service configurations @@ -10,41 +12,40 @@ # The next two items are intended to be a quick access place to # temporarily enable or disable services. # -# enabled = -# disabled = +# enabled = +# disabled = # Define general logging characteristics. - log_type = SYSLOG daemon info - log_on_failure = HOST - log_on_success = PID HOST DURATION EXIT + log_type = SYSLOG daemon info + log_on_failure = HOST + log_on_success = PID HOST DURATION EXIT # Define access restriction defaults # -# no_access = -# only_from = -# max_load = 0 - cps = 50 10 - instances = 50 - per_source = 10 +# no_access = +# only_from = +# max_load = 0 + cps = 50 10 + instances = 50 + per_source = 10 # Address and networking defaults # -# bind = -# mdns = yes - v6only = no +# bind = +# mdns = yes + v6only = no # setup environmental attributes # -# passenv = - groups = yes - umask = 002 +# passenv = + groups = yes + umask = 002 # Generally, banners are not used. This sets up their global defaults # -# banner = -# banner_fail = -# banner_success = +# banner = +# banner_fail = +# banner_success = } includedir /etc/xinetd.d - Info: /Stage[main]/Xinetd/File[/etc/xinetd.conf]: Filebucketed /etc/xinetd.conf to puppet with sum 9ff8cc688dd9f0dfc45e5afd25c427a7 Notice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/content: content changed '{md5}9ff8cc688dd9f0dfc45e5afd25c427a7' to '{md5}011e3163044bef3aa02a664f3785d30c' Notice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/mode: mode changed '0600' to '0644' Info: /Stage[main]/Xinetd/File[/etc/xinetd.conf]: Scheduling refresh of Service[xinetd] Info: /Stage[main]/Xinetd/File[/etc/xinetd.conf]: Scheduling refresh of Service[xinetd] Notice: /Stage[main]/Mysql::Server::Install/Package[mysql-server]/ensure: created Notice: /Stage[main]/Mysql::Server::Config/File[mysql-config-file]/ensure: defined content as '{md5}ff09a4033f718f08f69da17f0aa86652' Notice: /Stage[main]/Mysql::Server::Installdb/Exec[mysql_install_db]/returns: executed successfully Notice: /File[/var/log/mariadb/mariadb.log]/seluser: seluser changed 'unconfined_u' to 'system_u' Notice: /Stage[main]/Mysql::Server::Service/Service[mysqld]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Mysql::Server::Service/Service[mysqld]: Unscheduling refresh on Service[mysqld] Notice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Mysql_database[neutron]/ensure: created Notice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[neutron_127.0.0.1]/Mysql_user[neutron@127.0.0.1]/ensure: created Notice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Mysql_database[glance]/ensure: created Notice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Mysql_database[cinder]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[service_available/aodh]/ensure: created Notice: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]/ensure: created Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Logging/Cinder_config[DEFAULT/verbose]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Tempest/Tempest_config[identity/uri]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[identity/admin_domain_name]/ensure: created Notice: /Stage[main]/Ironic/Ironic_config[DEFAULT/rpc_backend]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Neutron::Services::Fwaas/Neutron_fwaas_service_config[fwaas/enabled]/ensure: created Info: /Stage[main]/Neutron::Services::Fwaas/Neutron_fwaas_service_config[fwaas/enabled]: Scheduling refresh of Service[neutron-l3] Notice: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/log_dir]/ensure: created Info: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/log_dir]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/log_dir]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v1_api]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_stderr]/ensure: created Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_stderr]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/use_stderr]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Nova/Package[python-nova]/ensure: created Info: /Stage[main]/Nova/Package[python-nova]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Nova/Package[python-nova]: Scheduling refresh of Anchor[nova::install::end] Notice: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Package[nova-consoleauth]/ensure: created Info: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Package[nova-consoleauth]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Package[nova-consoleauth]: Scheduling refresh of Anchor[nova::install::end] Notice: /Stage[main]/Nova::Compute/Package[genisoimage]/ensure: created Info: /Stage[main]/Nova::Compute/Package[genisoimage]: Scheduling refresh of Anchor[keystone::service::end] Notice: /Stage[main]/Openstack_integration::Nova/Openstack_integration::Ssl_key[nova]/File[/etc/nova/ssl]/ensure: created Notice: /Stage[main]/Nova::Compute::Libvirt/Package[libvirt-nwfilter]/ensure: created Info: /Stage[main]/Nova::Compute::Libvirt/Package[libvirt-nwfilter]: Scheduling refresh of Anchor[keystone::service::end] Notice: /Stage[main]/Nova::Compute::Libvirt/Package[libvirt]/ensure: created Info: /Stage[main]/Nova::Compute::Libvirt/Package[libvirt]: Scheduling refresh of Anchor[keystone::service::end] Notice: /Stage[main]/Openstack_integration::Nova/Openstack_integration::Ssl_key[nova]/File[/etc/nova/ssl/private]/ensure: created Notice: /Stage[main]/Openstack_integration::Nova/Openstack_integration::Ssl_key[nova]/File[/etc/nova/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00' Info: Openstack_integration::Ssl_key[nova]: Scheduling refresh of Service[httpd] Notice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_127.0.0.1]/Mysql_user[glance@127.0.0.1]/ensure: created Notice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_127.0.0.1]/Mysql_grant[glance@127.0.0.1/glance.*]/ensure: created Notice: /Stage[main]/Openstack_integration::Glance/Openstack_integration::Ssl_key[glance]/File[/etc/glance/ssl/private]/ensure: created Notice: /Stage[main]/Openstack_integration::Glance/Openstack_integration::Ssl_key[glance]/File[/etc/glance/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00' Info: Openstack_integration::Ssl_key[glance]: Scheduling refresh of Service[glance-api] Info: Openstack_integration::Ssl_key[glance]: Scheduling refresh of Service[glance-registry] Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/auth_uri]/ensure: created Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/auth_uri]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Tempest/Tempest_config[compute/build_interval]/ensure: created Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]/ensure: created Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/volume_backend_name]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Neutron::Agents::Lbaas/Package[neutron-lbaas-agent]/ensure: created Info: /Stage[main]/Neutron::Agents::Lbaas/Package[neutron-lbaas-agent]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Neutron::Agents::Lbaas/Package[neutron-lbaas-agent]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Agents::Lbaas/Package[neutron-lbaas-agent]: Scheduling refresh of Service[neutron-lbaas-service] Notice: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[haproxy/user_group]/ensure: created Info: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[haproxy/user_group]: Scheduling refresh of Service[neutron-lbaas-service] Notice: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-lbaas-service] Notice: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/interface_driver]/ensure: created Info: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/interface_driver]: Scheduling refresh of Service[neutron-lbaas-service] Notice: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/device_driver]/ensure: created Info: /Stage[main]/Neutron::Agents::Lbaas/Neutron_lbaas_agent_config[DEFAULT/device_driver]: Scheduling refresh of Service[neutron-lbaas-service] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_topic]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_topic]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_notification_topic]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/auth_address]/ensure: created Notice: /Stage[main]/Openstack_integration::Provision/Vs_bridge[br-ex]/ensure: created Info: /Stage[main]/Openstack_integration::Provision/Vs_bridge[br-ex]: Scheduling refresh of Exec[create_loop1_port] Notice: /Stage[main]/Openstack_integration::Provision/Exec[create_loop1_port]: Triggered 'refresh' from 1 events Notice: /Stage[main]/Openstack_integration::Provision/Vs_port[loop1]/ensure: created Info: /Stage[main]/Openstack_integration::Provision/Vs_port[loop1]: Scheduling refresh of Exec[create_br-ex_vif] Notice: /Stage[main]/Openstack_integration::Provision/Exec[create_br-ex_vif]: Triggered 'refresh' from 1 events Notice: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Package[nova-compute]/ensure: created Info: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Package[nova-compute]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Package[nova-compute]: Scheduling refresh of Anchor[nova::install::end] Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder::Setup_test_volume/Exec[create_/var/lib/cinder/cinder-volumes]/returns: executed successfully Info: /Stage[main]/Cinder::Setup_test_volume/Exec[create_/var/lib/cinder/cinder-volumes]: Scheduling refresh of Exec[losetup /dev/loop2 /var/lib/cinder/cinder-volumes] Notice: /Stage[main]/Cinder::Setup_test_volume/Exec[losetup /dev/loop2 /var/lib/cinder/cinder-volumes]: Triggered 'refresh' from 1 events Info: /Stage[main]/Cinder::Setup_test_volume/Exec[losetup /dev/loop2 /var/lib/cinder/cinder-volumes]: Scheduling refresh of Exec[pvcreate /dev/loop2] Notice: /Stage[main]/Cinder::Setup_test_volume/Exec[pvcreate /dev/loop2]: Triggered 'refresh' from 1 events Info: /Stage[main]/Cinder::Setup_test_volume/Exec[pvcreate /dev/loop2]: Scheduling refresh of Exec[vgcreate cinder-volumes /dev/loop2] Notice: /Stage[main]/Cinder::Setup_test_volume/Exec[vgcreate cinder-volumes /dev/loop2]: Triggered 'refresh' from 1 events Notice: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]/ensure: created Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/state_path]/ensure: created Info: /Stage[main]/Neutron::Agents::Dhcp/Neutron_dhcp_agent_config[DEFAULT/state_path]: Scheduling refresh of Service[neutron-dhcp-service] Notice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/identity_uri]/ensure: created Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/identity_uri]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_host]/ensure: created Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_host]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/registry_host]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Service[target]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Service[target]: Unscheduling refresh on Service[target] Notice: /Stage[main]/Swift/Package[swift]/ensure: created Info: /Stage[main]/Swift/Package[swift]: Scheduling refresh of Anchor[keystone::service::end] Notice: /Stage[main]/Swift/File[/var/lib/swift]/group: group changed 'root' to 'swift' Notice: /Stage[main]/Swift/File[/etc/swift]/owner: owner changed 'root' to 'swift' Notice: /Stage[main]/Swift/File[/etc/swift]/group: group changed 'root' to 'swift' Notice: /Stage[main]/Swift/File[/etc/swift/swift.conf]/owner: owner changed 'root' to 'swift' Notice: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]/ensure: created Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-proxy-server] Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-account-reaper] Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-container-updater] Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-object-updater] Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-account-server] Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-account-auditor] Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-account-replicator] Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-container-server] Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-container-auditor] Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-container-replicator] Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-object-server] Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-object-auditor] Info: /Stage[main]/Swift/Swift_config[swift-constraints/max_header_size]: Scheduling refresh of Service[swift-object-replicator] Notice: /Stage[main]/Openstack_integration::Swift/File[/srv/node]/ensure: created Notice: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]/value: value changed '%SWIFT_HASH_PATH_SUFFIX%' to 'secrete' Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-proxy-server] Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-account-reaper] Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-container-updater] Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-object-updater] Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-account-server] Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-account-auditor] Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-account-replicator] Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-container-server] Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-container-auditor] Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-container-replicator] Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-object-server] Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-object-auditor] Info: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]: Scheduling refresh of Service[swift-object-replicator] Notice: /Stage[main]/Swift/File[/var/run/swift]/group: group changed 'root' to 'swift' Notice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Create[container]/Exec[create_container]/returns: executed successfully Notice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Create[object]/Exec[create_object]/returns: executed successfully Notice: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/3]/ensure: created Info: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/3]: Scheduling refresh of Swift::Ringbuilder::Rebalance[object] Notice: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/3]/ensure: created Info: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/3]: Scheduling refresh of Swift::Ringbuilder::Rebalance[container] Notice: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/2]/ensure: created Info: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/2]: Scheduling refresh of Swift::Ringbuilder::Rebalance[container] Notice: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/2]/ensure: created Info: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/2]: Scheduling refresh of Swift::Ringbuilder::Rebalance[object] Notice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[neutron_127.0.0.1]/Mysql_grant[neutron@127.0.0.1/neutron.*]/ensure: created Info: Openstacklib::Db::Mysql[neutron]: Scheduling refresh of Service[neutron-server] Info: Openstacklib::Db::Mysql[neutron]: Scheduling refresh of Exec[neutron-db-sync] Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/debug]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/debug]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Api/Glance_api_config[glance_store/stores]/ensure: created Info: /Stage[main]/Glance::Api/Glance_api_config[glance_store/stores]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_api_config[glance_store/stores]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Neutron::Plugins::Ml2/Package[neutron-plugin-ml2]/ensure: created Info: /Stage[main]/Neutron::Plugins::Ml2/Package[neutron-plugin-ml2]: Scheduling refresh of Anchor[keystone::service::end] Notice: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vni_ranges]/ensure: created Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vni_ranges]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vni_ranges]: Scheduling refresh of Exec[neutron-db-sync] Notice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/type_drivers]/ensure: created Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/type_drivers]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/type_drivers]: Scheduling refresh of Exec[neutron-db-sync] Notice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/path_mtu]/ensure: created Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/path_mtu]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/path_mtu]: Scheduling refresh of Exec[neutron-db-sync] Notice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/tenant_network_types]/ensure: created Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/tenant_network_types]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/tenant_network_types]: Scheduling refresh of Exec[neutron-db-sync] Notice: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/mechanism_drivers]/ensure: created Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/mechanism_drivers]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron_plugin_ml2[ml2/mechanism_drivers]: Scheduling refresh of Exec[neutron-db-sync] Notice: /Stage[main]/Neutron::Plugins::Ml2/File[/etc/default/neutron-server]/ensure: created Notice: /Stage[main]/Neutron::Plugins::Ml2/File[/etc/neutron/plugin.ini]/ensure: created Notice: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]/ensure: created Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Api/Cinder_config[DEFAULT/nova_catalog_admin_info]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Ironic/Ironic_config[DEFAULT/control_exchange]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/control_exchange]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[DEFAULT/control_exchange]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/debug]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Cache::Logging/Glance_cache_config[DEFAULT/debug]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/1]/ensure: created Info: /Stage[main]/Openstack_integration::Swift/Ring_container_device[127.0.0.1:6001/1]: Scheduling refresh of Swift::Ringbuilder::Rebalance[container] Info: Swift::Ringbuilder::Rebalance[container]: Scheduling refresh of Exec[rebalance_container] Notice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Rebalance[container]/Exec[rebalance_container]: Triggered 'refresh' from 1 events Notice: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/verbose]/ensure: created Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/verbose]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api::Logging/Glance_api_config[DEFAULT/verbose]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/verbose]/ensure: created Info: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/verbose]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/verbose]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Openstacklib::Db::Mysql::Host_access[cinder_127.0.0.1]/Mysql_user[cinder@127.0.0.1]/ensure: created Notice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Openstacklib::Db::Mysql::Host_access[cinder_127.0.0.1]/Mysql_grant[cinder@127.0.0.1/cinder.*]/ensure: created Info: Openstacklib::Db::Mysql[cinder]: Scheduling refresh of Exec[cinder-manage db_sync] Notice: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Package[nova-conductor]/ensure: created Info: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Package[nova-conductor]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Package[nova-conductor]: Scheduling refresh of Anchor[nova::install::end] Notice: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/1]/ensure: created Info: /Stage[main]/Openstack_integration::Swift/Ring_object_device[127.0.0.1:6000/1]: Scheduling refresh of Swift::Ringbuilder::Rebalance[object] Info: Swift::Ringbuilder::Rebalance[object]: Scheduling refresh of Exec[rebalance_object] Notice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Rebalance[object]/Exec[rebalance_object]: Triggered 'refresh' from 1 events Notice: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/verbose]/ensure: created Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/verbose]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry::Logging/Glance_registry_config[DEFAULT/verbose]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Ironic::Conductor/Package[ironic-conductor]/ensure: created Info: /Stage[main]/Ironic::Conductor/Package[ironic-conductor]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Ironic::Conductor/Package[ironic-conductor]: Scheduling refresh of Exec[ironic-dbsync] Notice: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/max_time_interval]/ensure: created Info: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/max_time_interval]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/max_time_interval]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/force_power_state_during_sync]/ensure: created Info: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/force_power_state_during_sync]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Conductor/Ironic_config[conductor/force_power_state_during_sync]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Package[nova-vncproxy]/ensure: created Info: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Package[nova-vncproxy]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Package[nova-vncproxy]: Scheduling refresh of Anchor[nova::install::end] Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/rpc_backend]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Ironic/Ironic_config[glance/glance_num_retries]/ensure: created Info: /Stage[main]/Ironic/Ironic_config[glance/glance_num_retries]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic/Ironic_config[glance/glance_num_retries]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf]/ensure: created Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments]/ensure: created Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf] Notice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[container]/Concat::Fragment[swift_recon_container]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/35_swift_recon_container]/ensure: defined content as '{md5}d847d2d529a3596ed6a74d841d790dc7' Info: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[container]/Concat::Fragment[swift_recon_container]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/35_swift_recon_container]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments.concat]/ensure: created Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf]/ensure: created Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments.concat]/ensure: created Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments.concat.out]/ensure: created Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments]/ensure: created Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat::Fragment[swift-account-6002]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/00_swift-account-6002]/ensure: defined content as '{md5}666661f3805b49b4682cc11f80dad508' Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat::Fragment[swift-account-6002]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/00_swift-account-6002]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments.concat.out]/ensure: created Notice: /Stage[main]/Tempest/Tempest_config[service_broker/run_service_broker_tests]/ensure: created Notice: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]/ensure: created Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Openstack_integration::Cinder/Cinder::Backend::Iscsi[BACKEND_1]/Cinder_config[BACKEND_1/iscsi_helper]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Tempest/Tempest_config[DEFAULT/debug]/ensure: created Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[DEFAULT/notification_driver]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[DEFAULT/notification_driver]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[DEFAULT/notification_driver]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/interface_driver]/ensure: created Info: /Stage[main]/Neutron::Agents::Metering/Neutron_metering_agent_config[DEFAULT/interface_driver]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf]/ensure: created Info: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf]: Scheduling refresh of Exec[concat_/etc/rsync.conf] Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments.concat.out]/ensure: created Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments]/ensure: created Info: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments]: Scheduling refresh of Exec[concat_/etc/rsync.conf] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Rsync::Server::Module[container]/Concat::Fragment[frag-container]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_container_frag-container]/ensure: defined content as '{md5}7dd5f706fbeccaf9a45d40737af512ac' Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Rsync::Server::Module[container]/Concat::Fragment[frag-container]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_container_frag-container]: Scheduling refresh of Exec[concat_/etc/rsync.conf] Notice: /Stage[main]/Tempest/Tempest_config[service_available/neutron]/ensure: created Notice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]/ensure: created Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Glance::Backend::Swift/Glance_swift_config[ref1/key]/ensure: created Notice: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/auth_uri]/ensure: created Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/auth_uri]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic::Api/Package[ironic-api]/ensure: created Info: /Stage[main]/Ironic::Api/Package[ironic-api]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Ironic::Api/Package[ironic-api]: Scheduling refresh of Exec[ironic-dbsync] Notice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_tenant_name]/ensure: created Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic::Api/Ironic_config[api/host_ip]/ensure: created Info: /Stage[main]/Ironic::Api/Ironic_config[api/host_ip]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Api/Ironic_config[api/host_ip]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_password]/ensure: created Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_password]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/identity_uri]/ensure: created Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/identity_uri]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic::Api/Ironic_config[neutron/url]/ensure: created Info: /Stage[main]/Ironic::Api/Ironic_config[neutron/url]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Api/Ironic_config[neutron/url]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_user]/ensure: created Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Api/Ironic_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Ironic::Api/Ironic_config[api/max_limit]/ensure: created Info: /Stage[main]/Ironic::Api/Ironic_config[api/max_limit]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Api/Ironic_config[api/max_limit]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf]/ensure: created Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments]/ensure: created Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf] Notice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[object]/Concat::Fragment[swift_healthcheck_object]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/25_swift_healthcheck_object]/ensure: defined content as '{md5}8c92056c41082619d179f88ea15c5fc6' Info: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[object]/Concat::Fragment[swift_healthcheck_object]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/25_swift_healthcheck_object]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments.concat.out]/ensure: created Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat::Fragment[swift-object-6000]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/00_swift-object-6000]/ensure: defined content as '{md5}f5bb62f4798612b143fc441befa50ecc' Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat::Fragment[swift-object-6000]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/00_swift-object-6000]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Rsync::Server::Module[object]/Concat::Fragment[frag-object]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_object_frag-object]/ensure: defined content as '{md5}d0ecd24502eb0f9cd5c387b2e1e32943' Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Rsync::Server::Module[object]/Concat::Fragment[frag-object]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_object_frag-object]: Scheduling refresh of Exec[concat_/etc/rsync.conf] Notice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Create[account]/Exec[create_account]/returns: executed successfully Notice: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/1]/ensure: created Info: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/1]: Scheduling refresh of Swift::Ringbuilder::Rebalance[account] Notice: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/3]/ensure: created Info: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/3]: Scheduling refresh of Swift::Ringbuilder::Rebalance[account] Notice: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/2]/ensure: created Info: /Stage[main]/Openstack_integration::Swift/Ring_account_device[127.0.0.1:6002/2]: Scheduling refresh of Swift::Ringbuilder::Rebalance[account] Info: Swift::Ringbuilder::Rebalance[account]: Scheduling refresh of Exec[rebalance_account] Notice: /Stage[main]/Swift::Ringbuilder/Swift::Ringbuilder::Rebalance[account]/Exec[rebalance_account]: Triggered 'refresh' from 1 events Notice: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/debug]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Logging/Ironic_config[DEFAULT/debug]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Keystone::Client/Package[python-keystoneclient]/ensure: created Info: /Stage[main]/Keystone::Client/Package[python-keystoneclient]: Scheduling refresh of Anchor[keystone::service::end] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat::Fragment[swift-container-6001]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/00_swift-container-6001]/ensure: defined content as '{md5}26d25a9fa3702760a9fc42a4a2bd22c2' Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat::Fragment[swift-container-6001]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/00_swift-container-6001]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf] Notice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[account]/Concat::Fragment[swift_healthcheck_account]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/25_swift_healthcheck_account]/ensure: defined content as '{md5}8c92056c41082619d179f88ea15c5fc6' Info: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[account]/Concat::Fragment[swift_healthcheck_account]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/25_swift_healthcheck_account]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf] Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]/ensure: created Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Package[neutron-ovs-agent]: Scheduling refresh of Service[neutron-ovs-agent-service] Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/local_ip]/ensure: created Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/local_ip]: Scheduling refresh of Service[neutron-ovs-agent-service] Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/tunnel_bridge]/ensure: created Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/tunnel_bridge]: Scheduling refresh of Service[neutron-ovs-agent-service] Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/enable_tunneling]/ensure: created Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/enable_tunneling]: Scheduling refresh of Service[neutron-ovs-agent-service] Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/integration_bridge]/ensure: created Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[ovs/integration_bridge]: Scheduling refresh of Service[neutron-ovs-agent-service] Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[securitygroup/firewall_driver]/ensure: created Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[securitygroup/firewall_driver]: Scheduling refresh of Service[neutron-ovs-agent-service] Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Service[ovs-cleanup-service]/enable: enable changed 'false' to 'true' Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/drop_flows_on_start]/ensure: created Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/drop_flows_on_start]: Scheduling refresh of Service[neutron-ovs-agent-service] Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/vxlan_udp_port]/ensure: created Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/vxlan_udp_port]: Scheduling refresh of Service[neutron-ovs-agent-service] Notice: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/agent_mode]/ensure: created Info: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/agent_mode]: Scheduling refresh of Service[neutron-l3] Notice: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/interface_driver]/ensure: created Info: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/interface_driver]: Scheduling refresh of Service[neutron-l3] Notice: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Neutron::Agents::L3/Neutron_l3_agent_config[DEFAULT/debug]: Scheduling refresh of Service[neutron-l3] Notice: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created Info: /Stage[main]/Neutron::Agents::Metadata/Neutron_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]: Scheduling refresh of Service[neutron-metadata] Notice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Mysql_database[ironic]/ensure: created Notice: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vxlan_group]/ensure: created Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vxlan_group]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Plugins::Ml2/Neutron::Plugins::Ml2::Type_driver[vxlan]/Neutron_plugin_ml2[ml2_type_vxlan/vxlan_group]: Scheduling refresh of Exec[neutron-db-sync] Notice: /Stage[main]/Tempest/Tempest_config[DEFAULT/verbose]/ensure: created Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/user_domain_id]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/auth_url]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]/ensure: created Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Api/Cinder_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Ironic::Api/Ironic_config[api/port]/ensure: created Info: /Stage[main]/Ironic::Api/Ironic_config[api/port]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Api/Ironic_config[api/port]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Rsync::Server/Xinetd::Service[rsync]/File[/etc/xinetd.d/rsync]/ensure: created Info: /Stage[main]/Rsync::Server/Xinetd::Service[rsync]/File[/etc/xinetd.d/rsync]: Scheduling refresh of Service[xinetd] Notice: /Stage[main]/Xinetd/Service[xinetd]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Xinetd/Service[xinetd]: Unscheduling refresh on Service[xinetd] Info: Openstacklib::Db::Mysql[glance]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[account]/Concat::Fragment[swift_recon_account]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/35_swift_recon_account]/ensure: defined content as '{md5}d847d2d529a3596ed6a74d841d790dc7' Info: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[account]/Concat::Fragment[swift_recon_account]/File[/var/lib/puppet/concat/_etc_swift_account-server.conf/fragments/35_swift_recon_account]: Scheduling refresh of Exec[concat_/etc/swift/account-server.conf] Notice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_tenant_name]/ensure: created Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Openstacklib::Db::Mysql::Host_access[ironic_127.0.0.1]/Mysql_user[ironic@127.0.0.1]/ensure: created Notice: /Stage[main]/Ironic::Db::Mysql/Openstacklib::Db::Mysql[ironic]/Openstacklib::Db::Mysql::Host_access[ironic_127.0.0.1]/Mysql_grant[ironic@127.0.0.1/ironic.*]/ensure: created Info: Openstacklib::Db::Mysql[ironic]: Scheduling refresh of Exec[ironic-dbsync] Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_container]/ensure: created Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_container]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_container]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[keystone_authtoken/project_domain_id]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Apache/Package[httpd]/ensure: created Info: /Stage[main]/Apache/Package[httpd]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/File[/var/www/cgi-bin/nova]/ensure: created Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/File[/var/www/cgi-bin/ironic]/ensure: created Notice: /Stage[main]/Apache::Mod::Wsgi/Apache::Mod[wsgi]/Package[mod_wsgi]/ensure: created Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/File[nova_api_wsgi]/ensure: defined content as '{md5}87dec420e9b6e707b94b149f1432bad2' Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_digest]/File[auth_digest.load]/ensure: defined content as '{md5}df9e85f8da0b239fe8e698ae7ead4f60' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_digest]/File[auth_digest.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Mime/Apache::Mod[mime]/File[mime.load]/ensure: defined content as '{md5}e36257b9efab01459141d423cae57c7c' Info: /Stage[main]/Apache::Mod::Mime/Apache::Mod[mime]/File[mime.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[expires]/File[expires.load]/ensure: defined content as '{md5}f0825bad1e470de86ffabeb86dcc5d95' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[expires]/File[expires.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Dav/Apache::Mod[dav]/File[dav.load]/ensure: defined content as '{md5}588e496251838c4840c14b28b5aa7881' Info: /Stage[main]/Apache::Mod::Dav/Apache::Mod[dav]/File[dav.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_owner]/File[authz_owner.load]/ensure: defined content as '{md5}f30a9be1016df87f195449d9e02d1857' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_owner]/File[authz_owner.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_groupfile]/File[authz_groupfile.load]/ensure: defined content as '{md5}ae005a36b3ac8c20af36c434561c8a75' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_groupfile]/File[authz_groupfile.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_dbm]/File[authn_dbm.load]/ensure: defined content as '{md5}90ee8f8ef1a017cacadfda4225e10651' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_dbm]/File[authn_dbm.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Authn_core/Apache::Mod[authn_core]/File[authn_core.load]/ensure: defined content as '{md5}704d6e8b02b0eca0eba4083960d16c52' Info: /Stage[main]/Apache::Mod::Authn_core/Apache::Mod[authn_core]/File[authn_core.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Authz_user/Apache::Mod[authz_user]/File[authz_user.load]/ensure: defined content as '{md5}63594303ee808423679b1ea13dd5a784' Info: /Stage[main]/Apache::Mod::Authz_user/Apache::Mod[authz_user]/File[authz_user.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[log_config]/File[log_config.load]/ensure: defined content as '{md5}785d35cb285e190d589163b45263ca89' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[log_config]/File[log_config.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[logio]/File[logio.load]/ensure: defined content as '{md5}084533c7a44e9129d0e6df952e2472b6' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[logio]/File[logio.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[socache_shmcb]/File[socache_shmcb.load]/ensure: defined content as '{md5}ab31a6ea611785f74851b578572e4157' Info: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[socache_shmcb]/File[socache_shmcb.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Mime/File[mime.conf]/ensure: defined content as '{md5}9da85e58f3bd6c780ce76db603b7f028' Info: /Stage[main]/Apache::Mod::Mime/File[mime.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[access_compat]/File[access_compat.load]/ensure: defined content as '{md5}d5feb88bec4570e2dbc41cce7e0de003' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[access_compat]/File[access_compat.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Version/Apache::Mod[version]/File[version.load]/ensure: defined content as '{md5}1c9243de22ace4dc8266442c48ae0c92' Info: /Stage[main]/Apache::Mod::Version/Apache::Mod[version]/File[version.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Setenvif/File[setenvif.conf]/ensure: defined content as '{md5}c7ede4173da1915b7ec088201f030c28' Info: /Stage[main]/Apache::Mod::Setenvif/File[setenvif.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Actions/Apache::Mod[actions]/File[actions.load]/ensure: defined content as '{md5}599866dfaf734f60f7e2d41ee8235515' Info: /Stage[main]/Apache::Mod::Actions/Apache::Mod[actions]/File[actions.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Deflate/File[deflate.conf]/ensure: defined content as '{md5}a045d750d819b1e9dae3fbfb3f20edd5' Info: /Stage[main]/Apache::Mod::Deflate/File[deflate.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_core]/File[authz_core.load]/ensure: defined content as '{md5}39942569bff2abdb259f9a347c7246bc' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_core]/File[authz_core.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Negotiation/File[negotiation.conf]/ensure: defined content as '{md5}47284b5580b986a6ba32580b6ffb9fd7' Info: /Stage[main]/Apache::Mod::Negotiation/File[negotiation.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Alias/Apache::Mod[alias]/File[alias.load]/ensure: defined content as '{md5}3cf2fa309ccae4c29a4b875d0894cd79' Info: /Stage[main]/Apache::Mod::Alias/Apache::Mod[alias]/File[alias.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[env]/File[env.load]/ensure: defined content as '{md5}d74184d40d0ee24ba02626a188ee7e1a' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[env]/File[env.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Negotiation/Apache::Mod[negotiation]/File[negotiation.load]/ensure: defined content as '{md5}d262ee6a5f20d9dd7f87770638dc2ccd' Info: /Stage[main]/Apache::Mod::Negotiation/Apache::Mod[negotiation]/File[negotiation.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_dbm]/File[authz_dbm.load]/ensure: defined content as '{md5}c1363277984d22f99b70f7dce8753b60' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_dbm]/File[authz_dbm.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Dir/File[dir.conf]/ensure: defined content as '{md5}c741d8ea840e6eb999d739eed47c69d7' Info: /Stage[main]/Apache::Mod::Dir/File[dir.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[usertrack]/File[usertrack.load]/ensure: defined content as '{md5}e95fbbf030fabec98b948f8dc217775c' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[usertrack]/File[usertrack.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Prefork/File[/etc/httpd/conf.modules.d/prefork.conf]/ensure: defined content as '{md5}109c4f51dac10fc1b39373855e566d01' Info: /Stage[main]/Apache::Mod::Prefork/File[/etc/httpd/conf.modules.d/prefork.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Vhost_alias/Apache::Mod[vhost_alias]/File[vhost_alias.load]/ensure: defined content as '{md5}eca907865997d50d5130497665c3f82e' Info: /Stage[main]/Apache::Mod::Vhost_alias/Apache::Mod[vhost_alias]/File[vhost_alias.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[substitute]/File[substitute.load]/ensure: defined content as '{md5}8077c34a71afcf41c8fc644830935915' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[substitute]/File[substitute.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Setenvif/Apache::Mod[setenvif]/File[setenvif.load]/ensure: defined content as '{md5}ec6c99f7cc8e35bdbcf8028f652c9f6d' Info: /Stage[main]/Apache::Mod::Setenvif/Apache::Mod[setenvif]/File[setenvif.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[unixd]/File[unixd.load]/ensure: defined content as '{md5}0e8468ecc1265f8947b8725f4d1be9c0' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[unixd]/File[unixd.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Deflate/Apache::Mod[deflate]/File[deflate.load]/ensure: defined content as '{md5}2d1a1afcae0c70557251829a8586eeaf' Info: /Stage[main]/Apache::Mod::Deflate/Apache::Mod[deflate]/File[deflate.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Wsgi/Apache::Mod[wsgi]/File[wsgi.load]/ensure: defined content as '{md5}e1795e051e7aae1f865fde0d3b86a507' Info: /Stage[main]/Apache::Mod::Wsgi/Apache::Mod[wsgi]/File[wsgi.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_basic]/File[auth_basic.load]/ensure: defined content as '{md5}494bcf4b843f7908675d663d8dc1bdc8' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[auth_basic]/File[auth_basic.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Prefork/Apache::Mpm[prefork]/File[/etc/httpd/conf.modules.d/prefork.load]/ensure: defined content as '{md5}157529aafcf03fa491bc924103e4608e' Info: /Stage[main]/Apache::Mod::Prefork/Apache::Mpm[prefork]/File[/etc/httpd/conf.modules.d/prefork.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Authn_file/Apache::Mod[authn_file]/File[authn_file.load]/ensure: defined content as '{md5}d41656680003d7b890267bb73621c60b' Info: /Stage[main]/Apache::Mod::Authn_file/Apache::Mod[authn_file]/File[authn_file.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Ext_filter/Apache::Mod[ext_filter]/File[ext_filter.load]/ensure: defined content as '{md5}76d5e0ac3411a4be57ac33ebe2e52ac8' Info: /Stage[main]/Apache::Mod::Ext_filter/Apache::Mod[ext_filter]/File[ext_filter.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Wsgi/File[wsgi.conf]/ensure: defined content as '{md5}8b3feb3fc2563de439920bb2c52cbd11' Info: /Stage[main]/Apache::Mod::Wsgi/File[wsgi.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Speling/Apache::Mod[speling]/File[speling.load]/ensure: defined content as '{md5}f82e9e6b871a276c324c9eeffcec8a61' Info: /Stage[main]/Apache::Mod::Speling/Apache::Mod[speling]/File[speling.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Dir/Apache::Mod[dir]/File[dir.load]/ensure: defined content as '{md5}1bfb1c2a46d7351fc9eb47c659dee068' Info: /Stage[main]/Apache::Mod::Dir/Apache::Mod[dir]/File[dir.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Dav_fs/Apache::Mod[dav_fs]/File[dav_fs.load]/ensure: defined content as '{md5}2996277c73b1cd684a9a3111c355e0d3' Info: /Stage[main]/Apache::Mod::Dav_fs/Apache::Mod[dav_fs]/File[dav_fs.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[include]/File[include.load]/ensure: defined content as '{md5}88095a914eedc3c2c184dd5d74c3954c' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[include]/File[include.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[systemd]/File[systemd.load]/ensure: defined content as '{md5}26e5d44aae258b3e9d821cbbbd3e2826' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[systemd]/File[systemd.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Alias/File[alias.conf]/ensure: defined content as '{md5}983e865be85f5e0daaed7433db82995e' Info: /Stage[main]/Apache::Mod::Alias/File[alias.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[ssl]/Package[mod_ssl]/ensure: created Notice: /Stage[main]/Apache::Mod::Ssl/File[ssl.conf]/ensure: defined content as '{md5}8884ea33793365e0784cfd43be72464e' Info: /Stage[main]/Apache::Mod::Ssl/File[ssl.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[ssl]/File[ssl.load]/ensure: defined content as '{md5}e282ac9f82fe5538692a4de3616fb695' Info: /Stage[main]/Apache::Mod::Ssl/Apache::Mod[ssl]/File[ssl.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_host]/File[authz_host.load]/ensure: defined content as '{md5}d1045f54d2798499ca0f030ca0eef920' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authz_host]/File[authz_host.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Suexec/Apache::Mod[suexec]/File[suexec.load]/ensure: defined content as '{md5}c7d5c61c534ba423a79b0ae78ff9be35' Info: /Stage[main]/Apache::Mod::Suexec/Apache::Mod[suexec]/File[suexec.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Cache/Apache::Mod[cache]/File[cache.load]/ensure: defined content as '{md5}01e4d392225b518a65b0f7d6c4e21d29' Info: /Stage[main]/Apache::Mod::Cache/Apache::Mod[cache]/File[cache.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Rewrite/Apache::Mod[rewrite]/File[rewrite.load]/ensure: defined content as '{md5}26e2683352fc1599f29573ff0d934e79' Info: /Stage[main]/Apache::Mod::Rewrite/Apache::Mod[rewrite]/File[rewrite.load]: Scheduling refresh of Class[Apache::Service] Info: /Stage[main]/Apache/File[/etc/httpd/conf.d/autoindex.conf]: Filebucketed /etc/httpd/conf.d/autoindex.conf to puppet with sum 09726332c2fd6fc73a57fbe69fc10427 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.d/autoindex.conf]/ensure: removed Info: /Stage[main]/Apache/File[/etc/httpd/conf.d/userdir.conf]: Filebucketed /etc/httpd/conf.d/userdir.conf to puppet with sum d4a2620683cc3ff2315c685f9f354265 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.d/userdir.conf]/ensure: removed Info: /Stage[main]/Apache/File[/etc/httpd/conf.d/ssl.conf]: Filebucketed /etc/httpd/conf.d/ssl.conf to puppet with sum 1888b608773b45f4acea3604eccf3562 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.d/ssl.conf]/ensure: removed Info: /Stage[main]/Apache/File[/etc/httpd/conf.d/welcome.conf]: Filebucketed /etc/httpd/conf.d/welcome.conf to puppet with sum 9d1328b985d0851eb5bc610da6122f44 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.d/welcome.conf]/ensure: removed Info: /Stage[main]/Apache/File[/etc/httpd/conf.d/README]: Filebucketed /etc/httpd/conf.d/README to puppet with sum 20b886e8496027dcbc31ed28d404ebb1 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.d/README]/ensure: removed Notice: /Stage[main]/Apache::Mod::Autoindex/Apache::Mod[autoindex]/File[autoindex.load]/ensure: defined content as '{md5}515cdf5b573e961a60d2931d39248648' Info: /Stage[main]/Apache::Mod::Autoindex/Apache::Mod[autoindex]/File[autoindex.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_anon]/File[authn_anon.load]/ensure: defined content as '{md5}bf57b94b5aec35476fc2a2dc3861f132' Info: /Stage[main]/Apache::Default_mods/Apache::Mod[authn_anon]/File[authn_anon.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Autoindex/File[autoindex.conf]/ensure: defined content as '{md5}2421a3c6df32c7e38c2a7a22afdf5728' Info: /Stage[main]/Apache::Mod::Autoindex/File[autoindex.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf]/ensure: created Info: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf] Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments.concat.out]/ensure: created Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments]/ensure: created Info: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf] Notice: /Stage[main]/Apache/Concat::Fragment[Apache ports header]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Apache ports header]/ensure: defined content as '{md5}afe35cb5747574b700ebaa0f0b3a626e' Info: /Stage[main]/Apache/Concat::Fragment[Apache ports header]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Apache ports header]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf] Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments.concat]/ensure: created Notice: /Stage[main]/Apache::Mod::Filter/Apache::Mod[filter]/File[filter.load]/ensure: defined content as '{md5}66a1e2064a140c3e7dca7ac33877700e' Info: /Stage[main]/Apache::Mod::Filter/Apache::Mod[filter]/File[filter.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/File[ironic_wsgi]/ensure: defined content as '{md5}77ef07cc957e05e2024c75ef82d6fbbd' Notice: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]/ensure: created Info: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[object]/Concat::Fragment[swift_recon_object]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/35_swift_recon_object]/ensure: defined content as '{md5}d847d2d529a3596ed6a74d841d790dc7' Info: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Recon[object]/Concat::Fragment[swift_recon_object]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments/35_swift_recon_object]: Scheduling refresh of Exec[concat_/etc/swift/object-server.conf] Notice: /Stage[main]/Rsync::Server/Concat::Fragment[rsyncd_conf_header]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/00_header_rsyncd_conf_header]/ensure: defined content as '{md5}3a2ab53ad81bbfc64ceb17fb3a7efee0' Info: /Stage[main]/Rsync::Server/Concat::Fragment[rsyncd_conf_header]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/00_header_rsyncd_conf_header]: Scheduling refresh of Exec[concat_/etc/rsync.conf] Notice: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_user]/ensure: created Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_user]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry/Glance_registry_config[keystone_authtoken/admin_user]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Mongodb::Client::Install/Package[mongodb_client]/ensure: created Notice: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Package[nova-scheduler]/ensure: created Info: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Package[nova-scheduler]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Package[nova-scheduler]: Scheduling refresh of Anchor[nova::install::end] Notice: /Stage[main]/Neutron::Db/Neutron_config[database/connection]/ensure: created Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Db/Neutron_config[database/connection]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Glance::Registry::Db/Glance_registry_config[database/connection]/ensure: created Info: /Stage[main]/Glance::Registry::Db/Glance_registry_config[database/connection]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Registry::Db/Glance_registry_config[database/connection]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Ironic::Db/Ironic_config[database/connection]/ensure: created Info: /Stage[main]/Ironic::Db/Ironic_config[database/connection]: Scheduling refresh of Exec[ironic-dbsync] Info: /Stage[main]/Ironic::Db/Ironic_config[database/connection]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Ironic::Db/Ironic_config[database/connection]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Glance::Api::Db/Glance_api_config[database/connection]/ensure: created Info: /Stage[main]/Glance::Api::Db/Glance_api_config[database/connection]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api::Db/Glance_api_config[database/connection]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Cinder::Db/Cinder_config[database/connection]/ensure: created Info: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Exec[cinder-manage db_sync] Info: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Db/Cinder_config[database/connection]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Ironic::Db::Sync/Exec[ironic-dbsync]: Triggered 'refresh' from 5 events Info: /Stage[main]/Ironic::Db::Sync/Exec[ironic-dbsync]: Scheduling refresh of Service[ironic-api] Info: /Stage[main]/Ironic::Db::Sync/Exec[ironic-dbsync]: Scheduling refresh of Service[ironic-conductor] Notice: /Stage[main]/Apache::Mod::Mime_magic/File[mime_magic.conf]/ensure: defined content as '{md5}b258529b332429e2ff8344f726a95457' Info: /Stage[main]/Apache::Mod::Mime_magic/File[mime_magic.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache::Mod::Mime_magic/Apache::Mod[mime_magic]/File[mime_magic.load]/ensure: defined content as '{md5}cb8670bb2fb352aac7ebf3a85d52094c' Info: /Stage[main]/Apache::Mod::Mime_magic/Apache::Mod[mime_magic]/File[mime_magic.load]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments.concat]/ensure: created Notice: /Stage[main]/Mongodb::Server::Config/File[/etc/mongod.conf]/content: --- /etc/mongod.conf 2015-12-07 22:55:21.000000000 +0000 +++ /tmp/puppet-file20160520-26469-1krjs66 2016-05-20 12:29:37.587578098 +0100 @@ -1,237 +1,19 @@ -## -### Basic Defaults -## +# mongo.conf - generated from Puppet -# Comma separated list of ip addresses to listen on (all local ips by default) -bind_ip = 127.0.0.1 - -# Specify port number (27017 by default) -#port = 27017 - -# Fork server process (false by default) -fork = true - -# Full path to pidfile (if not set, no pidfile is created) -pidfilepath = /var/run/mongodb/mongod.pid - -# Log file to send write to instead of stdout - has to be a file, not directory -logpath = /var/log/mongodb/mongod.log - -# Alternative directory for UNIX domain sockets (defaults to /tmp) -unixSocketPrefix = /var/run/mongodb - -# Directory for datafiles (defaults to /data/db/) -dbpath = /var/lib/mongodb - -# Enable/Disable journaling (journaling is on by default for 64 bit) -#journal = true -#nojournal = true - - - -## -### General options -## - -# Be more verbose (include multiple times for more verbosity e.g. -vvvvv) (v by default) -#verbose = v - -# Max number of simultaneous connections (1000000 by default) -#maxConns = 1000000 - -# Log to system's syslog facility instead of file or stdout (false by default) -#syslog = true - -# Syslog facility used for monogdb syslog message (user by defautl) -#syslogFacility = user - -# Append to logpath instead of over-writing (false by default) -#logappend = true - -# Desired format for timestamps in log messages (One of ctime, iso8601-utc or iso8601-local) (iso8601-local by default) -#timeStampFormat = arg - -# Private key for cluster authentication -#keyFile = arg - -# Set a configurable parameter -#setParameter = arg - -# Enable http interface (false by default) -#httpinterface = true - -# Authentication mode used for cluster authentication. Alternatives are (keyFile|sendKeyFile|sendX509|x509) (keyFile by default) -#clusterAuthMode = arg - -# Disable listening on unix sockets (false by default) -#nounixsocket = true - -# Run with/without security (without by default) -#auth = true -#noauth = true - -# Enable IPv6 support (disabled by default) -#ipv6 = true - -# Allow JSONP access via http (has security implications) (false by default) -#jsonp = true - -# Turn on simple rest api (false by default) -#rest = true - -# Value of slow for profile and console log (100 by default) -#slowms = 100 - -# 0=off 1=slow, 2=all (0 by default) -#profile = 0 - -# Periodically show cpu and iowait utilization (false by default) -#cpu = true - -# Print some diagnostic system information (false by default) -#sysinfo = true - -# Each database will be stored in a separate directory (false by default) -#directoryperdb = true - -# Don't retry any index builds that were interrupted by shutdown (false by default) -#noIndexBuildRetry = true - -# Disable data file preallocation - will often hurt performance (false by default) -#noprealloc = true - -# .ns file size (in MB) for new databases (16 MB by default) -#nssize = 16 - -# Limits each database to a certain number of files (8 default) -#quota - -# Number of files allowed per db, implies --quota (8 by default) -#quotaFiles = 8 - -# Use a smaller default file size (false by default) -#smallfiles = true - -# Seconds between disk syncs (0=never, but not recommended) (60 by default) -#syncdelay = 60 - -# Upgrade db if needed (false by default) -#upgrade = true - -# Run repair on all dbs (false by default) -#repair = true - -# Root directory for repair files (defaults to dbpath) -#repairpath = arg - -# Disable scripting engine (false by default) -#noscripting = true - -# Do not allow table scans (false by default) -#notablescan = true - -# Journal diagnostic options (0 by default) -#journalOptions = 0 - -# How often to group/batch commit (ms) (100 or 30 by default) -#journalCommitInterval = 100 - - - -## -### Replication options -## - -# Size to use (in MB) for replication op log (default 5% of disk space - i.e. large is good) -#oplogSize = arg - - - -## -### Master/slave options (old; use replica sets instead) -## - -# Master mode -#master = true - -# Slave mode -#slave = true - -# When slave: specify master as -#source = arg - -# When slave: specify a single database to replicate -#only = arg - -# Specify delay (in seconds) to be used when applying master ops to slave -#slavedelay = arg - -# Automatically resync if slave data is stale -#autoresync = true - - - -## -### Replica set options -## - -# Arg is [/] -#replSet = arg - -# Specify index prefetching behavior (if secondary) [none|_id_only|all] (all by default) -#replIndexPrefetch = all - - - -## -### Sharding options -## - -# Declare this is a config db of a cluster (default port 27019; default dir /data/configdb) (false by default) -#configsvr = true - -# Declare this is a shard db of a cluster (default port 27018) (false by default) -#shardsvr = true - - - -## -### SSL options -## - -# Use ssl on configured ports -#sslOnNormalPorts = true - -# Set the SSL operation mode (disabled|allowSSL|preferSSL|requireSSL) -# sslMode = arg - -# PEM file for ssl -#sslPEMKeyFile = arg - -# PEM file password -#sslPEMKeyPassword = arg - -# Key file for internal SSL authentication -#sslClusterFile = arg - -# Internal authentication key file password -#sslClusterPassword = arg - -# Certificate Authority file for SSL -#sslCAFile = arg - -# Certificate Revocation List file for SSL -#sslCRLFile = arg - -# Allow client to connect without presenting a certificate -#sslWeakCertificateValidation = true - -# Allow server certificates to provide non-matching hostnames -#sslAllowInvalidHostnames = true - -# Allow connections to servers with invalid certificates -#sslAllowInvalidCertificates = true - -# Activate FIPS 140-2 mode at startup -#sslFIPSMode = true +#where to log +logpath=/var/log/mongodb/mongodb.log +logappend=true +# Set this option to configure the mongod or mongos process to bind to and +# listen for connections from applications on this address. +# You may concatenate a list of comma separated values to bind mongod to multiple IP addresses. +bind_ip = 127.0.0.1 +# fork and run in background +fork=true +dbpath=/var/lib/mongodb +# location of pidfile +pidfilepath=/var/run/mongodb/mongod.pid +# Enables journaling +journal = true +# Turn on/off security. Off is currently the default +noauth=true Info: /Stage[main]/Mongodb::Server::Config/File[/etc/mongod.conf]: Filebucketed /etc/mongod.conf to puppet with sum c9466bad2ec40e2613630b7d49d58b2b Notice: /Stage[main]/Mongodb::Server::Config/File[/etc/mongod.conf]/content: content changed '{md5}c9466bad2ec40e2613630b7d49d58b2b' to '{md5}b770678a1c1e5991d9990e8fdb0fabea' Notice: /Stage[main]/Mongodb::Server::Config/File[/var/lib/mongodb]/group: group changed 'root' to 'mongodb' Notice: /Stage[main]/Mongodb::Server::Config/File[/var/lib/mongodb]/mode: mode changed '0750' to '0755' Info: Class[Mongodb::Server::Config]: Scheduling refresh of Class[Mongodb::Server::Service] Info: Class[Mongodb::Server::Service]: Scheduling refresh of Service[mongodb] Notice: /Stage[main]/Mongodb::Server::Service/Service[mongodb]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Mongodb::Server::Service/Service[mongodb]: Unscheduling refresh on Service[mongodb] Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/tunnel_types]/ensure: created Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Neutron_agent_ovs[agent/tunnel_types]: Scheduling refresh of Service[neutron-ovs-agent-service] Notice: /Stage[main]/Keystone/Package[keystone]/ensure: created Info: /Stage[main]/Keystone/Package[keystone]: Scheduling refresh of Anchor[keystone::install::end] Info: /Stage[main]/Keystone/Package[keystone]: Scheduling refresh of Anchor[keystone::service::end] Notice: /Stage[main]/Rabbitmq::Install/Package[rabbitmq-server]/ensure: created Info: /Stage[main]/Rabbitmq::Install/Package[rabbitmq-server]: Scheduling refresh of Class[Rabbitmq::Service] Notice: /Stage[main]/Rabbitmq/Rabbitmq_plugin[rabbitmq_management]/ensure: created Info: /Stage[main]/Rabbitmq/Rabbitmq_plugin[rabbitmq_management]: Scheduling refresh of Class[Rabbitmq::Service] Notice: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]/content: --- /etc/rabbitmq/rabbitmq.config 2014-08-11 12:36:33.000000000 +0100 +++ /tmp/puppet-file20160520-26469-1onsrit 2016-05-20 12:29:47.927185812 +0100 @@ -1,567 +1,42 @@ -%% -*- mode: erlang -*- -%% ---------------------------------------------------------------------------- -%% RabbitMQ Sample Configuration File. -%% -%% See http://www.rabbitmq.com/configure.html for details. -%% ---------------------------------------------------------------------------- +% This file managed by Puppet +% Template Path: rabbitmq/templates/rabbitmq.config [ - {rabbit, - [%% - %% Network Connectivity - %% ==================== - %% - - %% By default, RabbitMQ will listen on all interfaces, using - %% the standard (reserved) AMQP port. - %% - %% {tcp_listeners, [5672]}, - - %% To listen on a specific interface, provide a tuple of {IpAddress, Port}. - %% For example, to listen only on localhost for both IPv4 and IPv6: - %% - %% {tcp_listeners, [{"127.0.0.1", 5672}, - %% {"::1", 5672}]}, - - %% SSL listeners are configured in the same fashion as TCP listeners, - %% including the option to control the choice of interface. - %% - %% {ssl_listeners, [5671]}, - - %% Log levels (currently just used for connection logging). - %% One of 'info', 'warning', 'error' or 'none', in decreasing order - %% of verbosity. Defaults to 'info'. - %% - %% {log_levels, [{connection, info}]}, - - %% Set to 'true' to perform reverse DNS lookups when accepting a - %% connection. Hostnames will then be shown instead of IP addresses - %% in rabbitmqctl and the management plugin. - %% - %% {reverse_dns_lookups, true}, - - %% - %% Security / AAA - %% ============== - %% - - %% The default "guest" user is only permitted to access the server - %% via a loopback interface (e.g. localhost). - %% {loopback_users, [<<"guest">>]}, - %% - %% Uncomment the following line if you want to allow access to the - %% guest user from anywhere on the network. - %% {loopback_users, []}, - - %% Configuring SSL. - %% See http://www.rabbitmq.com/ssl.html for full documentation. - %% - %% {ssl_options, [{cacertfile, "/path/to/testca/cacert.pem"}, - %% {certfile, "/path/to/server/cert.pem"}, - %% {keyfile, "/path/to/server/key.pem"}, - %% {verify, verify_peer}, - %% {fail_if_no_peer_cert, false}]}, - - %% Choose the available SASL mechanism(s) to expose. - %% The two default (built in) mechanisms are 'PLAIN' and - %% 'AMQPLAIN'. Additional mechanisms can be added via - %% plugins. - %% - %% See http://www.rabbitmq.com/authentication.html for more details. - %% - %% {auth_mechanisms, ['PLAIN', 'AMQPLAIN']}, - - %% Select an authentication database to use. RabbitMQ comes bundled - %% with a built-in auth-database, based on mnesia. - %% - %% {auth_backends, [rabbit_auth_backend_internal]}, - - %% Configurations supporting the rabbitmq_auth_mechanism_ssl and - %% rabbitmq_auth_backend_ldap plugins. - %% - %% NB: These options require that the relevant plugin is enabled. - %% See http://www.rabbitmq.com/plugins.html for further details. - - %% The RabbitMQ-auth-mechanism-ssl plugin makes it possible to - %% authenticate a user based on the client's SSL certificate. - %% - %% To use auth-mechanism-ssl, add to or replace the auth_mechanisms - %% list with the entry 'EXTERNAL'. - %% - %% {auth_mechanisms, ['EXTERNAL']}, - - %% The rabbitmq_auth_backend_ldap plugin allows the broker to - %% perform authentication and authorisation by deferring to an - %% external LDAP server. - %% - %% For more information about configuring the LDAP backend, see - %% http://www.rabbitmq.com/ldap.html. - %% - %% Enable the LDAP auth backend by adding to or replacing the - %% auth_backends entry: - %% - %% {auth_backends, [rabbit_auth_backend_ldap]}, - - %% This pertains to both the rabbitmq_auth_mechanism_ssl plugin and - %% STOMP ssl_cert_login configurations. See the rabbitmq_stomp - %% configuration section later in this fail and the README in - %% https://github.com/rabbitmq/rabbitmq-auth-mechanism-ssl for further - %% details. - %% - %% To use the SSL cert's CN instead of its DN as the username - %% - %% {ssl_cert_login_from, common_name}, - - %% - %% Default User / VHost - %% ==================== - %% - - %% On first start RabbitMQ will create a vhost and a user. These - %% config items control what gets created. See - %% http://www.rabbitmq.com/access-control.html for further - %% information about vhosts and access control. - %% - %% {default_vhost, <<"/">>}, - %% {default_user, <<"guest">>}, - %% {default_pass, <<"guest">>}, - %% {default_permissions, [<<".*">>, <<".*">>, <<".*">>]}, - - %% Tags for default user - %% - %% For more details about tags, see the documentation for the - %% Management Plugin at http://www.rabbitmq.com/management.html. - %% - %% {default_user_tags, [administrator]}, - - %% - %% Additional network and protocol related configuration - %% ===================================================== - %% - - %% Set the default AMQP heartbeat delay (in seconds). - %% - %% {heartbeat, 600}, - - %% Set the max permissible size of an AMQP frame (in bytes). - %% - %% {frame_max, 131072}, - - %% Set the max permissible number of channels per connection. - %% 0 means "no limit". - %% - %% {channel_max, 128}, - - %% Customising Socket Options. - %% - %% See (http://www.erlang.org/doc/man/inet.html#setopts-2) for - %% further documentation. - %% - %% {tcp_listen_options, [binary, - %% {packet, raw}, - %% {reuseaddr, true}, - %% {backlog, 128}, - %% {nodelay, true}, - %% {exit_on_close, false}]}, - - %% - %% Resource Limits & Flow Control - %% ============================== - %% - %% See http://www.rabbitmq.com/memory.html for full details. - - %% Memory-based Flow Control threshold. - %% - %% {vm_memory_high_watermark, 0.4}, - - %% Fraction of the high watermark limit at which queues start to - %% page message out to disc in order to free up memory. - %% - %% {vm_memory_high_watermark_paging_ratio, 0.5}, - - %% Set disk free limit (in bytes). Once free disk space reaches this - %% lower bound, a disk alarm will be set - see the documentation - %% listed above for more details. - %% - %% {disk_free_limit, 50000000}, - - %% Alternatively, we can set a limit relative to total available RAM. - %% - %% {disk_free_limit, {mem_relative, 1.0}}, - - %% - %% Misc/Advanced Options - %% ===================== - %% - %% NB: Change these only if you understand what you are doing! - %% - - %% To announce custom properties to clients on connection: - %% - %% {server_properties, []}, - - %% How to respond to cluster partitions. - %% See http://www.rabbitmq.com/partitions.html for further details. - %% - %% {cluster_partition_handling, ignore}, - - %% Make clustering happen *automatically* at startup - only applied - %% to nodes that have just been reset or started for the first time. - %% See http://www.rabbitmq.com/clustering.html#auto-config for - %% further details. - %% - %% {cluster_nodes, {['rabbit@my.host.com'], disc}}, - - %% Set (internal) statistics collection granularity. - %% - %% {collect_statistics, none}, - - %% Statistics collection interval (in milliseconds). - %% - %% {collect_statistics_interval, 5000}, - - %% Explicitly enable/disable hipe compilation. - %% - %% {hipe_compile, true} - - ]}, - - %% ---------------------------------------------------------------------------- - %% Advanced Erlang Networking/Clustering Options. - %% - %% See http://www.rabbitmq.com/clustering.html for details - %% ---------------------------------------------------------------------------- - {kernel, - [%% Sets the net_kernel tick time. - %% Please see http://erlang.org/doc/man/kernel_app.html and - %% http://www.rabbitmq.com/nettick.html for further details. - %% - %% {net_ticktime, 60} + {rabbit, [ + {tcp_listen_options, + [binary, + {packet, raw}, + {reuseaddr, true}, + {backlog, 128}, + {nodelay, true}, + {exit_on_close, false}] + }, + {tcp_listeners, []}, + {ssl_listeners, [5671]}, + {ssl_options, [ + {cacertfile,"/etc/ssl/certs/ca-bundle.crt"}, + {certfile,"/etc/pki/ca-trust/source/anchors/puppet_openstack.pem"}, + {keyfile,"/etc/rabbitmq/ssl/private/n2.dusty.ci.centos.org.pem"}, + {verify,verify_none}, + {fail_if_no_peer_cert,false} + ]}, + {default_user, <<"guest">>}, + {default_pass, <<"guest">>} ]}, - - %% ---------------------------------------------------------------------------- - %% RabbitMQ Management Plugin - %% - %% See http://www.rabbitmq.com/management.html for details - %% ---------------------------------------------------------------------------- - - {rabbitmq_management, - [%% Pre-Load schema definitions from the following JSON file. See - %% http://www.rabbitmq.com/management.html#load-definitions - %% - %% {load_definitions, "/path/to/schema.json"}, - - %% Log all requests to the management HTTP API to a file. - %% - %% {http_log_dir, "/path/to/access.log"}, - - %% Change the port on which the HTTP listener listens, - %% specifying an interface for the web server to bind to. - %% Also set the listener to use SSL and provide SSL options. - %% - %% {listener, [{port, 12345}, - %% {ip, "127.0.0.1"}, - %% {ssl, true}, - %% {ssl_opts, [{cacertfile, "/path/to/cacert.pem"}, - %% {certfile, "/path/to/cert.pem"}, - %% {keyfile, "/path/to/key.pem"}]}]}, - - %% Configure how long aggregated data (such as message rates and queue - %% lengths) is retained. Please read the plugin's documentation in - %% https://www.rabbitmq.com/management.html#configuration for more - %% details. - %% - %% {sample_retention_policies, - %% [{global, [{60, 5}, {3600, 60}, {86400, 1200}]}, - %% {basic, [{60, 5}, {3600, 60}]}, - %% {detailed, [{10, 5}]}]} - ]}, - - {rabbitmq_management_agent, - [%% Misc/Advanced Options - %% - %% NB: Change these only if you understand what you are doing! - %% - %% {force_fine_statistics, true} - ]}, - - %% ---------------------------------------------------------------------------- - %% RabbitMQ Shovel Plugin - %% - %% See http://www.rabbitmq.com/shovel.html for details - %% ---------------------------------------------------------------------------- - - {rabbitmq_shovel, - [{shovels, - [%% A named shovel worker. - %% {my_first_shovel, - %% [ - - %% List the source broker(s) from which to consume. - %% - %% {sources, - %% [%% URI(s) and pre-declarations for all source broker(s). - %% {brokers, ["amqp://user:password@host.domain/my_vhost"]}, - %% {declarations, []} - %% ]}, - - %% List the destination broker(s) to publish to. - %% {destinations, - %% [%% A singular version of the 'brokers' element. - %% {broker, "amqp://"}, - %% {declarations, []} - %% ]}, - - %% Name of the queue to shovel messages from. - %% - %% {queue, <<"your-queue-name-goes-here">>}, - - %% Optional prefetch count. - %% - %% {prefetch_count, 10}, - - %% when to acknowledge messages: - %% - no_ack: never (auto) - %% - on_publish: after each message is republished - %% - on_confirm: when the destination broker confirms receipt - %% - %% {ack_mode, on_confirm}, - - %% Overwrite fields of the outbound basic.publish. - %% - %% {publish_fields, [{exchange, <<"my_exchange">>}, - %% {routing_key, <<"from_shovel">>}]}, - - %% Static list of basic.properties to set on re-publication. - %% - %% {publish_properties, [{delivery_mode, 2}]}, - - %% The number of seconds to wait before attempting to - %% reconnect in the event of a connection failure. - %% - %% {reconnect_delay, 2.5} - - %% ]} %% End of my_first_shovel + {kernel, [ + + ]} +, + {rabbitmq_management, [ + {listener, [ + {port, 15671}, + {ssl, true}, + {ssl_opts, [ + {cacertfile, "/etc/ssl/certs/ca-bundle.crt"}, + + {certfile, "/etc/pki/ca-trust/source/anchors/puppet_openstack.pem"}, + {keyfile, "/etc/rabbitmq/ssl/private/n2.dusty.ci.centos.org.pem"} + ]} ]} - %% Rather than specifying some values per-shovel, you can specify - %% them for all shovels here. - %% - %% {defaults, [{prefetch_count, 0}, - %% {ack_mode, on_confirm}, - %% {publish_fields, []}, - %% {publish_properties, [{delivery_mode, 2}]}, - %% {reconnect_delay, 2.5}]} - ]}, - - %% ---------------------------------------------------------------------------- - %% RabbitMQ Stomp Adapter - %% - %% See http://www.rabbitmq.com/stomp.html for details - %% ---------------------------------------------------------------------------- - - {rabbitmq_stomp, - [%% Network Configuration - the format is generally the same as for the broker - - %% Listen only on localhost (ipv4 & ipv6) on a specific port. - %% {tcp_listeners, [{"127.0.0.1", 61613}, - %% {"::1", 61613}]}, - - %% Listen for SSL connections on a specific port. - %% {ssl_listeners, [61614]}, - - %% Additional SSL options - - %% Extract a name from the client's certificate when using SSL. - %% - %% {ssl_cert_login, true}, - - %% Set a default user name and password. This is used as the default login - %% whenever a CONNECT frame omits the login and passcode headers. - %% - %% Please note that setting this will allow clients to connect without - %% authenticating! - %% - %% {default_user, [{login, "guest"}, - %% {passcode, "guest"}]}, - - %% If a default user is configured, or you have configured use SSL client - %% certificate based authentication, you can choose to allow clients to - %% omit the CONNECT frame entirely. If set to true, the client is - %% automatically connected as the default user or user supplied in the - %% SSL certificate whenever the first frame sent on a session is not a - %% CONNECT frame. - %% - %% {implicit_connect, true} - ]}, - - %% ---------------------------------------------------------------------------- - %% RabbitMQ MQTT Adapter - %% - %% See http://hg.rabbitmq.com/rabbitmq-mqtt/file/stable/README.md for details - %% ---------------------------------------------------------------------------- - - {rabbitmq_mqtt, - [%% Set the default user name and password. Will be used as the default login - %% if a connecting client provides no other login details. - %% - %% Please note that setting this will allow clients to connect without - %% authenticating! - %% - %% {default_user, <<"guest">>}, - %% {default_pass, <<"guest">>}, - - %% Enable anonymous access. If this is set to false, clients MUST provide - %% login information in order to connect. See the default_user/default_pass - %% configuration elements for managing logins without authentication. - %% - %% {allow_anonymous, true}, - - %% If you have multiple chosts, specify the one to which the - %% adapter connects. - %% - %% {vhost, <<"/">>}, - - %% Specify the exchange to which messages from MQTT clients are published. - %% - %% {exchange, <<"amq.topic">>}, - - %% Specify TTL (time to live) to control the lifetime of non-clean sessions. - %% - %% {subscription_ttl, 1800000}, - - %% Set the prefetch count (governing the maximum number of unacknowledged - %% messages that will be delivered). - %% - %% {prefetch, 10}, - - %% TCP/SSL Configuration (as per the broker configuration). - %% - %% {tcp_listeners, [1883]}, - %% {ssl_listeners, []}, - - %% TCP/Socket options (as per the broker configuration). - %% - %% {tcp_listen_options, [binary, - %% {packet, raw}, - %% {reuseaddr, true}, - %% {backlog, 128}, - %% {nodelay, true}]} - ]}, - - %% ---------------------------------------------------------------------------- - %% RabbitMQ AMQP 1.0 Support - %% - %% See http://hg.rabbitmq.com/rabbitmq-amqp1.0/file/default/README.md - %% for details - %% ---------------------------------------------------------------------------- - - {rabbitmq_amqp1_0, - [%% Connections that are not authenticated with SASL will connect as this - %% account. See the README for more information. - %% - %% Please note that setting this will allow clients to connect without - %% authenticating! - %% - %% {default_user, "guest"}, - - %% Enable protocol strict mode. See the README for more information. - %% - %% {protocol_strict_mode, false} - ]}, - - %% ---------------------------------------------------------------------------- - %% RabbitMQ LDAP Plugin - %% - %% See http://www.rabbitmq.com/ldap.html for details. - %% - %% ---------------------------------------------------------------------------- - - {rabbitmq_auth_backend_ldap, - [%% - %% Connecting to the LDAP server(s) - %% ================================ - %% - - %% Specify servers to bind to. You *must* set this in order for the plugin - %% to work properly. - %% - %% {servers, ["your-server-name-goes-here"]}, - - %% Connect to the LDAP server using SSL - %% - %% {use_ssl, false}, - - %% Specify the LDAP port to connect to - %% - %% {port, 389}, - - %% LDAP connection timeout, in milliseconds or 'infinity' - %% - %% {timeout, infinity}, - - %% Enable logging of LDAP queries. - %% One of - %% - false (no logging is performed) - %% - true (verbose logging of the logic used by the plugin) - %% - network (as true, but additionally logs LDAP network traffic) - %% - %% Defaults to false. - %% - %% {log, false}, - - %% - %% Authentication - %% ============== - %% - - %% Pattern to convert the username given through AMQP to a DN before - %% binding - %% - %% {user_dn_pattern, "cn=${username},ou=People,dc=example,dc=com"}, - - %% Alternatively, you can convert a username to a Distinguished - %% Name via an LDAP lookup after binding. See the documentation for - %% full details. - - %% When converting a username to a dn via a lookup, set these to - %% the name of the attribute that represents the user name, and the - %% base DN for the lookup query. - %% - %% {dn_lookup_attribute, "userPrincipalName"}, - %% {dn_lookup_base, "DC=gopivotal,DC=com"}, - - %% Controls how to bind for authorisation queries and also to - %% retrieve the details of users logging in without presenting a - %% password (e.g., SASL EXTERNAL). - %% One of - %% - as_user (to bind as the authenticated user - requires a password) - %% - anon (to bind anonymously) - %% - {UserDN, Password} (to bind with a specified user name and password) - %% - %% Defaults to 'as_user'. - %% - %% {other_bind, as_user}, - - %% - %% Authorisation - %% ============= - %% - - %% The LDAP plugin can perform a variety of queries against your - %% LDAP server to determine questions of authorisation. See - %% http://www.rabbitmq.com/ldap.html#authorisation for more - %% information. - - %% Set the query to use when determining vhost access - %% - %% {vhost_access_query, {in_group, - %% "ou=${vhost}-users,ou=vhosts,dc=example,dc=com"}}, - - %% Set the query to use when determining resource (e.g., queue) access - %% - %% {resource_access_query, {constant, true}}, - - %% Set queries to determine which tags a user has - %% - %% {tag_queries, []} ]} ]. +% EOF Info: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]: Filebucketed /etc/rabbitmq/rabbitmq.config to puppet with sum 3e342d4a660626a9b588a723ad6cba74 Notice: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]/content: content changed '{md5}3e342d4a660626a9b588a723ad6cba74' to '{md5}808c7824d2fe3217e34c0f11b45084ed' Info: /Stage[main]/Rabbitmq::Config/File[rabbitmq.config]: Scheduling refresh of Class[Rabbitmq::Service] Notice: /Stage[main]/Rabbitmq::Config/File[rabbitmqadmin.conf]/ensure: defined content as '{md5}56b4bb3dfb32765e14d2a04faea60e62' Notice: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d]/ensure: created Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::install::end]: Triggered 'refresh' from 1 events Info: /Stage[main]/Keystone::Deps/Anchor[keystone::install::end]: Scheduling refresh of Anchor[keystone::service::begin] Info: /Stage[main]/Keystone::Deps/Anchor[keystone::install::end]: Scheduling refresh of Exec[keystone-manage db_sync] Notice: /Stage[main]/Keystone::Wsgi::Apache/File[/var/www/cgi-bin/keystone]/ensure: created Notice: /Stage[main]/Keystone::Wsgi::Apache/File[keystone_wsgi_admin]/ensure: defined content as '{md5}b60f70d60e09d39ab5900f4b4eebf921' Notice: /Stage[main]/Keystone::Wsgi::Apache/File[keystone_wsgi_main]/ensure: defined content as '{md5}b60f70d60e09d39ab5900f4b4eebf921' Notice: /Stage[main]/Openstack_integration::Keystone/Openstack_integration::Ssl_key[keystone]/File[/etc/keystone/ssl]/ensure: created Notice: /Stage[main]/Openstack_integration::Keystone/Openstack_integration::Ssl_key[keystone]/File[/etc/keystone/ssl/private]/ensure: created Notice: /Stage[main]/Openstack_integration::Keystone/Openstack_integration::Ssl_key[keystone]/File[/etc/keystone/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00' Info: Openstack_integration::Ssl_key[keystone]: Scheduling refresh of Service[httpd] Notice: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d/limits.conf]/ensure: defined content as '{md5}8eb9ff6c576b9869944215af3a568c2e' Info: /Stage[main]/Rabbitmq::Config/File[/etc/systemd/system/rabbitmq-server.service.d/limits.conf]: Scheduling refresh of Exec[rabbitmq-systemd-reload] Notice: /Stage[main]/Rabbitmq::Config/Exec[rabbitmq-systemd-reload]: Triggered 'refresh' from 1 events Info: /Stage[main]/Rabbitmq::Config/Exec[rabbitmq-systemd-reload]: Scheduling refresh of Class[Rabbitmq::Service] Notice: /Stage[main]/Keystone::Cron::Token_flush/Cron[keystone-manage token_flush]/ensure: created Notice: /Stage[main]/Rabbitmq::Config/File[/etc/rabbitmq/ssl]/ensure: created Notice: /Stage[main]/Openstack_integration::Rabbitmq/File[/etc/rabbitmq/ssl/private]/ensure: created Notice: /Stage[main]/Openstack_integration::Rabbitmq/Openstack_integration::Ssl_key[rabbitmq]/File[/etc/rabbitmq/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00' Info: Openstack_integration::Ssl_key[rabbitmq]: Scheduling refresh of Service[rabbitmq-server] Notice: /Stage[main]/Keystone/Keystone_config[ssl/ca_certs]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[ssl/ca_certs]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[fernet_tokens/key_repository]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[fernet_tokens/key_repository]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[ssl/cert_subject]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[ssl/cert_subject]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[signing/keyfile]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[signing/keyfile]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[catalog/driver]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[catalog/driver]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[ssl/ca_key]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[ssl/ca_key]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[ssl/enable]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[ssl/enable]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[token/provider]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[token/provider]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[signing/key_size]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[signing/key_size]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_port]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_port]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[signing/ca_certs]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[signing/ca_certs]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[signing/ca_key]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[signing/ca_key]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[ssl/certfile]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[ssl/certfile]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf]/ensure: created Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf]: Scheduling refresh of Exec[concat_15-default.conf] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf/fragments]/ensure: created Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf/fragments]: Scheduling refresh of Exec[concat_15-default.conf] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-serversignature]/File[/var/lib/puppet/concat/15-default.conf/fragments/90_default-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7' Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-serversignature]/File[/var/lib/puppet/concat/15-default.conf/fragments/90_default-serversignature]: Scheduling refresh of Exec[concat_15-default.conf] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-directories]/File[/var/lib/puppet/concat/15-default.conf/fragments/60_default-directories]/ensure: defined content as '{md5}5e2a84875965faa5e3df0e222301ba37' Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-directories]/File[/var/lib/puppet/concat/15-default.conf/fragments/60_default-directories]: Scheduling refresh of Exec[concat_15-default.conf] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-docroot]/File[/var/lib/puppet/concat/15-default.conf/fragments/10_default-docroot]/ensure: defined content as '{md5}6faaccbc7ca8bc885ebf139223885d52' Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-docroot]/File[/var/lib/puppet/concat/15-default.conf/fragments/10_default-docroot]: Scheduling refresh of Exec[concat_15-default.conf] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-apache-header]/File[/var/lib/puppet/concat/15-default.conf/fragments/0_default-apache-header]/ensure: defined content as '{md5}c46eea5ff4d7874403fa7a9228888f0e' Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-apache-header]/File[/var/lib/puppet/concat/15-default.conf/fragments/0_default-apache-header]: Scheduling refresh of Exec[concat_15-default.conf] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf/fragments.concat]/ensure: created Notice: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/debug]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[ssl/keyfile]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[ssl/keyfile]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[signing/certfile]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[signing/certfile]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_bind_host]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_bind_host]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Apache::Listen[35357]/Concat::Fragment[Listen 35357]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 35357]/ensure: defined content as '{md5}37dc13694e40f667def8eaa0cc261d03' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Apache::Listen[35357]/Concat::Fragment[Listen 35357]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 35357]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf] Notice: /Stage[main]/Keystone/Keystone_config[eventlet_server/public_workers]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[eventlet_server/public_workers]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Apache::Listen[5000]/Concat::Fragment[Listen 5000]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 5000]/ensure: defined content as '{md5}9ce4fddc0fe1c0dd6016a171946def55' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Apache::Listen[5000]/Concat::Fragment[Listen 5000]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 5000]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Apache::Listen[80]/Concat::Fragment[Listen 80]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 80]/ensure: defined content as '{md5}d5fcefc335117f400d451de47efeca87' Info: /Stage[main]/Apache/Apache::Vhost[default]/Apache::Listen[80]/Concat::Fragment[Listen 80]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 80]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf] Notice: /Stage[main]/Keystone/Keystone_config[catalog/template_file]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[catalog/template_file]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[token/driver]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[token/driver]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_bind_host]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[DEFAULT/public_bind_host]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_token]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_token]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Apache::Listen[8774]/Concat::Fragment[Listen 8774]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 8774]/ensure: defined content as '{md5}edb2a81e84f59aaa4978ff2d53c01a3e' Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Apache::Listen[8774]/Concat::Fragment[Listen 8774]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 8774]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf] Notice: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/log_dir]/ensure: created Info: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/log_dir]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_port]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[DEFAULT/admin_port]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[/var/lib/puppet/concat/15-default.conf/fragments.concat.out]/ensure: created Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-file_footer]/File[/var/lib/puppet/concat/15-default.conf/fragments/999_default-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d' Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-file_footer]/File[/var/lib/puppet/concat/15-default.conf/fragments/999_default-file_footer]: Scheduling refresh of Exec[concat_15-default.conf] Notice: /Stage[main]/Keystone/Keystone_config[signing/cert_subject]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[signing/cert_subject]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-scriptalias]/File[/var/lib/puppet/concat/15-default.conf/fragments/200_default-scriptalias]/ensure: defined content as '{md5}7fc65400381c3a010f38870f94f236f0' Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-scriptalias]/File[/var/lib/puppet/concat/15-default.conf/fragments/200_default-scriptalias]: Scheduling refresh of Exec[concat_15-default.conf] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Apache::Listen[6385]/Concat::Fragment[Listen 6385]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 6385]/ensure: defined content as '{md5}dab46123b45901c26ef6386ec1195db9' Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Apache::Listen[6385]/Concat::Fragment[Listen 6385]/File[/var/lib/puppet/concat/_etc_httpd_conf_ports.conf/fragments/10_Listen 6385]: Scheduling refresh of Exec[concat_/etc/httpd/conf/ports.conf] Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/Exec[concat_/etc/httpd/conf/ports.conf]/returns: executed successfully Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/Exec[concat_/etc/httpd/conf/ports.conf]: Triggered 'refresh' from 8 events Notice: /Stage[main]/Apache/Concat[/etc/httpd/conf/ports.conf]/File[/etc/httpd/conf/ports.conf]/ensure: defined content as '{md5}ae39e379894fcb4065bbee3724f7036d' Info: Concat[/etc/httpd/conf/ports.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]/content: --- /etc/httpd/conf/httpd.conf 2016-05-12 11:16:14.000000000 +0100 +++ /tmp/puppet-file20160520-26469-t9ohgg 2016-05-20 12:29:49.253135506 +0100 @@ -1,353 +1,49 @@ -# -# This is the main Apache HTTP server configuration file. It contains the -# configuration directives that give the server its instructions. -# See for detailed information. -# In particular, see -# -# for a discussion of each configuration directive. -# -# Do NOT simply read the instructions in here without understanding -# what they do. They're here only as hints or reminders. If you are unsure -# consult the online docs. You have been warned. -# -# Configuration and logfile names: If the filenames you specify for many -# of the server's control files begin with "/" (or "drive:/" for Win32), the -# server will use that explicit path. If the filenames do *not* begin -# with "/", the value of ServerRoot is prepended -- so 'log/access_log' -# with ServerRoot set to '/www' will be interpreted by the -# server as '/www/log/access_log', where as '/log/access_log' will be -# interpreted as '/log/access_log'. - -# -# ServerRoot: The top of the directory tree under which the server's -# configuration, error, and log files are kept. -# -# Do not add a slash at the end of the directory path. If you point -# ServerRoot at a non-local disk, be sure to specify a local disk on the -# Mutex directive, if file-based mutexes are used. If you wish to share the -# same ServerRoot for multiple httpd daemons, you will need to change at -# least PidFile. -# +# Security +ServerTokens OS +ServerSignature On +TraceEnable On + +ServerName "n2.dusty.ci.centos.org" ServerRoot "/etc/httpd" +PidFile run/httpd.pid +Timeout 120 +KeepAlive Off +MaxKeepAliveRequests 100 +KeepAliveTimeout 15 +LimitRequestFieldSize 8190 + -# -# Listen: Allows you to bind Apache to specific IP addresses and/or -# ports, instead of the default. See also the -# directive. -# -# Change this to Listen on specific IP addresses as shown below to -# prevent Apache from glomming onto all bound IP addresses. -# -#Listen 12.34.56.78:80 -Listen 80 - -# -# Dynamic Shared Object (DSO) Support -# -# To be able to use the functionality of a module which was built as a DSO you -# have to place corresponding `LoadModule' lines at this location so the -# directives contained in it are actually available _before_ they are used. -# Statically compiled modules (those listed by `httpd -l') do not need -# to be loaded here. -# -# Example: -# LoadModule foo_module modules/mod_foo.so -# -Include conf.modules.d/*.conf - -# -# If you wish httpd to run as a different user or group, you must run -# httpd as root initially and it will switch. -# -# User/Group: The name (or #number) of the user/group to run httpd as. -# It is usually good practice to create a dedicated user and group for -# running httpd, as with most system services. -# User apache Group apache -# 'Main' server configuration -# -# The directives in this section set up the values used by the 'main' -# server, which responds to any requests that aren't handled by a -# definition. These values also provide defaults for -# any containers you may define later in the file. -# -# All of these directives may appear inside containers, -# in which case these default settings will be overridden for the -# virtual host being defined. -# - -# -# ServerAdmin: Your address, where problems with the server should be -# e-mailed. This address appears on some server-generated pages, such -# as error documents. e.g. admin@your-domain.com -# -ServerAdmin root@localhost - -# -# ServerName gives the name and port that the server uses to identify itself. -# This can often be determined automatically, but we recommend you specify -# it explicitly to prevent problems during startup. -# -# If your host doesn't have a registered DNS name, enter its IP address here. -# -#ServerName www.example.com:80 - -# -# Deny access to the entirety of your server's filesystem. You must -# explicitly permit access to web content directories in other -# blocks below. -# - - AllowOverride none +AccessFileName .htaccess + Require all denied - + -# -# Note that from this point forward you must specifically allow -# particular features to be enabled - so if something's not working as -# you might expect, make sure that you have specifically enabled it -# below. -# - -# -# DocumentRoot: The directory out of which you will serve your -# documents. By default, all requests are taken from this directory, but -# symbolic links and aliases may be used to point to other locations. -# -DocumentRoot "/var/www/html" - -# -# Relax access to content within /var/www. -# - - AllowOverride None - # Allow open access: - Require all granted + + Options FollowSymLinks + AllowOverride None -# Further relax access to the default document root: - - # - # Possible values for the Options directive are "None", "All", - # or any combination of: - # Indexes Includes FollowSymLinks SymLinksifOwnerMatch ExecCGI MultiViews - # - # Note that "MultiViews" must be named *explicitly* --- "Options All" - # doesn't give it to you. - # - # The Options directive is both complicated and important. Please see - # http://httpd.apache.org/docs/2.4/mod/core.html#options - # for more information. - # - Options Indexes FollowSymLinks - - # - # AllowOverride controls what directives may be placed in .htaccess files. - # It can be "All", "None", or any combination of the keywords: - # Options FileInfo AuthConfig Limit - # - AllowOverride None - - # - # Controls who can get stuff from this server. - # - Require all granted - -# -# DirectoryIndex: sets the file that Apache will serve if a directory -# is requested. -# - - DirectoryIndex index.html - - -# -# The following lines prevent .htaccess and .htpasswd files from being -# viewed by Web clients. -# - - Require all denied - - -# -# ErrorLog: The location of the error log file. -# If you do not specify an ErrorLog directive within a -# container, error messages relating to that virtual host will be -# logged here. If you *do* define an error logfile for a -# container, that host's errors will be logged there and not here. -# -ErrorLog "logs/error_log" - -# -# LogLevel: Control the number of messages logged to the error_log. -# Possible values include: debug, info, notice, warn, error, crit, -# alert, emerg. -# +HostnameLookups Off +ErrorLog "/var/log/httpd/error_log" LogLevel warn +EnableSendfile On - - # - # The following directives define some format nicknames for use with - # a CustomLog directive (see below). - # - LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" combined - LogFormat "%h %l %u %t \"%r\" %>s %b" common - - - # You need to enable mod_logio.c to use %I and %O - LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\" %I %O" combinedio - - - # - # The location and format of the access logfile (Common Logfile Format). - # If you do not define any access logfiles within a - # container, they will be logged here. Contrariwise, if you *do* - # define per- access logfiles, transactions will be - # logged therein and *not* in this file. - # - #CustomLog "logs/access_log" common - - # - # If you prefer a logfile with access, agent, and referer information - # (Combined Logfile Format) you can use the following directive. - # - CustomLog "logs/access_log" combined - - - - # - # Redirect: Allows you to tell clients about documents that used to - # exist in your server's namespace, but do not anymore. The client - # will make a new request for the document at its new location. - # Example: - # Redirect permanent /foo http://www.example.com/bar - - # - # Alias: Maps web paths into filesystem paths and is used to - # access content that does not live under the DocumentRoot. - # Example: - # Alias /webpath /full/filesystem/path - # - # If you include a trailing / on /webpath then the server will - # require it to be present in the URL. You will also likely - # need to provide a section to allow access to - # the filesystem path. - - # - # ScriptAlias: This controls which directories contain server scripts. - # ScriptAliases are essentially the same as Aliases, except that - # documents in the target directory are treated as applications and - # run by the server when requested rather than as documents sent to the - # client. The same rules about trailing "/" apply to ScriptAlias - # directives as to Alias. - # - ScriptAlias /cgi-bin/ "/var/www/cgi-bin/" - - - -# -# "/var/www/cgi-bin" should be changed to whatever your ScriptAliased -# CGI directory exists, if you have that configured. -# - - AllowOverride None - Options None - Require all granted - +#Listen 80 + + +Include "/etc/httpd/conf.modules.d/*.load" +Include "/etc/httpd/conf.modules.d/*.conf" +Include "/etc/httpd/conf/ports.conf" + +LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" combined +LogFormat "%h %l %u %t \"%r\" %>s %b" common +LogFormat "%{Referer}i -> %U" referer +LogFormat "%{User-agent}i" agent +LogFormat "%{X-Forwarded-For}i %l %u %t \"%r\" %s %b \"%{Referer}i\" \"%{User-agent}i\"" forwarded + +IncludeOptional "/etc/httpd/conf.d/*.conf" - - # - # TypesConfig points to the file containing the list of mappings from - # filename extension to MIME-type. - # - TypesConfig /etc/mime.types - - # - # AddType allows you to add to or override the MIME configuration - # file specified in TypesConfig for specific file types. - # - #AddType application/x-gzip .tgz - # - # AddEncoding allows you to have certain browsers uncompress - # information on the fly. Note: Not all browsers support this. - # - #AddEncoding x-compress .Z - #AddEncoding x-gzip .gz .tgz - # - # If the AddEncoding directives above are commented-out, then you - # probably should define those extensions to indicate media types: - # - AddType application/x-compress .Z - AddType application/x-gzip .gz .tgz - - # - # AddHandler allows you to map certain file extensions to "handlers": - # actions unrelated to filetype. These can be either built into the server - # or added with the Action directive (see below) - # - # To use CGI scripts outside of ScriptAliased directories: - # (You will also need to add "ExecCGI" to the "Options" directive.) - # - #AddHandler cgi-script .cgi - - # For type maps (negotiated resources): - #AddHandler type-map var - - # - # Filters allow you to process content before it is sent to the client. - # - # To parse .shtml files for server-side includes (SSI): - # (You will also need to add "Includes" to the "Options" directive.) - # - AddType text/html .shtml - AddOutputFilter INCLUDES .shtml - - -# -# Specify a default charset for all content served; this enables -# interpretation of all content as UTF-8 by default. To use the -# default browser choice (ISO-8859-1), or to allow the META tags -# in HTML content to override this choice, comment out this -# directive: -# -AddDefaultCharset UTF-8 - - - # - # The mod_mime_magic module allows the server to use various hints from the - # contents of the file itself to determine its type. The MIMEMagicFile - # directive tells the module where the hint definitions are located. - # - MIMEMagicFile conf/magic - - -# -# Customizable error responses come in three flavors: -# 1) plain text 2) local redirects 3) external redirects -# -# Some examples: -#ErrorDocument 500 "The server made a boo boo." -#ErrorDocument 404 /missing.html -#ErrorDocument 404 "/cgi-bin/missing_handler.pl" -#ErrorDocument 402 http://www.example.com/subscription_info.html -# - -# -# EnableMMAP and EnableSendfile: On systems that support it, -# memory-mapping or the sendfile syscall may be used to deliver -# files. This usually improves server performance, but must -# be turned off when serving from networked-mounted -# filesystems or if support for these functions is otherwise -# broken on your system. -# Defaults if commented: EnableMMAP On, EnableSendfile Off -# -#EnableMMAP off -EnableSendfile on - -# Supplemental configuration -# -# Load config files in the "/etc/httpd/conf.d" directory, if any. -IncludeOptional conf.d/*.conf Info: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]: Filebucketed /etc/httpd/conf/httpd.conf to puppet with sum f5e7449c0f17bc856e86011cb5d152ba Notice: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]/content: content changed '{md5}f5e7449c0f17bc856e86011cb5d152ba' to '{md5}b3ed70a3a40f48d061c63f23fbbea111' Info: /Stage[main]/Apache/File[/etc/httpd/conf/httpd.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf]/ensure: created Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments]/ensure: created Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-wsgi]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/260_keystone_wsgi_admin-wsgi]/ensure: defined content as '{md5}eab4d58b350697a7677844fd645581bf' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-wsgi]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/260_keystone_wsgi_admin-wsgi]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-directories]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/60_keystone_wsgi_admin-directories]/ensure: defined content as '{md5}cc81234a3bbf77f857ed3f11bb369e8c' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-directories]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/60_keystone_wsgi_admin-directories]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-file_footer]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/999_keystone_wsgi_admin-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-file_footer]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/999_keystone_wsgi_admin-file_footer]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-docroot]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/10_keystone_wsgi_admin-docroot]/ensure: defined content as '{md5}e250ff3401328e2e106702576d684293' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-docroot]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/10_keystone_wsgi_admin-docroot]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-serversignature]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/90_keystone_wsgi_admin-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-serversignature]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/90_keystone_wsgi_admin-serversignature]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-logging]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/80_keystone_wsgi_admin-logging]/ensure: defined content as '{md5}6e95210e81b53fbd537c884ba77577a6' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-logging]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/80_keystone_wsgi_admin-logging]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-ssl]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/230_keystone_wsgi_admin-ssl]/ensure: defined content as '{md5}30fbced56cdd99b65558d366e970e5fd' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-ssl]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/230_keystone_wsgi_admin-ssl]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments.concat]/ensure: created Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-access_log]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/100_keystone_wsgi_admin-access_log]/ensure: defined content as '{md5}f3a5a390b72c0e5ada35efbd1ab9c568' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-access_log]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/100_keystone_wsgi_admin-access_log]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf]/ensure: created Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments]/ensure: created Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-file_footer]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/999_keystone_wsgi_main-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-file_footer]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/999_keystone_wsgi_main-file_footer]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments.concat]/ensure: created Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-serversignature]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/90_keystone_wsgi_main-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-serversignature]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/90_keystone_wsgi_main-serversignature]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments.concat.out]/ensure: created Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-logging]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/80_keystone_wsgi_main-logging]/ensure: defined content as '{md5}2e5c08362091258b73059cd0e5435e9a' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-logging]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/80_keystone_wsgi_main-logging]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-access_log]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/100_keystone_wsgi_main-access_log]/ensure: defined content as '{md5}f8509b8e1ef317dd58bbcca1480a9c61' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-access_log]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/100_keystone_wsgi_main-access_log]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-apache-header]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/0_keystone_wsgi_main-apache-header]/ensure: defined content as '{md5}bcbedce152a9ba8190ab5a78ad4256f9' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-apache-header]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/0_keystone_wsgi_main-apache-header]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-wsgi]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/260_keystone_wsgi_main-wsgi]/ensure: defined content as '{md5}0ed0f415940e9362ef9e1871efb2c050' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-wsgi]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/260_keystone_wsgi_main-wsgi]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-ssl]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/230_keystone_wsgi_main-ssl]/ensure: defined content as '{md5}30fbced56cdd99b65558d366e970e5fd' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-ssl]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/230_keystone_wsgi_main-ssl]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-docroot]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/10_keystone_wsgi_main-docroot]/ensure: defined content as '{md5}e250ff3401328e2e106702576d684293' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-docroot]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/10_keystone_wsgi_main-docroot]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf] Notice: /Stage[main]/Keystone/Keystone_config[token/expiration]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[token/expiration]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf]/ensure: created Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments]/ensure: created Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-serversignature]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/90_nova_api_wsgi-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7' Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-serversignature]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/90_nova_api_wsgi-serversignature]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-ssl]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/230_nova_api_wsgi-ssl]/ensure: defined content as '{md5}6e6f07e9782e4535b25afa0e9dbd5964' Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-ssl]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/230_nova_api_wsgi-ssl]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-docroot]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/10_nova_api_wsgi-docroot]/ensure: defined content as '{md5}a24d3496cbab869d04b9f6400e91f05b' Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-docroot]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/10_nova_api_wsgi-docroot]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-file_footer]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/999_nova_api_wsgi-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d' Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-file_footer]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/999_nova_api_wsgi-file_footer]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments.concat]/ensure: created Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-apache-header]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/0_nova_api_wsgi-apache-header]/ensure: defined content as '{md5}532286892f0965124c5d0f7a2d7ad2d2' Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-apache-header]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/0_nova_api_wsgi-apache-header]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments.concat.out]/ensure: created Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-logging]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/80_nova_api_wsgi-logging]/ensure: defined content as '{md5}fffc2d2c643ad504aca6c347d7aec2d6' Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-logging]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/80_nova_api_wsgi-logging]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-directories]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/60_nova_api_wsgi-directories]/ensure: defined content as '{md5}969793e0f283be30a0641501324cd29c' Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-directories]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/60_nova_api_wsgi-directories]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf] Notice: /Stage[main]/Keystone/Keystone_config[eventlet_server/admin_workers]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[eventlet_server/admin_workers]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone::Db/Keystone_config[database/connection]/ensure: created Info: /Stage[main]/Keystone::Db/Keystone_config[database/connection]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments.concat.out]/ensure: created Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-access_log]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/100_nova_api_wsgi-access_log]/ensure: defined content as '{md5}3202d2662ed78e6f729646225603e1f5' Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-access_log]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/100_nova_api_wsgi-access_log]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf] Notice: /Stage[main]/Rabbitmq::Config/File[/etc/security/limits.d/rabbitmq-server.conf]/ensure: defined content as '{md5}5ddc6ba5fcaeddd5b1565e5adfda5236' Info: /Stage[main]/Rabbitmq::Config/File[/etc/security/limits.d/rabbitmq-server.conf]: Scheduling refresh of Class[Rabbitmq::Service] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-access_log]/File[/var/lib/puppet/concat/15-default.conf/fragments/100_default-access_log]/ensure: defined content as '{md5}65fb033baac888b4ab85c295e870cb8f' Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-access_log]/File[/var/lib/puppet/concat/15-default.conf/fragments/100_default-access_log]: Scheduling refresh of Exec[concat_15-default.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-apache-header]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/0_keystone_wsgi_admin-apache-header]/ensure: defined content as '{md5}36e2769e5e22c8ff440262db545892f0' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat::Fragment[keystone_wsgi_admin-apache-header]/File[/var/lib/puppet/concat/10-keystone_wsgi_admin.conf/fragments/0_keystone_wsgi_admin-apache-header]: Scheduling refresh of Exec[concat_10-keystone_wsgi_admin.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/Exec[concat_10-keystone_wsgi_admin.conf]/returns: executed successfully Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/Exec[concat_10-keystone_wsgi_admin.conf]: Triggered 'refresh' from 11 events Notice: /Stage[main]/Rabbitmq::Config/File[rabbitmq-env.config]/ensure: defined content as '{md5}d1faed99ee5f85f2e3ef458c2d19f3a8' Info: /Stage[main]/Rabbitmq::Config/File[rabbitmq-env.config]: Scheduling refresh of Class[Rabbitmq::Service] Info: Class[Rabbitmq::Config]: Scheduling refresh of Class[Rabbitmq::Service] Info: Class[Rabbitmq::Service]: Scheduling refresh of Service[rabbitmq-server] Notice: /Stage[main]/Rabbitmq::Service/Service[rabbitmq-server]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Rabbitmq::Service/Service[rabbitmq-server]: Unscheduling refresh on Service[rabbitmq-server] Notice: /Stage[main]/Rabbitmq::Management/Rabbitmq_user[guest]/ensure: removed Notice: /Stage[main]/Rabbitmq::Install::Rabbitmqadmin/Staging::File[rabbitmqadmin]/Exec[/var/lib/rabbitmq/rabbitmqadmin]/returns: executed successfully Notice: /Stage[main]/Rabbitmq::Install::Rabbitmqadmin/File[/usr/local/bin/rabbitmqadmin]/ensure: defined content as '{md5}63d7331e825c865a97b7a8d1299841ff' Notice: /Stage[main]/Openstack_integration::Ironic/Rabbitmq_user[ironic]/ensure: created Notice: /Stage[main]/Openstack_integration::Neutron/Rabbitmq_user[neutron]/ensure: created Notice: /Stage[main]/Openstack_integration::Cinder/Rabbitmq_user[cinder]/ensure: created Notice: /Stage[main]/Openstack_integration::Nova/Rabbitmq_user[nova]/ensure: created Notice: /Stage[main]/Openstack_integration::Ironic/Rabbitmq_user_permissions[ironic@/]/ensure: created Notice: /Stage[main]/Ironic::Conductor/Service[ironic-conductor]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Ironic::Conductor/Service[ironic-conductor]: Unscheduling refresh on Service[ironic-conductor] Notice: /Stage[main]/Ironic::Api/Service[ironic-api]: Triggered 'refresh' from 1 events Notice: /Stage[main]/Openstack_integration::Nova/Rabbitmq_user_permissions[nova@/]/ensure: created Notice: /Stage[main]/Openstack_integration::Cinder/Rabbitmq_user_permissions[cinder@/]/ensure: created Notice: /Stage[main]/Openstack_integration::Glance/Rabbitmq_user[glance]/ensure: created Notice: /Stage[main]/Openstack_integration::Glance/Rabbitmq_user_permissions[glance@/]/ensure: created Notice: /Stage[main]/Openstack_integration::Neutron/Rabbitmq_user_permissions[neutron@/]/ensure: created Notice: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_large_object_size]/ensure: created Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_large_object_size]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Backend::Swift/Glance_api_config[glance_store/swift_store_large_object_size]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]/ensure: created Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Service[swift-account-server] Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Service[swift-account-replicator] Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Service[swift-account-auditor] Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Swift::Service[swift-account-server] Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Swift::Service[swift-account-replicator] Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Package[swift-account]: Scheduling refresh of Swift::Service[swift-account-auditor] Info: Swift::Service[swift-account-server]: Scheduling refresh of Service[swift-account-server] Notice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/File[/etc/swift/account-server/]/owner: owner changed 'root' to 'swift' Notice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/File[/etc/swift/account-server/]/group: group changed 'root' to 'swift' Info: Swift::Service[swift-account-replicator]: Scheduling refresh of Service[swift-account-replicator] Notice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]/ensure: created Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Service[swift-container-server] Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Service[swift-container-replicator] Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Service[swift-container-auditor] Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Swift::Service[swift-container-server] Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Swift::Service[swift-container-replicator] Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Package[swift-container]: Scheduling refresh of Swift::Service[swift-container-auditor] Info: Swift::Service[swift-container-replicator]: Scheduling refresh of Service[swift-container-replicator] Notice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/File[/etc/swift/container-server/]/owner: owner changed 'root' to 'swift' Notice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/File[/etc/swift/container-server/]/group: group changed 'root' to 'swift' Info: Swift::Service[swift-container-auditor]: Scheduling refresh of Service[swift-container-auditor] Info: Swift::Service[swift-container-server]: Scheduling refresh of Service[swift-container-server] Notice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]/ensure: created Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Service[swift-object-server] Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Service[swift-object-replicator] Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Service[swift-object-auditor] Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Swift::Service[swift-object-server] Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Swift::Service[swift-object-replicator] Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Package[swift-object]: Scheduling refresh of Swift::Service[swift-object-auditor] Notice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/File[/etc/swift/object-server/]/owner: owner changed 'root' to 'swift' Notice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/File[/etc/swift/object-server/]/group: group changed 'root' to 'swift' Info: Swift::Service[swift-object-server]: Scheduling refresh of Service[swift-object-server] Info: Swift::Service[swift-object-replicator]: Scheduling refresh of Service[swift-object-replicator] Info: Swift::Service[swift-object-auditor]: Scheduling refresh of Service[swift-object-auditor] Info: Swift::Service[swift-account-auditor]: Scheduling refresh of Service[swift-account-auditor] Notice: /Stage[main]/Swift::Proxy/Package[swift-proxy]/ensure: created Info: /Stage[main]/Swift::Proxy/Package[swift-proxy]: Scheduling refresh of Anchor[keystone::service::end] Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf]/ensure: created Info: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments.concat]/ensure: created Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments.concat.out]/ensure: created Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments]/ensure: created Info: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy/Concat::Fragment[swift_proxy]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/00_swift_proxy]/ensure: defined content as '{md5}3e7368112b701526ac018208596b6f2d' Info: /Stage[main]/Swift::Proxy/Concat::Fragment[swift_proxy]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/00_swift_proxy]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Tempauth/Concat::Fragment[swift-proxy-swauth]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/01_swift-proxy-swauth]/ensure: defined content as '{md5}77ae9d1ddf6d75e07b795e520797adb4' Info: /Stage[main]/Swift::Proxy::Tempauth/Concat::Fragment[swift-proxy-swauth]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/01_swift-proxy-swauth]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Container_quotas/Concat::Fragment[swift_container_quotas]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/81_swift_container_quotas]/ensure: defined content as '{md5}9cb7c3e198ec9152a4e1f80eb6448f6a' Info: /Stage[main]/Swift::Proxy::Container_quotas/Concat::Fragment[swift_container_quotas]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/81_swift_container_quotas]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Catch_errors/Concat::Fragment[swift_catch_errors]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/24_swift_catch_errors]/ensure: defined content as '{md5}e07f0e5b125db7d6c8b4724c1648bcd5' Info: /Stage[main]/Swift::Proxy::Catch_errors/Concat::Fragment[swift_catch_errors]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/24_swift_catch_errors]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Healthcheck/Concat::Fragment[swift_healthcheck]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/25_swift_healthcheck]/ensure: defined content as '{md5}8c92056c41082619d179f88ea15c5fc6' Info: /Stage[main]/Swift::Proxy::Healthcheck/Concat::Fragment[swift_healthcheck]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/25_swift_healthcheck]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Account_quotas/Concat::Fragment[swift_account_quotas]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/80_swift_account_quotas]/ensure: defined content as '{md5}c1ff253d7976e5b952647085fb3cefe3' Info: /Stage[main]/Swift::Proxy::Account_quotas/Concat::Fragment[swift_account_quotas]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/80_swift_account_quotas]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Cache/Concat::Fragment[swift_cache]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/23_swift_cache]/ensure: defined content as '{md5}cf82123513431b136e71a4503aeb82d9' Info: /Stage[main]/Swift::Proxy::Cache/Concat::Fragment[swift_cache]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/23_swift_cache]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Tempurl/Concat::Fragment[swift-proxy-tempurl]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/29_swift-proxy-tempurl]/ensure: defined content as '{md5}2fe004eae9f03fc684f9ed90044bd9c5' Info: /Stage[main]/Swift::Proxy::Tempurl/Concat::Fragment[swift-proxy-tempurl]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/29_swift-proxy-tempurl]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Formpost/Concat::Fragment[swift-proxy-formpost]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/31_swift-proxy-formpost]/ensure: defined content as '{md5}6907293ed6375b05de487bb7e0556ddd' Info: /Stage[main]/Swift::Proxy::Formpost/Concat::Fragment[swift-proxy-formpost]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/31_swift-proxy-formpost]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Keystone/Concat::Fragment[swift_keystone]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/79_swift_keystone]/ensure: defined content as '{md5}1cf1118a35e6b76ab6ee194eb0722f53' Info: /Stage[main]/Swift::Proxy::Keystone/Concat::Fragment[swift_keystone]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/79_swift_keystone]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Ratelimit/Concat::Fragment[swift_ratelimit]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/26_swift_ratelimit]/ensure: defined content as '{md5}2421e61cdf9eb2689fd5f1cc3740eb08' Info: /Stage[main]/Swift::Proxy::Ratelimit/Concat::Fragment[swift_ratelimit]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/26_swift_ratelimit]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Staticweb/Concat::Fragment[swift-proxy-staticweb]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/32_swift-proxy-staticweb]/ensure: defined content as '{md5}3e8e5d2820dc79360e8f1e07541ef8dc' Info: /Stage[main]/Swift::Proxy::Staticweb/Concat::Fragment[swift-proxy-staticweb]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/32_swift-proxy-staticweb]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Authtoken/File[/var/cache/swift]/group: group changed 'root' to 'swift' Notice: /Stage[main]/Swift::Proxy::Authtoken/File[/var/cache/swift]/mode: mode changed '0755' to '0700' Notice: /Stage[main]/Swift::Proxy::Authtoken/Concat::Fragment[swift_authtoken]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/22_swift_authtoken]/ensure: defined content as '{md5}f056388ce12c47fdd707acf18f5a14db' Info: /Stage[main]/Swift::Proxy::Authtoken/Concat::Fragment[swift_authtoken]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/22_swift_authtoken]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy::Proxy_logging/Concat::Fragment[swift_proxy-logging]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/27_swift_proxy-logging]/ensure: defined content as '{md5}a7f5751de4957dadfee13dc6e6c83c1a' Info: /Stage[main]/Swift::Proxy::Proxy_logging/Concat::Fragment[swift_proxy-logging]/File[/var/lib/puppet/concat/_etc_swift_proxy-server.conf/fragments/27_swift_proxy-logging]: Scheduling refresh of Exec[concat_/etc/swift/proxy-server.conf] Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/Exec[concat_/etc/swift/proxy-server.conf]/returns: executed successfully Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/Exec[concat_/etc/swift/proxy-server.conf]: Triggered 'refresh' from 16 events Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]/content: --- /etc/swift/proxy-server.conf 2016-05-07 16:57:43.000000000 +0100 +++ /tmp/puppet-file20160520-26469-1jhbxzd 2016-05-20 12:30:10.292337276 +0100 @@ -1,16 +1,57 @@ +# This file is managed by puppet. Do not edit +# [DEFAULT] bind_port = 8080 -workers = 8 + +bind_ip = 127.0.0.1 + +workers = 2 user = swift +log_name = proxy-server +log_facility = LOG_LOCAL1 +log_level = INFO +log_headers = False +log_address = /dev/log + + [pipeline:main] -pipeline = healthcheck cache authtoken keystone proxy-logging proxy-server +pipeline = catch_errors healthcheck cache tempurl ratelimit authtoken keystone formpost staticweb container_quotas account_quotas proxy-logging proxy-server [app:proxy-server] use = egg:swift#proxy +set log_name = proxy-server +set log_facility = LOG_LOCAL1 +set log_level = INFO +set log_address = /dev/log +log_handoffs = true allow_account_management = true account_autocreate = true + + + + +[filter:tempauth] +use = egg:swift#tempauth + +user_admin_admin = admin .admin .reseller_admin + + +[filter:authtoken] +log_name = swift +signing_dir = /var/cache/swift +paste.filter_factory = keystonemiddleware.auth_token:filter_factory + +auth_uri = https://127.0.0.1:5000/v2.0 +identity_uri = https://127.0.0.1:35357/ +admin_tenant_name = services +admin_user = swift +admin_password = a_big_secret +delay_auth_decision = 1 +cache = swift.cache +include_service_catalog = False + [filter:cache] use = egg:swift#memcache memcache_servers = 127.0.0.1:11211 @@ -21,21 +62,34 @@ [filter:healthcheck] use = egg:swift#healthcheck +[filter:ratelimit] +use = egg:swift#ratelimit +clock_accuracy = 1000 +max_sleep_time_seconds = 60 +log_sleep_time_seconds = 0 +rate_buffer_seconds = 5 +account_ratelimit = 0 + [filter:proxy-logging] use = egg:swift#proxy_logging +[filter:tempurl] +use = egg:swift#tempurl + +[filter:formpost] +use = egg:swift#formpost + +[filter:staticweb] +use = egg:swift#staticweb + [filter:keystone] use = egg:swift#keystoneauth -operator_roles = admin, SwiftOperator +operator_roles = Member, admin, SwiftOperator is_admin = true -cache = swift.cache +reseller_prefix = AUTH_ -[filter:authtoken] -paste.filter_factory = keystonemiddleware.auth_token:filter_factory -admin_tenant_name = %SERVICE_TENANT_NAME% -admin_user = %SERVICE_USER% -admin_password = %SERVICE_PASSWORD% -auth_host = 127.0.0.1 -auth_port = 35357 -auth_protocol = http -signing_dir = /tmp/keystone-signing-swift +[filter:account_quotas] +use = egg:swift#account_quotas + +[filter:container_quotas] +use = egg:swift#container_quotas Info: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]: Filebucketed /etc/swift/proxy-server.conf to puppet with sum cd347a2631d48647d000f5d34985704c Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]/content: content changed '{md5}cd347a2631d48647d000f5d34985704c' to '{md5}d6844dcb64e004f7b06f1e9ac75a5a56' Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]/owner: owner changed 'root' to 'swift' Notice: /Stage[main]/Swift::Proxy/Concat[/etc/swift/proxy-server.conf]/File[/etc/swift/proxy-server.conf]/mode: mode changed '0640' to '0644' Info: Concat[/etc/swift/proxy-server.conf]: Scheduling refresh of Swift::Service[swift-proxy-server] Info: Concat[/etc/swift/proxy-server.conf]: Scheduling refresh of Service[swift-proxy-server] Info: Swift::Service[swift-proxy-server]: Scheduling refresh of Service[swift-proxy-server] Notice: /Stage[main]/Swift::Proxy/Swift::Service[swift-proxy-server]/Service[swift-proxy-server]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Proxy/Swift::Service[swift-proxy-server]/Service[swift-proxy-server]: Unscheduling refresh on Service[swift-proxy-server] Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/Exec[concat_/etc/swift/account-server.conf]/returns: executed successfully Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/Exec[concat_/etc/swift/account-server.conf]: Triggered 'refresh' from 5 events Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/content: --- /etc/swift/account-server.conf 2016-05-07 16:57:43.000000000 +0100 +++ /tmp/puppet-file20160520-26469-ic4cnw 2016-05-20 12:30:10.638324149 +0100 @@ -1,21 +1,39 @@ [DEFAULT] - -# Make sure your swift-ring-builder arguments match the bind_ip and bind_port. -# You almost certainly do not want to listen just on loopback unless testing. -# However, you want to keep port 6202 if SElinux is enabled. +devices = /srv/node bind_ip = 127.0.0.1 -bind_port = 6202 +bind_port = 6002 +mount_check = false +user = swift +workers = 1 +log_name = account-server +log_facility = LOG_LOCAL2 +log_level = INFO +log_address = /dev/log + -workers = 2 [pipeline:main] pipeline = account-server [app:account-server] use = egg:swift#account +set log_name = account-server +set log_facility = LOG_LOCAL2 +set log_level = INFO +set log_requests = true +set log_address = /dev/log [account-replicator] +concurrency = 8 [account-auditor] [account-reaper] +concurrency = 8 + +[filter:healthcheck] +use = egg:swift#healthcheck + +[filter:recon] +use = egg:swift#recon +recon_cache_path = /var/cache/swift Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Filebucketed /etc/swift/account-server.conf to puppet with sum 07e5a1a1e5a0ab83d745e20680eb32c1 Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/content: content changed '{md5}07e5a1a1e5a0ab83d745e20680eb32c1' to '{md5}b09bb7b7833b29c19014f8963d0e6884' Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/owner: owner changed 'root' to 'swift' Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]/mode: mode changed '0640' to '0644' Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-reaper] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Swift::Service[swift-account-reaper] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-reaper] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Swift::Service[swift-account-reaper] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-reaper] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Concat[/etc/swift/account-server.conf]/File[/etc/swift/account-server.conf]: Scheduling refresh of Swift::Service[swift-account-reaper] Info: Swift::Service[swift-account-reaper]: Scheduling refresh of Service[swift-account-reaper] Notice: /Stage[main]/Swift::Storage::Account/Swift::Service[swift-account-reaper]/Service[swift-account-reaper]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Storage::Account/Swift::Service[swift-account-reaper]/Service[swift-account-reaper]: Unscheduling refresh on Service[swift-account-reaper] Info: Concat[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-server] Info: Concat[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-replicator] Info: Concat[/etc/swift/account-server.conf]: Scheduling refresh of Service[swift-account-auditor] Notice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-replicator]/Service[swift-account-replicator]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-replicator]/Service[swift-account-replicator]: Unscheduling refresh on Service[swift-account-replicator] Notice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-auditor]/Service[swift-account-auditor]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-auditor]/Service[swift-account-auditor]: Unscheduling refresh on Service[swift-account-auditor] Notice: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-server]/Service[swift-account-server]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Storage::Account/Swift::Storage::Generic[account]/Swift::Service[swift-account-server]/Service[swift-account-server]: Unscheduling refresh on Service[swift-account-server] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-logging]/File[/var/lib/puppet/concat/15-default.conf/fragments/80_default-logging]/ensure: defined content as '{md5}f202203ce2fe5d885160be988ff36151' Info: /Stage[main]/Apache/Apache::Vhost[default]/Concat::Fragment[default-logging]/File[/var/lib/puppet/concat/15-default.conf/fragments/80_default-logging]: Scheduling refresh of Exec[concat_15-default.conf] Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/Exec[concat_15-default.conf]/returns: executed successfully Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/Exec[concat_15-default.conf]: Triggered 'refresh' from 10 events Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-wsgi]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/260_nova_api_wsgi-wsgi]/ensure: defined content as '{md5}d8fcfbd8a3ec337955722d8a7c10844a' Info: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat::Fragment[nova_api_wsgi-wsgi]/File[/var/lib/puppet/concat/10-nova_api_wsgi.conf/fragments/260_nova_api_wsgi-wsgi]: Scheduling refresh of Exec[concat_10-nova_api_wsgi.conf] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/Exec[concat_10-nova_api_wsgi.conf]/returns: executed successfully Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/Exec[concat_10-nova_api_wsgi.conf]: Triggered 'refresh' from 11 events Notice: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[container]/Concat::Fragment[swift_healthcheck_container]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/25_swift_healthcheck_container]/ensure: defined content as '{md5}8c92056c41082619d179f88ea15c5fc6' Info: /Stage[main]/Openstack_integration::Swift/Swift::Storage::Filter::Healthcheck[container]/Concat::Fragment[swift_healthcheck_container]/File[/var/lib/puppet/concat/_etc_swift_container-server.conf/fragments/25_swift_healthcheck_container]: Scheduling refresh of Exec[concat_/etc/swift/container-server.conf] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/Exec[concat_/etc/swift/container-server.conf]/returns: executed successfully Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/Exec[concat_/etc/swift/container-server.conf]: Triggered 'refresh' from 5 events Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/content: --- /etc/swift/container-server.conf 2016-05-07 16:57:43.000000000 +0100 +++ /tmp/puppet-file20160520-26469-1khud84 2016-05-20 12:30:11.635286324 +0100 @@ -1,23 +1,43 @@ [DEFAULT] - -# Make sure your swift-ring-builder arguments match the bind_ip and bind_port. -# You almost certainly do not want to listen just on loopback unless testing. -# However, you want to keep port 6201 if SElinux is enabled. +devices = /srv/node bind_ip = 127.0.0.1 -bind_port = 6201 +bind_port = 6001 +mount_check = false +user = swift +log_name = container-server +log_facility = LOG_LOCAL2 +log_level = INFO +log_address = /dev/log + -workers = 2 +workers = 1 +allowed_sync_hosts = 127.0.0.1 [pipeline:main] pipeline = container-server [app:container-server] +allow_versions = false use = egg:swift#container +set log_name = container-server +set log_facility = LOG_LOCAL2 +set log_level = INFO +set log_requests = true +set log_address = /dev/log [container-replicator] +concurrency = 8 [container-updater] +concurrency = 8 [container-auditor] [container-sync] + +[filter:healthcheck] +use = egg:swift#healthcheck + +[filter:recon] +use = egg:swift#recon +recon_cache_path = /var/cache/swift Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Filebucketed /etc/swift/container-server.conf to puppet with sum 4998257eb89ff63e838b37686ebb1ee7 Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/content: content changed '{md5}4998257eb89ff63e838b37686ebb1ee7' to '{md5}21c2517e90b3e9698ae546bfbf8e210f' Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/owner: owner changed 'root' to 'swift' Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]/mode: mode changed '0640' to '0644' Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-updater] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Swift::Service[swift-container-updater] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-updater] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Swift::Service[swift-container-updater] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-updater] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6001]/Concat[/etc/swift/container-server.conf]/File[/etc/swift/container-server.conf]: Scheduling refresh of Swift::Service[swift-container-updater] Info: Concat[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-server] Info: Concat[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-replicator] Info: Concat[/etc/swift/container-server.conf]: Scheduling refresh of Service[swift-container-auditor] Notice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-replicator]/Service[swift-container-replicator]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-replicator]/Service[swift-container-replicator]: Unscheduling refresh on Service[swift-container-replicator] Notice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-auditor]/Service[swift-container-auditor]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-auditor]/Service[swift-container-auditor]: Unscheduling refresh on Service[swift-container-auditor] Notice: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-server]/Service[swift-container-server]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Storage::Container/Swift::Storage::Generic[container]/Swift::Service[swift-container-server]/Service[swift-container-server]: Unscheduling refresh on Service[swift-container-server] Info: Swift::Service[swift-container-updater]: Scheduling refresh of Service[swift-container-updater] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Rsync::Server::Module[account]/Concat::Fragment[frag-account]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_account_frag-account]/ensure: defined content as '{md5}c1253249b9f960b4c5ab27bffc4c0382' Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6002]/Rsync::Server::Module[account]/Concat::Fragment[frag-account]/File[/var/lib/puppet/concat/_etc_rsync.conf/fragments/10_account_frag-account]: Scheduling refresh of Exec[concat_/etc/rsync.conf] Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/Exec[concat_/etc/rsync.conf]/returns: executed successfully Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/Exec[concat_/etc/rsync.conf]: Triggered 'refresh' from 6 events Notice: /Stage[main]/Rsync::Server/Concat[/etc/rsync.conf]/File[/etc/rsync.conf]/ensure: defined content as '{md5}4b60030f2dab5c450c9d32e3fa3705c2' Notice: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Package[nova-api]/ensure: created Info: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Package[nova-api]: Scheduling refresh of Anchor[keystone::service::end] Info: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Package[nova-api]: Scheduling refresh of Anchor[nova::install::end] Notice: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Triggered 'refresh' from 7 events Info: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Scheduling refresh of Anchor[nova::service::begin] Info: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Scheduling refresh of Exec[nova-db-sync] Info: /Stage[main]/Nova::Deps/Anchor[nova::install::end]: Scheduling refresh of Exec[nova-db-sync-api] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf auth_tcp]/ensure: created Info: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf auth_tcp]: Scheduling refresh of Service[libvirt] Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/debug]/ensure: created Info: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/debug]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[neutron/service_metadata_proxy]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[neutron/service_metadata_proxy]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/use_neutron]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/use_neutron]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/log_dir]/ensure: created Info: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/log_dir]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_use_baremetal_filters]/ensure: created Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_use_baremetal_filters]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[DEFAULT/notify_api_faults]/ensure: created Info: /Stage[main]/Nova/Nova_config[DEFAULT/notify_api_faults]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Db/Nova_config[api_database/connection]/ensure: created Info: /Stage[main]/Nova::Db/Nova_config[api_database/connection]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Conductor/Nova_config[conductor/use_local]/ensure: created Info: /Stage[main]/Nova::Conductor/Nova_config[conductor/use_local]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[osapi_v3/enabled]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[osapi_v3/enabled]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[DEFAULT/notification_driver]/ensure: created Info: /Stage[main]/Nova/Nova_config[DEFAULT/notification_driver]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/max_io_ops_per_host]/ensure: created Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/max_io_ops_per_host]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_password]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_password]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/compute_manager]/ensure: created Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/compute_manager]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_password]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_user]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_user]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Vncproxy/Nova_config[DEFAULT/novncproxy_host]/ensure: created Info: /Stage[main]/Nova::Vncproxy/Nova_config[DEFAULT/novncproxy_host]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created Info: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created Info: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/verbose]/ensure: created Info: /Stage[main]/Nova::Logging/Nova_config[DEFAULT/verbose]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_weight_classes]/ensure: created Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_weight_classes]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_ha_queues]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_ha_queues]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_plugin]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_plugin]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/security_group_api]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/security_group_api]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/enabled_apis]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/enabled_apis]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen_port]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_listen_port]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_virtual_host]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_virtual_host]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[glance/api_servers]/ensure: created Info: /Stage[main]/Nova/Nova_config[glance/api_servers]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_port]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[DEFAULT/image_service]/ensure: created Info: /Stage[main]/Nova/Nova_config[DEFAULT/image_service]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/volume_api_class]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/volume_api_class]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/heal_instance_info_cache_interval]/ensure: created Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/heal_instance_info_cache_interval]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[DEFAULT/notify_on_state_change]/ensure: created Info: /Stage[main]/Nova/Nova_config[DEFAULT/notify_on_state_change]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/firewall_driver]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/firewall_driver]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_host_subset_size]/ensure: created Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_host_subset_size]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Vncproxy/Nova_config[DEFAULT/novncproxy_port]/ensure: created Info: /Stage[main]/Nova::Vncproxy/Nova_config[DEFAULT/novncproxy_port]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[cinder/catalog_info]/ensure: created Info: /Stage[main]/Nova/Nova_config[cinder/catalog_info]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/kombu_reconnect_delay]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/kombu_reconnect_delay]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/ram_allocation_ratio]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Db/Nova_config[database/connection]/ensure: created Info: /Stage[main]/Nova::Db/Nova_config[database/connection]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_host]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/default_floating_pool]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/default_floating_pool]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Scheduler/Nova_config[DEFAULT/scheduler_driver]/ensure: created Info: /Stage[main]/Nova::Scheduler/Nova_config[DEFAULT/scheduler_driver]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/amqp_durable_queues]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/amqp_durable_queues]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/disk_allocation_ratio]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/heartbeat_rate]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/heartbeat_rate]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/vncserver_proxyclient_address]/ensure: created Info: /Stage[main]/Nova::Compute/Nova_config[vnc/vncserver_proxyclient_address]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/api_paste_config]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/api_paste_config]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/vncserver_listen]/ensure: created Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/vncserver_listen]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_max_attempts]/ensure: created Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_max_attempts]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/cpu_allocation_ratio]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen_port]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen_port]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_use_ssl]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_use_ssl]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/max_instances_per_host]/ensure: created Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/max_instances_per_host]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_available_filters]/ensure: created Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_available_filters]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/keymap]/ensure: created Info: /Stage[main]/Nova::Compute/Nova_config[vnc/keymap]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/url]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/url]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created Info: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_tenant_name]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/admin_tenant_name]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf listen_tls]/ensure: created Info: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf listen_tls]: Scheduling refresh of Service[libvirt] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_host_manager]/ensure: created Info: /Stage[main]/Nova::Scheduler::Filter/Nova_config[DEFAULT/scheduler_host_manager]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[DEFAULT/notification_topics]/ensure: created Info: /Stage[main]/Nova/Nova_config[DEFAULT/notification_topics]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/kombu_ssl_version]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/kombu_ssl_version]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/use_forwarded_for]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/use_forwarded_for]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/sysconfig/libvirtd libvirtd args]/ensure: created Info: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/sysconfig/libvirtd libvirtd args]: Scheduling refresh of Service[libvirt] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_volume_listen]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_volume_listen]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_listen]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[DEFAULT/auth_strategy]/ensure: created Info: /Stage[main]/Nova/Nova_config[DEFAULT/auth_strategy]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created Info: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created Info: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_workers]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/osapi_compute_workers]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/identity_uri]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/identity_uri]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created Info: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_hosts]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_messaging_rabbit/rabbit_hosts]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rpc_backend]/ensure: created Info: /Stage[main]/Nova/Nova_config[DEFAULT/rpc_backend]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf listen_tcp]/ensure: created Info: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/libvirt/libvirtd.conf listen_tcp]: Scheduling refresh of Service[libvirt] Notice: /Stage[main]/Nova::Compute::Libvirt/Service[libvirt]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Nova::Compute::Libvirt/Service[libvirt]: Unscheduling refresh on Service[libvirt] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_workers]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/metadata_workers]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/dhcp_domain]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/dhcp_domain]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/allow_resize_to_same_host]/ensure: created Info: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/allow_resize_to_same_host]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova/Nova_config[oslo_concurrency/lock_path]/ensure: created Info: /Stage[main]/Nova/Nova_config[oslo_concurrency/lock_path]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/auth_uri]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[keystone_authtoken/auth_uri]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[neutron/metadata_proxy_shared_secret]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[neutron/metadata_proxy_shared_secret]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created Info: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]: Scheduling refresh of Anchor[nova::config::end] Info: /etc/httpd/conf.d: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Nova::Wsgi::Apache/Openstacklib::Wsgi::Apache[nova_api_wsgi]/Apache::Vhost[nova_api_wsgi]/Concat[10-nova_api_wsgi.conf]/File[10-nova_api_wsgi.conf]/ensure: defined content as '{md5}a201c1c5ac33c244ff2071cfe9b38046' Notice: /Stage[main]/Apache/Apache::Vhost[default]/Concat[15-default.conf]/File[15-default.conf]/ensure: defined content as '{md5}a430bf4e003be964b419e7aea251c6c4' Info: Concat[10-nova_api_wsgi.conf]: Scheduling refresh of Class[Apache::Service] Info: Apache::Vhost[nova_api_wsgi]: Scheduling refresh of Anchor[keystone::config::end] Info: Concat[15-default.conf]: Scheduling refresh of Class[Apache::Service] Info: Apache::Vhost[default]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Nova::Api/Nova_config[DEFAULT/fping_path]/ensure: created Info: /Stage[main]/Nova::Api/Nova_config[DEFAULT/fping_path]: Scheduling refresh of Anchor[nova::config::end] Notice: /Stage[main]/Apache::Mod::Dav_fs/File[dav_fs.conf]/ensure: defined content as '{md5}899a57534f3d84efa81887ec93c90c9b' Info: /Stage[main]/Apache::Mod::Dav_fs/File[dav_fs.conf]: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/var/lib/puppet/concat/_etc_swift_object-server.conf/fragments.concat]/ensure: created Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/Exec[concat_/etc/swift/object-server.conf]/returns: executed successfully Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/Exec[concat_/etc/swift/object-server.conf]: Triggered 'refresh' from 5 events Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/content: --- /etc/swift/object-server.conf 2016-05-07 16:57:43.000000000 +0100 +++ /tmp/puppet-file20160520-26469-14vpviw 2016-05-20 12:30:21.421915019 +0100 @@ -1,21 +1,39 @@ [DEFAULT] - -# Make sure your swift-ring-builder arguments match the bind_ip and bind_port. -# You almost certainly do not want to listen just on loopback unless testing. -# However, you want to keep port 6200 if SElinux is enabled. +devices = /srv/node bind_ip = 127.0.0.1 -bind_port = 6200 +bind_port = 6000 +mount_check = false +user = swift +log_name = object-server +log_facility = LOG_LOCAL2 +log_level = INFO +log_address = /dev/log + -workers = 3 +workers = 1 [pipeline:main] pipeline = object-server [app:object-server] use = egg:swift#object +set log_name = object-server +set log_facility = LOG_LOCAL2 +set log_level = INFO +set log_requests = true +set log_address = /dev/log [object-replicator] +concurrency = 8 [object-updater] +concurrency = 8 [object-auditor] + +[filter:healthcheck] +use = egg:swift#healthcheck + +[filter:recon] +use = egg:swift#recon +recon_cache_path = /var/cache/swift Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Filebucketed /etc/swift/object-server.conf to puppet with sum 43f14d676b28bc8111d6100e06e9a8bf Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/content: content changed '{md5}43f14d676b28bc8111d6100e06e9a8bf' to '{md5}396c3ccb85387cbac0df92cdbad14646' Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/owner: owner changed 'root' to 'swift' Notice: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]/mode: mode changed '0640' to '0644' Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-updater] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Swift::Service[swift-object-updater] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-updater] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Swift::Service[swift-object-updater] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-updater] Info: /Stage[main]/Swift::Storage::All/Swift::Storage::Server[6000]/Concat[/etc/swift/object-server.conf]/File[/etc/swift/object-server.conf]: Scheduling refresh of Swift::Service[swift-object-updater] Info: Concat[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-server] Info: Concat[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-replicator] Info: Concat[/etc/swift/object-server.conf]: Scheduling refresh of Service[swift-object-auditor] Notice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-server]/Service[swift-object-server]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-server]/Service[swift-object-server]: Unscheduling refresh on Service[swift-object-server] Notice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-replicator]/Service[swift-object-replicator]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-replicator]/Service[swift-object-replicator]: Unscheduling refresh on Service[swift-object-replicator] Notice: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-auditor]/Service[swift-object-auditor]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Storage::Object/Swift::Storage::Generic[object]/Swift::Service[swift-object-auditor]/Service[swift-object-auditor]: Unscheduling refresh on Service[swift-object-auditor] Info: Swift::Service[swift-object-updater]: Scheduling refresh of Service[swift-object-updater] Notice: /Stage[main]/Swift::Storage::Object/Swift::Service[swift-object-updater]/Service[swift-object-updater]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Storage::Object/Swift::Service[swift-object-updater]/Service[swift-object-updater]: Unscheduling refresh on Service[swift-object-updater] Notice: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Triggered 'refresh' from 47 events Info: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-api] Info: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-scheduler] Info: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-volume] Info: /Stage[main]/Cinder::Db::Sync/Exec[cinder-manage db_sync]: Scheduling refresh of Service[cinder-volume] Notice: /Stage[main]/Cinder::Api/Service[cinder-api]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Cinder::Api/Service[cinder-api]: Unscheduling refresh on Service[cinder-api] Notice: /Stage[main]/Cinder::Volume/Service[cinder-volume]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Cinder::Volume/Service[cinder-volume]: Unscheduling refresh on Service[cinder-volume] Notice: /Stage[main]/Cinder::Scheduler/Service[cinder-scheduler]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Cinder::Scheduler/Service[cinder-scheduler]: Unscheduling refresh on Service[cinder-scheduler] Notice: /Stage[main]/Keystone/Keystone_config[token/revoke_by_id]/ensure: created Info: /Stage[main]/Keystone/Keystone_config[token/revoke_by_id]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf]/ensure: created Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments]/ensure: created Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-access_log]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/100_ironic_wsgi-access_log]/ensure: defined content as '{md5}f2a2c3f663fb69cb0f359c1ae7ad320c' Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-access_log]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/100_ironic_wsgi-access_log]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-directories]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/60_ironic_wsgi-directories]/ensure: defined content as '{md5}29d0408a3b55a4415d880929f9a3ad46' Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-directories]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/60_ironic_wsgi-directories]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-ssl]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/230_ironic_wsgi-ssl]/ensure: defined content as '{md5}d6cec447dc3b9d177de1da941662dde7' Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-ssl]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/230_ironic_wsgi-ssl]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-docroot]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/10_ironic_wsgi-docroot]/ensure: defined content as '{md5}5cce1f4b838a61eb9353dc516b6f1912' Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-docroot]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/10_ironic_wsgi-docroot]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-wsgi]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/260_ironic_wsgi-wsgi]/ensure: defined content as '{md5}ce69252b664facd16f8d6d002943bde9' Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-wsgi]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/260_ironic_wsgi-wsgi]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-apache-header]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/0_ironic_wsgi-apache-header]/ensure: defined content as '{md5}eed662cc75f34394db84b64d61142357' Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-apache-header]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/0_ironic_wsgi-apache-header]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-logging]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/80_ironic_wsgi-logging]/ensure: defined content as '{md5}228ae1c4025ea06df280b6c090746264' Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-logging]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/80_ironic_wsgi-logging]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-file_footer]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/999_ironic_wsgi-file_footer]/ensure: defined content as '{md5}e27b2525783e590ca1820f1e2118285d' Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-file_footer]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/999_ironic_wsgi-file_footer]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments.concat]/ensure: created Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments.concat.out]/ensure: created Notice: /Stage[main]/Swift::Storage::Container/Swift::Service[swift-container-updater]/Service[swift-container-updater]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Swift::Storage::Container/Swift::Service[swift-container-updater]/Service[swift-container-updater]: Unscheduling refresh on Service[swift-container-updater] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-serversignature]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/90_ironic_wsgi-serversignature]/ensure: defined content as '{md5}9bf5a458783ab459e5043e1cdf671fa7' Info: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat::Fragment[ironic_wsgi-serversignature]/File[/var/lib/puppet/concat/10-ironic_wsgi.conf/fragments/90_ironic_wsgi-serversignature]: Scheduling refresh of Exec[concat_10-ironic_wsgi.conf] Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/Exec[concat_10-ironic_wsgi.conf]/returns: executed successfully Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/Exec[concat_10-ironic_wsgi.conf]: Triggered 'refresh' from 11 events Notice: /Stage[main]/Ironic::Wsgi::Apache/Openstacklib::Wsgi::Apache[ironic_wsgi]/Apache::Vhost[ironic_wsgi]/Concat[10-ironic_wsgi.conf]/File[10-ironic_wsgi.conf]/ensure: defined content as '{md5}fd0438eae872c05b10e229854a6dd56d' Info: Concat[10-ironic_wsgi.conf]: Scheduling refresh of Class[Apache::Service] Info: Apache::Vhost[ironic_wsgi]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]/ensure: created Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server/Neutron_config[DEFAULT/l3_ha]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_host]/ensure: created Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Notify::Rabbitmq/Glance_api_config[oslo_messaging_rabbit/rabbit_host]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Openstack_integration::Ironic/Openstack_integration::Ssl_key[ironic]/File[/etc/ironic/ssl]/ensure: created Notice: /Stage[main]/Openstack_integration::Ironic/Openstack_integration::Ssl_key[ironic]/File[/etc/ironic/ssl/private]/ensure: created Notice: /Stage[main]/Openstack_integration::Ironic/Openstack_integration::Ssl_key[ironic]/File[/etc/ironic/ssl/private/n2.dusty.ci.centos.org.pem]/ensure: defined content as '{md5}a4b1e9413d7b63f33794716d77cfaa00' Info: Openstack_integration::Ssl_key[ironic]: Scheduling refresh of Service[httpd] Notice: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/verbose]/ensure: created Info: /Stage[main]/Keystone::Logging/Keystone_config[DEFAULT/verbose]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/auth_url]/ensure: created Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/auth_url]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Api/Glance_cache_config[DEFAULT/auth_url]: Scheduling refresh of Exec[glance-manage db_sync] Notice: /Stage[main]/Glance::Db::Sync/Exec[glance-manage db_sync]: Triggered 'refresh' from 83 events Info: /Stage[main]/Glance::Db::Sync/Exec[glance-manage db_sync]: Scheduling refresh of Service[glance-api] Info: /Stage[main]/Glance::Db::Sync/Exec[glance-manage db_sync]: Scheduling refresh of Service[glance-registry] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-directories]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/60_keystone_wsgi_main-directories]/ensure: defined content as '{md5}cc81234a3bbf77f857ed3f11bb369e8c' Info: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat::Fragment[keystone_wsgi_main-directories]/File[/var/lib/puppet/concat/10-keystone_wsgi_main.conf/fragments/60_keystone_wsgi_main-directories]: Scheduling refresh of Exec[concat_10-keystone_wsgi_main.conf] Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/Exec[concat_10-keystone_wsgi_main.conf]/returns: executed successfully Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/Exec[concat_10-keystone_wsgi_main.conf]: Triggered 'refresh' from 11 events Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_main]/Concat[10-keystone_wsgi_main.conf]/File[10-keystone_wsgi_main.conf]/ensure: defined content as '{md5}fa0ea0cfef0ad72ddbeb9b6110bd2c86' Info: Concat[10-keystone_wsgi_main.conf]: Scheduling refresh of Class[Apache::Service] Info: Apache::Vhost[keystone_wsgi_main]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Triggered 'refresh' from 103 events Info: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Scheduling refresh of Anchor[nova::service::begin] Info: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Scheduling refresh of Exec[nova-db-sync] Info: /Stage[main]/Nova::Deps/Anchor[nova::config::end]: Scheduling refresh of Exec[nova-db-sync-api] Notice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Mysql_database[nova]/ensure: created Notice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_127.0.0.1]/Mysql_user[nova@127.0.0.1]/ensure: created Notice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_127.0.0.1]/Mysql_grant[nova@127.0.0.1/nova.*]/ensure: created Info: Class[Nova::Db::Mysql]: Scheduling refresh of Anchor[nova::db::end] Notice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Mysql_database[nova_api]/ensure: created Notice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Openstacklib::Db::Mysql::Host_access[nova_api_127.0.0.1]/Mysql_user[nova_api@127.0.0.1]/ensure: created Notice: /Stage[main]/Nova::Db::Mysql_api/Openstacklib::Db::Mysql[nova_api]/Openstacklib::Db::Mysql::Host_access[nova_api_127.0.0.1]/Mysql_grant[nova_api@127.0.0.1/nova_api.*]/ensure: created Info: Class[Nova::Db::Mysql_api]: Scheduling refresh of Anchor[nova::db::end] Notice: /Stage[main]/Nova::Deps/Anchor[nova::db::end]: Triggered 'refresh' from 2 events Info: /Stage[main]/Nova::Deps/Anchor[nova::db::end]: Scheduling refresh of Anchor[nova::dbsync::begin] Notice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::begin]: Triggered 'refresh' from 1 events Info: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::begin]: Scheduling refresh of Exec[nova-db-sync] Notice: /Stage[main]/Nova::Db::Sync/Exec[nova-db-sync]: Triggered 'refresh' from 3 events Info: /Stage[main]/Nova::Db::Sync/Exec[nova-db-sync]: Scheduling refresh of Anchor[nova::dbsync::end] Notice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::end]: Triggered 'refresh' from 1 events Info: /Stage[main]/Nova::Deps/Anchor[nova::dbsync::end]: Scheduling refresh of Anchor[nova::dbsync_api::begin] Notice: /Stage[main]/Nova::Cron::Archive_deleted_rows/Cron[nova-manage db archive_deleted_rows]/ensure: created Notice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::begin]: Triggered 'refresh' from 1 events Info: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::begin]: Scheduling refresh of Exec[nova-db-sync-api] Notice: /Stage[main]/Nova::Db::Sync_api/Exec[nova-db-sync-api]: Triggered 'refresh' from 3 events Info: /Stage[main]/Nova::Db::Sync_api/Exec[nova-db-sync-api]: Scheduling refresh of Anchor[nova::dbsync_api::end] Notice: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::end]: Triggered 'refresh' from 1 events Info: /Stage[main]/Nova::Deps/Anchor[nova::dbsync_api::end]: Scheduling refresh of Anchor[nova::service::begin] Notice: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Triggered 'refresh' from 3 events Info: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-api] Info: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-conductor] Info: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-consoleauth] Info: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-compute] Info: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-scheduler] Info: /Stage[main]/Nova::Deps/Anchor[nova::service::begin]: Scheduling refresh of Service[nova-vncproxy] Notice: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Service[nova-vncproxy]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Service[nova-vncproxy]: Scheduling refresh of Anchor[nova::service::end] Info: /Stage[main]/Nova::Vncproxy/Nova::Generic_service[vncproxy]/Service[nova-vncproxy]: Unscheduling refresh on Service[nova-vncproxy] Notice: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Service[nova-consoleauth]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Service[nova-consoleauth]: Scheduling refresh of Anchor[nova::service::end] Info: /Stage[main]/Nova::Consoleauth/Nova::Generic_service[consoleauth]/Service[nova-consoleauth]: Unscheduling refresh on Service[nova-consoleauth] Notice: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Service[nova-scheduler]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Service[nova-scheduler]: Scheduling refresh of Anchor[nova::service::end] Info: /Stage[main]/Nova::Scheduler/Nova::Generic_service[scheduler]/Service[nova-scheduler]: Unscheduling refresh on Service[nova-scheduler] Notice: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Service[nova-conductor]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Service[nova-conductor]: Scheduling refresh of Anchor[nova::service::end] Info: /Stage[main]/Nova::Conductor/Nova::Generic_service[conductor]/Service[nova-conductor]: Unscheduling refresh on Service[nova-conductor] Notice: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Service[nova-compute]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Service[nova-compute]: Scheduling refresh of Anchor[nova::service::end] Info: /Stage[main]/Nova::Compute/Nova::Generic_service[compute]/Service[nova-compute]: Unscheduling refresh on Service[nova-compute] Notice: /Stage[main]/Apache::Mod::Cgi/Apache::Mod[cgi]/File[cgi.load]/ensure: defined content as '{md5}ac20c5c5779b37ab06b480d6485a0881' Info: /Stage[main]/Apache::Mod::Cgi/Apache::Mod[cgi]/File[cgi.load]: Scheduling refresh of Class[Apache::Service] Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-proxy.conf]: Filebucketed /etc/httpd/conf.modules.d/00-proxy.conf to puppet with sum 85487c6777a89a8494dc8976dfff3268 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-proxy.conf]/ensure: removed Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/01-cgi.conf]: Filebucketed /etc/httpd/conf.modules.d/01-cgi.conf to puppet with sum 36e54d4b2bd190f5cbad876bfbeda461 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/01-cgi.conf]/ensure: removed Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-ssl.conf]: Filebucketed /etc/httpd/conf.modules.d/00-ssl.conf to puppet with sum e282ac9f82fe5538692a4de3616fb695 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-ssl.conf]/ensure: removed Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-mpm.conf]: Filebucketed /etc/httpd/conf.modules.d/00-mpm.conf to puppet with sum 820f672ca85595fd80620db585d51970 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-mpm.conf]/ensure: removed Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-systemd.conf]: Filebucketed /etc/httpd/conf.modules.d/00-systemd.conf to puppet with sum fd94264cd695af2ad86e7715c10e285d Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-systemd.conf]/ensure: removed Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/10-wsgi.conf]: Filebucketed /etc/httpd/conf.modules.d/10-wsgi.conf to puppet with sum e1795e051e7aae1f865fde0d3b86a507 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/10-wsgi.conf]/ensure: removed Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-base.conf]: Filebucketed /etc/httpd/conf.modules.d/00-base.conf to puppet with sum 6098845a84033f0fabe536488e52b1a0 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-base.conf]/ensure: removed Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-lua.conf]: Filebucketed /etc/httpd/conf.modules.d/00-lua.conf to puppet with sum 449a4aea60473ac4a16f025fca4463e3 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-lua.conf]/ensure: removed Info: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-dav.conf]: Filebucketed /etc/httpd/conf.modules.d/00-dav.conf to puppet with sum 56406b62d1fc7b7f1912e5b9e223f7a0 Notice: /Stage[main]/Apache/File[/etc/httpd/conf.modules.d/00-dav.conf]/ensure: removed Info: /etc/httpd/conf.modules.d: Scheduling refresh of Class[Apache::Service] Notice: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]/ensure: created Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron/Neutron_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Agents::Ml2::Ovs/Service[neutron-ovs-agent-service]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Neutron::Agents::Ml2::Ovs/Service[neutron-ovs-agent-service]: Unscheduling refresh on Service[neutron-ovs-agent-service] Notice: /Stage[main]/Neutron::Agents::Lbaas/Service[neutron-lbaasv2-service]: Triggered 'refresh' from 1 events Notice: /Stage[main]/Keystone::Wsgi::Apache/Apache::Vhost[keystone_wsgi_admin]/Concat[10-keystone_wsgi_admin.conf]/File[10-keystone_wsgi_admin.conf]/ensure: defined content as '{md5}5147e80911d47f807820c80ccf1b3f9e' Info: Concat[10-keystone_wsgi_admin.conf]: Scheduling refresh of Class[Apache::Service] Info: Apache::Vhost[keystone_wsgi_admin]: Scheduling refresh of Anchor[keystone::config::end] Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Triggered 'refresh' from 36 events Info: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Scheduling refresh of Anchor[keystone::service::begin] Info: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Scheduling refresh of Service[httpd] Info: /Stage[main]/Keystone::Deps/Anchor[keystone::config::end]: Scheduling refresh of Exec[keystone-manage db_sync] Notice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Mysql_database[keystone]/ensure: created Info: Class[Apache::Service]: Scheduling refresh of Service[httpd] Notice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_127.0.0.1]/Mysql_user[keystone@127.0.0.1]/ensure: created Notice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_127.0.0.1]/Mysql_grant[keystone@127.0.0.1/keystone.*]/ensure: created Info: Class[Keystone::Db::Mysql]: Scheduling refresh of Anchor[keystone::db::end] Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::db::end]: Triggered 'refresh' from 1 events Info: /Stage[main]/Keystone::Deps/Anchor[keystone::db::end]: Scheduling refresh of Anchor[keystone::dbsync::begin] Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::begin]: Triggered 'refresh' from 1 events Info: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::begin]: Scheduling refresh of Exec[keystone-manage db_sync] Notice: /Stage[main]/Keystone::Db::Sync/Exec[keystone-manage db_sync]: Triggered 'refresh' from 3 events Info: /Stage[main]/Keystone::Db::Sync/Exec[keystone-manage db_sync]: Scheduling refresh of Anchor[keystone::dbsync::end] Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::end]: Triggered 'refresh' from 1 events Info: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::end]: Scheduling refresh of Anchor[keystone::service::begin] Info: /Stage[main]/Keystone::Deps/Anchor[keystone::dbsync::end]: Scheduling refresh of Exec[keystone-manage bootstrap] Notice: /Stage[main]/Keystone/Exec[keystone-manage bootstrap]: Triggered 'refresh' from 1 events Info: /Stage[main]/Keystone/Exec[keystone-manage bootstrap]: Scheduling refresh of Anchor[keystone::service::begin] Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::service::begin]: Triggered 'refresh' from 4 events Info: /Stage[main]/Keystone::Deps/Anchor[keystone::service::begin]: Scheduling refresh of Service[keystone] Notice: /Stage[main]/Keystone::Service/Service[keystone]: Triggered 'refresh' from 1 events Info: /Stage[main]/Keystone::Service/Service[keystone]: Scheduling refresh of Anchor[keystone::service::end] Notice: /Stage[main]/Apache::Service/Service[httpd]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Apache::Service/Service[httpd]: Unscheduling refresh on Service[httpd] Notice: /Stage[main]/Keystone::Deps/Anchor[keystone::service::end]: Triggered 'refresh' from 31 events Notice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift_s3]/Keystone_service[swift_s3::s3]/ensure: created Notice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift_s3]/Keystone_endpoint[RegionOne/swift_s3::s3]/ensure: created Notice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_service[Image Service::image]/ensure: created Notice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_user[neutron]/ensure: created Notice: /Stage[main]/Swift::Keystone::Auth/Keystone_role[ResellerAdmin]/ensure: created Notice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_service[ironic::baremetal]/ensure: created Notice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_endpoint[RegionOne/Image Service::image]/ensure: created Info: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_endpoint[RegionOne/Image Service::image]: Scheduling refresh of Service[glance-api] Notice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova service, user nova]/Keystone_user[nova]/ensure: created Notice: /Stage[main]/Swift::Keystone::Auth/Keystone_role[SwiftOperator]/ensure: created Notice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user[glance]/ensure: created Notice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova v3 service, user novav3]/Keystone_service[novav3::computev3]/ensure: created Notice: /Stage[main]/Keystone::Roles::Admin/Keystone_tenant[services]/ensure: created Notice: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user_role[glance@services]/ensure: created Info: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user_role[glance@services]: Scheduling refresh of Service[glance-registry] Info: /Stage[main]/Glance::Keystone::Auth/Keystone::Resource::Service_identity[glance]/Keystone_user_role[glance@services]: Scheduling refresh of Service[glance-api] Notice: /Stage[main]/Glance::Registry/Service[glance-registry]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Glance::Registry/Service[glance-registry]: Unscheduling refresh on Service[glance-registry] Notice: /Stage[main]/Keystone::Roles::Admin/Keystone_tenant[openstack]/ensure: created Notice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_user[ironic]/ensure: created Notice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_user_role[ironic@services]/ensure: created Notice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova service, user nova]/Keystone_service[nova::compute]/ensure: created Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinderv3]/Keystone_service[cinderv3::volumev3]/ensure: created Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinderv3]/Keystone_endpoint[RegionOne/cinderv3::volumev3]/ensure: created Notice: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Service[nova-api]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Service[nova-api]: Scheduling refresh of Anchor[nova::service::end] Info: /Stage[main]/Nova::Api/Nova::Generic_service[api]/Service[nova-api]: Unscheduling refresh on Service[nova-api] Notice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_service[neutron::network]/ensure: created Notice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova service, user nova]/Keystone_user_role[nova@services]/ensure: created Notice: /Stage[main]/Keystone::Roles::Admin/Keystone_user[admin]/password: changed password Notice: /Stage[main]/Keystone::Roles::Admin/Keystone_user[admin]/email: defined 'email' as 'test@example.tld' Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinder]/Keystone_user[cinder]/ensure: created Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinderv2]/Keystone_service[cinderv2::volumev2]/ensure: created Notice: /Stage[main]/Nova::Deps/Anchor[nova::service::end]: Triggered 'refresh' from 6 events Notice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova v3 service, user novav3]/Keystone_endpoint[RegionOne/novav3::computev3]/ensure: created Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]/ensure: created Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_status_changes]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]/ensure: created Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_type]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]/ensure: created Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_name]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]/ensure: created Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/username]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]/ensure: created Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/password]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]/ensure: created Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/project_domain_id]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]/ensure: created Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/notify_nova_on_port_data_changes]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]/ensure: created Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/tenant_name]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Nova::Keystone::Auth/Keystone::Resource::Service_identity[nova service, user nova]/Keystone_endpoint[RegionOne/nova::compute]/ensure: created Notice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_endpoint[RegionOne/neutron::network]/ensure: created Info: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_endpoint[RegionOne/neutron::network]: Scheduling refresh of Service[neutron-server] Notice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_service[swift::object-store]/ensure: created Notice: /Stage[main]/Keystone::Endpoint/Keystone::Resource::Service_identity[keystone]/Keystone_service[keystone::identity]/ensure: created Notice: /Stage[main]/Keystone::Endpoint/Keystone::Resource::Service_identity[keystone]/Keystone_endpoint[RegionOne/keystone::identity]/ensure: created Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]/ensure: created Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/user_domain_id]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_user_role[neutron@services]/ensure: created Info: /Stage[main]/Neutron::Keystone::Auth/Keystone::Resource::Service_identity[neutron]/Keystone_user_role[neutron@services]: Scheduling refresh of Service[neutron-server] Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]/ensure: created Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[nova/auth_url]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]/ensure: created Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-server] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Exec[neutron-db-sync] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-metadata] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-lbaas-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-l3] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-dhcp-service] Info: /Stage[main]/Neutron::Server::Notifications/Neutron_config[DEFAULT/nova_url]: Scheduling refresh of Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Agents::Metering/Service[neutron-metering-service]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Neutron::Agents::Metering/Service[neutron-metering-service]: Unscheduling refresh on Service[neutron-metering-service] Notice: /Stage[main]/Neutron::Agents::Dhcp/Service[neutron-dhcp-service]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Neutron::Agents::Dhcp/Service[neutron-dhcp-service]: Unscheduling refresh on Service[neutron-dhcp-service] Notice: /Stage[main]/Neutron::Agents::L3/Service[neutron-l3]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Neutron::Agents::L3/Service[neutron-l3]: Unscheduling refresh on Service[neutron-l3] Notice: /Stage[main]/Neutron::Agents::Lbaas/Service[neutron-lbaas-service]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Neutron::Agents::Lbaas/Service[neutron-lbaas-service]: Unscheduling refresh on Service[neutron-lbaas-service] Notice: /Stage[main]/Neutron::Agents::Metadata/Service[neutron-metadata]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Neutron::Agents::Metadata/Service[neutron-metadata]: Unscheduling refresh on Service[neutron-metadata] Notice: /Stage[main]/Glance::Api/Service[glance-api]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Glance::Api/Service[glance-api]: Unscheduling refresh on Service[glance-api] Notice: /Stage[main]/Ironic::Keystone::Auth/Keystone::Resource::Service_identity[ironic]/Keystone_endpoint[RegionOne/ironic::baremetal]/ensure: created Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinder]/Keystone_service[cinder::volume]/ensure: created Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinder]/Keystone_endpoint[RegionOne/cinder::volume]/ensure: created Notice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_endpoint[RegionOne/swift::object-store]/ensure: created Notice: /Stage[main]/Neutron::Db::Sync/Exec[neutron-db-sync]: Triggered 'refresh' from 59 events Info: /Stage[main]/Neutron::Db::Sync/Exec[neutron-db-sync]: Scheduling refresh of Service[neutron-server] Notice: /Stage[main]/Neutron::Server/Service[neutron-server]/ensure: ensure changed 'stopped' to 'running' Info: /Stage[main]/Neutron::Server/Service[neutron-server]: Unscheduling refresh on Service[neutron-server] Notice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_user[swift]/ensure: created Notice: /Stage[main]/Swift::Keystone::Auth/Keystone::Resource::Service_identity[swift]/Keystone_user_role[swift@services]/ensure: created Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinderv2]/Keystone_endpoint[RegionOne/cinderv2::volumev2]/ensure: created Notice: /Stage[main]/Cinder::Keystone::Auth/Keystone::Resource::Service_identity[cinder]/Keystone_user_role[cinder@services]/ensure: created Notice: /Stage[main]/Openstack_integration::Cinder/Cinder_type[BACKEND_1]/ensure: created Notice: /Stage[main]/Keystone::Roles::Admin/Keystone_user_role[admin@openstack]/ensure: created Notice: /Stage[main]/Openstack_integration::Provision/Glance_image[cirros]/ensure: created Notice: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[public_api/admin_token_auth]/ensure: removed Info: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[public_api/admin_token_auth]: Scheduling refresh of Exec[restart_keystone] Notice: /Stage[main]/Openstack_integration::Provision/Neutron_network[public]/ensure: created Notice: /Stage[main]/Tempest/Tempest_neutron_net_id_setter[public_network_id]/ensure: created Notice: /Stage[main]/Openstack_integration::Provision/Exec[manage_m1.micro_nova_flavor]/returns: executed successfully Notice: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[api_v3/admin_token_auth]/ensure: removed Info: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[api_v3/admin_token_auth]: Scheduling refresh of Exec[restart_keystone] Notice: /Stage[main]/Openstack_extras::Auth_file/File[/root/openrc]/ensure: defined content as '{md5}3f4b596583820c76e15d3092a5c6dcc0' Notice: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[admin_api/admin_token_auth]/ensure: removed Info: /Stage[main]/Keystone::Disable_admin_token_auth/Ini_subsetting[admin_api/admin_token_auth]: Scheduling refresh of Exec[restart_keystone] Notice: /Stage[main]/Keystone/Exec[restart_keystone]: Triggered 'refresh' from 3 events Notice: /Stage[main]/Openstack_integration::Provision/Exec[manage_m1.nano_nova_flavor]/returns: executed successfully Notice: /Stage[main]/Openstack_integration::Provision/Neutron_subnet[public-subnet]/ensure: created Notice: /Stage[main]/Openstack_integration::Provision/Glance_image[cirros_alt]/ensure: created Notice: /Stage[main]/Tempest/Tempest_glance_id_setter[image_ref]/ensure: created Notice: /Stage[main]/Tempest/Tempest_glance_id_setter[image_ref_alt]/ensure: created Info: Creating state file /var/lib/puppet/state/state.yaml Notice: Finished catalog run in 519.98 seconds Info: Loading external facts from /etc/puppet/modules/openstacklib/facts.d Info: Loading facts in /etc/puppet/modules/nova/lib/facter/libvirt_uuid.rb Info: Loading facts in /etc/puppet/modules/openstacklib/lib/facter/os_package_type.rb Info: Loading facts in /etc/puppet/modules/openstacklib/lib/facter/os_service_default.rb Info: Loading facts in /etc/puppet/modules/vswitch/lib/facter/ovs.rb Info: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_reboot_required.rb Info: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_update_last_success.rb Info: Loading facts in /etc/puppet/modules/apt/lib/facter/apt_updates.rb Info: Loading facts in /etc/puppet/modules/concat/lib/facter/concat_basedir.rb Info: Loading facts in /etc/puppet/modules/firewall/lib/facter/ip6tables_version.rb Info: Loading facts in /etc/puppet/modules/firewall/lib/facter/iptables_persistent_version.rb Info: Loading facts in /etc/puppet/modules/firewall/lib/facter/iptables_version.rb Info: Loading facts in /etc/puppet/modules/mysql/lib/facter/mysql_version.rb Info: Loading facts in /etc/puppet/modules/mysql/lib/facter/mysql_server_id.rb Info: Loading facts in /etc/puppet/modules/python/lib/facter/pip_version.rb Info: Loading facts in /etc/puppet/modules/python/lib/facter/python_version.rb Info: Loading facts in /etc/puppet/modules/python/lib/facter/virtualenv_version.rb Info: Loading facts in /etc/puppet/modules/staging/lib/facter/staging_http_get.rb Info: Loading facts in /etc/puppet/modules/staging/lib/facter/staging_windir.rb Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/puppet_vardir.rb Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/facter_dot_d.rb Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/pe_version.rb Info: Loading facts in /etc/puppet/modules/stdlib/lib/facter/root_home.rb Notice: Compiled catalog for n2.dusty.ci.centos.org in environment production in 9.47 seconds Info: Applying configuration version '1463743997' Notice: Finished catalog run in 69.42 seconds all create: /tmp/openstack/tempest/.tox/tempest all installdeps: setuptools, -r/tmp/openstack/tempest/requirements.txt all develop-inst: /tmp/openstack/tempest all installed: Babel==2.3.4,cffi==1.6.0,cliff==2.0.0,cmd2==0.6.8,cryptography==1.3.2,debtcollector==1.4.0,enum34==1.1.6,extras==1.0.0,fasteners==0.14.1,fixtures==1.4.0,funcsigs==1.0.2,functools32==3.2.3.post2,idna==2.1,ipaddress==1.0.16,iso8601==0.1.11,jsonschema==2.5.1,linecache2==1.0.0,monotonic==1.1,msgpack-python==0.4.7,netaddr==0.7.18,netifaces==0.10.4,os-testr==0.6.0,oslo.concurrency==3.8.0,oslo.config==3.9.0,oslo.context==2.3.0,oslo.i18n==3.6.0,oslo.log==3.7.0,oslo.serialization==2.6.0,oslo.utils==3.10.0,paramiko==2.0.0,pbr==1.9.1,prettytable==0.7.2,pyasn1==0.1.9,pycparser==2.14,pyinotify==0.9.6,pyOpenSSL==16.0.0,pyparsing==2.1.4,python-dateutil==2.5.3,python-mimeparse==1.5.2,python-subunit==1.2.0,pytz==2016.4,PyYAML==3.11,retrying==1.3.3,six==1.10.0,stevedore==1.13.0,-e git://git.openstack.org/openstack/tempest@aff9cc072bbbb222b09a3411b203c180b493eae8#egg=tempest,testrepository==0.0.20,testscenarios==0.5.0,testtools==2.2.0,traceback2==1.4.0,unicodecsv==0.14.1,unittest2==1.1.0,urllib3==1.15.1,wrapt==1.10.8 all runtests: PYTHONHASHSEED='3977220619' all runtests: commands[0] | find . -type f -name *.pyc -delete all runtests: commands[1] | bash tools/pretty_tox.sh --concurrency=2 smoke dashbboard TelemetryAlarming api.baremetal.admin.test_drivers running testr running=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \ OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \ OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-500} \ OS_TEST_LOCK_PATH=${OS_TEST_LOCK_PATH:-${TMPDIR:-'/tmp'}} \ ${PYTHON:-python} -m subunit.run discover -t ${OS_TOP_LEVEL:-./} ${OS_TEST_PATH:-./tempest/test_discover} --list running=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \ OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \ OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-500} \ OS_TEST_LOCK_PATH=${OS_TEST_LOCK_PATH:-${TMPDIR:-'/tmp'}} \ ${PYTHON:-python} -m subunit.run discover -t ${OS_TOP_LEVEL:-./} ${OS_TEST_PATH:-./tempest/test_discover} --load-list /tmp/tmp0s50qd running=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \ OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \ OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-500} \ OS_TEST_LOCK_PATH=${OS_TEST_LOCK_PATH:-${TMPDIR:-'/tmp'}} \ ${PYTHON:-python} -m subunit.run discover -t ${OS_TOP_LEVEL:-./} ${OS_TEST_PATH:-./tempest/test_discover} --load-list /tmp/tmpdnKp0B {1} tempest.api.compute.security_groups.test_security_group_rules.SecurityGroupRulesTestJSON.test_security_group_rules_create [0.510592s] ... ok {1} tempest.api.compute.security_groups.test_security_group_rules.SecurityGroupRulesTestJSON.test_security_group_rules_list [0.637885s] ... ok {0} tempest.api.baremetal.admin.test_drivers.TestDrivers.test_list_drivers [1.271812s] ... ok {0} tempest.api.baremetal.admin.test_drivers.TestDrivers.test_show_driver [1.338437s] ... ok {0} tempest.api.compute.flavors.test_flavors.FlavorsV2TestJSON.test_get_flavor [0.106825s] ... ok {0} tempest.api.compute.flavors.test_flavors.FlavorsV2TestJSON.test_list_flavors [0.130774s] ... ok {0} tempest.api.compute.security_groups.test_security_groups.SecurityGroupsTestJSON.test_security_groups_create_list_delete [1.448903s] ... ok {1} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_list_servers [0.064264s] ... ok {1} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_verify_server_details [0.000582s] ... ok {0} tempest.api.compute.servers.test_attach_interfaces.AttachInterfacesTestJSON.test_add_remove_fixed_ip [11.839100s] ... ok {1} tempest.api.compute.servers.test_server_addresses.ServerAddressesTestJSON.test_list_server_addresses [0.071666s] ... ok {1} tempest.api.compute.servers.test_server_addresses.ServerAddressesTestJSON.test_list_server_addresses_by_network [0.159031s] ... ok {0} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_list_servers [0.063791s] ... ok {0} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_verify_server_details [0.000565s] ... ok {1} setUpClass (tempest.api.data_processing.test_cluster_templates.ClusterTemplateTest) ... SKIPPED: Sahara support is required {1} setUpClass (tempest.api.data_processing.test_data_sources.DataSourceTest) ... SKIPPED: Sahara support is required {1} setUpClass (tempest.api.data_processing.test_job_binaries.JobBinaryTest) ... SKIPPED: Sahara support is required {1} setUpClass (tempest.api.data_processing.test_jobs.JobTest) ... SKIPPED: Sahara support is required {1} setUpClass (tempest.api.data_processing.test_node_group_templates.NodeGroupTemplateTest) ... SKIPPED: Sahara support is required {1} setUpClass (tempest.api.database.flavors.test_flavors.DatabaseFlavorsTest) ... SKIPPED: DatabaseFlavorsTest skipped as trove is not available {1} setUpClass (tempest.api.database.limits.test_limits.DatabaseLimitsTest) ... SKIPPED: DatabaseLimitsTest skipped as trove is not available {1} tempest.api.identity.admin.v3.test_credentials.CredentialsTestJSON.test_credentials_create_get_update_delete [0.152104s] ... ok {1} tempest.api.identity.admin.v3.test_domains.DefaultDomainTestJSON.test_default_domain_exists [0.037410s] ... ok {1} tempest.api.identity.admin.v3.test_domains.DomainsTestJSON.test_create_update_delete_domain [0.397761s] ... ok {1} tempest.api.identity.admin.v3.test_endpoints.EndPointsTestJSON.test_update_endpoint [0.215164s] ... ok {0} tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_reboot_server_hard [11.347985s] ... ok {1} tempest.api.identity.admin.v3.test_groups.GroupsV3TestJSON.test_group_users_add_list_delete [1.166480s] ... ok {0} setUpClass (tempest.api.data_processing.test_job_binary_internals.JobBinaryInternalTest) ... SKIPPED: Sahara support is required {0} setUpClass (tempest.api.data_processing.test_plugins.PluginsTest) ... SKIPPED: Sahara support is required {0} setUpClass (tempest.api.database.versions.test_versions.DatabaseVersionsTest) ... SKIPPED: DatabaseVersionsTest skipped as trove is not available {1} tempest.api.identity.admin.v3.test_regions.RegionsTestJSON.test_create_region_with_specific_id [0.166700s] ... ok {0} tempest.api.identity.admin.v2.test_services.ServicesTestJSON.test_list_services [0.373792s] ... ok {1} tempest.api.identity.admin.v3.test_roles.RolesV3TestJSON.test_role_create_update_show_list [0.286381s] ... ok {0} tempest.api.identity.admin.v2.test_users.UsersTestJSON.test_create_user [0.143726s] ... ok {1} tempest.api.identity.admin.v3.test_trusts.TrustsV3TestJSON.test_get_trusts_all [1.541047s] ... ok {0} tempest.api.identity.admin.v3.test_policies.PoliciesTestJSON.test_create_update_delete_policy [0.206302s] ... ok {1} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_delete_image [0.517553s] ... ok {1} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_register_upload_get_image_file [1.137646s] ... ok {1} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_update_image [1.469844s] ... ok {1} tempest.api.network.test_extensions.ExtensionsTestJSON.test_list_show_extensions [0.430944s] ... ok {0} tempest.api.identity.admin.v3.test_services.ServicesTestJSON.test_create_update_get_service [0.295808s] ... ok {1} tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_network [0.846129s] ... ok {1} tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_port [1.372009s] ... ok {0} tempest.api.identity.v2.test_api_discovery.TestApiDiscovery.test_api_media_types [0.048919s] ... ok {0} tempest.api.identity.v2.test_api_discovery.TestApiDiscovery.test_api_version_resources [0.054454s] ... ok {0} tempest.api.identity.v2.test_api_discovery.TestApiDiscovery.test_api_version_statuses [0.045440s] ... ok {1} tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_subnet [4.599057s] ... ok {1} setUpClass (tempest.api.network.test_networks.NetworksIpV6TestAttrs) ... SKIPPED: IPv6 extended attributes for subnets not available {0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_media_types [0.054893s] ... ok {0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_version_resources [0.061760s] ... ok {0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_version_statuses [0.059559s] ... ok {1} tempest.api.network.test_networks.NetworksTest.test_create_update_delete_network_subnet [1.563828s] ... ok {1} tempest.api.network.test_networks.NetworksTest.test_external_network_visibility [0.184518s] ... ok {1} tempest.api.network.test_networks.NetworksTest.test_list_networks [0.079143s] ... ok {1} tempest.api.network.test_networks.NetworksTest.test_list_subnets [0.046161s] ... ok {1} tempest.api.network.test_networks.NetworksTest.test_show_network [0.052944s] ... ok {1} tempest.api.network.test_networks.NetworksTest.test_show_subnet [0.051730s] ... ok {1} tempest.api.network.test_ports.PortsTestJSON.test_create_port_in_allowed_allocation_pools [1.498361s] ... ok {0} tempest.api.network.test_floating_ips.FloatingIPTestJSON.test_create_floating_ip_specifying_a_fixed_ip_address [0.891415s] ... ok {1} tempest.api.network.test_ports.PortsTestJSON.test_create_port_with_no_securitygroups [1.660860s] ... ok {0} tempest.api.network.test_floating_ips.FloatingIPTestJSON.test_create_list_show_update_delete_floating_ip [1.472597s] ... ok {1} tempest.api.network.test_ports.PortsTestJSON.test_create_update_delete_port [1.024533s] ... ok {1} tempest.api.network.test_ports.PortsTestJSON.test_list_ports [0.028519s] ... ok {1} tempest.api.network.test_ports.PortsTestJSON.test_show_port [0.031189s] ... ok {0} tempest.api.network.test_networks.BulkNetworkOpsTest.test_bulk_create_delete_network [0.826167s] ... ok {0} tempest.api.network.test_networks.BulkNetworkOpsTest.test_bulk_create_delete_port [1.382763s] ... ok {1} tempest.api.network.test_routers.RoutersTest.test_add_multiple_router_interfaces [3.649875s] ... ok {0} tempest.api.network.test_networks.BulkNetworkOpsTest.test_bulk_create_delete_subnet [1.938906s] ... ok {1} tempest.api.network.test_routers.RoutersTest.test_add_remove_router_interface_with_port_id [2.267556s] ... ok {1} tempest.api.network.test_routers.RoutersTest.test_add_remove_router_interface_with_subnet_id [1.954573s] ... ok {1} tempest.api.network.test_routers.RoutersTest.test_create_show_list_update_delete_router [1.438991s] ... ok {0} tempest.api.network.test_networks.NetworksIpV6Test.test_create_update_delete_network_subnet [1.268792s] ... ok {0} tempest.api.network.test_networks.NetworksIpV6Test.test_external_network_visibility [0.112706s] ... ok {0} tempest.api.network.test_networks.NetworksIpV6Test.test_list_networks [0.051579s] ... ok {0} tempest.api.network.test_networks.NetworksIpV6Test.test_list_subnets [0.045387s] ... ok {0} tempest.api.network.test_networks.NetworksIpV6Test.test_show_network [0.132461s] ... ok {0} tempest.api.network.test_networks.NetworksIpV6Test.test_show_subnet [0.044410s] ... ok {1} tempest.api.network.test_security_groups.SecGroupIPv6Test.test_create_list_update_show_delete_security_group [0.375148s] ... ok {1} tempest.api.network.test_security_groups.SecGroupIPv6Test.test_create_show_delete_security_group_rule [0.470574s] ... ok {1} tempest.api.network.test_security_groups.SecGroupIPv6Test.test_list_security_groups [0.035922s] ... ok {0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_create_port_in_allowed_allocation_pools [1.515059s] ... ok {1} tempest.api.object_storage.test_account_quotas.AccountQuotasTest.test_admin_modify_quota [0.210497s] ... ok {1} tempest.api.object_storage.test_account_quotas.AccountQuotasTest.test_upload_valid_object [0.071776s] ... ok {0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_create_port_with_no_securitygroups [1.804811s] ... ok {0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_create_update_delete_port [0.783414s] ... ok {0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_list_ports [0.030561s] ... ok {0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_show_port [0.029457s] ... ok {1} tempest.api.object_storage.test_account_services.AccountTest.test_list_account_metadata [0.054494s] ... ok {1} tempest.api.object_storage.test_account_services.AccountTest.test_list_containers [0.013434s] ... ok {1} setUpClass (tempest.api.orchestration.stacks.test_stacks.StacksTestJSON) ... SKIPPED: Heat support is required {1} setUpClass (tempest.api.telemetry.test_alarming_api.TelemetryAlarmingAPITestJSON) ... SKIPPED: Aodh support is required {1} setUpClass (tempest.api.telemetry.test_alarming_api_negative.TelemetryAlarmingNegativeTest) ... SKIPPED: Aodh support is required {0} tempest.api.network.test_routers.RoutersIpV6Test.test_add_multiple_router_interfaces [3.744758s] ... ok {0} tempest.api.network.test_routers.RoutersIpV6Test.test_add_remove_router_interface_with_port_id [2.046541s] ... ok {0} tempest.api.network.test_routers.RoutersIpV6Test.test_add_remove_router_interface_with_subnet_id [2.020083s] ... ok {1} tempest.api.volume.test_volumes_get.VolumesV2GetTest.test_volume_create_get_update_delete [7.842538s] ... ok {0} tempest.api.network.test_routers.RoutersIpV6Test.test_create_show_list_update_delete_router [1.502659s] ... ok {0} tempest.api.network.test_security_groups.SecGroupTest.test_create_list_update_show_delete_security_group [0.368896s] ... ok {0} tempest.api.network.test_security_groups.SecGroupTest.test_create_show_delete_security_group_rule [0.471705s] ... ok {0} tempest.api.network.test_security_groups.SecGroupTest.test_list_security_groups [0.044018s] ... ok {0} tempest.api.network.test_subnetpools_extensions.SubnetPoolsTestJSON.test_create_list_show_update_delete_subnetpools [0.268973s] ... ok {0} tempest.api.object_storage.test_container_quotas.ContainerQuotasTest.test_upload_large_object [0.391122s] ... ok {0} tempest.api.object_storage.test_container_quotas.ContainerQuotasTest.test_upload_too_many_objects [0.285795s] ... ok {0} tempest.api.object_storage.test_container_quotas.ContainerQuotasTest.test_upload_valid_object [0.195461s] ... ok {1} tempest.api.volume.test_volumes_get.VolumesV2GetTest.test_volume_create_get_update_delete_from_image [32.992742s] ... ok {0} tempest.api.object_storage.test_container_services.ContainerTest.test_create_container [0.335287s] ... ok {0} tempest.api.object_storage.test_container_services.ContainerTest.test_list_container_contents [0.149195s] ... ok {0} tempest.api.object_storage.test_container_services.ContainerTest.test_list_container_metadata [0.121836s] ... ok {0} tempest.api.object_storage.test_object_services.ObjectTest.test_create_object [0.050552s] ... ok {0} tempest.api.object_storage.test_object_services.ObjectTest.test_get_object [0.026938s] ... ok {0} tempest.api.object_storage.test_object_services.ObjectTest.test_list_object_metadata [0.024721s] ... ok {0} tempest.api.object_storage.test_object_services.ObjectTest.test_update_object_metadata [0.051842s] ... ok {0} setUpClass (tempest.api.orchestration.stacks.test_resource_types.ResourceTypesTest) ... SKIPPED: Heat support is required {0} setUpClass (tempest.api.orchestration.stacks.test_soft_conf.TestSoftwareConfig) ... SKIPPED: Heat support is required {0} setUpClass (tempest.api.telemetry.test_telemetry_notification_api.TelemetryNotificationAPITestJSON) ... SKIPPED: Ceilometer support is required {1} tempest.api.volume.test_volumes_list.VolumesV1ListTestJSON.test_volume_list [0.049297s] ... ok {0} tempest.api.volume.test_volumes_actions.VolumesV1ActionsTest.test_attach_detach_volume_to_instance [1.502631s] ... ok {1} tempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops [36.835052s] ... FAILED {1} setUpClass (tempest.scenario.test_server_multinode.TestServerMultinode) ... SKIPPED: Less than 2 compute nodes, skipping multinode tests. {0} tempest.api.volume.test_volumes_actions.VolumesV2ActionsTest.test_attach_detach_volume_to_instance [1.199769s] ... ok {0} tempest.api.volume.test_volumes_get.VolumesV1GetTest.test_volume_create_get_update_delete [17.959206s] ... ok {0} tempest.api.volume.test_volumes_get.VolumesV1GetTest.test_volume_create_get_update_delete_from_image [42.517879s] ... ok {0} tempest.api.volume.test_volumes_list.VolumesV2ListTestJSON.test_volume_list [0.047964s] ... ok {0} tempest.scenario.test_network_basic_ops.TestNetworkBasicOps.test_network_basic_ops [132.155957s] ... ok {0} tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_volume_boot_pattern [151.413748s] ... ok {0} tempest.scenario.test_volume_boot_pattern.TestVolumeBootPatternV2.test_volume_boot_pattern [155.015026s] ... ok ============================== Failed 1 tests - output below: ============================== tempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops[compute,id-7fff3fb3-91d8-4fd0-bd7d-0204f1f180ba,network,smoke] ---------------------------------------------------------------------------------------------------------------------------------------------- Captured pythonlogging: ~~~~~~~~~~~~~~~~~~~~~~~ 2016-05-20 12:39:33,977 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:setUp): 200 GET https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59 0.461s 2016-05-20 12:39:33,978 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''} Body: None Response - Headers: {'vary': 'Accept-Encoding', 'status': '200', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '677', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59', 'date': 'Fri, 20 May 2016 11:39:33 GMT', 'x-compute-request-id': 'req-2d694247-967f-4d4c-b110-8dd52b397df7'} Body: {"image": {"status": "ACTIVE", "updated": "2016-05-20T11:32:39Z", "links": [{"href": "https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59", "rel": "self"}, {"href": "https://127.0.0.1:8774/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59", "rel": "bookmark"}, {"href": "http://172.19.2.66:9292/images/ffff3a3a-5101-497a-b186-38682e723d59", "type": "application/vnd.openstack.image", "rel": "alternate"}], "id": "ffff3a3a-5101-497a-b186-38682e723d59", "OS-EXT-IMG-SIZE:size": 13287936, "name": "cirros", "created": "2016-05-20T11:32:36Z", "minDisk": 0, "progress": 100, "minRam": 0, "metadata": {}}} 2016-05-20 12:39:34,143 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:setUp): 200 GET https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/flavors/42 0.162s 2016-05-20 12:39:34,144 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''} Body: None Response - Headers: {'vary': 'Accept-Encoding', 'status': '200', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '421', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/flavors/42', 'date': 'Fri, 20 May 2016 11:39:33 GMT', 'x-compute-request-id': 'req-8aa2720b-8e5f-4b38-a53a-2f0a4f4a3442'} Body: {"flavor": {"name": "m1.nano", "links": [{"href": "https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/flavors/42", "rel": "self"}, {"href": "https://127.0.0.1:8774/2b19e26c86fb4b48abe8551003fc00c7/flavors/42", "rel": "bookmark"}], "ram": 128, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "swap": "", "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "OS-FLV-EXT-DATA:ephemeral": 0, "disk": 0, "id": "42"}} 2016-05-20 12:39:34,561 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:setUp): 200 GET https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59 0.415s 2016-05-20 12:39:34,562 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''} Body: None Response - Headers: {'vary': 'Accept-Encoding', 'status': '200', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '677', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59', 'date': 'Fri, 20 May 2016 11:39:34 GMT', 'x-compute-request-id': 'req-47d134ff-c986-4e47-87bc-8b1865abeb34'} Body: {"image": {"status": "ACTIVE", "updated": "2016-05-20T11:32:39Z", "links": [{"href": "https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59", "rel": "self"}, {"href": "https://127.0.0.1:8774/2b19e26c86fb4b48abe8551003fc00c7/images/ffff3a3a-5101-497a-b186-38682e723d59", "rel": "bookmark"}, {"href": "http://172.19.2.66:9292/images/ffff3a3a-5101-497a-b186-38682e723d59", "type": "application/vnd.openstack.image", "rel": "alternate"}], "id": "ffff3a3a-5101-497a-b186-38682e723d59", "OS-EXT-IMG-SIZE:size": 13287936, "name": "cirros", "created": "2016-05-20T11:32:36Z", "minDisk": 0, "progress": 100, "minRam": 0, "metadata": {}}} 2016-05-20 12:39:34,568 7734 DEBUG [tempest.scenario.test_server_basic_ops] Starting test for i:ffff3a3a-5101-497a-b186-38682e723d59, f:42. Run ssh: False, user: cirros 2016-05-20 12:39:34,754 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:test_server_basic_ops): 200 POST https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-keypairs 0.184s 2016-05-20 12:39:34,754 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''} Body: {"keypair": {"name": "tempest-TestServerBasicOps-1692537820"}} Response - Headers: {'vary': 'Accept-Encoding', 'status': '200', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '2320', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-keypairs', 'date': 'Fri, 20 May 2016 11:39:34 GMT', 'x-compute-request-id': 'req-174eb88e-a1a0-4382-abaa-3db83a7276b4'} Body: {"keypair": {"public_key": "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCf0YVs8Qd2HOxGejejNA86wa9jKGRUqadnX16ux7D0QgTxcru4ll4JtSPY3azJqwwUAajeHOge/vPM6ySLlJscB9iPo0k4A0AbNed1hfmEvYXYEYmss58gkgFjwrv5wqIz08V4Fu+I9FMjD0PmFFQNqSv35i3C6i54LUZRGkFzT7HxXM4aAZUjpCfjNXsJSDoRSz0GBC0QbZ+GQah7mYiVMDJO1MFWKrReDjYMNr3xdooTb2m3G2rvksHgl0ezVRDbvkgCodJz4YQrC82gitJdLyGEJZpYPTMbOp/dsOAkKPGtkyF4Qqv/FDMyCHM8bsiOog/xXmBIT87xlzBtAzZ9 Generated-by-Nova", "private_key": "-----BEGIN RSA PRIVATE KEY----- MIIEqQIBAAKCAQEAn9GFbPEHdhzsRno3ozQPOsGvYyhkVKmnZ19ersew9EIE8XK7 uJZeCbUj2N2syasMFAGo3hzoHv7zzOski5SbHAfYj6NJOANAGzXndYX5hL2F2BGJ rLOfIJIBY8K7+cKiM9PFeBbviPRTIw9D5hRUDakr9+YtwuoueC1GURpBc0+x8VzO GgGVI6Qn4zV7CUg6EUs9BgQtEG2fhkGoe5mIlTAyTtTBViq0Xg42DDa98XaKE29p txtq75LB4JdHs1UQ275IAqHSc+GEKwvNoIrSXS8hhCWaWD0zGzqf3bDgJCjxrZMh eEKr/xQzMghzPG7IjqIP8V5gSE/O8ZcwbQM2fQIDAQABAoIBABO2sJKjmJwFLU/0 O3CyNz60LYI5tUaMNs4VgYRltXoruphd4rH+OlNQOL/DeFDX/IFrQv1C648HO+OH Ddb52bg3b4soRRvXqsywdYCVqhWpmxzv7N+UuIg3+lvn5XAFhiSGdtE9YwatvKOS enmjAEs/FuFZT0O/x0OjsgzHBFPIyt15vGAOIIhbWRBoWJSBD5MglPHpqFRMbWnh Ima71YSEn62dddHzlnk5+7gVf7FF9eZl4hcLrfqWuZhi8lNTiu/FtBQT9cEnoAXb u6Y/59eoZSBv334s3D/nlbtqY922xJrwVjucfbw7tDrzDaDlurkHKST/jr29weOM Pl7T8gECggCBAL/E6Y/62Eja/DoQwMGytxb612xbA3lasZnpyBjBFpByKuK8tPy9 wp9K+dT8nk8+E1GToPOGGyvk1UqnYl2mShiDpZWRrtDf2JZqL0r4FhDs1DoMvbcO scAt9KYT9yjMwFtZXflA2N7sU5pWovJccnEsAN47elxT9ROC9l0Sqt2dAoIAgQDV WQTawXkU2bJlyUqC+EXEFEtHR1uUfLWB7ZbwoqB5tYKUydKk0d7CNOLSPQgJJnpt b5l/iRtypsZ0FbjRiBtdkzsn7zzsY5pvaptasbeSNG7EOdADRRmfXSDPQi9J5TIL sqxxbu9lLlIgT1J8ECQARpNx7VmSzA697JjpS9TWYQKCAIEAr9hhj3wmXdAoHxqD llpJV1IWva5LZjkKyCa+LCzKgxOdTaJal7NtxmGa63nltKYoUtJ7cTLUsZA5ISaR pWw5X3dAHAGlerT4Rx0BVs5cdZKlHMHYKQbZaW76eluudQQjkuBEsq2K8Admtgyh iHnLGwmNljqV/hmijgy12iym72UCggCBAJ/MzZYM1GSJqtYSr3zp+Vek273H9RCD WHC5RRV4ujpveh94DA7oI7HTaxGOJTa1W34c2Rxt1eFKidrpakWHbPfqD6UZzMhC 0qohb7u+4YDhRRY1N1k7qLV1S93x9PmkcpfQfNl5/lYLG/iXcXD7pfuO4WG0JiOO NHyNevtDkWgBAoIAgBXL82F/ICjK7i4B232bJB0RQEzAevqBCoRFMUUGl9rePcgB UOSiiDVHfl2C1yu3WabzNehoDO5/RqyxpPji/SrnMvi4aPPywLvJ9gqEfUwld1Wo p6riJoPx6aS+VLPLP0rDhKGuEJkIu4Qv9tCdG7nReWWEImiM6ldN9kzOZfIN -----END RSA PRIVATE KEY----- ", "user_id": "4f2057b1b7744ce9b90440c0f47efbef", "name": "tempest-TestServerBasicOps-1692537820", "fingerprint": "01:a5:e4:68:53:67:e9:cc:22:5b:d6:b0:21:ff:5a:f4"}} 2016-05-20 12:40:04,819 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:test_server_basic_ops): 500 POST https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-security-groups 30.063s 2016-05-20 12:40:04,819 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''} Body: {"security_group": {"description": "tempest-TestServerBasicOps-1404384290 description", "name": "tempest-TestServerBasicOps-1404384290"}} Response - Headers: {'status': '500', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '224', 'content-type': 'application/json; charset=UTF-8', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-security-groups', 'date': 'Fri, 20 May 2016 11:39:34 GMT', 'x-compute-request-id': 'req-940cd5d7-8a3c-478b-9285-2964bfe29105'} Body: {"computeFault": {"message": "Unexpected API Error. Please report this at http://bugs.launchpad.net/nova/ and attach the Nova API log if possible. ", "code": 500}} 2016-05-20 12:40:10,344 7734 INFO [tempest.lib.common.rest_client] Request (TestServerBasicOps:_run_cleanups): 202 DELETE https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-keypairs/tempest-TestServerBasicOps-1692537820 5.521s 2016-05-20 12:40:10,349 7734 DEBUG [tempest.lib.common.rest_client] Request - Headers: {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Auth-Token': ''} Body: None Response - Headers: {'status': '202', 'connection': 'close', 'server': 'Apache/2.4.6 (CentOS)', 'content-length': '0', 'content-type': 'application/json', 'content-location': 'https://127.0.0.1:8774/v2/2b19e26c86fb4b48abe8551003fc00c7/os-keypairs/tempest-TestServerBasicOps-1692537820', 'date': 'Fri, 20 May 2016 11:40:04 GMT', 'x-compute-request-id': 'req-a0ec0f02-aeeb-4a83-81f3-01be3558a2df'} Body: Captured traceback: ~~~~~~~~~~~~~~~~~~~ Traceback (most recent call last): File "tempest/test.py", line 113, in wrapper return f(self, *func_args, **func_kwargs) File "tempest/scenario/test_server_basic_ops.py", line 124, in test_server_basic_ops self.security_group = self._create_security_group() File "tempest/scenario/manager.py", line 333, in _create_security_group name=sg_name, description=sg_desc)['security_group'] File "tempest/lib/services/compute/security_groups_client.py", line 55, in create_security_group resp, body = self.post('os-security-groups', post_body) File "tempest/lib/common/rest_client.py", line 259, in post return self.request('POST', url, extra_headers, headers, body) File "tempest/lib/services/compute/base_compute_client.py", line 53, in request method, url, extra_headers, headers, body) File "tempest/lib/common/rest_client.py", line 641, in request resp, resp_body) File "tempest/lib/common/rest_client.py", line 760, in _error_checker message=message) tempest.lib.exceptions.ServerFault: Got server fault Details: Unexpected API Error. Please report this at http://bugs.launchpad.net/nova/ and attach the Nova API log if possible. ====== Totals ====== Ran: 126 tests in 837.0000 sec. - Passed: 107 - Skipped: 18 - Expected Fail: 0 - Unexpected Success: 0 - Failed: 1 Sum of execute time for each test: 665.4004 sec. ============== Worker Balance ============== - Worker 0 (67 tests) => 0:13:47.264576 - Worker 1 (59 tests) => 0:04:33.373255 Slowest Tests: Test id Runtime (s) -------------------------------------------------------------------------------------------------------------------------------------------------------------- ----------- tempest.scenario.test_volume_boot_pattern.TestVolumeBootPatternV2.test_volume_boot_pattern[compute,id-557cd2c2-4eb8-4dce-98be-f86765ff311b,image,smoke,volume] 155.015 tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_volume_boot_pattern[compute,id-557cd2c2-4eb8-4dce-98be-f86765ff311b,image,smoke,volume] 151.414 tempest.scenario.test_network_basic_ops.TestNetworkBasicOps.test_network_basic_ops[compute,id-f323b3ba-82f8-4db7-8ea6-6a895869ec49,network,smoke] 132.156 tempest.api.volume.test_volumes_get.VolumesV1GetTest.test_volume_create_get_update_delete_from_image[id-54a01030-c7fc-447c-86ee-c1182beae638,image,smoke] 42.518 tempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops[compute,id-7fff3fb3-91d8-4fd0-bd7d-0204f1f180ba,network,smoke] 36.835 tempest.api.volume.test_volumes_get.VolumesV2GetTest.test_volume_create_get_update_delete_from_image[id-54a01030-c7fc-447c-86ee-c1182beae638,image,smoke] 32.993 tempest.api.volume.test_volumes_get.VolumesV1GetTest.test_volume_create_get_update_delete[id-27fb0e9f-fb64-41dd-8bdb-1ffa762f0d51,smoke] 17.959 tempest.api.compute.servers.test_attach_interfaces.AttachInterfacesTestJSON.test_add_remove_fixed_ip[id-c7e0e60b-ee45-43d0-abeb-8596fd42a2f9,network,smoke] 11.839 tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_reboot_server_hard[id-2cb1baf6-ac8d-4429-bf0d-ba8a0ba53e32,smoke] 11.348 tempest.api.volume.test_volumes_get.VolumesV2GetTest.test_volume_create_get_update_delete[id-27fb0e9f-fb64-41dd-8bdb-1ffa762f0d51,smoke] 7.843 ERROR: InvocationError: '/usr/bin/bash tools/pretty_tox.sh --concurrency=2 smoke dashbboard TelemetryAlarming api.baremetal.admin.test_drivers' ___________________________________ summary ____________________________________ ERROR: all: commands failed stderr: + export PUPPET_VERSION=3 + PUPPET_VERSION=3 + export SCENARIO=scenario002 + SCENARIO=scenario002 + export MANAGE_PUPPET_MODULES=true + MANAGE_PUPPET_MODULES=true + export MANAGE_REPOS=false + MANAGE_REPOS=false + export PUPPET_ARGS= + PUPPET_ARGS= +++ dirname ./run_tests.sh ++ cd . ++ pwd -P + export SCRIPT_DIR=/tmp/puppet-openstack + SCRIPT_DIR=/tmp/puppet-openstack + '[' -f /etc/nodepool/provider ']' + NODEPOOL_MIRROR_HOST=mirror.centos.org + export FACTER_nodepool_mirror_host=http://mirror.centos.org + FACTER_nodepool_mirror_host=http://mirror.centos.org + '[' 3 == 4 ']' + export PUPPET_RELEASE_FILE=puppetlabs-release + PUPPET_RELEASE_FILE=puppetlabs-release + export PUPPET_BASE_PATH=/etc/puppet + PUPPET_BASE_PATH=/etc/puppet + export PUPPET_PKG=puppet + PUPPET_PKG=puppet + source /tmp/puppet-openstack/functions + '[' '!' -f fixtures/scenario002.pp ']' ++ id -u + '[' 0 '!=' 0 ']' + git clone -b 12.0.0 git://git.openstack.org/openstack/tempest /tmp/openstack/tempest Note: checking out 'aff9cc072bbbb222b09a3411b203c180b493eae8'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by performing another checkout. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -b with the checkout command again. Example: git checkout -b new_branch_name + PUPPET_ARGS=' --detailed-exitcodes --color=false --test --trace' + uses_debs + type apt-get + is_fedora + '[' -f /etc/os-release ']' + source /etc/os-release ++ NAME='CentOS Linux' ++ VERSION='7 (Core)' ++ ID=centos ++ ID_LIKE='rhel fedora' ++ VERSION_ID=7 ++ PRETTY_NAME='CentOS Linux 7 (Core)' ++ ANSI_COLOR='0;31' ++ CPE_NAME=cpe:/o:centos:centos:7 ++ HOME_URL=https://www.centos.org/ ++ BUG_REPORT_URL=https://bugs.centos.org/ ++ CENTOS_MANTISBT_PROJECT=CentOS-7 ++ CENTOS_MANTISBT_PROJECT_VERSION=7 ++ REDHAT_SUPPORT_PRODUCT=centos ++ REDHAT_SUPPORT_PRODUCT_VERSION=7 + test centos = fedora -o centos = centos + rpm --quiet -q puppetlabs-release + rpm --quiet -q epel-release + rm -f /tmp/puppet.rpm + wget http://yum.puppetlabs.com/puppetlabs-release-el-7.noarch.rpm -O /tmp/puppet.rpm --2016-05-20 12:22:49-- http://yum.puppetlabs.com/puppetlabs-release-el-7.noarch.rpm Resolving yum.puppetlabs.com (yum.puppetlabs.com)... 192.155.89.90, 2600:3c03::f03c:91ff:fedb:6b1d Connecting to yum.puppetlabs.com (yum.puppetlabs.com)|192.155.89.90|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 12504 (12K) [application/x-redhat-package-manager] Saving to: ?/tmp/puppet.rpm? 0K .......... .. 100% 291M=0s 2016-05-20 12:22:49 (291 MB/s) - ?/tmp/puppet.rpm? saved [12504/12504] + rpm -ivh /tmp/puppet.rpm warning: /tmp/puppet.rpm: Header V4 RSA/SHA1 Signature, key ID 4bd6ec30: NOKEY + yum install -y dstat puppet Warning: RPMDB altered outside of yum. + type dstat + dstat -tcmndrylpg --top-cpu-adv --top-io-adv --nocolor + '[' true = true ']' + ./install_modules.sh + tee --append /var/log/dstat.log + '[' -n '' ']' + '[' 3 = 4 ']' + export PUPPET_BASE_PATH=/etc/puppet + PUPPET_BASE_PATH=/etc/puppet +++ dirname ./install_modules.sh ++ cd . ++ pwd -P + export SCRIPT_DIR=/tmp/puppet-openstack + SCRIPT_DIR=/tmp/puppet-openstack + export PUPPETFILE_DIR=/etc/puppet/modules + PUPPETFILE_DIR=/etc/puppet/modules + source /tmp/puppet-openstack/functions + gem install r10k --no-ri --no-rdoc + rm -rf '/etc/puppet/modules/*' + install_modules + '[' -e /usr/zuul-env/bin/zuul-cloner ']' + install_all + PUPPETFILE=/tmp/puppet-openstack/Puppetfile + r10k puppetfile install -v INFO -> Updating module /etc/puppet/modules/aodh INFO -> Updating module /etc/puppet/modules/barbican INFO -> Updating module /etc/puppet/modules/ceilometer INFO -> Updating module /etc/puppet/modules/ceph INFO -> Updating module /etc/puppet/modules/cinder INFO -> Updating module /etc/puppet/modules/designate INFO -> Updating module /etc/puppet/modules/glance INFO -> Updating module /etc/puppet/modules/gnocchi INFO -> Updating module /etc/puppet/modules/heat INFO -> Updating module /etc/puppet/modules/horizon INFO -> Updating module /etc/puppet/modules/ironic INFO -> Updating module /etc/puppet/modules/keystone INFO -> Updating module /etc/puppet/modules/manila INFO -> Updating module /etc/puppet/modules/mistral INFO -> Updating module /etc/puppet/modules/monasca INFO -> Updating module /etc/puppet/modules/murano INFO -> Updating module /etc/puppet/modules/neutron INFO -> Updating module /etc/puppet/modules/nova INFO -> Updating module /etc/puppet/modules/octavia INFO -> Updating module /etc/puppet/modules/openstack_integration INFO -> Updating module /etc/puppet/modules/openstack_extras INFO -> Updating module /etc/puppet/modules/openstacklib INFO -> Updating module /etc/puppet/modules/oslo INFO -> Updating module /etc/puppet/modules/sahara INFO -> Updating module /etc/puppet/modules/swift INFO -> Updating module /etc/puppet/modules/tempest INFO -> Updating module /etc/puppet/modules/trove INFO -> Updating module /etc/puppet/modules/vswitch INFO -> Updating module /etc/puppet/modules/zaqar INFO -> Updating module /etc/puppet/modules/apache INFO -> Updating module /etc/puppet/modules/apt INFO -> Updating module /etc/puppet/modules/concat INFO -> Updating module /etc/puppet/modules/corosync INFO -> Updating module /etc/puppet/modules/dns INFO -> Updating module /etc/puppet/modules/firewall INFO -> Updating module /etc/puppet/modules/inifile INFO -> Updating module /etc/puppet/modules/memcached INFO -> Updating module /etc/puppet/modules/mongodb INFO -> Updating module /etc/puppet/modules/mysql INFO -> Updating module /etc/puppet/modules/postgresql INFO -> Updating module /etc/puppet/modules/powerdns INFO -> Updating module /etc/puppet/modules/python INFO -> Updating module /etc/puppet/modules/qpid INFO -> Updating module /etc/puppet/modules/rabbitmq INFO -> Updating module /etc/puppet/modules/rsync INFO -> Updating module /etc/puppet/modules/staging INFO -> Updating module /etc/puppet/modules/stdlib INFO -> Updating module /etc/puppet/modules/sysctl INFO -> Updating module /etc/puppet/modules/vcsrepo INFO -> Updating module /etc/puppet/modules/xinetd + puppet module list Warning: Module 'openstack-openstacklib' (v8.0.1) fails to meet some dependencies: 'puppet-octavia' (v0.0.1) requires 'openstack-openstacklib' (>=7.0.0 <8.0.0) 'puppet-oslo' (v0.0.1) requires 'openstack-openstacklib' (>=7.0.0 <8.0.0) Warning: Module 'puppetlabs-inifile' (v1.4.3) fails to meet some dependencies: 'openstack-gnocchi' (v8.0.1) requires 'puppetlabs-inifile' (>=1.5.0 <2.0.0) Warning: Missing dependency 'deric-storm': 'openstack-monasca' (v1.0.0) requires 'deric-storm' (>=0.0.1 <1.0.0) Warning: Missing dependency 'deric-zookeeper': 'openstack-monasca' (v1.0.0) requires 'deric-zookeeper' (>=0.0.1 <1.0.0) Warning: Missing dependency 'jdowning-influxdb': 'openstack-monasca' (v1.0.0) requires 'jdowning-influxdb' (>=0.3.0 <1.0.0) Warning: Missing dependency 'openstack-oslo': 'openstack-barbican' (v0.0.1) requires 'openstack-oslo' (<9.0.0) Warning: Missing dependency 'opentable-kafka': 'openstack-monasca' (v1.0.0) requires 'opentable-kafka' (>=1.0.0 <2.0.0) Warning: Missing dependency 'puppetlabs-stdlib': 'antonlindstrom-powerdns' (v0.0.5) requires 'puppetlabs-stdlib' (>= 0.0.0) Warning: Missing dependency 'puppetlabs-corosync': 'openstack-openstack_extras' (v8.0.1) requires 'puppetlabs-corosync' (>=0.8.0 <1.0.0) Warning: Missing dependency 'stahnma-epel': 'stankevich-python' (v1.10.0) requires 'stahnma-epel' (>= 1.0.1 < 2.0.0) + set +e + '[' false = true ']' + run_puppet scenario002 + local manifest=scenario002 + puppet apply --detailed-exitcodes --color=false --test --trace fixtures/scenario002.pp Warning: Config file /etc/puppet/hiera.yaml not found, using Hiera defaults Warning: Scope(Class[Nova::Keystone::Auth]): Note that service_name parameter default value will be changed to "Compute Service" (according to Keystone default catalog) in a future release. In case you use different value, please update your manifests accordingly. Warning: Scope(Class[Nova::Keystone::Auth]): Note that service_name_v3 parameter default value will be changed to "Compute Service v3" (according to Keystone default catalog) in a future release. In case you use different value, please update your manifests accordingly. Warning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::host'; class ::nova::vncproxy has not been evaluated Warning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::vncproxy_protocol'; class ::nova::vncproxy has not been evaluated Warning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::port'; class ::nova::vncproxy has not been evaluated Warning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::vncproxy_path'; class ::nova::vncproxy has not been evaluated Warning: Scope(Class[Swift]): swift_hash_suffix has been deprecated and should be replaced with swift_hash_path_suffix, this will be removed as part of the N-cycle Warning: The package type's allow_virtual parameter will be changing its default value from false to true in a future release. If you do not want to allow virtual packages, please explicitly set allow_virtual to false. (at /usr/share/ruby/vendor_ruby/puppet/type.rb:816:in `set_default'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2263:in `block in set_parameters'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2262:in `each'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2262:in `set_parameters'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2200:in `initialize'; /usr/share/ruby/vendor_ruby/puppet/resource.rb:314:in `new'; /usr/share/ruby/vendor_ruby/puppet/resource.rb:314:in `to_ral'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:510:in `block in to_catalog'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:502:in `each'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:502:in `to_catalog'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:405:in `to_ral'; /usr/share/ruby/vendor_ruby/puppet/application/apply.rb:217:in `block in main'; /usr/share/ruby/vendor_ruby/puppet/context.rb:64:in `override'; /usr/share/ruby/vendor_ruby/puppet.rb:234:in `override'; /usr/share/ruby/vendor_ruby/puppet/application/apply.rb:190:in `main'; /usr/share/ruby/vendor_ruby/puppet/application/apply.rb:151:in `run_command'; /usr/share/ruby/vendor_ruby/puppet/application.rb:371:in `block (2 levels) in run'; /usr/share/ruby/vendor_ruby/puppet/application.rb:477:in `plugin_hook'; /usr/share/ruby/vendor_ruby/puppet/application.rb:371:in `block in run'; /usr/share/ruby/vendor_ruby/puppet/util.rb:479:in `exit_on_fail'; /usr/share/ruby/vendor_ruby/puppet/application.rb:371:in `run'; /usr/share/ruby/vendor_ruby/puppet/util/command_line.rb:137:in `run'; /usr/share/ruby/vendor_ruby/puppet/util/command_line.rb:91:in `execute'; /usr/bin/puppet:8:in `
') Warning: Unexpected line: Ring file /etc/swift/object.ring.gz not found, probably it hasn't been written yet Warning: Unexpected line: Ring file /etc/swift/container.ring.gz not found, probably it hasn't been written yet Warning: Unexpected line: Ring file /etc/swift/account.ring.gz not found, probably it hasn't been written yet + local res=2 + return 2 + RESULT=2 + set -e + '[' 2 -ne 2 ']' + set +e + run_puppet scenario002 + local manifest=scenario002 + puppet apply --detailed-exitcodes --color=false --test --trace fixtures/scenario002.pp Warning: Config file /etc/puppet/hiera.yaml not found, using Hiera defaults Warning: Scope(Class[Nova::Keystone::Auth]): Note that service_name parameter default value will be changed to "Compute Service" (according to Keystone default catalog) in a future release. In case you use different value, please update your manifests accordingly. Warning: Scope(Class[Nova::Keystone::Auth]): Note that service_name_v3 parameter default value will be changed to "Compute Service v3" (according to Keystone default catalog) in a future release. In case you use different value, please update your manifests accordingly. Warning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::host'; class ::nova::vncproxy has not been evaluated Warning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::vncproxy_protocol'; class ::nova::vncproxy has not been evaluated Warning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::port'; class ::nova::vncproxy has not been evaluated Warning: Scope(Class[Nova::Vncproxy::Common]): Could not look up qualified variable '::nova::vncproxy::vncproxy_path'; class ::nova::vncproxy has not been evaluated Warning: Scope(Class[Swift]): swift_hash_suffix has been deprecated and should be replaced with swift_hash_path_suffix, this will be removed as part of the N-cycle Warning: The package type's allow_virtual parameter will be changing its default value from false to true in a future release. If you do not want to allow virtual packages, please explicitly set allow_virtual to false. (at /usr/share/ruby/vendor_ruby/puppet/type.rb:816:in `set_default'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2263:in `block in set_parameters'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2262:in `each'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2262:in `set_parameters'; /usr/share/ruby/vendor_ruby/puppet/type.rb:2200:in `initialize'; /usr/share/ruby/vendor_ruby/puppet/resource.rb:314:in `new'; /usr/share/ruby/vendor_ruby/puppet/resource.rb:314:in `to_ral'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:510:in `block in to_catalog'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:502:in `each'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:502:in `to_catalog'; /usr/share/ruby/vendor_ruby/puppet/resource/catalog.rb:405:in `to_ral'; /usr/share/ruby/vendor_ruby/puppet/application/apply.rb:217:in `block in main'; /usr/share/ruby/vendor_ruby/puppet/context.rb:64:in `override'; /usr/share/ruby/vendor_ruby/puppet.rb:234:in `override'; /usr/share/ruby/vendor_ruby/puppet/application/apply.rb:190:in `main'; /usr/share/ruby/vendor_ruby/puppet/application/apply.rb:151:in `run_command'; /usr/share/ruby/vendor_ruby/puppet/application.rb:371:in `block (2 levels) in run'; /usr/share/ruby/vendor_ruby/puppet/application.rb:477:in `plugin_hook'; /usr/share/ruby/vendor_ruby/puppet/application.rb:371:in `block in run'; /usr/share/ruby/vendor_ruby/puppet/util.rb:479:in `exit_on_fail'; /usr/share/ruby/vendor_ruby/puppet/application.rb:371:in `run'; /usr/share/ruby/vendor_ruby/puppet/util/command_line.rb:137:in `run'; /usr/share/ruby/vendor_ruby/puppet/util/command_line.rb:91:in `execute'; /usr/bin/puppet:8:in `
') Warning: Unexpected line: Ring file /etc/swift/object.ring.gz is up-to-date Warning: Unexpected line: Devices: id region zone ip address port replication ip replication port name weight partitions balance flags meta Warning: Unexpected line: Ring file /etc/swift/container.ring.gz is up-to-date Warning: Unexpected line: Devices: id region zone ip address port replication ip replication port name weight partitions balance flags meta Warning: Unexpected line: Ring file /etc/swift/account.ring.gz is up-to-date Warning: Unexpected line: Devices: id region zone ip address port replication ip replication port name weight partitions balance flags meta + local res=0 + return 0 + RESULT=0 + set -e + '[' 0 -ne 0 ']' + mkdir -p /tmp/openstack/tempest + rm -f /tmp/openstack/tempest/cirros-0.3.4-x86_64-disk.img + wget http://download.cirros-cloud.net/0.3.4/cirros-0.3.4-x86_64-disk.img -P /tmp/openstack/tempest --2016-05-20 12:34:42-- http://download.cirros-cloud.net/0.3.4/cirros-0.3.4-x86_64-disk.img Resolving download.cirros-cloud.net (download.cirros-cloud.net)... 64.90.42.85 Connecting to download.cirros-cloud.net (download.cirros-cloud.net)|64.90.42.85|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 13287936 (13M) [text/plain] Saving to: ?/tmp/openstack/tempest/cirros-0.3.4-x86_64-disk.img? 0K .......... .......... .......... .......... .......... 0% 197K 65s 50K .......... .......... .......... .......... .......... 0% 586K 44s 100K .......... .......... .......... .......... .......... 1% 297K 43s 150K .......... .......... .......... .......... .......... 1% 592K 38s 200K .......... .......... .......... .......... .......... 1% 27.3M 30s 250K .......... .......... .......... .......... .......... 2% 595K 29s 300K .......... .......... .......... .......... .......... 2% 597K 27s 350K .......... .......... .......... .......... .......... 3% 77.4M 24s 400K .......... .......... .......... .......... .......... 3% 596K 24s 450K .......... .......... .......... .......... .......... 3% 52.6M 21s 500K .......... .......... .......... .......... .......... 4% 45.3M 19s 550K .......... .......... .......... .......... .......... 4% 596K 19s 600K .......... .......... .......... .......... .......... 5% 134M 18s 650K .......... .......... .......... .......... .......... 5% 48.6M 16s 700K .......... .......... .......... .......... .......... 5% 605K 17s 750K .......... .......... .......... .......... .......... 6% 39.9M 15s 800K .......... .......... .......... .......... .......... 6% 23.3M 15s 850K .......... .......... .......... .......... .......... 6% 242M 14s 900K .......... .......... .......... .......... .......... 7% 616K 14s 950K .......... .......... .......... .......... .......... 7% 47.1M 13s 1000K .......... .......... .......... .......... .......... 8% 26.5M 13s 1050K .......... .......... .......... .......... .......... 8% 52.2M 12s 1100K .......... .......... .......... .......... .......... 8% 583K 12s 1150K .......... .......... .......... .......... .......... 9% 289M 12s 1200K .......... .......... .......... .......... .......... 9% 291M 11s 1250K .......... .......... .......... .......... .......... 10% 275M 11s 1300K .......... .......... .......... .......... .......... 10% 307M 10s 1350K .......... .......... .......... .......... .......... 10% 319M 10s 1400K .......... .......... .......... .......... .......... 11% 635K 10s 1450K .......... .......... .......... .......... .......... 11% 30.5M 10s 1500K .......... .......... .......... .......... .......... 11% 607K 10s 1550K .......... .......... .......... .......... .......... 12% 22.7M 10s 1600K .......... .......... .......... .......... .......... 12% 1.31M 10s 1650K .......... .......... .......... .......... .......... 13% 1.05M 10s 1700K .......... .......... .......... .......... .......... 13% 33.8M 9s 1750K .......... .......... .......... .......... .......... 13% 21.8M 9s 1800K .......... .......... .......... .......... .......... 14% 625K 9s 1850K .......... .......... .......... .......... .......... 14% 18.3M 9s 1900K .......... .......... .......... .......... .......... 15% 75.5M 9s 1950K .......... .......... .......... .......... .......... 15% 1.38M 9s 2000K .......... .......... .......... .......... .......... 15% 1.07M 9s 2050K .......... .......... .......... .......... .......... 16% 14.1M 8s 2100K .......... .......... .......... .......... .......... 16% 79.4M 8s 2150K .......... .......... .......... .......... .......... 16% 1.41M 8s 2200K .......... .......... .......... .......... .......... 17% 1.07M 8s 2250K .......... .......... .......... .......... .......... 17% 13.8M 8s 2300K .......... .......... .......... .......... .......... 18% 40.7M 8s 2350K .......... .......... .......... .......... .......... 18% 1.45M 8s 2400K .......... .......... .......... .......... .......... 18% 1.06M 8s 2450K .......... .......... .......... .......... .......... 19% 10.3M 7s 2500K .......... .......... .......... .......... .......... 19% 55.3M 7s 2550K .......... .......... .......... .......... .......... 20% 85.7M 7s 2600K .......... .......... .......... .......... .......... 20% 638K 7s 2650K .......... .......... .......... .......... .......... 20% 10.3M 7s 2700K .......... .......... .......... .......... .......... 21% 48.8M 7s 2750K .......... .......... .......... .......... .......... 21% 62.6M 7s 2800K .......... .......... .......... .......... .......... 21% 639K 7s 2850K .......... .......... .......... .......... .......... 22% 84.0M 7s 2900K .......... .......... .......... .......... .......... 22% 9.73M 7s 2950K .......... .......... .......... .......... .......... 23% 64.7M 6s 3000K .......... .......... .......... .......... .......... 23% 1.54M 6s 3050K .......... .......... .......... .......... .......... 23% 1.04M 6s 3100K .......... .......... .......... .......... .......... 24% 10.2M 6s 3150K .......... .......... .......... .......... .......... 24% 44.8M 6s 3200K .......... .......... .......... .......... .......... 25% 63.1M 6s 3250K .......... .......... .......... .......... .......... 25% 643K 6s 3300K .......... .......... .......... .......... .......... 25% 13.5M 6s 3350K .......... .......... .......... .......... .......... 26% 28.0M 6s 3400K .......... .......... .......... .......... .......... 26% 34.0M 6s 3450K .......... .......... .......... .......... .......... 26% 1.59M 6s 3500K .......... .......... .......... .......... .......... 27% 1.02M 6s 3550K .......... .......... .......... .......... .......... 27% 11.3M 6s 3600K .......... .......... .......... .......... .......... 28% 20.9M 6s 3650K .......... .......... .......... .......... .......... 28% 138M 6s 3700K .......... .......... .......... .......... .......... 28% 643K 6s 3750K .......... .......... .......... .......... .......... 29% 20.7M 6s 3800K .......... .......... .......... .......... .......... 29% 22.4M 5s 3850K .......... .......... .......... .......... .......... 30% 17.7M 5s 3900K .......... .......... .......... .......... .......... 30% 1.69M 5s 3950K .......... .......... .......... .......... .......... 30% 1014K 5s 4000K .......... .......... .......... .......... .......... 31% 15.7M 5s 4050K .......... .......... .......... .......... .......... 31% 15.3M 5s 4100K .......... .......... .......... .......... .......... 31% 80.7M 5s 4150K .......... .......... .......... .......... .......... 32% 652K 5s 4200K .......... .......... .......... .......... .......... 32% 28.3M 5s 4250K .......... .......... .......... .......... .......... 33% 15.1M 5s 4300K .......... .......... .......... .......... .......... 33% 15.3M 5s 4350K .......... .......... .......... .......... .......... 33% 120M 5s 4400K .......... .......... .......... .......... .......... 34% 649K 5s 4450K .......... .......... .......... .......... .......... 34% 68.6M 5s 4500K .......... .......... .......... .......... .......... 35% 15.9M 5s 4550K .......... .......... .......... .......... .......... 35% 14.3M 5s 4600K .......... .......... .......... .......... .......... 35% 1.77M 5s 4650K .......... .......... .......... .......... .......... 36% 990K 5s 4700K .......... .......... .......... .......... .......... 36% 63.9M 5s 4750K .......... .......... .......... .......... .......... 36% 18.4M 4s 4800K .......... .......... .......... .......... .......... 37% 15.5M 4s 4850K .......... .......... .......... .......... .......... 37% 1.78M 4s 4900K .......... .......... .......... .......... .......... 38% 972K 4s 4950K .......... .......... .......... .......... .......... 38% 145M 4s 5000K .......... .......... .......... .......... .......... 38% 22.7M 4s 5050K .......... .......... .......... .......... .......... 39% 14.2M 4s 5100K .......... .......... .......... .......... .......... 39% 762K 4s 5150K .......... .......... .......... .......... .......... 40% 3.74M 4s 5200K .......... .......... .......... .......... .......... 40% 183M 4s 5250K .......... .......... .......... .......... .......... 40% 10.4M 4s 5300K .......... .......... .......... .......... .......... 41% 51.2M 4s 5350K .......... .......... .......... .......... .......... 41% 765K 4s 5400K .......... .......... .......... .......... .......... 41% 3.49M 4s 5450K .......... .......... .......... .......... .......... 42% 63.1M 4s 5500K .......... .......... .......... .......... .......... 42% 14.5M 4s 5550K .......... .......... .......... .......... .......... 43% 49.5M 4s 5600K .......... .......... .......... .......... .......... 43% 767K 4s 5650K .......... .......... .......... .......... .......... 43% 3.23M 4s 5700K .......... .......... .......... .......... .......... 44% 123M 4s 5750K .......... .......... .......... .......... .......... 44% 18.4M 4s 5800K .......... .......... .......... .......... .......... 45% 37.9M 4s 5850K .......... .......... .......... .......... .......... 45% 769K 4s 5900K .......... .......... .......... .......... .......... 45% 2.94M 4s 5950K .......... .......... .......... .......... .......... 46% 147M 4s 6000K .......... .......... .......... .......... .......... 46% 44.9M 3s 6050K .......... .......... .......... .......... .......... 47% 27.3M 3s 6100K .......... .......... .......... .......... .......... 47% 773K 3s 6150K .......... .......... .......... .......... .......... 47% 2.88M 3s 6200K .......... .......... .......... .......... .......... 48% 76.7M 3s 6250K .......... .......... .......... .......... .......... 48% 149M 3s 6300K .......... .......... .......... .......... .......... 48% 15.7M 3s 6350K .......... .......... .......... .......... .......... 49% 1.96M 3s 6400K .......... .......... .......... .......... .......... 49% 897K 3s 6450K .......... .......... .......... .......... .......... 50% 68.4M 3s 6500K .......... .......... .......... .......... .......... 50% 65.9M 3s 6550K .......... .......... .......... .......... .......... 50% 18.0M 3s 6600K .......... .......... .......... .......... .......... 51% 2.00M 3s 6650K .......... .......... .......... .......... .......... 51% 872K 3s 6700K .......... .......... .......... .......... .......... 52% 69.7M 3s 6750K .......... .......... .......... .......... .......... 52% 126M 3s 6800K .......... .......... .......... .......... .......... 52% 32.5M 3s 6850K .......... .......... .......... .......... .......... 53% 2.01M 3s 6900K .......... .......... .......... .......... .......... 53% 871K 3s 6950K .......... .......... .......... .......... .......... 53% 78.4M 3s 7000K .......... .......... .......... .......... .......... 54% 70.5M 3s 7050K .......... .......... .......... .......... .......... 54% 37.4M 3s 7100K .......... .......... .......... .......... .......... 55% 2.02M 3s 7150K .......... .......... .......... .......... .......... 55% 902K 3s 7200K .......... .......... .......... .......... .......... 55% 16.4M 3s 7250K .......... .......... .......... .......... .......... 56% 48.6M 3s 7300K .......... .......... .......... .......... .......... 56% 97.7M 3s 7350K .......... .......... .......... .......... .......... 57% 55.5M 3s 7400K .......... .......... .......... .......... .......... 57% 802K 3s 7450K .......... .......... .......... .......... .......... 57% 2.50M 3s 7500K .......... .......... .......... .......... .......... 58% 32.5M 3s 7550K .......... .......... .......... .......... .......... 58% 65.7M 3s 7600K .......... .......... .......... .......... .......... 58% 96.9M 2s 7650K .......... .......... .......... .......... .......... 59% 2.04M 2s 7700K .......... .......... .......... .......... .......... 59% 867K 2s 7750K .......... .......... .......... .......... .......... 60% 23.8M 2s 7800K .......... .......... .......... .......... .......... 60% 68.3M 2s 7850K .......... .......... .......... .......... .......... 60% 50.8M 2s 7900K .......... .......... .......... .......... .......... 61% 2.14M 2s 7950K .......... .......... .......... .......... .......... 61% 872K 2s 8000K .......... .......... .......... .......... .......... 62% 11.1M 2s 8050K .......... .......... .......... .......... .......... 62% 108M 2s 8100K .......... .......... .......... .......... .......... 62% 83.9M 2s 8150K .......... .......... .......... .......... .......... 63% 108M 2s 8200K .......... .......... .......... .......... .......... 63% 658K 2s 8250K .......... .......... .......... .......... .......... 63% 6.58M 2s 8300K .......... .......... .......... .......... .......... 64% 31.4M 2s 8350K .......... .......... .......... .......... .......... 64% 60.8M 2s 8400K .......... .......... .......... .......... .......... 65% 110M 2s 8450K .......... .......... .......... .......... .......... 65% 2.46M 2s 8500K .......... .......... .......... .......... .......... 65% 867K 2s 8550K .......... .......... .......... .......... .......... 66% 6.34M 2s 8600K .......... .......... .......... .......... .......... 66% 46.6M 2s 8650K .......... .......... .......... .......... .......... 67% 308M 2s 8700K .......... .......... .......... .......... .......... 67% 2.88M 2s 8750K .......... .......... .......... .......... .......... 67% 857K 2s 8800K .......... .......... .......... .......... .......... 68% 5.47M 2s 8850K .......... .......... .......... .......... .......... 68% 80.1M 2s 8900K .......... .......... .......... .......... .......... 68% 65.2M 2s 8950K .......... .......... .......... .......... .......... 69% 338M 2s 9000K .......... .......... .......... .......... .......... 69% 899K 2s 9050K .......... .......... .......... .......... .......... 70% 2.59M 2s 9100K .......... .......... .......... .......... .......... 70% 5.55M 2s 9150K .......... .......... .......... .......... .......... 70% 71.9M 2s 9200K .......... .......... .......... .......... .......... 71% 70.5M 2s 9250K .......... .......... .......... .......... .......... 71% 3.16M 2s 9300K .......... .......... .......... .......... .......... 72% 854K 2s 9350K .......... .......... .......... .......... .......... 72% 5.38M 2s 9400K .......... .......... .......... .......... .......... 72% 48.1M 2s 9450K .......... .......... .......... .......... .......... 73% 70.6M 2s 9500K .......... .......... .......... .......... .......... 73% 98.9M 1s 9550K .......... .......... .......... .......... .......... 73% 919K 1s 9600K .......... .......... .......... .......... .......... 74% 2.55M 1s 9650K .......... .......... .......... .......... .......... 74% 5.51M 1s 9700K .......... .......... .......... .......... .......... 75% 50.2M 1s 9750K .......... .......... .......... .......... .......... 75% 45.2M 1s 9800K .......... .......... .......... .......... .......... 75% 3.43M 1s 9850K .......... .......... .......... .......... .......... 76% 844K 1s 9900K .......... .......... .......... .......... .......... 76% 5.76M 1s 9950K .......... .......... .......... .......... .......... 77% 47.0M 1s 10000K .......... .......... .......... .......... .......... 77% 33.3M 1s 10050K .......... .......... .......... .......... .......... 77% 72.7M 1s 10100K .......... .......... .......... .......... .......... 78% 930K 1s 10150K .......... .......... .......... .......... .......... 78% 2.52M 1s 10200K .......... .......... .......... .......... .......... 78% 5.83M 1s 10250K .......... .......... .......... .......... .......... 79% 44.5M 1s 10300K .......... .......... .......... .......... .......... 79% 24.4M 1s 10350K .......... .......... .......... .......... .......... 80% 3.74M 1s 10400K .......... .......... .......... .......... .......... 80% 842K 1s 10450K .......... .......... .......... .......... .......... 80% 6.47M 1s 10500K .......... .......... .......... .......... .......... 81% 24.4M 1s 10550K .......... .......... .......... .......... .......... 81% 20.3M 1s 10600K .......... .......... .......... .......... .......... 82% 106M 1s 10650K .......... .......... .......... .......... .......... 82% 937K 1s 10700K .......... .......... .......... .......... .......... 82% 2.55M 1s 10750K .......... .......... .......... .......... .......... 83% 5.78M 1s 10800K .......... .......... .......... .......... .......... 83% 43.1M 1s 10850K .......... .......... .......... .......... .......... 83% 18.3M 1s 10900K .......... .......... .......... .......... .......... 84% 3.95M 1s 10950K .......... .......... .......... .......... .......... 84% 902K 1s 11000K .......... .......... .......... .......... .......... 85% 4.20M 1s 11050K .......... .......... .......... .......... .......... 85% 26.4M 1s 11100K .......... .......... .......... .......... .......... 85% 16.9M 1s 11150K .......... .......... .......... .......... .......... 86% 37.1M 1s 11200K .......... .......... .......... .......... .......... 86% 968K 1s 11250K .......... .......... .......... .......... .......... 87% 2.47M 1s 11300K .......... .......... .......... .......... .......... 87% 6.26M 1s 11350K .......... .......... .......... .......... .......... 87% 30.5M 1s 11400K .......... .......... .......... .......... .......... 88% 14.8M 1s 11450K .......... .......... .......... .......... .......... 88% 4.16M 1s 11500K .......... .......... .......... .......... .......... 89% 908K 1s 11550K .......... .......... .......... .......... .......... 89% 9.24M 1s 11600K .......... .......... .......... .......... .......... 89% 5.67M 1s 11650K .......... .......... .......... .......... .......... 90% 49.7M 1s 11700K .......... .......... .......... .......... .......... 90% 13.3M 1s 11750K .......... .......... .......... .......... .......... 90% 4.25M 0s 11800K .......... .......... .......... .......... .......... 91% 846K 0s 11850K .......... .......... .......... .......... .......... 91% 6.19M 0s 11900K .......... .......... .......... .......... .......... 92% 30.9M 0s 11950K .......... .......... .......... .......... .......... 92% 12.0M 0s 12000K .......... .......... .......... .......... .......... 92% 88.7M 0s 12050K .......... .......... .......... .......... .......... 93% 983K 0s 12100K .......... .......... .......... .......... .......... 93% 2.49M 0s 12150K .......... .......... .......... .......... .......... 94% 5.51M 0s 12200K .......... .......... .......... .......... .......... 94% 68.0M 0s 12250K .......... .......... .......... .......... .......... 94% 12.1M 0s 12300K .......... .......... .......... .......... .......... 95% 4.24M 0s 12350K .......... .......... .......... .......... .......... 95% 915K 0s 12400K .......... .......... .......... .......... .......... 95% 4.45M 0s 12450K .......... .......... .......... .......... .......... 96% 16.8M 0s 12500K .......... .......... .......... .......... .......... 96% 25.9M 0s 12550K .......... .......... .......... .......... .......... 97% 16.1M 0s 12600K .......... .......... .......... .......... .......... 97% 977K 0s 12650K .......... .......... .......... .......... .......... 97% 2.56M 0s 12700K .......... .......... .......... .......... .......... 98% 6.39M 0s 12750K .......... .......... .......... .......... .......... 98% 28.3M 0s 12800K .......... .......... .......... .......... .......... 99% 13.0M 0s 12850K .......... .......... .......... .......... .......... 99% 38.4M 0s 12900K .......... .......... .......... .......... .......... 99% 973K 0s 12950K .......... .......... ...... 100% 1.39M=5.3s 2016-05-20 12:34:47 (2.41 MB/s) - ?/tmp/openstack/tempest/cirros-0.3.4-x86_64-disk.img? saved [13287936/13287936] + set +e + TESTS=smoke + TESTS='smoke dashbboard' + TESTS='smoke dashbboard TelemetryAlarming' + TESTS='smoke dashbboard TelemetryAlarming api.baremetal.admin.test_drivers' + cd /tmp/openstack/tempest + tox -eall -- --concurrency=2 smoke dashbboard TelemetryAlarming api.baremetal.admin.test_drivers Option "verbose" from group "DEFAULT" is deprecated for removal. Its value may be silently ignored in the future. Option "verbose" from group "DEFAULT" is deprecated for removal. Its value may be silently ignored in the future. Option "verbose" from group "DEFAULT" is deprecated for removal. Its value may be silently ignored in the future. + RESULT=1 + set -e + /tmp/openstack/tempest/.tox/tempest/bin/testr last --subunit + exit 1 TASK [puppet-openstack : Failure occurred, executing rescue] ******************* task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/run.yml:28 Friday 20 May 2016 11:49:31 +0000 (0:26:49.090) 0:28:21.226 ************ included: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/rescue.yml for n2.dusty TASK [puppet-openstack : debug] ************************************************ task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/rescue.yml:17 Friday 20 May 2016 11:49:32 +0000 (0:00:00.377) 0:28:21.604 ************ ok: [n2.dusty] => { "msg": "Failure occured in /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/run.yml" } msg: Failure occured in /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/puppet-openstack/tasks/run.yml TASK [puppet-openstack : command] ********************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/rescue.yml:20 Friday 20 May 2016 11:49:32 +0000 (0:00:00.298) 0:28:21.903 ************ NOTIFIED HANDLER Ensure global logs are recovered NOTIFIED HANDLER Ensure puppet-openstack logs are recovered NOTIFIED HANDLER Ensure ci nodes are released NOTIFIED HANDLER Fail playbook execution changed: [n2.dusty] => {"changed": true, "cmd": ["/bin/true"], "delta": "0:00:00.002107", "end": "2016-05-20 12:49:32.807809", "rc": 0, "start": "2016-05-20 12:49:32.805702", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} cmd: /bin/true start: 2016-05-20 12:49:32.805702 end: 2016-05-20 12:49:32.807809 delta: 0:00:00.002107 RUNNING HANDLER [common : Ensure puppet-openstack logs are recovered] ********** Friday 20 May 2016 11:49:33 +0000 (0:00:00.564) 0:28:22.467 ************ included: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/../../puppet-openstack/tasks/logs.yml for n2.dusty RUNNING HANDLER [common : command] ********************************************* [WARNING]: Failure when attempting to use callback plugin (): cannot concatenate 'str' and 'NoneType' objects changed: [n2.dusty] => {"changed": true, "cmd": ["./copy_puppet_logs.sh"], "delta": "0:00:04.867382", "end": "2016-05-20 12:49:38.515110", "rc": 0, "start": "2016-05-20 12:49:33.647728", "stderr": "+ set -o errexit\n+ LOG_DIR=/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs\n+ mkdir /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/aodh/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/aodh/metadata.json\n++ basename /etc/puppet/modules/aodh\n+ PROJECTS+='aodh '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/apache/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/apache/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/apt/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/apt/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/barbican/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/barbican/metadata.json\n++ basename /etc/puppet/modules/barbican\n+ PROJECTS+='barbican '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/ceilometer/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/ceilometer/metadata.json\n++ basename /etc/puppet/modules/ceilometer\n+ PROJECTS+='ceilometer '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/ceph/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/ceph/metadata.json\n++ basename /etc/puppet/modules/ceph\n+ PROJECTS+='ceph '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/cinder/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/cinder/metadata.json\n++ basename /etc/puppet/modules/cinder\n+ PROJECTS+='cinder '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/concat/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/concat/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/corosync/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/corosync/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/designate/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/designate/metadata.json\n++ basename /etc/puppet/modules/designate\n+ PROJECTS+='designate '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/dns/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/dns/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/firewall/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/firewall/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/glance/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/glance/metadata.json\n++ basename /etc/puppet/modules/glance\n+ PROJECTS+='glance '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/gnocchi/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/gnocchi/metadata.json\n++ basename /etc/puppet/modules/gnocchi\n+ PROJECTS+='gnocchi '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/heat/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/heat/metadata.json\n++ basename /etc/puppet/modules/heat\n+ PROJECTS+='heat '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/horizon/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/horizon/metadata.json\n++ basename /etc/puppet/modules/horizon\n+ PROJECTS+='horizon '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/inifile/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/inifile/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/ironic/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/ironic/metadata.json\n++ basename /etc/puppet/modules/ironic\n+ PROJECTS+='ironic '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/keystone/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/keystone/metadata.json\n++ basename /etc/puppet/modules/keystone\n+ PROJECTS+='keystone '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/manila/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/manila/metadata.json\n++ basename /etc/puppet/modules/manila\n+ PROJECTS+='manila '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/memcached/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/memcached/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/mistral/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/mistral/metadata.json\n++ basename /etc/puppet/modules/mistral\n+ PROJECTS+='mistral '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/monasca/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/monasca/metadata.json\n++ basename /etc/puppet/modules/monasca\n+ PROJECTS+='monasca '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/mongodb/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/mongodb/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/murano/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/murano/metadata.json\n++ basename /etc/puppet/modules/murano\n+ PROJECTS+='murano '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/mysql/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/mysql/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/neutron/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/neutron/metadata.json\n++ basename /etc/puppet/modules/neutron\n+ PROJECTS+='neutron '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/nova/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/nova/metadata.json\n++ basename /etc/puppet/modules/nova\n+ PROJECTS+='nova '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/octavia/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/octavia/metadata.json\n++ basename /etc/puppet/modules/octavia\n+ PROJECTS+='octavia '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/openstack_extras/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/openstack_extras/metadata.json\n++ basename /etc/puppet/modules/openstack_extras\n+ PROJECTS+='openstack_extras '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/openstack_integration/metadata.json ']'\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/openstacklib/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/openstacklib/metadata.json\n++ basename /etc/puppet/modules/openstacklib\n+ PROJECTS+='openstacklib '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/oslo/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/oslo/metadata.json\n++ basename /etc/puppet/modules/oslo\n+ PROJECTS+='oslo '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/postgresql/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/postgresql/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/powerdns/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/powerdns/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/python/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/python/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/qpid/metadata.json ']'\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/rabbitmq/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/rabbitmq/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/rsync/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/rsync/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/sahara/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/sahara/metadata.json\n++ basename /etc/puppet/modules/sahara\n+ PROJECTS+='sahara '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/staging/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/staging/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/stdlib/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/stdlib/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/swift/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/swift/metadata.json\n++ basename /etc/puppet/modules/swift\n+ PROJECTS+='swift '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/sysctl/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/sysctl/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/tempest/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/tempest/metadata.json\n++ basename /etc/puppet/modules/tempest\n+ PROJECTS+='tempest '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/trove/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/trove/metadata.json\n++ basename /etc/puppet/modules/trove\n+ PROJECTS+='trove '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/vcsrepo/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/vcsrepo/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/vswitch/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/vswitch/metadata.json\n++ basename /etc/puppet/modules/vswitch\n+ PROJECTS+='vswitch '\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/xinetd/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/xinetd/metadata.json\n+ for project in '/etc/puppet/modules/*'\n+ '[' -f /etc/puppet/modules/zaqar/metadata.json ']'\n+ egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/zaqar/metadata.json\n++ basename /etc/puppet/modules/zaqar\n+ PROJECTS+='zaqar '\n+ mkdir /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/\n+ for p in '$PROJECTS'\n+ '[' -d /etc/aodh ']'\n+ '[' -d /var/log/aodh ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/barbican ']'\n+ '[' -d /var/log/barbican ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/ceilometer ']'\n+ '[' -d /var/log/ceilometer ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/ceph ']'\n+ '[' -d /var/log/ceph ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/cinder ']'\n+ sudo cp -r /etc/cinder /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/\n+ '[' -d /var/log/cinder ']'\n+ sudo cp -r /var/log/cinder /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs\n+ for p in '$PROJECTS'\n+ '[' -d /etc/designate ']'\n+ '[' -d /var/log/designate ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/glance ']'\n+ sudo cp -r /etc/glance /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/\n+ '[' -d /var/log/glance ']'\n+ sudo cp -r /var/log/glance /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs\n+ for p in '$PROJECTS'\n+ '[' -d /etc/gnocchi ']'\n+ '[' -d /var/log/gnocchi ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/heat ']'\n+ '[' -d /var/log/heat ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/horizon ']'\n+ '[' -d /var/log/horizon ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/ironic ']'\n+ sudo cp -r /etc/ironic /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/\n+ '[' -d /var/log/ironic ']'\n+ sudo cp -r /var/log/ironic /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs\n+ for p in '$PROJECTS'\n+ '[' -d /etc/keystone ']'\n+ sudo cp -r /etc/keystone /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/\n+ '[' -d /var/log/keystone ']'\n+ sudo cp -r /var/log/keystone /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs\n+ for p in '$PROJECTS'\n+ '[' -d /etc/manila ']'\n+ '[' -d /var/log/manila ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/mistral ']'\n+ '[' -d /var/log/mistral ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/monasca ']'\n+ '[' -d /var/log/monasca ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/murano ']'\n+ '[' -d /var/log/murano ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/neutron ']'\n+ sudo cp -r /etc/neutron /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/\n+ '[' -d /var/log/neutron ']'\n+ sudo cp -r /var/log/neutron /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs\n+ for p in '$PROJECTS'\n+ '[' -d /etc/nova ']'\n+ sudo cp -r /etc/nova /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/\n+ '[' -d /var/log/nova ']'\n+ sudo cp -r /var/log/nova /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs\n+ for p in '$PROJECTS'\n+ '[' -d /etc/octavia ']'\n+ '[' -d /var/log/octavia ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/openstack_extras ']'\n+ '[' -d /var/log/openstack_extras ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/openstacklib ']'\n+ '[' -d /var/log/openstacklib ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/oslo ']'\n+ '[' -d /var/log/oslo ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/sahara ']'\n+ '[' -d /var/log/sahara ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/swift ']'\n+ sudo cp -r /etc/swift /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/\n+ '[' -d /var/log/swift ']'\n+ sudo cp -r /var/log/swift /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs\n+ for p in '$PROJECTS'\n+ '[' -d /etc/tempest ']'\n+ '[' -d /var/log/tempest ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/trove ']'\n+ '[' -d /var/log/trove ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/vswitch ']'\n+ '[' -d /var/log/vswitch ']'\n+ for p in '$PROJECTS'\n+ '[' -d /etc/zaqar ']'\n+ '[' -d /var/log/zaqar ']'\n+ uses_debs\n+ type apt-get\n+ is_fedora\n+ lsb_release -i\n+ grep -iq fedora\n+ lsb_release -i\n+ grep -iq CentOS\n+ sudo journalctl --no-pager\n+ '[' -d /var/log/rabbitmq ']'\n+ sudo cp -r /var/log/rabbitmq /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs\n+ '[' -d /var/log/postgresql ']'\n+ '[' -f /var/log/mysql.err ']'\n+ '[' -f /var/log/mysql.log ']'\n+ '[' -f /tmp/openstack/tempest/tempest.log ']'\n+ sudo cp /tmp/openstack/tempest/tempest.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/\n+ '[' -f /tmp/openstack/tempest/testrepository.subunit ']'\n+ sudo cp /tmp/openstack/tempest/testrepository.subunit /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/testrepository.subunit\n+ '[' -f /var/log/dstat.log ']'\n+ sudo cp /var/log/dstat.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/\n+ '[' -d /var/log/libvirt ']'\n+ sudo cp -r /var/log/libvirt /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/\n+ '[' -d /var/log/openvswitch ']'\n+ sudo cp -r /var/log/openvswitch /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/\n+ sudo cp -r /etc/sudoers.d /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/\n+ sudo cp /etc/sudoers /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.txt\n+ uses_debs\n+ type apt-get\n+ is_fedora\n+ lsb_release -i\n+ grep -iq fedora\n+ lsb_release -i\n+ grep -iq CentOS\n+ apache_logs=/var/log/httpd\n+ '[' -d /etc/httpd/conf.d ']'\n+ mkdir /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config\n+ sudo cp /etc/httpd/conf.d/10-ironic_wsgi.conf /etc/httpd/conf.d/10-keystone_wsgi_admin.conf /etc/httpd/conf.d/10-keystone_wsgi_main.conf /etc/httpd/conf.d/10-nova_api_wsgi.conf /etc/httpd/conf.d/15-default.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config\n++ ls /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config\n+ for f in '`ls $LOG_DIR/apache_config`'\n+ mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-ironic_wsgi.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-ironic_wsgi.conf.txt\n+ for f in '`ls $LOG_DIR/apache_config`'\n+ mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_admin.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_admin.conf.txt\n+ for f in '`ls $LOG_DIR/apache_config`'\n+ mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_main.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_main.conf.txt\n+ for f in '`ls $LOG_DIR/apache_config`'\n+ mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-nova_api_wsgi.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-nova_api_wsgi.conf.txt\n+ for f in '`ls $LOG_DIR/apache_config`'\n+ mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/15-default.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/15-default.conf.txt\n+ '[' -d /var/log/httpd ']'\n+ sudo cp -r /var/log/httpd /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache\n+ '[' -f /var/log/audit/audit.log ']'\n+ sudo cp /var/log/audit/audit.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/audit.log.txt\n+ '[' -d /tmp/openstack/tempest ']'\n+ sudo cp /tmp/openstack/tempest/etc/tempest.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/tempest.conf.txt\n+ '[' -d /etc/openstack-dashboard ']'\n++ command -v dpkg\n+ '[' ']'\n++ command -v rpm\n+ '[' /usr/bin/rpm ']'\n+ rpm -qa\n+ df -h\n+ free -m\n+ cat /proc/cpuinfo\n+ ps -eo user,pid,ppid,lwp,%cpu,%mem,size,rss,cmd\n+ sudo find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -type d -execdir sudo chmod 755 '{}' ';'\n+ sudo find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -type f -execdir sudo chmod 644 '{}' ';'\n+ sudo find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -type l -execdir sudo rm -f '{}' ';'\n++ find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -name '*.log'\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/cinder-manage.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/cinder-manage.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/api.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/api.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/volume.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/volume.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/scheduler.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/scheduler.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/registry.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/registry.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/api.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/api.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-dbsync.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-dbsync.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-conductor.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-conductor.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/app.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/app.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/keystone/keystone.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/keystone/keystone.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/openvswitch-agent.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/openvswitch-agent.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metering-agent.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metering-agent.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/dhcp-agent.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/dhcp-agent.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/lbaas-agent.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/lbaas-agent.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/l3-agent.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/l3-agent.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metadata-agent.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metadata-agent.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/server.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/server.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-4e0cd0d8-1601-47d0-9820-d7e73f9bf3df.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-4e0cd0d8-1601-47d0-9820-d7e73f9bf3df.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-626609dd-d1cb-4f30-abe3-bde4810fa59b.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-626609dd-d1cb-4f30-abe3-bde4810fa59b.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-dc80ad39-b9a7-41bf-a792-cf8721e7ba5e.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-dc80ad39-b9a7-41bf-a792-cf8721e7ba5e.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-04c5d21a-4ce9-48bc-9ebd-760d57d6f181.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-04c5d21a-4ce9-48bc-9ebd-760d57d6f181.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-d5cd80b6-00c9-4002-b7c1-3f0e89fe6b22.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-d5cd80b6-00c9-4002-b7c1-3f0e89fe6b22.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-0482c0a4-c3ff-4e4f-b9d2-1840f91820f3.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-0482c0a4-c3ff-4e4f-b9d2-1840f91820f3.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-2ceb5c9c-90bd-4773-a918-66aa6dbcfa4d.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-2ceb5c9c-90bd-4773-a918-66aa6dbcfa4d.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-3444e121-20ca-4d43-bc85-250291b85adf.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-3444e121-20ca-4d43-bc85-250291b85adf.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-507c5b71-cfff-4102-9946-28e8e2327626.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-507c5b71-cfff-4102-9946-28e8e2327626.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-61fa4559-4a33-4c9c-9aae-787631907b72.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-61fa4559-4a33-4c9c-9aae-787631907b72.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c9135f9e-e3b5-45bc-a377-09e80f3e46f9.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c9135f9e-e3b5-45bc-a377-09e80f3e46f9.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-241bdf5e-518e-4b0e-a2de-3c66ae07ff1a.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-241bdf5e-518e-4b0e-a2de-3c66ae07ff1a.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-83e52736-3b87-4753-a58a-223b725f50a3.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-83e52736-3b87-4753-a58a-223b725f50a3.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-52f14bea-16c9-450f-b2eb-c38128712134.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-52f14bea-16c9-450f-b2eb-c38128712134.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-63a1d182-dee3-4268-bd18-bf7fbce7c030.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-63a1d182-dee3-4268-bd18-bf7fbce7c030.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c1ad332d-7462-4167-a25a-32bef965a872.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c1ad332d-7462-4167-a25a-32bef965a872.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c5966564-d7eb-4943-8b2c-2971587bb84e.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c5966564-d7eb-4943-8b2c-2971587bb84e.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ef0e2587-c553-4913-8c7c-e6cfd6fea623.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ef0e2587-c553-4913-8c7c-e6cfd6fea623.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-43f26813-0feb-402a-bc02-f6a443073330.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-43f26813-0feb-402a-bc02-f6a443073330.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-65c24824-1749-4f9b-a3ce-825f3d91070e.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-65c24824-1749-4f9b-a3ce-825f3d91070e.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7483ed8c-9a86-48c8-8f2f-c86314a8fd89.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7483ed8c-9a86-48c8-8f2f-c86314a8fd89.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-bb5ad119-e2c2-487c-a81b-486691e2d2c3.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-bb5ad119-e2c2-487c-a81b-486691e2d2c3.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-abb4fa75-9747-4a73-9e2d-92f30aea7bd8.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-abb4fa75-9747-4a73-9e2d-92f30aea7bd8.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-8f917175-7007-4c3b-9c9b-d18448bd1139.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-8f917175-7007-4c3b-9c9b-d18448bd1139.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-e8957ee5-60b7-4a76-b725-66b6fd770c16.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-e8957ee5-60b7-4a76-b725-66b6fd770c16.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-57f1109f-c393-4fb6-8c44-51d0cfbab6a0.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-57f1109f-c393-4fb6-8c44-51d0cfbab6a0.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-cae8fa4f-49b1-45ae-b151-2721639d9ee6.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-cae8fa4f-49b1-45ae-b151-2721639d9ee6.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ade8d50d-fea6-41a4-b385-35625210cb2a.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ade8d50d-fea6-41a4-b385-35625210cb2a.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f906b905-159d-4308-bf7e-1eafb8888246.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f906b905-159d-4308-bf7e-1eafb8888246.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ba272c2c-1ea6-4c14-9b5c-78cae90d82b7.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ba272c2c-1ea6-4c14-9b5c-78cae90d82b7.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ebe9ce7b-1d0c-4e2a-9545-f15e4c4a7e5e.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ebe9ce7b-1d0c-4e2a-9545-f15e4c4a7e5e.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7e360df8-ef95-43f0-b078-00a48543d918.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7e360df8-ef95-43f0-b078-00a48543d918.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c764cc0f-af8c-4138-a98b-3bc8c2ebd200.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c764cc0f-af8c-4138-a98b-3bc8c2ebd200.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f986b546-cdf7-4c8f-8fab-65ed694fddf4.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f986b546-cdf7-4c8f-8fab-65ed694fddf4.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-manage.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-manage.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-novncproxy.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-novncproxy.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-consoleauth.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-consoleauth.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-scheduler.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-scheduler.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-conductor.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-conductor.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-compute.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-compute.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-api.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-api.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2-sasl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2-sasl.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/tempest.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/tempest.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/dstat.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/dstat.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000001.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000001.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000002.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000002.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000003.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000003.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000004.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000004.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000005.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000005.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000006.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000006.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000007.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000007.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000008.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000008.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000009.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000009.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000a.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000a.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000b.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000b.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000c.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000c.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000d.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000d.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000e.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000e.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000f.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000f.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovsdb-server.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovsdb-server.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovs-vswitchd.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovs-vswitchd.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/default_error.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/default_error.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_error_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_error_ssl.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_error_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_error_ssl.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_error_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_error_ssl.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_error_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_error_ssl.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_access_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_access_ssl.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_access_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_access_ssl.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_access_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_access_ssl.txt\n+ for f in '$(find $LOG_DIR -name \"*.log\")'\n+ sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_access_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_access_ssl.txt\n+ find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc -type f -exec sudo mv '{}' '{}.txt' ';'\n+ '[' -f /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq ']'\n+ sudo find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -iname '*.txt' -type f -execdir gzip -9 '{}' +\n+ sudo find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -iname '*.dat' -type f -execdir gzip -9 '{}' +\n+ sudo find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -iname '*.conf' -type f -execdir gzip -9 '{}' +", "stdout": "", "stdout_lines": [], "warnings": []} cmd: ./copy_puppet_logs.sh start: 2016-05-20 12:49:33.647728 end: 2016-05-20 12:49:38.515110 delta: 0:00:04.867382 stderr: + set -o errexit + LOG_DIR=/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs + mkdir /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/aodh/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/aodh/metadata.json ++ basename /etc/puppet/modules/aodh + PROJECTS+='aodh ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/apache/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/apache/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/apt/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/apt/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/barbican/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/barbican/metadata.json ++ basename /etc/puppet/modules/barbican + PROJECTS+='barbican ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/ceilometer/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/ceilometer/metadata.json ++ basename /etc/puppet/modules/ceilometer + PROJECTS+='ceilometer ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/ceph/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/ceph/metadata.json ++ basename /etc/puppet/modules/ceph + PROJECTS+='ceph ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/cinder/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/cinder/metadata.json ++ basename /etc/puppet/modules/cinder + PROJECTS+='cinder ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/concat/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/concat/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/corosync/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/corosync/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/designate/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/designate/metadata.json ++ basename /etc/puppet/modules/designate + PROJECTS+='designate ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/dns/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/dns/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/firewall/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/firewall/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/glance/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/glance/metadata.json ++ basename /etc/puppet/modules/glance + PROJECTS+='glance ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/gnocchi/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/gnocchi/metadata.json ++ basename /etc/puppet/modules/gnocchi + PROJECTS+='gnocchi ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/heat/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/heat/metadata.json ++ basename /etc/puppet/modules/heat + PROJECTS+='heat ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/horizon/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/horizon/metadata.json ++ basename /etc/puppet/modules/horizon + PROJECTS+='horizon ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/inifile/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/inifile/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/ironic/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/ironic/metadata.json ++ basename /etc/puppet/modules/ironic + PROJECTS+='ironic ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/keystone/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/keystone/metadata.json ++ basename /etc/puppet/modules/keystone + PROJECTS+='keystone ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/manila/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/manila/metadata.json ++ basename /etc/puppet/modules/manila + PROJECTS+='manila ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/memcached/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/memcached/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/mistral/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/mistral/metadata.json ++ basename /etc/puppet/modules/mistral + PROJECTS+='mistral ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/monasca/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/monasca/metadata.json ++ basename /etc/puppet/modules/monasca + PROJECTS+='monasca ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/mongodb/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/mongodb/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/murano/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/murano/metadata.json ++ basename /etc/puppet/modules/murano + PROJECTS+='murano ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/mysql/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/mysql/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/neutron/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/neutron/metadata.json ++ basename /etc/puppet/modules/neutron + PROJECTS+='neutron ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/nova/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/nova/metadata.json ++ basename /etc/puppet/modules/nova + PROJECTS+='nova ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/octavia/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/octavia/metadata.json ++ basename /etc/puppet/modules/octavia + PROJECTS+='octavia ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/openstack_extras/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/openstack_extras/metadata.json ++ basename /etc/puppet/modules/openstack_extras + PROJECTS+='openstack_extras ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/openstack_integration/metadata.json ']' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/openstacklib/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/openstacklib/metadata.json ++ basename /etc/puppet/modules/openstacklib + PROJECTS+='openstacklib ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/oslo/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/oslo/metadata.json ++ basename /etc/puppet/modules/oslo + PROJECTS+='oslo ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/postgresql/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/postgresql/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/powerdns/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/powerdns/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/python/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/python/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/qpid/metadata.json ']' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/rabbitmq/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/rabbitmq/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/rsync/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/rsync/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/sahara/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/sahara/metadata.json ++ basename /etc/puppet/modules/sahara + PROJECTS+='sahara ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/staging/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/staging/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/stdlib/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/stdlib/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/swift/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/swift/metadata.json ++ basename /etc/puppet/modules/swift + PROJECTS+='swift ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/sysctl/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/sysctl/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/tempest/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/tempest/metadata.json ++ basename /etc/puppet/modules/tempest + PROJECTS+='tempest ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/trove/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/trove/metadata.json ++ basename /etc/puppet/modules/trove + PROJECTS+='trove ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/vcsrepo/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/vcsrepo/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/vswitch/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/vswitch/metadata.json ++ basename /etc/puppet/modules/vswitch + PROJECTS+='vswitch ' + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/xinetd/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/xinetd/metadata.json + for project in '/etc/puppet/modules/*' + '[' -f /etc/puppet/modules/zaqar/metadata.json ']' + egrep -q 'github.com/(stackforge|openstack)/puppet' /etc/puppet/modules/zaqar/metadata.json ++ basename /etc/puppet/modules/zaqar + PROJECTS+='zaqar ' + mkdir /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ + for p in '$PROJECTS' + '[' -d /etc/aodh ']' + '[' -d /var/log/aodh ']' + for p in '$PROJECTS' + '[' -d /etc/barbican ']' + '[' -d /var/log/barbican ']' + for p in '$PROJECTS' + '[' -d /etc/ceilometer ']' + '[' -d /var/log/ceilometer ']' + for p in '$PROJECTS' + '[' -d /etc/ceph ']' + '[' -d /var/log/ceph ']' + for p in '$PROJECTS' + '[' -d /etc/cinder ']' + sudo cp -r /etc/cinder /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ + '[' -d /var/log/cinder ']' + sudo cp -r /var/log/cinder /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs + for p in '$PROJECTS' + '[' -d /etc/designate ']' + '[' -d /var/log/designate ']' + for p in '$PROJECTS' + '[' -d /etc/glance ']' + sudo cp -r /etc/glance /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ + '[' -d /var/log/glance ']' + sudo cp -r /var/log/glance /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs + for p in '$PROJECTS' + '[' -d /etc/gnocchi ']' + '[' -d /var/log/gnocchi ']' + for p in '$PROJECTS' + '[' -d /etc/heat ']' + '[' -d /var/log/heat ']' + for p in '$PROJECTS' + '[' -d /etc/horizon ']' + '[' -d /var/log/horizon ']' + for p in '$PROJECTS' + '[' -d /etc/ironic ']' + sudo cp -r /etc/ironic /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ + '[' -d /var/log/ironic ']' + sudo cp -r /var/log/ironic /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs + for p in '$PROJECTS' + '[' -d /etc/keystone ']' + sudo cp -r /etc/keystone /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ + '[' -d /var/log/keystone ']' + sudo cp -r /var/log/keystone /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs + for p in '$PROJECTS' + '[' -d /etc/manila ']' + '[' -d /var/log/manila ']' + for p in '$PROJECTS' + '[' -d /etc/mistral ']' + '[' -d /var/log/mistral ']' + for p in '$PROJECTS' + '[' -d /etc/monasca ']' + '[' -d /var/log/monasca ']' + for p in '$PROJECTS' + '[' -d /etc/murano ']' + '[' -d /var/log/murano ']' + for p in '$PROJECTS' + '[' -d /etc/neutron ']' + sudo cp -r /etc/neutron /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ + '[' -d /var/log/neutron ']' + sudo cp -r /var/log/neutron /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs + for p in '$PROJECTS' + '[' -d /etc/nova ']' + sudo cp -r /etc/nova /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ + '[' -d /var/log/nova ']' + sudo cp -r /var/log/nova /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs + for p in '$PROJECTS' + '[' -d /etc/octavia ']' + '[' -d /var/log/octavia ']' + for p in '$PROJECTS' + '[' -d /etc/openstack_extras ']' + '[' -d /var/log/openstack_extras ']' + for p in '$PROJECTS' + '[' -d /etc/openstacklib ']' + '[' -d /var/log/openstacklib ']' + for p in '$PROJECTS' + '[' -d /etc/oslo ']' + '[' -d /var/log/oslo ']' + for p in '$PROJECTS' + '[' -d /etc/sahara ']' + '[' -d /var/log/sahara ']' + for p in '$PROJECTS' + '[' -d /etc/swift ']' + sudo cp -r /etc/swift /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ + '[' -d /var/log/swift ']' + sudo cp -r /var/log/swift /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs + for p in '$PROJECTS' + '[' -d /etc/tempest ']' + '[' -d /var/log/tempest ']' + for p in '$PROJECTS' + '[' -d /etc/trove ']' + '[' -d /var/log/trove ']' + for p in '$PROJECTS' + '[' -d /etc/vswitch ']' + '[' -d /var/log/vswitch ']' + for p in '$PROJECTS' + '[' -d /etc/zaqar ']' + '[' -d /var/log/zaqar ']' + uses_debs + type apt-get + is_fedora + lsb_release -i + grep -iq fedora + lsb_release -i + grep -iq CentOS + sudo journalctl --no-pager + '[' -d /var/log/rabbitmq ']' + sudo cp -r /var/log/rabbitmq /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs + '[' -d /var/log/postgresql ']' + '[' -f /var/log/mysql.err ']' + '[' -f /var/log/mysql.log ']' + '[' -f /tmp/openstack/tempest/tempest.log ']' + sudo cp /tmp/openstack/tempest/tempest.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ + '[' -f /tmp/openstack/tempest/testrepository.subunit ']' + sudo cp /tmp/openstack/tempest/testrepository.subunit /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/testrepository.subunit + '[' -f /var/log/dstat.log ']' + sudo cp /var/log/dstat.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ + '[' -d /var/log/libvirt ']' + sudo cp -r /var/log/libvirt /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ + '[' -d /var/log/openvswitch ']' + sudo cp -r /var/log/openvswitch /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ + sudo cp -r /etc/sudoers.d /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ + sudo cp /etc/sudoers /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.txt + uses_debs + type apt-get + is_fedora + lsb_release -i + grep -iq fedora + lsb_release -i + grep -iq CentOS + apache_logs=/var/log/httpd + '[' -d /etc/httpd/conf.d ']' + mkdir /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config + sudo cp /etc/httpd/conf.d/10-ironic_wsgi.conf /etc/httpd/conf.d/10-keystone_wsgi_admin.conf /etc/httpd/conf.d/10-keystone_wsgi_main.conf /etc/httpd/conf.d/10-nova_api_wsgi.conf /etc/httpd/conf.d/15-default.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config ++ ls /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config + for f in '`ls $LOG_DIR/apache_config`' + mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-ironic_wsgi.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-ironic_wsgi.conf.txt + for f in '`ls $LOG_DIR/apache_config`' + mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_admin.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_admin.conf.txt + for f in '`ls $LOG_DIR/apache_config`' + mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_main.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_main.conf.txt + for f in '`ls $LOG_DIR/apache_config`' + mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-nova_api_wsgi.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-nova_api_wsgi.conf.txt + for f in '`ls $LOG_DIR/apache_config`' + mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/15-default.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/15-default.conf.txt + '[' -d /var/log/httpd ']' + sudo cp -r /var/log/httpd /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache + '[' -f /var/log/audit/audit.log ']' + sudo cp /var/log/audit/audit.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/audit.log.txt + '[' -d /tmp/openstack/tempest ']' + sudo cp /tmp/openstack/tempest/etc/tempest.conf /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/tempest.conf.txt + '[' -d /etc/openstack-dashboard ']' ++ command -v dpkg + '[' ']' ++ command -v rpm + '[' /usr/bin/rpm ']' + rpm -qa + df -h + free -m + cat /proc/cpuinfo + ps -eo user,pid,ppid,lwp,%cpu,%mem,size,rss,cmd + sudo find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -type d -execdir sudo chmod 755 '{}' ';' + sudo find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -type f -execdir sudo chmod 644 '{}' ';' + sudo find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -type l -execdir sudo rm -f '{}' ';' ++ find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -name '*.log' + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/cinder-manage.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/cinder-manage.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/api.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/api.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/volume.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/volume.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/scheduler.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/scheduler.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/registry.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/registry.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/api.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/api.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-dbsync.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-dbsync.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-conductor.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-conductor.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/app.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/app.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/keystone/keystone.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/keystone/keystone.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/openvswitch-agent.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/openvswitch-agent.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metering-agent.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metering-agent.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/dhcp-agent.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/dhcp-agent.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/lbaas-agent.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/lbaas-agent.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/l3-agent.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/l3-agent.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metadata-agent.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metadata-agent.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/server.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/server.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-4e0cd0d8-1601-47d0-9820-d7e73f9bf3df.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-4e0cd0d8-1601-47d0-9820-d7e73f9bf3df.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-626609dd-d1cb-4f30-abe3-bde4810fa59b.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-626609dd-d1cb-4f30-abe3-bde4810fa59b.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-dc80ad39-b9a7-41bf-a792-cf8721e7ba5e.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-dc80ad39-b9a7-41bf-a792-cf8721e7ba5e.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-04c5d21a-4ce9-48bc-9ebd-760d57d6f181.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-04c5d21a-4ce9-48bc-9ebd-760d57d6f181.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-d5cd80b6-00c9-4002-b7c1-3f0e89fe6b22.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-d5cd80b6-00c9-4002-b7c1-3f0e89fe6b22.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-0482c0a4-c3ff-4e4f-b9d2-1840f91820f3.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-0482c0a4-c3ff-4e4f-b9d2-1840f91820f3.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-2ceb5c9c-90bd-4773-a918-66aa6dbcfa4d.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-2ceb5c9c-90bd-4773-a918-66aa6dbcfa4d.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-3444e121-20ca-4d43-bc85-250291b85adf.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-3444e121-20ca-4d43-bc85-250291b85adf.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-507c5b71-cfff-4102-9946-28e8e2327626.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-507c5b71-cfff-4102-9946-28e8e2327626.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-61fa4559-4a33-4c9c-9aae-787631907b72.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-61fa4559-4a33-4c9c-9aae-787631907b72.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c9135f9e-e3b5-45bc-a377-09e80f3e46f9.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c9135f9e-e3b5-45bc-a377-09e80f3e46f9.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-241bdf5e-518e-4b0e-a2de-3c66ae07ff1a.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-241bdf5e-518e-4b0e-a2de-3c66ae07ff1a.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-83e52736-3b87-4753-a58a-223b725f50a3.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-83e52736-3b87-4753-a58a-223b725f50a3.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-52f14bea-16c9-450f-b2eb-c38128712134.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-52f14bea-16c9-450f-b2eb-c38128712134.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-63a1d182-dee3-4268-bd18-bf7fbce7c030.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-63a1d182-dee3-4268-bd18-bf7fbce7c030.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c1ad332d-7462-4167-a25a-32bef965a872.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c1ad332d-7462-4167-a25a-32bef965a872.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c5966564-d7eb-4943-8b2c-2971587bb84e.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c5966564-d7eb-4943-8b2c-2971587bb84e.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ef0e2587-c553-4913-8c7c-e6cfd6fea623.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ef0e2587-c553-4913-8c7c-e6cfd6fea623.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-43f26813-0feb-402a-bc02-f6a443073330.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-43f26813-0feb-402a-bc02-f6a443073330.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-65c24824-1749-4f9b-a3ce-825f3d91070e.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-65c24824-1749-4f9b-a3ce-825f3d91070e.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7483ed8c-9a86-48c8-8f2f-c86314a8fd89.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7483ed8c-9a86-48c8-8f2f-c86314a8fd89.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-bb5ad119-e2c2-487c-a81b-486691e2d2c3.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-bb5ad119-e2c2-487c-a81b-486691e2d2c3.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-abb4fa75-9747-4a73-9e2d-92f30aea7bd8.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-abb4fa75-9747-4a73-9e2d-92f30aea7bd8.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-8f917175-7007-4c3b-9c9b-d18448bd1139.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-8f917175-7007-4c3b-9c9b-d18448bd1139.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-e8957ee5-60b7-4a76-b725-66b6fd770c16.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-e8957ee5-60b7-4a76-b725-66b6fd770c16.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-57f1109f-c393-4fb6-8c44-51d0cfbab6a0.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-57f1109f-c393-4fb6-8c44-51d0cfbab6a0.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-cae8fa4f-49b1-45ae-b151-2721639d9ee6.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-cae8fa4f-49b1-45ae-b151-2721639d9ee6.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ade8d50d-fea6-41a4-b385-35625210cb2a.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ade8d50d-fea6-41a4-b385-35625210cb2a.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f906b905-159d-4308-bf7e-1eafb8888246.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f906b905-159d-4308-bf7e-1eafb8888246.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ba272c2c-1ea6-4c14-9b5c-78cae90d82b7.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ba272c2c-1ea6-4c14-9b5c-78cae90d82b7.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ebe9ce7b-1d0c-4e2a-9545-f15e4c4a7e5e.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ebe9ce7b-1d0c-4e2a-9545-f15e4c4a7e5e.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7e360df8-ef95-43f0-b078-00a48543d918.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7e360df8-ef95-43f0-b078-00a48543d918.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c764cc0f-af8c-4138-a98b-3bc8c2ebd200.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c764cc0f-af8c-4138-a98b-3bc8c2ebd200.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f986b546-cdf7-4c8f-8fab-65ed694fddf4.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f986b546-cdf7-4c8f-8fab-65ed694fddf4.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-manage.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-manage.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-novncproxy.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-novncproxy.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-consoleauth.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-consoleauth.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-scheduler.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-scheduler.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-conductor.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-conductor.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-compute.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-compute.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-api.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-api.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2-sasl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2-sasl.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/tempest.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/tempest.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/dstat.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/dstat.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000001.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000001.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000002.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000002.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000003.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000003.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000004.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000004.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000005.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000005.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000006.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000006.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000007.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000007.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000008.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000008.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000009.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000009.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000a.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000a.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000b.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000b.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000c.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000c.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000d.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000d.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000e.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000e.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000f.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000f.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovsdb-server.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovsdb-server.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovs-vswitchd.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovs-vswitchd.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/default_error.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/default_error.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_error_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_error_ssl.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_error_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_error_ssl.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_error_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_error_ssl.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_error_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_error_ssl.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_access_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_access_ssl.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_access_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_access_ssl.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_access_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_access_ssl.txt + for f in '$(find $LOG_DIR -name "*.log")' + sudo mv /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_access_ssl.log /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_access_ssl.txt + find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc -type f -exec sudo mv '{}' '{}.txt' ';' + '[' -f /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq ']' + sudo find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -iname '*.txt' -type f -execdir gzip -9 '{}' + + sudo find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -iname '*.dat' -type f -execdir gzip -9 '{}' + + sudo find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs -iname '*.conf' -type f -execdir gzip -9 '{}' + RUNNING HANDLER [common : Ensure global logs are recovered] ******************** Friday 20 May 2016 11:49:38 +0000 (0:00:05.623) 0:28:28.091 ************ included: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/roles/common/tasks/../../common/tasks/logs.yml for n2.dusty RUNNING HANDLER [common : command] ********************************************* changed: [n2.dusty] => (item={'value': {u'command': u'pstree -p'}, 'key': u'pstree'}) => {"changed": true, "cmd": "pstree -p >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/pstree.txt", "delta": "0:00:00.060559", "end": "2016-05-20 12:49:39.311217", "item": {"key": "pstree", "value": {"command": "pstree -p"}}, "rc": 0, "start": "2016-05-20 12:49:39.250658", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'yum repolist -v'}, 'key': u'repolist'}) => {"changed": true, "cmd": "yum repolist -v >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/repolist.txt", "delta": "0:00:00.923338", "end": "2016-05-20 12:49:40.461725", "item": {"key": "repolist", "value": {"command": "yum repolist -v"}}, "rc": 0, "start": "2016-05-20 12:49:39.538387", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": ["Consider using yum module rather than running yum"]} changed: [n2.dusty] => (item={'value': {u'command': u'rpm -qa'}, 'key': u'rpm_packages'}) => {"changed": true, "cmd": "rpm -qa >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/rpm_packages.txt", "delta": "0:00:00.461820", "end": "2016-05-20 12:49:41.151568", "item": {"key": "rpm_packages", "value": {"command": "rpm -qa"}}, "rc": 0, "start": "2016-05-20 12:49:40.689748", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": ["Consider using yum module rather than running rpm"]} changed: [n2.dusty] => (item={'value': {u'command': u'lsof -Pni'}, 'key': u'lsof_network'}) => {"changed": true, "cmd": "lsof -Pni >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof_network.txt", "delta": "0:00:00.074819", "end": "2016-05-20 12:49:41.465068", "item": {"key": "lsof_network", "value": {"command": "lsof -Pni"}}, "rc": 0, "start": "2016-05-20 12:49:41.390249", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'dmesg -T'}, 'key': u'dmesg'}) => {"changed": true, "cmd": "dmesg -T >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/dmesg.txt", "delta": "0:00:00.025345", "end": "2016-05-20 12:49:41.746885", "item": {"key": "dmesg", "value": {"command": "dmesg -T"}}, "rc": 0, "start": "2016-05-20 12:49:41.721540", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'journalctl --no-pager'}, 'key': u'journalctl'}) => {"changed": true, "cmd": "journalctl --no-pager >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/journalctl.txt", "delta": "0:00:00.350638", "end": "2016-05-20 12:49:42.345254", "item": {"key": "journalctl", "value": {"command": "journalctl --no-pager"}}, "rc": 0, "start": "2016-05-20 12:49:41.994616", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'sysctl -a'}, 'key': u'sysctl'}) => {"changed": true, "cmd": "sysctl -a >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/sysctl.txt", "delta": "0:00:00.086389", "end": "2016-05-20 12:49:42.662178", "item": {"key": "sysctl", "value": {"command": "sysctl -a"}}, "rc": 0, "start": "2016-05-20 12:49:42.575789", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'fdisk -l'}, 'key': u'fdisk'}) => {"changed": true, "cmd": "fdisk -l >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/fdisk.txt", "delta": "0:00:00.010571", "end": "2016-05-20 12:49:42.920460", "item": {"key": "fdisk", "value": {"command": "fdisk -l"}}, "rc": 0, "start": "2016-05-20 12:49:42.909889", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'df -h'}, 'key': u'df'}) => {"changed": true, "cmd": "df -h >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/df.txt", "delta": "0:00:00.005140", "end": "2016-05-20 12:49:43.140294", "item": {"key": "df", "value": {"command": "df -h"}}, "rc": 0, "start": "2016-05-20 12:49:43.135154", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'cat /proc/meminfo'}, 'key': u'meminfo'}) => {"changed": true, "cmd": "cat /proc/meminfo >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/meminfo.txt", "delta": "0:00:00.004054", "end": "2016-05-20 12:49:43.364841", "item": {"key": "meminfo", "value": {"command": "cat /proc/meminfo"}}, "rc": 0, "start": "2016-05-20 12:49:43.360787", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'lsmod'}, 'key': u'lsmod'}) => {"changed": true, "cmd": "lsmod >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsmod.txt", "delta": "0:00:00.009908", "end": "2016-05-20 12:49:43.599977", "item": {"key": "lsmod", "value": {"command": "lsmod"}}, "rc": 0, "start": "2016-05-20 12:49:43.590069", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'cat /proc/cpuinfo'}, 'key': u'cpuinfo'}) => {"changed": true, "cmd": "cat /proc/cpuinfo >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/cpuinfo.txt", "delta": "0:00:00.004319", "end": "2016-05-20 12:49:43.823557", "item": {"key": "cpuinfo", "value": {"command": "cat /proc/cpuinfo"}}, "rc": 0, "start": "2016-05-20 12:49:43.819238", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'uname -a'}, 'key': u'uname'}) => {"changed": true, "cmd": "uname -a >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/uname.txt", "delta": "0:00:00.004536", "end": "2016-05-20 12:49:44.048444", "item": {"key": "uname", "value": {"command": "uname -a"}}, "rc": 0, "start": "2016-05-20 12:49:44.043908", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'netstat -ntlp'}, 'key': u'netstat'}) => {"changed": true, "cmd": "netstat -ntlp >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/netstat.txt", "delta": "0:00:00.045907", "end": "2016-05-20 12:49:44.312167", "item": {"key": "netstat", "value": {"command": "netstat -ntlp"}}, "rc": 0, "start": "2016-05-20 12:49:44.266260", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'iptables -vnL -t nat'}, 'key': u'iptables_nat'}) => {"changed": true, "cmd": "iptables -vnL -t nat >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_nat.txt", "delta": "0:00:00.006461", "end": "2016-05-20 12:49:44.529191", "item": {"key": "iptables_nat", "value": {"command": "iptables -vnL -t nat"}}, "rc": 0, "start": "2016-05-20 12:49:44.522730", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'lsof -P'}, 'key': u'lsof'}) => {"changed": true, "cmd": "lsof -P >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof.txt", "delta": "0:00:01.774469", "end": "2016-05-20 12:49:46.531442", "item": {"key": "lsof", "value": {"command": "lsof -P"}}, "rc": 0, "start": "2016-05-20 12:49:44.756973", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'sar -A -f /var/log/sa/*'}, 'key': u'sysstat'}) => {"changed": true, "cmd": "sar -A -f /var/log/sa/* >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/sysstat.txt", "delta": "0:00:00.019631", "end": "2016-05-20 12:49:46.780516", "item": {"key": "sysstat", "value": {"command": "sar -A -f /var/log/sa/*"}}, "rc": 0, "start": "2016-05-20 12:49:46.760885", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'cat /etc/hosts'}, 'key': u'hosts'}) => {"changed": true, "cmd": "cat /etc/hosts >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/hosts.txt", "delta": "0:00:00.004778", "end": "2016-05-20 12:49:47.019581", "item": {"key": "hosts", "value": {"command": "cat /etc/hosts"}}, "rc": 0, "start": "2016-05-20 12:49:47.014803", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'cat /proc/mounts |column -t'}, 'key': u'mounts'}) => {"changed": true, "cmd": "cat /proc/mounts |column -t >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/mounts.txt", "delta": "0:00:00.005619", "end": "2016-05-20 12:49:47.228073", "item": {"key": "mounts", "value": {"command": "cat /proc/mounts |column -t"}}, "rc": 0, "start": "2016-05-20 12:49:47.222454", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'iptables -vnL'}, 'key': u'iptables'}) => {"changed": true, "cmd": "iptables -vnL >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables.txt", "delta": "0:00:00.005745", "end": "2016-05-20 12:49:47.439397", "item": {"key": "iptables", "value": {"command": "iptables -vnL"}}, "rc": 0, "start": "2016-05-20 12:49:47.433652", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'getenforce'}, 'key': u'getenforce'}) => {"changed": true, "cmd": "getenforce >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/getenforce.txt", "delta": "0:00:00.004982", "end": "2016-05-20 12:49:47.634178", "item": {"key": "getenforce", "value": {"command": "getenforce"}}, "rc": 0, "start": "2016-05-20 12:49:47.629196", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item={'value': {u'command': u'iptables -vnL -t mangle'}, 'key': u'iptables_mangle'}) => {"changed": true, "cmd": "iptables -vnL -t mangle >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_mangle.txt", "delta": "0:00:00.005471", "end": "2016-05-20 12:49:47.858173", "item": {"key": "iptables_mangle", "value": {"command": "iptables -vnL -t mangle"}}, "rc": 0, "start": "2016-05-20 12:49:47.852702", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} msg: All items completed results: [ { "cmd": "pstree -p >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/pstree.txt", "_ansible_no_log": false, "delta": "0:00:00.060559", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/pstree.txt", "executable": null, "_raw_params": "pstree -p >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/pstree.txt", "removes": null } }, "start": "2016-05-20 12:49:39.250658", "end": "2016-05-20 12:49:39.311217", "stdout": "", "changed": true, "item": { "value": { "command": "pstree -p" }, "key": "pstree" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "yum repolist -v >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/repolist.txt", "_ansible_no_log": false, "delta": "0:00:00.923338", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/repolist.txt", "executable": null, "_raw_params": "yum repolist -v >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/repolist.txt", "removes": null } }, "start": "2016-05-20 12:49:39.538387", "end": "2016-05-20 12:49:40.461725", "stdout": "", "changed": true, "item": { "value": { "command": "yum repolist -v" }, "key": "repolist" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [ "Consider using yum module rather than running yum" ] }, { "cmd": "rpm -qa >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/rpm_packages.txt", "_ansible_no_log": false, "delta": "0:00:00.461820", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/rpm_packages.txt", "executable": null, "_raw_params": "rpm -qa >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/rpm_packages.txt", "removes": null } }, "start": "2016-05-20 12:49:40.689748", "end": "2016-05-20 12:49:41.151568", "stdout": "", "changed": true, "item": { "value": { "command": "rpm -qa" }, "key": "rpm_packages" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [ "Consider using yum module rather than running rpm" ] }, { "cmd": "lsof -Pni >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof_network.txt", "_ansible_no_log": false, "delta": "0:00:00.074819", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof_network.txt", "executable": null, "_raw_params": "lsof -Pni >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof_network.txt", "removes": null } }, "start": "2016-05-20 12:49:41.390249", "end": "2016-05-20 12:49:41.465068", "stdout": "", "changed": true, "item": { "value": { "command": "lsof -Pni" }, "key": "lsof_network" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "dmesg -T >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/dmesg.txt", "_ansible_no_log": false, "delta": "0:00:00.025345", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/dmesg.txt", "executable": null, "_raw_params": "dmesg -T >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/dmesg.txt", "removes": null } }, "start": "2016-05-20 12:49:41.721540", "end": "2016-05-20 12:49:41.746885", "stdout": "", "changed": true, "item": { "value": { "command": "dmesg -T" }, "key": "dmesg" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "journalctl --no-pager >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/journalctl.txt", "_ansible_no_log": false, "delta": "0:00:00.350638", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/journalctl.txt", "executable": null, "_raw_params": "journalctl --no-pager >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/journalctl.txt", "removes": null } }, "start": "2016-05-20 12:49:41.994616", "end": "2016-05-20 12:49:42.345254", "stdout": "", "changed": true, "item": { "value": { "command": "journalctl --no-pager" }, "key": "journalctl" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "sysctl -a >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/sysctl.txt", "_ansible_no_log": false, "delta": "0:00:00.086389", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/sysctl.txt", "executable": null, "_raw_params": "sysctl -a >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/sysctl.txt", "removes": null } }, "start": "2016-05-20 12:49:42.575789", "end": "2016-05-20 12:49:42.662178", "stdout": "", "changed": true, "item": { "value": { "command": "sysctl -a" }, "key": "sysctl" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "fdisk -l >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/fdisk.txt", "_ansible_no_log": false, "delta": "0:00:00.010571", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/fdisk.txt", "executable": null, "_raw_params": "fdisk -l >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/fdisk.txt", "removes": null } }, "start": "2016-05-20 12:49:42.909889", "end": "2016-05-20 12:49:42.920460", "stdout": "", "changed": true, "item": { "value": { "command": "fdisk -l" }, "key": "fdisk" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "df -h >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/df.txt", "_ansible_no_log": false, "delta": "0:00:00.005140", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/df.txt", "executable": null, "_raw_params": "df -h >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/df.txt", "removes": null } }, "start": "2016-05-20 12:49:43.135154", "end": "2016-05-20 12:49:43.140294", "stdout": "", "changed": true, "item": { "value": { "command": "df -h" }, "key": "df" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "cat /proc/meminfo >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/meminfo.txt", "_ansible_no_log": false, "delta": "0:00:00.004054", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/meminfo.txt", "executable": null, "_raw_params": "cat /proc/meminfo >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/meminfo.txt", "removes": null } }, "start": "2016-05-20 12:49:43.360787", "end": "2016-05-20 12:49:43.364841", "stdout": "", "changed": true, "item": { "value": { "command": "cat /proc/meminfo" }, "key": "meminfo" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "lsmod >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsmod.txt", "_ansible_no_log": false, "delta": "0:00:00.009908", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsmod.txt", "executable": null, "_raw_params": "lsmod >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsmod.txt", "removes": null } }, "start": "2016-05-20 12:49:43.590069", "end": "2016-05-20 12:49:43.599977", "stdout": "", "changed": true, "item": { "value": { "command": "lsmod" }, "key": "lsmod" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "cat /proc/cpuinfo >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/cpuinfo.txt", "_ansible_no_log": false, "delta": "0:00:00.004319", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/cpuinfo.txt", "executable": null, "_raw_params": "cat /proc/cpuinfo >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/cpuinfo.txt", "removes": null } }, "start": "2016-05-20 12:49:43.819238", "end": "2016-05-20 12:49:43.823557", "stdout": "", "changed": true, "item": { "value": { "command": "cat /proc/cpuinfo" }, "key": "cpuinfo" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "uname -a >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/uname.txt", "_ansible_no_log": false, "delta": "0:00:00.004536", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/uname.txt", "executable": null, "_raw_params": "uname -a >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/uname.txt", "removes": null } }, "start": "2016-05-20 12:49:44.043908", "end": "2016-05-20 12:49:44.048444", "stdout": "", "changed": true, "item": { "value": { "command": "uname -a" }, "key": "uname" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "netstat -ntlp >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/netstat.txt", "_ansible_no_log": false, "delta": "0:00:00.045907", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/netstat.txt", "executable": null, "_raw_params": "netstat -ntlp >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/netstat.txt", "removes": null } }, "start": "2016-05-20 12:49:44.266260", "end": "2016-05-20 12:49:44.312167", "stdout": "", "changed": true, "item": { "value": { "command": "netstat -ntlp" }, "key": "netstat" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "iptables -vnL -t nat >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_nat.txt", "_ansible_no_log": false, "delta": "0:00:00.006461", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_nat.txt", "executable": null, "_raw_params": "iptables -vnL -t nat >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_nat.txt", "removes": null } }, "start": "2016-05-20 12:49:44.522730", "end": "2016-05-20 12:49:44.529191", "stdout": "", "changed": true, "item": { "value": { "command": "iptables -vnL -t nat" }, "key": "iptables_nat" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "lsof -P >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof.txt", "_ansible_no_log": false, "delta": "0:00:01.774469", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof.txt", "executable": null, "_raw_params": "lsof -P >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof.txt", "removes": null } }, "start": "2016-05-20 12:49:44.756973", "end": "2016-05-20 12:49:46.531442", "stdout": "", "changed": true, "item": { "value": { "command": "lsof -P" }, "key": "lsof" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "sar -A -f /var/log/sa/* >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/sysstat.txt", "_ansible_no_log": false, "delta": "0:00:00.019631", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/sysstat.txt", "executable": null, "_raw_params": "sar -A -f /var/log/sa/* >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/sysstat.txt", "removes": null } }, "start": "2016-05-20 12:49:46.760885", "end": "2016-05-20 12:49:46.780516", "stdout": "", "changed": true, "item": { "value": { "command": "sar -A -f /var/log/sa/*" }, "key": "sysstat" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "cat /etc/hosts >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/hosts.txt", "_ansible_no_log": false, "delta": "0:00:00.004778", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/hosts.txt", "executable": null, "_raw_params": "cat /etc/hosts >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/hosts.txt", "removes": null } }, "start": "2016-05-20 12:49:47.014803", "end": "2016-05-20 12:49:47.019581", "stdout": "", "changed": true, "item": { "value": { "command": "cat /etc/hosts" }, "key": "hosts" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "cat /proc/mounts |column -t >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/mounts.txt", "_ansible_no_log": false, "delta": "0:00:00.005619", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/mounts.txt", "executable": null, "_raw_params": "cat /proc/mounts |column -t >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/mounts.txt", "removes": null } }, "start": "2016-05-20 12:49:47.222454", "end": "2016-05-20 12:49:47.228073", "stdout": "", "changed": true, "item": { "value": { "command": "cat /proc/mounts |column -t" }, "key": "mounts" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "iptables -vnL >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables.txt", "_ansible_no_log": false, "delta": "0:00:00.005745", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables.txt", "executable": null, "_raw_params": "iptables -vnL >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables.txt", "removes": null } }, "start": "2016-05-20 12:49:47.433652", "end": "2016-05-20 12:49:47.439397", "stdout": "", "changed": true, "item": { "value": { "command": "iptables -vnL" }, "key": "iptables" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "getenforce >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/getenforce.txt", "_ansible_no_log": false, "delta": "0:00:00.004982", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/getenforce.txt", "executable": null, "_raw_params": "getenforce >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/getenforce.txt", "removes": null } }, "start": "2016-05-20 12:49:47.629196", "end": "2016-05-20 12:49:47.634178", "stdout": "", "changed": true, "item": { "value": { "command": "getenforce" }, "key": "getenforce" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "iptables -vnL -t mangle >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_mangle.txt", "_ansible_no_log": false, "delta": "0:00:00.005471", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_mangle.txt", "executable": null, "_raw_params": "iptables -vnL -t mangle >/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_mangle.txt", "removes": null } }, "start": "2016-05-20 12:49:47.852702", "end": "2016-05-20 12:49:47.858173", "stdout": "", "changed": true, "item": { "value": { "command": "iptables -vnL -t mangle" }, "key": "iptables_mangle" }, "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] } ] RUNNING HANDLER [common : command] ********************************************* changed: [n2.dusty] => (item=/var/log/audit) => {"changed": true, "cmd": ["rsync", "-azr", "/var/log/audit", "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/"], "delta": "0:00:00.100070", "end": "2016-05-20 12:49:48.755136", "item": "/var/log/audit", "rc": 0, "start": "2016-05-20 12:49:48.655066", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": ["Consider using synchronize module rather than running rsync"]} failed: [n2.dusty] => (item=/etc/ansible) => {"changed": true, "cmd": ["rsync", "-azr", "/etc/ansible", "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/"], "delta": "0:00:00.144752", "end": "2016-05-20 12:49:49.122615", "failed": true, "item": "/etc/ansible", "rc": 23, "start": "2016-05-20 12:49:48.977863", "stderr": "rsync: link_stat \"/etc/ansible\" failed: No such file or directory (2)\nrsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1052) [sender=3.0.9]", "stdout": "", "stdout_lines": [], "warnings": ["Consider using synchronize module rather than running rsync"]} [WARNING]: Consider using synchronize module rather than running rsync ...ignoring msg: One or more items failed results: [ { "cmd": " rsync -azr /var/log/audit /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/", "_ansible_no_log": false, "delta": "0:00:00.100070", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": false, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245//var/log/audit", "executable": null, "_raw_params": "rsync -azr /var/log/audit /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/", "removes": null } }, "start": "2016-05-20 12:49:48.655066", "end": "2016-05-20 12:49:48.755136", "stdout": "", "changed": true, "item": "/var/log/audit", "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [ "Consider using synchronize module rather than running rsync" ] }, { "cmd": " rsync -azr /etc/ansible /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/", "stdout": "", "delta": "0:00:00.144752", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": false, "creates": "/var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245//etc/ansible", "executable": null, "_raw_params": "rsync -azr /etc/ansible /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/", "removes": null } }, "start": "2016-05-20 12:49:48.977863", "end": "2016-05-20 12:49:49.122615", "_ansible_no_log": false, "changed": true, "item": "/etc/ansible", "stderr": "rsync: link_stat \"/etc/ansible\" failed: No such file or directory (2) rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1052) [sender=3.0.9]", "rc": 23, "failed": true, "stdout_lines": [], "warnings": [ "Consider using synchronize module rather than running rsync" ] } ] RUNNING HANDLER [common : copy] ************************************************ changed: [n2.dusty] => {"changed": true, "checksum": "d6d9b9********db3d********6****************6********ddb9****************f********f************************d************************9fd********3e", "dest": "********var****************g********weirdo********weirdo-********-promote-puppet-openstack-scenario****************************************************************ansible_********vars.txt", "gid": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "group": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "md5sum": "********bc********acc********db6****************99a9b****************fd****************b************************3da", "mode": "********6****************", "owner": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "secontext": "system_u:object_r:var_********g_t:s********", "size": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "src": "playbooks********tmp********ansible-tmp-****************63************************9********9.****************-****************************************663************************3********************************source", "state": "file", "uid": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER"} RUNNING HANDLER [common : command] ********************************************* changed: [n2.dusty] => (item=log) => {"changed": true, "cmd": "for file in $(find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245 -name \"*.log\"); do\n mv $file ${file/.log/.txt}\n done", "delta": "0:00:00.009169", "end": "2016-05-20 12:49:50.194993", "item": "log", "rc": 0, "start": "2016-05-20 12:49:50.185824", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item=conf) => {"changed": true, "cmd": "for file in $(find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245 -name \"*.conf\"); do\n mv $file ${file/.conf/.txt}\n done", "delta": "0:00:00.007049", "end": "2016-05-20 12:49:50.425309", "item": "conf", "rc": 0, "start": "2016-05-20 12:49:50.418260", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} changed: [n2.dusty] => (item=fact) => {"changed": true, "cmd": "for file in $(find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245 -name \"*.fact\"); do\n mv $file ${file/.fact/.txt}\n done", "delta": "0:00:00.007719", "end": "2016-05-20 12:49:50.654244", "item": "fact", "rc": 0, "start": "2016-05-20 12:49:50.646525", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} msg: All items completed results: [ { "cmd": "for file in $(find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245 -name \"*.log\"); do mv $file ${file/.log/.txt} done", "_ansible_no_log": false, "delta": "0:00:00.009169", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": null, "executable": null, "_raw_params": "for file in $(find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245 -name \"*.log\"); do mv $file ${file/.log/.txt} done", "removes": null } }, "start": "2016-05-20 12:49:50.185824", "end": "2016-05-20 12:49:50.194993", "stdout": "", "changed": true, "item": "log", "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "for file in $(find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245 -name \"*.conf\"); do mv $file ${file/.conf/.txt} done", "_ansible_no_log": false, "delta": "0:00:00.007049", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": null, "executable": null, "_raw_params": "for file in $(find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245 -name \"*.conf\"); do mv $file ${file/.conf/.txt} done", "removes": null } }, "start": "2016-05-20 12:49:50.418260", "end": "2016-05-20 12:49:50.425309", "stdout": "", "changed": true, "item": "conf", "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] }, { "cmd": "for file in $(find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245 -name \"*.fact\"); do mv $file ${file/.fact/.txt} done", "_ansible_no_log": false, "delta": "0:00:00.007719", "invocation": { "module_name": "command", "module_args": { "chdir": null, "warn": true, "_uses_shell": true, "creates": null, "executable": null, "_raw_params": "for file in $(find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245 -name \"*.fact\"); do mv $file ${file/.fact/.txt} done", "removes": null } }, "start": "2016-05-20 12:49:50.646525", "end": "2016-05-20 12:49:50.654244", "stdout": "", "changed": true, "item": "fact", "stderr": "", "rc": 0, "stdout_lines": [], "warnings": [] } ] RUNNING HANDLER [common : command] ********************************************* changed: [n2.dusty] => {"changed": true, "cmd": "find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245 -type f -name \"*.txt\" -execdir gzip -9 {} \\+", "delta": "0:00:00.733630", "end": "2016-05-20 12:49:51.877415", "rc": 0, "start": "2016-05-20 12:49:51.143785", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []} cmd: find /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245 -type f -name "*.txt" -execdir gzip -9 {} \+ start: 2016-05-20 12:49:51.143785 end: 2016-05-20 12:49:51.877415 delta: 0:00:00.733630 RUNNING HANDLER [common : Ensure ci nodes are released] ************************ Friday 20 May 2016 11:49:52 +0000 (0:00:13.333) 0:28:41.425 ************ skipping: [n2.dusty] => {"changed": false, "skip_reason": "Conditional check failed", "skipped": true} RUNNING HANDLER [common : Fail playbook execution] ***************************** Friday 20 May 2016 11:49:52 +0000 (0:00:00.341) 0:28:41.767 ************ fatal: [n2.dusty]: FAILED! => {"changed": false, "failed": true, "msg": "A task notified that the playbook execution should be failed"} msg: A task notified that the playbook execution should be failed to retry, use: --limit @playbooks/puppet-openstack-scenario002.retry PLAY RECAP ********************************************************************* n2.dusty : ok=31 changed=18 unreachable=0 failed=1 Friday 20 May 2016 11:49:52 +0000 (0:00:00.528) 0:28:42.295 ************ =============================================================================== TASK: puppet-openstack : Run puppet integration test - {{ test }} ---- 1609.09s TASK: common : Install base packages ----------------------------------- 33.54s TASK: common : Update all packages ------------------------------------- 21.65s HANDLER: Ensure global logs are recovered ------------------------------ 13.33s TASK: puppet-openstack : Install required packages ---------------------- 9.77s TASK: puppet-openstack : Install required ruby gems --------------------- 7.47s HANDLER: Ensure puppet-openstack logs are recovered --------------------- 5.62s TASK: puppet-openstack : Download script for retrieving logs ------------ 3.32s TASK: common : Install debug packages ----------------------------------- 2.60s TASK: puppet-openstack : Clone upstream puppet-openstack-integration repository --- 2.37s TASK: setup ------------------------------------------------------------- 2.15s TASK: common : include -------------------------------------------------- 1.15s TASK: puppet-openstack : Setup delorean repository ---------------------- 0.88s TASK: puppet-openstack : Setup delorean-deps repository ----------------- 0.84s TASK: common : Run sysstat every minute instead of every 10 minutes ----- 0.76s TASK: puppet-openstack : Create directory where puppet-openstack logs will be stored --- 0.65s TASK: common : Create log folder to centralize logs in ------------------ 0.61s TASK: common : Enable sysstat ------------------------------------------- 0.61s TASK: puppet-openstack : command ---------------------------------------- 0.56s HANDLER: Fail playbook execution ---------------------------------------- 0.53s ERROR: InvocationError: '/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/.tox/ansible-playbook/bin/ansible-playbook -vv -i hosts playbooks/puppet-openstack-scenario002.yml -e ci_environment=ci-centos -e delorean_url=http://trunk.rdoproject.org/centos7-mitaka/2d/d7/2dd7f56b0b04af66c8ea2739df3dcc43dc3a9316_cbd0900e/delorean.repo -e openstack_release=mitaka -e version=stable/mitaka' ___________________________________ summary ____________________________________ ERROR: ansible-playbook: commands failed Build step 'Execute shell' marked build as failure Sending e-mails to: dms@redhat.com Archiving artifacts Performing Post build task... Match found for :Building remotely : True Logical operation result is TRUE Running script : # It doesn't look otherwise possible to do an !include-raw mixed with content # in JJB. export ANSIBLE_HOSTS=$WORKSPACE/weirdo/hosts export SSID_FILE=$WORKSPACE/weirdo/cico-ssid # weirdo-collect-logs.sh # A script to collect logs generated by a weirdo job pushd $WORKSPACE/weirdo # Don't fail script execution even if log collection fails -- the node needs to be destroyed afterwards tox -e ansible-playbook -- -vv -i hosts playbooks/logs-ci-centos.yml -e ci_environment=ci-centos || true popd # cico-node-done-from-ansible.sh # A script that releases nodes from a SSID file written by SSID_FILE=${SSID_FILE:-$WORKSPACE/cico-ssid} for ssid in $(cat ${SSID_FILE}) do cico -q node done $ssid done [weirdo-mitaka-promote-puppet-openstack-scenario002] $ /bin/sh -xe /tmp/hudson9023821423222679205.sh + export ANSIBLE_HOSTS=/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/hosts + ANSIBLE_HOSTS=/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/hosts + export SSID_FILE=/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/cico-ssid + SSID_FILE=/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/cico-ssid + pushd /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo ~/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo ~/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002 + tox -e ansible-playbook -- -vv -i hosts playbooks/logs-ci-centos.yml -e ci_environment=ci-centos ansible-playbook develop-inst-noop: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo ansible-playbook installed: alabaster==0.7.8,ansible==2.0.1.0,ansible-lint==2.7.0,ara==0.5.2,Babel==2.3.4,cffi==1.6.0,cliff==2.0.0,cmd2==0.6.8,cryptography==1.3.2,decorator==4.0.9,docutils==0.12,enum34==1.1.6,Flask==0.10.1,Flask-SQLAlchemy==2.1,graphviz==0.4.10,idna==2.1,imagesize==0.7.1,ipaddress==1.0.16,itsdangerous==0.24,Jinja2==2.8,MarkupSafe==0.23,paramiko==2.0.0,pbr==1.9.1,prettytable==0.7.2,pyasn1==0.1.9,pycparser==2.14,pycrypto==2.6.1,Pygments==2.1.3,PyMySQL==0.7.3,pyparsing==2.1.4,python-cicoclient==0.3.9,pytz==2016.4,PyYAML==3.11,requests==2.10.0,six==1.10.0,snowballstemmer==1.2.1,Sphinx==1.4.1,sphinx-rtd-theme==0.1.9,SQLAlchemy==1.0.13,stevedore==1.13.0,unicodecsv==0.14.1,-e git+https://github.com/redhat-openstack/weirdo.git@77084420349d74c7e5ce0594c62583c4ac0f6992#egg=weirdo-origin_master,Werkzeug==0.11.9,wheel==0.24.0,You are using pip version 7.1.2, however version 8.1.2 is available.,You should consider upgrading via the 'pip install --upgrade pip' command. ansible-playbook runtests: PYTHONHASHSEED='3381671243' ansible-playbook runtests: commands[0] | ansible-galaxy install -r ansible-role-requirements.yml --ignore-errors - packstack is already installed, skipping. - puppet-openstack is already installed, skipping. - kolla is already installed, skipping. - common is already installed, skipping. ansible-playbook runtests: commands[1] | ansible-playbook -vv -i hosts playbooks/logs-ci-centos.yml -e ci_environment=ci-centos Using /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/ansible.cfg as config file [WARNING]: Failure when attempting to use callback plugin (): (sqlite3.OperationalError) table playbooks has no column named path [SQL: u'INSERT INTO playbooks (id, path, time_start, time_end) VALUES (?, ?, ?, ?)'] [parameters: ('594776e0-005b-46f2-b534-8e9f29137af7', 'playbooks/logs-ci- centos.yml', '2016-05-20 11:49:56.533163', None)] 1 plays in playbooks/logs-ci-centos.yml PLAY [Recover logs from a ci.centos.org job] *********************************** [WARNING]: Failure when attempting to use callback plugin (): This Session's transaction has been rolled back due to a previous exception during flush. To begin a new transaction with this Session, first issue Session.rollback(). Original exception was: (sqlite3.OperationalError) table playbooks has no column named path [SQL: u'INSERT INTO playbooks (id, path, time_start, time_end) VALUES (?, ?, ?, ?)'] [parameters: ('594776e0-005b- 46f2-b534-8e9f29137af7', 'playbooks/logs-ci-centos.yml', '2016-05-20 11:49:56.533163', None)] TASK [Fetch and gzip the console log] ****************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/logs-ci-centos.yml:30 Friday 20 May 2016 11:49:56 +0000 (0:00:00.107) 0:00:00.107 ************ changed: [n2.dusty] => {"changed": true, "cmd": "URL=\"https://ci.centos.org/job/weirdo-mitaka-promote-puppet-openstack-scenario002/245//consoleText\";\n curl $URL | gzip > /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/console.txt.gz", "delta": "0:00:00.289938", "end": "2016-05-20 12:49:57.141285", "rc": 0, "start": "2016-05-20 12:49:56.851347", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 1597k 0 1597k 0 0 5705k 0 --:--:-- --:--:-- --:--:-- 5725k", "stdout": "", "stdout_lines": [], "warnings": []} cmd: URL="https://ci.centos.org/job/weirdo-mitaka-promote-puppet-openstack-scenario002/245//consoleText"; curl $URL | gzip > /var/log/weirdo/weirdo-mitaka-promote-puppet-openstack-scenario002/245/console.txt.gz start: 2016-05-20 12:49:56.851347 end: 2016-05-20 12:49:57.141285 delta: 0:00:00.289938 stderr: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 1597k 0 1597k 0 0 5705k 0 --:--:-- --:--:-- --:--:-- 5725k TASK [Upload logs to the artifact server] ************************************** task path: /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/playbooks/logs-ci-centos.yml:38 Friday 20 May 2016 11:49:57 +0000 (0:00:00.560) 0:00:00.667 ************ changed: [n2.dusty] => {"changed": true, "cmd": "rsync -avzR /var/log/weirdo/./weirdo-mitaka-promote-puppet-openstack-scenario002/245 rdo@artifacts.ci.centos.org::rdo --stats", "delta": "0:00:01.119210", "end": "2016-05-20 12:49:58.520967", "rc": 0, "start": "2016-05-20 12:49:57.401757", "stderr": "", "stdout": "sending incremental file list\nweirdo-mitaka-promote-puppet-openstack-scenario002/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/ansible_hostvars.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/console.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/cpuinfo.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/df.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/dmesg.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/fdisk.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/getenforce.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/hosts.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_mangle.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_nat.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/journalctl.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/lsmod.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof_network.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/meminfo.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/mounts.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/netstat.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/pstree.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/repolist.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/rpm_packages.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/sysctl.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/sysstat.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/uname.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/audit/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/audit/audit.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/audit.log.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cpuinfo.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/df.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/dstat.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/free.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ps.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rpm-qa.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/syslog.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/tempest.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/tempest.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/testrepository.subunit\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/access_log\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/default_error.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/error_log\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_access_ssl.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_error_ssl.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_access_ssl.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_error_ssl.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_access_ssl.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_error_ssl.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_access_ssl.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_error_ssl.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-ironic_wsgi.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_admin.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_main.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-nova_api_wsgi.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/15-default.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/api.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/cinder-manage.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/scheduler.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/volume.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/api-paste.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/cinder.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/policy.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/rootwrap.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/rootwrap.d/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/volumes/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-api.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-cache.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-glare.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-registry.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-scrubber.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-swift.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/policy.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/schema-image.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/cim-processor-allocation-setting-data.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/cim-resource-allocation-setting-data.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/cim-storage-allocation-setting-data.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/cim-virtual-system-setting-data.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-aggr-disk-filter.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-aggr-iops-filter.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-aggr-num-instances.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-cpu-pinning.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-guest-memory-backing.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-guest-shutdown.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-host-capabilities.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-hypervisor.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-instance-data.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-libvirt-image.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-libvirt.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-quota.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-randomgen.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-trust.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-vcputopology.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-vmware-flavor.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-vmware-quota-flavor.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-vmware.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-watchdog.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-xenapi.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/glance-common-image-props.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/operating-system.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/software-databases.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/software-runtimes.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/software-webservers.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/storage-volume-type.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/ssl/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/ssl/private/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/ssl/private/n2.dusty.ci.centos.org.pem.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ironic.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/policy.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.d/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.d/ironic-images.filters.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.d/ironic-lib.filters.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.d/ironic-utils.filters.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ssl/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ssl/private/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ssl/private/n2.dusty.ci.centos.org.pem.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/default_catalog.templates.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/keystone-paste.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/keystone.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/logging.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/policy.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/sso_callback_template.html.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/ssl/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/ssl/private/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/ssl/private/n2.dusty.ci.centos.org.pem.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/api-paste.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/dhcp_agent.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/fwaas_driver.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/l3_agent.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/lbaas_agent.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/metadata_agent.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/metering_agent.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/neutron.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/neutron_lbaas.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/policy.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/rootwrap.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/services_lbaas.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/README.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/common/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-dhcp-agent/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-l3-agent/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-lbaas-agent/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-lbaasv2-agent/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-linuxbridge-cleanup/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-metadata-agent/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-metering-agent/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-netns-cleanup/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-openvswitch-agent/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-ovs-cleanup/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-server/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/ml2_conf.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/ml2_conf_sriov.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/openvswitch_agent.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/sriov_agent.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/api-paste.ini.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/nova.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/policy.json.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/release.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/rootwrap.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/ssl/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/ssl/private/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/ssl/private/n2.dusty.ci.centos.org.pem.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/account-server.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/account.builder.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/account.ring.gz.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container-reconciler.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container-server.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container.builder.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container.ring.gz.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object-expirer.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object-server.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object.builder.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object.ring.gz.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/proxy-server.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/swift.conf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/account-server/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743679.container.builder.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743679.object.builder.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743700.container.builder.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743700.container.ring.gz.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743719.object.builder.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743719.object.ring.gz.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743729.account.builder.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743745.account.builder.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743745.account.ring.gz.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container-server/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object-server/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/proxy-server/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/api.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/registry.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/app.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-conductor.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-dbsync.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/keystone/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/keystone/keystone.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000001.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000002.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000003.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000004.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000005.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000006.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000007.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000008.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000009.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000a.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000b.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000c.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000d.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000e.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000f.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/dhcp-agent.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/l3-agent.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/lbaas-agent.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metadata-agent.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metering-agent.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-0482c0a4-c3ff-4e4f-b9d2-1840f91820f3.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-04c5d21a-4ce9-48bc-9ebd-760d57d6f181.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-241bdf5e-518e-4b0e-a2de-3c66ae07ff1a.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-2ceb5c9c-90bd-4773-a918-66aa6dbcfa4d.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-3444e121-20ca-4d43-bc85-250291b85adf.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-43f26813-0feb-402a-bc02-f6a443073330.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-4e0cd0d8-1601-47d0-9820-d7e73f9bf3df.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-507c5b71-cfff-4102-9946-28e8e2327626.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-52f14bea-16c9-450f-b2eb-c38128712134.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-57f1109f-c393-4fb6-8c44-51d0cfbab6a0.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-61fa4559-4a33-4c9c-9aae-787631907b72.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-626609dd-d1cb-4f30-abe3-bde4810fa59b.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-63a1d182-dee3-4268-bd18-bf7fbce7c030.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-65c24824-1749-4f9b-a3ce-825f3d91070e.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7483ed8c-9a86-48c8-8f2f-c86314a8fd89.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7e360df8-ef95-43f0-b078-00a48543d918.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-83e52736-3b87-4753-a58a-223b725f50a3.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-8f917175-7007-4c3b-9c9b-d18448bd1139.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-abb4fa75-9747-4a73-9e2d-92f30aea7bd8.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ade8d50d-fea6-41a4-b385-35625210cb2a.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ba272c2c-1ea6-4c14-9b5c-78cae90d82b7.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-bb5ad119-e2c2-487c-a81b-486691e2d2c3.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c1ad332d-7462-4167-a25a-32bef965a872.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c5966564-d7eb-4943-8b2c-2971587bb84e.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c764cc0f-af8c-4138-a98b-3bc8c2ebd200.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c9135f9e-e3b5-45bc-a377-09e80f3e46f9.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-cae8fa4f-49b1-45ae-b151-2721639d9ee6.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-d5cd80b6-00c9-4002-b7c1-3f0e89fe6b22.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-dc80ad39-b9a7-41bf-a792-cf8721e7ba5e.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-e8957ee5-60b7-4a76-b725-66b6fd770c16.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ebe9ce7b-1d0c-4e2a-9545-f15e4c4a7e5e.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ef0e2587-c553-4913-8c7c-e6cfd6fea623.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f906b905-159d-4308-bf7e-1eafb8888246.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f986b546-cdf7-4c8f-8fab-65ed694fddf4.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/openvswitch-agent.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/server.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-api.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-compute.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-conductor.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-consoleauth.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-manage.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-novncproxy.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-scheduler.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovs-vswitchd.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovsdb-server.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2-sasl.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/cinder.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/ironic.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/neutron.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/nova.txt.gz\nweirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/swift/\n\nNumber of files: 295\nNumber of files transferred: 236\nTotal file size: 7055391 bytes\nTotal transferred file size: 7055391 bytes\nLiteral data: 7055391 bytes\nMatched data: 0 bytes\nFile list size: 8542\nFile list generation time: 0.001 seconds\nFile list transfer time: 0.000 seconds\nTotal bytes sent: 7076713\nTotal bytes received: 4728\n\nsent 7076713 bytes received 4728 bytes 4720960.67 bytes/sec\ntotal size is 7055391 speedup is 1.00", "stdout_lines": ["sending incremental file list", "weirdo-mitaka-promote-puppet-openstack-scenario002/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/ansible_hostvars.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/console.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/cpuinfo.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/df.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/dmesg.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/fdisk.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/getenforce.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/hosts.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_mangle.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_nat.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/journalctl.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsmod.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof_network.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/meminfo.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/mounts.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/netstat.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/pstree.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/repolist.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/rpm_packages.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/sysctl.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/sysstat.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/uname.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/audit/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/audit/audit.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/audit.log.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cpuinfo.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/df.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/dstat.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/free.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ps.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rpm-qa.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/syslog.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/tempest.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/tempest.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/testrepository.subunit", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/access_log", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/default_error.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/error_log", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_access_ssl.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_error_ssl.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_access_ssl.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_error_ssl.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_access_ssl.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_error_ssl.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_access_ssl.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_error_ssl.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-ironic_wsgi.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_admin.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_main.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-nova_api_wsgi.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/15-default.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/api.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/cinder-manage.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/scheduler.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/volume.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/api-paste.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/cinder.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/policy.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/rootwrap.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/rootwrap.d/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/volumes/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-api.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-cache.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-glare.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-registry.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-scrubber.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-swift.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/policy.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/schema-image.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/cim-processor-allocation-setting-data.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/cim-resource-allocation-setting-data.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/cim-storage-allocation-setting-data.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/cim-virtual-system-setting-data.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-aggr-disk-filter.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-aggr-iops-filter.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-aggr-num-instances.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-cpu-pinning.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-guest-memory-backing.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-guest-shutdown.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-host-capabilities.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-hypervisor.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-instance-data.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-libvirt-image.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-libvirt.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-quota.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-randomgen.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-trust.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-vcputopology.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-vmware-flavor.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-vmware-quota-flavor.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-vmware.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-watchdog.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-xenapi.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/glance-common-image-props.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/operating-system.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/software-databases.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/software-runtimes.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/software-webservers.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/storage-volume-type.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/ssl/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/ssl/private/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/ssl/private/n2.dusty.ci.centos.org.pem.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ironic.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/policy.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.d/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.d/ironic-images.filters.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.d/ironic-lib.filters.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.d/ironic-utils.filters.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ssl/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ssl/private/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ssl/private/n2.dusty.ci.centos.org.pem.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/default_catalog.templates.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/keystone-paste.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/keystone.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/logging.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/policy.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/sso_callback_template.html.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/ssl/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/ssl/private/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/ssl/private/n2.dusty.ci.centos.org.pem.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/api-paste.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/dhcp_agent.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/fwaas_driver.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/l3_agent.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/lbaas_agent.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/metadata_agent.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/metering_agent.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/neutron.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/neutron_lbaas.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/policy.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/rootwrap.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/services_lbaas.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/README.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/common/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-dhcp-agent/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-l3-agent/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-lbaas-agent/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-lbaasv2-agent/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-linuxbridge-cleanup/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-metadata-agent/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-metering-agent/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-netns-cleanup/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-openvswitch-agent/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-ovs-cleanup/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-server/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/ml2_conf.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/ml2_conf_sriov.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/openvswitch_agent.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/sriov_agent.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/api-paste.ini.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/nova.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/policy.json.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/release.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/rootwrap.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/ssl/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/ssl/private/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/ssl/private/n2.dusty.ci.centos.org.pem.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/account-server.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/account.builder.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/account.ring.gz.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container-reconciler.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container-server.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container.builder.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container.ring.gz.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object-expirer.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object-server.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object.builder.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object.ring.gz.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/proxy-server.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/swift.conf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/account-server/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743679.container.builder.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743679.object.builder.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743700.container.builder.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743700.container.ring.gz.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743719.object.builder.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743719.object.ring.gz.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743729.account.builder.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743745.account.builder.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743745.account.ring.gz.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container-server/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object-server/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/proxy-server/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/api.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/registry.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/app.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-conductor.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-dbsync.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/keystone/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/keystone/keystone.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000001.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000002.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000003.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000004.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000005.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000006.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000007.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000008.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000009.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000a.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000b.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000c.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000d.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000e.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000f.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/dhcp-agent.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/l3-agent.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/lbaas-agent.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metadata-agent.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metering-agent.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-0482c0a4-c3ff-4e4f-b9d2-1840f91820f3.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-04c5d21a-4ce9-48bc-9ebd-760d57d6f181.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-241bdf5e-518e-4b0e-a2de-3c66ae07ff1a.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-2ceb5c9c-90bd-4773-a918-66aa6dbcfa4d.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-3444e121-20ca-4d43-bc85-250291b85adf.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-43f26813-0feb-402a-bc02-f6a443073330.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-4e0cd0d8-1601-47d0-9820-d7e73f9bf3df.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-507c5b71-cfff-4102-9946-28e8e2327626.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-52f14bea-16c9-450f-b2eb-c38128712134.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-57f1109f-c393-4fb6-8c44-51d0cfbab6a0.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-61fa4559-4a33-4c9c-9aae-787631907b72.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-626609dd-d1cb-4f30-abe3-bde4810fa59b.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-63a1d182-dee3-4268-bd18-bf7fbce7c030.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-65c24824-1749-4f9b-a3ce-825f3d91070e.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7483ed8c-9a86-48c8-8f2f-c86314a8fd89.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7e360df8-ef95-43f0-b078-00a48543d918.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-83e52736-3b87-4753-a58a-223b725f50a3.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-8f917175-7007-4c3b-9c9b-d18448bd1139.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-abb4fa75-9747-4a73-9e2d-92f30aea7bd8.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ade8d50d-fea6-41a4-b385-35625210cb2a.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ba272c2c-1ea6-4c14-9b5c-78cae90d82b7.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-bb5ad119-e2c2-487c-a81b-486691e2d2c3.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c1ad332d-7462-4167-a25a-32bef965a872.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c5966564-d7eb-4943-8b2c-2971587bb84e.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c764cc0f-af8c-4138-a98b-3bc8c2ebd200.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c9135f9e-e3b5-45bc-a377-09e80f3e46f9.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-cae8fa4f-49b1-45ae-b151-2721639d9ee6.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-d5cd80b6-00c9-4002-b7c1-3f0e89fe6b22.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-dc80ad39-b9a7-41bf-a792-cf8721e7ba5e.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-e8957ee5-60b7-4a76-b725-66b6fd770c16.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ebe9ce7b-1d0c-4e2a-9545-f15e4c4a7e5e.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ef0e2587-c553-4913-8c7c-e6cfd6fea623.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f906b905-159d-4308-bf7e-1eafb8888246.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f986b546-cdf7-4c8f-8fab-65ed694fddf4.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/openvswitch-agent.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/server.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-api.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-compute.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-conductor.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-consoleauth.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-manage.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-novncproxy.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-scheduler.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovs-vswitchd.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovsdb-server.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2-sasl.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/cinder.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/ironic.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/neutron.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/nova.txt.gz", "weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/swift/", "", "Number of files: 295", "Number of files transferred: 236", "Total file size: 7055391 bytes", "Total transferred file size: 7055391 bytes", "Literal data: 7055391 bytes", "Matched data: 0 bytes", "File list size: 8542", "File list generation time: 0.001 seconds", "File list transfer time: 0.000 seconds", "Total bytes sent: 7076713", "Total bytes received: 4728", "", "sent 7076713 bytes received 4728 bytes 4720960.67 bytes/sec", "total size is 7055391 speedup is 1.00"], "warnings": ["Consider using synchronize module rather than running rsync"]} [WARNING]: Consider using synchronize module rather than running rsync cmd: rsync -avzR /var/log/weirdo/./weirdo-mitaka-promote-puppet-openstack-scenario002/245 rdo@artifacts.ci.centos.org::rdo --stats start: 2016-05-20 12:49:57.401757 end: 2016-05-20 12:49:58.520967 delta: 0:00:01.119210 stdout: sending incremental file list weirdo-mitaka-promote-puppet-openstack-scenario002/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/ansible_hostvars.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/console.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/cpuinfo.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/df.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/dmesg.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/fdisk.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/getenforce.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/hosts.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_mangle.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/iptables_nat.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/journalctl.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsmod.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/lsof_network.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/meminfo.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/mounts.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/netstat.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/pstree.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/repolist.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/rpm_packages.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/sysctl.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/sysstat.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/uname.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/audit/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/audit/audit.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/audit.log.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cpuinfo.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/df.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/dstat.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/free.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ps.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rpm-qa.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/syslog.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/tempest.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/tempest.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/testrepository.subunit weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/access_log weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/default_error.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/error_log weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_access_ssl.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/ironic_wsgi_error_ssl.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_access_ssl.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_admin_error_ssl.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_access_ssl.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/keystone_wsgi_main_error_ssl.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_access_ssl.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache/nova_api_wsgi_error_ssl.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-ironic_wsgi.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_admin.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-keystone_wsgi_main.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/10-nova_api_wsgi.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/apache_config/15-default.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/api.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/cinder-manage.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/scheduler.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/cinder/volume.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/api-paste.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/cinder.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/policy.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/rootwrap.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/rootwrap.d/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/cinder/volumes/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-api.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-cache.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-glare.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-registry.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-scrubber.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/glance-swift.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/policy.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/schema-image.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/cim-processor-allocation-setting-data.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/cim-resource-allocation-setting-data.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/cim-storage-allocation-setting-data.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/cim-virtual-system-setting-data.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-aggr-disk-filter.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-aggr-iops-filter.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-aggr-num-instances.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-cpu-pinning.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-guest-memory-backing.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-guest-shutdown.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-host-capabilities.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-hypervisor.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-instance-data.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-libvirt-image.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-libvirt.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-quota.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-randomgen.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-trust.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-vcputopology.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-vmware-flavor.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-vmware-quota-flavor.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-vmware.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-watchdog.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/compute-xenapi.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/glance-common-image-props.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/operating-system.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/software-databases.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/software-runtimes.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/software-webservers.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/metadefs/storage-volume-type.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/ssl/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/ssl/private/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/glance/ssl/private/n2.dusty.ci.centos.org.pem.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ironic.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/policy.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.d/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.d/ironic-images.filters.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.d/ironic-lib.filters.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/rootwrap.d/ironic-utils.filters.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ssl/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ssl/private/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/ironic/ssl/private/n2.dusty.ci.centos.org.pem.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/default_catalog.templates.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/keystone-paste.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/keystone.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/logging.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/policy.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/sso_callback_template.html.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/ssl/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/ssl/private/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/keystone/ssl/private/n2.dusty.ci.centos.org.pem.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/api-paste.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/dhcp_agent.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/fwaas_driver.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/l3_agent.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/lbaas_agent.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/metadata_agent.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/metering_agent.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/neutron.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/neutron_lbaas.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/policy.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/rootwrap.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/services_lbaas.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/README.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/common/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-dhcp-agent/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-l3-agent/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-lbaas-agent/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-lbaasv2-agent/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-linuxbridge-cleanup/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-metadata-agent/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-metering-agent/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-netns-cleanup/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-openvswitch-agent/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-ovs-cleanup/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/conf.d/neutron-server/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/ml2_conf.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/ml2_conf_sriov.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/openvswitch_agent.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/neutron/plugins/ml2/sriov_agent.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/api-paste.ini.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/nova.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/policy.json.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/release.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/rootwrap.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/ssl/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/ssl/private/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/nova/ssl/private/n2.dusty.ci.centos.org.pem.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/account-server.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/account.builder.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/account.ring.gz.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container-reconciler.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container-server.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container.builder.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container.ring.gz.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object-expirer.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object-server.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object.builder.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object.ring.gz.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/proxy-server.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/swift.conf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/account-server/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743679.container.builder.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743679.object.builder.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743700.container.builder.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743700.container.ring.gz.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743719.object.builder.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743719.object.ring.gz.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743729.account.builder.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743745.account.builder.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/backups/1463743745.account.ring.gz.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/container-server/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/object-server/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/etc/swift/proxy-server/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/api.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/glance/registry.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/app.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-conductor.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/ironic/ironic-dbsync.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/keystone/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/keystone/keystone.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000001.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000002.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000003.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000004.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000005.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000006.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000007.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000008.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-00000009.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000a.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000b.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000c.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000d.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000e.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/libvirt/qemu/instance-0000000f.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/dhcp-agent.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/l3-agent.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/lbaas-agent.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metadata-agent.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/metering-agent.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-0482c0a4-c3ff-4e4f-b9d2-1840f91820f3.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-04c5d21a-4ce9-48bc-9ebd-760d57d6f181.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-241bdf5e-518e-4b0e-a2de-3c66ae07ff1a.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-2ceb5c9c-90bd-4773-a918-66aa6dbcfa4d.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-3444e121-20ca-4d43-bc85-250291b85adf.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-43f26813-0feb-402a-bc02-f6a443073330.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-4e0cd0d8-1601-47d0-9820-d7e73f9bf3df.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-507c5b71-cfff-4102-9946-28e8e2327626.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-52f14bea-16c9-450f-b2eb-c38128712134.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-57f1109f-c393-4fb6-8c44-51d0cfbab6a0.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-61fa4559-4a33-4c9c-9aae-787631907b72.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-626609dd-d1cb-4f30-abe3-bde4810fa59b.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-63a1d182-dee3-4268-bd18-bf7fbce7c030.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-65c24824-1749-4f9b-a3ce-825f3d91070e.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7483ed8c-9a86-48c8-8f2f-c86314a8fd89.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-7e360df8-ef95-43f0-b078-00a48543d918.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-83e52736-3b87-4753-a58a-223b725f50a3.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-8f917175-7007-4c3b-9c9b-d18448bd1139.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-abb4fa75-9747-4a73-9e2d-92f30aea7bd8.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ade8d50d-fea6-41a4-b385-35625210cb2a.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ba272c2c-1ea6-4c14-9b5c-78cae90d82b7.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-bb5ad119-e2c2-487c-a81b-486691e2d2c3.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c1ad332d-7462-4167-a25a-32bef965a872.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c5966564-d7eb-4943-8b2c-2971587bb84e.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c764cc0f-af8c-4138-a98b-3bc8c2ebd200.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-c9135f9e-e3b5-45bc-a377-09e80f3e46f9.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-cae8fa4f-49b1-45ae-b151-2721639d9ee6.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-d5cd80b6-00c9-4002-b7c1-3f0e89fe6b22.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-dc80ad39-b9a7-41bf-a792-cf8721e7ba5e.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-e8957ee5-60b7-4a76-b725-66b6fd770c16.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ebe9ce7b-1d0c-4e2a-9545-f15e4c4a7e5e.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-ef0e2587-c553-4913-8c7c-e6cfd6fea623.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f906b905-159d-4308-bf7e-1eafb8888246.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/neutron-ns-metadata-proxy-f986b546-cdf7-4c8f-8fab-65ed694fddf4.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/openvswitch-agent.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/neutron/server.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-api.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-compute.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-conductor.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-consoleauth.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-manage.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-novncproxy.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/nova/nova-scheduler.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovs-vswitchd.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/openvswitch/ovsdb-server.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2-sasl.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/rabbitmq/rabbit@n2.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/ weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/cinder.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/ironic.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/neutron.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/sudoers.d/nova.txt.gz weirdo-mitaka-promote-puppet-openstack-scenario002/245/puppet-openstack/logs/swift/ Number of files: 295 Number of files transferred: 236 Total file size: 7055391 bytes Total transferred file size: 7055391 bytes Literal data: 7055391 bytes Matched data: 0 bytes File list size: 8542 File list generation time: 0.001 seconds File list transfer time: 0.000 seconds Total bytes sent: 7076713 Total bytes received: 4728 sent 7076713 bytes received 4728 bytes 4720960.67 bytes/sec total size is 7055391 speedup is 1.00 PLAY RECAP ********************************************************************* n2.dusty : ok=2 changed=2 unreachable=0 failed=0 Friday 20 May 2016 11:49:58 +0000 (0:00:01.381) 0:00:02.049 ************ =============================================================================== TASK: Upload logs to the artifact server -------------------------------- 1.38s TASK: Fetch and gzip the console log ------------------------------------ 0.56s ___________________________________ summary ____________________________________ ansible-playbook: commands succeeded congratulations :) + popd ~/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002 + SSID_FILE=/home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/cico-ssid ++ cat /home/rhos-ci/workspace/weirdo-mitaka-promote-puppet-openstack-scenario002/weirdo/cico-ssid + for ssid in '$(cat ${SSID_FILE})' + cico -q node done 5ad1554c +---------+----------+-------------+---------+------------+---------------+----------+--------+------+----------------+--------------+-----------+ | host_id | hostname | ip_address | chassis | used_count | current_state | comment | distro | rel | centos_version | architecture | node_pool | +---------+----------+-------------+---------+------------+---------------+----------+--------+------+----------------+--------------+-----------+ | 193 | n2.dusty | 172.19.2.66 | dusty | 151 | Deployed | 5ad1554c | None | None | 7 | x86_64 | 0 | +---------+----------+-------------+---------+------------+---------------+----------+--------+------+----------------+--------------+-----------+ POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Finished: FAILURE