Description of problem: During the initial setup of hosted-engine, you are prompted for the admin API password. Then, when actually configuring RHEVM you are prompted again. If the password entered in the first step (on the host) does not match the password entered on the manager, the host will attempt once, then fail. Because we do not have any kind of "resume" or "try again" option, the only recovery approach is to rebuild the manager again from scratch. This is an extremely tedious and time-consuming task. Version-Release number of selected component (if applicable): How reproducible: 100% Steps to Reproduce: 1. Enter a password for the admin API when prompted by hosted-engine --deploy 2. Enter a different password on the manager during engine-setup Actual results: The host fails to add itself to the engine and the process aborts with no opportunity for recovery Expected results: The host prompts again for admin API credentials, allowing the user an opportunity to continue without having to start over.
Oh, one more bit on "Expected Results". I would like the hosted-engine script to let the user know that the password failed and give them the option to try again or abort. Thanks! ~james
Works for me on these components: # rpm -qa vdsm libvirt* sanlock* qemu-kvm* ovirt* mom libvirt-daemon-driver-interface-1.2.8-16.el7_1.3.x86_64 vdsm-4.17.0-822.git9b11a18.el7.noarch sanlock-python-3.2.2-2.el7.x86_64 qemu-kvm-common-ev-2.1.2-23.el7_1.3.1.x86_64 libvirt-python-1.2.8-7.el7_1.1.x86_64 libvirt-daemon-kvm-1.2.8-16.el7_1.3.x86_64 ovirt-release-master-001-0.8.master.noarch ovirt-engine-sdk-python-3.6.0.0-0.13.20150515.gitdd15fbf.el7.centos.noarch libvirt-client-1.2.8-16.el7_1.3.x86_64 libvirt-daemon-config-nwfilter-1.2.8-16.el7_1.3.x86_64 libvirt-daemon-driver-storage-1.2.8-16.el7_1.3.x86_64 libvirt-daemon-driver-network-1.2.8-16.el7_1.3.x86_64 sanlock-lib-3.2.2-2.el7.x86_64 qemu-kvm-ev-2.1.2-23.el7_1.3.1.x86_64 libvirt-daemon-1.2.8-16.el7_1.3.x86_64 libvirt-lock-sanlock-1.2.8-16.el7_1.3.x86_64 libvirt-daemon-driver-secret-1.2.8-16.el7_1.3.x86_64 libvirt-daemon-driver-qemu-1.2.8-16.el7_1.3.x86_64 ovirt-hosted-engine-setup-1.3.0-0.0.master.20150518075146.gitdd9741f.el7.noarch libvirt-daemon-driver-nwfilter-1.2.8-16.el7_1.3.x86_64 mom-0.4.4-0.0.master.20150515133332.git2d32797.el7.noarch qemu-kvm-tools-ev-2.1.2-23.el7_1.3.1.x86_64 ovirt-host-deploy-1.4.0-0.0.master.20150505205623.giteabc23b.el7.noarch libvirt-daemon-driver-nodedev-1.2.8-16.el7_1.3.x86_64 ovirt-hosted-engine-ha-1.3.0-0.0.master.20150424113553.20150424113551.git7c14f4c.el7.noarch sanlock-3.2.2-2.el7.x86_64 I've followed exactly the steps described for reproduction and deployment process found that engine's password differs from given to it, then I've received the question to give another password. I've provided another password (entered previously to engine) and deployment finished successfully. Components installed on engine: ovirt-engine-cli-3.6.0.0-0.2.20150518.gite3609e3.el6.noarch ovirt-engine-setup-base-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-host-deploy-java-1.4.0-0.0.master.20150515080139.giteabc23b.el6.noarch ovirt-engine-userportal-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-engine-setup-plugin-ovirt-engine-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-engine-sdk-python-3.6.0.0-0.13.20150515.gitdd15fbf.el6.noarch ovirt-iso-uploader-3.6.0-0.0.master.20150410142241.git1a680f9.el6.noarch ovirt-engine-extensions-api-impl-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-engine-setup-plugin-ovirt-engine-common-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-engine-websocket-proxy-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-engine-tools-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-engine-backend-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-engine-setup-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-engine-webadmin-portal-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-release-master-001-0.8.master.noarch ovirt-host-deploy-1.4.0-0.0.master.20150515080139.giteabc23b.el6.noarch ovirt-image-uploader-3.6.0-0.0.master.20150128151752.git3f60704.el6.noarch ovirt-engine-lib-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-engine-jboss-as-7.1.1-1.el6.x86_64 ovirt-engine-setup-plugin-websocket-proxy-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-engine-restapi-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-engine-dbscripts-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-engine-3.6.0-0.0.master.20150519172219.git9a2e2b3.el6.noarch ovirt-guest-agent-1.0.10.2-1.el6.noarch
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://rhn.redhat.com/errata/RHEA-2016-0375.html