Bug 1397170 - Mitaka undercloud install fails with version lock
Summary: Mitaka undercloud install fails with version lock
Keywords:
Status: CLOSED WONTFIX
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: instack
Version: 9.0 (Mitaka)
Hardware: x86_64
OS: Linux
high
urgent
Target Milestone: ---
: 9.0 (Mitaka)
Assignee: James Slagle
QA Contact: Shai Revivo
URL:
Whiteboard:
Depends On:
Blocks: 1321607 1373538
TreeView+ depends on / blocked
 
Reported: 2016-11-21 18:52 UTC by Satish
Modified: 2018-03-19 20:27 UTC (History)
4 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2018-03-19 20:27:43 UTC
Target Upstream Version:


Attachments (Terms of Use)

Description Satish 2016-11-21 18:52:28 UTC
Description of problem: 
I trying to install Undercloud Mitaka on RHEL 7.2 HOST OS with Versionlock.
Steps followed
1. Did a minimal install on a new director
2. Versionlock file is generated on a working Mitaka+RHEL 7.2 testbed.
   copied that version lock file to new director  (/etc/yum/pluginconf.d/versionlock.list)
3. yum update -all returned following error

--> Processing Dependency: libipset.so.3(LIBIPSET_1.0)(64bit) for package: ipset-6.19-4.el7.x86_64
--> Processing Dependency: libipset.so.3(LIBIPSET_2.0)(64bit) for package: ipset-6.19-4.el7.x86_64
--> Processing Dependency: libipset.so.3(LIBIPSET_3.0)(64bit) for package: ipset-6.19-4.el7.x86_64
--> Processing Dependency: libipset.so.3()(64bit) for package: ipset-6.19-4.el7.x86_64
---> Package python-firewall.noarch 0:0.4.3.2-8.el7 will be installed
--> Running transaction check
---> Package ipset-libs.x86_64 0:6.19-4.el7 will be installed
--> Processing Conflict: firewalld-0.4.3.2-8.el7.noarch conflicts NetworkManager < 1:1.4.0-3.el7
--> Processing Conflict: firewalld-0.4.3.2-8.el7.noarch conflicts selinux-policy < 3.13.1-89
--> Finished Dependency Resolution
Error: firewalld conflicts with selinux-policy-3.13.1-60.el7_2.9.noarch
Error: firewalld conflicts with 1:NetworkManager-1.0.6-31.el7_2.x86_64
 You could try using --skip-broken to work around the problem
 You could try running: rpm -Va --nofiles --nodigest

4. Used --skip-broken option for yum update -all to overcome above error
5. Now tripleoclient installation fails

ice: /Stage[main]/Heat::Deps/Anchor[heat::service::end]: Triggered 'refresh' from 3 events
Notice: Finished catalog run in 613.65 seconds
+ rc=6
+ set -e
+ echo 'puppet apply exited with exit code 6'
puppet apply exited with exit code 6
+ '[' 6 '!=' 2 -a 6 '!=' 0 ']'
+ exit 6
[2016-11-19 13:10:00,226] (os-refresh-config) [ERROR] during configure phase. [Command '['dib-run-parts', '/usr/libexec/os-refresh-config/configure.d']' returned non-zero exit status 6]

[2016-11-19 13:10:00,226] (os-refresh-config) [ERROR] Aborting...
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py", line 845, in install
    _run_orc(instack_env)
  File "/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py", line 735, in _run_orc
    _run_live_command(args, instack_env, 'os-refresh-config')
  File "/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py", line 406, in _run_live_command
    raise RuntimeError('%s failed. See log for details.' % name)
RuntimeError: os-refresh-config failed. See log for details.

Version-Release number of selected component (if applicable):


How reproducible:



Steps to Reproduce:
1. Did a minimal install on a new director
2. Versionlock file is generated on a working Mitaka+RHEL 7.2 testbed.
   copied that version lock file to new director  (/etc/yum/pluginconf.d/versionlock.list)
3. yum update -all returned following error

--> Processing Dependency: libipset.so.3(LIBIPSET_1.0)(64bit) for package: ipset-6.19-4.el7.x86_64
--> Processing Dependency: libipset.so.3(LIBIPSET_2.0)(64bit) for package: ipset-6.19-4.el7.x86_64
--> Processing Dependency: libipset.so.3(LIBIPSET_3.0)(64bit) for package: ipset-6.19-4.el7.x86_64
--> Processing Dependency: libipset.so.3()(64bit) for package: ipset-6.19-4.el7.x86_64
---> Package python-firewall.noarch 0:0.4.3.2-8.el7 will be installed
--> Running transaction check
---> Package ipset-libs.x86_64 0:6.19-4.el7 will be installed
--> Processing Conflict: firewalld-0.4.3.2-8.el7.noarch conflicts NetworkManager < 1:1.4.0-3.el7
--> Processing Conflict: firewalld-0.4.3.2-8.el7.noarch conflicts selinux-policy < 3.13.1-89
--> Finished Dependency Resolution
Error: firewalld conflicts with selinux-policy-3.13.1-60.el7_2.9.noarch
Error: firewalld conflicts with 1:NetworkManager-1.0.6-31.el7_2.x86_64
 You could try using --skip-broken to work around the problem
 You could try running: rpm -Va --nofiles --nodigest

4. Used --skip-broken option for yum update -all to overcome above error
5. Now tripleoclient installation fails

ice: /Stage[main]/Heat::Deps/Anchor[heat::service::end]: Triggered 'refresh' from 3 events
Notice: Finished catalog run in 613.65 seconds
+ rc=6
+ set -e
+ echo 'puppet apply exited with exit code 6'
puppet apply exited with exit code 6
+ '[' 6 '!=' 2 -a 6 '!=' 0 ']'
+ exit 6
[2016-11-19 13:10:00,226] (os-refresh-config) [ERROR] during configure phase. [Command '['dib-run-parts', '/usr/libexec/os-refresh-config/configure.d']' returned non-zero exit status 6]

[2016-11-19 13:10:00,226] (os-refresh-config) [ERROR] Aborting...
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py", line 845, in install
    _run_orc(instack_env)
  File "/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py", line 735, in _run_orc
    _run_live_command(args, instack_env, 'os-refresh-config')
  File "/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py", line 406, in _run_live_command
    raise RuntimeError('%s failed. See log for details.' % name)
RuntimeError: os-refresh-config failed. See log for details.


Actual results:


Expected results:


Additional info:

Comment 1 Steve Baker 2018-03-19 20:27:43 UTC
We're not able to test or support this, so it is not clear how to fix it.


Note You need to log in before you can comment on or make changes to this bug.