Hide Forgot
Description of problem: DEBUG util.py:421: Error: Package: hypre-mpich-devel-2.11.1-4.el7.x86_64 (build) DEBUG util.py:421: Requires: mpich-devel(x86-64) Version-Release number of selected component (if applicable): 3.0.4-10.el7
This is a regression from 7.2: # repoquery --whatprovides mpich-devel'(x86-64)' mpich-devel-0:3.0.4-8.el7.x86_64
Note that mpich-3.0-devel does provide mpich-devel, just not the mpich-devel%{?isa} explicitly required by hypre-mpich-devel in the spec.
True, but the arch specific version is what should be used according to the Fedora packaging guidelines: https://fedoraproject.org/wiki/Packaging:Guidelines#Package_dependencies
(In reply to Yaakov Selkowitz from comment #2) > Note that mpich-3.0-devel does provide mpich-devel, just not the > mpich-devel%{?isa} explicitly required by hypre-mpich-devel in the spec. The 3.0.4-10.el7 package I'm looking at does, but that gets obsoleted. The instability of RHEL MPI packages is a huge pain for anyone trying to run a typical HPC service using them.
No solution for this issue?
Well, in the immediate term - hypre needs to change the requires, assuming that it will take a while to get this fixed in RHEL7. Something like: %if 0%{?el7} # https://bugzilla.redhat.com/show_bug.cgi?id=1397192 Requires: mpich-devel %else Requires: mpich-devel%{?_isa} %endif should do the trick. Also a possibility - one might want to build both mpich-3.0 and mpich-3.2 versions. I'll leave these up to Dave.
(In reply to Orion Poplawski from comment #6) > Well, in the immediate term - hypre needs to change the requires, assuming > that it will take a while to get this fixed in RHEL7. Something like: > > %if 0%{?el7} > # https://bugzilla.redhat.com/show_bug.cgi?id=1397192 > Requires: mpich-devel > %else > Requires: mpich-devel%{?_isa} > %endif > > should do the trick. > > Also a possibility - one might want to build both mpich-3.0 and mpich-3.2 > versions. > > I'll leave these up to Dave. I don't object to trying to work round problems, but why would this apply only to hypre? It won't help unless petsc and any other relevant packages change. Also any mpich packages rebuilt now won't install on CentOS/SL 7 systems as far as I can see.
Because hypre is the only package that (correctly) requires mpich-devel%{?_isa} rather than just mpich-devel: # repoquery --whatrequires mpich-devel blacs-mpich-devel-0:2.0.2-15.el7.x86_64 ga-mpich-devel-0:5.3b-14.el7.x86_64 ga-mpich-static-0:5.3b-14.el7.x86_64 gasnet-devel-0:1.28.0-1.el7.x86_64 hypre-mpich-devel-0:2.11.1-4.el7.x86_64 mpich-doc-0:3.0.4-8.el7.noarch netgen-mesher-mpich-devel-0:5.3.1-5.el7.x86_64 scalapack-mpich-devel-0:2.0.2-15.el7.x86_64 # repoquery --whatrequires mpich-devel'(x86-64)' hypre-mpich-devel-0:2.11.1-4.el7.x86_64
hypre-2.11.1-6.el7 has been submitted as an update to Fedora EPEL 7. https://bodhi.fedoraproject.org/updates/FEDORA-EPEL-2016-30fb5f975e
(In reply to Orion Poplawski from comment #8) > Because hypre is the only package that (correctly) requires > mpich-devel%{?_isa} rather than just mpich-devel: Actually petsc does too, but that list suggests plenty of missing dependencies. Anyhow, I've pushed the hypre update to testing (and will keep it there until it will install on CentOS/SL).
(In reply to Dave Love from comment #10) > (In reply to Orion Poplawski from comment #8) > > Because hypre is the only package that (correctly) requires > > mpich-devel%{?_isa} rather than just mpich-devel: > > Actually petsc does too, but that list suggests plenty of missing > dependencies. > Does 'petsc' need this fix?
(In reply to Antonio Trande from comment #11) > Does 'petsc' need this fix? yes
hypre-2.11.1-6.el7 has been pushed to the Fedora EPEL 7 testing repository. If problems still persist, please make note of it in this bug report. See https://fedoraproject.org/wiki/QA:Updates_Testing for instructions on how to install test updates. You can provide feedback for this update here: https://bodhi.fedoraproject.org/updates/FEDORA-EPEL-2016-30fb5f975e
After evaluating this issue, there are no plans to address it further or fix it in an upcoming release. Therefore, it is being closed. If plans change such that this issue will be fixed in an upcoming release, then the bug can be reopened.