Bug 1005724 - libhdfs links to shared library not in default library path
Summary: libhdfs links to shared library not in default library path
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Fedora
Classification: Fedora
Component: hadoop
Version: 20
Hardware: Unspecified
OS: Unspecified
unspecified
high
Target Milestone: ---
Assignee: Robert Rati
QA Contact: Fedora Extras Quality Assurance
URL:
Whiteboard:
Depends On:
Blocks: 998521 1008201 1008202
TreeView+ depends on / blocked
 
Reported: 2013-09-09 09:25 UTC by Mattias Ellert
Modified: 2013-11-10 07:33 UTC (History)
5 users (show)

Fixed In Version: hadoop-2.2.0-1.fc20
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2013-11-10 07:33:46 UTC
Type: Bug
Embargoed:


Attachments (Terms of Use)

Description Mattias Ellert 2013-09-09 09:25:00 UTC
Description of problem:

The libhdfs library links to libjvm.so which is not in the default library search path:

$ ldd /usr/lib64/hadoop/libhdfs.so.0.0.0 
	linux-vdso.so.1 =>  (0x00007fffa74c0000)
	libjvm.so => not found
	libc.so.6 => /lib64/libc.so.6 (0x00007fae3bb79000)
	/lib64/ld-linux-x86-64.so.2 (0x00007fae3c16e000)

It should not be necessary to set any environment variables (LD_LIBRARY_PATH, JAVA_HOME, CLASSPATH etc.) in order to use any software that has been properly integrated in the distribution. For a java application it is sufficient to write a shell wrapper that sets the proper environment before starting the java interpreter - this will not cause problems for other software, but for a C library some more work is needed, since you can not write a shell wrapper for a library. C libraries that uses libraries not in the default search path should be patched so that they use dlopen (using the full path) instead of direct linking as part of its Fedora distribution integration.

Adding configuration that adds or modifies environment variables in every shell started by every user on the system is not allowed, since this will cause interference with other software. It is necessary to patch the library so that it finds the default libjvm and the jars it needs without the help of any environment variables. You may write your patches so that it honours environment variables like LD_LIBRARY_PATH, JAVA_HOME and CLASSPATH it they are set, but it must work without them.


Version-Release number of selected component (if applicable):

$ rpm -q libhdfs
libhdfs-2.0.5-8.fc21.x86_64


How reproducible:

Always

Comment 1 Robert Rati 2013-09-09 19:58:47 UTC
I moved libdhfs to /usr/lib[64] instead of a subdirectory underneath to resolve BZ1003036 and I am unable to reproduce.  From a koji scratch build for F20:

$ ldd /usr/lib64/libhdfs.so
	linux-vdso.so.1 =>  (0x00007fffe8ee0000)
	libjvm.so => /usr/lib/jvm/java/jre/lib/amd64/server/libjvm.so (0x00007f892738a000)
	libc.so.6 => /lib64/libc.so.6 (0x00007f8926fb3000)
	/lib64/ld-linux-x86-64.so.2 (0x00000032a6e00000)
	libstdc++.so.6 => /lib64/libstdc++.so.6 (0x00007f8926caf000)
	libm.so.6 => /lib64/libm.so.6 (0x00007f89269ad000)
	libdl.so.2 => /lib64/libdl.so.2 (0x00007f89267a8000)
	libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f892658c000)
	libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f8926376000)

What is the system configuration you are using to produce this issue?

Comment 2 Michael Schwendt 2013-09-09 20:19:25 UTC
> libjvm.so => /usr/lib/jvm/java/jre/lib/amd64/server/libjvm.so 

What on your machine puts /usr/lib/jvm/java/jre/lib/amd64/server into the run-time linker's search path?

Anything in /etc/ld.so.conf.d/* that does it?

$ repoquery --whatprovides /usr/lib/jvm/java/jre/lib/amd64/server/libjvm.so
$

[...]

$ rpm -qpR libhdfs-2.0.5-8.fc20.x86_64.rpm | grep jvm
libjvm.so()(64bit)
libjvm.so(SUNWprivate_1.1)(64bit)

$ rpm -q --whatprovides 'libjvm.so(SUNWprivate_1.1)(64bit)'
java-1.7.0-openjdk-1.7.0.31-2.4.1.4.fc20.x86_64

That one is insufficient and not available in run-time linker's search path.

$ rpm -ql java-1.7.0-openjdk | grep libjvm
/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.31-2.4.1.4.fc20.x86_64/jre/lib/amd64/server/libjvm.so

$ repoquery --whatprovides 'libjvm.so(SUNWprivate_1.1)(64bit)'
java-1.8.0-openjdk-1:1.8.0.0-0.19.b106.fc20.x86_64
java-1.7.0-openjdk-1:1.7.0.31-2.4.1.4.fc20.x86_64
java-1.8.0-openjdk-1:1.8.0.0-0.16.b89x.fc20.x86_64

$ repoquery --whatprovides 'libjvm.so()(64bit)'
java-1.8.0-openjdk-1:1.8.0.0-0.19.b106.fc20.x86_64
java-1.7.0-openjdk-1:1.7.0.31-2.4.1.4.fc20.x86_64
libgcj-0:4.8.1-6.fc20.x86_64
java-1.8.0-openjdk-1:1.8.0.0-0.16.b89x.fc20.x86_64

Comment 3 Michael Schwendt 2013-09-09 20:22:45 UTC
There's not even /usr/lib/jvm/java/jre but just /usr/lib/jvm/jre:

$ file /usr/lib/jvm/*
/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.31-2.4.1.4.fc20.x86_64: directory
/usr/lib/jvm/jre:                                             symbolic link to `/etc/alternatives/jre'
/usr/lib/jvm/jre-1.7.0:                                       symbolic link to `/etc/alternatives/jre_1.7.0'
/usr/lib/jvm/jre-1.7.0_openjdk:                               symbolic link to `/etc/alternatives/jre_1.7.0_openjdk'
/usr/lib/jvm/jre-1.7.0-openjdk-1.7.0.31-2.4.1.4.fc20.x86_64:  symbolic link to `java-1.7.0-openjdk-1.7.0.31-2.4.1.4.fc20.x86_64/jre'
/usr/lib/jvm/jre-openjdk:                                     symbolic link to `/etc/alternatives/jre_openjdk'

Comment 4 Mattias Ellert 2013-09-10 11:29:18 UTC
/usr/lib/jvm/java and /usr/lib/jvm/jre have no owners in the RPM data base. These are created and managed by the rpm scriptlets using alternatives.

/usr/lib/jvm/java (and therefore also /usr/lib/jvm/java/jre) only exist if you install java-devel. You should not have to install devel packages to fulfil runtime dependencies. For runtime, the default jre is in /usr/lib/jvm/jre - this only requires java, not java-devel.

/usr/lib/jvm/jre/lib/{i386,amd64,arm,...}/server is not part of the default library search path, so libraries here will not be found at runtime without using environment variables or rpath or something else. 

I can see in the koji build log that the compilation uses rpath:

     [exec] /usr/bin/cc  -fPIC  -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64   -shared -Wl,-soname,libhdfs.so.0.0.0 -o target/usr/local/lib/libhdfs.so.0.0.0 CMakeFiles/hdfs.dir/main/native/libhdfs/exception.c.o CMakeFiles/hdfs.dir/main/native/libhdfs/jni_helper.c.o CMakeFiles/hdfs.dir/main/native/libhdfs/hdfs.c.o /usr/lib/jvm/java/jre/lib/amd64/server/libjvm.so -Wl,-rpath,/usr/lib/jvm/java/jre/lib/amd64/server

This would allow libjvm to be found in /usr/lib/jvm/java/jre/lib/amd64/server - this is however as written above not a good runtime location and would introduce a runtime dependency on java-devel.

However, you later - in order to comply with the no-rpath packaging guideline - remove the rpath.

chrpath --delete [...] /builddir/build/BUILDROOT/hadoop-2.0.5-8.fc21.x86_64//usr/lib64/hadoop/libhdfs.so.0.0.0

This removal of the rpath is the right thing to do in order to comply with the guidelines. However after removing the rpath the libjvm library will not be found outside the default library path. In order to properly find libjvm you need to use dlopen with the full path and no direct linking.

Comment 5 Michael Schwendt 2013-09-10 11:59:07 UTC
> /usr/lib/jvm/java (and therefore also /usr/lib/jvm/java/jre) only exist
> if you install java-devel.

Okay, but it doesn't add anything to the run-timer linker's search path. What am I missing?


> You should not have to install devel packages to fulfil runtime dependencies.

More precisely, the runtime dependency is not mapped into a working RPM dependency in the package:

  $ rpm -qpR libhdfs-2.0.5-8.fc20.x86_64.rpm |grep jvm
  libjvm.so()(64bit)
  libjvm.so(SUNWprivate_1.1)(64bit)

Whatever "provides" that shared lib, must store it in runtime linker's search path. That's not the case. The dependency doesn't lead to a working runtime. Whatever dir java-1.7.0-openjdk-devel may create via the alternatives system, it doesn't put the lib into runtime linker's search path.

  $ rpm -ql java-1.7.0-openjdk|grep libjvm.so
  /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.31-2.4.1.4.fc20.x86_64/jre/lib/amd64/server/libjvm.so

And yes, I agree the packaging is broken.

Comment 6 Mattias Ellert 2013-09-10 14:01:51 UTC
(In reply to Michael Schwendt from comment #5)
> > /usr/lib/jvm/java (and therefore also /usr/lib/jvm/java/jre) only exist
> > if you install java-devel.
> 
> Okay, but it doesn't add anything to the run-timer linker's search path.
> What am I missing?

My comment was only addressing the "this directory does not exist" part of your comment. It was never intended to offer a solution for the "this directory is not in the default library search path" part of it. It was the fact that this directory is not in the default search path that is the reason this bug report was filed in the first place.

> > You should not have to install devel packages to fulfil runtime dependencies.
> 
> More precisely, the runtime dependency is not mapped into a working RPM
> dependency in the package:
> 
>   $ rpm -qpR libhdfs-2.0.5-8.fc20.x86_64.rpm |grep jvm
>   libjvm.so()(64bit)
>   libjvm.so(SUNWprivate_1.1)(64bit)
> 
> Whatever "provides" that shared lib, must store it in runtime linker's
> search path. That's not the case. The dependency doesn't lead to a working
> runtime. Whatever dir java-1.7.0-openjdk-devel may create via the
> alternatives system, it doesn't put the lib into runtime linker's search
> path.

That the Fedora java packages never installed the libjvm in the default library search path is most probably on purpose. I am not familiar with the reasoning behind this choice, but I know people have suggested making the library available in the default search path for many years and it never happened. So I don't thing yet another request for this will be successful unless there are some new arguments that has not been heard before. "I want to link to it", has not been a successful argument so far.

Until now it has always been the responsibility of the packages using the libjvm to open it properly using dlopen in its installed location outside the default library search path. At the moment I see no other solution than this.

>   $ rpm -ql java-1.7.0-openjdk|grep libjvm.so
>  
> /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.31-2.4.1.4.fc20.x86_64/jre/lib/amd64/
> server/libjvm.so
> 
> And yes, I agree the packaging is broken.

Comment 7 Robert Rati 2013-09-10 16:23:51 UTC
(In reply to Mattias Ellert from comment #4)
> /usr/lib/jvm/java and /usr/lib/jvm/jre have no owners in the RPM data base.
> These are created and managed by the rpm scriptlets using alternatives.
> 
> /usr/lib/jvm/java (and therefore also /usr/lib/jvm/java/jre) only exist if
> you install java-devel. You should not have to install devel packages to
> fulfil runtime dependencies. For runtime, the default jre is in
> /usr/lib/jvm/jre - this only requires java, not java-devel.

java-devel is required by pretty much the entire java stack in Fedora (as of F19).  Usually you should not need -devel to fulfil runtime dependencies, but the java stack seems to be an exception.  In this case libhdfs.so will call out to java files that are part of the hadoop-hdfs package, which will in turn will end up installing java-devel.  It is might be possible to get around this with a subpackage but that is not desirable for a number of reasons.

You likely ended up with this situation because there is no dependency from libhdfs on hadoop-hdfs (BZ1003039), but that will be fixed in the next spin.
 
> /usr/lib/jvm/jre/lib/{i386,amd64,arm,...}/server is not part of the default
> library search path, so libraries here will not be found at runtime without
> using environment variables or rpath or something else. 
> 
> I can see in the koji build log that the compilation uses rpath:
> 
>      [exec] /usr/bin/cc  -fPIC  -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE
> -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64   -shared
> -Wl,-soname,libhdfs.so.0.0.0 -o target/usr/local/lib/libhdfs.so.0.0.0
> CMakeFiles/hdfs.dir/main/native/libhdfs/exception.c.o
> CMakeFiles/hdfs.dir/main/native/libhdfs/jni_helper.c.o
> CMakeFiles/hdfs.dir/main/native/libhdfs/hdfs.c.o
> /usr/lib/jvm/java/jre/lib/amd64/server/libjvm.so
> -Wl,-rpath,/usr/lib/jvm/java/jre/lib/amd64/server
> 
> This would allow libjvm to be found in
> /usr/lib/jvm/java/jre/lib/amd64/server - this is however as written above
> not a good runtime location and would introduce a runtime dependency on
> java-devel.
> 
> However, you later - in order to comply with the no-rpath packaging
> guideline - remove the rpath.
> 
> chrpath --delete [...]
> /builddir/build/BUILDROOT/hadoop-2.0.5-8.fc21.x86_64//usr/lib64/hadoop/
> libhdfs.so.0.0.0
> 
> This removal of the rpath is the right thing to do in order to comply with
> the guidelines. However after removing the rpath the libjvm library will not
> be found outside the default library path. In order to properly find libjvm
> you need to use dlopen with the full path and no direct linking.

I am looking at using dlopen to open the libjvm.so directly based upon JAVA_HOME (which would be the responsibility of the calling program to set) but it is important not to break functionality with what upstream provides and people are currently using, meaning I won't want to put a hard requirement on setting JAVA_HOME.  If anything it would likely end up an optional ability to set JAVA_HOME and not a hard requirement.  If the lib doesn't work for you then your program would need to set JAVA_HOME.

Do you have a test program that fails?  Have you tried installing hadoop-hdfs?

Comment 8 Mattias Ellert 2013-09-11 07:24:35 UTC
There should be clear separation between runtime and devel. If this is broken in the java stack, bugs should be filed so that the separation can be restored. Any existing brokenness should not be treated as an alibi to introduce even more brokenness.

It is currently a bit tricky to install hadoop-hdfs, since it depends on hadoop-common which tries to install /usr/bin/rcc, and a file with this name is already installed by the qt-devel package.

I maintain the root package in Fedora, which is a tricky package when it comes to build requirements. It has a lot of modules that support a large variety of functionality. Enabling the building of the root HDFS module was the main reason for me to install libhdfs and hadoop-devel. In order to build the full set of root modules you need to have lots of packages installed. For runtime you can of course simply chose the modules you need, since each module is in its own rpm - and only the dependencies for the modules you install are dragged in.

Both qt-devel and hadoop-devel are build requirements for root, so removing qt-devel in order to be able to install hadoop-hdfs does not make sense if I am trying to build root with all its modules. If libhdfs is fixed so that it depends properly on hadoop-hdfs without fixing the file conflict between hadoop-common and qt-devel, the root package would become unbuildable since its build requirements would then drag in two packages which are not installable together.

The building of the root package contains some level of testing since the documentation generation step involves running the root application in batch mode loading all the built modules. It doesn't test every single piece of functionality, and the package build can of course not test connectivity since network access is not allowed. But in order for the root HDFS module to load properly for the documentation generation I had to add both /usr/lib(64)?/hadoop and /usr/lib/jvm/jre/lib/{i386,amd64}/server to the library search path before executing root in the specfile.

I do not have any test program other than the root HDFS module. But you can try that if you want. The root-io-hdfs subpackage is available in Fedora 21 and in updates-testing on Fedora 20. With the module installed it should be possible to create TFile objects in root from files in hdfs using a URL like:

hdfs:///path/to/file/in/HDFS.root

I don't know if you are familiar with root and the above description makes sense for you or not. If you are interested and would like more information I can provide it.

If JAVA_HOME is not set you should fall back to /usr/lib/jvm.

I.e. ${JAVA_HOME}/jre/lib/amd64/server if JAVA_HOME is set otherwise /usr/lib/jvm/jre/lib/amd64/server. The amd64 part in the path differs for different architectures so you either need to try the right one for the architecture you are compiling for or try all in turn until you hit something that works. Note that the options ar not only amd64 and i386, but you have other options for arm, s390(x)?, ppc(64)?, aarch64 and so on.

Comment 9 Robert Rati 2013-09-11 12:29:29 UTC
(In reply to Mattias Ellert from comment #8)
> There should be clear separation between runtime and devel. If this is
> broken in the java stack, bugs should be filed so that the separation can be
> restored. Any existing brokenness should not be treated as an alibi to
> introduce even more brokenness.

As I said I normally agree that there should be a separation between runtime and devel.  I was just pointing out that worrying about the dependency on java-devel for libhdfs is like trying to stop a single raindrop in a down pour.  The entire java stack requires java-devel.  I don't know why this is the case but if it's really a big issue a discussion should be had about it.  I'm going to look at removing the dep for libhdfs, but hadoop-common and its deps will still need it so you won't end up with a system w/o java-devel.  It's going to be there. :)

> It is currently a bit tricky to install hadoop-hdfs, since it depends on
> hadoop-common which tries to install /usr/bin/rcc, and a file with this name
> is already installed by the qt-devel package.

That is fixed as well.  The new spin, which fixes the outstanding bzs and a few other issues, is basically waiting upon my resolution of this issue.  You should be able to install hadoop-common w/o conflicting with qt-devel in the next spin.

> I maintain the root package in Fedora, which is a tricky package when it
> comes to build requirements. It has a lot of modules that support a large
> variety of functionality. Enabling the building of the root HDFS module was
> the main reason for me to install libhdfs and hadoop-devel. In order to
> build the full set of root modules you need to have lots of packages
> installed. For runtime you can of course simply chose the modules you need,
> since each module is in its own rpm - and only the dependencies for the
> modules you install are dragged in.

I'm glad you're using the libhdfs package.  I wasn't sure if there would be much immediate interest in the package.

Comment 10 Robert Rati 2013-10-21 18:57:33 UTC
This seems to have stemmed from a missing dependency on the hadoop-hdfs package.  The hadoop-hdfs package will pull in java-devel, thus providing the missing shared library.

Also, trying to fix hadoop to prevent it from pulling in java-devel doesn't seem worthwhile when a dependency of hadoop will just pull in java-devel.  The separation between runtime and devel is much larger issue than just hadoop with the java stack.

Comment 11 Timothy St. Clair 2013-10-21 20:17:45 UTC
Of note, expecting upstream projects to change their code to use dlopen on libjvm is rather onerous, especially for a project of that magnitude. 

I've run across several other projects which all link directly as well.

Comment 12 Mattias Ellert 2013-10-21 23:16:11 UTC
(In reply to Robert Rati from comment #10)
> This seems to have stemmed from a missing dependency on the hadoop-hdfs
> package.  The hadoop-hdfs package will pull in java-devel, thus providing
> the missing shared library.

No, this is a misunderstanding.

The libjvm.so is in java, not in java-devel. If hadoop-hdfs (a runtime package) drags in java-devel (a development package) as a dependency, there is a packaging mistake somewhere. The guidelines https://fedoraproject.org/wiki/Packaging:Guidelines?rd=Packaging/Guidelines#Devel_Packages say:

"Fedora packages must be designed with a logical separation of files. Specifically, -devel packages must be used to contain files which are intended solely for development or needed only at build-time."

There is no need to install java-devel to have libjvm.so available. So this was never the issue.

The problem is not that libjvm.so is not installed, but where it is installed. It is available in /usr/lib/jvm/jre/lib/amd64/server, a location that is not in the default library search path. This can be solved in a number of ways.

 a) Install libjvm.so in the default library path. This would be a change to
    the java main package and not a fix in hadoop. So even though it would make
    sense, that is out of scope for the hadoop package.

 b) Patch the code to dlopen libjvm.so instead of linking to it. This is not
    impossible to do, but requires some work to make it portable. Such a change
    might also not be acceptable by upstream.

 c) Use rpath. Slightly questionable. The guidelines allow rpath only for
    "internal libraries", which normally only applies to libraries built from
    the same source RPM, but there might be some room for interpretation here.

The hadoop package uses the rpath option:

$ readelf -a /lib64/libhdfs.so.0.0.0 | grep RPATH
 0x000000000000000f (RPATH)              Library rpath: [/usr/lib/jvm/java/jre/lib/amd64/server]

However, the encoded rpath is not appropriate. The path /usr/lib/jvm/java/jre/lib/amd64/server is the jdk path to the libjvm.so. In order for the jdk paths to be present you need java-devel to be installed, which is not acceptable for a runtime package.

The proper runtime rpath is the jre path /usr/lib/jvm/jre/lib/amd64/server. The jre path is present when java is installed, there is no need for java-devel.

Is summary: if you choose to use rpath to find libjvm.so - encode the jre path and not the jdk path.

> Also, trying to fix hadoop to prevent it from pulling in java-devel doesn't
> seem worthwhile when a dependency of hadoop will just pull in java-devel. 
> The separation between runtime and devel is much larger issue than just
> hadoop with the java stack.

It is very likely that the failure to properly separate runtime from development is not in hadoop itself, but in one of its dependencies. If this is the case please file a bug for that package. Arguing that, hey, my package is using java so the guidelines don't apply to me, is not a valid argument.

Comment 13 Mikolaj Izdebski 2013-10-22 14:21:55 UTC
(In reply to Mattias Ellert from comment #12)
> (In reply to Robert Rati from comment #10)
> > This seems to have stemmed from a missing dependency on the hadoop-hdfs
> > package.  The hadoop-hdfs package will pull in java-devel, thus providing
> > the missing shared library.
> 
> No, this is a misunderstanding.
> 
> The libjvm.so is in java, not in java-devel. If hadoop-hdfs (a runtime
> package) drags in java-devel (a development package) as a dependency, there
> is a packaging mistake somewhere. The guidelines
> https://fedoraproject.org/wiki/Packaging:Guidelines?rd=Packaging/
> Guidelines#Devel_Packages say:

I don't know, maybe there is a packaging bug, but in general runtime packages can require java-devel in some cases.  Here's an example from maven spec file:

# Theoretically Maven might be usable with just JRE, but typical Maven
# workflow requires full JDK, wso we require it here.
Requires:       java-devel

Also packages requiring tools.jar often require java-devel.

> "Fedora packages must be designed with a logical separation of files.
> Specifically, -devel packages must be used to contain files which are
> intended solely for development or needed only at build-time."

IMO java-1.7.0-openjdk-devel is not a devel package in the context of this part of packaging guidelines.  It provides Java Development Kit which is supposed to be used even by end users.  It provides tools not available in java-1.7.0-openjdk.

> The problem is not that libjvm.so is not installed, but where it is
> installed. It is available in /usr/lib/jvm/jre/lib/amd64/server, a location
> that is not in the default library search path. This can be solved in a
> number of ways.

That it not exactly correct.  Fedora supports installations of multiple JVMs in pararell.  Each of them can install libjvm.so in a different location.  There is no single place in the system where one could look for libjvm.so -- the exact location depends on which JVM is chosen for which use case (different users or different applications can use different JVMs).

>  a) Install libjvm.so in the default library path. This would be a change to
>     the java main package and not a fix in hadoop. So even though it would
> make
>     sense, that is out of scope for the hadoop package.

That is not a good solution as there are many JVMs and each can provide libjvm.so. libhdfs should work with whatever JVM user wants to use it, if possible. See above.

>  b) Patch the code to dlopen libjvm.so instead of linking to it. This is not
>     impossible to do, but requires some work to make it portable. Such a
> change
>     might also not be acceptable by upstream.

This is the right way to go IMO.  I already discussed this in detail with Robert on IRC.

>  c) Use rpath. Slightly questionable. The guidelines allow rpath only for
>     "internal libraries", which normally only applies to libraries built from
>     the same source RPM, but there might be some room for interpretation
> here.

While this solution might work it is still much better to use dlopen().


(In reply to Mattias Ellert from comment #8)
> If JAVA_HOME is not set you should fall back to /usr/lib/jvm.
> 
> I.e. ${JAVA_HOME}/jre/lib/amd64/server if JAVA_HOME is set otherwise
> /usr/lib/jvm/jre/lib/amd64/server. The amd64 part in the path differs for
> different architectures so you either need to try the right one for the
> architecture you are compiling for or try all in turn until you hit
> something that works. Note that the options ar not only amd64 and i386, but
> you have other options for arm, s390(x)?, ppc(64)?, aarch64 and so on.

I disagree.  Please don't do that.

Fedora allows multiple installations of JVM.  /usr/lib/jvm/jre is the highest priority distribution JVM configured by alternatives system.  Fedora package maintainers decide what are priorities of which JVM.  This setting only affects which JVM is on default PATH.  This does not mean it is default JVM on particular machine!

Admins of particular machine can set different preferred JVM in /etc/java/java.conf.  Users can define their preferred JVMs in $HOME/.java/java.conf.  Moreover, there are per-application settings -- each application can use different JVM.

One reliable way of determining which JVM to use is Java Packages Tools.  You can implement your own parser, but please respect user preferences and don't force one particular JVM.

Comment 14 Mattias Ellert 2013-10-22 16:11:56 UTC
I agree that dlopen is a better idea than rpath. My previous comment was that if rpath is used, the jre path is a better idea than the jdk path. But dlopen is even better.

Configuration should override the default - I do not disagree with this. However if there is no configuration (application, system, user, ...) it should still work and the default should be used. It must never be compulsory to change the configuration if the default is what you want. It is easier to honour the possible ways configuration can be done if dlopen is used than if rpath is used. Using rpath means that you hardcode the default location and can not easily override it. In this case you would need to change the default java implementation using alternatives. Trying to change it in configuration files or environment variables will not work - but it is better than nothing. Using dlopen is better since then configuration files and environment variables can be honoured too. But it still must work if no configuration file or environment variable chooses the preferred java implementation, in which case the default must be used.

That a development package (maven) requires another development package (java-devel) is perfectly fine. Not all development packages are named -devel. If you install a package that was using maven to build it, you should not need maven at runtime. Similar to that if you install a program that was using gcc to build it you don't need to install gcc. You need the compiler's runtime libraries (libgcc), but those are separated out into a separate rpm that does not drag in the compiler itself as a dependency. If a runtime package has a dependency on maven, that is an issue.

The toolchains for other programming languages have managed to properly separate development and runtime packages. doing the same for java should not be impossible.

Comment 15 Mikolaj Izdebski 2013-10-22 16:24:31 UTC
(In reply to Mattias Ellert from comment #14)
> Configuration should override the default - I do not disagree with this.
> However if there is no configuration (application, system, user, ...) it
> should still work and the default should be used. It must never be
> compulsory to change the configuration if the default is what you want. It
> is easier to honour the possible ways configuration can be done if dlopen is
> used than if rpath is used. Using rpath means that you hardcode the default
> location and can not easily override it. In this case you would need to
> change the default java implementation using alternatives. Trying to change
> it in configuration files or environment variables will not work - but it is
> better than nothing. Using dlopen is better since then configuration files
> and environment variables can be honoured too. But it still must work if no
> configuration file or environment variable chooses the preferred java
> implementation, in which case the default must be used.

Global Java configuration (/etc/java/java.conf) is always present.  It it was removed many applications would stop working.

The solution is IMO quite simple, there is no need to reinvent the wheel by implementing code to search for JVMs or hardcode JVM paths.  libhdfs is a library and as such it is used by some aplications.  Each application should be responsible for exporting JAVA_HOME, which libhdfs could read and dlopen() libjvm.so from there.  Exporting proper JAVA_HOME is as simple as adding a single line to startup script for each application:

. /usr/share/java-utils/java-functions; set_jvm

This will take care of reading config files, selecting appropriate JVM and exporting several environment variables.

> That a development package (maven) requires another development package
> (java-devel) is perfectly fine. Not all development packages are named
> -devel. If you install a package that was using maven to build it, you
> should not need maven at runtime.

Ok, now I fully agree.

Comment 16 Robert Rati 2013-10-23 15:56:21 UTC
Created a patch that will use dlopen to open libjvm, but this is only supported on x86 architectures.  Any other architecture will not build libhdfs, which means no -devel and related subpackages.

Comment 17 Fedora Update System 2013-10-24 12:27:19 UTC
hadoop-2.2.0-1.fc20,objenesis-1.2-16.fc20 has been submitted as an update for Fedora 20.
https://admin.fedoraproject.org/updates/hadoop-2.2.0-1.fc20,objenesis-1.2-16.fc20

Comment 18 Fedora Update System 2013-10-24 17:47:41 UTC
Package hadoop-2.2.0-1.fc20, objenesis-1.2-16.fc20:
* should fix your issue,
* was pushed to the Fedora 20 testing repository,
* should be available at your local mirror within two days.
Update it with:
# su -c 'yum update --enablerepo=updates-testing hadoop-2.2.0-1.fc20 objenesis-1.2-16.fc20'
as soon as you are able to.
Please go to the following url:
https://admin.fedoraproject.org/updates/FEDORA-2013-19859/hadoop-2.2.0-1.fc20,objenesis-1.2-16.fc20
then log in and leave karma (feedback).

Comment 19 Fedora Update System 2013-11-10 07:33:46 UTC
hadoop-2.2.0-1.fc20, objenesis-1.2-16.fc20 has been pushed to the Fedora 20 stable repository.  If problems still persist, please make note of it in this bug report.


Note You need to log in before you can comment on or make changes to this bug.