JVM gets into OOM during hadoop build on s390x ... INFO] --- build-helper-maven-plugin:1.9:add-source (add-jsp-generated-sources-directory) @ hadoop-hdfs --- [INFO] Source directory: /builddir/build/BUILD/hadoop-common-9e2ef43a240fb0f603d8c384e501daec11524510/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java added. [INFO] [INFO] --- hadoop-maven-plugins:2.4.1:protoc (compile-protoc) @ hadoop-hdfs --- [INFO] [INFO] --- hadoop-maven-plugins:2.4.1:protoc (compile-protoc-datanode) @ hadoop-hdfs --- [INFO] [INFO] --- hadoop-maven-plugins:2.4.1:protoc (compile-protoc-namenode) @ hadoop-hdfs --- [INFO] [INFO] --- hadoop-maven-plugins:2.4.1:protoc (compile-protoc-qjournal) @ hadoop-hdfs --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-hdfs --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-hdfs --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 588 source files to /builddir/build/BUILD/hadoop-common-9e2ef43a240fb0f603d8c384e501daec11524510/hadoop-hdfs-project/hadoop-hdfs/target/classes The system is out of resources. Consult the following stack trace for details. java.lang.OutOfMemoryError: Java heap space at java.util.HashMap.newNode(HashMap.java:1734) at java.util.HashMap.putVal(HashMap.java:630) at java.util.HashMap.put(HashMap.java:611) ... for full log please see http://s390.koji.fedoraproject.org/koji/taskinfo?taskID=1575786 Version-Release number of selected component (if applicable): hadoop-2.4.1-2.fc21 And with diff --git a/hadoop.spec b/hadoop.spec index f8c4301..c5cada6 100644 --- a/hadoop.spec +++ b/hadoop.spec @@ -602,6 +602,9 @@ opts="-j" opts="-j" %endif %endif +%ifarch s390x +export MAVEN_OPTS="-Xms2048M -Xmx4096M" +%endif %mvn_build $opts -- -Drequire.snappy=true -Dcontainer-executor.conf.dir=%{_sysconfdir}/%{name} -Pdist,native -DskipTests -DskipTest -DskipIT # This takes a long time to run, so comment out for now hadoop builds fine on both s390 and s390x - http://s390.koji.fedoraproject.org/koji/taskinfo?taskID=1576118
same problem is on ppc64le
ppc64le additionally needs diff -up hadoop-common-project/hadoop-common/src/JNIFlags.cmake.ppc hadoop-common-project/hadoop-common/src/JNIFlags.cmake --- hadoop-common-project/hadoop-common/src/JNIFlags.cmake.ppc 2014-10-10 07:17:31.770038082 +0000 +++ hadoop-common-project/hadoop-common/src/JNIFlags.cmake 2014-10-10 07:18:10.050038084 +0000 @@ -78,6 +78,8 @@ IF("${CMAKE_SYSTEM}" MATCHES "Linux") SET(_java_libarch "amd64") ELSEIF (CMAKE_SYSTEM_PROCESSOR MATCHES "^arm") SET(_java_libarch "arm") + elseif(CMAKE_SYSTEM_PROCESSOR MATCHES "^(powerpc|ppc)64le") + set(_java_libarch "ppc64") ELSE() SET(_java_libarch ${CMAKE_SYSTEM_PROCESSOR}) ENDIF() because OpenJDK upstream decided to install JRE into ppc64 directory. Ideally Hadoop should use Java detection from cmake (FindJNI.cmake) directly. successful scratch builds on ppc64/ppc64le = http://ppc.koji.fedoraproject.org/koji/taskinfo?taskID=2141939
hadoop-2.4.1-5.fc21 has been submitted as an update for Fedora 21. https://admin.fedoraproject.org/updates/hadoop-2.4.1-5.fc21
Package hadoop-2.4.1-5.fc21: * should fix your issue, * was pushed to the Fedora 21 testing repository, * should be available at your local mirror within two days. Update it with: # su -c 'yum update --enablerepo=updates-testing hadoop-2.4.1-5.fc21' as soon as you are able to. Please go to the following url: https://admin.fedoraproject.org/updates/FEDORA-2014-12632/hadoop-2.4.1-5.fc21 then log in and leave karma (feedback).
hadoop-2.4.1-5.fc21 has been pushed to the Fedora 21 stable repository. If problems still persist, please make note of it in this bug report.