Description of problem: For parallel HDF5 (with openmpi-1.5-3.fc15.x86_64), file creation always fail. The simplest example in the HDF documentation fails. As file creation is a necessary step and fails, nothing can be done with parallel HDF5. The serial hdf5 package works well. Version-Release number of selected component (if applicable): hdf5-openmpi-1.8.5.patch1-9.fc15.x86_64 (and devel) How reproducible: 100% reproducible (tested on a freshly installed Fedora 15). Steps to Reproduce: 1. Install the packages openmpi-x86_64, hdf5 and hdf5-openmpi (with devel too) 2. Download the simplest test file for collective file creation from the HDF group site: http://www.hdfgroup.org/HDF5/Tutor/examples/parallel/File_create.c 3. Compile it with "mpicc File_create.c -lhdf5_hl -lhdf5" and run it: "a.out" Actual results: $ a.out HDF5-DIAG: Error detected in HDF5 (1.8.5-patch1) MPI-process 0: #000: ../../src/H5F.c line 1427 in H5Fcreate(): unable to create file major: File accessability minor: Unable to open file #001: ../../src/H5F.c line 1198 in H5F_open(): unable to open file: time = Fri Jun 10 10:14:16 2011 , name = 'SDS_row.h5', tent_flags = 13 major: File accessability minor: Unable to open file #002: ../../src/H5FD.c line 1088 in H5FD_open(): open failed major: Virtual File Layer minor: Unable to initialize object #003: ../../src/H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed major: Internal error (too specific to document in detail) minor: Some MPI function failed #004: ../../src/H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not in list major: Internal error (too specific to document in detail) minor: MPI Error String HDF5-DIAG: Error detected in HDF5 (1.8.5-patch1) MPI-process 0: #000: ../../src/H5F.c line 1950 in H5Fclose(): not a file ID major: Invalid arguments to routine minor: Inappropriate type Expected results: Smooth creation of the file and normal termination of the program Additional info: 1. Same bug in Fortran 2. Running "mpiexec -n xx a.out" produces the same error, once for each MPI process. 3. I think that the proper libraries are loaded, as the output of "ldd a.out" is: linux-vdso.so.1 => (0x00007fff372c7000) libhdf5_hl.so.6 => /usr/lib64/openmpi/lib/libhdf5_hl.so.6 (0x00007fa9e3b28000) libhdf5.so.6 => /usr/lib64/openmpi/lib/libhdf5.so.6 (0x00007fa9e3535000) libmpi.so.1 => /usr/lib64/openmpi/lib/libmpi.so.1 (0x0000003d54c00000) libdl.so.2 => /lib64/libdl.so.2 (0x0000003d53c00000) libnsl.so.1 => /lib64/libnsl.so.1 (0x0000003d62e00000) libutil.so.1 => /lib64/libutil.so.1 (0x0000003d66800000) libm.so.6 => /lib64/libm.so.6 (0x0000003d53400000) libpthread.so.0 => /lib64/libpthread.so.0 (0x0000003d53800000) libc.so.6 => /lib64/libc.so.6 (0x0000003d53000000) libz.so.1 => /lib64/libz.so.1 (0x0000003d54800000) /lib64/ld-linux-x86-64.so.2 (0x0000003d52c00000)
This bug has nothing to do with hdf5 since it is caused by openmpi itself. ROMIO is not built correctly in these packages (due to a patch applied and followed by an autogen, which breaks configure files of ROMIO).
*** This bug has been marked as a duplicate of bug 722534 ***