[Vagnekman-users] New MPI versions on Ekman
Michael Schliephake
michs at kth.se
Wed Nov 9 10:00:35 CET 2011
Dear Users,
I made several new MPI versions available on Ekman:
Open MPI V. 1.4.4: The latest official stable release from the so-called
"super stable" series.
Open MPI V.1.5.4: The latest version from the "feature series". It is
officially a beta release and contains more bug fixes and new features.
These versions are provided for GNU, Intel, and PGI compilers.
MVAPICH2 V.1.7: This MPI library was provided for the first time last
month. Now I would like to mention that the installation has been fixed
and the newer Hydra process manager can be used. It allows an easier and
more sophisticated mapping and pinning of MPI processes to cores. This
allows more performant job setups that will be improved further together
with the installed optimization for MPI communication between processes on
the same node. To give an impression I provided recently the performance
data in that picture: http://www.pdc.kth.se/~michs/operation/notes.html
This version is supported currently for GNU and Intel compilers.
For the latest list of supported compilers see:
http://www.pdc.kth.se/resources/software/installed-software/mpi-libraries
Please feel free to contact me in the case of questions or the need of
support for other compilers.
A technical hint with respect to the use of MPI libraries on Ekman:
For MPI applications using MPI_THREAD_MULTIPLE you will get only the
performance of IPoIB on the network like it happens now with the Open MPI
1.4.3 modules. The new MPI libraries are compiled for single-thread use to
avoid that this happens unnoticed like it is now possible.
The investigation of this issue is ongoing. It focuses of course onto the
latest versions. Therefore and because there are new features available
and bug fixes available it is to recommend that you upgrade to the latest
MPI versions.
Please contact me if you are bound to use MPI with multiple communicating
threads per process and would like to upgrade the used MPI.
Best regards,
Michael Schliephake
More information about the Vagnekman-users
mailing list