Chris@19: Chris@19:
Chris@19:Chris@19: Next: Linking and Initializing MPI FFTW, Chris@19: Previous: Distributed-memory FFTW with MPI, Chris@19: Up: Distributed-memory FFTW with MPI Chris@19:
All of the FFTW MPI code is located in the mpi
subdirectory of
Chris@19: the FFTW package. On Unix systems, the FFTW MPI libraries and header
Chris@19: files are automatically configured, compiled, and installed along with
Chris@19: the uniprocessor FFTW libraries simply by including
Chris@19: --enable-mpi
in the flags to the configure
script
Chris@19: (see Installation on Unix).
Chris@19:
Chris@19:
Chris@19:
Any implementation of the MPI standard, version 1 or later, should
Chris@19: work with FFTW. The configure
script will attempt to
Chris@19: automatically detect how to compile and link code using your MPI
Chris@19: implementation. In some cases, especially if you have multiple
Chris@19: different MPI implementations installed or have an unusual MPI
Chris@19: software package, you may need to provide this information explicitly.
Chris@19:
Chris@19:
Most commonly, one compiles MPI code by invoking a special compiler
Chris@19: command, typically mpicc
for C code. The configure
Chris@19: script knows the most common names for this command, but you can
Chris@19: specify the MPI compilation command explicitly by setting the
Chris@19: MPICC
variable, as in ‘./configure MPICC=mpicc ...’.
Chris@19:
Chris@19:
Chris@19:
If, instead of a special compiler command, you need to link a certain
Chris@19: library, you can specify the link command via the MPILIBS
Chris@19: variable, as in ‘./configure MPILIBS=-lmpi ...’. Note that if
Chris@19: your MPI library is installed in a non-standard location (one the
Chris@19: compiler does not know about by default), you may also have to specify
Chris@19: the location of the library and header files via LDFLAGS
and
Chris@19: CPPFLAGS
variables, respectively, as in ‘./configure
Chris@19: LDFLAGS=-L/path/to/mpi/libs CPPFLAGS=-I/path/to/mpi/include ...’.
Chris@19:
Chris@19:
Chris@19:
Chris@19: