mpi4torch is an automatic-differentiable wrapper of MPI functions for the pytorch tensor library.
MPI stands for Message Passing Interface and is the de facto standard communication interface on high-performance computing resources. To facilitate the usage of pytorch on these resources an MPI wrapper that is transparent to pytorch’s automatic differentiation (AD) engine is much in need. This library tries to bridge this gap.
- Basic Usage
- Examples
- API Reference
JoinDummies()
JoinDummiesHandle()
MPI_MAX
MPI_MIN
MPI_SUM
MPI_PROD
MPI_LAND
MPI_BAND
MPI_LOR
MPI_BOR
MPI_LXOR
MPI_BXOR
MPI_MINLOC
MPI_MAXLOC
COMM_WORLD
MPI_Communicator
MPI_Communicator.Allgather()
MPI_Communicator.Allreduce()
MPI_Communicator.Alltoall()
MPI_Communicator.Bcast_()
MPI_Communicator.Gather()
MPI_Communicator.Irecv()
MPI_Communicator.Isend()
MPI_Communicator.Recv()
MPI_Communicator.Reduce_()
MPI_Communicator.Scatter()
MPI_Communicator.Send()
MPI_Communicator.Wait()
MPI_Communicator.rank
MPI_Communicator.size
WaitHandle
comm_from_mpi4py()
deactivate_cuda_aware_mpi_support()
- Glossary