Search Docs by Keyword

Table of Contents

MPI for Python (mpi4py) on the FASRC cluster

Introduction

This web-page is intended to help you running MPI Python applications on the cluster cluster using mpi4py.
To use **mpi4py** you need to load an appropriate Python software module. We have the Anaconda Python distribution from Continuum Analytics. In addition to mpi4py, it includes hundreds of the most popular packages for large-scale data processing and scientific computing.
You can load python in your user environment by running in your terminal:

module load python/2.7.14-fasrc01

Example Code

Below is a simple example code using mpi4py.

#!/usr/bin/env python
#++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
# Program: mpi4py_test.py
#++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
from mpi4py import MPI
nproc = MPI.COMM_WORLD.Get_size()   # Size of communicator
iproc = MPI.COMM_WORLD.Get_rank()   # Ranks in communicator
inode = MPI.Get_processor_name()    # Node where this MPI process runs
if iproc == 0: print "This code is a test for mpi4py."
for i in range(0,nproc):
    MPI.COMM_WORLD.Barrier()
    if iproc == i:
        print 'Rank %d out of %d' % (iproc,nproc)
MPI.Finalize()

Running the program

You could use the following SLURM batch-job submission script to submit the job to the queue:

#!/bin/bash
#SBATCH -J mpi4py_test
#SBATCH -o mpi4py_test.out
#SBATCH -e mpi4py_test.err
#SBATCH -p shared
#SBATCH -n 16
#SBATCH -t 30
#SBATCH --mem-per-cpu=4000
module load python/2.7.14-fasrc01
srun -n $SLURM_NTASKS --mpi=pmi2 python mpi4py_test.py

If you name the above script run.sbatch, for instance, the job is submitted to the queue with

sbatch run.sbatch

Upon job completion, job output will be located in the file mpi4py_test.out with the contents:

This code is a test for mpi4py.
Rank 0 out of 16
Rank 1 out of 16
Rank 2 out of 16
Rank 3 out of 16
Rank 4 out of 16
Rank 5 out of 16
Rank 6 out of 16
Rank 7 out of 16
Rank 8 out of 16
Rank 9 out of 16
Rank 10 out of 16
Rank 11 out of 16
Rank 12 out of 16
Rank 13 out of 16
Rank 14 out of 16
Rank 15 out of 16

References

* MPI for Python
* mpi4py documentation

© The President and Fellows of Harvard College
Except where otherwise noted, this content is licensed under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license.