##// END OF EJS Templates
Changing references to mpirun to mpiexec in docs.
Brian Granger -
Show More
@@ -32,34 +32,34 b' Starting the engines with MPI enabled'
32 32 To use code that calls MPI, there are typically two things that MPI requires.
33 33
34 34 1. The process that wants to call MPI must be started using
35 :command:`mpirun` or a batch system (like PBS) that has MPI support.
35 :command:`mpiexec` or a batch system (like PBS) that has MPI support.
36 36 2. Once the process starts, it must call :func:`MPI_Init`.
37 37
38 38 There are a couple of ways that you can start the IPython engines and get these things to happen.
39 39
40 Automatic starting using :command:`mpirun` and :command:`ipcluster`
40 Automatic starting using :command:`mpiexec` and :command:`ipcluster`
41 41 -------------------------------------------------------------------
42 42
43 The easiest approach is to use the `mpirun` mode of :command:`ipcluster`, which will first start a controller and then a set of engines using :command:`mpirun`::
43 The easiest approach is to use the `mpiexec` mode of :command:`ipcluster`, which will first start a controller and then a set of engines using :command:`mpiexec`::
44 44
45 $ ipcluster mpirun -n 4
45 $ ipcluster mpiexec -n 4
46 46
47 47 This approach is best as interrupting :command:`ipcluster` will automatically
48 48 stop and clean up the controller and engines.
49 49
50 Manual starting using :command:`mpirun`
50 Manual starting using :command:`mpiexec`
51 51 ---------------------------------------
52 52
53 If you want to start the IPython engines using the :command:`mpirun`, just do::
53 If you want to start the IPython engines using the :command:`mpiexec`, just do::
54 54
55 $ mpirun -n 4 ipengine --mpi=mpi4py
55 $ mpiexec -n 4 ipengine --mpi=mpi4py
56 56
57 57 This requires that you already have a controller running and that the FURL
58 58 files for the engines are in place. We also have built in support for
59 59 PyTrilinos [PyTrilinos]_, which can be used (assuming is installed) by
60 60 starting the engines with::
61 61
62 mpirun -n 4 ipengine --mpi=pytrilinos
62 mpiexec -n 4 ipengine --mpi=pytrilinos
63 63
64 64 Automatic starting using PBS and :command:`ipcluster`
65 65 -----------------------------------------------------
@@ -84,7 +84,7 b' First, lets define a simply function that uses MPI to calculate the sum of a dis'
84 84
85 85 Now, start an IPython cluster in the same directory as :file:`psum.py`::
86 86
87 $ ipcluster mpirun -n 4
87 $ ipcluster mpiexec -n 4
88 88
89 89 Finally, connect to the cluster and use this function interactively. In this case, we create a random array on each engine and sum up all the random arrays using our :func:`psum` function:
90 90
General Comments 0
You need to be logged in to leave comments. Login now