##// END OF EJS Templates
Documentation work....
Brian Granger -
Show More
@@ -27,6 +27,9 b' Release dev'
27 New features
27 New features
28 ------------
28 ------------
29
29
30 * The wonderful TextMate editor can now be used with %edit on OS X. Thanks
31 to Matt Foster for this patch.
32
30 * Fully refactored :command:`ipcluster` command line program for starting
33 * Fully refactored :command:`ipcluster` command line program for starting
31 IPython clusters. This new version is a complete rewrite and 1) is fully
34 IPython clusters. This new version is a complete rewrite and 1) is fully
32 cross platform (we now use Twisted's process management), 2) has much
35 cross platform (we now use Twisted's process management), 2) has much
@@ -20,11 +20,11 b' If you want to use MPI with IPython, you will need to install:'
20
20
21 The `mpi4py`_ package is not a strict requirement. However, you need to
21 The `mpi4py`_ package is not a strict requirement. However, you need to
22 have *some* way of calling MPI from Python. You also need some way of
22 have *some* way of calling MPI from Python. You also need some way of
23 making sure that `MPI_Init` is called when the IPython engines start up.
23 making sure that :func:`MPI_Init` is called when the IPython engines start
24 There are a number of ways of doing this and a good number of associated
24 up. There are a number of ways of doing this and a good number of
25 subtleties. We highly recommend just using `mpi4py`_ as it takes care of
25 associated subtleties. We highly recommend just using `mpi4py`_ as it
26 most of these problems. If you want to do something different, let us know
26 takes care of most of these problems. If you want to do something
27 and we can help you get started.
27 different, let us know and we can help you get started.
28
28
29 Starting the engines with MPI enabled
29 Starting the engines with MPI enabled
30 =====================================
30 =====================================
@@ -33,7 +33,7 b' To use code that calls `MPI`_, there are typically two things that `MPI`_ requir'
33
33
34 1. The process that wants to call `MPI`_ must be started using
34 1. The process that wants to call `MPI`_ must be started using
35 :command:`mpirun` or a batch system (like PBS) that has `MPI`_ support.
35 :command:`mpirun` or a batch system (like PBS) that has `MPI`_ support.
36 2. Once the process starts, it must call `MPI_Init`.
36 2. Once the process starts, it must call :func:`MPI_Init`.
37
37
38 There are a couple of ways that you can start the IPython engines and get these things to happen.
38 There are a couple of ways that you can start the IPython engines and get these things to happen.
39
39
@@ -65,13 +65,90 b' The :command:`ipcluster` command also has built-in integration with PBS. For mor'
65 Actually using MPI
65 Actually using MPI
66 ==================
66 ==================
67
67
68 Once the engines are running with `MPI`_ enabled, you are ready to go. You can now call any code that uses MPI in the IPython engines. And all of this can be done interactively.
68 Once the engines are running with `MPI`_ enabled, you are ready to go. You can now call any code that uses MPI in the IPython engines. And, all of this can be done interactively. Here we show a simple example that uses `mpi4py`_.
69
70 First, lets define a simply function that uses MPI to calculate the sum of a distributed array. Save the following text in a file called :file:`psum.py`::
71
72 from mpi4py import MPI
73 import numpy as np
74
75 def psum(a):
76 s = np.sum(a)
77 return MPI.COMM_WORLD.Allreduce(s,MPI.SUM)
78
79 Now, start an IPython cluster in the same directory as :file:`psum.py`::
80
81 $ ipcluster mpirun -n 4
82
83 Finally, connect to the cluster and use this function interactively. In this case, we create a random array on each engine and sum up all the random arrays using our :func:`psum` function::
84
85 In [1]: from IPython.kernel import client
86
87 In [2]: mec = client.MultiEngineClient()
88
89 In [3]: mec.activate()
90
91 In [4]: px import numpy as np
92 Parallel execution on engines: all
93 Out[4]:
94 <Results List>
95 [0] In [13]: import numpy as np
96 [1] In [13]: import numpy as np
97 [2] In [13]: import numpy as np
98 [3] In [13]: import numpy as np
99
100 In [6]: px a = np.random.rand(100)
101 Parallel execution on engines: all
102 Out[6]:
103 <Results List>
104 [0] In [15]: a = np.random.rand(100)
105 [1] In [15]: a = np.random.rand(100)
106 [2] In [15]: a = np.random.rand(100)
107 [3] In [15]: a = np.random.rand(100)
108
109 In [7]: px from psum import psum
110 Parallel execution on engines: all
111 Out[7]:
112 <Results List>
113 [0] In [16]: from psum import psum
114 [1] In [16]: from psum import psum
115 [2] In [16]: from psum import psum
116 [3] In [16]: from psum import psum
117
118 In [8]: px s = psum(a)
119 Parallel execution on engines: all
120 Out[8]:
121 <Results List>
122 [0] In [17]: s = psum(a)
123 [1] In [17]: s = psum(a)
124 [2] In [17]: s = psum(a)
125 [3] In [17]: s = psum(a)
126
127 In [9]: px print s
128 Parallel execution on engines: all
129 Out[9]:
130 <Results List>
131 [0] In [18]: print s
132 [0] Out[18]: 187.451545803
133
134 [1] In [18]: print s
135 [1] Out[18]: 187.451545803
136
137 [2] In [18]: print s
138 [2] Out[18]: 187.451545803
139
140 [3] In [18]: print s
141 [3] Out[18]: 187.451545803
142
143 Any Python code that makes calls to MPI [MPIref]_ can be used in this manner.
69
144
70 Complications
145 Complications
71 =============
146 =============
72
147
73 Talk about how some older MPI implementations are broken and need to have a custom Python mail loop.
148 Talk about how some older MPI implementations are broken and need to have a custom Python mail loop.
74
149
150 .. [MPIref] http://www-unix.mcs.anl.gov/mpi/
151
75 .. _MPI: http://www-unix.mcs.anl.gov/mpi/
152 .. _MPI: http://www-unix.mcs.anl.gov/mpi/
76 .. _mpi4py: http://mpi4py.scipy.org/
153 .. _mpi4py: http://mpi4py.scipy.org/
77 .. _Open MPI: http://www.open-mpi.org/
154 .. _Open MPI: http://www.open-mpi.org/
General Comments 0
You need to be logged in to leave comments. Login now