Show More
@@ -1,178 +1,182 b'' | |||||
1 | .. _parallelmpi: |
|
1 | .. _parallelmpi: | |
2 |
|
2 | |||
3 | ======================= |
|
3 | ======================= | |
4 | Using MPI with IPython |
|
4 | Using MPI with IPython | |
5 | ======================= |
|
5 | ======================= | |
6 |
|
6 | |||
7 | Often, a parallel algorithm will require moving data between the engines. One |
|
7 | Often, a parallel algorithm will require moving data between the engines. One | |
8 | way of accomplishing this is by doing a pull and then a push using the |
|
8 | way of accomplishing this is by doing a pull and then a push using the | |
9 | multiengine client. However, this will be slow as all the data has to go |
|
9 | multiengine client. However, this will be slow as all the data has to go | |
10 | through the controller to the client and then back through the controller, to |
|
10 | through the controller to the client and then back through the controller, to | |
11 | its final destination. |
|
11 | its final destination. | |
12 |
|
12 | |||
13 | A much better way of moving data between engines is to use a message passing |
|
13 | A much better way of moving data between engines is to use a message passing | |
14 | library, such as the Message Passing Interface (MPI) [MPI]_. IPython's |
|
14 | library, such as the Message Passing Interface (MPI) [MPI]_. IPython's | |
15 | parallel computing architecture has been designed from the ground up to |
|
15 | parallel computing architecture has been designed from the ground up to | |
16 | integrate with MPI. This document describes how to use MPI with IPython. |
|
16 | integrate with MPI. This document describes how to use MPI with IPython. | |
17 |
|
17 | |||
18 | Additional installation requirements |
|
18 | Additional installation requirements | |
19 | ==================================== |
|
19 | ==================================== | |
20 |
|
20 | |||
21 | If you want to use MPI with IPython, you will need to install: |
|
21 | If you want to use MPI with IPython, you will need to install: | |
22 |
|
22 | |||
23 | * A standard MPI implementation such as OpenMPI [OpenMPI]_ or MPICH. |
|
23 | * A standard MPI implementation such as OpenMPI [OpenMPI]_ or MPICH. | |
24 | * The mpi4py [mpi4py]_ package. |
|
24 | * The mpi4py [mpi4py]_ package. | |
25 |
|
25 | |||
26 | .. note:: |
|
26 | .. note:: | |
27 |
|
27 | |||
28 | The mpi4py package is not a strict requirement. However, you need to |
|
28 | The mpi4py package is not a strict requirement. However, you need to | |
29 | have *some* way of calling MPI from Python. You also need some way of |
|
29 | have *some* way of calling MPI from Python. You also need some way of | |
30 | making sure that :func:`MPI_Init` is called when the IPython engines start |
|
30 | making sure that :func:`MPI_Init` is called when the IPython engines start | |
31 | up. There are a number of ways of doing this and a good number of |
|
31 | up. There are a number of ways of doing this and a good number of | |
32 | associated subtleties. We highly recommend just using mpi4py as it |
|
32 | associated subtleties. We highly recommend just using mpi4py as it | |
33 | takes care of most of these problems. If you want to do something |
|
33 | takes care of most of these problems. If you want to do something | |
34 | different, let us know and we can help you get started. |
|
34 | different, let us know and we can help you get started. | |
35 |
|
35 | |||
36 | Starting the engines with MPI enabled |
|
36 | Starting the engines with MPI enabled | |
37 | ===================================== |
|
37 | ===================================== | |
38 |
|
38 | |||
39 | To use code that calls MPI, there are typically two things that MPI requires. |
|
39 | To use code that calls MPI, there are typically two things that MPI requires. | |
40 |
|
40 | |||
41 | 1. The process that wants to call MPI must be started using |
|
41 | 1. The process that wants to call MPI must be started using | |
42 | :command:`mpiexec` or a batch system (like PBS) that has MPI support. |
|
42 | :command:`mpiexec` or a batch system (like PBS) that has MPI support. | |
43 | 2. Once the process starts, it must call :func:`MPI_Init`. |
|
43 | 2. Once the process starts, it must call :func:`MPI_Init`. | |
44 |
|
44 | |||
45 | There are a couple of ways that you can start the IPython engines and get |
|
45 | There are a couple of ways that you can start the IPython engines and get | |
46 | these things to happen. |
|
46 | these things to happen. | |
47 |
|
47 | |||
48 | Automatic starting using :command:`mpiexec` and :command:`ipcluster` |
|
48 | Automatic starting using :command:`mpiexec` and :command:`ipcluster` | |
49 | -------------------------------------------------------------------- |
|
49 | -------------------------------------------------------------------- | |
50 |
|
50 | |||
51 | The easiest approach is to use the `mpiexec` mode of :command:`ipcluster`, |
|
51 | The easiest approach is to use the `mpiexec` mode of :command:`ipcluster`, | |
52 | which will first start a controller and then a set of engines using |
|
52 | which will first start a controller and then a set of engines using | |
53 | :command:`mpiexec`:: |
|
53 | :command:`mpiexec`:: | |
54 |
|
54 | |||
55 | $ ipcluster mpiexec -n 4 |
|
55 | $ ipcluster mpiexec -n 4 | |
56 |
|
56 | |||
57 | This approach is best as interrupting :command:`ipcluster` will automatically |
|
57 | This approach is best as interrupting :command:`ipcluster` will automatically | |
58 | stop and clean up the controller and engines. |
|
58 | stop and clean up the controller and engines. | |
59 |
|
59 | |||
60 | Manual starting using :command:`mpiexec` |
|
60 | Manual starting using :command:`mpiexec` | |
61 | ---------------------------------------- |
|
61 | ---------------------------------------- | |
62 |
|
62 | |||
63 | If you want to start the IPython engines using the :command:`mpiexec`, just |
|
63 | If you want to start the IPython engines using the :command:`mpiexec`, just | |
64 | do:: |
|
64 | do:: | |
65 |
|
65 | |||
66 | $ mpiexec -n 4 ipengine --mpi=mpi4py |
|
66 | $ mpiexec -n 4 ipengine --mpi=mpi4py | |
67 |
|
67 | |||
68 | This requires that you already have a controller running and that the FURL |
|
68 | This requires that you already have a controller running and that the FURL | |
69 | files for the engines are in place. We also have built in support for |
|
69 | files for the engines are in place. We also have built in support for | |
70 | PyTrilinos [PyTrilinos]_, which can be used (assuming is installed) by |
|
70 | PyTrilinos [PyTrilinos]_, which can be used (assuming is installed) by | |
71 | starting the engines with:: |
|
71 | starting the engines with:: | |
72 |
|
72 | |||
73 | mpiexec -n 4 ipengine --mpi=pytrilinos |
|
73 | mpiexec -n 4 ipengine --mpi=pytrilinos | |
74 |
|
74 | |||
75 | Automatic starting using PBS and :command:`ipcluster` |
|
75 | Automatic starting using PBS and :command:`ipcluster` | |
76 | ----------------------------------------------------- |
|
76 | ----------------------------------------------------- | |
77 |
|
77 | |||
78 | The :command:`ipcluster` command also has built-in integration with PBS. For |
|
78 | The :command:`ipcluster` command also has built-in integration with PBS. For | |
79 | more information on this approach, see our documentation on :ref:`ipcluster |
|
79 | more information on this approach, see our documentation on :ref:`ipcluster | |
80 | <parallel_process>`. |
|
80 | <parallel_process>`. | |
81 |
|
81 | |||
82 | Actually using MPI |
|
82 | Actually using MPI | |
83 | ================== |
|
83 | ================== | |
84 |
|
84 | |||
85 | Once the engines are running with MPI enabled, you are ready to go. You can |
|
85 | Once the engines are running with MPI enabled, you are ready to go. You can | |
86 | now call any code that uses MPI in the IPython engines. And, all of this can |
|
86 | now call any code that uses MPI in the IPython engines. And, all of this can | |
87 | be done interactively. Here we show a simple example that uses mpi4py |
|
87 | be done interactively. Here we show a simple example that uses mpi4py | |
88 | [mpi4py]_. |
|
88 | [mpi4py]_ version 1.1.0 or later. | |
89 |
|
89 | |||
90 | First, lets define a simply function that uses MPI to calculate the sum of a |
|
90 | First, lets define a simply function that uses MPI to calculate the sum of a | |
91 | distributed array. Save the following text in a file called :file:`psum.py`: |
|
91 | distributed array. Save the following text in a file called :file:`psum.py`: | |
92 |
|
92 | |||
93 | .. sourcecode:: python |
|
93 | .. sourcecode:: python | |
94 |
|
94 | |||
95 | from mpi4py import MPI |
|
95 | from mpi4py import MPI | |
96 | import numpy as np |
|
96 | import numpy as np | |
97 |
|
97 | |||
98 | def psum(a): |
|
98 | def psum(a): | |
99 | s = np.sum(a) |
|
99 | s = np.sum(a) | |
100 | return MPI.COMM_WORLD.Allreduce(s,MPI.SUM) |
|
100 | rcvBuf = np.array(0.0,'d') | |
|
101 | MPI.COMM_WORLD.Allreduce([s, MPI.DOUBLE], | |||
|
102 | [rcvBuf, MPI.DOUBLE], | |||
|
103 | op=MPI.SUM) | |||
|
104 | return rcvBuf | |||
101 |
|
105 | |||
102 | Now, start an IPython cluster in the same directory as :file:`psum.py`:: |
|
106 | Now, start an IPython cluster in the same directory as :file:`psum.py`:: | |
103 |
|
107 | |||
104 | $ ipcluster mpiexec -n 4 |
|
108 | $ ipcluster mpiexec -n 4 | |
105 |
|
109 | |||
106 | Finally, connect to the cluster and use this function interactively. In this |
|
110 | Finally, connect to the cluster and use this function interactively. In this | |
107 | case, we create a random array on each engine and sum up all the random arrays |
|
111 | case, we create a random array on each engine and sum up all the random arrays | |
108 | using our :func:`psum` function: |
|
112 | using our :func:`psum` function: | |
109 |
|
113 | |||
110 | .. sourcecode:: ipython |
|
114 | .. sourcecode:: ipython | |
111 |
|
115 | |||
112 | In [1]: from IPython.kernel import client |
|
116 | In [1]: from IPython.kernel import client | |
113 |
|
117 | |||
114 | In [2]: mec = client.MultiEngineClient() |
|
118 | In [2]: mec = client.MultiEngineClient() | |
115 |
|
119 | |||
116 | In [3]: mec.activate() |
|
120 | In [3]: mec.activate() | |
117 |
|
121 | |||
118 | In [4]: px import numpy as np |
|
122 | In [4]: px import numpy as np | |
119 | Parallel execution on engines: all |
|
123 | Parallel execution on engines: all | |
120 | Out[4]: |
|
124 | Out[4]: | |
121 | <Results List> |
|
125 | <Results List> | |
122 | [0] In [13]: import numpy as np |
|
126 | [0] In [13]: import numpy as np | |
123 | [1] In [13]: import numpy as np |
|
127 | [1] In [13]: import numpy as np | |
124 | [2] In [13]: import numpy as np |
|
128 | [2] In [13]: import numpy as np | |
125 | [3] In [13]: import numpy as np |
|
129 | [3] In [13]: import numpy as np | |
126 |
|
130 | |||
127 | In [6]: px a = np.random.rand(100) |
|
131 | In [6]: px a = np.random.rand(100) | |
128 | Parallel execution on engines: all |
|
132 | Parallel execution on engines: all | |
129 | Out[6]: |
|
133 | Out[6]: | |
130 | <Results List> |
|
134 | <Results List> | |
131 | [0] In [15]: a = np.random.rand(100) |
|
135 | [0] In [15]: a = np.random.rand(100) | |
132 | [1] In [15]: a = np.random.rand(100) |
|
136 | [1] In [15]: a = np.random.rand(100) | |
133 | [2] In [15]: a = np.random.rand(100) |
|
137 | [2] In [15]: a = np.random.rand(100) | |
134 | [3] In [15]: a = np.random.rand(100) |
|
138 | [3] In [15]: a = np.random.rand(100) | |
135 |
|
139 | |||
136 | In [7]: px from psum import psum |
|
140 | In [7]: px from psum import psum | |
137 | Parallel execution on engines: all |
|
141 | Parallel execution on engines: all | |
138 | Out[7]: |
|
142 | Out[7]: | |
139 | <Results List> |
|
143 | <Results List> | |
140 | [0] In [16]: from psum import psum |
|
144 | [0] In [16]: from psum import psum | |
141 | [1] In [16]: from psum import psum |
|
145 | [1] In [16]: from psum import psum | |
142 | [2] In [16]: from psum import psum |
|
146 | [2] In [16]: from psum import psum | |
143 | [3] In [16]: from psum import psum |
|
147 | [3] In [16]: from psum import psum | |
144 |
|
148 | |||
145 | In [8]: px s = psum(a) |
|
149 | In [8]: px s = psum(a) | |
146 | Parallel execution on engines: all |
|
150 | Parallel execution on engines: all | |
147 | Out[8]: |
|
151 | Out[8]: | |
148 | <Results List> |
|
152 | <Results List> | |
149 | [0] In [17]: s = psum(a) |
|
153 | [0] In [17]: s = psum(a) | |
150 | [1] In [17]: s = psum(a) |
|
154 | [1] In [17]: s = psum(a) | |
151 | [2] In [17]: s = psum(a) |
|
155 | [2] In [17]: s = psum(a) | |
152 | [3] In [17]: s = psum(a) |
|
156 | [3] In [17]: s = psum(a) | |
153 |
|
157 | |||
154 | In [9]: px print s |
|
158 | In [9]: px print s | |
155 | Parallel execution on engines: all |
|
159 | Parallel execution on engines: all | |
156 | Out[9]: |
|
160 | Out[9]: | |
157 | <Results List> |
|
161 | <Results List> | |
158 | [0] In [18]: print s |
|
162 | [0] In [18]: print s | |
159 | [0] Out[18]: 187.451545803 |
|
163 | [0] Out[18]: 187.451545803 | |
160 |
|
164 | |||
161 | [1] In [18]: print s |
|
165 | [1] In [18]: print s | |
162 | [1] Out[18]: 187.451545803 |
|
166 | [1] Out[18]: 187.451545803 | |
163 |
|
167 | |||
164 | [2] In [18]: print s |
|
168 | [2] In [18]: print s | |
165 | [2] Out[18]: 187.451545803 |
|
169 | [2] Out[18]: 187.451545803 | |
166 |
|
170 | |||
167 | [3] In [18]: print s |
|
171 | [3] In [18]: print s | |
168 | [3] Out[18]: 187.451545803 |
|
172 | [3] Out[18]: 187.451545803 | |
169 |
|
173 | |||
170 | Any Python code that makes calls to MPI can be used in this manner, including |
|
174 | Any Python code that makes calls to MPI can be used in this manner, including | |
171 | compiled C, C++ and Fortran libraries that have been exposed to Python. |
|
175 | compiled C, C++ and Fortran libraries that have been exposed to Python. | |
172 |
|
176 | |||
173 | .. [MPI] Message Passing Interface. http://www-unix.mcs.anl.gov/mpi/ |
|
177 | .. [MPI] Message Passing Interface. http://www-unix.mcs.anl.gov/mpi/ | |
174 | .. [mpi4py] MPI for Python. mpi4py: http://mpi4py.scipy.org/ |
|
178 | .. [mpi4py] MPI for Python. mpi4py: http://mpi4py.scipy.org/ | |
175 | .. [OpenMPI] Open MPI. http://www.open-mpi.org/ |
|
179 | .. [OpenMPI] Open MPI. http://www.open-mpi.org/ | |
176 | .. [PyTrilinos] PyTrilinos. http://trilinos.sandia.gov/packages/pytrilinos/ |
|
180 | .. [PyTrilinos] PyTrilinos. http://trilinos.sandia.gov/packages/pytrilinos/ | |
177 |
|
181 | |||
178 |
|
182 |
General Comments 0
You need to be logged in to leave comments.
Login now