##// END OF EJS Templates
change docs/examples refs to be just examples...
Paul Ivanov -
Show More
@@ -1,240 +1,240 b''
1 .. _paralleltask:
1 .. _paralleltask:
2
2
3 ==========================
3 ==========================
4 The IPython task interface
4 The IPython task interface
5 ==========================
5 ==========================
6
6
7 .. contents::
7 .. contents::
8
8
9 The ``Task`` interface to the controller presents the engines as a fault tolerant, dynamic load-balanced system or workers. Unlike the ``MultiEngine`` interface, in the ``Task`` interface, the user have no direct access to individual engines. In some ways, this interface is simpler, but in other ways it is more powerful. Best of all the user can use both of these interfaces at the same time to take advantage or both of their strengths. When the user can break up the user's work into segments that do not depend on previous execution, the ``Task`` interface is ideal. But it also has more power and flexibility, allowing the user to guide the distribution of jobs, without having to assign Tasks to engines explicitly.
9 The ``Task`` interface to the controller presents the engines as a fault tolerant, dynamic load-balanced system or workers. Unlike the ``MultiEngine`` interface, in the ``Task`` interface, the user have no direct access to individual engines. In some ways, this interface is simpler, but in other ways it is more powerful. Best of all the user can use both of these interfaces at the same time to take advantage or both of their strengths. When the user can break up the user's work into segments that do not depend on previous execution, the ``Task`` interface is ideal. But it also has more power and flexibility, allowing the user to guide the distribution of jobs, without having to assign Tasks to engines explicitly.
10
10
11 Starting the IPython controller and engines
11 Starting the IPython controller and engines
12 ===========================================
12 ===========================================
13
13
14 To follow along with this tutorial, the user will need to start the IPython
14 To follow along with this tutorial, the user will need to start the IPython
15 controller and four IPython engines. The simplest way of doing this is to
15 controller and four IPython engines. The simplest way of doing this is to
16 use the ``ipcluster`` command::
16 use the ``ipcluster`` command::
17
17
18 $ ipcluster -n 4
18 $ ipcluster -n 4
19
19
20 For more detailed information about starting the controller and engines, see our :ref:`introduction <parallel_overview>` to using IPython for parallel computing.
20 For more detailed information about starting the controller and engines, see our :ref:`introduction <parallel_overview>` to using IPython for parallel computing.
21
21
22 The magic here is that this single controller and set of engines is running both the MultiEngine and ``Task`` interfaces simultaneously.
22 The magic here is that this single controller and set of engines is running both the MultiEngine and ``Task`` interfaces simultaneously.
23
23
24 QuickStart Task Farming
24 QuickStart Task Farming
25 =======================
25 =======================
26
26
27 First, a quick example of how to start running the most basic Tasks.
27 First, a quick example of how to start running the most basic Tasks.
28 The first step is to import the IPython ``client`` module and then create a ``TaskClient`` instance::
28 The first step is to import the IPython ``client`` module and then create a ``TaskClient`` instance::
29
29
30 In [1]: from IPython.kernel import client
30 In [1]: from IPython.kernel import client
31
31
32 In [2]: tc = client.TaskClient()
32 In [2]: tc = client.TaskClient()
33
33
34 Then the user wrap the commands the user want to run in Tasks::
34 Then the user wrap the commands the user want to run in Tasks::
35
35
36 In [3]: tasklist = []
36 In [3]: tasklist = []
37 In [4]: for n in range(1000):
37 In [4]: for n in range(1000):
38 ... tasklist.append(client.Task("a = %i"%n, pull="a"))
38 ... tasklist.append(client.Task("a = %i"%n, pull="a"))
39
39
40 The first argument of the ``Task`` constructor is a string, the command to be executed. The most important optional keyword argument is ``pull``, which can be a string or list of strings, and it specifies the variable names to be saved as results of the ``Task``.
40 The first argument of the ``Task`` constructor is a string, the command to be executed. The most important optional keyword argument is ``pull``, which can be a string or list of strings, and it specifies the variable names to be saved as results of the ``Task``.
41
41
42 Next, the user need to submit the Tasks to the ``TaskController`` with the ``TaskClient``::
42 Next, the user need to submit the Tasks to the ``TaskController`` with the ``TaskClient``::
43
43
44 In [5]: taskids = [ tc.run(t) for t in tasklist ]
44 In [5]: taskids = [ tc.run(t) for t in tasklist ]
45
45
46 This will give the user a list of the TaskIDs used by the controller to keep track of the Tasks and their results. Now at some point the user are going to want to get those results back. The ``barrier`` method allows the user to wait for the Tasks to finish running::
46 This will give the user a list of the TaskIDs used by the controller to keep track of the Tasks and their results. Now at some point the user are going to want to get those results back. The ``barrier`` method allows the user to wait for the Tasks to finish running::
47
47
48 In [6]: tc.barrier(taskids)
48 In [6]: tc.barrier(taskids)
49
49
50 This command will block until all the Tasks in ``taskids`` have finished. Now, the user probably want to look at the user's results::
50 This command will block until all the Tasks in ``taskids`` have finished. Now, the user probably want to look at the user's results::
51
51
52 In [7]: task_results = [ tc.get_task_result(taskid) for taskid in taskids ]
52 In [7]: task_results = [ tc.get_task_result(taskid) for taskid in taskids ]
53
53
54 Now the user have a list of ``TaskResult`` objects, which have the actual result as a dictionary, but also keep track of some useful metadata about the ``Task``::
54 Now the user have a list of ``TaskResult`` objects, which have the actual result as a dictionary, but also keep track of some useful metadata about the ``Task``::
55
55
56 In [8]: tr = ``Task``_results[73]
56 In [8]: tr = ``Task``_results[73]
57
57
58 In [9]: tr
58 In [9]: tr
59 Out[9]: ``TaskResult``[ID:73]:{'a':73}
59 Out[9]: ``TaskResult``[ID:73]:{'a':73}
60
60
61 In [10]: tr.engineid
61 In [10]: tr.engineid
62 Out[10]: 1
62 Out[10]: 1
63
63
64 In [11]: tr.submitted, tr.completed, tr.duration
64 In [11]: tr.submitted, tr.completed, tr.duration
65 Out[11]: ("2008/03/08 03:41:42", "2008/03/08 03:41:44", 2.12345)
65 Out[11]: ("2008/03/08 03:41:42", "2008/03/08 03:41:44", 2.12345)
66
66
67 The actual results are stored in a dictionary, ``tr.results``, and a namespace object ``tr.ns`` which accesses the result keys by attribute::
67 The actual results are stored in a dictionary, ``tr.results``, and a namespace object ``tr.ns`` which accesses the result keys by attribute::
68
68
69 In [12]: tr.results['a']
69 In [12]: tr.results['a']
70 Out[12]: 73
70 Out[12]: 73
71
71
72 In [13]: tr.ns.a
72 In [13]: tr.ns.a
73 Out[13]: 73
73 Out[13]: 73
74
74
75 That should cover the basics of running simple Tasks. There are several more powerful things the user can do with Tasks covered later. The most useful probably being using a ``MutiEngineClient`` interface to initialize all the engines with the import dependencies necessary to run the user's Tasks.
75 That should cover the basics of running simple Tasks. There are several more powerful things the user can do with Tasks covered later. The most useful probably being using a ``MutiEngineClient`` interface to initialize all the engines with the import dependencies necessary to run the user's Tasks.
76
76
77 There are many options for running and managing Tasks. The best way to learn further about the ``Task`` interface is to study the examples in ``docs/examples``. If the user do so and learn a lots about this interface, we encourage the user to expand this documentation about the ``Task`` system.
77 There are many options for running and managing Tasks. The best way to learn further about the ``Task`` interface is to study the examples in ``examples``. If the user do so and learn a lots about this interface, we encourage the user to expand this documentation about the ``Task`` system.
78
78
79 Overview of the Task System
79 Overview of the Task System
80 ===========================
80 ===========================
81
81
82 The user's view of the ``Task`` system has three basic objects: The ``TaskClient``, the ``Task``, and the ``TaskResult``. The names of these three objects well indicate their role.
82 The user's view of the ``Task`` system has three basic objects: The ``TaskClient``, the ``Task``, and the ``TaskResult``. The names of these three objects well indicate their role.
83
83
84 The ``TaskClient`` is the user's ``Task`` farming connection to the IPython cluster. Unlike the ``MultiEngineClient``, the ``TaskControler`` handles all the scheduling and distribution of work, so the ``TaskClient`` has no notion of engines, it just submits Tasks and requests their results. The Tasks are described as ``Task`` objects, and their results are wrapped in ``TaskResult`` objects. Thus, there are very few necessary methods for the user to manage.
84 The ``TaskClient`` is the user's ``Task`` farming connection to the IPython cluster. Unlike the ``MultiEngineClient``, the ``TaskControler`` handles all the scheduling and distribution of work, so the ``TaskClient`` has no notion of engines, it just submits Tasks and requests their results. The Tasks are described as ``Task`` objects, and their results are wrapped in ``TaskResult`` objects. Thus, there are very few necessary methods for the user to manage.
85
85
86 Inside the task system is a Scheduler object, which assigns tasks to workers. The default scheduler is a simple FIFO queue. Subclassing the Scheduler should be easy, just implementing your own priority system.
86 Inside the task system is a Scheduler object, which assigns tasks to workers. The default scheduler is a simple FIFO queue. Subclassing the Scheduler should be easy, just implementing your own priority system.
87
87
88 The TaskClient
88 The TaskClient
89 ==============
89 ==============
90
90
91 The ``TaskClient`` is the object the user use to connect to the ``Controller`` that is managing the user's Tasks. It is the analog of the ``MultiEngineClient`` for the standard IPython multiplexing interface. As with all client interfaces, the first step is to import the IPython Client Module::
91 The ``TaskClient`` is the object the user use to connect to the ``Controller`` that is managing the user's Tasks. It is the analog of the ``MultiEngineClient`` for the standard IPython multiplexing interface. As with all client interfaces, the first step is to import the IPython Client Module::
92
92
93 In [1]: from IPython.kernel import client
93 In [1]: from IPython.kernel import client
94
94
95 Just as with the ``MultiEngineClient``, the user create the ``TaskClient`` with a tuple, containing the ip-address and port of the ``Controller``. the ``client`` module conveniently has the default address of the ``Task`` interface of the controller. Creating a default ``TaskClient`` object would be done with this::
95 Just as with the ``MultiEngineClient``, the user create the ``TaskClient`` with a tuple, containing the ip-address and port of the ``Controller``. the ``client`` module conveniently has the default address of the ``Task`` interface of the controller. Creating a default ``TaskClient`` object would be done with this::
96
96
97 In [2]: tc = client.TaskClient(client.default_task_address)
97 In [2]: tc = client.TaskClient(client.default_task_address)
98
98
99 or, if the user want to specify a non default location of the ``Controller``, the user can specify explicitly::
99 or, if the user want to specify a non default location of the ``Controller``, the user can specify explicitly::
100
100
101 In [3]: tc = client.TaskClient(("192.168.1.1", 10113))
101 In [3]: tc = client.TaskClient(("192.168.1.1", 10113))
102
102
103 As discussed earlier, the ``TaskClient`` only has a few basic methods.
103 As discussed earlier, the ``TaskClient`` only has a few basic methods.
104
104
105 * ``tc.run(task)``
105 * ``tc.run(task)``
106 ``run`` is the method by which the user submits Tasks. It takes exactly one argument, a ``Task`` object. All the advanced control of ``Task`` behavior is handled by properties of the ``Task`` object, rather than the submission command, so they will be discussed later in the `Task`_ section. ``run`` returns an integer, the ``Task``ID by which the ``Task`` and its results can be tracked and retrieved::
106 ``run`` is the method by which the user submits Tasks. It takes exactly one argument, a ``Task`` object. All the advanced control of ``Task`` behavior is handled by properties of the ``Task`` object, rather than the submission command, so they will be discussed later in the `Task`_ section. ``run`` returns an integer, the ``Task``ID by which the ``Task`` and its results can be tracked and retrieved::
107
107
108 In [4]: ``Task``ID = tc.run(``Task``)
108 In [4]: ``Task``ID = tc.run(``Task``)
109
109
110 * ``tc.get_task_result(taskid, block=``False``)``
110 * ``tc.get_task_result(taskid, block=``False``)``
111 ``get_task_result`` is the method by which results are retrieved. It takes a single integer argument, the ``Task``ID`` of the result the user wish to retrieve. ``get_task_result`` also takes a keyword argument ``block``. ``block`` specifies whether the user actually want to wait for the result. If ``block`` is false, as it is by default, ``get_task_result`` will return immediately. If the ``Task`` has completed, it will return the ``TaskResult`` object for that ``Task``. But if the ``Task`` has not completed, it will return ``None``. If the user specify ``block=``True``, then ``get_task_result`` will wait for the ``Task`` to complete, and always return the ``TaskResult`` for the requested ``Task``.
111 ``get_task_result`` is the method by which results are retrieved. It takes a single integer argument, the ``Task``ID`` of the result the user wish to retrieve. ``get_task_result`` also takes a keyword argument ``block``. ``block`` specifies whether the user actually want to wait for the result. If ``block`` is false, as it is by default, ``get_task_result`` will return immediately. If the ``Task`` has completed, it will return the ``TaskResult`` object for that ``Task``. But if the ``Task`` has not completed, it will return ``None``. If the user specify ``block=``True``, then ``get_task_result`` will wait for the ``Task`` to complete, and always return the ``TaskResult`` for the requested ``Task``.
112 * ``tc.barrier(taskid(s))``
112 * ``tc.barrier(taskid(s))``
113 ``barrier`` is a synchronization method. It takes exactly one argument, a ``Task``ID or list of taskIDs. ``barrier`` will block until all the specified Tasks have completed. In practice, a barrier is often called between the ``Task`` submission section of the code and the result gathering section::
113 ``barrier`` is a synchronization method. It takes exactly one argument, a ``Task``ID or list of taskIDs. ``barrier`` will block until all the specified Tasks have completed. In practice, a barrier is often called between the ``Task`` submission section of the code and the result gathering section::
114
114
115 In [5]: taskIDs = [ tc.run(``Task``) for ``Task`` in myTasks ]
115 In [5]: taskIDs = [ tc.run(``Task``) for ``Task`` in myTasks ]
116
116
117 In [6]: tc.get_task_result(taskIDs[-1]) is None
117 In [6]: tc.get_task_result(taskIDs[-1]) is None
118 Out[6]: ``True``
118 Out[6]: ``True``
119
119
120 In [7]: tc.barrier(``Task``ID)
120 In [7]: tc.barrier(``Task``ID)
121
121
122 In [8]: results = [ tc.get_task_result(tid) for tid in taskIDs ]
122 In [8]: results = [ tc.get_task_result(tid) for tid in taskIDs ]
123
123
124 * ``tc.queue_status(verbose=``False``)``
124 * ``tc.queue_status(verbose=``False``)``
125 ``queue_status`` is a method for querying the state of the ``TaskControler``. ``queue_status`` returns a dict of the form::
125 ``queue_status`` is a method for querying the state of the ``TaskControler``. ``queue_status`` returns a dict of the form::
126
126
127 {'scheduled': Tasks that have been submitted but yet run
127 {'scheduled': Tasks that have been submitted but yet run
128 'pending' : Tasks that are currently running
128 'pending' : Tasks that are currently running
129 'succeeded': Tasks that have completed successfully
129 'succeeded': Tasks that have completed successfully
130 'failed' : Tasks that have finished with a failure
130 'failed' : Tasks that have finished with a failure
131 }
131 }
132
132
133 if @verbose is not specified (or is ``False``), then the values of the dict are integers - the number of Tasks in each state. if @verbose is ``True``, then each element in the dict is a list of the taskIDs in that state::
133 if @verbose is not specified (or is ``False``), then the values of the dict are integers - the number of Tasks in each state. if @verbose is ``True``, then each element in the dict is a list of the taskIDs in that state::
134
134
135 In [8]: tc.queue_status()
135 In [8]: tc.queue_status()
136 Out[8]: {'scheduled': 4,
136 Out[8]: {'scheduled': 4,
137 'pending' : 2,
137 'pending' : 2,
138 'succeeded': 5,
138 'succeeded': 5,
139 'failed' : 1
139 'failed' : 1
140 }
140 }
141
141
142 In [9]: tc.queue_status(verbose=True)
142 In [9]: tc.queue_status(verbose=True)
143 Out[9]: {'scheduled': [8,9,10,11],
143 Out[9]: {'scheduled': [8,9,10,11],
144 'pending' : [6,7],
144 'pending' : [6,7],
145 'succeeded': [0,1,2,4,5],
145 'succeeded': [0,1,2,4,5],
146 'failed' : [3]
146 'failed' : [3]
147 }
147 }
148
148
149 * ``tc.abort(taskid)``
149 * ``tc.abort(taskid)``
150 ``abort`` allows the user to abort Tasks that have already been submitted. ``abort`` will always return immediately. If the ``Task`` has completed, ``abort`` will raise an ``IndexError ``Task`` Already Completed``. An obvious case for ``abort`` would be where the user submits a long-running ``Task`` with a number of retries (see ``Task``_ section for how to specify retries) in an interactive session, but realizes there has been a typo. The user can then abort the ``Task``, preventing certain failures from cluttering up the queue. It can also be used for parallel search-type problems, where only one ``Task`` will give the solution, so once the user find the solution, the user would want to abort all remaining Tasks to prevent wasted work.
150 ``abort`` allows the user to abort Tasks that have already been submitted. ``abort`` will always return immediately. If the ``Task`` has completed, ``abort`` will raise an ``IndexError ``Task`` Already Completed``. An obvious case for ``abort`` would be where the user submits a long-running ``Task`` with a number of retries (see ``Task``_ section for how to specify retries) in an interactive session, but realizes there has been a typo. The user can then abort the ``Task``, preventing certain failures from cluttering up the queue. It can also be used for parallel search-type problems, where only one ``Task`` will give the solution, so once the user find the solution, the user would want to abort all remaining Tasks to prevent wasted work.
151 * ``tc.spin()``
151 * ``tc.spin()``
152 ``spin`` simply triggers the scheduler in the ``TaskControler``. Under most normal circumstances, this will do nothing. The primary known usage case involves the ``Task`` dependency (see `Dependencies`_). The dependency is a function of an Engine's ``properties``, but changing the ``properties`` via the ``MutliEngineClient`` does not trigger a reschedule event. The main example case for this requires the following event sequence:
152 ``spin`` simply triggers the scheduler in the ``TaskControler``. Under most normal circumstances, this will do nothing. The primary known usage case involves the ``Task`` dependency (see `Dependencies`_). The dependency is a function of an Engine's ``properties``, but changing the ``properties`` via the ``MutliEngineClient`` does not trigger a reschedule event. The main example case for this requires the following event sequence:
153 * ``engine`` is available, ``Task`` is submitted, but ``engine`` does not have ``Task``'s dependencies.
153 * ``engine`` is available, ``Task`` is submitted, but ``engine`` does not have ``Task``'s dependencies.
154 * ``engine`` gets necessary dependencies while no new Tasks are submitted or completed.
154 * ``engine`` gets necessary dependencies while no new Tasks are submitted or completed.
155 * now ``engine`` can run ``Task``, but a ``Task`` event is required for the ``TaskControler`` to try scheduling ``Task`` again.
155 * now ``engine`` can run ``Task``, but a ``Task`` event is required for the ``TaskControler`` to try scheduling ``Task`` again.
156
156
157 ``spin`` is just an empty ping method to ensure that the Controller has scheduled all available Tasks, and should not be needed under most normal circumstances.
157 ``spin`` is just an empty ping method to ensure that the Controller has scheduled all available Tasks, and should not be needed under most normal circumstances.
158
158
159 That covers the ``TaskClient``, a simple interface to the cluster. With this, the user can submit jobs (and abort if necessary), request their results, synchronize on arbitrary subsets of jobs.
159 That covers the ``TaskClient``, a simple interface to the cluster. With this, the user can submit jobs (and abort if necessary), request their results, synchronize on arbitrary subsets of jobs.
160
160
161 .. _task: The Task Object
161 .. _task: The Task Object
162
162
163 The Task Object
163 The Task Object
164 ===============
164 ===============
165
165
166 The ``Task`` is the basic object for describing a job. It can be used in a very simple manner, where the user just specifies a command string to be executed as the ``Task``. The usage of this first argument is exactly the same as the ``execute`` method of the ``MultiEngine`` (in fact, ``execute`` is called to run the code)::
166 The ``Task`` is the basic object for describing a job. It can be used in a very simple manner, where the user just specifies a command string to be executed as the ``Task``. The usage of this first argument is exactly the same as the ``execute`` method of the ``MultiEngine`` (in fact, ``execute`` is called to run the code)::
167
167
168 In [1]: t = client.Task("a = str(id)")
168 In [1]: t = client.Task("a = str(id)")
169
169
170 This ``Task`` would run, and store the string representation of the ``id`` element in ``a`` in each worker's namespace, but it is fairly useless because the user does not know anything about the state of the ``worker`` on which it ran at the time of retrieving results. It is important that each ``Task`` not expect the state of the ``worker`` to persist after the ``Task`` is completed.
170 This ``Task`` would run, and store the string representation of the ``id`` element in ``a`` in each worker's namespace, but it is fairly useless because the user does not know anything about the state of the ``worker`` on which it ran at the time of retrieving results. It is important that each ``Task`` not expect the state of the ``worker`` to persist after the ``Task`` is completed.
171 There are many different situations for using ``Task`` Farming, and the ``Task`` object has many attributes for use in customizing the ``Task`` behavior. All of a ``Task``'s attributes may be specified in the constructor, through keyword arguments, or after ``Task`` construction through attribute assignment.
171 There are many different situations for using ``Task`` Farming, and the ``Task`` object has many attributes for use in customizing the ``Task`` behavior. All of a ``Task``'s attributes may be specified in the constructor, through keyword arguments, or after ``Task`` construction through attribute assignment.
172
172
173 Data Attributes
173 Data Attributes
174 ***************
174 ***************
175 It is likely that the user may want to move data around before or after executing the ``Task``. We provide methods of sending data to initialize the worker's namespace, and specifying what data to bring back as the ``Task``'s results.
175 It is likely that the user may want to move data around before or after executing the ``Task``. We provide methods of sending data to initialize the worker's namespace, and specifying what data to bring back as the ``Task``'s results.
176
176
177 * pull = []
177 * pull = []
178 The obvious case is as above, where ``t`` would execute and store the result of ``myfunc`` in ``a``, it is likely that the user would want to bring ``a`` back to their namespace. This is done through the ``pull`` attribute. ``pull`` can be a string or list of strings, and it specifies the names of variables to be retrieved. The ``TaskResult`` object retrieved by ``get_task_result`` will have a dictionary of keys and values, and the ``Task``'s ``pull`` attribute determines what goes into it::
178 The obvious case is as above, where ``t`` would execute and store the result of ``myfunc`` in ``a``, it is likely that the user would want to bring ``a`` back to their namespace. This is done through the ``pull`` attribute. ``pull`` can be a string or list of strings, and it specifies the names of variables to be retrieved. The ``TaskResult`` object retrieved by ``get_task_result`` will have a dictionary of keys and values, and the ``Task``'s ``pull`` attribute determines what goes into it::
179
179
180 In [2]: t = client.Task("a = str(id)", pull = "a")
180 In [2]: t = client.Task("a = str(id)", pull = "a")
181
181
182 In [3]: t = client.Task("a = str(id)", pull = ["a", "id"])
182 In [3]: t = client.Task("a = str(id)", pull = ["a", "id"])
183
183
184 * push = {}
184 * push = {}
185 A user might also want to initialize some data into the namespace before the code part of the ``Task`` is run. Enter ``push``. ``push`` is a dictionary of key/value pairs to be loaded from the user's namespace into the worker's immediately before execution::
185 A user might also want to initialize some data into the namespace before the code part of the ``Task`` is run. Enter ``push``. ``push`` is a dictionary of key/value pairs to be loaded from the user's namespace into the worker's immediately before execution::
186
186
187 In [4]: t = client.Task("a = f(submitted)", push=dict(submitted=time.time()), pull="a")
187 In [4]: t = client.Task("a = f(submitted)", push=dict(submitted=time.time()), pull="a")
188
188
189 push and pull result directly in calling an ``engine``'s ``push`` and ``pull`` methods before and after ``Task`` execution respectively, and thus their api is the same.
189 push and pull result directly in calling an ``engine``'s ``push`` and ``pull`` methods before and after ``Task`` execution respectively, and thus their api is the same.
190
190
191 Namespace Cleaning
191 Namespace Cleaning
192 ******************
192 ******************
193 When a user is running a large number of Tasks, it is likely that the namespace of the worker's could become cluttered. Some Tasks might be sensitive to clutter, while others might be known to cause namespace pollution. For these reasons, Tasks have two boolean attributes for cleaning up the namespace.
193 When a user is running a large number of Tasks, it is likely that the namespace of the worker's could become cluttered. Some Tasks might be sensitive to clutter, while others might be known to cause namespace pollution. For these reasons, Tasks have two boolean attributes for cleaning up the namespace.
194
194
195 * ``clear_after``
195 * ``clear_after``
196 if clear_after is specified ``True``, the worker on which the ``Task`` was run will be reset (via ``engine.reset``) upon completion of the ``Task``. This can be useful for both Tasks that produce clutter or Tasks whose intermediate data one might wish to be kept private::
196 if clear_after is specified ``True``, the worker on which the ``Task`` was run will be reset (via ``engine.reset``) upon completion of the ``Task``. This can be useful for both Tasks that produce clutter or Tasks whose intermediate data one might wish to be kept private::
197
197
198 In [5]: t = client.Task("a = range(1e10)", pull = "a",clear_after=True)
198 In [5]: t = client.Task("a = range(1e10)", pull = "a",clear_after=True)
199
199
200
200
201 * ``clear_before``
201 * ``clear_before``
202 as one might guess, clear_before is identical to ``clear_after``, but it takes place before the ``Task`` is run. This ensures that the ``Task`` runs on a fresh worker::
202 as one might guess, clear_before is identical to ``clear_after``, but it takes place before the ``Task`` is run. This ensures that the ``Task`` runs on a fresh worker::
203
203
204 In [6]: t = client.Task("a = globals()", pull = "a",clear_before=True)
204 In [6]: t = client.Task("a = globals()", pull = "a",clear_before=True)
205
205
206 Of course, a user can both at the same time, ensuring that all workers are clear except when they are currently running a job. Both of these default to ``False``.
206 Of course, a user can both at the same time, ensuring that all workers are clear except when they are currently running a job. Both of these default to ``False``.
207
207
208 Fault Tolerance
208 Fault Tolerance
209 ***************
209 ***************
210 It is possible that Tasks might fail, and there are a variety of reasons this could happen. One might be that the worker it was running on disconnected, and there was nothing wrong with the ``Task`` itself. With the fault tolerance attributes of the ``Task``, the user can specify how many times to resubmit the ``Task``, and what to do if it never succeeds.
210 It is possible that Tasks might fail, and there are a variety of reasons this could happen. One might be that the worker it was running on disconnected, and there was nothing wrong with the ``Task`` itself. With the fault tolerance attributes of the ``Task``, the user can specify how many times to resubmit the ``Task``, and what to do if it never succeeds.
211
211
212 * ``retries``
212 * ``retries``
213 ``retries`` is an integer, specifying the number of times a ``Task`` is to be retried. It defaults to zero. It is often a good idea for this number to be 1 or 2, to protect the ``Task`` from disconnecting engines, but not a large number. If a ``Task`` is failing 100 times, there is probably something wrong with the ``Task``. The canonical bad example:
213 ``retries`` is an integer, specifying the number of times a ``Task`` is to be retried. It defaults to zero. It is often a good idea for this number to be 1 or 2, to protect the ``Task`` from disconnecting engines, but not a large number. If a ``Task`` is failing 100 times, there is probably something wrong with the ``Task``. The canonical bad example:
214
214
215 In [7]: t = client.Task("os.kill(os.getpid(), 9)", retries=99)
215 In [7]: t = client.Task("os.kill(os.getpid(), 9)", retries=99)
216
216
217 This would actually take down 100 workers.
217 This would actually take down 100 workers.
218
218
219 * ``recovery_task``
219 * ``recovery_task``
220 ``recovery_task`` is another ``Task`` object, to be run in the event of the original ``Task`` still failing after running out of retries. Since ``recovery_task`` is another ``Task`` object, it can have its own ``recovery_task``. The chain of Tasks is limitless, except loops are not allowed (that would be bad!).
220 ``recovery_task`` is another ``Task`` object, to be run in the event of the original ``Task`` still failing after running out of retries. Since ``recovery_task`` is another ``Task`` object, it can have its own ``recovery_task``. The chain of Tasks is limitless, except loops are not allowed (that would be bad!).
221
221
222 Dependencies
222 Dependencies
223 ************
223 ************
224 Dependencies are the most powerful part of the ``Task`` farming system, because it allows the user to do some classification of the workers, and guide the ``Task`` distribution without meddling with the controller directly. It makes use of two objects - the ``Task``'s ``depend`` attribute, and the engine's ``properties``. See the `MultiEngine`_ reference for how to use engine properties. The engine properties api exists for extending IPython, allowing conditional execution and new controllers that make decisions based on properties of its engines. Currently the ``Task`` dependency is the only internal use of the properties api.
224 Dependencies are the most powerful part of the ``Task`` farming system, because it allows the user to do some classification of the workers, and guide the ``Task`` distribution without meddling with the controller directly. It makes use of two objects - the ``Task``'s ``depend`` attribute, and the engine's ``properties``. See the `MultiEngine`_ reference for how to use engine properties. The engine properties api exists for extending IPython, allowing conditional execution and new controllers that make decisions based on properties of its engines. Currently the ``Task`` dependency is the only internal use of the properties api.
225
225
226 .. _MultiEngine: ./parallel_multiengine
226 .. _MultiEngine: ./parallel_multiengine
227
227
228 The ``depend`` attribute of a ``Task`` must be a function of exactly one argument, the worker's properties dictionary, and it should return ``True`` if the ``Task`` should be allowed to run on the worker and ``False`` if not. The usage in the controller is fault tolerant, so exceptions raised by ``Task.depend`` will be ignored and functionally equivalent to always returning ``False``. Tasks`` with invalid ``depend`` functions will never be assigned to a worker::
228 The ``depend`` attribute of a ``Task`` must be a function of exactly one argument, the worker's properties dictionary, and it should return ``True`` if the ``Task`` should be allowed to run on the worker and ``False`` if not. The usage in the controller is fault tolerant, so exceptions raised by ``Task.depend`` will be ignored and functionally equivalent to always returning ``False``. Tasks`` with invalid ``depend`` functions will never be assigned to a worker::
229
229
230 In [8]: def dep(properties):
230 In [8]: def dep(properties):
231 ... return properties["RAM"] > 2**32 # have at least 4GB
231 ... return properties["RAM"] > 2**32 # have at least 4GB
232 In [9]: t = client.Task("a = bigfunc()", depend=dep)
232 In [9]: t = client.Task("a = bigfunc()", depend=dep)
233
233
234 It is important to note that assignment of values to the properties dict is done entirely by the user, either locally (in the engine) using the EngineAPI, or remotely, through the ``MultiEngineClient``'s get/set_properties methods.
234 It is important to note that assignment of values to the properties dict is done entirely by the user, either locally (in the engine) using the EngineAPI, or remotely, through the ``MultiEngineClient``'s get/set_properties methods.
235
235
236
236
237
237
238
238
239
239
240
240
@@ -1,1165 +1,1165 b''
1 =================
1 =================
2 IPython reference
2 IPython reference
3 =================
3 =================
4
4
5 .. _command_line_options:
5 .. _command_line_options:
6
6
7 Command-line usage
7 Command-line usage
8 ==================
8 ==================
9
9
10 You start IPython with the command::
10 You start IPython with the command::
11
11
12 $ ipython [options] files
12 $ ipython [options] files
13
13
14 .. note::
14 .. note::
15
15
16 For IPython on Python 3, use ``ipython3`` in place of ``ipython``.
16 For IPython on Python 3, use ``ipython3`` in place of ``ipython``.
17
17
18 If invoked with no options, it executes all the files listed in sequence
18 If invoked with no options, it executes all the files listed in sequence
19 and drops you into the interpreter while still acknowledging any options
19 and drops you into the interpreter while still acknowledging any options
20 you may have set in your ipython_config.py. This behavior is different from
20 you may have set in your ipython_config.py. This behavior is different from
21 standard Python, which when called as python -i will only execute one
21 standard Python, which when called as python -i will only execute one
22 file and ignore your configuration setup.
22 file and ignore your configuration setup.
23
23
24 Please note that some of the configuration options are not available at
24 Please note that some of the configuration options are not available at
25 the command line, simply because they are not practical here. Look into
25 the command line, simply because they are not practical here. Look into
26 your configuration files for details on those. There are separate configuration
26 your configuration files for details on those. There are separate configuration
27 files for each profile, and the files look like "ipython_config.py" or
27 files for each profile, and the files look like "ipython_config.py" or
28 "ipython_config_<frontendname>.py". Profile directories look like
28 "ipython_config_<frontendname>.py". Profile directories look like
29 "profile_profilename" and are typically installed in the IPYTHONDIR directory.
29 "profile_profilename" and are typically installed in the IPYTHONDIR directory.
30 For Linux users, this will be $HOME/.config/ipython, and for other users it
30 For Linux users, this will be $HOME/.config/ipython, and for other users it
31 will be $HOME/.ipython. For Windows users, $HOME resolves to C:\\Documents and
31 will be $HOME/.ipython. For Windows users, $HOME resolves to C:\\Documents and
32 Settings\\YourUserName in most instances.
32 Settings\\YourUserName in most instances.
33
33
34
34
35 Eventloop integration
35 Eventloop integration
36 ---------------------
36 ---------------------
37
37
38 Previously IPython had command line options for controlling GUI event loop
38 Previously IPython had command line options for controlling GUI event loop
39 integration (-gthread, -qthread, -q4thread, -wthread, -pylab). As of IPython
39 integration (-gthread, -qthread, -q4thread, -wthread, -pylab). As of IPython
40 version 0.11, these have been removed. Please see the new ``%gui``
40 version 0.11, these have been removed. Please see the new ``%gui``
41 magic command or :ref:`this section <gui_support>` for details on the new
41 magic command or :ref:`this section <gui_support>` for details on the new
42 interface, or specify the gui at the commandline::
42 interface, or specify the gui at the commandline::
43
43
44 $ ipython --gui=qt
44 $ ipython --gui=qt
45
45
46
46
47 Command-line Options
47 Command-line Options
48 --------------------
48 --------------------
49
49
50 To see the options IPython accepts, use ``ipython --help`` (and you probably
50 To see the options IPython accepts, use ``ipython --help`` (and you probably
51 should run the output through a pager such as ``ipython --help | less`` for
51 should run the output through a pager such as ``ipython --help | less`` for
52 more convenient reading). This shows all the options that have a single-word
52 more convenient reading). This shows all the options that have a single-word
53 alias to control them, but IPython lets you configure all of its objects from
53 alias to control them, but IPython lets you configure all of its objects from
54 the command-line by passing the full class name and a corresponding value; type
54 the command-line by passing the full class name and a corresponding value; type
55 ``ipython --help-all`` to see this full list. For example::
55 ``ipython --help-all`` to see this full list. For example::
56
56
57 ipython --matplotlib qt
57 ipython --matplotlib qt
58
58
59 is equivalent to::
59 is equivalent to::
60
60
61 ipython --TerminalIPythonApp.matplotlib='qt'
61 ipython --TerminalIPythonApp.matplotlib='qt'
62
62
63 Note that in the second form, you *must* use the equal sign, as the expression
63 Note that in the second form, you *must* use the equal sign, as the expression
64 is evaluated as an actual Python assignment. While in the above example the
64 is evaluated as an actual Python assignment. While in the above example the
65 short form is more convenient, only the most common options have a short form,
65 short form is more convenient, only the most common options have a short form,
66 while any configurable variable in IPython can be set at the command-line by
66 while any configurable variable in IPython can be set at the command-line by
67 using the long form. This long form is the same syntax used in the
67 using the long form. This long form is the same syntax used in the
68 configuration files, if you want to set these options permanently.
68 configuration files, if you want to set these options permanently.
69
69
70
70
71 Interactive use
71 Interactive use
72 ===============
72 ===============
73
73
74 IPython is meant to work as a drop-in replacement for the standard interactive
74 IPython is meant to work as a drop-in replacement for the standard interactive
75 interpreter. As such, any code which is valid python should execute normally
75 interpreter. As such, any code which is valid python should execute normally
76 under IPython (cases where this is not true should be reported as bugs). It
76 under IPython (cases where this is not true should be reported as bugs). It
77 does, however, offer many features which are not available at a standard python
77 does, however, offer many features which are not available at a standard python
78 prompt. What follows is a list of these.
78 prompt. What follows is a list of these.
79
79
80
80
81 Caution for Windows users
81 Caution for Windows users
82 -------------------------
82 -------------------------
83
83
84 Windows, unfortunately, uses the '\\' character as a path separator. This is a
84 Windows, unfortunately, uses the '\\' character as a path separator. This is a
85 terrible choice, because '\\' also represents the escape character in most
85 terrible choice, because '\\' also represents the escape character in most
86 modern programming languages, including Python. For this reason, using '/'
86 modern programming languages, including Python. For this reason, using '/'
87 character is recommended if you have problems with ``\``. However, in Windows
87 character is recommended if you have problems with ``\``. However, in Windows
88 commands '/' flags options, so you can not use it for the root directory. This
88 commands '/' flags options, so you can not use it for the root directory. This
89 means that paths beginning at the root must be typed in a contrived manner
89 means that paths beginning at the root must be typed in a contrived manner
90 like: ``%copy \opt/foo/bar.txt \tmp``
90 like: ``%copy \opt/foo/bar.txt \tmp``
91
91
92 .. _magic:
92 .. _magic:
93
93
94 Magic command system
94 Magic command system
95 --------------------
95 --------------------
96
96
97 IPython will treat any line whose first character is a % as a special
97 IPython will treat any line whose first character is a % as a special
98 call to a 'magic' function. These allow you to control the behavior of
98 call to a 'magic' function. These allow you to control the behavior of
99 IPython itself, plus a lot of system-type features. They are all
99 IPython itself, plus a lot of system-type features. They are all
100 prefixed with a % character, but parameters are given without
100 prefixed with a % character, but parameters are given without
101 parentheses or quotes.
101 parentheses or quotes.
102
102
103 Lines that begin with ``%%`` signal a *cell magic*: they take as arguments not
103 Lines that begin with ``%%`` signal a *cell magic*: they take as arguments not
104 only the rest of the current line, but all lines below them as well, in the
104 only the rest of the current line, but all lines below them as well, in the
105 current execution block. Cell magics can in fact make arbitrary modifications
105 current execution block. Cell magics can in fact make arbitrary modifications
106 to the input they receive, which need not even be valid Python code at all.
106 to the input they receive, which need not even be valid Python code at all.
107 They receive the whole block as a single string.
107 They receive the whole block as a single string.
108
108
109 As a line magic example, the ``%cd`` magic works just like the OS command of
109 As a line magic example, the ``%cd`` magic works just like the OS command of
110 the same name::
110 the same name::
111
111
112 In [8]: %cd
112 In [8]: %cd
113 /home/fperez
113 /home/fperez
114
114
115 The following uses the builtin ``timeit`` in cell mode::
115 The following uses the builtin ``timeit`` in cell mode::
116
116
117 In [10]: %%timeit x = range(10000)
117 In [10]: %%timeit x = range(10000)
118 ...: min(x)
118 ...: min(x)
119 ...: max(x)
119 ...: max(x)
120 ...:
120 ...:
121 1000 loops, best of 3: 438 us per loop
121 1000 loops, best of 3: 438 us per loop
122
122
123 In this case, ``x = range(10000)`` is called as the line argument, and the
123 In this case, ``x = range(10000)`` is called as the line argument, and the
124 block with ``min(x)`` and ``max(x)`` is called as the cell body. The
124 block with ``min(x)`` and ``max(x)`` is called as the cell body. The
125 ``timeit`` magic receives both.
125 ``timeit`` magic receives both.
126
126
127 If you have 'automagic' enabled (as it by default), you don't need to type in
127 If you have 'automagic' enabled (as it by default), you don't need to type in
128 the single ``%`` explicitly for line magics; IPython will scan its internal
128 the single ``%`` explicitly for line magics; IPython will scan its internal
129 list of magic functions and call one if it exists. With automagic on you can
129 list of magic functions and call one if it exists. With automagic on you can
130 then just type ``cd mydir`` to go to directory 'mydir'::
130 then just type ``cd mydir`` to go to directory 'mydir'::
131
131
132 In [9]: cd mydir
132 In [9]: cd mydir
133 /home/fperez/mydir
133 /home/fperez/mydir
134
134
135 Note that cell magics *always* require an explicit ``%%`` prefix, automagic
135 Note that cell magics *always* require an explicit ``%%`` prefix, automagic
136 calling only works for line magics.
136 calling only works for line magics.
137
137
138 The automagic system has the lowest possible precedence in name searches, so
138 The automagic system has the lowest possible precedence in name searches, so
139 defining an identifier with the same name as an existing magic function will
139 defining an identifier with the same name as an existing magic function will
140 shadow it for automagic use. You can still access the shadowed magic function
140 shadow it for automagic use. You can still access the shadowed magic function
141 by explicitly using the ``%`` character at the beginning of the line.
141 by explicitly using the ``%`` character at the beginning of the line.
142
142
143 An example (with automagic on) should clarify all this:
143 An example (with automagic on) should clarify all this:
144
144
145 .. sourcecode:: ipython
145 .. sourcecode:: ipython
146
146
147 In [1]: cd ipython # %cd is called by automagic
147 In [1]: cd ipython # %cd is called by automagic
148 /home/fperez/ipython
148 /home/fperez/ipython
149
149
150 In [2]: cd=1 # now cd is just a variable
150 In [2]: cd=1 # now cd is just a variable
151
151
152 In [3]: cd .. # and doesn't work as a function anymore
152 In [3]: cd .. # and doesn't work as a function anymore
153 File "<ipython-input-3-9fedb3aff56c>", line 1
153 File "<ipython-input-3-9fedb3aff56c>", line 1
154 cd ..
154 cd ..
155 ^
155 ^
156 SyntaxError: invalid syntax
156 SyntaxError: invalid syntax
157
157
158
158
159 In [4]: %cd .. # but %cd always works
159 In [4]: %cd .. # but %cd always works
160 /home/fperez
160 /home/fperez
161
161
162 In [5]: del cd # if you remove the cd variable, automagic works again
162 In [5]: del cd # if you remove the cd variable, automagic works again
163
163
164 In [6]: cd ipython
164 In [6]: cd ipython
165
165
166 /home/fperez/ipython
166 /home/fperez/ipython
167
167
168 Defining your own magics
168 Defining your own magics
169 ++++++++++++++++++++++++
169 ++++++++++++++++++++++++
170
170
171 There are two main ways to define your own magic functions: from standalone
171 There are two main ways to define your own magic functions: from standalone
172 functions and by inheriting from a base class provided by IPython:
172 functions and by inheriting from a base class provided by IPython:
173 :class:`IPython.core.magic.Magics`. Below we show code you can place in a file
173 :class:`IPython.core.magic.Magics`. Below we show code you can place in a file
174 that you load from your configuration, such as any file in the ``startup``
174 that you load from your configuration, such as any file in the ``startup``
175 subdirectory of your default IPython profile.
175 subdirectory of your default IPython profile.
176
176
177 First, let us see the simplest case. The following shows how to create a line
177 First, let us see the simplest case. The following shows how to create a line
178 magic, a cell one and one that works in both modes, using just plain functions:
178 magic, a cell one and one that works in both modes, using just plain functions:
179
179
180 .. sourcecode:: python
180 .. sourcecode:: python
181
181
182 from IPython.core.magic import (register_line_magic, register_cell_magic,
182 from IPython.core.magic import (register_line_magic, register_cell_magic,
183 register_line_cell_magic)
183 register_line_cell_magic)
184
184
185 @register_line_magic
185 @register_line_magic
186 def lmagic(line):
186 def lmagic(line):
187 "my line magic"
187 "my line magic"
188 return line
188 return line
189
189
190 @register_cell_magic
190 @register_cell_magic
191 def cmagic(line, cell):
191 def cmagic(line, cell):
192 "my cell magic"
192 "my cell magic"
193 return line, cell
193 return line, cell
194
194
195 @register_line_cell_magic
195 @register_line_cell_magic
196 def lcmagic(line, cell=None):
196 def lcmagic(line, cell=None):
197 "Magic that works both as %lcmagic and as %%lcmagic"
197 "Magic that works both as %lcmagic and as %%lcmagic"
198 if cell is None:
198 if cell is None:
199 print "Called as line magic"
199 print "Called as line magic"
200 return line
200 return line
201 else:
201 else:
202 print "Called as cell magic"
202 print "Called as cell magic"
203 return line, cell
203 return line, cell
204
204
205 # We delete these to avoid name conflicts for automagic to work
205 # We delete these to avoid name conflicts for automagic to work
206 del lmagic, lcmagic
206 del lmagic, lcmagic
207
207
208
208
209 You can also create magics of all three kinds by inheriting from the
209 You can also create magics of all three kinds by inheriting from the
210 :class:`IPython.core.magic.Magics` class. This lets you create magics that can
210 :class:`IPython.core.magic.Magics` class. This lets you create magics that can
211 potentially hold state in between calls, and that have full access to the main
211 potentially hold state in between calls, and that have full access to the main
212 IPython object:
212 IPython object:
213
213
214 .. sourcecode:: python
214 .. sourcecode:: python
215
215
216 # This code can be put in any Python module, it does not require IPython
216 # This code can be put in any Python module, it does not require IPython
217 # itself to be running already. It only creates the magics subclass but
217 # itself to be running already. It only creates the magics subclass but
218 # doesn't instantiate it yet.
218 # doesn't instantiate it yet.
219 from IPython.core.magic import (Magics, magics_class, line_magic,
219 from IPython.core.magic import (Magics, magics_class, line_magic,
220 cell_magic, line_cell_magic)
220 cell_magic, line_cell_magic)
221
221
222 # The class MUST call this class decorator at creation time
222 # The class MUST call this class decorator at creation time
223 @magics_class
223 @magics_class
224 class MyMagics(Magics):
224 class MyMagics(Magics):
225
225
226 @line_magic
226 @line_magic
227 def lmagic(self, line):
227 def lmagic(self, line):
228 "my line magic"
228 "my line magic"
229 print "Full access to the main IPython object:", self.shell
229 print "Full access to the main IPython object:", self.shell
230 print "Variables in the user namespace:", self.shell.user_ns.keys()
230 print "Variables in the user namespace:", self.shell.user_ns.keys()
231 return line
231 return line
232
232
233 @cell_magic
233 @cell_magic
234 def cmagic(self, line, cell):
234 def cmagic(self, line, cell):
235 "my cell magic"
235 "my cell magic"
236 return line, cell
236 return line, cell
237
237
238 @line_cell_magic
238 @line_cell_magic
239 def lcmagic(self, line, cell=None):
239 def lcmagic(self, line, cell=None):
240 "Magic that works both as %lcmagic and as %%lcmagic"
240 "Magic that works both as %lcmagic and as %%lcmagic"
241 if cell is None:
241 if cell is None:
242 print "Called as line magic"
242 print "Called as line magic"
243 return line
243 return line
244 else:
244 else:
245 print "Called as cell magic"
245 print "Called as cell magic"
246 return line, cell
246 return line, cell
247
247
248
248
249 # In order to actually use these magics, you must register them with a
249 # In order to actually use these magics, you must register them with a
250 # running IPython. This code must be placed in a file that is loaded once
250 # running IPython. This code must be placed in a file that is loaded once
251 # IPython is up and running:
251 # IPython is up and running:
252 ip = get_ipython()
252 ip = get_ipython()
253 # You can register the class itself without instantiating it. IPython will
253 # You can register the class itself without instantiating it. IPython will
254 # call the default constructor on it.
254 # call the default constructor on it.
255 ip.register_magics(MyMagics)
255 ip.register_magics(MyMagics)
256
256
257 If you want to create a class with a different constructor that holds
257 If you want to create a class with a different constructor that holds
258 additional state, then you should always call the parent constructor and
258 additional state, then you should always call the parent constructor and
259 instantiate the class yourself before registration:
259 instantiate the class yourself before registration:
260
260
261 .. sourcecode:: python
261 .. sourcecode:: python
262
262
263 @magics_class
263 @magics_class
264 class StatefulMagics(Magics):
264 class StatefulMagics(Magics):
265 "Magics that hold additional state"
265 "Magics that hold additional state"
266
266
267 def __init__(self, shell, data):
267 def __init__(self, shell, data):
268 # You must call the parent constructor
268 # You must call the parent constructor
269 super(StatefulMagics, self).__init__(shell)
269 super(StatefulMagics, self).__init__(shell)
270 self.data = data
270 self.data = data
271
271
272 # etc...
272 # etc...
273
273
274 # This class must then be registered with a manually created instance,
274 # This class must then be registered with a manually created instance,
275 # since its constructor has different arguments from the default:
275 # since its constructor has different arguments from the default:
276 ip = get_ipython()
276 ip = get_ipython()
277 magics = StatefulMagics(ip, some_data)
277 magics = StatefulMagics(ip, some_data)
278 ip.register_magics(magics)
278 ip.register_magics(magics)
279
279
280
280
281 In earlier versions, IPython had an API for the creation of line magics (cell
281 In earlier versions, IPython had an API for the creation of line magics (cell
282 magics did not exist at the time) that required you to create functions with a
282 magics did not exist at the time) that required you to create functions with a
283 method-looking signature and to manually pass both the function and the name.
283 method-looking signature and to manually pass both the function and the name.
284 While this API is no longer recommended, it remains indefinitely supported for
284 While this API is no longer recommended, it remains indefinitely supported for
285 backwards compatibility purposes. With the old API, you'd create a magic as
285 backwards compatibility purposes. With the old API, you'd create a magic as
286 follows:
286 follows:
287
287
288 .. sourcecode:: python
288 .. sourcecode:: python
289
289
290 def func(self, line):
290 def func(self, line):
291 print "Line magic called with line:", line
291 print "Line magic called with line:", line
292 print "IPython object:", self.shell
292 print "IPython object:", self.shell
293
293
294 ip = get_ipython()
294 ip = get_ipython()
295 # Declare this function as the magic %mycommand
295 # Declare this function as the magic %mycommand
296 ip.define_magic('mycommand', func)
296 ip.define_magic('mycommand', func)
297
297
298 Type ``%magic`` for more information, including a list of all available magic
298 Type ``%magic`` for more information, including a list of all available magic
299 functions at any time and their docstrings. You can also type
299 functions at any time and their docstrings. You can also type
300 ``%magic_function_name?`` (see :ref:`below <dynamic_object_info>` for
300 ``%magic_function_name?`` (see :ref:`below <dynamic_object_info>` for
301 information on the '?' system) to get information about any particular magic
301 information on the '?' system) to get information about any particular magic
302 function you are interested in.
302 function you are interested in.
303
303
304 The API documentation for the :mod:`IPython.core.magic` module contains the full
304 The API documentation for the :mod:`IPython.core.magic` module contains the full
305 docstrings of all currently available magic commands.
305 docstrings of all currently available magic commands.
306
306
307
307
308 Access to the standard Python help
308 Access to the standard Python help
309 ----------------------------------
309 ----------------------------------
310
310
311 Simply type ``help()`` to access Python's standard help system. You can
311 Simply type ``help()`` to access Python's standard help system. You can
312 also type ``help(object)`` for information about a given object, or
312 also type ``help(object)`` for information about a given object, or
313 ``help('keyword')`` for information on a keyword. You may need to configure your
313 ``help('keyword')`` for information on a keyword. You may need to configure your
314 PYTHONDOCS environment variable for this feature to work correctly.
314 PYTHONDOCS environment variable for this feature to work correctly.
315
315
316 .. _dynamic_object_info:
316 .. _dynamic_object_info:
317
317
318 Dynamic object information
318 Dynamic object information
319 --------------------------
319 --------------------------
320
320
321 Typing ``?word`` or ``word?`` prints detailed information about an object. If
321 Typing ``?word`` or ``word?`` prints detailed information about an object. If
322 certain strings in the object are too long (e.g. function signatures) they get
322 certain strings in the object are too long (e.g. function signatures) they get
323 snipped in the center for brevity. This system gives access variable types and
323 snipped in the center for brevity. This system gives access variable types and
324 values, docstrings, function prototypes and other useful information.
324 values, docstrings, function prototypes and other useful information.
325
325
326 If the information will not fit in the terminal, it is displayed in a pager
326 If the information will not fit in the terminal, it is displayed in a pager
327 (``less`` if available, otherwise a basic internal pager).
327 (``less`` if available, otherwise a basic internal pager).
328
328
329 Typing ``??word`` or ``word??`` gives access to the full information, including
329 Typing ``??word`` or ``word??`` gives access to the full information, including
330 the source code where possible. Long strings are not snipped.
330 the source code where possible. Long strings are not snipped.
331
331
332 The following magic functions are particularly useful for gathering
332 The following magic functions are particularly useful for gathering
333 information about your working environment. You can get more details by
333 information about your working environment. You can get more details by
334 typing ``%magic`` or querying them individually (``%function_name?``);
334 typing ``%magic`` or querying them individually (``%function_name?``);
335 this is just a summary:
335 this is just a summary:
336
336
337 * **%pdoc <object>**: Print (or run through a pager if too long) the
337 * **%pdoc <object>**: Print (or run through a pager if too long) the
338 docstring for an object. If the given object is a class, it will
338 docstring for an object. If the given object is a class, it will
339 print both the class and the constructor docstrings.
339 print both the class and the constructor docstrings.
340 * **%pdef <object>**: Print the call signature for any callable
340 * **%pdef <object>**: Print the call signature for any callable
341 object. If the object is a class, print the constructor information.
341 object. If the object is a class, print the constructor information.
342 * **%psource <object>**: Print (or run through a pager if too long)
342 * **%psource <object>**: Print (or run through a pager if too long)
343 the source code for an object.
343 the source code for an object.
344 * **%pfile <object>**: Show the entire source file where an object was
344 * **%pfile <object>**: Show the entire source file where an object was
345 defined via a pager, opening it at the line where the object
345 defined via a pager, opening it at the line where the object
346 definition begins.
346 definition begins.
347 * **%who/%whos**: These functions give information about identifiers
347 * **%who/%whos**: These functions give information about identifiers
348 you have defined interactively (not things you loaded or defined
348 you have defined interactively (not things you loaded or defined
349 in your configuration files). %who just prints a list of
349 in your configuration files). %who just prints a list of
350 identifiers and %whos prints a table with some basic details about
350 identifiers and %whos prints a table with some basic details about
351 each identifier.
351 each identifier.
352
352
353 Note that the dynamic object information functions (?/??, ``%pdoc``,
353 Note that the dynamic object information functions (?/??, ``%pdoc``,
354 ``%pfile``, ``%pdef``, ``%psource``) work on object attributes, as well as
354 ``%pfile``, ``%pdef``, ``%psource``) work on object attributes, as well as
355 directly on variables. For example, after doing ``import os``, you can use
355 directly on variables. For example, after doing ``import os``, you can use
356 ``os.path.abspath??``.
356 ``os.path.abspath??``.
357
357
358 .. _readline:
358 .. _readline:
359
359
360 Readline-based features
360 Readline-based features
361 -----------------------
361 -----------------------
362
362
363 These features require the GNU readline library, so they won't work if your
363 These features require the GNU readline library, so they won't work if your
364 Python installation lacks readline support. We will first describe the default
364 Python installation lacks readline support. We will first describe the default
365 behavior IPython uses, and then how to change it to suit your preferences.
365 behavior IPython uses, and then how to change it to suit your preferences.
366
366
367
367
368 Command line completion
368 Command line completion
369 +++++++++++++++++++++++
369 +++++++++++++++++++++++
370
370
371 At any time, hitting TAB will complete any available python commands or
371 At any time, hitting TAB will complete any available python commands or
372 variable names, and show you a list of the possible completions if
372 variable names, and show you a list of the possible completions if
373 there's no unambiguous one. It will also complete filenames in the
373 there's no unambiguous one. It will also complete filenames in the
374 current directory if no python names match what you've typed so far.
374 current directory if no python names match what you've typed so far.
375
375
376
376
377 Search command history
377 Search command history
378 ++++++++++++++++++++++
378 ++++++++++++++++++++++
379
379
380 IPython provides two ways for searching through previous input and thus
380 IPython provides two ways for searching through previous input and thus
381 reduce the need for repetitive typing:
381 reduce the need for repetitive typing:
382
382
383 1. Start typing, and then use Ctrl-p (previous,up) and Ctrl-n
383 1. Start typing, and then use Ctrl-p (previous,up) and Ctrl-n
384 (next,down) to search through only the history items that match
384 (next,down) to search through only the history items that match
385 what you've typed so far. If you use Ctrl-p/Ctrl-n at a blank
385 what you've typed so far. If you use Ctrl-p/Ctrl-n at a blank
386 prompt, they just behave like normal arrow keys.
386 prompt, they just behave like normal arrow keys.
387 2. Hit Ctrl-r: opens a search prompt. Begin typing and the system
387 2. Hit Ctrl-r: opens a search prompt. Begin typing and the system
388 searches your history for lines that contain what you've typed so
388 searches your history for lines that contain what you've typed so
389 far, completing as much as it can.
389 far, completing as much as it can.
390
390
391
391
392 Persistent command history across sessions
392 Persistent command history across sessions
393 ++++++++++++++++++++++++++++++++++++++++++
393 ++++++++++++++++++++++++++++++++++++++++++
394
394
395 IPython will save your input history when it leaves and reload it next
395 IPython will save your input history when it leaves and reload it next
396 time you restart it. By default, the history file is named
396 time you restart it. By default, the history file is named
397 $IPYTHONDIR/profile_<name>/history.sqlite. This allows you to keep
397 $IPYTHONDIR/profile_<name>/history.sqlite. This allows you to keep
398 separate histories related to various tasks: commands related to
398 separate histories related to various tasks: commands related to
399 numerical work will not be clobbered by a system shell history, for
399 numerical work will not be clobbered by a system shell history, for
400 example.
400 example.
401
401
402
402
403 Autoindent
403 Autoindent
404 ++++++++++
404 ++++++++++
405
405
406 IPython can recognize lines ending in ':' and indent the next line,
406 IPython can recognize lines ending in ':' and indent the next line,
407 while also un-indenting automatically after 'raise' or 'return'.
407 while also un-indenting automatically after 'raise' or 'return'.
408
408
409 This feature uses the readline library, so it will honor your
409 This feature uses the readline library, so it will honor your
410 :file:`~/.inputrc` configuration (or whatever file your INPUTRC variable points
410 :file:`~/.inputrc` configuration (or whatever file your INPUTRC variable points
411 to). Adding the following lines to your :file:`.inputrc` file can make
411 to). Adding the following lines to your :file:`.inputrc` file can make
412 indenting/unindenting more convenient (M-i indents, M-u unindents)::
412 indenting/unindenting more convenient (M-i indents, M-u unindents)::
413
413
414 # if you don't already have a ~/.inputrc file, you need this include:
414 # if you don't already have a ~/.inputrc file, you need this include:
415 $include /etc/inputrc
415 $include /etc/inputrc
416
416
417 $if Python
417 $if Python
418 "\M-i": " "
418 "\M-i": " "
419 "\M-u": "\d\d\d\d"
419 "\M-u": "\d\d\d\d"
420 $endif
420 $endif
421
421
422 Note that there are 4 spaces between the quote marks after "M-i" above.
422 Note that there are 4 spaces between the quote marks after "M-i" above.
423
423
424 .. warning::
424 .. warning::
425
425
426 Setting the above indents will cause problems with unicode text entry in
426 Setting the above indents will cause problems with unicode text entry in
427 the terminal.
427 the terminal.
428
428
429 .. warning::
429 .. warning::
430
430
431 Autoindent is ON by default, but it can cause problems with the pasting of
431 Autoindent is ON by default, but it can cause problems with the pasting of
432 multi-line indented code (the pasted code gets re-indented on each line). A
432 multi-line indented code (the pasted code gets re-indented on each line). A
433 magic function %autoindent allows you to toggle it on/off at runtime. You
433 magic function %autoindent allows you to toggle it on/off at runtime. You
434 can also disable it permanently on in your :file:`ipython_config.py` file
434 can also disable it permanently on in your :file:`ipython_config.py` file
435 (set TerminalInteractiveShell.autoindent=False).
435 (set TerminalInteractiveShell.autoindent=False).
436
436
437 If you want to paste multiple lines in the terminal, it is recommended that
437 If you want to paste multiple lines in the terminal, it is recommended that
438 you use ``%paste``.
438 you use ``%paste``.
439
439
440
440
441 Customizing readline behavior
441 Customizing readline behavior
442 +++++++++++++++++++++++++++++
442 +++++++++++++++++++++++++++++
443
443
444 All these features are based on the GNU readline library, which has an
444 All these features are based on the GNU readline library, which has an
445 extremely customizable interface. Normally, readline is configured via a
445 extremely customizable interface. Normally, readline is configured via a
446 file which defines the behavior of the library; the details of the
446 file which defines the behavior of the library; the details of the
447 syntax for this can be found in the readline documentation available
447 syntax for this can be found in the readline documentation available
448 with your system or on the Internet. IPython doesn't read this file (if
448 with your system or on the Internet. IPython doesn't read this file (if
449 it exists) directly, but it does support passing to readline valid
449 it exists) directly, but it does support passing to readline valid
450 options via a simple interface. In brief, you can customize readline by
450 options via a simple interface. In brief, you can customize readline by
451 setting the following options in your configuration file (note
451 setting the following options in your configuration file (note
452 that these options can not be specified at the command line):
452 that these options can not be specified at the command line):
453
453
454 * **readline_parse_and_bind**: this holds a list of strings to be executed
454 * **readline_parse_and_bind**: this holds a list of strings to be executed
455 via a readline.parse_and_bind() command. The syntax for valid commands
455 via a readline.parse_and_bind() command. The syntax for valid commands
456 of this kind can be found by reading the documentation for the GNU
456 of this kind can be found by reading the documentation for the GNU
457 readline library, as these commands are of the kind which readline
457 readline library, as these commands are of the kind which readline
458 accepts in its configuration file.
458 accepts in its configuration file.
459 * **readline_remove_delims**: a string of characters to be removed
459 * **readline_remove_delims**: a string of characters to be removed
460 from the default word-delimiters list used by readline, so that
460 from the default word-delimiters list used by readline, so that
461 completions may be performed on strings which contain them. Do not
461 completions may be performed on strings which contain them. Do not
462 change the default value unless you know what you're doing.
462 change the default value unless you know what you're doing.
463
463
464 You will find the default values in your configuration file.
464 You will find the default values in your configuration file.
465
465
466
466
467 Session logging and restoring
467 Session logging and restoring
468 -----------------------------
468 -----------------------------
469
469
470 You can log all input from a session either by starting IPython with the
470 You can log all input from a session either by starting IPython with the
471 command line switch ``--logfile=foo.py`` (see :ref:`here <command_line_options>`)
471 command line switch ``--logfile=foo.py`` (see :ref:`here <command_line_options>`)
472 or by activating the logging at any moment with the magic function %logstart.
472 or by activating the logging at any moment with the magic function %logstart.
473
473
474 Log files can later be reloaded by running them as scripts and IPython
474 Log files can later be reloaded by running them as scripts and IPython
475 will attempt to 'replay' the log by executing all the lines in it, thus
475 will attempt to 'replay' the log by executing all the lines in it, thus
476 restoring the state of a previous session. This feature is not quite
476 restoring the state of a previous session. This feature is not quite
477 perfect, but can still be useful in many cases.
477 perfect, but can still be useful in many cases.
478
478
479 The log files can also be used as a way to have a permanent record of
479 The log files can also be used as a way to have a permanent record of
480 any code you wrote while experimenting. Log files are regular text files
480 any code you wrote while experimenting. Log files are regular text files
481 which you can later open in your favorite text editor to extract code or
481 which you can later open in your favorite text editor to extract code or
482 to 'clean them up' before using them to replay a session.
482 to 'clean them up' before using them to replay a session.
483
483
484 The `%logstart` function for activating logging in mid-session is used as
484 The `%logstart` function for activating logging in mid-session is used as
485 follows::
485 follows::
486
486
487 %logstart [log_name [log_mode]]
487 %logstart [log_name [log_mode]]
488
488
489 If no name is given, it defaults to a file named 'ipython_log.py' in your
489 If no name is given, it defaults to a file named 'ipython_log.py' in your
490 current working directory, in 'rotate' mode (see below).
490 current working directory, in 'rotate' mode (see below).
491
491
492 '%logstart name' saves to file 'name' in 'backup' mode. It saves your
492 '%logstart name' saves to file 'name' in 'backup' mode. It saves your
493 history up to that point and then continues logging.
493 history up to that point and then continues logging.
494
494
495 %logstart takes a second optional parameter: logging mode. This can be
495 %logstart takes a second optional parameter: logging mode. This can be
496 one of (note that the modes are given unquoted):
496 one of (note that the modes are given unquoted):
497
497
498 * [over:] overwrite existing log_name.
498 * [over:] overwrite existing log_name.
499 * [backup:] rename (if exists) to log_name~ and start log_name.
499 * [backup:] rename (if exists) to log_name~ and start log_name.
500 * [append:] well, that says it.
500 * [append:] well, that says it.
501 * [rotate:] create rotating logs log_name.1~, log_name.2~, etc.
501 * [rotate:] create rotating logs log_name.1~, log_name.2~, etc.
502
502
503 The %logoff and %logon functions allow you to temporarily stop and
503 The %logoff and %logon functions allow you to temporarily stop and
504 resume logging to a file which had previously been started with
504 resume logging to a file which had previously been started with
505 %logstart. They will fail (with an explanation) if you try to use them
505 %logstart. They will fail (with an explanation) if you try to use them
506 before logging has been started.
506 before logging has been started.
507
507
508 .. _system_shell_access:
508 .. _system_shell_access:
509
509
510 System shell access
510 System shell access
511 -------------------
511 -------------------
512
512
513 Any input line beginning with a ! character is passed verbatim (minus
513 Any input line beginning with a ! character is passed verbatim (minus
514 the !, of course) to the underlying operating system. For example,
514 the !, of course) to the underlying operating system. For example,
515 typing ``!ls`` will run 'ls' in the current directory.
515 typing ``!ls`` will run 'ls' in the current directory.
516
516
517 Manual capture of command output
517 Manual capture of command output
518 --------------------------------
518 --------------------------------
519
519
520 You can assign the result of a system command to a Python variable with the
520 You can assign the result of a system command to a Python variable with the
521 syntax ``myfiles = !ls``. This gets machine readable output from stdout
521 syntax ``myfiles = !ls``. This gets machine readable output from stdout
522 (e.g. without colours), and splits on newlines. To explicitly get this sort of
522 (e.g. without colours), and splits on newlines. To explicitly get this sort of
523 output without assigning to a variable, use two exclamation marks (``!!ls``) or
523 output without assigning to a variable, use two exclamation marks (``!!ls``) or
524 the ``%sx`` magic command.
524 the ``%sx`` magic command.
525
525
526 The captured list has some convenience features. ``myfiles.n`` or ``myfiles.s``
526 The captured list has some convenience features. ``myfiles.n`` or ``myfiles.s``
527 returns a string delimited by newlines or spaces, respectively. ``myfiles.p``
527 returns a string delimited by newlines or spaces, respectively. ``myfiles.p``
528 produces `path objects <http://pypi.python.org/pypi/path.py>`_ from the list items.
528 produces `path objects <http://pypi.python.org/pypi/path.py>`_ from the list items.
529 See :ref:`string_lists` for details.
529 See :ref:`string_lists` for details.
530
530
531 IPython also allows you to expand the value of python variables when
531 IPython also allows you to expand the value of python variables when
532 making system calls. Wrap variables or expressions in {braces}::
532 making system calls. Wrap variables or expressions in {braces}::
533
533
534 In [1]: pyvar = 'Hello world'
534 In [1]: pyvar = 'Hello world'
535 In [2]: !echo "A python variable: {pyvar}"
535 In [2]: !echo "A python variable: {pyvar}"
536 A python variable: Hello world
536 A python variable: Hello world
537 In [3]: import math
537 In [3]: import math
538 In [4]: x = 8
538 In [4]: x = 8
539 In [5]: !echo {math.factorial(x)}
539 In [5]: !echo {math.factorial(x)}
540 40320
540 40320
541
541
542 For simple cases, you can alternatively prepend $ to a variable name::
542 For simple cases, you can alternatively prepend $ to a variable name::
543
543
544 In [6]: !echo $sys.argv
544 In [6]: !echo $sys.argv
545 [/home/fperez/usr/bin/ipython]
545 [/home/fperez/usr/bin/ipython]
546 In [7]: !echo "A system variable: $$HOME" # Use $$ for literal $
546 In [7]: !echo "A system variable: $$HOME" # Use $$ for literal $
547 A system variable: /home/fperez
547 A system variable: /home/fperez
548
548
549 System command aliases
549 System command aliases
550 ----------------------
550 ----------------------
551
551
552 The %alias magic function allows you to define magic functions which are in fact
552 The %alias magic function allows you to define magic functions which are in fact
553 system shell commands. These aliases can have parameters.
553 system shell commands. These aliases can have parameters.
554
554
555 ``%alias alias_name cmd`` defines 'alias_name' as an alias for 'cmd'
555 ``%alias alias_name cmd`` defines 'alias_name' as an alias for 'cmd'
556
556
557 Then, typing ``alias_name params`` will execute the system command 'cmd
557 Then, typing ``alias_name params`` will execute the system command 'cmd
558 params' (from your underlying operating system).
558 params' (from your underlying operating system).
559
559
560 You can also define aliases with parameters using %s specifiers (one per
560 You can also define aliases with parameters using %s specifiers (one per
561 parameter). The following example defines the parts function as an
561 parameter). The following example defines the parts function as an
562 alias to the command 'echo first %s second %s' where each %s will be
562 alias to the command 'echo first %s second %s' where each %s will be
563 replaced by a positional parameter to the call to %parts::
563 replaced by a positional parameter to the call to %parts::
564
564
565 In [1]: %alias parts echo first %s second %s
565 In [1]: %alias parts echo first %s second %s
566 In [2]: parts A B
566 In [2]: parts A B
567 first A second B
567 first A second B
568 In [3]: parts A
568 In [3]: parts A
569 ERROR: Alias <parts> requires 2 arguments, 1 given.
569 ERROR: Alias <parts> requires 2 arguments, 1 given.
570
570
571 If called with no parameters, %alias prints the table of currently
571 If called with no parameters, %alias prints the table of currently
572 defined aliases.
572 defined aliases.
573
573
574 The %rehashx magic allows you to load your entire $PATH as
574 The %rehashx magic allows you to load your entire $PATH as
575 ipython aliases. See its docstring for further details.
575 ipython aliases. See its docstring for further details.
576
576
577
577
578 .. _dreload:
578 .. _dreload:
579
579
580 Recursive reload
580 Recursive reload
581 ----------------
581 ----------------
582
582
583 The :mod:`IPython.lib.deepreload` module allows you to recursively reload a
583 The :mod:`IPython.lib.deepreload` module allows you to recursively reload a
584 module: changes made to any of its dependencies will be reloaded without
584 module: changes made to any of its dependencies will be reloaded without
585 having to exit. To start using it, do::
585 having to exit. To start using it, do::
586
586
587 from IPython.lib.deepreload import reload as dreload
587 from IPython.lib.deepreload import reload as dreload
588
588
589
589
590 Verbose and colored exception traceback printouts
590 Verbose and colored exception traceback printouts
591 -------------------------------------------------
591 -------------------------------------------------
592
592
593 IPython provides the option to see very detailed exception tracebacks,
593 IPython provides the option to see very detailed exception tracebacks,
594 which can be especially useful when debugging large programs. You can
594 which can be especially useful when debugging large programs. You can
595 run any Python file with the %run function to benefit from these
595 run any Python file with the %run function to benefit from these
596 detailed tracebacks. Furthermore, both normal and verbose tracebacks can
596 detailed tracebacks. Furthermore, both normal and verbose tracebacks can
597 be colored (if your terminal supports it) which makes them much easier
597 be colored (if your terminal supports it) which makes them much easier
598 to parse visually.
598 to parse visually.
599
599
600 See the magic xmode and colors functions for details (just type %magic).
600 See the magic xmode and colors functions for details (just type %magic).
601
601
602 These features are basically a terminal version of Ka-Ping Yee's cgitb
602 These features are basically a terminal version of Ka-Ping Yee's cgitb
603 module, now part of the standard Python library.
603 module, now part of the standard Python library.
604
604
605
605
606 .. _input_caching:
606 .. _input_caching:
607
607
608 Input caching system
608 Input caching system
609 --------------------
609 --------------------
610
610
611 IPython offers numbered prompts (In/Out) with input and output caching
611 IPython offers numbered prompts (In/Out) with input and output caching
612 (also referred to as 'input history'). All input is saved and can be
612 (also referred to as 'input history'). All input is saved and can be
613 retrieved as variables (besides the usual arrow key recall), in
613 retrieved as variables (besides the usual arrow key recall), in
614 addition to the %rep magic command that brings a history entry
614 addition to the %rep magic command that brings a history entry
615 up for editing on the next command line.
615 up for editing on the next command line.
616
616
617 The following GLOBAL variables always exist (so don't overwrite them!):
617 The following GLOBAL variables always exist (so don't overwrite them!):
618
618
619 * _i, _ii, _iii: store previous, next previous and next-next previous inputs.
619 * _i, _ii, _iii: store previous, next previous and next-next previous inputs.
620 * In, _ih : a list of all inputs; _ih[n] is the input from line n. If you
620 * In, _ih : a list of all inputs; _ih[n] is the input from line n. If you
621 overwrite In with a variable of your own, you can remake the assignment to the
621 overwrite In with a variable of your own, you can remake the assignment to the
622 internal list with a simple ``In=_ih``.
622 internal list with a simple ``In=_ih``.
623
623
624 Additionally, global variables named _i<n> are dynamically created (<n>
624 Additionally, global variables named _i<n> are dynamically created (<n>
625 being the prompt counter), so ``_i<n> == _ih[<n>] == In[<n>]``.
625 being the prompt counter), so ``_i<n> == _ih[<n>] == In[<n>]``.
626
626
627 For example, what you typed at prompt 14 is available as _i14, _ih[14]
627 For example, what you typed at prompt 14 is available as _i14, _ih[14]
628 and In[14].
628 and In[14].
629
629
630 This allows you to easily cut and paste multi line interactive prompts
630 This allows you to easily cut and paste multi line interactive prompts
631 by printing them out: they print like a clean string, without prompt
631 by printing them out: they print like a clean string, without prompt
632 characters. You can also manipulate them like regular variables (they
632 characters. You can also manipulate them like regular variables (they
633 are strings), modify or exec them (typing ``exec _i9`` will re-execute the
633 are strings), modify or exec them (typing ``exec _i9`` will re-execute the
634 contents of input prompt 9.
634 contents of input prompt 9.
635
635
636 You can also re-execute multiple lines of input easily by using the
636 You can also re-execute multiple lines of input easily by using the
637 magic %rerun or %macro functions. The macro system also allows you to re-execute
637 magic %rerun or %macro functions. The macro system also allows you to re-execute
638 previous lines which include magic function calls (which require special
638 previous lines which include magic function calls (which require special
639 processing). Type %macro? for more details on the macro system.
639 processing). Type %macro? for more details on the macro system.
640
640
641 A history function %hist allows you to see any part of your input
641 A history function %hist allows you to see any part of your input
642 history by printing a range of the _i variables.
642 history by printing a range of the _i variables.
643
643
644 You can also search ('grep') through your history by typing
644 You can also search ('grep') through your history by typing
645 ``%hist -g somestring``. This is handy for searching for URLs, IP addresses,
645 ``%hist -g somestring``. This is handy for searching for URLs, IP addresses,
646 etc. You can bring history entries listed by '%hist -g' up for editing
646 etc. You can bring history entries listed by '%hist -g' up for editing
647 with the %recall command, or run them immediately with %rerun.
647 with the %recall command, or run them immediately with %rerun.
648
648
649 .. _output_caching:
649 .. _output_caching:
650
650
651 Output caching system
651 Output caching system
652 ---------------------
652 ---------------------
653
653
654 For output that is returned from actions, a system similar to the input
654 For output that is returned from actions, a system similar to the input
655 cache exists but using _ instead of _i. Only actions that produce a
655 cache exists but using _ instead of _i. Only actions that produce a
656 result (NOT assignments, for example) are cached. If you are familiar
656 result (NOT assignments, for example) are cached. If you are familiar
657 with Mathematica, IPython's _ variables behave exactly like
657 with Mathematica, IPython's _ variables behave exactly like
658 Mathematica's % variables.
658 Mathematica's % variables.
659
659
660 The following GLOBAL variables always exist (so don't overwrite them!):
660 The following GLOBAL variables always exist (so don't overwrite them!):
661
661
662 * [_] (a single underscore) : stores previous output, like Python's
662 * [_] (a single underscore) : stores previous output, like Python's
663 default interpreter.
663 default interpreter.
664 * [__] (two underscores): next previous.
664 * [__] (two underscores): next previous.
665 * [___] (three underscores): next-next previous.
665 * [___] (three underscores): next-next previous.
666
666
667 Additionally, global variables named _<n> are dynamically created (<n>
667 Additionally, global variables named _<n> are dynamically created (<n>
668 being the prompt counter), such that the result of output <n> is always
668 being the prompt counter), such that the result of output <n> is always
669 available as _<n> (don't use the angle brackets, just the number, e.g.
669 available as _<n> (don't use the angle brackets, just the number, e.g.
670 _21).
670 _21).
671
671
672 These variables are also stored in a global dictionary (not a
672 These variables are also stored in a global dictionary (not a
673 list, since it only has entries for lines which returned a result)
673 list, since it only has entries for lines which returned a result)
674 available under the names _oh and Out (similar to _ih and In). So the
674 available under the names _oh and Out (similar to _ih and In). So the
675 output from line 12 can be obtained as _12, Out[12] or _oh[12]. If you
675 output from line 12 can be obtained as _12, Out[12] or _oh[12]. If you
676 accidentally overwrite the Out variable you can recover it by typing
676 accidentally overwrite the Out variable you can recover it by typing
677 'Out=_oh' at the prompt.
677 'Out=_oh' at the prompt.
678
678
679 This system obviously can potentially put heavy memory demands on your
679 This system obviously can potentially put heavy memory demands on your
680 system, since it prevents Python's garbage collector from removing any
680 system, since it prevents Python's garbage collector from removing any
681 previously computed results. You can control how many results are kept
681 previously computed results. You can control how many results are kept
682 in memory with the option (at the command line or in your configuration
682 in memory with the option (at the command line or in your configuration
683 file) cache_size. If you set it to 0, the whole system is completely
683 file) cache_size. If you set it to 0, the whole system is completely
684 disabled and the prompts revert to the classic '>>>' of normal Python.
684 disabled and the prompts revert to the classic '>>>' of normal Python.
685
685
686
686
687 Directory history
687 Directory history
688 -----------------
688 -----------------
689
689
690 Your history of visited directories is kept in the global list _dh, and
690 Your history of visited directories is kept in the global list _dh, and
691 the magic %cd command can be used to go to any entry in that list. The
691 the magic %cd command can be used to go to any entry in that list. The
692 %dhist command allows you to view this history. Do ``cd -<TAB>`` to
692 %dhist command allows you to view this history. Do ``cd -<TAB>`` to
693 conveniently view the directory history.
693 conveniently view the directory history.
694
694
695
695
696 Automatic parentheses and quotes
696 Automatic parentheses and quotes
697 --------------------------------
697 --------------------------------
698
698
699 These features were adapted from Nathan Gray's LazyPython. They are
699 These features were adapted from Nathan Gray's LazyPython. They are
700 meant to allow less typing for common situations.
700 meant to allow less typing for common situations.
701
701
702
702
703 Automatic parentheses
703 Automatic parentheses
704 +++++++++++++++++++++
704 +++++++++++++++++++++
705
705
706 Callable objects (i.e. functions, methods, etc) can be invoked like this
706 Callable objects (i.e. functions, methods, etc) can be invoked like this
707 (notice the commas between the arguments)::
707 (notice the commas between the arguments)::
708
708
709 In [1]: callable_ob arg1, arg2, arg3
709 In [1]: callable_ob arg1, arg2, arg3
710 ------> callable_ob(arg1, arg2, arg3)
710 ------> callable_ob(arg1, arg2, arg3)
711
711
712 You can force automatic parentheses by using '/' as the first character
712 You can force automatic parentheses by using '/' as the first character
713 of a line. For example::
713 of a line. For example::
714
714
715 In [2]: /globals # becomes 'globals()'
715 In [2]: /globals # becomes 'globals()'
716
716
717 Note that the '/' MUST be the first character on the line! This won't work::
717 Note that the '/' MUST be the first character on the line! This won't work::
718
718
719 In [3]: print /globals # syntax error
719 In [3]: print /globals # syntax error
720
720
721 In most cases the automatic algorithm should work, so you should rarely
721 In most cases the automatic algorithm should work, so you should rarely
722 need to explicitly invoke /. One notable exception is if you are trying
722 need to explicitly invoke /. One notable exception is if you are trying
723 to call a function with a list of tuples as arguments (the parenthesis
723 to call a function with a list of tuples as arguments (the parenthesis
724 will confuse IPython)::
724 will confuse IPython)::
725
725
726 In [4]: zip (1,2,3),(4,5,6) # won't work
726 In [4]: zip (1,2,3),(4,5,6) # won't work
727
727
728 but this will work::
728 but this will work::
729
729
730 In [5]: /zip (1,2,3),(4,5,6)
730 In [5]: /zip (1,2,3),(4,5,6)
731 ------> zip ((1,2,3),(4,5,6))
731 ------> zip ((1,2,3),(4,5,6))
732 Out[5]: [(1, 4), (2, 5), (3, 6)]
732 Out[5]: [(1, 4), (2, 5), (3, 6)]
733
733
734 IPython tells you that it has altered your command line by displaying
734 IPython tells you that it has altered your command line by displaying
735 the new command line preceded by ->. e.g.::
735 the new command line preceded by ->. e.g.::
736
736
737 In [6]: callable list
737 In [6]: callable list
738 ------> callable(list)
738 ------> callable(list)
739
739
740
740
741 Automatic quoting
741 Automatic quoting
742 +++++++++++++++++
742 +++++++++++++++++
743
743
744 You can force automatic quoting of a function's arguments by using ','
744 You can force automatic quoting of a function's arguments by using ','
745 or ';' as the first character of a line. For example::
745 or ';' as the first character of a line. For example::
746
746
747 In [1]: ,my_function /home/me # becomes my_function("/home/me")
747 In [1]: ,my_function /home/me # becomes my_function("/home/me")
748
748
749 If you use ';' the whole argument is quoted as a single string, while ',' splits
749 If you use ';' the whole argument is quoted as a single string, while ',' splits
750 on whitespace::
750 on whitespace::
751
751
752 In [2]: ,my_function a b c # becomes my_function("a","b","c")
752 In [2]: ,my_function a b c # becomes my_function("a","b","c")
753
753
754 In [3]: ;my_function a b c # becomes my_function("a b c")
754 In [3]: ;my_function a b c # becomes my_function("a b c")
755
755
756 Note that the ',' or ';' MUST be the first character on the line! This
756 Note that the ',' or ';' MUST be the first character on the line! This
757 won't work::
757 won't work::
758
758
759 In [4]: x = ,my_function /home/me # syntax error
759 In [4]: x = ,my_function /home/me # syntax error
760
760
761 IPython as your default Python environment
761 IPython as your default Python environment
762 ==========================================
762 ==========================================
763
763
764 Python honors the environment variable PYTHONSTARTUP and will execute at
764 Python honors the environment variable PYTHONSTARTUP and will execute at
765 startup the file referenced by this variable. If you put the following code at
765 startup the file referenced by this variable. If you put the following code at
766 the end of that file, then IPython will be your working environment anytime you
766 the end of that file, then IPython will be your working environment anytime you
767 start Python::
767 start Python::
768
768
769 from IPython.frontend.terminal.ipapp import launch_new_instance
769 from IPython.frontend.terminal.ipapp import launch_new_instance
770 launch_new_instance()
770 launch_new_instance()
771 raise SystemExit
771 raise SystemExit
772
772
773 The ``raise SystemExit`` is needed to exit Python when
773 The ``raise SystemExit`` is needed to exit Python when
774 it finishes, otherwise you'll be back at the normal Python '>>>'
774 it finishes, otherwise you'll be back at the normal Python '>>>'
775 prompt.
775 prompt.
776
776
777 This is probably useful to developers who manage multiple Python
777 This is probably useful to developers who manage multiple Python
778 versions and don't want to have correspondingly multiple IPython
778 versions and don't want to have correspondingly multiple IPython
779 versions. Note that in this mode, there is no way to pass IPython any
779 versions. Note that in this mode, there is no way to pass IPython any
780 command-line options, as those are trapped first by Python itself.
780 command-line options, as those are trapped first by Python itself.
781
781
782 .. _Embedding:
782 .. _Embedding:
783
783
784 Embedding IPython
784 Embedding IPython
785 =================
785 =================
786
786
787 You can start a regular IPython session with
787 You can start a regular IPython session with
788
788
789 .. sourcecode:: python
789 .. sourcecode:: python
790
790
791 import IPython
791 import IPython
792 IPython.start_ipython()
792 IPython.start_ipython()
793
793
794 at any point in your program. This will load IPython configuration,
794 at any point in your program. This will load IPython configuration,
795 startup files, and everything, just as if it were a normal IPython session.
795 startup files, and everything, just as if it were a normal IPython session.
796 In addition to this,
796 In addition to this,
797 it is possible to embed an IPython instance inside your own Python programs.
797 it is possible to embed an IPython instance inside your own Python programs.
798 This allows you to evaluate dynamically the state of your code,
798 This allows you to evaluate dynamically the state of your code,
799 operate with your variables, analyze them, etc. Note however that
799 operate with your variables, analyze them, etc. Note however that
800 any changes you make to values while in the shell do not propagate back
800 any changes you make to values while in the shell do not propagate back
801 to the running code, so it is safe to modify your values because you
801 to the running code, so it is safe to modify your values because you
802 won't break your code in bizarre ways by doing so.
802 won't break your code in bizarre ways by doing so.
803
803
804 .. note::
804 .. note::
805
805
806 At present, embedding IPython cannot be done from inside IPython.
806 At present, embedding IPython cannot be done from inside IPython.
807 Run the code samples below outside IPython.
807 Run the code samples below outside IPython.
808
808
809 This feature allows you to easily have a fully functional python
809 This feature allows you to easily have a fully functional python
810 environment for doing object introspection anywhere in your code with a
810 environment for doing object introspection anywhere in your code with a
811 simple function call. In some cases a simple print statement is enough,
811 simple function call. In some cases a simple print statement is enough,
812 but if you need to do more detailed analysis of a code fragment this
812 but if you need to do more detailed analysis of a code fragment this
813 feature can be very valuable.
813 feature can be very valuable.
814
814
815 It can also be useful in scientific computing situations where it is
815 It can also be useful in scientific computing situations where it is
816 common to need to do some automatic, computationally intensive part and
816 common to need to do some automatic, computationally intensive part and
817 then stop to look at data, plots, etc.
817 then stop to look at data, plots, etc.
818 Opening an IPython instance will give you full access to your data and
818 Opening an IPython instance will give you full access to your data and
819 functions, and you can resume program execution once you are done with
819 functions, and you can resume program execution once you are done with
820 the interactive part (perhaps to stop again later, as many times as
820 the interactive part (perhaps to stop again later, as many times as
821 needed).
821 needed).
822
822
823 The following code snippet is the bare minimum you need to include in
823 The following code snippet is the bare minimum you need to include in
824 your Python programs for this to work (detailed examples follow later)::
824 your Python programs for this to work (detailed examples follow later)::
825
825
826 from IPython import embed
826 from IPython import embed
827
827
828 embed() # this call anywhere in your program will start IPython
828 embed() # this call anywhere in your program will start IPython
829
829
830 .. note::
830 .. note::
831
831
832 As of 0.13, you can embed an IPython *kernel*, for use with qtconsole,
832 As of 0.13, you can embed an IPython *kernel*, for use with qtconsole,
833 etc. via ``IPython.embed_kernel()`` instead of ``IPython.embed()``.
833 etc. via ``IPython.embed_kernel()`` instead of ``IPython.embed()``.
834 It should function just the same as regular embed, but you connect
834 It should function just the same as regular embed, but you connect
835 an external frontend rather than IPython starting up in the local
835 an external frontend rather than IPython starting up in the local
836 terminal.
836 terminal.
837
837
838 You can run embedded instances even in code which is itself being run at
838 You can run embedded instances even in code which is itself being run at
839 the IPython interactive prompt with '%run <filename>'. Since it's easy
839 the IPython interactive prompt with '%run <filename>'. Since it's easy
840 to get lost as to where you are (in your top-level IPython or in your
840 to get lost as to where you are (in your top-level IPython or in your
841 embedded one), it's a good idea in such cases to set the in/out prompts
841 embedded one), it's a good idea in such cases to set the in/out prompts
842 to something different for the embedded instances. The code examples
842 to something different for the embedded instances. The code examples
843 below illustrate this.
843 below illustrate this.
844
844
845 You can also have multiple IPython instances in your program and open
845 You can also have multiple IPython instances in your program and open
846 them separately, for example with different options for data
846 them separately, for example with different options for data
847 presentation. If you close and open the same instance multiple times,
847 presentation. If you close and open the same instance multiple times,
848 its prompt counters simply continue from each execution to the next.
848 its prompt counters simply continue from each execution to the next.
849
849
850 Please look at the docstrings in the :mod:`~IPython.frontend.terminal.embed`
850 Please look at the docstrings in the :mod:`~IPython.frontend.terminal.embed`
851 module for more details on the use of this system.
851 module for more details on the use of this system.
852
852
853 The following sample file illustrating how to use the embedding
853 The following sample file illustrating how to use the embedding
854 functionality is provided in the examples directory as example-embed.py.
854 functionality is provided in the examples directory as example-embed.py.
855 It should be fairly self-explanatory:
855 It should be fairly self-explanatory:
856
856
857 .. literalinclude:: ../../../examples/core/example-embed.py
857 .. literalinclude:: ../../../examples/core/example-embed.py
858 :language: python
858 :language: python
859
859
860 Once you understand how the system functions, you can use the following
860 Once you understand how the system functions, you can use the following
861 code fragments in your programs which are ready for cut and paste:
861 code fragments in your programs which are ready for cut and paste:
862
862
863 .. literalinclude:: ../../../examples/core/example-embed-short.py
863 .. literalinclude:: ../../../examples/core/example-embed-short.py
864 :language: python
864 :language: python
865
865
866 Using the Python debugger (pdb)
866 Using the Python debugger (pdb)
867 ===============================
867 ===============================
868
868
869 Running entire programs via pdb
869 Running entire programs via pdb
870 -------------------------------
870 -------------------------------
871
871
872 pdb, the Python debugger, is a powerful interactive debugger which
872 pdb, the Python debugger, is a powerful interactive debugger which
873 allows you to step through code, set breakpoints, watch variables,
873 allows you to step through code, set breakpoints, watch variables,
874 etc. IPython makes it very easy to start any script under the control
874 etc. IPython makes it very easy to start any script under the control
875 of pdb, regardless of whether you have wrapped it into a 'main()'
875 of pdb, regardless of whether you have wrapped it into a 'main()'
876 function or not. For this, simply type '%run -d myscript' at an
876 function or not. For this, simply type '%run -d myscript' at an
877 IPython prompt. See the %run command's documentation (via '%run?' or
877 IPython prompt. See the %run command's documentation (via '%run?' or
878 in Sec. magic_ for more details, including how to control where pdb
878 in Sec. magic_ for more details, including how to control where pdb
879 will stop execution first.
879 will stop execution first.
880
880
881 For more information on the use of the pdb debugger, read the included
881 For more information on the use of the pdb debugger, read the included
882 pdb.doc file (part of the standard Python distribution). On a stock
882 pdb.doc file (part of the standard Python distribution). On a stock
883 Linux system it is located at /usr/lib/python2.3/pdb.doc, but the
883 Linux system it is located at /usr/lib/python2.3/pdb.doc, but the
884 easiest way to read it is by using the help() function of the pdb module
884 easiest way to read it is by using the help() function of the pdb module
885 as follows (in an IPython prompt)::
885 as follows (in an IPython prompt)::
886
886
887 In [1]: import pdb
887 In [1]: import pdb
888 In [2]: pdb.help()
888 In [2]: pdb.help()
889
889
890 This will load the pdb.doc document in a file viewer for you automatically.
890 This will load the pdb.doc document in a file viewer for you automatically.
891
891
892
892
893 Automatic invocation of pdb on exceptions
893 Automatic invocation of pdb on exceptions
894 -----------------------------------------
894 -----------------------------------------
895
895
896 IPython, if started with the ``--pdb`` option (or if the option is set in
896 IPython, if started with the ``--pdb`` option (or if the option is set in
897 your config file) can call the Python pdb debugger every time your code
897 your config file) can call the Python pdb debugger every time your code
898 triggers an uncaught exception. This feature
898 triggers an uncaught exception. This feature
899 can also be toggled at any time with the %pdb magic command. This can be
899 can also be toggled at any time with the %pdb magic command. This can be
900 extremely useful in order to find the origin of subtle bugs, because pdb
900 extremely useful in order to find the origin of subtle bugs, because pdb
901 opens up at the point in your code which triggered the exception, and
901 opens up at the point in your code which triggered the exception, and
902 while your program is at this point 'dead', all the data is still
902 while your program is at this point 'dead', all the data is still
903 available and you can walk up and down the stack frame and understand
903 available and you can walk up and down the stack frame and understand
904 the origin of the problem.
904 the origin of the problem.
905
905
906 Furthermore, you can use these debugging facilities both with the
906 Furthermore, you can use these debugging facilities both with the
907 embedded IPython mode and without IPython at all. For an embedded shell
907 embedded IPython mode and without IPython at all. For an embedded shell
908 (see sec. Embedding_), simply call the constructor with
908 (see sec. Embedding_), simply call the constructor with
909 ``--pdb`` in the argument string and pdb will automatically be called if an
909 ``--pdb`` in the argument string and pdb will automatically be called if an
910 uncaught exception is triggered by your code.
910 uncaught exception is triggered by your code.
911
911
912 For stand-alone use of the feature in your programs which do not use
912 For stand-alone use of the feature in your programs which do not use
913 IPython at all, put the following lines toward the top of your 'main'
913 IPython at all, put the following lines toward the top of your 'main'
914 routine::
914 routine::
915
915
916 import sys
916 import sys
917 from IPython.core import ultratb
917 from IPython.core import ultratb
918 sys.excepthook = ultratb.FormattedTB(mode='Verbose',
918 sys.excepthook = ultratb.FormattedTB(mode='Verbose',
919 color_scheme='Linux', call_pdb=1)
919 color_scheme='Linux', call_pdb=1)
920
920
921 The mode keyword can be either 'Verbose' or 'Plain', giving either very
921 The mode keyword can be either 'Verbose' or 'Plain', giving either very
922 detailed or normal tracebacks respectively. The color_scheme keyword can
922 detailed or normal tracebacks respectively. The color_scheme keyword can
923 be one of 'NoColor', 'Linux' (default) or 'LightBG'. These are the same
923 be one of 'NoColor', 'Linux' (default) or 'LightBG'. These are the same
924 options which can be set in IPython with ``--colors`` and ``--xmode``.
924 options which can be set in IPython with ``--colors`` and ``--xmode``.
925
925
926 This will give any of your programs detailed, colored tracebacks with
926 This will give any of your programs detailed, colored tracebacks with
927 automatic invocation of pdb.
927 automatic invocation of pdb.
928
928
929
929
930 Extensions for syntax processing
930 Extensions for syntax processing
931 ================================
931 ================================
932
932
933 This isn't for the faint of heart, because the potential for breaking
933 This isn't for the faint of heart, because the potential for breaking
934 things is quite high. But it can be a very powerful and useful feature.
934 things is quite high. But it can be a very powerful and useful feature.
935 In a nutshell, you can redefine the way IPython processes the user input
935 In a nutshell, you can redefine the way IPython processes the user input
936 line to accept new, special extensions to the syntax without needing to
936 line to accept new, special extensions to the syntax without needing to
937 change any of IPython's own code.
937 change any of IPython's own code.
938
938
939 In the IPython/extensions directory you will find some examples
939 In the IPython/extensions directory you will find some examples
940 supplied, which we will briefly describe now. These can be used 'as is'
940 supplied, which we will briefly describe now. These can be used 'as is'
941 (and both provide very useful functionality), or you can use them as a
941 (and both provide very useful functionality), or you can use them as a
942 starting point for writing your own extensions.
942 starting point for writing your own extensions.
943
943
944 .. _pasting_with_prompts:
944 .. _pasting_with_prompts:
945
945
946 Pasting of code starting with Python or IPython prompts
946 Pasting of code starting with Python or IPython prompts
947 -------------------------------------------------------
947 -------------------------------------------------------
948
948
949 IPython is smart enough to filter out input prompts, be they plain Python ones
949 IPython is smart enough to filter out input prompts, be they plain Python ones
950 (``>>>`` and ``...``) or IPython ones (``In [N]:`` and ``...:``). You can
950 (``>>>`` and ``...``) or IPython ones (``In [N]:`` and ``...:``). You can
951 therefore copy and paste from existing interactive sessions without worry.
951 therefore copy and paste from existing interactive sessions without worry.
952
952
953 The following is a 'screenshot' of how things work, copying an example from the
953 The following is a 'screenshot' of how things work, copying an example from the
954 standard Python tutorial::
954 standard Python tutorial::
955
955
956 In [1]: >>> # Fibonacci series:
956 In [1]: >>> # Fibonacci series:
957
957
958 In [2]: ... # the sum of two elements defines the next
958 In [2]: ... # the sum of two elements defines the next
959
959
960 In [3]: ... a, b = 0, 1
960 In [3]: ... a, b = 0, 1
961
961
962 In [4]: >>> while b < 10:
962 In [4]: >>> while b < 10:
963 ...: ... print b
963 ...: ... print b
964 ...: ... a, b = b, a+b
964 ...: ... a, b = b, a+b
965 ...:
965 ...:
966 1
966 1
967 1
967 1
968 2
968 2
969 3
969 3
970 5
970 5
971 8
971 8
972
972
973 And pasting from IPython sessions works equally well::
973 And pasting from IPython sessions works equally well::
974
974
975 In [1]: In [5]: def f(x):
975 In [1]: In [5]: def f(x):
976 ...: ...: "A simple function"
976 ...: ...: "A simple function"
977 ...: ...: return x**2
977 ...: ...: return x**2
978 ...: ...:
978 ...: ...:
979
979
980 In [2]: f(3)
980 In [2]: f(3)
981 Out[2]: 9
981 Out[2]: 9
982
982
983 .. _gui_support:
983 .. _gui_support:
984
984
985 GUI event loop support
985 GUI event loop support
986 ======================
986 ======================
987
987
988 .. versionadded:: 0.11
988 .. versionadded:: 0.11
989 The ``%gui`` magic and :mod:`IPython.lib.inputhook`.
989 The ``%gui`` magic and :mod:`IPython.lib.inputhook`.
990
990
991 IPython has excellent support for working interactively with Graphical User
991 IPython has excellent support for working interactively with Graphical User
992 Interface (GUI) toolkits, such as wxPython, PyQt4/PySide, PyGTK and Tk. This is
992 Interface (GUI) toolkits, such as wxPython, PyQt4/PySide, PyGTK and Tk. This is
993 implemented using Python's builtin ``PyOSInputHook`` hook. This implementation
993 implemented using Python's builtin ``PyOSInputHook`` hook. This implementation
994 is extremely robust compared to our previous thread-based version. The
994 is extremely robust compared to our previous thread-based version. The
995 advantages of this are:
995 advantages of this are:
996
996
997 * GUIs can be enabled and disabled dynamically at runtime.
997 * GUIs can be enabled and disabled dynamically at runtime.
998 * The active GUI can be switched dynamically at runtime.
998 * The active GUI can be switched dynamically at runtime.
999 * In some cases, multiple GUIs can run simultaneously with no problems.
999 * In some cases, multiple GUIs can run simultaneously with no problems.
1000 * There is a developer API in :mod:`IPython.lib.inputhook` for customizing
1000 * There is a developer API in :mod:`IPython.lib.inputhook` for customizing
1001 all of these things.
1001 all of these things.
1002
1002
1003 For users, enabling GUI event loop integration is simple. You simple use the
1003 For users, enabling GUI event loop integration is simple. You simple use the
1004 ``%gui`` magic as follows::
1004 ``%gui`` magic as follows::
1005
1005
1006 %gui [GUINAME]
1006 %gui [GUINAME]
1007
1007
1008 With no arguments, ``%gui`` removes all GUI support. Valid ``GUINAME``
1008 With no arguments, ``%gui`` removes all GUI support. Valid ``GUINAME``
1009 arguments are ``wx``, ``qt``, ``gtk`` and ``tk``.
1009 arguments are ``wx``, ``qt``, ``gtk`` and ``tk``.
1010
1010
1011 Thus, to use wxPython interactively and create a running :class:`wx.App`
1011 Thus, to use wxPython interactively and create a running :class:`wx.App`
1012 object, do::
1012 object, do::
1013
1013
1014 %gui wx
1014 %gui wx
1015
1015
1016 For information on IPython's Matplotlib integration (and the ``matplotlib``
1016 For information on IPython's Matplotlib integration (and the ``matplotlib``
1017 mode) see :ref:`this section <matplotlib_support>`.
1017 mode) see :ref:`this section <matplotlib_support>`.
1018
1018
1019 For developers that want to use IPython's GUI event loop integration in the
1019 For developers that want to use IPython's GUI event loop integration in the
1020 form of a library, these capabilities are exposed in library form in the
1020 form of a library, these capabilities are exposed in library form in the
1021 :mod:`IPython.lib.inputhook` and :mod:`IPython.lib.guisupport` modules.
1021 :mod:`IPython.lib.inputhook` and :mod:`IPython.lib.guisupport` modules.
1022 Interested developers should see the module docstrings for more information,
1022 Interested developers should see the module docstrings for more information,
1023 but there are a few points that should be mentioned here.
1023 but there are a few points that should be mentioned here.
1024
1024
1025 First, the ``PyOSInputHook`` approach only works in command line settings
1025 First, the ``PyOSInputHook`` approach only works in command line settings
1026 where readline is activated. The integration with various eventloops
1026 where readline is activated. The integration with various eventloops
1027 is handled somewhat differently (and more simply) when using the standalone
1027 is handled somewhat differently (and more simply) when using the standalone
1028 kernel, as in the qtconsole and notebook.
1028 kernel, as in the qtconsole and notebook.
1029
1029
1030 Second, when using the ``PyOSInputHook`` approach, a GUI application should
1030 Second, when using the ``PyOSInputHook`` approach, a GUI application should
1031 *not* start its event loop. Instead all of this is handled by the
1031 *not* start its event loop. Instead all of this is handled by the
1032 ``PyOSInputHook``. This means that applications that are meant to be used both
1032 ``PyOSInputHook``. This means that applications that are meant to be used both
1033 in IPython and as standalone apps need to have special code to detects how the
1033 in IPython and as standalone apps need to have special code to detects how the
1034 application is being run. We highly recommend using IPython's support for this.
1034 application is being run. We highly recommend using IPython's support for this.
1035 Since the details vary slightly between toolkits, we point you to the various
1035 Since the details vary slightly between toolkits, we point you to the various
1036 examples in our source directory :file:`docs/examples/lib` that demonstrate
1036 examples in our source directory :file:`examples/lib` that demonstrate
1037 these capabilities.
1037 these capabilities.
1038
1038
1039 Third, unlike previous versions of IPython, we no longer "hijack" (replace
1039 Third, unlike previous versions of IPython, we no longer "hijack" (replace
1040 them with no-ops) the event loops. This is done to allow applications that
1040 them with no-ops) the event loops. This is done to allow applications that
1041 actually need to run the real event loops to do so. This is often needed to
1041 actually need to run the real event loops to do so. This is often needed to
1042 process pending events at critical points.
1042 process pending events at critical points.
1043
1043
1044 Finally, we also have a number of examples in our source directory
1044 Finally, we also have a number of examples in our source directory
1045 :file:`docs/examples/lib` that demonstrate these capabilities.
1045 :file:`examples/lib` that demonstrate these capabilities.
1046
1046
1047 PyQt and PySide
1047 PyQt and PySide
1048 ---------------
1048 ---------------
1049
1049
1050 .. attempt at explanation of the complete mess that is Qt support
1050 .. attempt at explanation of the complete mess that is Qt support
1051
1051
1052 When you use ``--gui=qt`` or ``--matplotlib=qt``, IPython can work with either
1052 When you use ``--gui=qt`` or ``--matplotlib=qt``, IPython can work with either
1053 PyQt4 or PySide. There are three options for configuration here, because
1053 PyQt4 or PySide. There are three options for configuration here, because
1054 PyQt4 has two APIs for QString and QVariant - v1, which is the default on
1054 PyQt4 has two APIs for QString and QVariant - v1, which is the default on
1055 Python 2, and the more natural v2, which is the only API supported by PySide.
1055 Python 2, and the more natural v2, which is the only API supported by PySide.
1056 v2 is also the default for PyQt4 on Python 3. IPython's code for the QtConsole
1056 v2 is also the default for PyQt4 on Python 3. IPython's code for the QtConsole
1057 uses v2, but you can still use any interface in your code, since the
1057 uses v2, but you can still use any interface in your code, since the
1058 Qt frontend is in a different process.
1058 Qt frontend is in a different process.
1059
1059
1060 The default will be to import PyQt4 without configuration of the APIs, thus
1060 The default will be to import PyQt4 without configuration of the APIs, thus
1061 matching what most applications would expect. It will fall back of PySide if
1061 matching what most applications would expect. It will fall back of PySide if
1062 PyQt4 is unavailable.
1062 PyQt4 is unavailable.
1063
1063
1064 If specified, IPython will respect the environment variable ``QT_API`` used
1064 If specified, IPython will respect the environment variable ``QT_API`` used
1065 by ETS. ETS 4.0 also works with both PyQt4 and PySide, but it requires
1065 by ETS. ETS 4.0 also works with both PyQt4 and PySide, but it requires
1066 PyQt4 to use its v2 API. So if ``QT_API=pyside`` PySide will be used,
1066 PyQt4 to use its v2 API. So if ``QT_API=pyside`` PySide will be used,
1067 and if ``QT_API=pyqt`` then PyQt4 will be used *with the v2 API* for
1067 and if ``QT_API=pyqt`` then PyQt4 will be used *with the v2 API* for
1068 QString and QVariant, so ETS codes like MayaVi will also work with IPython.
1068 QString and QVariant, so ETS codes like MayaVi will also work with IPython.
1069
1069
1070 If you launch IPython in matplotlib mode with ``ipython --matplotlib=qt``,
1070 If you launch IPython in matplotlib mode with ``ipython --matplotlib=qt``,
1071 then IPython will ask matplotlib which Qt library to use (only if QT_API is
1071 then IPython will ask matplotlib which Qt library to use (only if QT_API is
1072 *not set*), via the 'backend.qt4' rcParam. If matplotlib is version 1.0.1 or
1072 *not set*), via the 'backend.qt4' rcParam. If matplotlib is version 1.0.1 or
1073 older, then IPython will always use PyQt4 without setting the v2 APIs, since
1073 older, then IPython will always use PyQt4 without setting the v2 APIs, since
1074 neither v2 PyQt nor PySide work.
1074 neither v2 PyQt nor PySide work.
1075
1075
1076 .. warning::
1076 .. warning::
1077
1077
1078 Note that this means for ETS 4 to work with PyQt4, ``QT_API`` *must* be set
1078 Note that this means for ETS 4 to work with PyQt4, ``QT_API`` *must* be set
1079 to work with IPython's qt integration, because otherwise PyQt4 will be
1079 to work with IPython's qt integration, because otherwise PyQt4 will be
1080 loaded in an incompatible mode.
1080 loaded in an incompatible mode.
1081
1081
1082 It also means that you must *not* have ``QT_API`` set if you want to
1082 It also means that you must *not* have ``QT_API`` set if you want to
1083 use ``--gui=qt`` with code that requires PyQt4 API v1.
1083 use ``--gui=qt`` with code that requires PyQt4 API v1.
1084
1084
1085
1085
1086 .. _matplotlib_support:
1086 .. _matplotlib_support:
1087
1087
1088 Plotting with matplotlib
1088 Plotting with matplotlib
1089 ========================
1089 ========================
1090
1090
1091 `Matplotlib`_ provides high quality 2D and 3D plotting for Python. Matplotlib
1091 `Matplotlib`_ provides high quality 2D and 3D plotting for Python. Matplotlib
1092 can produce plots on screen using a variety of GUI toolkits, including Tk,
1092 can produce plots on screen using a variety of GUI toolkits, including Tk,
1093 PyGTK, PyQt4 and wxPython. It also provides a number of commands useful for
1093 PyGTK, PyQt4 and wxPython. It also provides a number of commands useful for
1094 scientific computing, all with a syntax compatible with that of the popular
1094 scientific computing, all with a syntax compatible with that of the popular
1095 Matlab program.
1095 Matlab program.
1096
1096
1097 To start IPython with matplotlib support, use the ``--matplotlib`` switch. If
1097 To start IPython with matplotlib support, use the ``--matplotlib`` switch. If
1098 IPython is already running, you can run the ``%matplotlib`` magic. If no
1098 IPython is already running, you can run the ``%matplotlib`` magic. If no
1099 arguments are given, IPython will automatically detect your choice of
1099 arguments are given, IPython will automatically detect your choice of
1100 matplotlib backend. You can also request a specific backend with
1100 matplotlib backend. You can also request a specific backend with
1101 ``%matplotlib backend``, where ``backend`` must be one of: 'tk', 'qt', 'wx',
1101 ``%matplotlib backend``, where ``backend`` must be one of: 'tk', 'qt', 'wx',
1102 'gtk', 'osx'. In the web notebook and Qt console, 'inline' is also a valid
1102 'gtk', 'osx'. In the web notebook and Qt console, 'inline' is also a valid
1103 backend value, which produces static figures inlined inside the application
1103 backend value, which produces static figures inlined inside the application
1104 window instead of matplotlib's interactive figures that live in separate
1104 window instead of matplotlib's interactive figures that live in separate
1105 windows.
1105 windows.
1106
1106
1107 .. _Matplotlib: http://matplotlib.sourceforge.net
1107 .. _Matplotlib: http://matplotlib.sourceforge.net
1108
1108
1109 .. _interactive_demos:
1109 .. _interactive_demos:
1110
1110
1111 Interactive demos with IPython
1111 Interactive demos with IPython
1112 ==============================
1112 ==============================
1113
1113
1114 IPython ships with a basic system for running scripts interactively in
1114 IPython ships with a basic system for running scripts interactively in
1115 sections, useful when presenting code to audiences. A few tags embedded
1115 sections, useful when presenting code to audiences. A few tags embedded
1116 in comments (so that the script remains valid Python code) divide a file
1116 in comments (so that the script remains valid Python code) divide a file
1117 into separate blocks, and the demo can be run one block at a time, with
1117 into separate blocks, and the demo can be run one block at a time, with
1118 IPython printing (with syntax highlighting) the block before executing
1118 IPython printing (with syntax highlighting) the block before executing
1119 it, and returning to the interactive prompt after each block. The
1119 it, and returning to the interactive prompt after each block. The
1120 interactive namespace is updated after each block is run with the
1120 interactive namespace is updated after each block is run with the
1121 contents of the demo's namespace.
1121 contents of the demo's namespace.
1122
1122
1123 This allows you to show a piece of code, run it and then execute
1123 This allows you to show a piece of code, run it and then execute
1124 interactively commands based on the variables just created. Once you
1124 interactively commands based on the variables just created. Once you
1125 want to continue, you simply execute the next block of the demo. The
1125 want to continue, you simply execute the next block of the demo. The
1126 following listing shows the markup necessary for dividing a script into
1126 following listing shows the markup necessary for dividing a script into
1127 sections for execution as a demo:
1127 sections for execution as a demo:
1128
1128
1129 .. literalinclude:: ../../../examples/lib/example-demo.py
1129 .. literalinclude:: ../../../examples/lib/example-demo.py
1130 :language: python
1130 :language: python
1131
1131
1132 In order to run a file as a demo, you must first make a Demo object out
1132 In order to run a file as a demo, you must first make a Demo object out
1133 of it. If the file is named myscript.py, the following code will make a
1133 of it. If the file is named myscript.py, the following code will make a
1134 demo::
1134 demo::
1135
1135
1136 from IPython.lib.demo import Demo
1136 from IPython.lib.demo import Demo
1137
1137
1138 mydemo = Demo('myscript.py')
1138 mydemo = Demo('myscript.py')
1139
1139
1140 This creates the mydemo object, whose blocks you run one at a time by
1140 This creates the mydemo object, whose blocks you run one at a time by
1141 simply calling the object with no arguments. If you have autocall active
1141 simply calling the object with no arguments. If you have autocall active
1142 in IPython (the default), all you need to do is type::
1142 in IPython (the default), all you need to do is type::
1143
1143
1144 mydemo
1144 mydemo
1145
1145
1146 and IPython will call it, executing each block. Demo objects can be
1146 and IPython will call it, executing each block. Demo objects can be
1147 restarted, you can move forward or back skipping blocks, re-execute the
1147 restarted, you can move forward or back skipping blocks, re-execute the
1148 last block, etc. Simply use the Tab key on a demo object to see its
1148 last block, etc. Simply use the Tab key on a demo object to see its
1149 methods, and call '?' on them to see their docstrings for more usage
1149 methods, and call '?' on them to see their docstrings for more usage
1150 details. In addition, the demo module itself contains a comprehensive
1150 details. In addition, the demo module itself contains a comprehensive
1151 docstring, which you can access via::
1151 docstring, which you can access via::
1152
1152
1153 from IPython.lib import demo
1153 from IPython.lib import demo
1154
1154
1155 demo?
1155 demo?
1156
1156
1157 Limitations: It is important to note that these demos are limited to
1157 Limitations: It is important to note that these demos are limited to
1158 fairly simple uses. In particular, you cannot break up sections within
1158 fairly simple uses. In particular, you cannot break up sections within
1159 indented code (loops, if statements, function definitions, etc.)
1159 indented code (loops, if statements, function definitions, etc.)
1160 Supporting something like this would basically require tracking the
1160 Supporting something like this would basically require tracking the
1161 internal execution state of the Python interpreter, so only top-level
1161 internal execution state of the Python interpreter, so only top-level
1162 divisions are allowed. If you want to be able to open an IPython
1162 divisions are allowed. If you want to be able to open an IPython
1163 instance at an arbitrary point in a program, you can use IPython's
1163 instance at an arbitrary point in a program, you can use IPython's
1164 embedding facilities, see :func:`IPython.embed` for details.
1164 embedding facilities, see :func:`IPython.embed` for details.
1165
1165
@@ -1,150 +1,150 b''
1 .. _parallel_asyncresult:
1 .. _parallel_asyncresult:
2
2
3 ======================
3 ======================
4 The AsyncResult object
4 The AsyncResult object
5 ======================
5 ======================
6
6
7 In non-blocking mode, :meth:`apply` submits the command to be executed and
7 In non-blocking mode, :meth:`apply` submits the command to be executed and
8 then returns a :class:`~.AsyncResult` object immediately. The
8 then returns a :class:`~.AsyncResult` object immediately. The
9 AsyncResult object gives you a way of getting a result at a later
9 AsyncResult object gives you a way of getting a result at a later
10 time through its :meth:`get` method, but it also collects metadata
10 time through its :meth:`get` method, but it also collects metadata
11 on execution.
11 on execution.
12
12
13
13
14 Beyond multiprocessing's AsyncResult
14 Beyond multiprocessing's AsyncResult
15 ====================================
15 ====================================
16
16
17 .. Note::
17 .. Note::
18
18
19 The :class:`~.AsyncResult` object provides a superset of the interface in
19 The :class:`~.AsyncResult` object provides a superset of the interface in
20 :py:class:`multiprocessing.pool.AsyncResult`. See the
20 :py:class:`multiprocessing.pool.AsyncResult`. See the
21 `official Python documentation <http://docs.python.org/library/multiprocessing#multiprocessing.pool.AsyncResult>`_
21 `official Python documentation <http://docs.python.org/library/multiprocessing#multiprocessing.pool.AsyncResult>`_
22 for more on the basics of this interface.
22 for more on the basics of this interface.
23
23
24 Our AsyncResult objects add a number of convenient features for working with
24 Our AsyncResult objects add a number of convenient features for working with
25 parallel results, beyond what is provided by the original AsyncResult.
25 parallel results, beyond what is provided by the original AsyncResult.
26
26
27
27
28 get_dict
28 get_dict
29 --------
29 --------
30
30
31 First, is :meth:`.AsyncResult.get_dict`, which pulls results as a dictionary
31 First, is :meth:`.AsyncResult.get_dict`, which pulls results as a dictionary
32 keyed by engine_id, rather than a flat list. This is useful for quickly
32 keyed by engine_id, rather than a flat list. This is useful for quickly
33 coordinating or distributing information about all of the engines.
33 coordinating or distributing information about all of the engines.
34
34
35 As an example, here is a quick call that gives every engine a dict showing
35 As an example, here is a quick call that gives every engine a dict showing
36 the PID of every other engine:
36 the PID of every other engine:
37
37
38 .. sourcecode:: ipython
38 .. sourcecode:: ipython
39
39
40 In [10]: ar = rc[:].apply_async(os.getpid)
40 In [10]: ar = rc[:].apply_async(os.getpid)
41 In [11]: pids = ar.get_dict()
41 In [11]: pids = ar.get_dict()
42 In [12]: rc[:]['pid_map'] = pids
42 In [12]: rc[:]['pid_map'] = pids
43
43
44 This trick is particularly useful when setting up inter-engine communication,
44 This trick is particularly useful when setting up inter-engine communication,
45 as in IPython's :file:`examples/parallel/interengine` examples.
45 as in IPython's :file:`examples/parallel/interengine` examples.
46
46
47
47
48 Metadata
48 Metadata
49 ========
49 ========
50
50
51 IPython.parallel tracks some metadata about the tasks, which is stored
51 IPython.parallel tracks some metadata about the tasks, which is stored
52 in the :attr:`.Client.metadata` dict. The AsyncResult object gives you an
52 in the :attr:`.Client.metadata` dict. The AsyncResult object gives you an
53 interface for this information as well, including timestamps stdout/err,
53 interface for this information as well, including timestamps stdout/err,
54 and engine IDs.
54 and engine IDs.
55
55
56
56
57 Timing
57 Timing
58 ------
58 ------
59
59
60 IPython tracks various timestamps as :py:class:`.datetime` objects,
60 IPython tracks various timestamps as :py:class:`.datetime` objects,
61 and the AsyncResult object has a few properties that turn these into useful
61 and the AsyncResult object has a few properties that turn these into useful
62 times (in seconds as floats).
62 times (in seconds as floats).
63
63
64 For use while the tasks are still pending:
64 For use while the tasks are still pending:
65
65
66 * :attr:`ar.elapsed` is just the elapsed seconds since submission, for use
66 * :attr:`ar.elapsed` is just the elapsed seconds since submission, for use
67 before the AsyncResult is complete.
67 before the AsyncResult is complete.
68 * :attr:`ar.progress` is the number of tasks that have completed. Fractional progress
68 * :attr:`ar.progress` is the number of tasks that have completed. Fractional progress
69 would be::
69 would be::
70
70
71 1.0 * ar.progress / len(ar)
71 1.0 * ar.progress / len(ar)
72
72
73 * :meth:`AsyncResult.wait_interactive` will wait for the result to finish, but
73 * :meth:`AsyncResult.wait_interactive` will wait for the result to finish, but
74 print out status updates on progress and elapsed time while it waits.
74 print out status updates on progress and elapsed time while it waits.
75
75
76 For use after the tasks are done:
76 For use after the tasks are done:
77
77
78 * :attr:`ar.serial_time` is the sum of the computation time of all of the tasks
78 * :attr:`ar.serial_time` is the sum of the computation time of all of the tasks
79 done in parallel.
79 done in parallel.
80 * :attr:`ar.wall_time` is the time between the first task submitted and last result
80 * :attr:`ar.wall_time` is the time between the first task submitted and last result
81 received. This is the actual cost of computation, including IPython overhead.
81 received. This is the actual cost of computation, including IPython overhead.
82
82
83
83
84 .. note::
84 .. note::
85
85
86 wall_time is only precise if the Client is waiting for results when
86 wall_time is only precise if the Client is waiting for results when
87 the task finished, because the `received` timestamp is made when the result is
87 the task finished, because the `received` timestamp is made when the result is
88 unpacked by the Client, triggered by the :meth:`~Client.spin` call. If you
88 unpacked by the Client, triggered by the :meth:`~Client.spin` call. If you
89 are doing work in the Client, and not waiting/spinning, then `received` might
89 are doing work in the Client, and not waiting/spinning, then `received` might
90 be artificially high.
90 be artificially high.
91
91
92 An often interesting metric is the time it actually cost to do the work in parallel
92 An often interesting metric is the time it actually cost to do the work in parallel
93 relative to the serial computation, and this can be given simply with
93 relative to the serial computation, and this can be given simply with
94
94
95 .. sourcecode:: python
95 .. sourcecode:: python
96
96
97 speedup = ar.serial_time / ar.wall_time
97 speedup = ar.serial_time / ar.wall_time
98
98
99
99
100 Map results are iterable!
100 Map results are iterable!
101 =========================
101 =========================
102
102
103 When an AsyncResult object has multiple results (e.g. the :class:`~AsyncMapResult`
103 When an AsyncResult object has multiple results (e.g. the :class:`~AsyncMapResult`
104 object), you can actually iterate through results themselves, and act on them as they arrive:
104 object), you can actually iterate through results themselves, and act on them as they arrive:
105
105
106 .. literalinclude:: ../../../examples/parallel/itermapresult.py
106 .. literalinclude:: ../../../examples/parallel/itermapresult.py
107 :language: python
107 :language: python
108 :lines: 20-67
108 :lines: 20-67
109
109
110 That is to say, if you treat an AsyncMapResult as if it were a list of your actual
110 That is to say, if you treat an AsyncMapResult as if it were a list of your actual
111 results, it should behave as you would expect, with the only difference being
111 results, it should behave as you would expect, with the only difference being
112 that you can start iterating through the results before they have even been computed.
112 that you can start iterating through the results before they have even been computed.
113
113
114 This lets you do a dumb version of map/reduce with the builtin Python functions,
114 This lets you do a dumb version of map/reduce with the builtin Python functions,
115 and the only difference between doing this locally and doing it remotely in parallel
115 and the only difference between doing this locally and doing it remotely in parallel
116 is using the asynchronous view.map instead of the builtin map.
116 is using the asynchronous view.map instead of the builtin map.
117
117
118
118
119 Here is a simple one-line RMS (root-mean-square) implemented with Python's builtin map/reduce.
119 Here is a simple one-line RMS (root-mean-square) implemented with Python's builtin map/reduce.
120
120
121 .. sourcecode:: ipython
121 .. sourcecode:: ipython
122
122
123 In [38]: X = np.linspace(0,100)
123 In [38]: X = np.linspace(0,100)
124
124
125 In [39]: from math import sqrt
125 In [39]: from math import sqrt
126
126
127 In [40]: add = lambda a,b: a+b
127 In [40]: add = lambda a,b: a+b
128
128
129 In [41]: sq = lambda x: x*x
129 In [41]: sq = lambda x: x*x
130
130
131 In [42]: sqrt(reduce(add, map(sq, X)) / len(X))
131 In [42]: sqrt(reduce(add, map(sq, X)) / len(X))
132 Out[42]: 58.028845747399714
132 Out[42]: 58.028845747399714
133
133
134 In [43]: sqrt(reduce(add, view.map(sq, X)) / len(X))
134 In [43]: sqrt(reduce(add, view.map(sq, X)) / len(X))
135 Out[43]: 58.028845747399714
135 Out[43]: 58.028845747399714
136
136
137 To break that down:
137 To break that down:
138
138
139 1. ``map(sq, X)`` Compute the square of each element in the list (locally, or in parallel)
139 1. ``map(sq, X)`` Compute the square of each element in the list (locally, or in parallel)
140 2. ``reduce(add, sqX) / len(X)`` compute the mean by summing over the list (or AsyncMapResult)
140 2. ``reduce(add, sqX) / len(X)`` compute the mean by summing over the list (or AsyncMapResult)
141 and dividing by the size
141 and dividing by the size
142 3. take the square root of the resulting number
142 3. take the square root of the resulting number
143
143
144 .. seealso::
144 .. seealso::
145
145
146 When AsyncResult or the AsyncMapResult don't provide what you need (for instance,
146 When AsyncResult or the AsyncMapResult don't provide what you need (for instance,
147 handling individual results as they arrive, but with metadata), you can always
147 handling individual results as they arrive, but with metadata), you can always
148 just split the original result's ``msg_ids`` attribute, and handle them as you like.
148 just split the original result's ``msg_ids`` attribute, and handle them as you like.
149
149
150 For an example of this, see :file:`docs/examples/parallel/customresult.py`
150 For an example of this, see :file:`examples/parallel/customresult.py`
@@ -1,177 +1,177 b''
1 .. _dag_dependencies:
1 .. _dag_dependencies:
2
2
3 ================
3 ================
4 DAG Dependencies
4 DAG Dependencies
5 ================
5 ================
6
6
7 Often, parallel workflow is described in terms of a `Directed Acyclic Graph
7 Often, parallel workflow is described in terms of a `Directed Acyclic Graph
8 <http://en.wikipedia.org/wiki/Directed_acyclic_graph>`_ or DAG. A popular library
8 <http://en.wikipedia.org/wiki/Directed_acyclic_graph>`_ or DAG. A popular library
9 for working with Graphs is NetworkX_. Here, we will walk through a demo mapping
9 for working with Graphs is NetworkX_. Here, we will walk through a demo mapping
10 a nx DAG to task dependencies.
10 a nx DAG to task dependencies.
11
11
12 The full script that runs this demo can be found in
12 The full script that runs this demo can be found in
13 :file:`docs/examples/parallel/dagdeps.py`.
13 :file:`examples/parallel/dagdeps.py`.
14
14
15 Why are DAGs good for task dependencies?
15 Why are DAGs good for task dependencies?
16 ----------------------------------------
16 ----------------------------------------
17
17
18 The 'G' in DAG is 'Graph'. A Graph is a collection of **nodes** and **edges** that connect
18 The 'G' in DAG is 'Graph'. A Graph is a collection of **nodes** and **edges** that connect
19 the nodes. For our purposes, each node would be a task, and each edge would be a
19 the nodes. For our purposes, each node would be a task, and each edge would be a
20 dependency. The 'D' in DAG stands for 'Directed'. This means that each edge has a
20 dependency. The 'D' in DAG stands for 'Directed'. This means that each edge has a
21 direction associated with it. So we can interpret the edge (a,b) as meaning that b depends
21 direction associated with it. So we can interpret the edge (a,b) as meaning that b depends
22 on a, whereas the edge (b,a) would mean a depends on b. The 'A' is 'Acyclic', meaning that
22 on a, whereas the edge (b,a) would mean a depends on b. The 'A' is 'Acyclic', meaning that
23 there must not be any closed loops in the graph. This is important for dependencies,
23 there must not be any closed loops in the graph. This is important for dependencies,
24 because if a loop were closed, then a task could ultimately depend on itself, and never be
24 because if a loop were closed, then a task could ultimately depend on itself, and never be
25 able to run. If your workflow can be described as a DAG, then it is impossible for your
25 able to run. If your workflow can be described as a DAG, then it is impossible for your
26 dependencies to cause a deadlock.
26 dependencies to cause a deadlock.
27
27
28 A Sample DAG
28 A Sample DAG
29 ------------
29 ------------
30
30
31 Here, we have a very simple 5-node DAG:
31 Here, we have a very simple 5-node DAG:
32
32
33 .. figure:: figs/simpledag.*
33 .. figure:: figs/simpledag.*
34 :width: 600px
34 :width: 600px
35
35
36 With NetworkX, an arrow is just a fattened bit on the edge. Here, we can see that task 0
36 With NetworkX, an arrow is just a fattened bit on the edge. Here, we can see that task 0
37 depends on nothing, and can run immediately. 1 and 2 depend on 0; 3 depends on
37 depends on nothing, and can run immediately. 1 and 2 depend on 0; 3 depends on
38 1 and 2; and 4 depends only on 1.
38 1 and 2; and 4 depends only on 1.
39
39
40 A possible sequence of events for this workflow:
40 A possible sequence of events for this workflow:
41
41
42 0. Task 0 can run right away
42 0. Task 0 can run right away
43 1. 0 finishes, so 1,2 can start
43 1. 0 finishes, so 1,2 can start
44 2. 1 finishes, 3 is still waiting on 2, but 4 can start right away
44 2. 1 finishes, 3 is still waiting on 2, but 4 can start right away
45 3. 2 finishes, and 3 can finally start
45 3. 2 finishes, and 3 can finally start
46
46
47
47
48 Further, taking failures into account, assuming all dependencies are run with the default
48 Further, taking failures into account, assuming all dependencies are run with the default
49 `success=True,failure=False`, the following cases would occur for each node's failure:
49 `success=True,failure=False`, the following cases would occur for each node's failure:
50
50
51 0. fails: all other tasks fail as Impossible
51 0. fails: all other tasks fail as Impossible
52 1. 2 can still succeed, but 3,4 are unreachable
52 1. 2 can still succeed, but 3,4 are unreachable
53 2. 3 becomes unreachable, but 4 is unaffected
53 2. 3 becomes unreachable, but 4 is unaffected
54 3. and 4. are terminal, and can have no effect on other nodes
54 3. and 4. are terminal, and can have no effect on other nodes
55
55
56 The code to generate the simple DAG:
56 The code to generate the simple DAG:
57
57
58 .. sourcecode:: python
58 .. sourcecode:: python
59
59
60 import networkx as nx
60 import networkx as nx
61
61
62 G = nx.DiGraph()
62 G = nx.DiGraph()
63
63
64 # add 5 nodes, labeled 0-4:
64 # add 5 nodes, labeled 0-4:
65 map(G.add_node, range(5))
65 map(G.add_node, range(5))
66 # 1,2 depend on 0:
66 # 1,2 depend on 0:
67 G.add_edge(0,1)
67 G.add_edge(0,1)
68 G.add_edge(0,2)
68 G.add_edge(0,2)
69 # 3 depends on 1,2
69 # 3 depends on 1,2
70 G.add_edge(1,3)
70 G.add_edge(1,3)
71 G.add_edge(2,3)
71 G.add_edge(2,3)
72 # 4 depends on 1
72 # 4 depends on 1
73 G.add_edge(1,4)
73 G.add_edge(1,4)
74
74
75 # now draw the graph:
75 # now draw the graph:
76 pos = { 0 : (0,0), 1 : (1,1), 2 : (-1,1),
76 pos = { 0 : (0,0), 1 : (1,1), 2 : (-1,1),
77 3 : (0,2), 4 : (2,2)}
77 3 : (0,2), 4 : (2,2)}
78 nx.draw(G, pos, edge_color='r')
78 nx.draw(G, pos, edge_color='r')
79
79
80
80
81 For demonstration purposes, we have a function that generates a random DAG with a given
81 For demonstration purposes, we have a function that generates a random DAG with a given
82 number of nodes and edges.
82 number of nodes and edges.
83
83
84 .. literalinclude:: ../../../examples/parallel/dagdeps.py
84 .. literalinclude:: ../../../examples/parallel/dagdeps.py
85 :language: python
85 :language: python
86 :lines: 20-36
86 :lines: 20-36
87
87
88 So first, we start with a graph of 32 nodes, with 128 edges:
88 So first, we start with a graph of 32 nodes, with 128 edges:
89
89
90 .. sourcecode:: ipython
90 .. sourcecode:: ipython
91
91
92 In [2]: G = random_dag(32,128)
92 In [2]: G = random_dag(32,128)
93
93
94 Now, we need to build our dict of jobs corresponding to the nodes on the graph:
94 Now, we need to build our dict of jobs corresponding to the nodes on the graph:
95
95
96 .. sourcecode:: ipython
96 .. sourcecode:: ipython
97
97
98 In [3]: jobs = {}
98 In [3]: jobs = {}
99
99
100 # in reality, each job would presumably be different
100 # in reality, each job would presumably be different
101 # randomwait is just a function that sleeps for a random interval
101 # randomwait is just a function that sleeps for a random interval
102 In [4]: for node in G:
102 In [4]: for node in G:
103 ...: jobs[node] = randomwait
103 ...: jobs[node] = randomwait
104
104
105 Once we have a dict of jobs matching the nodes on the graph, we can start submitting jobs,
105 Once we have a dict of jobs matching the nodes on the graph, we can start submitting jobs,
106 and linking up the dependencies. Since we don't know a job's msg_id until it is submitted,
106 and linking up the dependencies. Since we don't know a job's msg_id until it is submitted,
107 which is necessary for building dependencies, it is critical that we don't submit any jobs
107 which is necessary for building dependencies, it is critical that we don't submit any jobs
108 before other jobs it may depend on. Fortunately, NetworkX provides a
108 before other jobs it may depend on. Fortunately, NetworkX provides a
109 :meth:`topological_sort` method which ensures exactly this. It presents an iterable, that
109 :meth:`topological_sort` method which ensures exactly this. It presents an iterable, that
110 guarantees that when you arrive at a node, you have already visited all the nodes it
110 guarantees that when you arrive at a node, you have already visited all the nodes it
111 on which it depends:
111 on which it depends:
112
112
113 .. sourcecode:: ipython
113 .. sourcecode:: ipython
114
114
115 In [5]: rc = Client()
115 In [5]: rc = Client()
116 In [5]: view = rc.load_balanced_view()
116 In [5]: view = rc.load_balanced_view()
117
117
118 In [6]: results = {}
118 In [6]: results = {}
119
119
120 In [7]: for node in G.topological_sort():
120 In [7]: for node in G.topological_sort():
121 ...: # get list of AsyncResult objects from nodes
121 ...: # get list of AsyncResult objects from nodes
122 ...: # leading into this one as dependencies
122 ...: # leading into this one as dependencies
123 ...: deps = [ results[n] for n in G.predecessors(node) ]
123 ...: deps = [ results[n] for n in G.predecessors(node) ]
124 ...: # submit and store AsyncResult object
124 ...: # submit and store AsyncResult object
125 ...: with view.temp_flags(after=deps, block=False):
125 ...: with view.temp_flags(after=deps, block=False):
126 ...: results[node] = view.apply_with_flags(jobs[node])
126 ...: results[node] = view.apply_with_flags(jobs[node])
127
127
128
128
129 Now that we have submitted all the jobs, we can wait for the results:
129 Now that we have submitted all the jobs, we can wait for the results:
130
130
131 .. sourcecode:: ipython
131 .. sourcecode:: ipython
132
132
133 In [8]: view.wait(results.values())
133 In [8]: view.wait(results.values())
134
134
135 Now, at least we know that all the jobs ran and did not fail (``r.get()`` would have
135 Now, at least we know that all the jobs ran and did not fail (``r.get()`` would have
136 raised an error if a task failed). But we don't know that the ordering was properly
136 raised an error if a task failed). But we don't know that the ordering was properly
137 respected. For this, we can use the :attr:`metadata` attribute of each AsyncResult.
137 respected. For this, we can use the :attr:`metadata` attribute of each AsyncResult.
138
138
139 These objects store a variety of metadata about each task, including various timestamps.
139 These objects store a variety of metadata about each task, including various timestamps.
140 We can validate that the dependencies were respected by checking that each task was
140 We can validate that the dependencies were respected by checking that each task was
141 started after all of its predecessors were completed:
141 started after all of its predecessors were completed:
142
142
143 .. literalinclude:: ../../../examples/parallel/dagdeps.py
143 .. literalinclude:: ../../../examples/parallel/dagdeps.py
144 :language: python
144 :language: python
145 :lines: 64-70
145 :lines: 64-70
146
146
147 We can also validate the graph visually. By drawing the graph with each node's x-position
147 We can also validate the graph visually. By drawing the graph with each node's x-position
148 as its start time, all arrows must be pointing to the right if dependencies were respected.
148 as its start time, all arrows must be pointing to the right if dependencies were respected.
149 For spreading, the y-position will be the runtime of the task, so long tasks
149 For spreading, the y-position will be the runtime of the task, so long tasks
150 will be at the top, and quick, small tasks will be at the bottom.
150 will be at the top, and quick, small tasks will be at the bottom.
151
151
152 .. sourcecode:: ipython
152 .. sourcecode:: ipython
153
153
154 In [10]: from matplotlib.dates import date2num
154 In [10]: from matplotlib.dates import date2num
155
155
156 In [11]: from matplotlib.cm import gist_rainbow
156 In [11]: from matplotlib.cm import gist_rainbow
157
157
158 In [12]: pos = {}; colors = {}
158 In [12]: pos = {}; colors = {}
159
159
160 In [12]: for node in G:
160 In [12]: for node in G:
161 ....: md = results[node].metadata
161 ....: md = results[node].metadata
162 ....: start = date2num(md.started)
162 ....: start = date2num(md.started)
163 ....: runtime = date2num(md.completed) - start
163 ....: runtime = date2num(md.completed) - start
164 ....: pos[node] = (start, runtime)
164 ....: pos[node] = (start, runtime)
165 ....: colors[node] = md.engine_id
165 ....: colors[node] = md.engine_id
166
166
167 In [13]: nx.draw(G, pos, node_list=colors.keys(), node_color=colors.values(),
167 In [13]: nx.draw(G, pos, node_list=colors.keys(), node_color=colors.values(),
168 ....: cmap=gist_rainbow)
168 ....: cmap=gist_rainbow)
169
169
170 .. figure:: figs/dagdeps.*
170 .. figure:: figs/dagdeps.*
171 :width: 600px
171 :width: 600px
172
172
173 Time started on x, runtime on y, and color-coded by engine-id (in this case there
173 Time started on x, runtime on y, and color-coded by engine-id (in this case there
174 were four engines). Edges denote dependencies.
174 were four engines). Edges denote dependencies.
175
175
176
176
177 .. _NetworkX: http://networkx.lanl.gov/
177 .. _NetworkX: http://networkx.lanl.gov/
@@ -1,208 +1,208 b''
1 .. _parallel_examples:
1 .. _parallel_examples:
2
2
3 =================
3 =================
4 Parallel examples
4 Parallel examples
5 =================
5 =================
6
6
7 In this section we describe two more involved examples of using an IPython
7 In this section we describe two more involved examples of using an IPython
8 cluster to perform a parallel computation. We will be doing some plotting,
8 cluster to perform a parallel computation. We will be doing some plotting,
9 so we start IPython with matplotlib integration by typing::
9 so we start IPython with matplotlib integration by typing::
10
10
11 ipython --matplotlib
11 ipython --matplotlib
12
12
13 at the system command line.
13 at the system command line.
14 Or you can enable matplotlib integration at any point with:
14 Or you can enable matplotlib integration at any point with:
15
15
16 .. sourcecode:: ipython
16 .. sourcecode:: ipython
17
17
18 In [1]: %matplotlib
18 In [1]: %matplotlib
19
19
20
20
21 150 million digits of pi
21 150 million digits of pi
22 ========================
22 ========================
23
23
24 In this example we would like to study the distribution of digits in the
24 In this example we would like to study the distribution of digits in the
25 number pi (in base 10). While it is not known if pi is a normal number (a
25 number pi (in base 10). While it is not known if pi is a normal number (a
26 number is normal in base 10 if 0-9 occur with equal likelihood) numerical
26 number is normal in base 10 if 0-9 occur with equal likelihood) numerical
27 investigations suggest that it is. We will begin with a serial calculation on
27 investigations suggest that it is. We will begin with a serial calculation on
28 10,000 digits of pi and then perform a parallel calculation involving 150
28 10,000 digits of pi and then perform a parallel calculation involving 150
29 million digits.
29 million digits.
30
30
31 In both the serial and parallel calculation we will be using functions defined
31 In both the serial and parallel calculation we will be using functions defined
32 in the :file:`pidigits.py` file, which is available in the
32 in the :file:`pidigits.py` file, which is available in the
33 :file:`docs/examples/parallel` directory of the IPython source distribution.
33 :file:`examples/parallel` directory of the IPython source distribution.
34 These functions provide basic facilities for working with the digits of pi and
34 These functions provide basic facilities for working with the digits of pi and
35 can be loaded into IPython by putting :file:`pidigits.py` in your current
35 can be loaded into IPython by putting :file:`pidigits.py` in your current
36 working directory and then doing:
36 working directory and then doing:
37
37
38 .. sourcecode:: ipython
38 .. sourcecode:: ipython
39
39
40 In [1]: run pidigits.py
40 In [1]: run pidigits.py
41
41
42 Serial calculation
42 Serial calculation
43 ------------------
43 ------------------
44
44
45 For the serial calculation, we will use `SymPy <http://www.sympy.org>`_ to
45 For the serial calculation, we will use `SymPy <http://www.sympy.org>`_ to
46 calculate 10,000 digits of pi and then look at the frequencies of the digits
46 calculate 10,000 digits of pi and then look at the frequencies of the digits
47 0-9. Out of 10,000 digits, we expect each digit to occur 1,000 times. While
47 0-9. Out of 10,000 digits, we expect each digit to occur 1,000 times. While
48 SymPy is capable of calculating many more digits of pi, our purpose here is to
48 SymPy is capable of calculating many more digits of pi, our purpose here is to
49 set the stage for the much larger parallel calculation.
49 set the stage for the much larger parallel calculation.
50
50
51 In this example, we use two functions from :file:`pidigits.py`:
51 In this example, we use two functions from :file:`pidigits.py`:
52 :func:`one_digit_freqs` (which calculates how many times each digit occurs)
52 :func:`one_digit_freqs` (which calculates how many times each digit occurs)
53 and :func:`plot_one_digit_freqs` (which uses Matplotlib to plot the result).
53 and :func:`plot_one_digit_freqs` (which uses Matplotlib to plot the result).
54 Here is an interactive IPython session that uses these functions with
54 Here is an interactive IPython session that uses these functions with
55 SymPy:
55 SymPy:
56
56
57 .. sourcecode:: ipython
57 .. sourcecode:: ipython
58
58
59 In [7]: import sympy
59 In [7]: import sympy
60
60
61 In [8]: pi = sympy.pi.evalf(40)
61 In [8]: pi = sympy.pi.evalf(40)
62
62
63 In [9]: pi
63 In [9]: pi
64 Out[9]: 3.141592653589793238462643383279502884197
64 Out[9]: 3.141592653589793238462643383279502884197
65
65
66 In [10]: pi = sympy.pi.evalf(10000)
66 In [10]: pi = sympy.pi.evalf(10000)
67
67
68 In [11]: digits = (d for d in str(pi)[2:]) # create a sequence of digits
68 In [11]: digits = (d for d in str(pi)[2:]) # create a sequence of digits
69
69
70 In [13]: freqs = one_digit_freqs(digits)
70 In [13]: freqs = one_digit_freqs(digits)
71
71
72 In [14]: plot_one_digit_freqs(freqs)
72 In [14]: plot_one_digit_freqs(freqs)
73 Out[14]: [<matplotlib.lines.Line2D object at 0x18a55290>]
73 Out[14]: [<matplotlib.lines.Line2D object at 0x18a55290>]
74
74
75 The resulting plot of the single digit counts shows that each digit occurs
75 The resulting plot of the single digit counts shows that each digit occurs
76 approximately 1,000 times, but that with only 10,000 digits the
76 approximately 1,000 times, but that with only 10,000 digits the
77 statistical fluctuations are still rather large:
77 statistical fluctuations are still rather large:
78
78
79 .. image:: figs/single_digits.*
79 .. image:: figs/single_digits.*
80
80
81 It is clear that to reduce the relative fluctuations in the counts, we need
81 It is clear that to reduce the relative fluctuations in the counts, we need
82 to look at many more digits of pi. That brings us to the parallel calculation.
82 to look at many more digits of pi. That brings us to the parallel calculation.
83
83
84 Parallel calculation
84 Parallel calculation
85 --------------------
85 --------------------
86
86
87 Calculating many digits of pi is a challenging computational problem in itself.
87 Calculating many digits of pi is a challenging computational problem in itself.
88 Because we want to focus on the distribution of digits in this example, we
88 Because we want to focus on the distribution of digits in this example, we
89 will use pre-computed digit of pi from the website of Professor Yasumasa
89 will use pre-computed digit of pi from the website of Professor Yasumasa
90 Kanada at the University of Tokyo (http://www.super-computing.org). These
90 Kanada at the University of Tokyo (http://www.super-computing.org). These
91 digits come in a set of text files (ftp://pi.super-computing.org/.2/pi200m/)
91 digits come in a set of text files (ftp://pi.super-computing.org/.2/pi200m/)
92 that each have 10 million digits of pi.
92 that each have 10 million digits of pi.
93
93
94 For the parallel calculation, we have copied these files to the local hard
94 For the parallel calculation, we have copied these files to the local hard
95 drives of the compute nodes. A total of 15 of these files will be used, for a
95 drives of the compute nodes. A total of 15 of these files will be used, for a
96 total of 150 million digits of pi. To make things a little more interesting we
96 total of 150 million digits of pi. To make things a little more interesting we
97 will calculate the frequencies of all 2 digits sequences (00-99) and then plot
97 will calculate the frequencies of all 2 digits sequences (00-99) and then plot
98 the result using a 2D matrix in Matplotlib.
98 the result using a 2D matrix in Matplotlib.
99
99
100 The overall idea of the calculation is simple: each IPython engine will
100 The overall idea of the calculation is simple: each IPython engine will
101 compute the two digit counts for the digits in a single file. Then in a final
101 compute the two digit counts for the digits in a single file. Then in a final
102 step the counts from each engine will be added up. To perform this
102 step the counts from each engine will be added up. To perform this
103 calculation, we will need two top-level functions from :file:`pidigits.py`:
103 calculation, we will need two top-level functions from :file:`pidigits.py`:
104
104
105 .. literalinclude:: ../../../examples/parallel/pi/pidigits.py
105 .. literalinclude:: ../../../examples/parallel/pi/pidigits.py
106 :language: python
106 :language: python
107 :lines: 47-62
107 :lines: 47-62
108
108
109 We will also use the :func:`plot_two_digit_freqs` function to plot the
109 We will also use the :func:`plot_two_digit_freqs` function to plot the
110 results. The code to run this calculation in parallel is contained in
110 results. The code to run this calculation in parallel is contained in
111 :file:`docs/examples/parallel/parallelpi.py`. This code can be run in parallel
111 :file:`examples/parallel/parallelpi.py`. This code can be run in parallel
112 using IPython by following these steps:
112 using IPython by following these steps:
113
113
114 1. Use :command:`ipcluster` to start 15 engines. We used 16 cores of an SGE linux
114 1. Use :command:`ipcluster` to start 15 engines. We used 16 cores of an SGE linux
115 cluster (1 controller + 15 engines).
115 cluster (1 controller + 15 engines).
116 2. With the file :file:`parallelpi.py` in your current working directory, open
116 2. With the file :file:`parallelpi.py` in your current working directory, open
117 up IPython, enable matplotlib, and type ``run parallelpi.py``. This will download
117 up IPython, enable matplotlib, and type ``run parallelpi.py``. This will download
118 the pi files via ftp the first time you run it, if they are not
118 the pi files via ftp the first time you run it, if they are not
119 present in the Engines' working directory.
119 present in the Engines' working directory.
120
120
121 When run on our 16 cores, we observe a speedup of 14.2x. This is slightly
121 When run on our 16 cores, we observe a speedup of 14.2x. This is slightly
122 less than linear scaling (16x) because the controller is also running on one of
122 less than linear scaling (16x) because the controller is also running on one of
123 the cores.
123 the cores.
124
124
125 To emphasize the interactive nature of IPython, we now show how the
125 To emphasize the interactive nature of IPython, we now show how the
126 calculation can also be run by simply typing the commands from
126 calculation can also be run by simply typing the commands from
127 :file:`parallelpi.py` interactively into IPython:
127 :file:`parallelpi.py` interactively into IPython:
128
128
129 .. sourcecode:: ipython
129 .. sourcecode:: ipython
130
130
131 In [1]: from IPython.parallel import Client
131 In [1]: from IPython.parallel import Client
132
132
133 # The Client allows us to use the engines interactively.
133 # The Client allows us to use the engines interactively.
134 # We simply pass Client the name of the cluster profile we
134 # We simply pass Client the name of the cluster profile we
135 # are using.
135 # are using.
136 In [2]: c = Client(profile='mycluster')
136 In [2]: c = Client(profile='mycluster')
137 In [3]: v = c[:]
137 In [3]: v = c[:]
138
138
139 In [3]: c.ids
139 In [3]: c.ids
140 Out[3]: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14]
140 Out[3]: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14]
141
141
142 In [4]: run pidigits.py
142 In [4]: run pidigits.py
143
143
144 In [5]: filestring = 'pi200m.ascii.%(i)02dof20'
144 In [5]: filestring = 'pi200m.ascii.%(i)02dof20'
145
145
146 # Create the list of files to process.
146 # Create the list of files to process.
147 In [6]: files = [filestring % {'i':i} for i in range(1,16)]
147 In [6]: files = [filestring % {'i':i} for i in range(1,16)]
148
148
149 In [7]: files
149 In [7]: files
150 Out[7]:
150 Out[7]:
151 ['pi200m.ascii.01of20',
151 ['pi200m.ascii.01of20',
152 'pi200m.ascii.02of20',
152 'pi200m.ascii.02of20',
153 'pi200m.ascii.03of20',
153 'pi200m.ascii.03of20',
154 'pi200m.ascii.04of20',
154 'pi200m.ascii.04of20',
155 'pi200m.ascii.05of20',
155 'pi200m.ascii.05of20',
156 'pi200m.ascii.06of20',
156 'pi200m.ascii.06of20',
157 'pi200m.ascii.07of20',
157 'pi200m.ascii.07of20',
158 'pi200m.ascii.08of20',
158 'pi200m.ascii.08of20',
159 'pi200m.ascii.09of20',
159 'pi200m.ascii.09of20',
160 'pi200m.ascii.10of20',
160 'pi200m.ascii.10of20',
161 'pi200m.ascii.11of20',
161 'pi200m.ascii.11of20',
162 'pi200m.ascii.12of20',
162 'pi200m.ascii.12of20',
163 'pi200m.ascii.13of20',
163 'pi200m.ascii.13of20',
164 'pi200m.ascii.14of20',
164 'pi200m.ascii.14of20',
165 'pi200m.ascii.15of20']
165 'pi200m.ascii.15of20']
166
166
167 # download the data files if they don't already exist:
167 # download the data files if they don't already exist:
168 In [8]: v.map(fetch_pi_file, files)
168 In [8]: v.map(fetch_pi_file, files)
169
169
170 # This is the parallel calculation using the Client.map method
170 # This is the parallel calculation using the Client.map method
171 # which applies compute_two_digit_freqs to each file in files in parallel.
171 # which applies compute_two_digit_freqs to each file in files in parallel.
172 In [9]: freqs_all = v.map(compute_two_digit_freqs, files)
172 In [9]: freqs_all = v.map(compute_two_digit_freqs, files)
173
173
174 # Add up the frequencies from each engine.
174 # Add up the frequencies from each engine.
175 In [10]: freqs = reduce_freqs(freqs_all)
175 In [10]: freqs = reduce_freqs(freqs_all)
176
176
177 In [11]: plot_two_digit_freqs(freqs)
177 In [11]: plot_two_digit_freqs(freqs)
178 Out[11]: <matplotlib.image.AxesImage object at 0x18beb110>
178 Out[11]: <matplotlib.image.AxesImage object at 0x18beb110>
179
179
180 In [12]: plt.title('2 digit counts of 150m digits of pi')
180 In [12]: plt.title('2 digit counts of 150m digits of pi')
181 Out[12]: <matplotlib.text.Text object at 0x18d1f9b0>
181 Out[12]: <matplotlib.text.Text object at 0x18d1f9b0>
182
182
183 The resulting plot generated by Matplotlib is shown below. The colors indicate
183 The resulting plot generated by Matplotlib is shown below. The colors indicate
184 which two digit sequences are more (red) or less (blue) likely to occur in the
184 which two digit sequences are more (red) or less (blue) likely to occur in the
185 first 150 million digits of pi. We clearly see that the sequence "41" is
185 first 150 million digits of pi. We clearly see that the sequence "41" is
186 most likely and that "06" and "07" are least likely. Further analysis would
186 most likely and that "06" and "07" are least likely. Further analysis would
187 show that the relative size of the statistical fluctuations have decreased
187 show that the relative size of the statistical fluctuations have decreased
188 compared to the 10,000 digit calculation.
188 compared to the 10,000 digit calculation.
189
189
190 .. image:: figs/two_digit_counts.*
190 .. image:: figs/two_digit_counts.*
191
191
192 Conclusion
192 Conclusion
193 ==========
193 ==========
194
194
195 To conclude these examples, we summarize the key features of IPython's
195 To conclude these examples, we summarize the key features of IPython's
196 parallel architecture that have been demonstrated:
196 parallel architecture that have been demonstrated:
197
197
198 * Serial code can be parallelized often with only a few extra lines of code.
198 * Serial code can be parallelized often with only a few extra lines of code.
199 We have used the :class:`DirectView` and :class:`LoadBalancedView` classes
199 We have used the :class:`DirectView` and :class:`LoadBalancedView` classes
200 for this purpose.
200 for this purpose.
201 * The resulting parallel code can be run without ever leaving the IPython's
201 * The resulting parallel code can be run without ever leaving the IPython's
202 interactive shell.
202 interactive shell.
203 * Any data computed in parallel can be explored interactively through
203 * Any data computed in parallel can be explored interactively through
204 visualization or further numerical calculations.
204 visualization or further numerical calculations.
205 * We have run these examples on a cluster running RHEL 5 and Sun GridEngine.
205 * We have run these examples on a cluster running RHEL 5 and Sun GridEngine.
206 IPython's built in support for SGE (and other batch systems) makes it easy
206 IPython's built in support for SGE (and other batch systems) makes it easy
207 to get started with IPython's parallel capabilities.
207 to get started with IPython's parallel capabilities.
208
208
@@ -1,306 +1,306 b''
1 .. _parallel_overview:
1 .. _parallel_overview:
2
2
3 ============================
3 ============================
4 Overview and getting started
4 Overview and getting started
5 ============================
5 ============================
6
6
7
7
8 Examples
8 Examples
9 ========
9 ========
10
10
11 We have various example scripts and notebooks for using IPython.parallel in our
11 We have various example scripts and notebooks for using IPython.parallel in our
12 :file:`docs/examples/parallel` directory, or they can be found `on GitHub`__.
12 :file:`examples/parallel` directory, or they can be found `on GitHub`__.
13 Some of these are covered in more detail in the :ref:`examples
13 Some of these are covered in more detail in the :ref:`examples
14 <parallel_examples>` section.
14 <parallel_examples>` section.
15
15
16 .. __: https://github.com/ipython/ipython/tree/master/docs/examples/parallel
16 .. __: https://github.com/ipython/ipython/tree/master/examples/parallel
17
17
18 Introduction
18 Introduction
19 ============
19 ============
20
20
21 This section gives an overview of IPython's sophisticated and powerful
21 This section gives an overview of IPython's sophisticated and powerful
22 architecture for parallel and distributed computing. This architecture
22 architecture for parallel and distributed computing. This architecture
23 abstracts out parallelism in a very general way, which enables IPython to
23 abstracts out parallelism in a very general way, which enables IPython to
24 support many different styles of parallelism including:
24 support many different styles of parallelism including:
25
25
26 * Single program, multiple data (SPMD) parallelism.
26 * Single program, multiple data (SPMD) parallelism.
27 * Multiple program, multiple data (MPMD) parallelism.
27 * Multiple program, multiple data (MPMD) parallelism.
28 * Message passing using MPI.
28 * Message passing using MPI.
29 * Task farming.
29 * Task farming.
30 * Data parallel.
30 * Data parallel.
31 * Combinations of these approaches.
31 * Combinations of these approaches.
32 * Custom user defined approaches.
32 * Custom user defined approaches.
33
33
34 Most importantly, IPython enables all types of parallel applications to
34 Most importantly, IPython enables all types of parallel applications to
35 be developed, executed, debugged and monitored *interactively*. Hence,
35 be developed, executed, debugged and monitored *interactively*. Hence,
36 the ``I`` in IPython. The following are some example usage cases for IPython:
36 the ``I`` in IPython. The following are some example usage cases for IPython:
37
37
38 * Quickly parallelize algorithms that are embarrassingly parallel
38 * Quickly parallelize algorithms that are embarrassingly parallel
39 using a number of simple approaches. Many simple things can be
39 using a number of simple approaches. Many simple things can be
40 parallelized interactively in one or two lines of code.
40 parallelized interactively in one or two lines of code.
41
41
42 * Steer traditional MPI applications on a supercomputer from an
42 * Steer traditional MPI applications on a supercomputer from an
43 IPython session on your laptop.
43 IPython session on your laptop.
44
44
45 * Analyze and visualize large datasets (that could be remote and/or
45 * Analyze and visualize large datasets (that could be remote and/or
46 distributed) interactively using IPython and tools like
46 distributed) interactively using IPython and tools like
47 matplotlib/TVTK.
47 matplotlib/TVTK.
48
48
49 * Develop, test and debug new parallel algorithms
49 * Develop, test and debug new parallel algorithms
50 (that may use MPI) interactively.
50 (that may use MPI) interactively.
51
51
52 * Tie together multiple MPI jobs running on different systems into
52 * Tie together multiple MPI jobs running on different systems into
53 one giant distributed and parallel system.
53 one giant distributed and parallel system.
54
54
55 * Start a parallel job on your cluster and then have a remote
55 * Start a parallel job on your cluster and then have a remote
56 collaborator connect to it and pull back data into their
56 collaborator connect to it and pull back data into their
57 local IPython session for plotting and analysis.
57 local IPython session for plotting and analysis.
58
58
59 * Run a set of tasks on a set of CPUs using dynamic load balancing.
59 * Run a set of tasks on a set of CPUs using dynamic load balancing.
60
60
61 .. tip::
61 .. tip::
62
62
63 At the SciPy 2011 conference in Austin, Min Ragan-Kelley presented a
63 At the SciPy 2011 conference in Austin, Min Ragan-Kelley presented a
64 complete 4-hour tutorial on the use of these features, and all the materials
64 complete 4-hour tutorial on the use of these features, and all the materials
65 for the tutorial are now `available online`__. That tutorial provides an
65 for the tutorial are now `available online`__. That tutorial provides an
66 excellent, hands-on oriented complement to the reference documentation
66 excellent, hands-on oriented complement to the reference documentation
67 presented here.
67 presented here.
68
68
69 .. __: http://minrk.github.com/scipy-tutorial-2011
69 .. __: http://minrk.github.com/scipy-tutorial-2011
70
70
71 Architecture overview
71 Architecture overview
72 =====================
72 =====================
73
73
74 .. figure:: figs/wideView.png
74 .. figure:: figs/wideView.png
75 :width: 300px
75 :width: 300px
76
76
77
77
78 The IPython architecture consists of four components:
78 The IPython architecture consists of four components:
79
79
80 * The IPython engine.
80 * The IPython engine.
81 * The IPython hub.
81 * The IPython hub.
82 * The IPython schedulers.
82 * The IPython schedulers.
83 * The controller client.
83 * The controller client.
84
84
85 These components live in the :mod:`IPython.parallel` package and are
85 These components live in the :mod:`IPython.parallel` package and are
86 installed with IPython. They do, however, have additional dependencies
86 installed with IPython. They do, however, have additional dependencies
87 that must be installed. For more information, see our
87 that must be installed. For more information, see our
88 :ref:`installation documentation <install_index>`.
88 :ref:`installation documentation <install_index>`.
89
89
90 .. TODO: include zmq in install_index
90 .. TODO: include zmq in install_index
91
91
92 IPython engine
92 IPython engine
93 ---------------
93 ---------------
94
94
95 The IPython engine is a Python instance that takes Python commands over a
95 The IPython engine is a Python instance that takes Python commands over a
96 network connection. Eventually, the IPython engine will be a full IPython
96 network connection. Eventually, the IPython engine will be a full IPython
97 interpreter, but for now, it is a regular Python interpreter. The engine
97 interpreter, but for now, it is a regular Python interpreter. The engine
98 can also handle incoming and outgoing Python objects sent over a network
98 can also handle incoming and outgoing Python objects sent over a network
99 connection. When multiple engines are started, parallel and distributed
99 connection. When multiple engines are started, parallel and distributed
100 computing becomes possible. An important feature of an IPython engine is
100 computing becomes possible. An important feature of an IPython engine is
101 that it blocks while user code is being executed. Read on for how the
101 that it blocks while user code is being executed. Read on for how the
102 IPython controller solves this problem to expose a clean asynchronous API
102 IPython controller solves this problem to expose a clean asynchronous API
103 to the user.
103 to the user.
104
104
105 IPython controller
105 IPython controller
106 ------------------
106 ------------------
107
107
108 The IPython controller processes provide an interface for working with a set of engines.
108 The IPython controller processes provide an interface for working with a set of engines.
109 At a general level, the controller is a collection of processes to which IPython engines
109 At a general level, the controller is a collection of processes to which IPython engines
110 and clients can connect. The controller is composed of a :class:`Hub` and a collection of
110 and clients can connect. The controller is composed of a :class:`Hub` and a collection of
111 :class:`Schedulers`. These Schedulers are typically run in separate processes but on the
111 :class:`Schedulers`. These Schedulers are typically run in separate processes but on the
112 same machine as the Hub, but can be run anywhere from local threads or on remote machines.
112 same machine as the Hub, but can be run anywhere from local threads or on remote machines.
113
113
114 The controller also provides a single point of contact for users who wish to
114 The controller also provides a single point of contact for users who wish to
115 utilize the engines connected to the controller. There are different ways of
115 utilize the engines connected to the controller. There are different ways of
116 working with a controller. In IPython, all of these models are implemented via
116 working with a controller. In IPython, all of these models are implemented via
117 the :meth:`.View.apply` method, after
117 the :meth:`.View.apply` method, after
118 constructing :class:`.View` objects to represent subsets of engines. The two
118 constructing :class:`.View` objects to represent subsets of engines. The two
119 primary models for interacting with engines are:
119 primary models for interacting with engines are:
120
120
121 * A **Direct** interface, where engines are addressed explicitly.
121 * A **Direct** interface, where engines are addressed explicitly.
122 * A **LoadBalanced** interface, where the Scheduler is trusted with assigning work to
122 * A **LoadBalanced** interface, where the Scheduler is trusted with assigning work to
123 appropriate engines.
123 appropriate engines.
124
124
125 Advanced users can readily extend the View models to enable other
125 Advanced users can readily extend the View models to enable other
126 styles of parallelism.
126 styles of parallelism.
127
127
128 .. note::
128 .. note::
129
129
130 A single controller and set of engines can be used with multiple models
130 A single controller and set of engines can be used with multiple models
131 simultaneously. This opens the door for lots of interesting things.
131 simultaneously. This opens the door for lots of interesting things.
132
132
133
133
134 The Hub
134 The Hub
135 *******
135 *******
136
136
137 The center of an IPython cluster is the Hub. This is the process that keeps
137 The center of an IPython cluster is the Hub. This is the process that keeps
138 track of engine connections, schedulers, clients, as well as all task requests and
138 track of engine connections, schedulers, clients, as well as all task requests and
139 results. The primary role of the Hub is to facilitate queries of the cluster state, and
139 results. The primary role of the Hub is to facilitate queries of the cluster state, and
140 minimize the necessary information required to establish the many connections involved in
140 minimize the necessary information required to establish the many connections involved in
141 connecting new clients and engines.
141 connecting new clients and engines.
142
142
143
143
144 Schedulers
144 Schedulers
145 **********
145 **********
146
146
147 All actions that can be performed on the engine go through a Scheduler. While the engines
147 All actions that can be performed on the engine go through a Scheduler. While the engines
148 themselves block when user code is run, the schedulers hide that from the user to provide
148 themselves block when user code is run, the schedulers hide that from the user to provide
149 a fully asynchronous interface to a set of engines.
149 a fully asynchronous interface to a set of engines.
150
150
151
151
152 IPython client and views
152 IPython client and views
153 ------------------------
153 ------------------------
154
154
155 There is one primary object, the :class:`~.parallel.Client`, for connecting to a cluster.
155 There is one primary object, the :class:`~.parallel.Client`, for connecting to a cluster.
156 For each execution model, there is a corresponding :class:`~.parallel.View`. These views
156 For each execution model, there is a corresponding :class:`~.parallel.View`. These views
157 allow users to interact with a set of engines through the interface. Here are the two default
157 allow users to interact with a set of engines through the interface. Here are the two default
158 views:
158 views:
159
159
160 * The :class:`DirectView` class for explicit addressing.
160 * The :class:`DirectView` class for explicit addressing.
161 * The :class:`LoadBalancedView` class for destination-agnostic scheduling.
161 * The :class:`LoadBalancedView` class for destination-agnostic scheduling.
162
162
163 Security
163 Security
164 --------
164 --------
165
165
166 IPython uses ZeroMQ for networking, which has provided many advantages, but
166 IPython uses ZeroMQ for networking, which has provided many advantages, but
167 one of the setbacks is its utter lack of security [ZeroMQ]_. By default, no IPython
167 one of the setbacks is its utter lack of security [ZeroMQ]_. By default, no IPython
168 connections are encrypted, but open ports only listen on localhost. The only
168 connections are encrypted, but open ports only listen on localhost. The only
169 source of security for IPython is via ssh-tunnel. IPython supports both shell
169 source of security for IPython is via ssh-tunnel. IPython supports both shell
170 (`openssh`) and `paramiko` based tunnels for connections. There is a key necessary
170 (`openssh`) and `paramiko` based tunnels for connections. There is a key necessary
171 to submit requests, but due to the lack of encryption, it does not provide
171 to submit requests, but due to the lack of encryption, it does not provide
172 significant security if loopback traffic is compromised.
172 significant security if loopback traffic is compromised.
173
173
174 In our architecture, the controller is the only process that listens on
174 In our architecture, the controller is the only process that listens on
175 network ports, and is thus the main point of vulnerability. The standard model
175 network ports, and is thus the main point of vulnerability. The standard model
176 for secure connections is to designate that the controller listen on
176 for secure connections is to designate that the controller listen on
177 localhost, and use ssh-tunnels to connect clients and/or
177 localhost, and use ssh-tunnels to connect clients and/or
178 engines.
178 engines.
179
179
180 To connect and authenticate to the controller an engine or client needs
180 To connect and authenticate to the controller an engine or client needs
181 some information that the controller has stored in a JSON file.
181 some information that the controller has stored in a JSON file.
182 Thus, the JSON files need to be copied to a location where
182 Thus, the JSON files need to be copied to a location where
183 the clients and engines can find them. Typically, this is the
183 the clients and engines can find them. Typically, this is the
184 :file:`~/.ipython/profile_default/security` directory on the host where the
184 :file:`~/.ipython/profile_default/security` directory on the host where the
185 client/engine is running (which could be a different host than the controller).
185 client/engine is running (which could be a different host than the controller).
186 Once the JSON files are copied over, everything should work fine.
186 Once the JSON files are copied over, everything should work fine.
187
187
188 Currently, there are two JSON files that the controller creates:
188 Currently, there are two JSON files that the controller creates:
189
189
190 ipcontroller-engine.json
190 ipcontroller-engine.json
191 This JSON file has the information necessary for an engine to connect
191 This JSON file has the information necessary for an engine to connect
192 to a controller.
192 to a controller.
193
193
194 ipcontroller-client.json
194 ipcontroller-client.json
195 The client's connection information. This may not differ from the engine's,
195 The client's connection information. This may not differ from the engine's,
196 but since the controller may listen on different ports for clients and
196 but since the controller may listen on different ports for clients and
197 engines, it is stored separately.
197 engines, it is stored separately.
198
198
199 ipcontroller-client.json will look something like this, under default localhost
199 ipcontroller-client.json will look something like this, under default localhost
200 circumstances:
200 circumstances:
201
201
202 .. sourcecode:: python
202 .. sourcecode:: python
203
203
204 {
204 {
205 "url":"tcp:\/\/127.0.0.1:54424",
205 "url":"tcp:\/\/127.0.0.1:54424",
206 "exec_key":"a361fe89-92fc-4762-9767-e2f0a05e3130",
206 "exec_key":"a361fe89-92fc-4762-9767-e2f0a05e3130",
207 "ssh":"",
207 "ssh":"",
208 "location":"10.19.1.135"
208 "location":"10.19.1.135"
209 }
209 }
210
210
211 If, however, you are running the controller on a work node on a cluster, you will likely
211 If, however, you are running the controller on a work node on a cluster, you will likely
212 need to use ssh tunnels to connect clients from your laptop to it. You will also
212 need to use ssh tunnels to connect clients from your laptop to it. You will also
213 probably need to instruct the controller to listen for engines coming from other work nodes
213 probably need to instruct the controller to listen for engines coming from other work nodes
214 on the cluster. An example of ipcontroller-client.json, as created by::
214 on the cluster. An example of ipcontroller-client.json, as created by::
215
215
216 $> ipcontroller --ip=* --ssh=login.mycluster.com
216 $> ipcontroller --ip=* --ssh=login.mycluster.com
217
217
218
218
219 .. sourcecode:: python
219 .. sourcecode:: python
220
220
221 {
221 {
222 "url":"tcp:\/\/*:54424",
222 "url":"tcp:\/\/*:54424",
223 "exec_key":"a361fe89-92fc-4762-9767-e2f0a05e3130",
223 "exec_key":"a361fe89-92fc-4762-9767-e2f0a05e3130",
224 "ssh":"login.mycluster.com",
224 "ssh":"login.mycluster.com",
225 "location":"10.0.0.2"
225 "location":"10.0.0.2"
226 }
226 }
227 More details of how these JSON files are used are given below.
227 More details of how these JSON files are used are given below.
228
228
229 A detailed description of the security model and its implementation in IPython
229 A detailed description of the security model and its implementation in IPython
230 can be found :ref:`here <parallelsecurity>`.
230 can be found :ref:`here <parallelsecurity>`.
231
231
232 .. warning::
232 .. warning::
233
233
234 Even at its most secure, the Controller listens on ports on localhost, and
234 Even at its most secure, the Controller listens on ports on localhost, and
235 every time you make a tunnel, you open a localhost port on the connecting
235 every time you make a tunnel, you open a localhost port on the connecting
236 machine that points to the Controller. If localhost on the Controller's
236 machine that points to the Controller. If localhost on the Controller's
237 machine, or the machine of any client or engine, is untrusted, then your
237 machine, or the machine of any client or engine, is untrusted, then your
238 Controller is insecure. There is no way around this with ZeroMQ.
238 Controller is insecure. There is no way around this with ZeroMQ.
239
239
240
240
241
241
242 Getting Started
242 Getting Started
243 ===============
243 ===============
244
244
245 To use IPython for parallel computing, you need to start one instance of the
245 To use IPython for parallel computing, you need to start one instance of the
246 controller and one or more instances of the engine. Initially, it is best to
246 controller and one or more instances of the engine. Initially, it is best to
247 simply start a controller and engines on a single host using the
247 simply start a controller and engines on a single host using the
248 :command:`ipcluster` command. To start a controller and 4 engines on your
248 :command:`ipcluster` command. To start a controller and 4 engines on your
249 localhost, just do::
249 localhost, just do::
250
250
251 $ ipcluster start -n 4
251 $ ipcluster start -n 4
252
252
253 More details about starting the IPython controller and engines can be found
253 More details about starting the IPython controller and engines can be found
254 :ref:`here <parallel_process>`
254 :ref:`here <parallel_process>`
255
255
256 Once you have started the IPython controller and one or more engines, you
256 Once you have started the IPython controller and one or more engines, you
257 are ready to use the engines to do something useful. To make sure
257 are ready to use the engines to do something useful. To make sure
258 everything is working correctly, try the following commands:
258 everything is working correctly, try the following commands:
259
259
260 .. sourcecode:: ipython
260 .. sourcecode:: ipython
261
261
262 In [1]: from IPython.parallel import Client
262 In [1]: from IPython.parallel import Client
263
263
264 In [2]: c = Client()
264 In [2]: c = Client()
265
265
266 In [4]: c.ids
266 In [4]: c.ids
267 Out[4]: set([0, 1, 2, 3])
267 Out[4]: set([0, 1, 2, 3])
268
268
269 In [5]: c[:].apply_sync(lambda : "Hello, World")
269 In [5]: c[:].apply_sync(lambda : "Hello, World")
270 Out[5]: [ 'Hello, World', 'Hello, World', 'Hello, World', 'Hello, World' ]
270 Out[5]: [ 'Hello, World', 'Hello, World', 'Hello, World', 'Hello, World' ]
271
271
272
272
273 When a client is created with no arguments, the client tries to find the corresponding JSON file
273 When a client is created with no arguments, the client tries to find the corresponding JSON file
274 in the local `~/.ipython/profile_default/security` directory. Or if you specified a profile,
274 in the local `~/.ipython/profile_default/security` directory. Or if you specified a profile,
275 you can use that with the Client. This should cover most cases:
275 you can use that with the Client. This should cover most cases:
276
276
277 .. sourcecode:: ipython
277 .. sourcecode:: ipython
278
278
279 In [2]: c = Client(profile='myprofile')
279 In [2]: c = Client(profile='myprofile')
280
280
281 If you have put the JSON file in a different location or it has a different name, create the
281 If you have put the JSON file in a different location or it has a different name, create the
282 client like this:
282 client like this:
283
283
284 .. sourcecode:: ipython
284 .. sourcecode:: ipython
285
285
286 In [2]: c = Client('/path/to/my/ipcontroller-client.json')
286 In [2]: c = Client('/path/to/my/ipcontroller-client.json')
287
287
288 Remember, a client needs to be able to see the Hub's ports to connect. So if they are on a
288 Remember, a client needs to be able to see the Hub's ports to connect. So if they are on a
289 different machine, you may need to use an ssh server to tunnel access to that machine,
289 different machine, you may need to use an ssh server to tunnel access to that machine,
290 then you would connect to it with:
290 then you would connect to it with:
291
291
292 .. sourcecode:: ipython
292 .. sourcecode:: ipython
293
293
294 In [2]: c = Client('/path/to/my/ipcontroller-client.json', sshserver='me@myhub.example.com')
294 In [2]: c = Client('/path/to/my/ipcontroller-client.json', sshserver='me@myhub.example.com')
295
295
296 Where 'myhub.example.com' is the url or IP address of the machine on
296 Where 'myhub.example.com' is the url or IP address of the machine on
297 which the Hub process is running (or another machine that has direct access to the Hub's ports).
297 which the Hub process is running (or another machine that has direct access to the Hub's ports).
298
298
299 The SSH server may already be specified in ipcontroller-client.json, if the controller was
299 The SSH server may already be specified in ipcontroller-client.json, if the controller was
300 instructed at its launch time.
300 instructed at its launch time.
301
301
302 You are now ready to learn more about the :ref:`Direct
302 You are now ready to learn more about the :ref:`Direct
303 <parallel_multiengine>` and :ref:`LoadBalanced <parallel_task>` interfaces to the
303 <parallel_multiengine>` and :ref:`LoadBalanced <parallel_task>` interfaces to the
304 controller.
304 controller.
305
305
306 .. [ZeroMQ] ZeroMQ. http://www.zeromq.org
306 .. [ZeroMQ] ZeroMQ. http://www.zeromq.org
@@ -1,765 +1,765 b''
1 =============
1 =============
2 0.11 Series
2 0.11 Series
3 =============
3 =============
4
4
5 Release 0.11
5 Release 0.11
6 ============
6 ============
7
7
8 IPython 0.11 is a *major* overhaul of IPython, two years in the making. Most
8 IPython 0.11 is a *major* overhaul of IPython, two years in the making. Most
9 of the code base has been rewritten or at least reorganized, breaking backward
9 of the code base has been rewritten or at least reorganized, breaking backward
10 compatibility with several APIs in previous versions. It is the first major
10 compatibility with several APIs in previous versions. It is the first major
11 release in two years, and probably the most significant change to IPython since
11 release in two years, and probably the most significant change to IPython since
12 its inception. We plan to have a relatively quick succession of releases, as
12 its inception. We plan to have a relatively quick succession of releases, as
13 people discover new bugs and regressions. Once we iron out any significant
13 people discover new bugs and regressions. Once we iron out any significant
14 bugs in this process and settle down the new APIs, this series will become
14 bugs in this process and settle down the new APIs, this series will become
15 IPython 1.0. We encourage feedback now on the core APIs, which we hope to
15 IPython 1.0. We encourage feedback now on the core APIs, which we hope to
16 maintain stable during the 1.0 series.
16 maintain stable during the 1.0 series.
17
17
18 Since the internal APIs have changed so much, projects using IPython as a
18 Since the internal APIs have changed so much, projects using IPython as a
19 library (as opposed to end-users of the application) are the most likely to
19 library (as opposed to end-users of the application) are the most likely to
20 encounter regressions or changes that break their existing use patterns. We
20 encounter regressions or changes that break their existing use patterns. We
21 will make every effort to provide updated versions of the APIs to facilitate
21 will make every effort to provide updated versions of the APIs to facilitate
22 the transition, and we encourage you to contact us on the `development mailing
22 the transition, and we encourage you to contact us on the `development mailing
23 list`__ with questions and feedback.
23 list`__ with questions and feedback.
24
24
25 .. __: http://mail.scipy.org/mailman/listinfo/ipython-dev
25 .. __: http://mail.scipy.org/mailman/listinfo/ipython-dev
26
26
27 Chris Fonnesbeck recently wrote an `excellent post`__ that highlights some of
27 Chris Fonnesbeck recently wrote an `excellent post`__ that highlights some of
28 our major new features, with examples and screenshots. We encourage you to
28 our major new features, with examples and screenshots. We encourage you to
29 read it as it provides an illustrated, high-level overview complementing the
29 read it as it provides an illustrated, high-level overview complementing the
30 detailed feature breakdown in this document.
30 detailed feature breakdown in this document.
31
31
32 .. __: http://fonnesbeck.calepin.co/innovations-in-ipython.html
32 .. __: http://fonnesbeck.calepin.co/innovations-in-ipython.html
33
33
34 A quick summary of the major changes (see below for details):
34 A quick summary of the major changes (see below for details):
35
35
36 * **Standalone Qt console**: a new rich console has been added to IPython,
36 * **Standalone Qt console**: a new rich console has been added to IPython,
37 started with `ipython qtconsole`. In this application we have tried to
37 started with `ipython qtconsole`. In this application we have tried to
38 retain the feel of a terminal for fast and efficient workflows, while adding
38 retain the feel of a terminal for fast and efficient workflows, while adding
39 many features that a line-oriented terminal simply can not support, such as
39 many features that a line-oriented terminal simply can not support, such as
40 inline figures, full multiline editing with syntax highlighting, graphical
40 inline figures, full multiline editing with syntax highlighting, graphical
41 tooltips for function calls and much more. This development was sponsored by
41 tooltips for function calls and much more. This development was sponsored by
42 `Enthought Inc.`__. See :ref:`below <qtconsole_011>` for details.
42 `Enthought Inc.`__. See :ref:`below <qtconsole_011>` for details.
43
43
44 .. __: http://enthought.com
44 .. __: http://enthought.com
45
45
46 * **High-level parallel computing with ZeroMQ**. Using the same architecture
46 * **High-level parallel computing with ZeroMQ**. Using the same architecture
47 that our Qt console is based on, we have completely rewritten our high-level
47 that our Qt console is based on, we have completely rewritten our high-level
48 parallel computing machinery that in prior versions used the Twisted
48 parallel computing machinery that in prior versions used the Twisted
49 networking framework. While this change will require users to update their
49 networking framework. While this change will require users to update their
50 codes, the improvements in performance, memory control and internal
50 codes, the improvements in performance, memory control and internal
51 consistency across our codebase convinced us it was a price worth paying. We
51 consistency across our codebase convinced us it was a price worth paying. We
52 have tried to explain how to best proceed with this update, and will be happy
52 have tried to explain how to best proceed with this update, and will be happy
53 to answer questions that may arise. A full tutorial describing these
53 to answer questions that may arise. A full tutorial describing these
54 features `was presented at SciPy'11`__, more details :ref:`below
54 features `was presented at SciPy'11`__, more details :ref:`below
55 <parallel_011>`.
55 <parallel_011>`.
56
56
57 .. __: http://minrk.github.com/scipy-tutorial-2011
57 .. __: http://minrk.github.com/scipy-tutorial-2011
58
58
59 * **New model for GUI/plotting support in the terminal**. Now instead of the
59 * **New model for GUI/plotting support in the terminal**. Now instead of the
60 various `-Xthread` flags we had before, GUI support is provided without the
60 various `-Xthread` flags we had before, GUI support is provided without the
61 use of any threads, by directly integrating GUI event loops with Python's
61 use of any threads, by directly integrating GUI event loops with Python's
62 `PyOS_InputHook` API. A new command-line flag `--gui` controls GUI support,
62 `PyOS_InputHook` API. A new command-line flag `--gui` controls GUI support,
63 and it can also be enabled after IPython startup via the new `%gui` magic.
63 and it can also be enabled after IPython startup via the new `%gui` magic.
64 This requires some changes if you want to execute GUI-using scripts inside
64 This requires some changes if you want to execute GUI-using scripts inside
65 IPython, see :ref:`the GUI support section <gui_support>` for more details.
65 IPython, see :ref:`the GUI support section <gui_support>` for more details.
66
66
67 * **A two-process architecture.** The Qt console is the first use of a new
67 * **A two-process architecture.** The Qt console is the first use of a new
68 model that splits IPython between a kernel process where code is executed and
68 model that splits IPython between a kernel process where code is executed and
69 a client that handles user interaction. We plan on also providing terminal
69 a client that handles user interaction. We plan on also providing terminal
70 and web-browser based clients using this infrastructure in future releases.
70 and web-browser based clients using this infrastructure in future releases.
71 This model allows multiple clients to interact with an IPython process
71 This model allows multiple clients to interact with an IPython process
72 through a :ref:`well-documented messaging protocol <messaging>` using the
72 through a :ref:`well-documented messaging protocol <messaging>` using the
73 ZeroMQ networking library.
73 ZeroMQ networking library.
74
74
75 * **Refactoring.** the entire codebase has been refactored, in order to make it
75 * **Refactoring.** the entire codebase has been refactored, in order to make it
76 more modular and easier to contribute to. IPython has traditionally been a
76 more modular and easier to contribute to. IPython has traditionally been a
77 hard project to participate because the old codebase was very monolithic. We
77 hard project to participate because the old codebase was very monolithic. We
78 hope this (ongoing) restructuring will make it easier for new developers to
78 hope this (ongoing) restructuring will make it easier for new developers to
79 join us.
79 join us.
80
80
81 * **Vim integration**. Vim can be configured to seamlessly control an IPython
81 * **Vim integration**. Vim can be configured to seamlessly control an IPython
82 kernel, see the files in :file:`docs/examples/vim` for the full details.
82 kernel, see the files in :file:`docs/examples/vim` for the full details.
83 This work was done by Paul Ivanov, who prepared a nice `video
83 This work was done by Paul Ivanov, who prepared a nice `video
84 demonstration`__ of the features it provides.
84 demonstration`__ of the features it provides.
85
85
86 .. __: http://pirsquared.org/blog/2011/07/28/vim-ipython/
86 .. __: http://pirsquared.org/blog/2011/07/28/vim-ipython/
87
87
88 * **Integration into Microsoft Visual Studio**. Thanks to the work of the
88 * **Integration into Microsoft Visual Studio**. Thanks to the work of the
89 Microsoft `Python Tools for Visual Studio`__ team, this version of IPython
89 Microsoft `Python Tools for Visual Studio`__ team, this version of IPython
90 has been integrated into Microsoft Visual Studio's Python tools open source
90 has been integrated into Microsoft Visual Studio's Python tools open source
91 plug-in. `Details below`_
91 plug-in. `Details below`_
92
92
93 .. __: http://pytools.codeplex.com
93 .. __: http://pytools.codeplex.com
94 .. _details below: ms_visual_studio_011_
94 .. _details below: ms_visual_studio_011_
95
95
96 * **Improved unicode support**. We closed many bugs related to unicode input.
96 * **Improved unicode support**. We closed many bugs related to unicode input.
97
97
98 * **Python 3**. IPython now runs on Python 3.x. See :ref:`python3_011` for
98 * **Python 3**. IPython now runs on Python 3.x. See :ref:`python3_011` for
99 details.
99 details.
100
100
101 * **New profile model**. Profiles are now directories that contain all relevant
101 * **New profile model**. Profiles are now directories that contain all relevant
102 information for that session, and thus better isolate IPython use-cases.
102 information for that session, and thus better isolate IPython use-cases.
103
103
104 * **SQLite storage for history**. All history is now stored in a SQLite
104 * **SQLite storage for history**. All history is now stored in a SQLite
105 database, providing support for multiple simultaneous sessions that won't
105 database, providing support for multiple simultaneous sessions that won't
106 clobber each other as well as the ability to perform queries on all stored
106 clobber each other as well as the ability to perform queries on all stored
107 data.
107 data.
108
108
109 * **New configuration system**. All parts of IPython are now configured via a
109 * **New configuration system**. All parts of IPython are now configured via a
110 mechanism inspired by the Enthought Traits library. Any configurable element
110 mechanism inspired by the Enthought Traits library. Any configurable element
111 can have its attributes set either via files that now use real Python syntax
111 can have its attributes set either via files that now use real Python syntax
112 or from the command-line.
112 or from the command-line.
113
113
114 * **Pasting of code with prompts**. IPython now intelligently strips out input
114 * **Pasting of code with prompts**. IPython now intelligently strips out input
115 prompts , be they plain Python ones (``>>>`` and ``...``) or IPython ones
115 prompts , be they plain Python ones (``>>>`` and ``...``) or IPython ones
116 (``In [N]:`` and ``...:``). More details :ref:`here <pasting_with_prompts>`.
116 (``In [N]:`` and ``...:``). More details :ref:`here <pasting_with_prompts>`.
117
117
118
118
119 Authors and support
119 Authors and support
120 -------------------
120 -------------------
121
121
122 Over 60 separate authors have contributed to this release, see :ref:`below
122 Over 60 separate authors have contributed to this release, see :ref:`below
123 <credits_011>` for a full list. In particular, we want to highlight the
123 <credits_011>` for a full list. In particular, we want to highlight the
124 extremely active participation of two new core team members: Evan Patterson
124 extremely active participation of two new core team members: Evan Patterson
125 implemented the Qt console, and Thomas Kluyver started with our Python 3 port
125 implemented the Qt console, and Thomas Kluyver started with our Python 3 port
126 and by now has made major contributions to just about every area of IPython.
126 and by now has made major contributions to just about every area of IPython.
127
127
128 We are also grateful for the support we have received during this development
128 We are also grateful for the support we have received during this development
129 cycle from several institutions:
129 cycle from several institutions:
130
130
131 - `Enthought Inc`__ funded the development of our new Qt console, an effort that
131 - `Enthought Inc`__ funded the development of our new Qt console, an effort that
132 required developing major pieces of underlying infrastructure, which now
132 required developing major pieces of underlying infrastructure, which now
133 power not only the Qt console but also our new parallel machinery. We'd like
133 power not only the Qt console but also our new parallel machinery. We'd like
134 to thank Eric Jones and Travis Oliphant for their support, as well as Ilan
134 to thank Eric Jones and Travis Oliphant for their support, as well as Ilan
135 Schnell for his tireless work integrating and testing IPython in the
135 Schnell for his tireless work integrating and testing IPython in the
136 `Enthought Python Distribution`_.
136 `Enthought Python Distribution`_.
137
137
138 .. __: http://enthought.com
138 .. __: http://enthought.com
139 .. _Enthought Python Distribution: http://www.enthought.com/products/epd.php
139 .. _Enthought Python Distribution: http://www.enthought.com/products/epd.php
140
140
141 - Nipy/NIH: funding via the `NiPy project`__ (NIH grant 5R01MH081909-02) helped
141 - Nipy/NIH: funding via the `NiPy project`__ (NIH grant 5R01MH081909-02) helped
142 us jumpstart the development of this series by restructuring the entire
142 us jumpstart the development of this series by restructuring the entire
143 codebase two years ago in a way that would make modular development and
143 codebase two years ago in a way that would make modular development and
144 testing more approachable. Without this initial groundwork, all the new
144 testing more approachable. Without this initial groundwork, all the new
145 features we have added would have been impossible to develop.
145 features we have added would have been impossible to develop.
146
146
147 .. __: http://nipy.org
147 .. __: http://nipy.org
148
148
149 - Sage/NSF: funding via the grant `Sage: Unifying Mathematical Software for
149 - Sage/NSF: funding via the grant `Sage: Unifying Mathematical Software for
150 Scientists, Engineers, and Mathematicians`__ (NSF grant DMS-1015114)
150 Scientists, Engineers, and Mathematicians`__ (NSF grant DMS-1015114)
151 supported a meeting in spring 2011 of several of the core IPython developers
151 supported a meeting in spring 2011 of several of the core IPython developers
152 where major progress was made integrating the last key pieces leading to this
152 where major progress was made integrating the last key pieces leading to this
153 release.
153 release.
154
154
155 .. __: http://modular.math.washington.edu/grants/compmath09
155 .. __: http://modular.math.washington.edu/grants/compmath09
156
156
157 - Microsoft's team working on `Python Tools for Visual Studio`__ developed the
157 - Microsoft's team working on `Python Tools for Visual Studio`__ developed the
158 integraton of IPython into the Python plugin for Visual Studio 2010.
158 integraton of IPython into the Python plugin for Visual Studio 2010.
159
159
160 .. __: http://pytools.codeplex.com
160 .. __: http://pytools.codeplex.com
161
161
162 - Google Summer of Code: in 2010, we had two students developing prototypes of
162 - Google Summer of Code: in 2010, we had two students developing prototypes of
163 the new machinery that is now maturing in this release: `Omar Zapata`_ and
163 the new machinery that is now maturing in this release: `Omar Zapata`_ and
164 `Gerardo GutiΓ©rrez`_.
164 `Gerardo GutiΓ©rrez`_.
165
165
166 .. _Omar Zapata: http://ipythonzmq.blogspot.com/2010/08/ipython-zmq-status.html
166 .. _Omar Zapata: http://ipythonzmq.blogspot.com/2010/08/ipython-zmq-status.html
167 .. _Gerardo GutiΓ©rrez: http://ipythonqt.blogspot.com/2010/04/ipython-qt-interface-gsoc-2010-proposal.html>
167 .. _Gerardo GutiΓ©rrez: http://ipythonqt.blogspot.com/2010/04/ipython-qt-interface-gsoc-2010-proposal.html>
168
168
169
169
170 Development summary: moving to Git and Github
170 Development summary: moving to Git and Github
171 ---------------------------------------------
171 ---------------------------------------------
172
172
173 In April 2010, after `one breakage too many with bzr`__, we decided to move our
173 In April 2010, after `one breakage too many with bzr`__, we decided to move our
174 entire development process to Git and Github.com. This has proven to be one of
174 entire development process to Git and Github.com. This has proven to be one of
175 the best decisions in the project's history, as the combination of git and
175 the best decisions in the project's history, as the combination of git and
176 github have made us far, far more productive than we could be with our previous
176 github have made us far, far more productive than we could be with our previous
177 tools. We first converted our bzr repo to a git one without losing history,
177 tools. We first converted our bzr repo to a git one without losing history,
178 and a few weeks later ported all open Launchpad bugs to github issues with
178 and a few weeks later ported all open Launchpad bugs to github issues with
179 their comments mostly intact (modulo some formatting changes). This ensured a
179 their comments mostly intact (modulo some formatting changes). This ensured a
180 smooth transition where no development history or submitted bugs were lost.
180 smooth transition where no development history or submitted bugs were lost.
181 Feel free to use our little Launchpad to Github issues `porting script`_ if you
181 Feel free to use our little Launchpad to Github issues `porting script`_ if you
182 need to make a similar transition.
182 need to make a similar transition.
183
183
184 .. __: http://mail.scipy.org/pipermail/ipython-dev/2010-April/005944.html
184 .. __: http://mail.scipy.org/pipermail/ipython-dev/2010-April/005944.html
185 .. _porting script: https://gist.github.com/835577
185 .. _porting script: https://gist.github.com/835577
186
186
187 These simple statistics show how much work has been done on the new release, by
187 These simple statistics show how much work has been done on the new release, by
188 comparing the current code to the last point it had in common with the 0.10
188 comparing the current code to the last point it had in common with the 0.10
189 series. A huge diff and ~2200 commits make up this cycle::
189 series. A huge diff and ~2200 commits make up this cycle::
190
190
191 git diff $(git merge-base 0.10.2 HEAD) | wc -l
191 git diff $(git merge-base 0.10.2 HEAD) | wc -l
192 288019
192 288019
193
193
194 git log $(git merge-base 0.10.2 HEAD)..HEAD --oneline | wc -l
194 git log $(git merge-base 0.10.2 HEAD)..HEAD --oneline | wc -l
195 2200
195 2200
196
196
197 Since our move to github, 511 issues were closed, 226 of which were pull
197 Since our move to github, 511 issues were closed, 226 of which were pull
198 requests and 285 regular issues (:ref:`a full list with links
198 requests and 285 regular issues (:ref:`a full list with links
199 <issues_list_011>` is available for those interested in the details). Github's
199 <issues_list_011>` is available for those interested in the details). Github's
200 pull requests are a fantastic mechanism for reviewing code and building a
200 pull requests are a fantastic mechanism for reviewing code and building a
201 shared ownership of the project, and we are making enthusiastic use of it.
201 shared ownership of the project, and we are making enthusiastic use of it.
202
202
203 .. Note::
203 .. Note::
204
204
205 This undercounts the number of issues closed in this development cycle,
205 This undercounts the number of issues closed in this development cycle,
206 since we only moved to github for issue tracking in May 2010, but we have no
206 since we only moved to github for issue tracking in May 2010, but we have no
207 way of collecting statistics on the number of issues closed in the old
207 way of collecting statistics on the number of issues closed in the old
208 Launchpad bug tracker prior to that.
208 Launchpad bug tracker prior to that.
209
209
210
210
211 .. _qtconsole_011:
211 .. _qtconsole_011:
212
212
213 Qt Console
213 Qt Console
214 ----------
214 ----------
215
215
216 IPython now ships with a Qt application that feels very much like a terminal,
216 IPython now ships with a Qt application that feels very much like a terminal,
217 but is in fact a rich GUI that runs an IPython client but supports inline
217 but is in fact a rich GUI that runs an IPython client but supports inline
218 figures, saving sessions to PDF and HTML, multiline editing with syntax
218 figures, saving sessions to PDF and HTML, multiline editing with syntax
219 highlighting, graphical calltips and much more:
219 highlighting, graphical calltips and much more:
220
220
221 .. figure:: ../_images/qtconsole.png
221 .. figure:: ../_images/qtconsole.png
222 :width: 400px
222 :width: 400px
223 :alt: IPython Qt console with embedded plots
223 :alt: IPython Qt console with embedded plots
224 :align: center
224 :align: center
225 :target: ../_images/qtconsole.png
225 :target: ../_images/qtconsole.png
226
226
227 The Qt console for IPython, using inline matplotlib plots.
227 The Qt console for IPython, using inline matplotlib plots.
228
228
229 We hope that many projects will embed this widget, which we've kept
229 We hope that many projects will embed this widget, which we've kept
230 deliberately very lightweight, into their own environments. In the future we
230 deliberately very lightweight, into their own environments. In the future we
231 may also offer a slightly more featureful application (with menus and other GUI
231 may also offer a slightly more featureful application (with menus and other GUI
232 elements), but we remain committed to always shipping this easy to embed
232 elements), but we remain committed to always shipping this easy to embed
233 widget.
233 widget.
234
234
235 See the :ref:`Qt console section <qtconsole>` of the docs for a detailed
235 See the :ref:`Qt console section <qtconsole>` of the docs for a detailed
236 description of the console's features and use.
236 description of the console's features and use.
237
237
238
238
239 .. _parallel_011:
239 .. _parallel_011:
240
240
241 High-level parallel computing with ZeroMQ
241 High-level parallel computing with ZeroMQ
242 -----------------------------------------
242 -----------------------------------------
243
243
244 We have completely rewritten the Twisted-based code for high-level parallel
244 We have completely rewritten the Twisted-based code for high-level parallel
245 computing to work atop our new ZeroMQ architecture. While we realize this will
245 computing to work atop our new ZeroMQ architecture. While we realize this will
246 break compatibility for a number of users, we hope to make the transition as
246 break compatibility for a number of users, we hope to make the transition as
247 easy as possible with our docs, and we are convinced the change is worth it.
247 easy as possible with our docs, and we are convinced the change is worth it.
248 ZeroMQ provides us with much tighter control over memory, higher performance,
248 ZeroMQ provides us with much tighter control over memory, higher performance,
249 and its communications are impervious to the Python Global Interpreter Lock
249 and its communications are impervious to the Python Global Interpreter Lock
250 because they take place in a system-level C++ thread. The impact of the GIL in
250 because they take place in a system-level C++ thread. The impact of the GIL in
251 our previous code was something we could simply not work around, given that
251 our previous code was something we could simply not work around, given that
252 Twisted is itself a Python library. So while Twisted is a very capable
252 Twisted is itself a Python library. So while Twisted is a very capable
253 framework, we think ZeroMQ fits our needs much better and we hope you will find
253 framework, we think ZeroMQ fits our needs much better and we hope you will find
254 the change to be a significant improvement in the long run.
254 the change to be a significant improvement in the long run.
255
255
256 Our manual contains :ref:`a full description of how to use IPython for parallel
256 Our manual contains :ref:`a full description of how to use IPython for parallel
257 computing <parallel_overview>`, and the `tutorial`__ presented by Min
257 computing <parallel_overview>`, and the `tutorial`__ presented by Min
258 Ragan-Kelley at the SciPy 2011 conference provides a hands-on complement to the
258 Ragan-Kelley at the SciPy 2011 conference provides a hands-on complement to the
259 reference docs.
259 reference docs.
260
260
261 .. __: http://minrk.github.com/scipy-tutorial-2011
261 .. __: http://minrk.github.com/scipy-tutorial-2011
262
262
263
263
264 Refactoring
264 Refactoring
265 -----------
265 -----------
266
266
267 As of this release, a signifiant portion of IPython has been refactored. This
267 As of this release, a signifiant portion of IPython has been refactored. This
268 refactoring is founded on a number of new abstractions. The main new classes
268 refactoring is founded on a number of new abstractions. The main new classes
269 that implement these abstractions are:
269 that implement these abstractions are:
270
270
271 * :class:`IPython.utils.traitlets.HasTraits`.
271 * :class:`IPython.utils.traitlets.HasTraits`.
272 * :class:`IPython.config.configurable.Configurable`.
272 * :class:`IPython.config.configurable.Configurable`.
273 * :class:`IPython.config.application.Application`.
273 * :class:`IPython.config.application.Application`.
274 * :class:`IPython.config.loader.ConfigLoader`.
274 * :class:`IPython.config.loader.ConfigLoader`.
275 * :class:`IPython.config.loader.Config`
275 * :class:`IPython.config.loader.Config`
276
276
277 We are still in the process of writing developer focused documentation about
277 We are still in the process of writing developer focused documentation about
278 these classes, but for now our :ref:`configuration documentation
278 these classes, but for now our :ref:`configuration documentation
279 <config_overview>` contains a high level overview of the concepts that these
279 <config_overview>` contains a high level overview of the concepts that these
280 classes express.
280 classes express.
281
281
282 The biggest user-visible change is likely the move to using the config system
282 The biggest user-visible change is likely the move to using the config system
283 to determine the command-line arguments for IPython applications. The benefit
283 to determine the command-line arguments for IPython applications. The benefit
284 of this is that *all* configurable values in IPython are exposed on the
284 of this is that *all* configurable values in IPython are exposed on the
285 command-line, but the syntax for specifying values has changed. The gist is
285 command-line, but the syntax for specifying values has changed. The gist is
286 that assigning values is pure Python assignment. Simple flags exist for
286 that assigning values is pure Python assignment. Simple flags exist for
287 commonly used options, these are always prefixed with '--'.
287 commonly used options, these are always prefixed with '--'.
288
288
289 The IPython command-line help has the details of all the options (via
289 The IPython command-line help has the details of all the options (via
290 ``ipythyon --help``), but a simple example should clarify things; the ``pylab``
290 ``ipythyon --help``), but a simple example should clarify things; the ``pylab``
291 flag can be used to start in pylab mode with the qt4 backend::
291 flag can be used to start in pylab mode with the qt4 backend::
292
292
293 ipython --pylab=qt
293 ipython --pylab=qt
294
294
295 which is equivalent to using the fully qualified form::
295 which is equivalent to using the fully qualified form::
296
296
297 ipython --TerminalIPythonApp.pylab=qt
297 ipython --TerminalIPythonApp.pylab=qt
298
298
299 The long-form options can be listed via ``ipython --help-all``.
299 The long-form options can be listed via ``ipython --help-all``.
300
300
301
301
302 ZeroMQ architecture
302 ZeroMQ architecture
303 -------------------
303 -------------------
304
304
305 There is a new GUI framework for IPython, based on a client-server model in
305 There is a new GUI framework for IPython, based on a client-server model in
306 which multiple clients can communicate with one IPython kernel, using the
306 which multiple clients can communicate with one IPython kernel, using the
307 ZeroMQ messaging framework. There is already a Qt console client, which can
307 ZeroMQ messaging framework. There is already a Qt console client, which can
308 be started by calling ``ipython qtconsole``. The protocol is :ref:`documented
308 be started by calling ``ipython qtconsole``. The protocol is :ref:`documented
309 <messaging>`.
309 <messaging>`.
310
310
311 The parallel computing framework has also been rewritten using ZMQ. The
311 The parallel computing framework has also been rewritten using ZMQ. The
312 protocol is described :ref:`here <parallel_messages>`, and the code is in the
312 protocol is described :ref:`here <parallel_messages>`, and the code is in the
313 new :mod:`IPython.parallel` module.
313 new :mod:`IPython.parallel` module.
314
314
315 .. _python3_011:
315 .. _python3_011:
316
316
317 Python 3 support
317 Python 3 support
318 ----------------
318 ----------------
319
319
320 A Python 3 version of IPython has been prepared. For the time being, this is
320 A Python 3 version of IPython has been prepared. For the time being, this is
321 maintained separately and updated from the main codebase. Its code can be found
321 maintained separately and updated from the main codebase. Its code can be found
322 `here <https://github.com/ipython/ipython-py3k>`_. The parallel computing
322 `here <https://github.com/ipython/ipython-py3k>`_. The parallel computing
323 components are not perfect on Python3, but most functionality appears to be
323 components are not perfect on Python3, but most functionality appears to be
324 working. As this work is evolving quickly, the best place to find updated
324 working. As this work is evolving quickly, the best place to find updated
325 information about it is our `Python 3 wiki page`__.
325 information about it is our `Python 3 wiki page`__.
326
326
327 .. __: http://wiki.ipython.org/index.php?title=Python_3
327 .. __: http://wiki.ipython.org/index.php?title=Python_3
328
328
329
329
330 Unicode
330 Unicode
331 -------
331 -------
332
332
333 Entering non-ascii characters in unicode literals (``u"€ø"``) now works
333 Entering non-ascii characters in unicode literals (``u"€ø"``) now works
334 properly on all platforms. However, entering these in byte/string literals
334 properly on all platforms. However, entering these in byte/string literals
335 (``"€ø"``) will not work as expected on Windows (or any platform where the
335 (``"€ø"``) will not work as expected on Windows (or any platform where the
336 terminal encoding is not UTF-8, as it typically is for Linux & Mac OS X). You
336 terminal encoding is not UTF-8, as it typically is for Linux & Mac OS X). You
337 can use escape sequences (``"\xe9\x82"``) to get bytes above 128, or use
337 can use escape sequences (``"\xe9\x82"``) to get bytes above 128, or use
338 unicode literals and encode them. This is a limitation of Python 2 which we
338 unicode literals and encode them. This is a limitation of Python 2 which we
339 cannot easily work around.
339 cannot easily work around.
340
340
341 .. _ms_visual_studio_011:
341 .. _ms_visual_studio_011:
342
342
343 Integration with Microsoft Visual Studio
343 Integration with Microsoft Visual Studio
344 ----------------------------------------
344 ----------------------------------------
345
345
346 IPython can be used as the interactive shell in the `Python plugin for
346 IPython can be used as the interactive shell in the `Python plugin for
347 Microsoft Visual Studio`__, as seen here:
347 Microsoft Visual Studio`__, as seen here:
348
348
349 .. figure:: ../_images/ms_visual_studio.png
349 .. figure:: ../_images/ms_visual_studio.png
350 :width: 500px
350 :width: 500px
351 :alt: IPython console embedded in Microsoft Visual Studio.
351 :alt: IPython console embedded in Microsoft Visual Studio.
352 :align: center
352 :align: center
353 :target: ../_images/ms_visual_studio.png
353 :target: ../_images/ms_visual_studio.png
354
354
355 IPython console embedded in Microsoft Visual Studio.
355 IPython console embedded in Microsoft Visual Studio.
356
356
357 The Microsoft team developing this currently has a release candidate out using
357 The Microsoft team developing this currently has a release candidate out using
358 IPython 0.11. We will continue to collaborate with them to ensure that as they
358 IPython 0.11. We will continue to collaborate with them to ensure that as they
359 approach their final release date, the integration with IPython remains smooth.
359 approach their final release date, the integration with IPython remains smooth.
360 We'd like to thank Dino Viehland and Shahrokh Mortazavi for the work they have
360 We'd like to thank Dino Viehland and Shahrokh Mortazavi for the work they have
361 done towards this feature, as well as Wenming Ye for his support of our WinHPC
361 done towards this feature, as well as Wenming Ye for his support of our WinHPC
362 capabilities.
362 capabilities.
363
363
364 .. __: http://pytools.codeplex.com
364 .. __: http://pytools.codeplex.com
365
365
366
366
367 Additional new features
367 Additional new features
368 -----------------------
368 -----------------------
369
369
370 * Added ``Bytes`` traitlet, removing ``Str``. All 'string' traitlets should
370 * Added ``Bytes`` traitlet, removing ``Str``. All 'string' traitlets should
371 either be ``Unicode`` if a real string, or ``Bytes`` if a C-string. This
371 either be ``Unicode`` if a real string, or ``Bytes`` if a C-string. This
372 removes ambiguity and helps the Python 3 transition.
372 removes ambiguity and helps the Python 3 transition.
373
373
374 * New magic ``%loadpy`` loads a python file from disk or web URL into
374 * New magic ``%loadpy`` loads a python file from disk or web URL into
375 the current input buffer.
375 the current input buffer.
376
376
377 * New magic ``%pastebin`` for sharing code via the 'Lodge it' pastebin.
377 * New magic ``%pastebin`` for sharing code via the 'Lodge it' pastebin.
378
378
379 * New magic ``%precision`` for controlling float and numpy pretty printing.
379 * New magic ``%precision`` for controlling float and numpy pretty printing.
380
380
381 * IPython applications initiate logging, so any object can gain access to
381 * IPython applications initiate logging, so any object can gain access to
382 a the logger of the currently running Application with:
382 a the logger of the currently running Application with:
383
383
384 .. sourcecode:: python
384 .. sourcecode:: python
385
385
386 from IPython.config.application import Application
386 from IPython.config.application import Application
387 logger = Application.instance().log
387 logger = Application.instance().log
388
388
389 * You can now get help on an object halfway through typing a command. For
389 * You can now get help on an object halfway through typing a command. For
390 instance, typing ``a = zip?`` shows the details of :func:`zip`. It also
390 instance, typing ``a = zip?`` shows the details of :func:`zip`. It also
391 leaves the command at the next prompt so you can carry on with it.
391 leaves the command at the next prompt so you can carry on with it.
392
392
393 * The input history is now written to an SQLite database. The API for
393 * The input history is now written to an SQLite database. The API for
394 retrieving items from the history has also been redesigned.
394 retrieving items from the history has also been redesigned.
395
395
396 * The :mod:`IPython.extensions.pretty` extension has been moved out of
396 * The :mod:`IPython.extensions.pretty` extension has been moved out of
397 quarantine and fully updated to the new extension API.
397 quarantine and fully updated to the new extension API.
398
398
399 * New magics for loading/unloading/reloading extensions have been added:
399 * New magics for loading/unloading/reloading extensions have been added:
400 ``%load_ext``, ``%unload_ext`` and ``%reload_ext``.
400 ``%load_ext``, ``%unload_ext`` and ``%reload_ext``.
401
401
402 * The configuration system and configuration files are brand new. See the
402 * The configuration system and configuration files are brand new. See the
403 configuration system :ref:`documentation <config_index>` for more details.
403 configuration system :ref:`documentation <config_index>` for more details.
404
404
405 * The :class:`~IPython.core.interactiveshell.InteractiveShell` class is now a
405 * The :class:`~IPython.core.interactiveshell.InteractiveShell` class is now a
406 :class:`~IPython.config.configurable.Configurable` subclass and has traitlets
406 :class:`~IPython.config.configurable.Configurable` subclass and has traitlets
407 that determine the defaults and runtime environment. The ``__init__`` method
407 that determine the defaults and runtime environment. The ``__init__`` method
408 has also been refactored so this class can be instantiated and run without
408 has also been refactored so this class can be instantiated and run without
409 the old :mod:`ipmaker` module.
409 the old :mod:`ipmaker` module.
410
410
411 * The methods of :class:`~IPython.core.interactiveshell.InteractiveShell` have
411 * The methods of :class:`~IPython.core.interactiveshell.InteractiveShell` have
412 been organized into sections to make it easier to turn more sections
412 been organized into sections to make it easier to turn more sections
413 of functionality into components.
413 of functionality into components.
414
414
415 * The embedded shell has been refactored into a truly standalone subclass of
415 * The embedded shell has been refactored into a truly standalone subclass of
416 :class:`InteractiveShell` called :class:`InteractiveShellEmbed`. All
416 :class:`InteractiveShell` called :class:`InteractiveShellEmbed`. All
417 embedding logic has been taken out of the base class and put into the
417 embedding logic has been taken out of the base class and put into the
418 embedded subclass.
418 embedded subclass.
419
419
420 * Added methods of :class:`~IPython.core.interactiveshell.InteractiveShell` to
420 * Added methods of :class:`~IPython.core.interactiveshell.InteractiveShell` to
421 help it cleanup after itself. The :meth:`cleanup` method controls this. We
421 help it cleanup after itself. The :meth:`cleanup` method controls this. We
422 couldn't do this in :meth:`__del__` because we have cycles in our object
422 couldn't do this in :meth:`__del__` because we have cycles in our object
423 graph that prevent it from being called.
423 graph that prevent it from being called.
424
424
425 * Created a new module :mod:`IPython.utils.importstring` for resolving
425 * Created a new module :mod:`IPython.utils.importstring` for resolving
426 strings like ``foo.bar.Bar`` to the actual class.
426 strings like ``foo.bar.Bar`` to the actual class.
427
427
428 * Completely refactored the :mod:`IPython.core.prefilter` module into
428 * Completely refactored the :mod:`IPython.core.prefilter` module into
429 :class:`~IPython.config.configurable.Configurable` subclasses. Added a new
429 :class:`~IPython.config.configurable.Configurable` subclasses. Added a new
430 layer into the prefilter system, called "transformations" that all new
430 layer into the prefilter system, called "transformations" that all new
431 prefilter logic should use (rather than the older "checker/handler"
431 prefilter logic should use (rather than the older "checker/handler"
432 approach).
432 approach).
433
433
434 * Aliases are now components (:mod:`IPython.core.alias`).
434 * Aliases are now components (:mod:`IPython.core.alias`).
435
435
436 * New top level :func:`~IPython.frontend.terminal.embed.embed` function that can
436 * New top level :func:`~IPython.frontend.terminal.embed.embed` function that can
437 be called to embed IPython at any place in user's code. On the first call it
437 be called to embed IPython at any place in user's code. On the first call it
438 will create an :class:`~IPython.frontend.terminal.embed.InteractiveShellEmbed`
438 will create an :class:`~IPython.frontend.terminal.embed.InteractiveShellEmbed`
439 instance and call it. In later calls, it just calls the previously created
439 instance and call it. In later calls, it just calls the previously created
440 :class:`~IPython.frontend.terminal.embed.InteractiveShellEmbed`.
440 :class:`~IPython.frontend.terminal.embed.InteractiveShellEmbed`.
441
441
442 * Created a configuration system (:mod:`IPython.config.configurable`) that is
442 * Created a configuration system (:mod:`IPython.config.configurable`) that is
443 based on :mod:`IPython.utils.traitlets`. Configurables are arranged into a
443 based on :mod:`IPython.utils.traitlets`. Configurables are arranged into a
444 runtime containment tree (not inheritance) that i) automatically propagates
444 runtime containment tree (not inheritance) that i) automatically propagates
445 configuration information and ii) allows singletons to discover each other in
445 configuration information and ii) allows singletons to discover each other in
446 a loosely coupled manner. In the future all parts of IPython will be
446 a loosely coupled manner. In the future all parts of IPython will be
447 subclasses of :class:`~IPython.config.configurable.Configurable`. All IPython
447 subclasses of :class:`~IPython.config.configurable.Configurable`. All IPython
448 developers should become familiar with the config system.
448 developers should become familiar with the config system.
449
449
450 * Created a new :class:`~IPython.config.loader.Config` for holding
450 * Created a new :class:`~IPython.config.loader.Config` for holding
451 configuration information. This is a dict like class with a few extras: i)
451 configuration information. This is a dict like class with a few extras: i)
452 it supports attribute style access, ii) it has a merge function that merges
452 it supports attribute style access, ii) it has a merge function that merges
453 two :class:`~IPython.config.loader.Config` instances recursively and iii) it
453 two :class:`~IPython.config.loader.Config` instances recursively and iii) it
454 will automatically create sub-:class:`~IPython.config.loader.Config`
454 will automatically create sub-:class:`~IPython.config.loader.Config`
455 instances for attributes that start with an uppercase character.
455 instances for attributes that start with an uppercase character.
456
456
457 * Created new configuration loaders in :mod:`IPython.config.loader`. These
457 * Created new configuration loaders in :mod:`IPython.config.loader`. These
458 loaders provide a unified loading interface for all configuration
458 loaders provide a unified loading interface for all configuration
459 information including command line arguments and configuration files. We
459 information including command line arguments and configuration files. We
460 have two default implementations based on :mod:`argparse` and plain python
460 have two default implementations based on :mod:`argparse` and plain python
461 files. These are used to implement the new configuration system.
461 files. These are used to implement the new configuration system.
462
462
463 * Created a top-level :class:`Application` class in
463 * Created a top-level :class:`Application` class in
464 :mod:`IPython.core.application` that is designed to encapsulate the starting
464 :mod:`IPython.core.application` that is designed to encapsulate the starting
465 of any basic Python program. An application loads and merges all the
465 of any basic Python program. An application loads and merges all the
466 configuration objects, constructs the main application, configures and
466 configuration objects, constructs the main application, configures and
467 initiates logging, and creates and configures any :class:`Configurable`
467 initiates logging, and creates and configures any :class:`Configurable`
468 instances and then starts the application running. An extended
468 instances and then starts the application running. An extended
469 :class:`BaseIPythonApplication` class adds logic for handling the
469 :class:`BaseIPythonApplication` class adds logic for handling the
470 IPython directory as well as profiles, and all IPython entry points
470 IPython directory as well as profiles, and all IPython entry points
471 extend it.
471 extend it.
472
472
473 * The :class:`Type` and :class:`Instance` traitlets now handle classes given
473 * The :class:`Type` and :class:`Instance` traitlets now handle classes given
474 as strings, like ``foo.bar.Bar``. This is needed for forward declarations.
474 as strings, like ``foo.bar.Bar``. This is needed for forward declarations.
475 But, this was implemented in a careful way so that string to class
475 But, this was implemented in a careful way so that string to class
476 resolution is done at a single point, when the parent
476 resolution is done at a single point, when the parent
477 :class:`~IPython.utils.traitlets.HasTraitlets` is instantiated.
477 :class:`~IPython.utils.traitlets.HasTraitlets` is instantiated.
478
478
479 * :mod:`IPython.utils.ipstruct` has been refactored to be a subclass of
479 * :mod:`IPython.utils.ipstruct` has been refactored to be a subclass of
480 dict. It also now has full docstrings and doctests.
480 dict. It also now has full docstrings and doctests.
481
481
482 * Created a Traits like implementation in :mod:`IPython.utils.traitlets`. This
482 * Created a Traits like implementation in :mod:`IPython.utils.traitlets`. This
483 is a pure Python, lightweight version of a library that is similar to
483 is a pure Python, lightweight version of a library that is similar to
484 Enthought's Traits project, but has no dependencies on Enthought's code. We
484 Enthought's Traits project, but has no dependencies on Enthought's code. We
485 are using this for validation, defaults and notification in our new component
485 are using this for validation, defaults and notification in our new component
486 system. Although it is not 100% API compatible with Enthought's Traits, we
486 system. Although it is not 100% API compatible with Enthought's Traits, we
487 plan on moving in this direction so that eventually our implementation could
487 plan on moving in this direction so that eventually our implementation could
488 be replaced by a (yet to exist) pure Python version of Enthought Traits.
488 be replaced by a (yet to exist) pure Python version of Enthought Traits.
489
489
490 * Added a new module :mod:`IPython.lib.inputhook` to manage the integration
490 * Added a new module :mod:`IPython.lib.inputhook` to manage the integration
491 with GUI event loops using `PyOS_InputHook`. See the docstrings in this
491 with GUI event loops using `PyOS_InputHook`. See the docstrings in this
492 module or the main IPython docs for details.
492 module or the main IPython docs for details.
493
493
494 * For users, GUI event loop integration is now handled through the new
494 * For users, GUI event loop integration is now handled through the new
495 :command:`%gui` magic command. Type ``%gui?`` at an IPython prompt for
495 :command:`%gui` magic command. Type ``%gui?`` at an IPython prompt for
496 documentation.
496 documentation.
497
497
498 * For developers :mod:`IPython.lib.inputhook` provides a simple interface
498 * For developers :mod:`IPython.lib.inputhook` provides a simple interface
499 for managing the event loops in their interactive GUI applications.
499 for managing the event loops in their interactive GUI applications.
500 Examples can be found in our :file:`docs/examples/lib` directory.
500 Examples can be found in our :file:`examples/lib` directory.
501
501
502 Backwards incompatible changes
502 Backwards incompatible changes
503 ------------------------------
503 ------------------------------
504
504
505 * The Twisted-based :mod:`IPython.kernel` has been removed, and completely
505 * The Twisted-based :mod:`IPython.kernel` has been removed, and completely
506 rewritten as :mod:`IPython.parallel`, using ZeroMQ.
506 rewritten as :mod:`IPython.parallel`, using ZeroMQ.
507
507
508 * Profiles are now directories. Instead of a profile being a single config file,
508 * Profiles are now directories. Instead of a profile being a single config file,
509 profiles are now self-contained directories. By default, profiles get their
509 profiles are now self-contained directories. By default, profiles get their
510 own IPython history, log files, and everything. To create a new profile, do
510 own IPython history, log files, and everything. To create a new profile, do
511 ``ipython profile create <name>``.
511 ``ipython profile create <name>``.
512
512
513 * All IPython applications have been rewritten to use
513 * All IPython applications have been rewritten to use
514 :class:`~IPython.config.loader.KeyValueConfigLoader`. This means that
514 :class:`~IPython.config.loader.KeyValueConfigLoader`. This means that
515 command-line options have changed. Now, all configurable values are accessible
515 command-line options have changed. Now, all configurable values are accessible
516 from the command-line with the same syntax as in a configuration file.
516 from the command-line with the same syntax as in a configuration file.
517
517
518 * The command line options ``-wthread``, ``-qthread`` and
518 * The command line options ``-wthread``, ``-qthread`` and
519 ``-gthread`` have been removed. Use ``--gui=wx``, ``--gui=qt``, ``--gui=gtk``
519 ``-gthread`` have been removed. Use ``--gui=wx``, ``--gui=qt``, ``--gui=gtk``
520 instead.
520 instead.
521
521
522 * The extension loading functions have been renamed to
522 * The extension loading functions have been renamed to
523 :func:`load_ipython_extension` and :func:`unload_ipython_extension`.
523 :func:`load_ipython_extension` and :func:`unload_ipython_extension`.
524
524
525 * :class:`~IPython.core.interactiveshell.InteractiveShell` no longer takes an
525 * :class:`~IPython.core.interactiveshell.InteractiveShell` no longer takes an
526 ``embedded`` argument. Instead just use the
526 ``embedded`` argument. Instead just use the
527 :class:`~IPython.core.interactiveshell.InteractiveShellEmbed` class.
527 :class:`~IPython.core.interactiveshell.InteractiveShellEmbed` class.
528
528
529 * ``__IPYTHON__`` is no longer injected into ``__builtin__``.
529 * ``__IPYTHON__`` is no longer injected into ``__builtin__``.
530
530
531 * :meth:`Struct.__init__` no longer takes `None` as its first argument. It
531 * :meth:`Struct.__init__` no longer takes `None` as its first argument. It
532 must be a :class:`dict` or :class:`Struct`.
532 must be a :class:`dict` or :class:`Struct`.
533
533
534 * :meth:`~IPython.core.interactiveshell.InteractiveShell.ipmagic` has been
534 * :meth:`~IPython.core.interactiveshell.InteractiveShell.ipmagic` has been
535 renamed :meth:`~IPython.core.interactiveshell.InteractiveShell.magic.`
535 renamed :meth:`~IPython.core.interactiveshell.InteractiveShell.magic.`
536
536
537 * The functions :func:`ipmagic` and :func:`ipalias` have been removed from
537 * The functions :func:`ipmagic` and :func:`ipalias` have been removed from
538 :mod:`__builtins__`.
538 :mod:`__builtins__`.
539
539
540 * The references to the global
540 * The references to the global
541 :class:`~IPython.core.interactivehell.InteractiveShell` instance (``_ip``, and
541 :class:`~IPython.core.interactivehell.InteractiveShell` instance (``_ip``, and
542 ``__IP``) have been removed from the user's namespace. They are replaced by a
542 ``__IP``) have been removed from the user's namespace. They are replaced by a
543 new function called :func:`get_ipython` that returns the current
543 new function called :func:`get_ipython` that returns the current
544 :class:`~IPython.core.interactiveshell.InteractiveShell` instance. This
544 :class:`~IPython.core.interactiveshell.InteractiveShell` instance. This
545 function is injected into the user's namespace and is now the main way of
545 function is injected into the user's namespace and is now the main way of
546 accessing the running IPython.
546 accessing the running IPython.
547
547
548 * Old style configuration files :file:`ipythonrc` and :file:`ipy_user_conf.py`
548 * Old style configuration files :file:`ipythonrc` and :file:`ipy_user_conf.py`
549 are no longer supported. Users should migrate there configuration files to
549 are no longer supported. Users should migrate there configuration files to
550 the new format described :ref:`here <config_overview>` and :ref:`here
550 the new format described :ref:`here <config_overview>` and :ref:`here
551 <configuring_ipython>`.
551 <configuring_ipython>`.
552
552
553 * The old IPython extension API that relied on :func:`ipapi` has been
553 * The old IPython extension API that relied on :func:`ipapi` has been
554 completely removed. The new extension API is described :ref:`here
554 completely removed. The new extension API is described :ref:`here
555 <configuring_ipython>`.
555 <configuring_ipython>`.
556
556
557 * Support for ``qt3`` has been dropped. Users who need this should use
557 * Support for ``qt3`` has been dropped. Users who need this should use
558 previous versions of IPython.
558 previous versions of IPython.
559
559
560 * Removed :mod:`shellglobals` as it was obsolete.
560 * Removed :mod:`shellglobals` as it was obsolete.
561
561
562 * Removed all the threaded shells in :mod:`IPython.core.shell`. These are no
562 * Removed all the threaded shells in :mod:`IPython.core.shell`. These are no
563 longer needed because of the new capabilities in
563 longer needed because of the new capabilities in
564 :mod:`IPython.lib.inputhook`.
564 :mod:`IPython.lib.inputhook`.
565
565
566 * New top-level sub-packages have been created: :mod:`IPython.core`,
566 * New top-level sub-packages have been created: :mod:`IPython.core`,
567 :mod:`IPython.lib`, :mod:`IPython.utils`, :mod:`IPython.deathrow`,
567 :mod:`IPython.lib`, :mod:`IPython.utils`, :mod:`IPython.deathrow`,
568 :mod:`IPython.quarantine`. All existing top-level modules have been
568 :mod:`IPython.quarantine`. All existing top-level modules have been
569 moved to appropriate sub-packages. All internal import statements
569 moved to appropriate sub-packages. All internal import statements
570 have been updated and tests have been added. The build system (setup.py
570 have been updated and tests have been added. The build system (setup.py
571 and friends) have been updated. See :ref:`this section <module_reorg>` of the
571 and friends) have been updated. See :ref:`this section <module_reorg>` of the
572 documentation for descriptions of these new sub-packages.
572 documentation for descriptions of these new sub-packages.
573
573
574 * :mod:`IPython.ipapi` has been moved to :mod:`IPython.core.ipapi`.
574 * :mod:`IPython.ipapi` has been moved to :mod:`IPython.core.ipapi`.
575 :mod:`IPython.Shell` and :mod:`IPython.iplib` have been split and removed as
575 :mod:`IPython.Shell` and :mod:`IPython.iplib` have been split and removed as
576 part of the refactor.
576 part of the refactor.
577
577
578 * :mod:`Extensions` has been moved to :mod:`extensions` and all existing
578 * :mod:`Extensions` has been moved to :mod:`extensions` and all existing
579 extensions have been moved to either :mod:`IPython.quarantine` or
579 extensions have been moved to either :mod:`IPython.quarantine` or
580 :mod:`IPython.deathrow`. :mod:`IPython.quarantine` contains modules that we
580 :mod:`IPython.deathrow`. :mod:`IPython.quarantine` contains modules that we
581 plan on keeping but that need to be updated. :mod:`IPython.deathrow` contains
581 plan on keeping but that need to be updated. :mod:`IPython.deathrow` contains
582 modules that are either dead or that should be maintained as third party
582 modules that are either dead or that should be maintained as third party
583 libraries. More details about this can be found :ref:`here <module_reorg>`.
583 libraries. More details about this can be found :ref:`here <module_reorg>`.
584
584
585 * Previous IPython GUIs in :mod:`IPython.frontend` and :mod:`IPython.gui` are
585 * Previous IPython GUIs in :mod:`IPython.frontend` and :mod:`IPython.gui` are
586 likely broken, and have been removed to :mod:`IPython.deathrow` because of the
586 likely broken, and have been removed to :mod:`IPython.deathrow` because of the
587 refactoring in the core. With proper updates, these should still work.
587 refactoring in the core. With proper updates, these should still work.
588
588
589
589
590 Known Regressions
590 Known Regressions
591 -----------------
591 -----------------
592
592
593 We do our best to improve IPython, but there are some known regressions in 0.11
593 We do our best to improve IPython, but there are some known regressions in 0.11
594 relative to 0.10.2. First of all, there are features that have yet to be
594 relative to 0.10.2. First of all, there are features that have yet to be
595 ported to the new APIs, and in order to ensure that all of the installed code
595 ported to the new APIs, and in order to ensure that all of the installed code
596 runs for our users, we have moved them to two separate directories in the
596 runs for our users, we have moved them to two separate directories in the
597 source distribution, `quarantine` and `deathrow`. Finally, we have some other
597 source distribution, `quarantine` and `deathrow`. Finally, we have some other
598 miscellaneous regressions that we hope to fix as soon as possible. We now
598 miscellaneous regressions that we hope to fix as soon as possible. We now
599 describe all of these in more detail.
599 describe all of these in more detail.
600
600
601 Quarantine
601 Quarantine
602 ~~~~~~~~~~
602 ~~~~~~~~~~
603
603
604 These are tools and extensions that we consider relatively easy to update to
604 These are tools and extensions that we consider relatively easy to update to
605 the new classes and APIs, but that we simply haven't had time for. Any user
605 the new classes and APIs, but that we simply haven't had time for. Any user
606 who is interested in one of these is encouraged to help us by porting it and
606 who is interested in one of these is encouraged to help us by porting it and
607 submitting a pull request on our `development site`_.
607 submitting a pull request on our `development site`_.
608
608
609 .. _development site: http://github.com/ipython/ipython
609 .. _development site: http://github.com/ipython/ipython
610
610
611 Currently, the quarantine directory contains::
611 Currently, the quarantine directory contains::
612
612
613 clearcmd.py ipy_fsops.py ipy_signals.py
613 clearcmd.py ipy_fsops.py ipy_signals.py
614 envpersist.py ipy_gnuglobal.py ipy_synchronize_with.py
614 envpersist.py ipy_gnuglobal.py ipy_synchronize_with.py
615 ext_rescapture.py ipy_greedycompleter.py ipy_system_conf.py
615 ext_rescapture.py ipy_greedycompleter.py ipy_system_conf.py
616 InterpreterExec.py ipy_jot.py ipy_which.py
616 InterpreterExec.py ipy_jot.py ipy_which.py
617 ipy_app_completers.py ipy_lookfor.py ipy_winpdb.py
617 ipy_app_completers.py ipy_lookfor.py ipy_winpdb.py
618 ipy_autoreload.py ipy_profile_doctest.py ipy_workdir.py
618 ipy_autoreload.py ipy_profile_doctest.py ipy_workdir.py
619 ipy_completers.py ipy_pydb.py jobctrl.py
619 ipy_completers.py ipy_pydb.py jobctrl.py
620 ipy_editors.py ipy_rehashdir.py ledit.py
620 ipy_editors.py ipy_rehashdir.py ledit.py
621 ipy_exportdb.py ipy_render.py pspersistence.py
621 ipy_exportdb.py ipy_render.py pspersistence.py
622 ipy_extutil.py ipy_server.py win32clip.py
622 ipy_extutil.py ipy_server.py win32clip.py
623
623
624 Deathrow
624 Deathrow
625 ~~~~~~~~
625 ~~~~~~~~
626
626
627 These packages may be harder to update or make most sense as third-party
627 These packages may be harder to update or make most sense as third-party
628 libraries. Some of them are completely obsolete and have been already replaced
628 libraries. Some of them are completely obsolete and have been already replaced
629 by better functionality (we simply haven't had the time to carefully weed them
629 by better functionality (we simply haven't had the time to carefully weed them
630 out so they are kept here for now). Others simply require fixes to code that
630 out so they are kept here for now). Others simply require fixes to code that
631 the current core team may not be familiar with. If a tool you were used to is
631 the current core team may not be familiar with. If a tool you were used to is
632 included here, we encourage you to contact the dev list and we can discuss
632 included here, we encourage you to contact the dev list and we can discuss
633 whether it makes sense to keep it in IPython (if it can be maintained).
633 whether it makes sense to keep it in IPython (if it can be maintained).
634
634
635 Currently, the deathrow directory contains::
635 Currently, the deathrow directory contains::
636
636
637 astyle.py ipy_defaults.py ipy_vimserver.py
637 astyle.py ipy_defaults.py ipy_vimserver.py
638 dtutils.py ipy_kitcfg.py numeric_formats.py
638 dtutils.py ipy_kitcfg.py numeric_formats.py
639 Gnuplot2.py ipy_legacy.py numutils.py
639 Gnuplot2.py ipy_legacy.py numutils.py
640 GnuplotInteractive.py ipy_p4.py outputtrap.py
640 GnuplotInteractive.py ipy_p4.py outputtrap.py
641 GnuplotRuntime.py ipy_profile_none.py PhysicalQInput.py
641 GnuplotRuntime.py ipy_profile_none.py PhysicalQInput.py
642 ibrowse.py ipy_profile_numpy.py PhysicalQInteractive.py
642 ibrowse.py ipy_profile_numpy.py PhysicalQInteractive.py
643 igrid.py ipy_profile_scipy.py quitter.py*
643 igrid.py ipy_profile_scipy.py quitter.py*
644 ipipe.py ipy_profile_sh.py scitedirector.py
644 ipipe.py ipy_profile_sh.py scitedirector.py
645 iplib.py ipy_profile_zope.py Shell.py
645 iplib.py ipy_profile_zope.py Shell.py
646 ipy_constants.py ipy_traits_completer.py twshell.py
646 ipy_constants.py ipy_traits_completer.py twshell.py
647
647
648
648
649 Other regressions
649 Other regressions
650 ~~~~~~~~~~~~~~~~~
650 ~~~~~~~~~~~~~~~~~
651
651
652 * The machinery that adds functionality to the 'sh' profile for using IPython
652 * The machinery that adds functionality to the 'sh' profile for using IPython
653 as your system shell has not been updated to use the new APIs. As a result,
653 as your system shell has not been updated to use the new APIs. As a result,
654 only the aesthetic (prompt) changes are still implemented. We intend to fix
654 only the aesthetic (prompt) changes are still implemented. We intend to fix
655 this by 0.12. Tracked as issue 547_.
655 this by 0.12. Tracked as issue 547_.
656
656
657 .. _547: https://github.com/ipython/ipython/issues/547
657 .. _547: https://github.com/ipython/ipython/issues/547
658
658
659 * The installation of scripts on Windows was broken without setuptools, so we
659 * The installation of scripts on Windows was broken without setuptools, so we
660 now depend on setuptools on Windows. We hope to fix setuptools-less
660 now depend on setuptools on Windows. We hope to fix setuptools-less
661 installation, and then remove the setuptools dependency. Issue 539_.
661 installation, and then remove the setuptools dependency. Issue 539_.
662
662
663 .. _539: https://github.com/ipython/ipython/issues/539
663 .. _539: https://github.com/ipython/ipython/issues/539
664
664
665 * The directory history `_dh` is not saved between sessions. Issue 634_.
665 * The directory history `_dh` is not saved between sessions. Issue 634_.
666
666
667 .. _634: https://github.com/ipython/ipython/issues/634
667 .. _634: https://github.com/ipython/ipython/issues/634
668
668
669
669
670 Removed Features
670 Removed Features
671 ----------------
671 ----------------
672
672
673 As part of the updating of IPython, we have removed a few features for the
673 As part of the updating of IPython, we have removed a few features for the
674 purposes of cleaning up the codebase and interfaces. These removals are
674 purposes of cleaning up the codebase and interfaces. These removals are
675 permanent, but for any item listed below, equivalent functionality is
675 permanent, but for any item listed below, equivalent functionality is
676 available.
676 available.
677
677
678 * The magics Exit and Quit have been dropped as ways to exit IPython. Instead,
678 * The magics Exit and Quit have been dropped as ways to exit IPython. Instead,
679 the lowercase forms of both work either as a bare name (``exit``) or a
679 the lowercase forms of both work either as a bare name (``exit``) or a
680 function call (``exit()``). You can assign these to other names using
680 function call (``exit()``). You can assign these to other names using
681 exec_lines in the config file.
681 exec_lines in the config file.
682
682
683
683
684 .. _credits_011:
684 .. _credits_011:
685
685
686 Credits
686 Credits
687 -------
687 -------
688
688
689 Many users and developers contributed code, features, bug reports and ideas to
689 Many users and developers contributed code, features, bug reports and ideas to
690 this release. Please do not hesitate in contacting us if we've failed to
690 this release. Please do not hesitate in contacting us if we've failed to
691 acknowledge your contribution here. In particular, for this release we have
691 acknowledge your contribution here. In particular, for this release we have
692 contribution from the following people, a mix of new and regular names (in
692 contribution from the following people, a mix of new and regular names (in
693 alphabetical order by first name):
693 alphabetical order by first name):
694
694
695 * Aenugu Sai Kiran Reddy <saikrn08-at-gmail.com>
695 * Aenugu Sai Kiran Reddy <saikrn08-at-gmail.com>
696 * andy wilson <wilson.andrew.j+github-at-gmail.com>
696 * andy wilson <wilson.andrew.j+github-at-gmail.com>
697 * Antonio Cuni <antocuni>
697 * Antonio Cuni <antocuni>
698 * Barry Wark <barrywark-at-gmail.com>
698 * Barry Wark <barrywark-at-gmail.com>
699 * Beetoju Anuradha <anu.beethoju-at-gmail.com>
699 * Beetoju Anuradha <anu.beethoju-at-gmail.com>
700 * Benjamin Ragan-Kelley <minrk-at-Mercury.local>
700 * Benjamin Ragan-Kelley <minrk-at-Mercury.local>
701 * Brad Reisfeld
701 * Brad Reisfeld
702 * Brian E. Granger <ellisonbg-at-gmail.com>
702 * Brian E. Granger <ellisonbg-at-gmail.com>
703 * Christoph Gohlke <cgohlke-at-uci.edu>
703 * Christoph Gohlke <cgohlke-at-uci.edu>
704 * Cody Precord
704 * Cody Precord
705 * dan.milstein
705 * dan.milstein
706 * Darren Dale <dsdale24-at-gmail.com>
706 * Darren Dale <dsdale24-at-gmail.com>
707 * Dav Clark <davclark-at-berkeley.edu>
707 * Dav Clark <davclark-at-berkeley.edu>
708 * David Warde-Farley <wardefar-at-iro.umontreal.ca>
708 * David Warde-Farley <wardefar-at-iro.umontreal.ca>
709 * epatters <ejpatters-at-gmail.com>
709 * epatters <ejpatters-at-gmail.com>
710 * epatters <epatters-at-caltech.edu>
710 * epatters <epatters-at-caltech.edu>
711 * epatters <epatters-at-enthought.com>
711 * epatters <epatters-at-enthought.com>
712 * Eric Firing <efiring-at-hawaii.edu>
712 * Eric Firing <efiring-at-hawaii.edu>
713 * Erik Tollerud <erik.tollerud-at-gmail.com>
713 * Erik Tollerud <erik.tollerud-at-gmail.com>
714 * Evan Patterson <epatters-at-enthought.com>
714 * Evan Patterson <epatters-at-enthought.com>
715 * Fernando Perez <Fernando.Perez-at-berkeley.edu>
715 * Fernando Perez <Fernando.Perez-at-berkeley.edu>
716 * Gael Varoquaux <gael.varoquaux-at-normalesup.org>
716 * Gael Varoquaux <gael.varoquaux-at-normalesup.org>
717 * Gerardo <muzgash-at-Muzpelheim>
717 * Gerardo <muzgash-at-Muzpelheim>
718 * Jason Grout <jason.grout-at-drake.edu>
718 * Jason Grout <jason.grout-at-drake.edu>
719 * John Hunter <jdh2358-at-gmail.com>
719 * John Hunter <jdh2358-at-gmail.com>
720 * Jens Hedegaard Nielsen <jenshnielsen-at-gmail.com>
720 * Jens Hedegaard Nielsen <jenshnielsen-at-gmail.com>
721 * Johann Cohen-Tanugi <johann.cohentanugi-at-gmail.com>
721 * Johann Cohen-Tanugi <johann.cohentanugi-at-gmail.com>
722 * JΓΆrgen Stenarson <jorgen.stenarson-at-bostream.nu>
722 * JΓΆrgen Stenarson <jorgen.stenarson-at-bostream.nu>
723 * Justin Riley <justin.t.riley-at-gmail.com>
723 * Justin Riley <justin.t.riley-at-gmail.com>
724 * Kiorky
724 * Kiorky
725 * Laurent Dufrechou <laurent.dufrechou-at-gmail.com>
725 * Laurent Dufrechou <laurent.dufrechou-at-gmail.com>
726 * Luis Pedro Coelho <lpc-at-cmu.edu>
726 * Luis Pedro Coelho <lpc-at-cmu.edu>
727 * Mani chandra <mchandra-at-iitk.ac.in>
727 * Mani chandra <mchandra-at-iitk.ac.in>
728 * Mark E. Smith
728 * Mark E. Smith
729 * Mark Voorhies <mark.voorhies-at-ucsf.edu>
729 * Mark Voorhies <mark.voorhies-at-ucsf.edu>
730 * Martin Spacek <git-at-mspacek.mm.st>
730 * Martin Spacek <git-at-mspacek.mm.st>
731 * Michael Droettboom <mdroe-at-stsci.edu>
731 * Michael Droettboom <mdroe-at-stsci.edu>
732 * MinRK <benjaminrk-at-gmail.com>
732 * MinRK <benjaminrk-at-gmail.com>
733 * muzuiget <muzuiget-at-gmail.com>
733 * muzuiget <muzuiget-at-gmail.com>
734 * Nick Tarleton <nick-at-quixey.com>
734 * Nick Tarleton <nick-at-quixey.com>
735 * Nicolas Rougier <Nicolas.rougier-at-inria.fr>
735 * Nicolas Rougier <Nicolas.rougier-at-inria.fr>
736 * Omar Andres Zapata Mesa <andresete.chaos-at-gmail.com>
736 * Omar Andres Zapata Mesa <andresete.chaos-at-gmail.com>
737 * Paul Ivanov <pivanov314-at-gmail.com>
737 * Paul Ivanov <pivanov314-at-gmail.com>
738 * Pauli Virtanen <pauli.virtanen-at-iki.fi>
738 * Pauli Virtanen <pauli.virtanen-at-iki.fi>
739 * Prabhu Ramachandran
739 * Prabhu Ramachandran
740 * Ramana <sramana9-at-gmail.com>
740 * Ramana <sramana9-at-gmail.com>
741 * Robert Kern <robert.kern-at-gmail.com>
741 * Robert Kern <robert.kern-at-gmail.com>
742 * Sathesh Chandra <satheshchandra88-at-gmail.com>
742 * Sathesh Chandra <satheshchandra88-at-gmail.com>
743 * Satrajit Ghosh <satra-at-mit.edu>
743 * Satrajit Ghosh <satra-at-mit.edu>
744 * Sebastian Busch
744 * Sebastian Busch
745 * Skipper Seabold <jsseabold-at-gmail.com>
745 * Skipper Seabold <jsseabold-at-gmail.com>
746 * Stefan van der Walt <bzr-at-mentat.za.net>
746 * Stefan van der Walt <bzr-at-mentat.za.net>
747 * Stephan Peijnik <debian-at-sp.or.at>
747 * Stephan Peijnik <debian-at-sp.or.at>
748 * Steven Bethard
748 * Steven Bethard
749 * Thomas Kluyver <takowl-at-gmail.com>
749 * Thomas Kluyver <takowl-at-gmail.com>
750 * Thomas Spura <tomspur-at-fedoraproject.org>
750 * Thomas Spura <tomspur-at-fedoraproject.org>
751 * Tom Fetherston <tfetherston-at-aol.com>
751 * Tom Fetherston <tfetherston-at-aol.com>
752 * Tom MacWright
752 * Tom MacWright
753 * tzanko
753 * tzanko
754 * vankayala sowjanya <hai.sowjanya-at-gmail.com>
754 * vankayala sowjanya <hai.sowjanya-at-gmail.com>
755 * Vivian De Smedt <vds2212-at-VIVIAN>
755 * Vivian De Smedt <vds2212-at-VIVIAN>
756 * Ville M. Vainio <vivainio-at-gmail.com>
756 * Ville M. Vainio <vivainio-at-gmail.com>
757 * Vishal Vatsa <vishal.vatsa-at-gmail.com>
757 * Vishal Vatsa <vishal.vatsa-at-gmail.com>
758 * Vishnu S G <sgvishnu777-at-gmail.com>
758 * Vishnu S G <sgvishnu777-at-gmail.com>
759 * Walter Doerwald <walter-at-livinglogic.de>
759 * Walter Doerwald <walter-at-livinglogic.de>
760
760
761 .. note::
761 .. note::
762
762
763 This list was generated with the output of
763 This list was generated with the output of
764 ``git log dev-0.11 HEAD --format='* %aN <%aE>' | sed 's/@/\-at\-/' | sed 's/<>//' | sort -u``
764 ``git log dev-0.11 HEAD --format='* %aN <%aE>' | sed 's/@/\-at\-/' | sed 's/<>//' | sort -u``
765 after some cleanup. If you should be on this list, please add yourself.
765 after some cleanup. If you should be on this list, please add yourself.
@@ -1,673 +1,673 b''
1 =============
1 =============
2 0.13 Series
2 0.13 Series
3 =============
3 =============
4
4
5 Release 0.13
5 Release 0.13
6 ============
6 ============
7
7
8 IPython 0.13 contains several major new features, as well as a large amount of
8 IPython 0.13 contains several major new features, as well as a large amount of
9 bug and regression fixes. The previous version (0.12) was released on December
9 bug and regression fixes. The previous version (0.12) was released on December
10 19 2011, and in this development cycle we had:
10 19 2011, and in this development cycle we had:
11
11
12 - ~6 months of work.
12 - ~6 months of work.
13 - 373 pull requests merged.
13 - 373 pull requests merged.
14 - 742 issues closed (non-pull requests).
14 - 742 issues closed (non-pull requests).
15 - contributions from 62 authors.
15 - contributions from 62 authors.
16 - 1760 commits.
16 - 1760 commits.
17 - a diff of 114226 lines.
17 - a diff of 114226 lines.
18
18
19 The amount of work included in this release is so large, that we can only cover
19 The amount of work included in this release is so large, that we can only cover
20 here the main highlights; please see our :ref:`detailed release statistics
20 here the main highlights; please see our :ref:`detailed release statistics
21 <issues_list_013>` for links to every issue and pull request closed on GitHub
21 <issues_list_013>` for links to every issue and pull request closed on GitHub
22 as well as a full list of individual contributors.
22 as well as a full list of individual contributors.
23
23
24
24
25 Major Notebook improvements: new user interface and more
25 Major Notebook improvements: new user interface and more
26 --------------------------------------------------------
26 --------------------------------------------------------
27
27
28 The IPython Notebook, which has proven since its release to be wildly popular,
28 The IPython Notebook, which has proven since its release to be wildly popular,
29 has seen a massive amount of work in this release cycle, leading to a
29 has seen a massive amount of work in this release cycle, leading to a
30 significantly improved user experience as well as many new features.
30 significantly improved user experience as well as many new features.
31
31
32 The first user-visible change is a reorganization of the user interface; the
32 The first user-visible change is a reorganization of the user interface; the
33 left panel has been removed and was replaced by a real menu system and a
33 left panel has been removed and was replaced by a real menu system and a
34 toolbar with icons. Both the toolbar and the header above the menu can be
34 toolbar with icons. Both the toolbar and the header above the menu can be
35 collapsed to leave an unobstructed working area:
35 collapsed to leave an unobstructed working area:
36
36
37 .. image:: ../_images/ipy_013_notebook_spectrogram.png
37 .. image:: ../_images/ipy_013_notebook_spectrogram.png
38 :width: 460px
38 :width: 460px
39 :alt: New user interface for Notebook
39 :alt: New user interface for Notebook
40 :align: center
40 :align: center
41 :target: ../_images/ipy_013_notebook_spectrogram.png
41 :target: ../_images/ipy_013_notebook_spectrogram.png
42
42
43 The notebook handles very long outputs much better than before (this was a
43 The notebook handles very long outputs much better than before (this was a
44 serious usability issue when running processes that generated massive amounts
44 serious usability issue when running processes that generated massive amounts
45 of output). Now, in the presence of outputs longer than ~100 lines, the
45 of output). Now, in the presence of outputs longer than ~100 lines, the
46 notebook will automatically collapse to a scrollable area and the entire left
46 notebook will automatically collapse to a scrollable area and the entire left
47 part of this area controls the display: one click in this area will expand the
47 part of this area controls the display: one click in this area will expand the
48 output region completely, and a double-click will hide it completely. This
48 output region completely, and a double-click will hide it completely. This
49 figure shows both the scrolled and hidden modes:
49 figure shows both the scrolled and hidden modes:
50
50
51 .. image:: ../_images/ipy_013_notebook_long_out.png
51 .. image:: ../_images/ipy_013_notebook_long_out.png
52 :width: 460px
52 :width: 460px
53 :alt: Scrolling and hiding of long output in the notebook.
53 :alt: Scrolling and hiding of long output in the notebook.
54 :align: center
54 :align: center
55 :target: ../_images/ipy_013_notebook_long_out.png
55 :target: ../_images/ipy_013_notebook_long_out.png
56
56
57 .. note::
57 .. note::
58
58
59 The auto-folding of long outputs is disabled in Firefox due to bugs in its
59 The auto-folding of long outputs is disabled in Firefox due to bugs in its
60 scrolling behavior. See :ghpull:`2047` for details.
60 scrolling behavior. See :ghpull:`2047` for details.
61
61
62 Uploading notebooks to the dashboard is now easier: in addition to drag and
62 Uploading notebooks to the dashboard is now easier: in addition to drag and
63 drop (which can be finicky sometimes), you can now click on the upload text and
63 drop (which can be finicky sometimes), you can now click on the upload text and
64 use a regular file dialog box to select notebooks to upload. Furthermore, the
64 use a regular file dialog box to select notebooks to upload. Furthermore, the
65 notebook dashboard now auto-refreshes its contents and offers buttons to shut
65 notebook dashboard now auto-refreshes its contents and offers buttons to shut
66 down any running kernels (:ghpull:`1739`):
66 down any running kernels (:ghpull:`1739`):
67
67
68 .. image:: ../_images/ipy_013_dashboard.png
68 .. image:: ../_images/ipy_013_dashboard.png
69 :width: 460px
69 :width: 460px
70 :alt: Improved dashboard
70 :alt: Improved dashboard
71 :align: center
71 :align: center
72 :target: ../_images/ipy_013_dashboard.png
72 :target: ../_images/ipy_013_dashboard.png
73
73
74
74
75 Cluster management
75 Cluster management
76 ~~~~~~~~~~~~~~~~~~
76 ~~~~~~~~~~~~~~~~~~
77
77
78 The notebook dashboard can now also start and stop clusters, thanks to a new
78 The notebook dashboard can now also start and stop clusters, thanks to a new
79 tab in the dashboard user interface:
79 tab in the dashboard user interface:
80
80
81 .. image:: ../_images/ipy_013_dashboard_cluster.png
81 .. image:: ../_images/ipy_013_dashboard_cluster.png
82 :width: 460px
82 :width: 460px
83 :alt: Cluster management from the notebook dashboard
83 :alt: Cluster management from the notebook dashboard
84 :align: center
84 :align: center
85 :target: ../_images/ipy_013_dashboard_cluster.png
85 :target: ../_images/ipy_013_dashboard_cluster.png
86
86
87 This interface allows, for each profile you have configured, to start and stop
87 This interface allows, for each profile you have configured, to start and stop
88 a cluster (and optionally override the default number of engines corresponding
88 a cluster (and optionally override the default number of engines corresponding
89 to that configuration). While this hides all error reporting, once you have a
89 to that configuration). While this hides all error reporting, once you have a
90 configuration that you know works smoothly, it is a very convenient interface
90 configuration that you know works smoothly, it is a very convenient interface
91 for controlling your parallel resources.
91 for controlling your parallel resources.
92
92
93
93
94 New notebook format
94 New notebook format
95 ~~~~~~~~~~~~~~~~~~~
95 ~~~~~~~~~~~~~~~~~~~
96
96
97 The notebooks saved now use version 3 of our format, which supports heading
97 The notebooks saved now use version 3 of our format, which supports heading
98 levels as well as the concept of 'raw' text cells that are not rendered as
98 levels as well as the concept of 'raw' text cells that are not rendered as
99 Markdown. These will be useful with converters_ we are developing, to pass raw
99 Markdown. These will be useful with converters_ we are developing, to pass raw
100 markup (say LaTeX). That conversion code is still under heavy development and
100 markup (say LaTeX). That conversion code is still under heavy development and
101 not quite ready for prime time, but we welcome help on this front so that we
101 not quite ready for prime time, but we welcome help on this front so that we
102 can merge it for full production use as soon as possible.
102 can merge it for full production use as soon as possible.
103
103
104 .. _converters: https://github.com/ipython/nbconvert
104 .. _converters: https://github.com/ipython/nbconvert
105
105
106 .. note::
106 .. note::
107
107
108 v3 notebooks can *not* be read by older versions of IPython, but we provide
108 v3 notebooks can *not* be read by older versions of IPython, but we provide
109 a `simple script`_ that you can use in case you need to export a v3
109 a `simple script`_ that you can use in case you need to export a v3
110 notebook to share with a v2 user.
110 notebook to share with a v2 user.
111
111
112 .. _simple script: https://gist.github.com/1935808
112 .. _simple script: https://gist.github.com/1935808
113
113
114
114
115 JavaScript refactoring
115 JavaScript refactoring
116 ~~~~~~~~~~~~~~~~~~~~~~
116 ~~~~~~~~~~~~~~~~~~~~~~
117
117
118 All the client-side JavaScript has been decoupled to ease reuse of parts of the
118 All the client-side JavaScript has been decoupled to ease reuse of parts of the
119 machinery without having to build a full-blown notebook. This will make it much
119 machinery without having to build a full-blown notebook. This will make it much
120 easier to communicate with an IPython kernel from existing web pages and to
120 easier to communicate with an IPython kernel from existing web pages and to
121 integrate single cells into other sites, without loading the full notebook
121 integrate single cells into other sites, without loading the full notebook
122 document-like UI. :ghpull:`1711`.
122 document-like UI. :ghpull:`1711`.
123
123
124 This refactoring also enables the possibility of writing dynamic javascript
124 This refactoring also enables the possibility of writing dynamic javascript
125 widgets that are returned from Python code and that present an interactive view
125 widgets that are returned from Python code and that present an interactive view
126 to the user, with callbacks in Javascript executing calls to the Kernel. This
126 to the user, with callbacks in Javascript executing calls to the Kernel. This
127 will enable many interactive elements to be added by users in notebooks.
127 will enable many interactive elements to be added by users in notebooks.
128
128
129 An example of this capability has been provided as a proof of concept in
129 An example of this capability has been provided as a proof of concept in
130 :file:`docs/examples/widgets` that lets you directly communicate with one or more
130 :file:`examples/widgets` that lets you directly communicate with one or more
131 parallel engines, acting as a mini-console for parallel debugging and
131 parallel engines, acting as a mini-console for parallel debugging and
132 introspection.
132 introspection.
133
133
134
134
135 Improved tooltips
135 Improved tooltips
136 ~~~~~~~~~~~~~~~~~
136 ~~~~~~~~~~~~~~~~~
137
137
138 The object tooltips have gained some new functionality. By pressing tab several
138 The object tooltips have gained some new functionality. By pressing tab several
139 times, you can expand them to see more of a docstring, keep them visible as you
139 times, you can expand them to see more of a docstring, keep them visible as you
140 fill in a function's parameters, or transfer the information to the pager at the
140 fill in a function's parameters, or transfer the information to the pager at the
141 bottom of the screen. For the details, look at the example notebook
141 bottom of the screen. For the details, look at the example notebook
142 :file:`01_notebook_introduction.ipynb`.
142 :file:`01_notebook_introduction.ipynb`.
143
143
144 .. figure:: ../_images/ipy_013_notebook_tooltip.png
144 .. figure:: ../_images/ipy_013_notebook_tooltip.png
145 :width: 460px
145 :width: 460px
146 :alt: Improved tooltips in the notebook.
146 :alt: Improved tooltips in the notebook.
147 :align: center
147 :align: center
148 :target: ../_images/ipy_013_notebook_tooltip.png
148 :target: ../_images/ipy_013_notebook_tooltip.png
149
149
150 The new notebook tooltips.
150 The new notebook tooltips.
151
151
152 Other improvements to the Notebook
152 Other improvements to the Notebook
153 ----------------------------------
153 ----------------------------------
154
154
155 These are some other notable small improvements to the notebook, in addition to
155 These are some other notable small improvements to the notebook, in addition to
156 many bug fixes and minor changes to add polish and robustness throughout:
156 many bug fixes and minor changes to add polish and robustness throughout:
157
157
158 * The notebook pager (the area at the bottom) is now resizeable by dragging its
158 * The notebook pager (the area at the bottom) is now resizeable by dragging its
159 divider handle, a feature that had been requested many times by just about
159 divider handle, a feature that had been requested many times by just about
160 anyone who had used the notebook system. :ghpull:`1705`.
160 anyone who had used the notebook system. :ghpull:`1705`.
161
161
162 * It is now possible to open notebooks directly from the command line; for
162 * It is now possible to open notebooks directly from the command line; for
163 example: ``ipython notebook path/`` will automatically set ``path/`` as the
163 example: ``ipython notebook path/`` will automatically set ``path/`` as the
164 notebook directory, and ``ipython notebook path/foo.ipynb`` will further
164 notebook directory, and ``ipython notebook path/foo.ipynb`` will further
165 start with the ``foo.ipynb`` notebook opened. :ghpull:`1686`.
165 start with the ``foo.ipynb`` notebook opened. :ghpull:`1686`.
166
166
167 * If a notebook directory is specified with ``--notebook-dir`` (or with the
167 * If a notebook directory is specified with ``--notebook-dir`` (or with the
168 corresponding configuration flag ``NotebookManager.notebook_dir``), all
168 corresponding configuration flag ``NotebookManager.notebook_dir``), all
169 kernels start in this directory.
169 kernels start in this directory.
170
170
171 * Fix codemirror clearing of cells with ``Ctrl-Z``; :ghpull:`1965`.
171 * Fix codemirror clearing of cells with ``Ctrl-Z``; :ghpull:`1965`.
172
172
173 * Text (markdown) cells now line wrap correctly in the notebook, making them
173 * Text (markdown) cells now line wrap correctly in the notebook, making them
174 much easier to edit :ghpull:`1330`.
174 much easier to edit :ghpull:`1330`.
175
175
176 * PNG and JPEG figures returned from plots can be interactively resized in the
176 * PNG and JPEG figures returned from plots can be interactively resized in the
177 notebook, by dragging them from their lower left corner. :ghpull:`1832`.
177 notebook, by dragging them from their lower left corner. :ghpull:`1832`.
178
178
179 * Clear ``In []`` prompt numbers on "Clear All Output". For more
179 * Clear ``In []`` prompt numbers on "Clear All Output". For more
180 version-control-friendly ``.ipynb`` files, we now strip all prompt numbers
180 version-control-friendly ``.ipynb`` files, we now strip all prompt numbers
181 when doing a "Clear all output". This reduces the amount of noise in
181 when doing a "Clear all output". This reduces the amount of noise in
182 commit-to-commit diffs that would otherwise show the (highly variable) prompt
182 commit-to-commit diffs that would otherwise show the (highly variable) prompt
183 number changes. :ghpull:`1621`.
183 number changes. :ghpull:`1621`.
184
184
185 * The notebook server now requires *two* consecutive ``Ctrl-C`` within 5
185 * The notebook server now requires *two* consecutive ``Ctrl-C`` within 5
186 seconds (or an interactive confirmation) to terminate operation. This makes
186 seconds (or an interactive confirmation) to terminate operation. This makes
187 it less likely that you will accidentally kill a long-running server by
187 it less likely that you will accidentally kill a long-running server by
188 typing ``Ctrl-C`` in the wrong terminal. :ghpull:`1609`.
188 typing ``Ctrl-C`` in the wrong terminal. :ghpull:`1609`.
189
189
190 * Using ``Ctrl-S`` (or ``Cmd-S`` on a Mac) actually saves the notebook rather
190 * Using ``Ctrl-S`` (or ``Cmd-S`` on a Mac) actually saves the notebook rather
191 than providing the fairly useless browser html save dialog. :ghpull:`1334`.
191 than providing the fairly useless browser html save dialog. :ghpull:`1334`.
192
192
193 * Allow accessing local files from the notebook (in urls), by serving any local
193 * Allow accessing local files from the notebook (in urls), by serving any local
194 file as the url ``files/<relativepath>``. This makes it possible to, for
194 file as the url ``files/<relativepath>``. This makes it possible to, for
195 example, embed local images in a notebook. :ghpull:`1211`.
195 example, embed local images in a notebook. :ghpull:`1211`.
196
196
197
197
198 Cell magics
198 Cell magics
199 -----------
199 -----------
200
200
201 We have completely refactored the magic system, finally moving the magic
201 We have completely refactored the magic system, finally moving the magic
202 objects to standalone, independent objects instead of being the mixin class
202 objects to standalone, independent objects instead of being the mixin class
203 we'd had since the beginning of IPython (:ghpull:`1732`). Now, a separate base
203 we'd had since the beginning of IPython (:ghpull:`1732`). Now, a separate base
204 class is provided in :class:`IPython.core.magic.Magics` that users can subclass
204 class is provided in :class:`IPython.core.magic.Magics` that users can subclass
205 to create their own magics. Decorators are also provided to create magics from
205 to create their own magics. Decorators are also provided to create magics from
206 simple functions without the need for object orientation. Please see the
206 simple functions without the need for object orientation. Please see the
207 :ref:`magic` docs for further details.
207 :ref:`magic` docs for further details.
208
208
209 All builtin magics now exist in a few subclasses that group together related
209 All builtin magics now exist in a few subclasses that group together related
210 functionality, and the new :mod:`IPython.core.magics` package has been created
210 functionality, and the new :mod:`IPython.core.magics` package has been created
211 to organize this into smaller files.
211 to organize this into smaller files.
212
212
213 This cleanup was the last major piece of deep refactoring needed from the
213 This cleanup was the last major piece of deep refactoring needed from the
214 original 2001 codebase.
214 original 2001 codebase.
215
215
216 We have also introduced a new type of magic function, prefixed with `%%`
216 We have also introduced a new type of magic function, prefixed with `%%`
217 instead of `%`, which operates at the whole-cell level. A cell magic receives
217 instead of `%`, which operates at the whole-cell level. A cell magic receives
218 two arguments: the line it is called on (like a line magic) and the body of the
218 two arguments: the line it is called on (like a line magic) and the body of the
219 cell below it.
219 cell below it.
220
220
221 Cell magics are most natural in the notebook, but they also work in the
221 Cell magics are most natural in the notebook, but they also work in the
222 terminal and qt console, with the usual approach of using a blank line to
222 terminal and qt console, with the usual approach of using a blank line to
223 signal cell termination.
223 signal cell termination.
224
224
225 For example, to time the execution of several statements::
225 For example, to time the execution of several statements::
226
226
227 %%timeit x = 0 # setup
227 %%timeit x = 0 # setup
228 for i in range(100000):
228 for i in range(100000):
229 x += i**2
229 x += i**2
230
230
231 This is particularly useful to integrate code in another language, and cell
231 This is particularly useful to integrate code in another language, and cell
232 magics already exist for shell scripts, Cython, R and Octave. Using ``%%script
232 magics already exist for shell scripts, Cython, R and Octave. Using ``%%script
233 /usr/bin/foo``, you can run a cell in any interpreter that accepts code via
233 /usr/bin/foo``, you can run a cell in any interpreter that accepts code via
234 stdin.
234 stdin.
235
235
236 Another handy cell magic makes it easy to write short text files: ``%%file
236 Another handy cell magic makes it easy to write short text files: ``%%file
237 ~/save/to/here.txt``.
237 ~/save/to/here.txt``.
238
238
239 The following cell magics are now included by default; all those that use
239 The following cell magics are now included by default; all those that use
240 special interpreters (Perl, Ruby, bash, etc.) assume you have the requisite
240 special interpreters (Perl, Ruby, bash, etc.) assume you have the requisite
241 interpreter installed:
241 interpreter installed:
242
242
243 * ``%%!``: run cell body with the underlying OS shell; this is similar to
243 * ``%%!``: run cell body with the underlying OS shell; this is similar to
244 prefixing every line in the cell with ``!``.
244 prefixing every line in the cell with ``!``.
245
245
246 * ``%%bash``: run cell body under bash.
246 * ``%%bash``: run cell body under bash.
247
247
248 * ``%%capture``: capture the output of the code in the cell (and stderr as
248 * ``%%capture``: capture the output of the code in the cell (and stderr as
249 well). Useful to run codes that produce too much output that you don't even
249 well). Useful to run codes that produce too much output that you don't even
250 want scrolled.
250 want scrolled.
251
251
252 * ``%%file``: save cell body as a file.
252 * ``%%file``: save cell body as a file.
253
253
254 * ``%%perl``: run cell body using Perl.
254 * ``%%perl``: run cell body using Perl.
255
255
256 * ``%%prun``: run cell body with profiler (cell extension of ``%prun``).
256 * ``%%prun``: run cell body with profiler (cell extension of ``%prun``).
257
257
258 * ``%%python3``: run cell body using Python 3.
258 * ``%%python3``: run cell body using Python 3.
259
259
260 * ``%%ruby``: run cell body using Ruby.
260 * ``%%ruby``: run cell body using Ruby.
261
261
262 * ``%%script``: run cell body with the script specified in the first line.
262 * ``%%script``: run cell body with the script specified in the first line.
263
263
264 * ``%%sh``: run cell body using sh.
264 * ``%%sh``: run cell body using sh.
265
265
266 * ``%%sx``: run cell with system shell and capture process output (cell
266 * ``%%sx``: run cell with system shell and capture process output (cell
267 extension of ``%sx``).
267 extension of ``%sx``).
268
268
269 * ``%%system``: run cell with system shell (``%%!`` is an alias to this).
269 * ``%%system``: run cell with system shell (``%%!`` is an alias to this).
270
270
271 * ``%%timeit``: time the execution of the cell (extension of ``%timeit``).
271 * ``%%timeit``: time the execution of the cell (extension of ``%timeit``).
272
272
273 This is what some of the script-related magics look like in action:
273 This is what some of the script-related magics look like in action:
274
274
275 .. image:: ../_images/ipy_013_notebook_script_cells.png
275 .. image:: ../_images/ipy_013_notebook_script_cells.png
276 :width: 460px
276 :width: 460px
277 :alt: Cluster management from the notebook dashboard
277 :alt: Cluster management from the notebook dashboard
278 :align: center
278 :align: center
279 :target: ../_images/ipy_013_notebook_script_cells.png
279 :target: ../_images/ipy_013_notebook_script_cells.png
280
280
281 In addition, we have also a number of :ref:`extensions <extensions_overview>`
281 In addition, we have also a number of :ref:`extensions <extensions_overview>`
282 that provide specialized magics. These typically require additional software
282 that provide specialized magics. These typically require additional software
283 to run and must be manually loaded via ``%load_ext <extension name>``, but are
283 to run and must be manually loaded via ``%load_ext <extension name>``, but are
284 extremely useful. The following extensions are provided:
284 extremely useful. The following extensions are provided:
285
285
286 **Cython magics** (extension :ref:`cythonmagic <extensions_cythonmagic>`)
286 **Cython magics** (extension :ref:`cythonmagic <extensions_cythonmagic>`)
287 This extension provides magics to automatically build and compile Python
287 This extension provides magics to automatically build and compile Python
288 extension modules using the Cython_ language. You must install Cython
288 extension modules using the Cython_ language. You must install Cython
289 separately, as well as a C compiler, for this to work. The examples
289 separately, as well as a C compiler, for this to work. The examples
290 directory in the source distribution ships with a full notebook
290 directory in the source distribution ships with a full notebook
291 demonstrating these capabilities:
291 demonstrating these capabilities:
292
292
293 .. image:: ../_images/ipy_013_notebook_cythonmagic.png
293 .. image:: ../_images/ipy_013_notebook_cythonmagic.png
294 :width: 460px
294 :width: 460px
295 :alt: Cython magic
295 :alt: Cython magic
296 :align: center
296 :align: center
297 :target: ../_images/ipy_013_notebook_cythonmagic.png
297 :target: ../_images/ipy_013_notebook_cythonmagic.png
298
298
299 .. _cython: http://cython.org
299 .. _cython: http://cython.org
300
300
301 **Octave magics** (extension :ref:`octavemagic <extensions_octavemagic>`)
301 **Octave magics** (extension :ref:`octavemagic <extensions_octavemagic>`)
302 This extension provides several magics that support calling code written in
302 This extension provides several magics that support calling code written in
303 the Octave_ language for numerical computing. You can execute single-lines
303 the Octave_ language for numerical computing. You can execute single-lines
304 or whole blocks of Octave code, capture both output and figures inline
304 or whole blocks of Octave code, capture both output and figures inline
305 (just like matplotlib plots), and have variables automatically converted
305 (just like matplotlib plots), and have variables automatically converted
306 between the two languages. To use this extension, you must have Octave
306 between the two languages. To use this extension, you must have Octave
307 installed as well as the oct2py_ package. The examples
307 installed as well as the oct2py_ package. The examples
308 directory in the source distribution ships with a full notebook
308 directory in the source distribution ships with a full notebook
309 demonstrating these capabilities:
309 demonstrating these capabilities:
310
310
311 .. image:: ../_images/ipy_013_notebook_octavemagic.png
311 .. image:: ../_images/ipy_013_notebook_octavemagic.png
312 :width: 460px
312 :width: 460px
313 :alt: Octave magic
313 :alt: Octave magic
314 :align: center
314 :align: center
315 :target: ../_images/ipy_013_notebook_octavemagic.png
315 :target: ../_images/ipy_013_notebook_octavemagic.png
316
316
317 .. _octave: http://www.gnu.org/software/octave
317 .. _octave: http://www.gnu.org/software/octave
318 .. _oct2py: http://pypi.python.org/pypi/oct2py
318 .. _oct2py: http://pypi.python.org/pypi/oct2py
319
319
320 **R magics** (extension :ref:`rmagic <extensions_rmagic>`)
320 **R magics** (extension :ref:`rmagic <extensions_rmagic>`)
321 This extension provides several magics that support calling code written in
321 This extension provides several magics that support calling code written in
322 the R_ language for statistical data analysis. You can execute
322 the R_ language for statistical data analysis. You can execute
323 single-lines or whole blocks of R code, capture both output and figures
323 single-lines or whole blocks of R code, capture both output and figures
324 inline (just like matplotlib plots), and have variables automatically
324 inline (just like matplotlib plots), and have variables automatically
325 converted between the two languages. To use this extension, you must have
325 converted between the two languages. To use this extension, you must have
326 R installed as well as the rpy2_ package that bridges Python and R. The
326 R installed as well as the rpy2_ package that bridges Python and R. The
327 examples directory in the source distribution ships with a full notebook
327 examples directory in the source distribution ships with a full notebook
328 demonstrating these capabilities:
328 demonstrating these capabilities:
329
329
330 .. image:: ../_images/ipy_013_notebook_rmagic.png
330 .. image:: ../_images/ipy_013_notebook_rmagic.png
331 :width: 460px
331 :width: 460px
332 :alt: R magic
332 :alt: R magic
333 :align: center
333 :align: center
334 :target: ../_images/ipy_013_notebook_rmagic.png
334 :target: ../_images/ipy_013_notebook_rmagic.png
335
335
336 .. _R: http://www.r-project.org
336 .. _R: http://www.r-project.org
337 .. _rpy2: http://rpy.sourceforge.net/rpy2.html
337 .. _rpy2: http://rpy.sourceforge.net/rpy2.html
338
338
339
339
340 Tab completer improvements
340 Tab completer improvements
341 --------------------------
341 --------------------------
342
342
343 Useful tab-completion based on live inspection of objects is one of the most
343 Useful tab-completion based on live inspection of objects is one of the most
344 popular features of IPython. To make this process even more user-friendly, the
344 popular features of IPython. To make this process even more user-friendly, the
345 completers of both the Qt console and the Notebook have been reworked.
345 completers of both the Qt console and the Notebook have been reworked.
346
346
347 The Qt console comes with a new ncurses-like tab completer, activated by
347 The Qt console comes with a new ncurses-like tab completer, activated by
348 default, which lets you cycle through the available completions by pressing tab,
348 default, which lets you cycle through the available completions by pressing tab,
349 or select a completion with the arrow keys (:ghpull:`1851`).
349 or select a completion with the arrow keys (:ghpull:`1851`).
350
350
351 .. figure:: ../_images/ipy_013_qtconsole_completer.png
351 .. figure:: ../_images/ipy_013_qtconsole_completer.png
352 :width: 460px
352 :width: 460px
353 :alt: ncurses-like completer, with highlighted selection.
353 :alt: ncurses-like completer, with highlighted selection.
354 :align: center
354 :align: center
355 :target: ../_images/ipy_013_qtconsole_completer.png
355 :target: ../_images/ipy_013_qtconsole_completer.png
356
356
357 The new improved Qt console's ncurses-like completer allows to easily
357 The new improved Qt console's ncurses-like completer allows to easily
358 navigate thought long list of completions.
358 navigate thought long list of completions.
359
359
360 In the notebook, completions are now sourced both from object introspection and
360 In the notebook, completions are now sourced both from object introspection and
361 analysis of surrounding code, so limited completions can be offered for
361 analysis of surrounding code, so limited completions can be offered for
362 variables defined in the current cell, or while the kernel is busy
362 variables defined in the current cell, or while the kernel is busy
363 (:ghpull:`1711`).
363 (:ghpull:`1711`).
364
364
365
365
366 We have implemented a new configurable flag to control tab completion on
366 We have implemented a new configurable flag to control tab completion on
367 modules that provide the ``__all__`` attribute::
367 modules that provide the ``__all__`` attribute::
368
368
369 IPCompleter.limit_to__all__= Boolean
369 IPCompleter.limit_to__all__= Boolean
370
370
371 This instructs the completer to honor ``__all__`` for the completion.
371 This instructs the completer to honor ``__all__`` for the completion.
372 Specifically, when completing on ``object.<tab>``, if True: only those names
372 Specifically, when completing on ``object.<tab>``, if True: only those names
373 in ``obj.__all__`` will be included. When False [default]: the ``__all__``
373 in ``obj.__all__`` will be included. When False [default]: the ``__all__``
374 attribute is ignored. :ghpull:`1529`.
374 attribute is ignored. :ghpull:`1529`.
375
375
376
376
377 Improvements to the Qt console
377 Improvements to the Qt console
378 ------------------------------
378 ------------------------------
379
379
380 The Qt console continues to receive improvements and refinements, despite the
380 The Qt console continues to receive improvements and refinements, despite the
381 fact that it is by now a fairly mature and robust component. Lots of small
381 fact that it is by now a fairly mature and robust component. Lots of small
382 polish has gone into it, here are a few highlights:
382 polish has gone into it, here are a few highlights:
383
383
384 * A number of changes were made to the underlying code for easier integration
384 * A number of changes were made to the underlying code for easier integration
385 into other projects such as Spyder_ (:ghpull:`2007`, :ghpull:`2024`).
385 into other projects such as Spyder_ (:ghpull:`2007`, :ghpull:`2024`).
386
386
387 * Improved menus with a new Magic menu that is organized by magic groups (this
387 * Improved menus with a new Magic menu that is organized by magic groups (this
388 was made possible by the reorganization of the magic system
388 was made possible by the reorganization of the magic system
389 internals). :ghpull:`1782`.
389 internals). :ghpull:`1782`.
390
390
391 * Allow for restarting kernels without clearing the qtconsole, while leaving a
391 * Allow for restarting kernels without clearing the qtconsole, while leaving a
392 visible indication that the kernel has restarted. :ghpull:`1681`.
392 visible indication that the kernel has restarted. :ghpull:`1681`.
393
393
394 * Allow the native display of jpeg images in the qtconsole. :ghpull:`1643`.
394 * Allow the native display of jpeg images in the qtconsole. :ghpull:`1643`.
395
395
396 .. _spyder: https://code.google.com/p/spyderlib
396 .. _spyder: https://code.google.com/p/spyderlib
397
397
398
398
399
399
400 Parallel
400 Parallel
401 --------
401 --------
402
402
403 The parallel tools have been improved and fine-tuned on multiple fronts. Now,
403 The parallel tools have been improved and fine-tuned on multiple fronts. Now,
404 the creation of an :class:`IPython.parallel.Client` object automatically
404 the creation of an :class:`IPython.parallel.Client` object automatically
405 activates a line and cell magic function ``px`` that sends its code to all the
405 activates a line and cell magic function ``px`` that sends its code to all the
406 engines. Further magics can be easily created with the :meth:`.Client.activate`
406 engines. Further magics can be easily created with the :meth:`.Client.activate`
407 method, to conveniently execute code on any subset of engines. :ghpull:`1893`.
407 method, to conveniently execute code on any subset of engines. :ghpull:`1893`.
408
408
409 The ``%%px`` cell magic can also be given an optional targets argument, as well
409 The ``%%px`` cell magic can also be given an optional targets argument, as well
410 as a ``--out`` argument for storing its output.
410 as a ``--out`` argument for storing its output.
411
411
412 A new magic has also been added, ``%pxconfig``, that lets you configure various
412 A new magic has also been added, ``%pxconfig``, that lets you configure various
413 defaults of the parallel magics. As usual, type ``%pxconfig?`` for details.
413 defaults of the parallel magics. As usual, type ``%pxconfig?`` for details.
414
414
415 The exception reporting in parallel contexts has been improved to be easier to
415 The exception reporting in parallel contexts has been improved to be easier to
416 read. Now, IPython directly reports the remote exceptions without showing any
416 read. Now, IPython directly reports the remote exceptions without showing any
417 of the internal execution parts:
417 of the internal execution parts:
418
418
419 .. image:: ../_images/ipy_013_par_tb.png
419 .. image:: ../_images/ipy_013_par_tb.png
420 :width: 460px
420 :width: 460px
421 :alt: Improved parallel exceptions.
421 :alt: Improved parallel exceptions.
422 :align: center
422 :align: center
423 :target: ../_images/ipy_013_par_tb.png
423 :target: ../_images/ipy_013_par_tb.png
424
424
425 The parallel tools now default to using ``NoDB`` as the storage backend for
425 The parallel tools now default to using ``NoDB`` as the storage backend for
426 intermediate results. This means that the default usage case will have a
426 intermediate results. This means that the default usage case will have a
427 significantly reduced memory footprint, though certain advanced features are
427 significantly reduced memory footprint, though certain advanced features are
428 not available with this backend. For more details, see :ref:`parallel_db`.
428 not available with this backend. For more details, see :ref:`parallel_db`.
429
429
430 The parallel magics now display all output, so you can do parallel plotting or
430 The parallel magics now display all output, so you can do parallel plotting or
431 other actions with complex display. The ``px`` magic has now both line and cell
431 other actions with complex display. The ``px`` magic has now both line and cell
432 modes, and in cell mode finer control has been added about how to collate
432 modes, and in cell mode finer control has been added about how to collate
433 output from multiple engines. :ghpull:`1768`.
433 output from multiple engines. :ghpull:`1768`.
434
434
435 There have also been incremental improvements to the SSH launchers:
435 There have also been incremental improvements to the SSH launchers:
436
436
437 * add to_send/fetch steps for moving connection files around.
437 * add to_send/fetch steps for moving connection files around.
438
438
439 * add SSHProxyEngineSetLauncher, for invoking to `ipcluster engines` on a
439 * add SSHProxyEngineSetLauncher, for invoking to `ipcluster engines` on a
440 remote host. This can be used to start a set of engines via PBS/SGE/MPI
440 remote host. This can be used to start a set of engines via PBS/SGE/MPI
441 *remotely*.
441 *remotely*.
442
442
443 This makes the SSHLauncher usable on machines without shared filesystems.
443 This makes the SSHLauncher usable on machines without shared filesystems.
444
444
445 A number of 'sugar' methods/properties were added to AsyncResult that are
445 A number of 'sugar' methods/properties were added to AsyncResult that are
446 quite useful (:ghpull:`1548`) for everday work:
446 quite useful (:ghpull:`1548`) for everday work:
447
447
448 * ``ar.wall_time`` = received - submitted
448 * ``ar.wall_time`` = received - submitted
449 * ``ar.serial_time`` = sum of serial computation time
449 * ``ar.serial_time`` = sum of serial computation time
450 * ``ar.elapsed`` = time since submission (wall_time if done)
450 * ``ar.elapsed`` = time since submission (wall_time if done)
451 * ``ar.progress`` = (int) number of sub-tasks that have completed
451 * ``ar.progress`` = (int) number of sub-tasks that have completed
452 * ``len(ar)`` = # of tasks
452 * ``len(ar)`` = # of tasks
453 * ``ar.wait_interactive()``: prints progress
453 * ``ar.wait_interactive()``: prints progress
454
454
455 Added :meth:`.Client.spin_thread` / :meth:`~.Client.stop_spin_thread` for
455 Added :meth:`.Client.spin_thread` / :meth:`~.Client.stop_spin_thread` for
456 running spin in a background thread, to keep zmq queue clear. This can be used
456 running spin in a background thread, to keep zmq queue clear. This can be used
457 to ensure that timing information is as accurate as possible (at the cost of
457 to ensure that timing information is as accurate as possible (at the cost of
458 having a background thread active).
458 having a background thread active).
459
459
460 Set TaskScheduler.hwm default to 1 instead of 0. 1 has more
460 Set TaskScheduler.hwm default to 1 instead of 0. 1 has more
461 predictable/intuitive behavior, if often slower, and thus a more logical
461 predictable/intuitive behavior, if often slower, and thus a more logical
462 default. Users whose workloads require maximum throughput and are largely
462 default. Users whose workloads require maximum throughput and are largely
463 homogeneous in time per task can make the optimization themselves, but now the
463 homogeneous in time per task can make the optimization themselves, but now the
464 behavior will be less surprising to new users. :ghpull:`1294`.
464 behavior will be less surprising to new users. :ghpull:`1294`.
465
465
466
466
467 Kernel/Engine unification
467 Kernel/Engine unification
468 -------------------------
468 -------------------------
469
469
470 This is mostly work 'under the hood', but it is actually a *major* achievement
470 This is mostly work 'under the hood', but it is actually a *major* achievement
471 for the project that has deep implications in the long term: at last, we have
471 for the project that has deep implications in the long term: at last, we have
472 unified the main object that executes as the user's interactive shell (which we
472 unified the main object that executes as the user's interactive shell (which we
473 refer to as the *IPython kernel*) with the objects that run in all the worker
473 refer to as the *IPython kernel*) with the objects that run in all the worker
474 nodes of the parallel computing facilities (the *IPython engines*). Ever since
474 nodes of the parallel computing facilities (the *IPython engines*). Ever since
475 the first implementation of IPython's parallel code back in 2006, we had wanted
475 the first implementation of IPython's parallel code back in 2006, we had wanted
476 to have these two roles be played by the same machinery, but a number of
476 to have these two roles be played by the same machinery, but a number of
477 technical reasons had prevented that from being true.
477 technical reasons had prevented that from being true.
478
478
479 In this release we have now merged them, and this has a number of important
479 In this release we have now merged them, and this has a number of important
480 consequences:
480 consequences:
481
481
482 * It is now possible to connect any of our clients (qtconsole or terminal
482 * It is now possible to connect any of our clients (qtconsole or terminal
483 console) to any individual parallel engine, with the *exact* behavior of
483 console) to any individual parallel engine, with the *exact* behavior of
484 working at a 'regular' IPython console/qtconsole. This makes debugging,
484 working at a 'regular' IPython console/qtconsole. This makes debugging,
485 plotting, etc. in parallel scenarios vastly easier.
485 plotting, etc. in parallel scenarios vastly easier.
486
486
487 * Parallel engines can always execute arbitrary 'IPython code', that is, code
487 * Parallel engines can always execute arbitrary 'IPython code', that is, code
488 that has magics, shell extensions, etc. In combination with the ``%%px``
488 that has magics, shell extensions, etc. In combination with the ``%%px``
489 magics, it is thus extremely natural for example to send to all engines a
489 magics, it is thus extremely natural for example to send to all engines a
490 block of Cython or R code to be executed via the new Cython and R magics. For
490 block of Cython or R code to be executed via the new Cython and R magics. For
491 example, this snippet would send the R block to all active engines in a
491 example, this snippet would send the R block to all active engines in a
492 cluster::
492 cluster::
493
493
494 %%px
494 %%px
495 %%R
495 %%R
496 ... R code goes here
496 ... R code goes here
497
497
498 * It is possible to embed not only an interactive shell with the
498 * It is possible to embed not only an interactive shell with the
499 :func:`IPython.embed` call as always, but now you can also embed a *kernel*
499 :func:`IPython.embed` call as always, but now you can also embed a *kernel*
500 with :func:`IPython.embed_kernel()`. Embedding an IPython kernel in an
500 with :func:`IPython.embed_kernel()`. Embedding an IPython kernel in an
501 application is useful when you want to use :func:`IPython.embed` but don't
501 application is useful when you want to use :func:`IPython.embed` but don't
502 have a terminal attached on stdin and stdout.
502 have a terminal attached on stdin and stdout.
503
503
504 * The new :func:`IPython.parallel.bind_kernel` allows you to promote Engines to
504 * The new :func:`IPython.parallel.bind_kernel` allows you to promote Engines to
505 listening Kernels, and connect QtConsoles to an Engine and debug it
505 listening Kernels, and connect QtConsoles to an Engine and debug it
506 directly.
506 directly.
507
507
508 In addition, having a single core object through our entire architecture also
508 In addition, having a single core object through our entire architecture also
509 makes the project conceptually cleaner, easier to maintain and more robust.
509 makes the project conceptually cleaner, easier to maintain and more robust.
510 This took a lot of work to get in place, but we are thrilled to have this major
510 This took a lot of work to get in place, but we are thrilled to have this major
511 piece of architecture finally where we'd always wanted it to be.
511 piece of architecture finally where we'd always wanted it to be.
512
512
513
513
514 Official Public API
514 Official Public API
515 -------------------
515 -------------------
516
516
517 We have begun organizing our API for easier public use, with an eye towards an
517 We have begun organizing our API for easier public use, with an eye towards an
518 official IPython 1.0 release which will firmly maintain this API compatible for
518 official IPython 1.0 release which will firmly maintain this API compatible for
519 its entire lifecycle. There is now an :mod:`IPython.display` module that
519 its entire lifecycle. There is now an :mod:`IPython.display` module that
520 aggregates all display routines, and the :mod:`IPython.config` namespace has
520 aggregates all display routines, and the :mod:`IPython.config` namespace has
521 all public configuration tools. We will continue improving our public API
521 all public configuration tools. We will continue improving our public API
522 layout so that users only need to import names one level deeper than the main
522 layout so that users only need to import names one level deeper than the main
523 ``IPython`` package to access all public namespaces.
523 ``IPython`` package to access all public namespaces.
524
524
525
525
526 IPython notebook file icons
526 IPython notebook file icons
527 ---------------------------
527 ---------------------------
528
528
529 The directory ``docs/resources`` in the source distribution contains SVG and
529 The directory ``docs/resources`` in the source distribution contains SVG and
530 PNG versions of our file icons, as well as an ``Info.plist.example`` file with
530 PNG versions of our file icons, as well as an ``Info.plist.example`` file with
531 instructions to install them on Mac OSX. This is a first draft of our icons,
531 instructions to install them on Mac OSX. This is a first draft of our icons,
532 and we encourage contributions from users with graphic talent to improve them
532 and we encourage contributions from users with graphic talent to improve them
533 in the future:
533 in the future:
534
534
535 .. image:: ../../resources/ipynb_icon_128x128.png
535 .. image:: ../../resources/ipynb_icon_128x128.png
536 :alt: IPython notebook file icon.
536 :alt: IPython notebook file icon.
537
537
538
538
539 New top-level `locate` command
539 New top-level `locate` command
540 ------------------------------
540 ------------------------------
541
541
542 Add `locate` entry points; these would be useful for quickly locating IPython
542 Add `locate` entry points; these would be useful for quickly locating IPython
543 directories and profiles from other (non-Python) applications. :ghpull:`1762`.
543 directories and profiles from other (non-Python) applications. :ghpull:`1762`.
544
544
545 Examples::
545 Examples::
546
546
547 $> ipython locate
547 $> ipython locate
548 /Users/me/.ipython
548 /Users/me/.ipython
549
549
550 $> ipython locate profile foo
550 $> ipython locate profile foo
551 /Users/me/.ipython/profile_foo
551 /Users/me/.ipython/profile_foo
552
552
553 $> ipython locate profile
553 $> ipython locate profile
554 /Users/me/.ipython/profile_default
554 /Users/me/.ipython/profile_default
555
555
556 $> ipython locate profile dne
556 $> ipython locate profile dne
557 [ProfileLocate] Profile u'dne' not found.
557 [ProfileLocate] Profile u'dne' not found.
558
558
559
559
560 Other new features and improvements
560 Other new features and improvements
561 -----------------------------------
561 -----------------------------------
562
562
563 * **%install_ext**: A new magic function to install an IPython extension from
563 * **%install_ext**: A new magic function to install an IPython extension from
564 a URL. E.g. ``%install_ext
564 a URL. E.g. ``%install_ext
565 https://bitbucket.org/birkenfeld/ipython-physics/raw/default/physics.py``.
565 https://bitbucket.org/birkenfeld/ipython-physics/raw/default/physics.py``.
566
566
567 * The ``%loadpy`` magic is no longer restricted to Python files, and has been
567 * The ``%loadpy`` magic is no longer restricted to Python files, and has been
568 renamed ``%load``. The old name remains as an alias.
568 renamed ``%load``. The old name remains as an alias.
569
569
570 * New command line arguments will help external programs find IPython folders:
570 * New command line arguments will help external programs find IPython folders:
571 ``ipython locate`` finds the user's IPython directory, and ``ipython locate
571 ``ipython locate`` finds the user's IPython directory, and ``ipython locate
572 profile foo`` finds the folder for the 'foo' profile (if it exists).
572 profile foo`` finds the folder for the 'foo' profile (if it exists).
573
573
574 * The :envvar:`IPYTHON_DIR` environment variable, introduced in the Great
574 * The :envvar:`IPYTHON_DIR` environment variable, introduced in the Great
575 Reorganization of 0.11 and existing only in versions 0.11-0.13, has been
575 Reorganization of 0.11 and existing only in versions 0.11-0.13, has been
576 deprecated. As described in :ghpull:`1167`, the complexity and confusion of
576 deprecated. As described in :ghpull:`1167`, the complexity and confusion of
577 migrating to this variable is not worth the aesthetic improvement. Please use
577 migrating to this variable is not worth the aesthetic improvement. Please use
578 the historical :envvar:`IPYTHONDIR` environment variable instead.
578 the historical :envvar:`IPYTHONDIR` environment variable instead.
579
579
580 * The default value of *interactivity* passed from
580 * The default value of *interactivity* passed from
581 :meth:`~IPython.core.interactiveshell.InteractiveShell.run_cell` to
581 :meth:`~IPython.core.interactiveshell.InteractiveShell.run_cell` to
582 :meth:`~IPython.core.interactiveshell.InteractiveShell.run_ast_nodes`
582 :meth:`~IPython.core.interactiveshell.InteractiveShell.run_ast_nodes`
583 is now configurable.
583 is now configurable.
584
584
585 * New ``%alias_magic`` function to conveniently create aliases of existing
585 * New ``%alias_magic`` function to conveniently create aliases of existing
586 magics, if you prefer to have shorter names for personal use.
586 magics, if you prefer to have shorter names for personal use.
587
587
588 * We ship unminified versions of the JavaScript libraries we use, to better
588 * We ship unminified versions of the JavaScript libraries we use, to better
589 comply with Debian's packaging policies.
589 comply with Debian's packaging policies.
590
590
591 * Simplify the information presented by ``obj?/obj??`` to eliminate a few
591 * Simplify the information presented by ``obj?/obj??`` to eliminate a few
592 redundant fields when possible. :ghpull:`2038`.
592 redundant fields when possible. :ghpull:`2038`.
593
593
594 * Improved continuous integration for IPython. We now have automated test runs
594 * Improved continuous integration for IPython. We now have automated test runs
595 on `Shining Panda <https://jenkins.shiningpanda.com/ipython>`_ and `Travis-CI
595 on `Shining Panda <https://jenkins.shiningpanda.com/ipython>`_ and `Travis-CI
596 <http://travis-ci.org/#!/ipython/ipython>`_, as well as `Tox support
596 <http://travis-ci.org/#!/ipython/ipython>`_, as well as `Tox support
597 <http://tox.testrun.org>`_.
597 <http://tox.testrun.org>`_.
598
598
599 * The `vim-ipython`_ functionality (externally developed) has been updated to
599 * The `vim-ipython`_ functionality (externally developed) has been updated to
600 the latest version.
600 the latest version.
601
601
602 .. _vim-ipython: https://github.com/ivanov/vim-ipython
602 .. _vim-ipython: https://github.com/ivanov/vim-ipython
603
603
604 * The ``%save`` magic now has a ``-f`` flag to force overwriting, which makes
604 * The ``%save`` magic now has a ``-f`` flag to force overwriting, which makes
605 it much more usable in the notebook where it is not possible to reply to
605 it much more usable in the notebook where it is not possible to reply to
606 interactive questions from the kernel. :ghpull:`1937`.
606 interactive questions from the kernel. :ghpull:`1937`.
607
607
608 * Use dvipng to format sympy.Matrix, enabling display of matrices in the Qt
608 * Use dvipng to format sympy.Matrix, enabling display of matrices in the Qt
609 console with the sympy printing extension. :ghpull:`1861`.
609 console with the sympy printing extension. :ghpull:`1861`.
610
610
611 * Our messaging protocol now has a reasonable test suite, helping ensure that
611 * Our messaging protocol now has a reasonable test suite, helping ensure that
612 we don't accidentally deviate from the spec and possibly break third-party
612 we don't accidentally deviate from the spec and possibly break third-party
613 applications that may have been using it. We encourage users to contribute
613 applications that may have been using it. We encourage users to contribute
614 more stringent tests to this part of the test suite. :ghpull:`1627`.
614 more stringent tests to this part of the test suite. :ghpull:`1627`.
615
615
616 * Use LaTeX to display, on output, various built-in types with the SymPy
616 * Use LaTeX to display, on output, various built-in types with the SymPy
617 printing extension. :ghpull:`1399`.
617 printing extension. :ghpull:`1399`.
618
618
619 * Add Gtk3 event loop integration and example. :ghpull:`1588`.
619 * Add Gtk3 event loop integration and example. :ghpull:`1588`.
620
620
621 * ``clear_output`` improvements, which allow things like progress bars and other
621 * ``clear_output`` improvements, which allow things like progress bars and other
622 simple animations to work well in the notebook (:ghpull:`1563`):
622 simple animations to work well in the notebook (:ghpull:`1563`):
623
623
624 * `clear_output()` clears the line, even in terminal IPython, the QtConsole
624 * `clear_output()` clears the line, even in terminal IPython, the QtConsole
625 and plain Python as well, by printing `\r` to streams.
625 and plain Python as well, by printing `\r` to streams.
626
626
627 * `clear_output()` avoids the flicker in the notebook by adding a delay,
627 * `clear_output()` avoids the flicker in the notebook by adding a delay,
628 and firing immediately upon the next actual display message.
628 and firing immediately upon the next actual display message.
629
629
630 * `display_javascript` hides its `output_area` element, so using display to
630 * `display_javascript` hides its `output_area` element, so using display to
631 run a bunch of javascript doesn't result in ever-growing vertical space.
631 run a bunch of javascript doesn't result in ever-growing vertical space.
632
632
633 * Add simple support for running inside a virtualenv. While this doesn't
633 * Add simple support for running inside a virtualenv. While this doesn't
634 supplant proper installation (as users should do), it helps ad-hoc calling of
634 supplant proper installation (as users should do), it helps ad-hoc calling of
635 IPython from inside a virtualenv. :ghpull:`1388`.
635 IPython from inside a virtualenv. :ghpull:`1388`.
636
636
637
637
638 Major Bugs fixed
638 Major Bugs fixed
639 ----------------
639 ----------------
640
640
641 In this cycle, we have :ref:`closed over 740 issues <issues_list_013>`, but a
641 In this cycle, we have :ref:`closed over 740 issues <issues_list_013>`, but a
642 few major ones merit special mention:
642 few major ones merit special mention:
643
643
644 * The ``%pastebin`` magic has been updated to point to gist.github.com, since
644 * The ``%pastebin`` magic has been updated to point to gist.github.com, since
645 unfortunately http://paste.pocoo.org has closed down. We also added a -d flag
645 unfortunately http://paste.pocoo.org has closed down. We also added a -d flag
646 for the user to provide a gist description string. :ghpull:`1670`.
646 for the user to provide a gist description string. :ghpull:`1670`.
647
647
648 * Fix ``%paste`` that would reject certain valid inputs. :ghpull:`1258`.
648 * Fix ``%paste`` that would reject certain valid inputs. :ghpull:`1258`.
649
649
650 * Fix sending and receiving of Numpy structured arrays (those with composite
650 * Fix sending and receiving of Numpy structured arrays (those with composite
651 dtypes, often used as recarrays). :ghpull:`2034`.
651 dtypes, often used as recarrays). :ghpull:`2034`.
652
652
653 * Reconnect when the websocket connection closes unexpectedly. :ghpull:`1577`.
653 * Reconnect when the websocket connection closes unexpectedly. :ghpull:`1577`.
654
654
655 * Fix truncated representation of objects in the debugger by showing at least
655 * Fix truncated representation of objects in the debugger by showing at least
656 80 characters' worth of information. :ghpull:`1793`.
656 80 characters' worth of information. :ghpull:`1793`.
657
657
658 * Fix logger to be Unicode-aware: logging could crash ipython if there was
658 * Fix logger to be Unicode-aware: logging could crash ipython if there was
659 unicode in the input. :ghpull:`1792`.
659 unicode in the input. :ghpull:`1792`.
660
660
661 * Fix images missing from XML/SVG export in the Qt console. :ghpull:`1449`.
661 * Fix images missing from XML/SVG export in the Qt console. :ghpull:`1449`.
662
662
663 * Fix deepreload on Python 3. :ghpull:`1625`, as well as having a much cleaner
663 * Fix deepreload on Python 3. :ghpull:`1625`, as well as having a much cleaner
664 and more robust implementation of deepreload in general. :ghpull:`1457`.
664 and more robust implementation of deepreload in general. :ghpull:`1457`.
665
665
666
666
667 Backwards incompatible changes
667 Backwards incompatible changes
668 ------------------------------
668 ------------------------------
669
669
670 * The exception :exc:`IPython.core.error.TryNext` previously accepted
670 * The exception :exc:`IPython.core.error.TryNext` previously accepted
671 arguments and keyword arguments to be passed to the next implementation
671 arguments and keyword arguments to be passed to the next implementation
672 of the hook. This feature was removed as it made error message propagation
672 of the hook. This feature was removed as it made error message propagation
673 difficult and violated the principle of loose coupling.
673 difficult and violated the principle of loose coupling.
General Comments 0
You need to be logged in to leave comments. Login now