##// END OF EJS Templates
removed %Rinline, %R has been converted to line_cell_magic
removed %Rinline, %R has been converted to line_cell_magic

File last commit:

r7217:c6d32c75
r7217:c6d32c75
Show More
rmagic_extension.ipynb
662 lines | 168.8 KiB | text/plain | TextLexer

Rmagic Functions Extension

Line magics

IPython has an rmagic extension that contains a some magic functions for working with R via rpy2. This extension can be loaded using the %load_ext magic as follows:

In [1]:
%load_ext rmagic
    

A typical use case one imagines is having some numpy arrays, wanting to compute some statistics of interest on these arrays and return the result back to python. Let's suppose we just want to fit a simple linear model to a scatterplot.

In [2]:
import numpy as np
import pylab
X = np.array([0,1,2,3,4])
Y = np.array([3,5,4,6,7])
pylab.scatter(X, Y)
Out[2]:
<matplotlib.collections.PathCollection at 0x10b9bc690>
No description has been provided for this image

We can accomplish this by first pushing variables to R, fitting a model and returning the results. The line magic %Rpush copies its arguments to variables of the same name in rpy2. The %R line magic evaluates the string in rpy2 and returns the results. In this case, the coefficients of a linear model.

In [3]:
%Rpush X Y
%R lm(Y~X)$coef
Out[3]:
array([ 3.2,  0.9])

We can check that this is correct fairly easily:

In [4]:
Xr = X - X.mean(); Yr = Y - Y.mean()
slope = (Xr*Yr).sum() / (Xr**2).sum()
intercept = Y.mean() - X.mean() * slope
(intercept, slope)
Out[4]:
(3.2000000000000002, 0.90000000000000002)

It is also possible to return more than one value with %R.

In [5]:
%R resid(lm(Y~X)); coef(lm(X~Y))
Out[5]:
[array([-0.2,  0.9, -1. ,  0.1,  0.2]), array([-2.5,  0.9])]

One can also easily capture the results of %R into python objects.

In [6]:
xr, yr = %R resid(lm(Y~X)); resid(lm(X~Y))
print xr, yr
[-0.2  0.9 -1.   0.1  0.2] [-0.2 -1.   0.9  0.1  0.2]

There is one more line magic, %Rpull which assumes that some R code has been executed and there are variables in the rpy2 namespace that one would like to export to the python namespace. Imagine we've stored the results of some calculation in the variable "a" in rpy2's namespace. By using the %R magic, we can obtain these results and store them in b. We can also pull them directly to user_ns with %Rpull. They are both views on the same data.

In [7]:
b = %R a=resid(lm(Y~X))
%Rpull a
print a
assert id(b.data) == id(a.data)
%R -o a
[-0.2  0.9 -1.   0.1  0.2]

%Rpull is equivalent to calling %R with just -o

In [8]:
%R d=resid(lm(Y~X)); e=coef(lm(Y~X))
%R -o d -o e
%Rpull e
print d
print e
import numpy as np
np.testing.assert_almost_equal(d, a)
[-0.2  0.9 -1.   0.1  0.2]
[ 3.2  0.9]

On the other hand %Rpush is equivalent to calling %R with just -i and no trailing code.

In [9]:
A = np.arange(20)
%R -i A
%R mean(A)
Out[9]:
array([ 9.5])

Plotting and capturing output

R's console (i.e. its stdout() connection) is captured by ipython, as are any plots which are published as PNG files like the notebook with arguments --pylab inline. As a call to %R may produce a return value (see above) we must ask what happens to a magic like the one below. The R code specifies that something is published to the notebook. If anything is published to the notebook, that call to %R returns None.

In [10]:
v1 = %R plot(X,Y); print(summary(lm(Y~X))); vv=mean(X)*mean(Y)
print 'v1 is:', v1
v2 = %R mean(X)*mean(Y)
print 'v2 is:', v2
Call:
lm(formula = Y ~ X)

Residuals:
   1    2    3    4    5 
-0.2  0.9 -1.0  0.1  0.2 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)  
(Intercept)   3.2000     0.6164   5.191   0.0139 *
X             0.9000     0.2517   3.576   0.0374 *
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 

Residual standard error: 0.7958 on 3 degrees of freedom
Multiple R-squared:  0.81,	Adjusted R-squared: 0.7467 
F-statistic: 12.79 on 1 and 3 DF,  p-value: 0.03739 

No description has been provided for this image
v1 is: None
v2 is: [ 10.]

Cell level magic

For the cell level magic, inputs can be passed via the -i or --inputs argument in the line. These variables are copied from the shell namespace to R's namespace using rpy2.robjects.r.assign. It would be nice not to have to copy these into R: rnumpy ( http://bitbucket.org/njs/rnumpy/wiki/API ) has done some work to limit or at least make transparent the number of copies of an array. This seems like a natural thing to try to build on. Arrays can be output from R via the -o or --outputs argument in the line. All other arguments are sent to R's png function, which is the graphics device used to create the plots.

We can redo the above calculations in one ipython cell. We might also want to add some output such as a summary from R or perhaps the standard plotting diagnostics of the lm.

In [11]:
%%R -i X,Y -o XYcoef
XYlm = lm(Y~X)
XYcoef = coef(XYlm)
print(summary(XYlm))
par(mfrow=c(2,2))
plot(XYlm)
Call:
lm(formula = Y ~ X)

Residuals:
   1    2    3    4    5 
-0.2  0.9 -1.0  0.1  0.2 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)  
(Intercept)   3.2000     0.6164   5.191   0.0139 *
X             0.9000     0.2517   3.576   0.0374 *
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 

Residual standard error: 0.7958 on 3 degrees of freedom
Multiple R-squared:  0.81,	Adjusted R-squared: 0.7467 
F-statistic: 12.79 on 1 and 3 DF,  p-value: 0.03739 

No description has been provided for this image

Often, we will want to do more than a simple linear regression model. There may be several lines of R code that we want to use before returning to python. This is the cell-level magic.

In this example, we will generate some random data, fit a LASSO model using the lars package and plot the results, returning the array of coefficients. This example assumes that the lars package is installed in R.

In [12]:
Xrandom = np.random.standard_normal((100,10))
Yrandom = np.random.standard_normal((100,1))
In [13]:
%%R -i Xrandom,Yrandom -o larscoef
library(lars)
larsobj = lars(Xrandom, Yrandom)
plot(larsobj)
larscoef = coef(larsobj)
larspred = predict(larsobj, s=seq(0,1,length=101), mode='fraction')
lcoef = larspred$coefficients
lfrac = larspred$fraction
Loaded lars 1.1

Warning message:
In predict.lars(larsobj, s = seq(0, 1, length = 101), mode = "fraction") :
  Type=fit with no newx argument; type switched to coefficients
No description has been provided for this image
In [14]:
larscoef.shape
Out[14]:
(11, 10)

Using Rpull

In [15]:
%Rpull lcoef lfrac
f = [pylab.plot(lfrac, c) for c in lcoef.T]
No description has been provided for this image

Passing data back and forth

Currently, data is passed through RMagics.pyconverter when going from python to R and RMagics.Rconverter when going from R to python. These currently default to numpy.ndarray. Future work will involve writing better converters, most likely involving integration with http://pandas.sourceforge.net.

Passing ndarrays into R seems to require a copy, though once an object is returned to python, this object is NOT copied, and it is possible to change its values.

In [16]:
seq1 = np.arange(10)
In [17]:
%%R -i seq1 -o seq2
seq2 = rep(seq1, 2)
print(seq2)
 [1] 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9
In [18]:
seq2[::2] = 0
seq2
Out[18]:
array([0, 1, 0, 3, 0, 5, 0, 7, 0, 9, 0, 1, 0, 3, 0, 5, 0, 7, 0, 9], dtype=int32)
In [19]:
%%R
print(seq2)
 [1] 0 1 0 3 0 5 0 7 0 9 0 1 0 3 0 5 0 7 0 9

Once the array data has been passed to R, modifring its contents does not modify R's copy of the data.

In [20]:
seq1[0] = 200
%R print(seq1)
 [1] 0 1 2 3 4 5 6 7 8 9

But, if we pass data as both input and output, then the value of "data" in user_ns will be overwritten and the new array will be a view of the data in R's copy.

In [21]:
print seq1
%R -i seq1 -o seq1
print seq1
seq1[0] = 200
%R print(seq1)
seq1_view = %R seq1
assert(id(seq1_view.data) == id(seq1.data))
[200   1   2   3   4   5   6   7   8   9]
[200   1   2   3   4   5   6   7   8   9]
 [1] 200   1   2   3   4   5   6   7   8   9
In [21]:
 
In [21]: