Theano is an open-source Python library to optimize and evaluate symbolic expressions involving multi-dimensional tensors. While input and output are traditional numpy arrays, the evaluation step is transparently taken out of the python interpreter by dynamically generating and compiling a C or CUDA program that computes the value of expression either on the CPU or the GPU to typically gain a 10 to 100 folds speed-up.
Theano can leverage the knowledge of the symbolic expression to perform a pluggable list of optimizations to remove unnecessary memory allocations and computational steps prior evaluation. The symbolic expression handler is also able to perform automated differentiation by deriving the expression graph of the gradient of another expression with respect to a multi-dimensional variable. This is typically useful to implement optimization algorithms such as stochastic gradient descent where the parameters of the model are iteratively updated by taking the gradient of the objective function to minimize with respect to a set of multidimensional parameters.
Theano is mainly developed and used for research and teaching on deep machine learning architectures at the University of Montreal.
The purpose of this talk is to give a global overview of how it feels to work with Theano in comparison to traditional numpy/scipy environment and showcase an application about training deep artificial neural networks to embed high dimensional data into a 2D vector space while preserving local structure.