Lasagne: Lightweight library to build and train neural networks in Theano
Project Website:
Github Link:
Description
Lasagne is a lightweight library to build and train neural networks in Theano. Its main features are:
Supports feed-forward networks such as Convolutional Neural Networks (CNNs), recurrent networks including Long Short-Term Memory (LSTM), and any combination thereof
Allows architectures of multiple inputs and multiple outputs, including auxiliary classifiers
Many optimization methods including Nesterov momentum, RMSprop and ADAM
Freely definable cost function and no need to derive gradients due to Theano’s symbolic differentiation
Transparent support of CPUs and GPUs due to Theano’s expression compiler
Its design is governed by six principles:
Simplicity: Be easy to use, easy to understand and easy to extend, to facilitate use in research
Transparency: Do not hide Theano behind abstractions, directly process and return Theano expressions or Python / numpy data types
Modularity: Allow all parts (layers, regularizers, optimizers, …) to be used independently of Lasagne
Pragmatism: Make common use cases easy, do not overrate uncommon cases
Restraint: Do not obstruct users with features they decide not to use
Focus: “Do one thing and do it well”