Skip to main content

A lightweight library to build and train neural networks in Theano

Project description

http://img.shields.io/badge/docs-latest-brightgreen.svg https://travis-ci.org/Lasagne/Lasagne.svg?branch=master https://img.shields.io/coveralls/Lasagne/Lasagne.svg https://img.shields.io/badge/license-MIT-blue.svg https://zenodo.org/badge/16974/Lasagne/Lasagne.svg

Lasagne

Lasagne is a lightweight library to build and train neural networks in Theano. Its main features are:

  • Supports feed-forward networks such as Convolutional Neural Networks (CNNs), recurrent networks including Long Short-Term Memory (LSTM), and any combination thereof

  • Allows architectures of multiple inputs and multiple outputs, including auxiliary classifiers

  • Many optimization methods including Nesterov momentum, RMSprop and ADAM

  • Freely definable cost function and no need to derive gradients due to Theano’s symbolic differentiation

  • Transparent support of CPUs and GPUs due to Theano’s expression compiler

Its design is governed by six principles:

  • Simplicity: Be easy to use, easy to understand and easy to extend, to facilitate use in research

  • Transparency: Do not hide Theano behind abstractions, directly process and return Theano expressions or Python / numpy data types

  • Modularity: Allow all parts (layers, regularizers, optimizers, …) to be used independently of Lasagne

  • Pragmatism: Make common use cases easy, do not overrate uncommon cases

  • Restraint: Do not obstruct users with features they decide not to use

  • Focus: “Do one thing and do it well”

Installation

In short, you can install a known compatible version of Theano and the latest Lasagne development version via:

pip install -r https://raw.githubusercontent.com/Lasagne/Lasagne/master/requirements.txt
pip install https://github.com/Lasagne/Lasagne/archive/master.zip

For more details and alternatives, please see the Installation instructions.

Documentation

Documentation is available online: http://lasagne.readthedocs.org/

For support, please refer to the lasagne-users mailing list.

Example

import lasagne
import theano
import theano.tensor as T

# create Theano variables for input and target minibatch
input_var = T.tensor4('X')
target_var = T.ivector('y')

# create a small convolutional neural network
from lasagne.nonlinearities import leaky_rectify, softmax
network = lasagne.layers.InputLayer((None, 3, 32, 32), input_var)
network = lasagne.layers.Conv2DLayer(network, 64, (3, 3),
                                     nonlinearity=leaky_rectify)
network = lasagne.layers.Conv2DLayer(network, 32, (3, 3),
                                     nonlinearity=leaky_rectify)
network = lasagne.layers.Pool2DLayer(network, (3, 3), stride=2, mode='max')
network = lasagne.layers.DenseLayer(lasagne.layers.dropout(network, 0.5),
                                    128, nonlinearity=leaky_rectify,
                                    W=lasagne.init.Orthogonal())
network = lasagne.layers.DenseLayer(lasagne.layers.dropout(network, 0.5),
                                    10, nonlinearity=softmax)

# create loss function
prediction = lasagne.layers.get_output(network)
loss = lasagne.objectives.categorical_crossentropy(prediction, target_var)
loss = loss.mean() + 1e-4 * lasagne.regularization.regularize_network_params(
        network, lasagne.regularization.l2)

# create parameter update expressions
params = lasagne.layers.get_all_params(network, trainable=True)
updates = lasagne.updates.nesterov_momentum(loss, params, learning_rate=0.01,
                                            momentum=0.9)

# compile training function that updates parameters and returns training loss
train_fn = theano.function([input_var, target_var], loss, updates=updates)

# train network (assuming you've got some training data in numpy arrays)
for epoch in range(100):
    loss = 0
    for input_batch, target_batch in training_data:
        loss += train_fn(input_batch, target_batch)
    print("Epoch %d: Loss %g" % (epoch + 1, loss / len(training_data)))

# use trained network for predictions
test_prediction = lasagne.layers.get_output(network, deterministic=True)
predict_fn = theano.function([input_var], T.argmax(test_prediction, axis=1))
print("Predicted class for first test input: %r" % predict_fn(test_data[0]))

For a fully-functional example, see examples/mnist.py, and check the Tutorial for in-depth explanations of the same. More examples, code snippets and reproductions of recent research papers are maintained in the separate Lasagne Recipes repository.

Development

Lasagne is a work in progress, input is welcome.

Please see the Contribution instructions for details on how you can contribute!

Changelog

0.1 (2015-08-13)

First release.

  • core contributors, in alphabetical order:

    • Eric Battenberg (@ebattenberg)

    • Sander Dieleman (@benanne)

    • Daniel Nouri (@dnouri)

    • Eben Olson (@ebenolson)

    • Aäron van den Oord (@avdnoord)

    • Colin Raffel (@craffel)

    • Jan Schlüter (@f0k)

    • Søren Kaae Sønderby (@skaae)

  • extra contributors, in chronological order:

    • Daniel Maturana (@dimatura): documentation, cuDNN layers, LRN

    • Jonas Degrave (@317070): get_all_param_values() fix

    • Jack Kelly (@JackKelly): help with recurrent layers

    • Gábor Takács (@takacsg84): support broadcastable parameters in lasagne.updates

    • Diogo Moitinho de Almeida (@diogo149): MNIST example fixes

    • Brian McFee (@bmcfee): MaxPool2DLayer fix

    • Martin Thoma (@MartinThoma): documentation

    • Jeffrey De Fauw (@JeffreyDF): documentation, ADAM fix

    • Michael Heilman (@mheilman): NonlinearityLayer, lasagne.random

    • Gregory Sanders (@instagibbs): documentation fix

    • Jon Crall (@erotemic): check for non-positive input shapes

    • Hendrik Weideman (@hjweide): set_all_param_values() test, MaxPool2DCCLayer fix

    • Kashif Rasul (@kashif): ADAM simplification

    • Peter de Rivaz (@peterderivaz): documentation fix

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Lasagne-0.1.tar.gz (125.1 kB view details)

Uploaded Source

File details

Details for the file Lasagne-0.1.tar.gz.

File metadata

  • Download URL: Lasagne-0.1.tar.gz
  • Upload date:
  • Size: 125.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for Lasagne-0.1.tar.gz
Algorithm Hash digest
SHA256 3c634ecab67e43e4f18520932bfd88bd3c678ec723c48177f18799dab2411233
MD5 44212b92bf5f3b1be3021fa0b64b5fdb
BLAKE2b-256 98bf4b2336e4dbc8c8859c4dd81b1cff18eef2066b4973a1bd2b0ca2e5435f35

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page