Skip to main content

deep neural network library in Python

Project description

pydnn is a deep neural network library written in Python using Theano (symbolic math and optimizing compiler package). I wrote it as a learning project while competing in Kaggle’s National Data Science Bowl in March 2015 (where it produced an entry finishing in the top 6%) and plan to continue developing it by adding support for the most important deep learning techniques (including RNNs).

Design Goals

  • Simplicity

    Wherever possible simplify code to make it a clear expression of underlying deep learning algorithms. Minimize cognitive overhead, so that it is easy for someone who has completed the deeplearning.net tutorials to pickup this library as a next step and easily start learning about, using, and coding more advanced techniques.

  • Completeness

    Include all the important and popular techniques for effective deep learning and not techniques with more marginal or ambiguous benefit.

  • Ease of use

    Make preparing a dataset, building a model and training a deep network only a few lines of code; enable users to work with NumPy rather than Theano.

  • Performance

    Should be roughly on par with other Theano neural net libraries so that pydnn is a viable choice for computationally intensive deep learning.

Features

  • High performance GPU training (courtesy of Theano)

  • Quick start tools to instantly get started training on inexpensive Amazon EC2 GPU instances.

  • Implementations of important new techniques recently reported in the literature:
  • Implementations of standard deep learning techniques:
    • Stochastic Gradient Descent with Momentum

    • Dropout

    • convolutions with max-pooling using overlapping windows

    • ReLU/Tanh/sigmoid activation functions

    • etc.

Usage

First download and unzip raw image data from somewhere (e.g. Kaggle). Then:

import pydnn
import numpy as np
rng = np.random.RandomState(e.rng_seed)

# build data, split into training/validation sets, preprocess
train_dir = 'home\ubuntu\train'
data = pydnn.data.DirectoryLabeledImageSet(train_dir).build()
data = pydnn.preprocess.split_training_data(data, 64, 80, 15, 5)
resizer = pydnn.preprocess.StretchResizer()
pre = pydnn.preprocess.Rotator360(data, (64, 64), resizer, rng)

# build the neural network
net = pydnn.nn.NN(pre, 'images', 121, 64, rng, pydnn.nn.relu)
net.add_convolution(72, (7, 7), (2, 2))
net.add_dropout()
net.add_convolution(128, (5, 5), (2, 2))
net.add_dropout()
net.add_convolution(128, (3, 3), (2, 2))
net.add_dropout()
net.add_hidden(3072)
net.add_dropout()
net.add_hidden(3072)
net.add_dropout()
net.add_logistic()

# train the network
lr = pydnn.nn.Adam(learning_rate=pydnn.nn.LearningRateDecay(
            learning_rate=0.006,
            decay=.1))
net.train(lr)

From raw data to trained network (including specifying network architecture) in 25 lines of code.

Short Term Goals

  • Implement popular RNN techniques.

  • Integrate with Amazon EC2 clustering software (such as StarCluster).

  • Integrate with hyper-parameter optimization frameworks (such as Spearmint and hyperopt).

Authors

Isaac Kriegman

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

pydnn-0.0.dev-py2.py3-none-any.whl (75.1 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file pydnn-0.0.dev-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for pydnn-0.0.dev-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 bf5b7a799fe8d6124b15bb18ea5cf5182a05d411db28b1aa21bd16369f5de113
MD5 6fbed30f08d8c0a780a3e3fb60de7a95
BLAKE2b-256 ab8bfde2a005c42035569835ae962a49e958f75e41000bed86ee89f10536fa67

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page