This is a pre-production deployment of Warehouse, however changes made here WILL affect the production instance of PyPI.
Latest Version Dependencies status unknown Test status unknown Test coverage unknown
Project Description

pydnn is a deep neural network library written in Python using Theano (symbolic math and optimizing compiler package). I wrote it as a learning project while competing in Kaggle’s National Data Science Bowl in March 2015 (where it produced an entry finishing in the top 6%) and plan to continue developing it by adding support for the most important deep learning techniques (including RNNs).

Design Goals

  • Simplicity

    Wherever possible simplify code to make it a clear expression of underlying deep learning algorithms. Minimize cognitive overhead, so that it is easy for someone who has completed the deeplearning.net tutorials to pickup this library as a next step and easily start learning about, using, and coding more advanced techniques.

  • Completeness

    Include all the important and popular techniques for effective deep learning and not techniques with more marginal or ambiguous benefit.

  • Ease of use

    Make preparing a dataset, building a model and training a deep network only a few lines of code; enable users to work with NumPy rather than Theano.

  • Performance

    Should be roughly on par with other Theano neural net libraries so that pydnn is a viable choice for computationally intensive deep learning.

Features

  • High performance GPU training (courtesy of Theano)

  • Quick start tools to instantly get started training on inexpensive Amazon EC2 GPU instances.

  • Implementations of important new techniques recently reported in the literature:
  • Implementations of standard deep learning techniques:
    • Stochastic Gradient Descent with Momentum
    • Dropout
    • convolutions with max-pooling using overlapping windows
    • ReLU/Tanh/sigmoid activation functions
    • etc.

Usage

First download and unzip raw image data from somewhere (e.g. Kaggle). Then:

import pydnn
import numpy as np
rng = np.random.RandomState(e.rng_seed)

# build data, split into training/validation sets, preprocess
train_dir = 'home\ubuntu\train'
data = pydnn.data.DirectoryLabeledImageSet(train_dir).build()
data = pydnn.preprocess.split_training_data(data, 64, 80, 15, 5)
resizer = pydnn.preprocess.StretchResizer()
pre = pydnn.preprocess.Rotator360(data, (64, 64), resizer, rng)

# build the neural network
net = pydnn.nn.NN(pre, 'images', 121, 64, rng, pydnn.nn.relu)
net.add_convolution(72, (7, 7), (2, 2))
net.add_dropout()
net.add_convolution(128, (5, 5), (2, 2))
net.add_dropout()
net.add_convolution(128, (3, 3), (2, 2))
net.add_dropout()
net.add_hidden(3072)
net.add_dropout()
net.add_hidden(3072)
net.add_dropout()
net.add_logistic()

# train the network
lr = pydnn.nn.Adam(learning_rate=pydnn.nn.LearningRateDecay(
            learning_rate=0.006,
            decay=.1))
net.train(lr)

From raw data to trained network (including specifying network architecture) in 25 lines of code.

Short Term Goals

  • Implement popular RNN techniques.
  • Integrate with Amazon EC2 clustering software (such as StarCluster).
  • Integrate with hyper-parameter optimization frameworks (such as Spearmint and hyperopt).

Authors

Isaac Kriegman

Release History

Release History

0.0.dev

This version

History Node

TODO: Figure out how to actually get changelog content.

Changelog content for this version goes here.

Donec et mollis dolor. Praesent et diam eget libero egestas mattis sit amet vitae augue. Nam tincidunt congue enim, ut porta lorem lacinia consectetur. Donec ut libero sed arcu vehicula ultricies a non tortor. Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Show More

Download Files

Download Files

TODO: Brief introduction on what you do with files - including link to relevant help section.

File Name & Checksum SHA256 Checksum Help Version File Type Upload Date
pydnn-0.0.dev-py2.py3-none-any.whl (75.1 kB) Copy SHA256 Checksum SHA256 py2.py3 Wheel Mar 26, 2015

Supported By

WebFaction WebFaction Technical Writing Elastic Elastic Search Pingdom Pingdom Monitoring Dyn Dyn DNS HPE HPE Development Sentry Sentry Error Logging CloudAMQP CloudAMQP RabbitMQ Heroku Heroku PaaS Kabu Creative Kabu Creative UX & Design Fastly Fastly CDN DigiCert DigiCert EV Certificate Rackspace Rackspace Cloud Servers DreamHost DreamHost Log Hosting