Neural Network Toolbox on TensorFlow
Project description
# tensorpack
Neural Network Toolbox on TensorFlow.
[![Build Status](https://travis-ci.org/ppwwyyxx/tensorpack.svg?branch=master)](https://travis-ci.org/ppwwyyxx/tensorpack)
[![badge](https://readthedocs.org/projects/pip/badge/?version=latest)](http://tensorpack.readthedocs.io/en/latest/index.html)
See some [examples](examples) to learn about the framework:
### Vision:
+ [DoReFa-Net: train binary / low-bitwidth CNN on ImageNet](examples/DoReFa-Net)
+ [Train ResNet on ImageNet / Cifar10 / SVHN](examples/ResNet)
+ [Generative Adversarial Network(GAN) variants](examples/GAN), including DCGAN, InfoGAN, Conditional GAN, WGAN, BEGAN, DiscoGAN, Image to Image.
+ [Fully-convolutional Network for Holistically-Nested Edge Detection(HED)](examples/HED)
+ [Spatial Transformer Networks on MNIST addition](examples/SpatialTransformer)
+ [Visualize Saliency Maps by Guided ReLU](examples/Saliency)
+ [Similarity Learning on MNIST](examples/SimilarityLearning)
### Reinforcement Learning:
+ [Deep Q-Network(DQN) variants on Atari games](examples/DeepQNetwork), including DQN, DoubleDQN, DuelingDQN.
+ [Asynchronous Advantage Actor-Critic(A3C) with demos on OpenAI Gym](examples/A3C-Gym)
### Speech / NLP:
+ [LSTM-CTC for speech recognition](examples/CTC-TIMIT)
+ [char-rnn for fun](examples/Char-RNN)
+ [LSTM language model on PennTreebank](examples/PennTreebank)
The examples are not only for demonstration of the framework -- you can train them and reproduce the results in papers.
## Features:
It's Yet Another TF wrapper, but different in:
1. Not focus on models.
+ There are already too many symbolic function wrappers.
Tensorpack includes only a few common models, and helpful tools such as `LinearWrap` to simplify large models.
But you can use any other wrappers within tensorpack, such as sonnet/Keras/slim/tflearn/tensorlayer/....
2. Focus on large datasets.
+ __DataFlow__ allows you to process large datasets such as ImageNet in Python without blocking the training.
+ DataFlow has a unified interface, so you can compose and reuse them to perform complex preprocessing.
3. Focus on training speed.
+ Tensorpack trainer is almost always faster than `feed_dict` based wrappers.
Even on a small CNN example, the training runs [2x faster](https://gist.github.com/ppwwyyxx/8d95da79f8d97036a7d67c2416c851b6) than the equivalent Keras code.
More improvements to come later.
+ Data-Parallel Multi-GPU training is off-the-shelf to use.
You can also define your own trainer for different style of training (e.g. GAN) without losing the efficiency.
4. Interface of extensible __Callbacks__.
Write a callback to implement everything you want to do apart from the training iterations, and
enable it with one line of code. Common examples include:
+ Change hyperparameters during training
+ Print some tensors of interest
+ Run inference on a test dataset
+ Run some operations once a while
+ Send loss to your phone
## Install:
Dependencies:
+ Python 2 or 3
+ TensorFlow >= 1.0.0
+ Python bindings for OpenCV
```
pip install -U git+https://github.com/ppwwyyxx/tensorpack.git
# or add `--user` to avoid system-wide installation.
```
Neural Network Toolbox on TensorFlow.
[![Build Status](https://travis-ci.org/ppwwyyxx/tensorpack.svg?branch=master)](https://travis-ci.org/ppwwyyxx/tensorpack)
[![badge](https://readthedocs.org/projects/pip/badge/?version=latest)](http://tensorpack.readthedocs.io/en/latest/index.html)
See some [examples](examples) to learn about the framework:
### Vision:
+ [DoReFa-Net: train binary / low-bitwidth CNN on ImageNet](examples/DoReFa-Net)
+ [Train ResNet on ImageNet / Cifar10 / SVHN](examples/ResNet)
+ [Generative Adversarial Network(GAN) variants](examples/GAN), including DCGAN, InfoGAN, Conditional GAN, WGAN, BEGAN, DiscoGAN, Image to Image.
+ [Fully-convolutional Network for Holistically-Nested Edge Detection(HED)](examples/HED)
+ [Spatial Transformer Networks on MNIST addition](examples/SpatialTransformer)
+ [Visualize Saliency Maps by Guided ReLU](examples/Saliency)
+ [Similarity Learning on MNIST](examples/SimilarityLearning)
### Reinforcement Learning:
+ [Deep Q-Network(DQN) variants on Atari games](examples/DeepQNetwork), including DQN, DoubleDQN, DuelingDQN.
+ [Asynchronous Advantage Actor-Critic(A3C) with demos on OpenAI Gym](examples/A3C-Gym)
### Speech / NLP:
+ [LSTM-CTC for speech recognition](examples/CTC-TIMIT)
+ [char-rnn for fun](examples/Char-RNN)
+ [LSTM language model on PennTreebank](examples/PennTreebank)
The examples are not only for demonstration of the framework -- you can train them and reproduce the results in papers.
## Features:
It's Yet Another TF wrapper, but different in:
1. Not focus on models.
+ There are already too many symbolic function wrappers.
Tensorpack includes only a few common models, and helpful tools such as `LinearWrap` to simplify large models.
But you can use any other wrappers within tensorpack, such as sonnet/Keras/slim/tflearn/tensorlayer/....
2. Focus on large datasets.
+ __DataFlow__ allows you to process large datasets such as ImageNet in Python without blocking the training.
+ DataFlow has a unified interface, so you can compose and reuse them to perform complex preprocessing.
3. Focus on training speed.
+ Tensorpack trainer is almost always faster than `feed_dict` based wrappers.
Even on a small CNN example, the training runs [2x faster](https://gist.github.com/ppwwyyxx/8d95da79f8d97036a7d67c2416c851b6) than the equivalent Keras code.
More improvements to come later.
+ Data-Parallel Multi-GPU training is off-the-shelf to use.
You can also define your own trainer for different style of training (e.g. GAN) without losing the efficiency.
4. Interface of extensible __Callbacks__.
Write a callback to implement everything you want to do apart from the training iterations, and
enable it with one line of code. Common examples include:
+ Change hyperparameters during training
+ Print some tensors of interest
+ Run inference on a test dataset
+ Run some operations once a while
+ Send loss to your phone
## Install:
Dependencies:
+ Python 2 or 3
+ TensorFlow >= 1.0.0
+ Python bindings for OpenCV
```
pip install -U git+https://github.com/ppwwyyxx/tensorpack.git
# or add `--user` to avoid system-wide installation.
```
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
tensorpack-0.1.8.tar.gz
(134.8 kB
view hashes)
Built Distribution
tensorpack-0.1.8-py2.py3-none-any.whl
(195.9 kB
view hashes)
Close
Hashes for tensorpack-0.1.8-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b6dfbbc4685d2e7694a3b8cbb4aedd1d4c5ced0f381bba2c5d8a7b295f49b3f0 |
|
MD5 | 3c2bf4deadccb395ba4fc0d84356e5b8 |
|
BLAKE2b-256 | c81c1b61340cc14a02adce5280555db9ed5da8efa7ea7e525a07fd4d7fe9c324 |