A high-level library on top of Pytorch.
Project description
Introduction
A high level framework for general purpose neural networks in Pytorch.
Personally, going from Theano to Pytorch is pretty much like time traveling from 90s to the modern day. However, despite a lot of bells and whistles, I still feel there are some missing elements from Pytorch which are confirmed to be never added to the library. Therefore, this library is written to add more features to the current magical Pytorch. All the modules here directly subclass the corresponding modules from Pytorch, so everything should still be familiar. For example, the following snippet in Pytorch
from torch import nn
model = nn.Sequential(
nn.Conv2d(1, 20, 5, padding=2),
nn.ReLU(),
nn.Conv2d(20, 64, 5, padding=2),
nn.ReLU()
)
can be rewritten in Neuralnet-pytorch as
import neuralnet_pytorch as nnt
model = nnt.Sequential(
nnt.Conv2d(1, 20, 5, padding='half', activation='relu'),
nnt.Conv2d(20, 64, 5, padding='half', activation='relu')
)
which is the same as the native Pytorch, or
import neuralnet_pytorch as nnt
model = nnt.Sequential(input_shape=1)
model.add_module('conv1', nnt.Conv2d(model.output_shape, 20, 5, padding='half', activation='relu'))
model.add_module('conv2', nnt.Conv2d(model.output_shape, 64, 5, padding='half', activation='relu'))
which frees you from a lot of memorizations and manual calculations when adding one layer on top of another. Theano folks will also find some reminiscence as many functions are highly inspired by Theano.
Requirements
Pytorch >= 1.0.0
Gin-config (optional)
Installation
Stable version
pip install --upgrade neuralnet-pytorch
Bleeding-edge version
pip install git+git://github.com/justanhduc/neuralnet-pytorch.git@master
To install the version with some collected Cuda/C++ ops, use
pip install git+git://github.com/justanhduc/neuralnet-pytorch.git@fancy
Usages
The manual reference is still under development and is available at https://neuralnet-pytorch.readthedocs.io.
TODO
- Adding introduction and installation
- Writing documentations
- Adding examples
Disclaimer
This package is a product from my little free time during my PhD, so most but not all the written modules are properly checked. No replacements or refunds for buggy performance. All PRs are welcome.
Acknowledgements
The CUDA Chamfer distance is taken from the AtlasNet repo.
The AdaBound optimizer is taken from its official repo.
The adapted Gin for Pytorch code is taken from Gin-config.
The monitor scheme is inspired from WGAN.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file neuralnet-pytorch-1.0.0a.tar.gz
.
File metadata
- Download URL: neuralnet-pytorch-1.0.0a.tar.gz
- Upload date:
- Size: 61.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fce6e6e7a50871b387a6d90b1050b98c75e22ed52b6dc17bdff5220e24316f97 |
|
MD5 | fab1c91076a4d22cfdc71c015c9f1374 |
|
BLAKE2b-256 | 77c8cbcdd670b55c77f22c0eec0e85bb623eff0cc2dbf73ef4f45f65b434dc1f |
File details
Details for the file neuralnet_pytorch-1.0.0a-py3-none-any.whl
.
File metadata
- Download URL: neuralnet_pytorch-1.0.0a-py3-none-any.whl
- Upload date:
- Size: 49.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 40d51dfad0be8ce3c985c0718b866d64d97f6ebb3f62b63418ab4e910231a7a5 |
|
MD5 | aacd6a549668fc3afb307c48f5b7cdb3 |
|
BLAKE2b-256 | 91ded52ea977393ea3ef2f42ed40a3c92d16d477e5075cb3b36f1a6912b855fa |