Skip to main content

Rational Activations

Project description

Rational Activations - Learnable Rational Activation Functions

First introduce as PAU in Padé Activation Units: End-to-end Learning of Activation Functions in Deep Neural Network

Arxiv link: https://arxiv.org/abs/1907.06732

1. About Padé Activation Units

Rational Activations are a novel learnable activation functions. Rationals encode activation functions as rational functions, trainable in an end-to-end fashion using backpropagation and can be seemingless integrated into any neural network in the same way as common activation functions (e.g. ReLU).

Rational matches or outperforms common activations in terms of predictive performance and training time. And, therefore relieves the network designer of having to commit to a potentially underperforming choice.

2. Dependencies

PyTorch>=1.4.0
CUDA>=10.1

3. Installation

To install the rational_activations module, you can use pip, but you should be careful about the CUDA version running on your machine. To get your CUDA version: import torch torch.version.cuda

<iframe src="tableau.html" width="800" height="300" seamless ></iframe >
pip3 install wheel

For CUDA 10.1 (and thus 1.4.0>=torch>= 1.5.0), download the wheel corresponding to your python3 version in the wheelhouse repo and install it with:

pip3 install rational-0.0.16-101-cp{your_version}-manylinux2014_x86_64.whl

If you encounter any trouble installing rational, please contact this person.

4. Using Rational in Neural Networks

Rational can be integrated in the same way as any other common activation function.

import torch
from rational_torch import Rational

model = torch.nn.Sequential(
    torch.nn.Linear(D_in, H),
    Rational(), # e.g. instead of torch.nn.ReLU()
    torch.nn.Linear(H, D_out),
)

5. Reproducing Results

To reproduce the reported results of the paper execute:

$ export PYTHONPATH="./" $ python experiments/main.py --dataset mnist --arch conv --optimizer adam --lr 2e-3

# DATASET: Name of the dataset, for MNIST use mnist and for Fashion-MNIST use fmnist
# ARCH: selected neural network architecture: vgg, lenet or conv
# OPTIMIZER: either adam or sgd
# LR: learning rate

6. To be implemented

  • Make a documentation
  • Create tutorial in the doc
  • Tensorflow working version
  • Automatically find initial approx weights for function list
  • Repair + enhance Automatic manylinux production script.
  • Add python3.9 support
  • Make an CUDA 11.0 compatible version
  • Repair the tox test and have them checking before commit

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

File details

Details for the file rational_activations-0.0.18-cp38-cp38-manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for rational_activations-0.0.18-cp38-cp38-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 49889fa297063ba7c70fe2e8e115a6a143383306e34fa88afb727e81711d146d
MD5 b119e130c26edb400c34845782332020
BLAKE2b-256 935a7a7055e499d10266835d456d035c69ab923d090f478bd237161a82f4bada

See more details on using hashes here.

File details

Details for the file rational_activations-0.0.18-cp37-cp37m-manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for rational_activations-0.0.18-cp37-cp37m-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 86b4601689d45cc74f08a0156519fd8e386734321e1225779f8ee8b77bd3c148
MD5 6a4c723cad54909c302f0d0598733335
BLAKE2b-256 d897998f8cdd5a21a98e0ba99c820e2df87173c3e4901d7f201e689f655d4fb9

See more details on using hashes here.

File details

Details for the file rational_activations-0.0.18-cp36-cp36m-manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for rational_activations-0.0.18-cp36-cp36m-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a54ad9f4f25b5a790a5bf93fc48d0a3f31936961d1921b45778f77e427a40442
MD5 b633b42a68a0be25b87d4ecbf71269a2
BLAKE2b-256 e61bd3bfca6783bc44643c51ca8b51ba9c8c1ef361d68671b235817c33fa72ef

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page