Skip to main content

Rational Activations

Project description

ArXiv Badge PWC

Logo

Rational Activations - Learnable Rational Activation Functions

First introduce as PAU in Padé Activation Units: End-to-end Learning of Activation Functions in Deep Neural Network.

1. About Rational Activation Functions

Rational Activations are a novel learnable activation functions. Rationals encode activation functions as rational functions, trainable in an end-to-end fashion using backpropagation and can be seemingless integrated into any neural network in the same way as common activation functions (e.g. ReLU).

Rationals: Beyond known Activation Functions

Rational can approximate any known activation function arbitrarily well (cf. Padé Activation Units: End-to-end Learning of Flexible Activation Functions in Deep Networks): rational_approx (*the dashed lines represent the rational approximation of every function)

Rational are made to be optimized by the gradient descent, and can discover good properties of activation functions after learning (cf Recurrent Rational Networks): rational_properties

Rationals evaluation on different tasks

Rational matches or outperforms common activations in terms of predictive performance and training time. And, therefore relieves the network designer of having to commit to a potentially underperforming choice.

  • Recurrent Rational Functions have then been introduced in Recurrent Rational Networks, and both Rational and Recurrent Rational Networks are evaluated on RL Tasks. rl_scores :octocat: See rational_rl github repo

2. Dependencies

We support MxNet, Keras, and PyTorch. Instructions for MxNet can be found here. Instructions for Keras here. The following README instructions assume that you want to use rational activations in PyTorch.

PyTorch>=1.4.0
CUDA>=10.1

3. Installation

To install the rational_activations module, you can use pip, but:

:bangbang: You should be careful about the CUDA version running on your machine.

To get your CUDA version:

import torch
torch.version.cuda

For your corresponding version of CUDA, please use one of the following command blocks:

CUDA 10.2

 pip3 install -U pip wheel
 pip3 install torch rational-activations

CUDA 10.1

Python3.6
   pip3 install -U pip wheel
   pip3 install torch==1.7.1+cu101 -f https://download.pytorch.org/whl/torch_stable.html
   pip3 install https://github.com/ml-research/rational_activations/blob/master/wheelhouse/cuda-10.1/rational_activations-0.1.0-cp36-cp36m-manylinux2014_x86_64.whl\?raw\=true 
Python3.7
   pip3 install -U pip wheel
   pip3 install torch==1.7.1+cu101 -f https://download.pytorch.org/whl/torch_stable.html
   pip3 install https://github.com/ml-research/rational_activations/blob/master/wheelhouse/cuda-10.1/rational_activations-0.1.0-cp37-cp37m-manylinux2014_x86_64.whl\?raw\=true 
Python3.8
     pip3 install -U pip wheel
     pip3 install torch==1.7.1+cu101 -f https://download.pytorch.org/whl/torch_stable.html
     pip3 install https://github.com/ml-research/rational_activations/blob/master/wheelhouse/cuda-10.1/rational_activations-0.1.0-cp38-cp38-manylinux2014_x86_64.whl\?raw\=true

CUDA 11.0

Python3.6
   pip3 install -U pip wheel
   pip3 install torch==1.7.1+cu110 -f https://download.pytorch.org/whl/torch_stable.html
   pip3 install https://github.com/ml-research/rational_activations/blob/master/wheelhouse/cuda-11.0/rational_activations-0.1.0-cp36-cp36m-manylinux2014_x86_64.whl\?raw\=true 
Python3.7
   pip3 install -U pip wheel
   pip3 install torch==1.7.1+cu110 -f https://download.pytorch.org/whl/torch_stable.html
   pip3 install https://github.com/ml-research/rational_activations/blob/master/wheelhouse/cuda-11.0/rational_activations-0.1.0-cp37-cp37m-manylinux2014_x86_64.whl\?raw\=true
Python3.8
     pip3 install -U pip wheel
     pip3 install torch==1.7.1+cu110 -f https://download.pytorch.org/whl/torch_stable.html
     pip3 install https://github.com/ml-research/rational_activations/blob/master/wheelhouse/cuda-11.0/rational_activations-0.1.0-cp38-cp38-manylinux2014_x86_64.whl\?raw\=true

Other CUDA/Pytorch

For any other combinaison of python, please install from source:

 pip3 install airspeed
 git clone https://github.com/ml-research/rational_activations.git
 cd rational_activations
 python3 setup.py install --user

If you encounter any trouble installing rational, please contact this person.

4. Using Rational in Neural Networks

Rational can be integrated in the same way as any other common activation function.

import torch
from rational.torch import Rational

model = torch.nn.Sequential(
    torch.nn.Linear(D_in, H),
    Rational(), # e.g. instead of torch.nn.ReLU()
    torch.nn.Linear(H, D_out),
)

5. Cite Us in your paper

@inproceedings{molina2019pade,
  title={Pad{\'e} Activation Units: End-to-end Learning of Flexible Activation Functions in Deep Networks},
  author={Molina, Alejandro and Schramowski, Patrick and Kersting, Kristian},
  booktitle={International Conference on Learning Representations},
  year={2019}
}


@article{delfosse2020rationals,
  title={Rational Activation functions},
  author={Delfosse, Quentin and Schramowski, Patrick and Molina, Alejandro and Beck, Nils and Hsu, Ting-Yu and Kashef, Yasien and Rüling-Cachay, Salva and Zimmermann, Julius},
  journal={arXiv preprint arXiv:2102.09407},
  year={2020}
  howpublished={\url{https://github.com/ml-research/rational_activations}}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

rational_activations-0.2.0-py3-none-any.whl (54.3 kB view details)

Uploaded Python 3

File details

Details for the file rational_activations-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: rational_activations-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 54.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.5.0 pkginfo/1.6.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.5

File hashes

Hashes for rational_activations-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c8903e068265c45b39a62f0ff02165f30f8edb23d8e52169ce8828e3755f6b27
MD5 e0795e032cb832a8ff3b280cb176564d
BLAKE2b-256 fddc01b19eb2628a8750543263c2ac3f4a72d1805df1b46a5561bbca888a501d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page