Skip to main content

ALReLU activation package for PyTorch with fixed and trainable alpha variants.

Project description

alrelu-torch

PyPI-installable PyTorch package for the ALReLU activation with 2 variants:

  1. ALReLU (default): fixed alpha=0.01
  2. TrainableALReLU: trainable alpha parameter

Formula:

ALReLU(x, alpha) = max(abs(alpha * x), x)

Reference

ALReLU paper:

ALReLU: A different approach on Leaky ReLU activation function to improve Neural Networks Performance

https://arxiv.org/abs/2012.07564

Installation

pip install alrelu-torch

From source:

pip install .

Training Examples (MNIST)

Scripts:

  • Keras + TensorFlow: scripts/train_mnist_keras_tf.py
  • PyTorch: scripts/train_mnist_torch.py

Direct links:

Run commands (from repo root):

# Keras + TensorFlow
python scripts/train_mnist_keras_tf.py --epochs 5 --variant fixed --alpha 0.01
python scripts/train_mnist_keras_tf.py --epochs 5 --variant learnable --alpha 0.01

# PyTorch
python scripts/train_mnist_torch.py --epochs 5 --variant fixed --alpha 0.01
python scripts/train_mnist_torch.py --epochs 5 --variant learnable --alpha 0.01

Usage

1) Functional API

import torch
from alrelu_torch import alrelu

x = torch.tensor([-2.0, -1.0, 0.0, 1.0, 2.0])
y = alrelu(x)  # alpha=0.01 by default

2) Fixed module

import torch.nn as nn
from alrelu_torch import ALReLU

model = nn.Sequential(
    nn.Linear(32, 64),
    ALReLU(alpha=0.01),
    nn.Linear(64, 10),
)

3) Trainable alpha module

import torch.nn as nn
from alrelu_torch import TrainableALReLU

model = nn.Sequential(
    nn.Linear(32, 64),
    TrainableALReLU(alpha_init=0.01, non_negative=True),
    nn.Linear(64, 10),
)

Development

Install dev tools:

pip install -e .[dev]

Run tests:

pytest

Build package:

python -m build

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

alrelu_torch-0.1.1.tar.gz (4.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

alrelu_torch-0.1.1-py3-none-any.whl (4.5 kB view details)

Uploaded Python 3

File details

Details for the file alrelu_torch-0.1.1.tar.gz.

File metadata

  • Download URL: alrelu_torch-0.1.1.tar.gz
  • Upload date:
  • Size: 4.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for alrelu_torch-0.1.1.tar.gz
Algorithm Hash digest
SHA256 6f20d1dbbe7cc214a75be696eacd648e3d8e1da90cea2b15adf1cede12dc7055
MD5 a4693a4183dc46851aea697636994987
BLAKE2b-256 14e28f6e8ef51deb8aac37170ab65ead483c3c2243cdc935ae33a30dd4c0176b

See more details on using hashes here.

Provenance

The following attestation bundles were made for alrelu_torch-0.1.1.tar.gz:

Publisher: publish-torch.yml on MStamatis/ALReLU

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file alrelu_torch-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: alrelu_torch-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 4.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for alrelu_torch-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ca47e8b5b1ce5d6a00d27661fe31b84222efc6e94dc45fc1b9c574534415aac0
MD5 c18622ff521fd58d666cc5840076910c
BLAKE2b-256 d6ad19b3ccf97069214ca7d11ded6b6ebd74034b9381ed9096422f2d31daff95

See more details on using hashes here.

Provenance

The following attestation bundles were made for alrelu_torch-0.1.1-py3-none-any.whl:

Publisher: publish-torch.yml on MStamatis/ALReLU

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page