Skip to main content

ALReLU activation package for PyTorch with fixed and trainable alpha variants.

Project description

alrelu-torch

PyPI-installable PyTorch package for the ALReLU activation with 2 variants:

  1. ALReLU (default): fixed alpha=0.01
  2. TrainableALReLU: trainable alpha parameter

Formula:

ALReLU(x, alpha) = max(abs(alpha * x), x)

Reference

ALReLU paper:

ALReLU: A different approach on Leaky ReLU activation function to improve Neural Networks Performance

https://arxiv.org/abs/2012.07564

Installation

pip install alrelu-torch

From source:

pip install .

Training Examples (MNIST)

Scripts:

  • Keras + TensorFlow: scripts/train_mnist_keras_tf.py
  • PyTorch: scripts/train_mnist_torch.py

Direct links:

Run commands (from repo root):

# Keras + TensorFlow
python scripts/train_mnist_keras_tf.py --epochs 5 --variant fixed --alpha 0.01
python scripts/train_mnist_keras_tf.py --epochs 5 --variant learnable --alpha 0.01

# PyTorch
python scripts/train_mnist_torch.py --epochs 5 --variant fixed --alpha 0.01
python scripts/train_mnist_torch.py --epochs 5 --variant learnable --alpha 0.01

Usage

1) Functional API

import torch
from alrelu_torch import alrelu

x = torch.tensor([-2.0, -1.0, 0.0, 1.0, 2.0])
y = alrelu(x)  # alpha=0.01 by default

2) Fixed module

import torch.nn as nn
from alrelu_torch import ALReLU

model = nn.Sequential(
    nn.Linear(32, 64),
    ALReLU(alpha=0.01),
    nn.Linear(64, 10),
)

3) Trainable alpha module

import torch.nn as nn
from alrelu_torch import TrainableALReLU

model = nn.Sequential(
    nn.Linear(32, 64),
    TrainableALReLU(alpha_init=0.01, non_negative=True),
    nn.Linear(64, 10),
)

Development

Install dev tools:

pip install -e .[dev]

Run tests:

pytest

Build package:

python -m build

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

alrelu_torch-0.1.2.tar.gz (4.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

alrelu_torch-0.1.2-py3-none-any.whl (4.5 kB view details)

Uploaded Python 3

File details

Details for the file alrelu_torch-0.1.2.tar.gz.

File metadata

  • Download URL: alrelu_torch-0.1.2.tar.gz
  • Upload date:
  • Size: 4.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for alrelu_torch-0.1.2.tar.gz
Algorithm Hash digest
SHA256 401c3f04356144f81d3e309eb2bfd47c27bd1ca16e22c9aff30d75b348cd8987
MD5 9a6e5960cddf97b1d9666da221ca54f6
BLAKE2b-256 8aa04ce2d43490e9ae09b9cfb07fc756d5ae8b59a29262b1a093c0429e7a84f2

See more details on using hashes here.

Provenance

The following attestation bundles were made for alrelu_torch-0.1.2.tar.gz:

Publisher: publish-torch.yml on MStamatis/ALReLU

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file alrelu_torch-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: alrelu_torch-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 4.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for alrelu_torch-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 701213769077113047566b90fc5c49f965b547e538035c03045aa1b7c043244a
MD5 62d25a45f80e0ef94b72e6788a4e6230
BLAKE2b-256 7a384a9bde5b3d44d60192d5a8d0f8dcaafc2cc4b19c962ac39dc145c41cc0f6

See more details on using hashes here.

Provenance

The following attestation bundles were made for alrelu_torch-0.1.2-py3-none-any.whl:

Publisher: publish-torch.yml on MStamatis/ALReLU

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page