A PyTorch implementation of the APTx activation function.
Project description
APTx Activation Function - Implementation in Pytorch
This repository offers a Python package for the PyTorch implementation of the APTx activation function, as introduced in the paper "APTx: Better Activation Function than MISH, SWISH, and ReLU's Variants used in Deep Learning."
APTx (Alpha Plus Tanh Times) is a novel activation function designed for computational efficiency in deep learning. It enhances training performance and inference speed, making it particularly suitable for low-end hardware such as IoT devices. Notably, APTx provides flexibility by allowing users to either use its default parameter values or optimize them as trainable parameters during training.
Paper Title: APTx: Better Activation Function than MISH, SWISH, and ReLU's Variants used in Deep Learning
Author: Ravin Kumar
Publication: 5th July, 2022
Published Paper: click here
Doi: DOI Link of Paper
Other Sources:
- Arxiv.org
- Research Gate, Research Gate - Preprint
- Osf.io - version 3, Osf.io - version 2, Osf.io - version 1
- SSRN
- Internet Archive, Internet Archive - Preprint
- Medium.com
Github Repositories:
- Github Repository (Python Package- Pytorch Implementation): Python Package
Cite Paper as:
Ravin Kumar (2022). APTx: Better Activation Function than MISH, SWISH, and ReLU’s Variants used in Deep Learning. International Journal of Artificial Intelligence and Machine Learning, 2(2), 56-61. doi: 10.51483/IJAIML.2.2.2022.56-61
Mathematical Definition
The APTx activation function is defined as:
APTx(x) = (α + tanh(β * x)) * γ * x
where:
- α controls the baseline shift (default: 1.0)
- β scales the input inside the tanh function (default: 1.0)
- γ controls the output amplitude (default: 0.5)
At α = 1, β = ½ and γ = ½ APTx closely maps the negative domain part of the APTx with the negative part of MISH. When α = 1, β = 1, and γ = ½ the positive domain part of APTx closely maps with the positive part of MISH.
So, we can use α = 1, β = ½, and γ = ½ values for the negative part, and α = 1, β = 1, and γ = ½ for the positive part in case we want to closely approximate the MISH activation function.
Interestingly, APTx function with parameters α = 1 , β = ½ and γ = ½ behaves like the SWISH(x, 1) activation function, and APTx with α = 1, β = 1, and γ = ½ behaves like SWISH(x, 2). APTx generates the SWISH(x, ρ) activation function at parameters α = 1, β = ρ/2, and γ = ½.
Furthermore, choosing α = 1, β ≈ 106, and γ = ½ yields a close approximation of the ReLU activation function. The approximation keeps improving as β increases and converging to ReLU in the limit β --> ∞.
📥 Installation
pip install aptx_activation
or,
pip install git+https://github.com/mr-ravin/aptx_activation.git
📌 Dependencies:
- Python >= 3.7
- Pytorch >= 1.8.0
Usage
1. APTx with parameters values
On Default Device:
import torch
from aptx_activation import APTx
# Example Usage
aptx = APTx(alpha=1.0, beta=1.0, gamma=0.5) # default values in APTx
tensor = torch.randn(5)
output = aptx(tensor)
print(output)
On GPU Device:
import torch
from aptx_activation import APTx
# Example Usage
aptx = APTx(alpha=1.0, beta=1.0, gamma=0.5).to("cuda") # default values in APTx
tensor = torch.randn(5).to("cuda")
output = aptx(tensor)
print(output)
2. APTx with parameters - Similar to SWISH
import torch
from aptx_activation import APTx
# Example Usage
aptx = APTx(alpha=1.0, beta=0.5, gamma=0.5) # Behaves like SWISH(x, 1)
tensor = torch.randn(5)
output = aptx(tensor)
print(output)
3. APTx with trainable parameters
APTx allows for trainable parameters to adapt dynamically during training:
from aptx_activation import APTx
aptx = APTx(trainable=True) # Learnable α, β, and γ
Key Benefits of APTx
- Efficient Computation: Requires fewer mathematical operations compared to MISH and SWISH.
- Faster Training: The reduced complexity speeds up both forward and backward propagation.
- Lower Hardware Requirements: Optimized for edge devices and low-end computing hardware.
- Parameter Flexibility - SWISH:
- By setting α = 1, β = 0.5, and γ = 0.5, APTx exactly replicates the SWISH(x, 1) activation function.
- By setting α = 1, β = 1, and γ = 0.5, APTx exactly replicates the SWISH(x, 2) activation function.
- Parameter Flexibility - MISH:
- By setting α = 1, β = 0.5, and γ = 0.5, APTx closely replicates the
negative domainpart of MISH activation function. - By setting α = 1, β = 1, and γ = 0.5, APTx closely replicates the
positive domainpart of MISH activation function.
- By setting α = 1, β = 0.5, and γ = 0.5, APTx closely replicates the
- Parameter Flexibility - ReLU:
- By setting α = 1, and γ = 0.5, with the approximation improving as β increases and converging to ReLU in the limit β --> ∞. In practice, setting α = 1, β ≈ 106, and γ = 0.5 already produces a close approximation of ReLU.
Comparison of APTx with MISH, SWISH, and ReLU's variants
- SWISH generally outperforms ReLU (and its variants) in deeper networks because it is smooth and non-monotonic, allowing better gradient flow.
- MISH vs SWISH:
- MISH is smoother than SWISH, helping gradient flow.
- MISH retains more information during negative input values.
- MISH requires more computation.
- APTx offers similar performance to MISH but with significantly lower computation costs, making it ideal for resource-constrained environments.
Conclusion
MISH has similar or even better performance than SWISH which is better than the rest of the activation functions. Our proposed activation function APTx behaves similar to MISH but requires lesser mathematical operations in calculating value in forward propagation, and derivatives in backward propagation. This allows APTx to train neural networks faster and be able to run inference on low-end computing hardwares such as neural networks deployed on low-end edge-devices with Internet of Things. Interestingly, using APTx one can also generate the SWISH(x, ρ) activation function at parameters α = 1, β = ρ/2, and γ = ½. Furthermore, choosing α = 1, β ≈ 106, and γ = ½ yields a close approximation of the ReLU activation function.
📜 Copyright License
Copyright (c) 2025 Ravin Kumar
Website: https://mr-ravin.github.io
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation
files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy,
modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the
Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the
Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file aptx_activation-0.0.6.tar.gz.
File metadata
- Download URL: aptx_activation-0.0.6.tar.gz
- Upload date:
- Size: 5.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bc8715e5de5f7bb40d8333e76a29288f6b4610107aca18887115cbc969a44866
|
|
| MD5 |
1a900ad7058163091295847ea449faea
|
|
| BLAKE2b-256 |
61ee4fac1eff51ed707f96f7957f49b27ca37db2708a15df88519ce016d25d7a
|
File details
Details for the file aptx_activation-0.0.6-py3-none-any.whl.
File metadata
- Download URL: aptx_activation-0.0.6-py3-none-any.whl
- Upload date:
- Size: 5.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8ec6845a1be9c09079a596071ee48d499724e960de89ebf2663d5f5d7d14bbf0
|
|
| MD5 |
b79c0716590237ce02a1a4119191fa4d
|
|
| BLAKE2b-256 |
d82c59cb519477c233983ea9b15e75519b130e2dec2c4db41342bfcc9dc9d6f3
|