Skip to main content

No project description provided

Project description

Activations Plus

Activations Plus is a Python package designed to provide a collection of advanced activation functions for machine learning and deep learning models. These activation functions are implemented to enhance the performance of neural networks by addressing specific challenges such as sparsity, non-linearity, and gradient flow.

PyPI - Python Version version License OS OS OS Tests Code Checks codecov Ruff Last Commit

Features

  • Entmax: Sparse activation function for probabilistic models.
  • Sparsemax: Sparse alternative to softmax.
  • Bent Identity: Smooth approximation of the identity function. (Experimental feature require review)
  • ELiSH (Exponential Linear Squared Hyperbolic): Combines exponential and linear properties. (Experimental feature require review)
  • Maxout: Learns piecewise linear functions. (Experimental feature require review)
  • Soft Clipping: Smoothly clips values to a range. (Experimental feature require review)
  • SReLU (S-shaped Rectified Linear Unit): Combines linear and non-linear properties. (Experimental feature require review)

Installation

To install the package, use pip:

pip install activations-plus

Usage

Import and use any activation function in your PyTorch models:

import torch
from activations_plus.sparsemax import Sparsemax
from activations_plus.entmax import Entmax

# Example with Sparsemax
sparsemax = Sparsemax()
x = torch.tensor([[1.0, 2.0, 3.0], [1.0, 2.0, -1.0]])
output_sparsemax = sparsemax(x)
print("Sparsemax Output:", output_sparsemax)

# Example with Entmax
entmax = Entmax(alpha=1.5)
output_entmax = entmax(x)
print("Entmax Output:", output_entmax)

These examples demonstrate how to use Sparsemax and Entmax activation functions in PyTorch models.

Documentation

Comprehensive documentation is available documentation.

Supported Activation Functions

  1. Entmax: Sparse activation function for probabilistic models. Reference Paper
  2. Sparsemax: Sparse alternative to softmax for probabilistic outputs. Reference Paper
  3. Bent Identity: A smooth approximation of the identity function. (Experimental feature require review) reference missing
  4. ELiSH: Combines exponential and linear properties for better gradient flow. (Experimental feature require review) Reference Paper
  5. Maxout: Learns piecewise linear functions for better expressiveness. (Experimental feature require review) Reference Paper
  6. Soft Clipping: Smoothly clips values to a range to avoid extreme outputs. (Experimental feature require review) Reference Paper
  7. SReLU: Combines linear and non-linear properties for better flexibility. (Experimental feature require review) Reference Paper

Contributing

Contributions are welcome! Please read the CONTRIBUTING.md file for guidelines.

Testing

To run the tests, use the following command:

pytest tests/

License

This project is licensed under the MIT License. See the LICENSE file for details.

Acknowledgments

Special thanks to the contributors and the open-source community for their support.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

activations_plus-0.1.1.tar.gz (10.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

activations_plus-0.1.1-py3-none-any.whl (15.9 kB view details)

Uploaded Python 3

File details

Details for the file activations_plus-0.1.1.tar.gz.

File metadata

  • Download URL: activations_plus-0.1.1.tar.gz
  • Upload date:
  • Size: 10.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.14

File hashes

Hashes for activations_plus-0.1.1.tar.gz
Algorithm Hash digest
SHA256 35bf8cf8da08da571b5c065e980bcf3bb91e5529499045c1cee7c0dbd84bfaff
MD5 d397f14fc5e6f82f17d2f6b0c7e593ca
BLAKE2b-256 e3e24be2a801e73a0ee6364fd827ff4e727aee1a76c970cb07d502093fbd531f

See more details on using hashes here.

File details

Details for the file activations_plus-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for activations_plus-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 96e43974052dc0dd809732883cd2b0a7954cf49132a8517795146a56a59dcb62
MD5 0059ad3d79d1af208627ca415b4033b5
BLAKE2b-256 f41fff0932d2cb8d6a799d3afced643a2aea1a9106e1fe5f4df9748e1dfefcf5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page