Skip to main content

Activation Functions for TensorFlow

Project description



ActTensor: Activation Functions for TensorFlow

license releases

What is it?

ActTensor is a Python package that provides state-of-the-art activation functions which facilitate using them in Deep Learning projects in an easy and fast manner.

Why not using tf.keras.activations?

As you may know, TensorFlow only has a few defined activation functions and most importantly it does not include newly-introduced activation functions. Wrting another one requires time and energy; however, this package has most of the widely-used, and even state-of-the-art activation functions that are ready to use in your models.

Requirements

numpy
tensorflow
setuptools
keras
wheel

Where to get it?

The source code is currently hosted on GitHub at: https://github.com/pouyaardehkhani/ActTensor

Binary installers for the latest released version are available at the Python Package Index (PyPI)

# PyPI
pip install ActTensor-tf

License

MIT

How to use?

import tensorflow as tf
import numpy as np
from ActTensor_tf import ReLU # name of the layer

functional api

inputs = tf.keras.layers.Input(shape=(28,28))
x = tf.keras.layers.Flatten()(inputs)
x = tf.keras.layers.Dense(128)(x)
# wanted class name
x = ReLU()(x)
output = tf.keras.layers.Dense(10,activation='softmax')(x)

model = tf.keras.models.Model(inputs = inputs,outputs=output)

sequential api

model = tf.keras.models.Sequential([tf.keras.layers.Flatten(),
                                    tf.keras.layers.Dense(128),
                                    # wanted class name
                                    ReLU(),
                                    tf.keras.layers.Dense(10, activation = tf.nn.softmax)])

NOTE:

The main function of the activation layers are also availabe but it maybe defined as different name. Check this for more information.

from ActTensor_tf import relu

Activations

Classes and Functions are available in ActTensor_tf

Activation Name Class Name Function Name
SoftShrink SoftShrink softSHRINK
HardShrink HardShrink hard_shrink
GLU GLU -
Bilinear Bilinear -
ReGLU ReGLU -
GeGLU GeGLU -
SwiGLU SwiGLU -
SeGLU SeGLU -
ReLU ReLU relu
Identity Identity identity
Step Step step
Sigmoid Sigmoid sigmoid
HardSigmoid HardSigmoid hard_sigmoid
LogSigmoid LogSigmoid log_sigmoid
SiLU SiLU silu
PLinear ParametricLinear parametric_linear
Piecewise-Linear PiecewiseLinear piecewise_linear
Complementary Log-Log CLL cll
Bipolar Bipolar bipolar
Bipolar-Sigmoid BipolarSigmoid bipolar_sigmoid
Tanh Tanh tanh
TanhShrink TanhShrink tanhshrink
LeCun's Tanh LeCunTanh leCun_tanh
HardTanh HardTanh hard_tanh
TanhExp TanhExp tanh_exp
Absolute ABS Abs
Squared-ReLU SquaredReLU squared_relu
P-ReLU ParametricReLU Parametric_ReLU
R-ReLU RandomizedReLU Randomized_ReLU
LeakyReLU LeakyReLU leaky_ReLU
ReLU6 ReLU6 relu6
Mod-ReLU ModReLU Mod_ReLU
Cosine-ReLU CosReLU Cos_ReLU
Sin-ReLU SinReLU Sin_ReLU
Probit Probit probit
Cos Cos Cosine
Gaussian Gaussian gaussian
Multiquadratic Multiquadratic Multi_quadratic
Inverse-Multiquadratic InvMultiquadratic Inv_Multi_quadratic
SoftPlus SoftPlus softPlus
Mish Mish mish
SMish Smish smish
P-SMish ParametricSmish Parametric_Smish
Swish Swish swish
ESwish ESwish eswish
HardSwish HardSwish hardSwish
GCU GCU gcu
CoLU CoLU colu
PELU PELU pelu
SELU SELU selu
CELU CELU celu
ArcTan ArcTan arcTan
Shifted-SoftPlus ShiftedSoftPlus Shifted_SoftPlus
Softmax Softmax softmax
Logit Logit logit
GELU GELU gelu
Softsign Softsign softsign
ELiSH ELiSH elish
HardELiSH HardELiSH hardELiSH
Serf Serf serf
ELU ELU elu
Phish Phish phish
QReLU QReLU qrelu
MQReLU MQReLU mqrelu
FReLU FReLU frelu

Which activation functions it supports?

  1. Soft Shrink:

  1. Hard Shrink:

  1. GLU:

  1. Bilinear:
  1. ReGLU:

    ReGLU is an activation function which is a variant of GLU.

  1. GeGLU:

    GeGLU is an activation function which is a variant of GLU.

  1. SwiGLU:

    SwiGLU is an activation function which is a variant of GLU.

  1. SeGLU:

    SeGLU is an activation function which is a variant of GLU.

  2. ReLU:

  1. Identity:

    $f(x) = x$

  1. Step:

  1. Sigmoid:

  1. Hard Sigmoid:

  1. Log Sigmoid:

  1. SiLU:

  1. ParametricLinear:

    $f(x) = a*x$

  2. PiecewiseLinear:

    Choose some xmin and xmax, which is our "range". Everything less than than this range will be 0, and everything greater than this range will be 1. Anything else is linearly-interpolated between.

  1. Complementary Log-Log (CLL):

  1. Bipolar:

  1. Bipolar Sigmoid:

  1. Tanh:

  1. Tanh Shrink:

  1. LeCunTanh:

  1. Hard Tanh:

  1. TanhExp:

  1. ABS:

  1. SquaredReLU:

  1. ParametricReLU (PReLU):

  1. RandomizedReLU (RReLU):

  1. LeakyReLU:

  1. ReLU6:

  1. ModReLU:

  1. CosReLU:

  1. SinReLU:

  1. Probit:

  1. Cosine:

  1. Gaussian:

  1. Multiquadratic:

    Choose some point (x,y).

  1. InvMultiquadratic:

  1. SoftPlus:

  1. Mish:

  1. Smish:

  1. ParametricSmish (PSmish):

  1. Swish:

  1. ESwish:

  1. Hard Swish:

  1. GCU:

  1. CoLU:

  1. PELU:

  1. SELU:

    where $\alpha \approx 1.6733$ & $\lambda \approx 1.0507$

  1. CELU:

  1. ArcTan:

  1. ShiftedSoftPlus:

  1. Softmax:

  1. Logit:

  1. GELU:

  1. Softsign:

  1. ELiSH:

  1. Hard ELiSH:

  1. Serf:

  1. ELU:

  1. Phish:

  1. QReLU:

  1. m-QReLU:

  1. FReLU:

Cite this repository

@software{Pouya_ActTensor_2022,
author = {Pouya, Ardehkhani and Pegah, Ardehkhani},
license = {MIT},
month = {7},
title = {{ActTensor}},
url = {https://github.com/pouyaardehkhani/ActTensor},
version = {1.0.0},
year = {2022}
}

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ActTensor_tf-1.0.tar.gz (33.5 kB view details)

Uploaded Source

File details

Details for the file ActTensor_tf-1.0.tar.gz.

File metadata

  • Download URL: ActTensor_tf-1.0.tar.gz
  • Upload date:
  • Size: 33.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.2

File hashes

Hashes for ActTensor_tf-1.0.tar.gz
Algorithm Hash digest
SHA256 516273f0c044c5665efa1eda84e9f7a766acba35a56ac99724bf5544861f62c6
MD5 acefeaac10473d769f000aa591c08a9f
BLAKE2b-256 4d405dc7d4d31a39178166cbb10eb0989c3ee7141b1b265909e00cc03cfc5623

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page