Spocu activation function in Pytorch and Tensorflow.
Project description
“SPOCU”: scaled polynomial constant unit activation function.
Non-official Pytorch/Tensorflow implementation of the SPOCU activation function [1], for the case when c=infinite.
Installation
You can install this package using pip:
python3 -m pip install spocu
Pytorch
It can be included in your network given an alpha, beta and gamma value:
from spocu.spocu_pytorch import SPOCU
alpha = 3.0937
beta = 0.6653
gamma = 4.437
spocu = SPOCU(alpha, beta, gamma)
x = torch.rand((10,10))
print(spocu(x))
Tensorflow
from spocu.spocu_tensorflow import SPOCU
alpha = 3.0937
beta = 0.6653
gamma = 4.437
spocu = SPOCU(alpha, beta, gamma)
X = tf.Variable(tf.random.normal([10, 10], stddev=5, mean=4) )
print(spocu(X))
Tests
See spocu_test for equivalance of pytorch and tensorflow implementation.
Citation
If you find this work useful, please cite:
@article{carrillo2021deep,
title={Deep learning to classify ultra-high-energy cosmic rays by means of PMT signals},
author={Carrillo-Perez, F and Herrera, LJ and Carceller, JM and Guill{\'e}n, A},
journal={Neural Computing and Applications},
pages={1--17},
year={2021},
publisher={Springer}
}
Acknowledgements
Thanks to the author of the Tensorflow version, Atilla Ozgur.
Bibliography
[1] Kiseľák, J., Lu, Y., Švihra, J. et al. “SPOCU”: scaled polynomial constant unit activation function. Neural Comput & Applic (2020). https://doi.org/10.1007/s00521-020-05182-1
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
spocu-0.0.1.tar.gz
(2.8 kB
view hashes)
Built Distribution
spocu-0.0.1-py3-none-any.whl
(4.2 kB
view hashes)