Skip to main content

A custom non-monotonic activation function

Project description

lili_activation

Description

lili_activation is a custom activation function designed for TensorFlow and Keras frameworks. This package introduces fnm3, a sine-based transformation activation function that provides an alternative to traditional activation functions like ReLU or sigmoid. The fnm3 function is particularly useful in scenarios where traditional activation functions might not capture complex patterns effectively.

Installation

Install lili_activation directly from PyPI using pip:

pip install lili_activation

Ensure that you have pip updated and tensorflow installed in your environment, as lili_activation_keras depends on TensorFlow.

Usage

To use fnm3 in your Keras model, follow these steps:

import tensorflow as tf

from lili_activation import fnm3

Simple model example using fnm3 as the activation function

model = tf.keras.Sequentia([ tf.keras.layers.Dense(10, input_shape=(10,), activation=fnm3), tf.keras.layers.Dense(1, activation='sigmoid') ])

model.compile(optimizer='adam', loss='binary_crossentropy') model.summary()

Features

Non-monotonic: Introduces a controlled non-monotonic that may be more suitable in scenarios where the relationships among input data are complex.

Innovative: Explores new avenues in activation functions that could prove beneficial in certain types of neural networks.

Contributions

Contributions are always welcome. If you have ideas for improvements or extensions, please feel free to create an issue or pull request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

Lili Chen - lilichen577@gmail.com

Acknowledgements

Special thanks to the TensorFlow and Keras community for providing an excellent platform for the experimentation and development of new ideas in machine learning.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lili_activation-0.1.2.tar.gz (3.2 kB view details)

Uploaded Source

Built Distribution

lili_activation-0.1.2-py3-none-any.whl (3.5 kB view details)

Uploaded Python 3

File details

Details for the file lili_activation-0.1.2.tar.gz.

File metadata

  • Download URL: lili_activation-0.1.2.tar.gz
  • Upload date:
  • Size: 3.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.12

File hashes

Hashes for lili_activation-0.1.2.tar.gz
Algorithm Hash digest
SHA256 f675afde8a0ae3574627c959f814484d1aebdd38346c6dbaea3edfa5b762983a
MD5 fabd93ae9f834cd911f4ccc59f264948
BLAKE2b-256 3384ede99f0fb4d1735345859d4ab77c8a8feb4dbc25dbf50509848af2cd25f5

See more details on using hashes here.

File details

Details for the file lili_activation-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for lili_activation-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 13673ec85169211eb3da5c709e85f944072db7303a8830992a0d35ce11ddecd1
MD5 48dbda2d4e051f834b457aeedfb9bf19
BLAKE2b-256 666a3ee06a46bc89d765737a7b94698a109a0a03f1b91781027be344ea2e7620

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page