Skip to main content

A Python package for direct implementation of ReLU network.

Project description

ReLU neural network

rectified linear activation function

What is ReLU ?

ReLU is defined as g(x) = max(0,x). It is 0 when x is negative and equal to x when positive. Due to it’s lower saturation region, it is highly trainable and decreases the cost function far more quickly than sigmoid.

acitation functions

Inline-style: alt text

ReLU acitation function

Inline-style: alt text

direct implementation of ReLU neural networks

install

pip install ReLUs

or

pip3 install ReLUs

parameters for the model to train

layers_sizes   (e.g.  layers_size=[13,5,5,1])
num_iters      (e.g. num_iters=1000)
learning_rate (e.g. learning_rate=0.03)

training the model

model_name =  model(X_train, Y_train, layer_sizes, num_iters, learning_rate)

train and test accuracy

train_acc, test_acc = compute_accuracy(X_train, X_test, Y_train, Y_test, model_name)

making predictions

predict(X_train,your_model)

REFRENCES

https://machinelearningmastery.com/rectified-linear-activation-function-for-deep-learning-neural-networks/

https://en.wikipedia.org/wiki/Rectifier_(neural_networks)

https://www.kaggle.com/dansbecker/rectified-linear-units-relu-in-deep-learning

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

ReLU_nueral_network-1.0.0-py3-none-any.whl (4.1 kB view details)

Uploaded Python 3

File details

Details for the file ReLU_nueral_network-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: ReLU_nueral_network-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 4.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/53.0.0 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.9.1

File hashes

Hashes for ReLU_nueral_network-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 99276e1f1a97bccad4ef63d6a7af15dbdf3545d7b8af6c550aa806a0de7ad9fc
MD5 951b0452704dd6ba6b65fd34cf0c7942
BLAKE2b-256 fa462e055e9b8fa47452c049f2d1429f7aa57dc36e768727af03951600356191

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page