A Python package for direct implementation of ReLU network.
Project description
ReLU neural network
rectified linear activation function
What is ReLU ?
ReLU is defined as g(x) = max(0,x). It is 0 when x is negative and equal to x when positive. Due to it’s lower saturation region, it is highly trainable and decreases the cost function far more quickly than sigmoid.
acitation functions
Inline-style:
ReLU acitation function
Inline-style:
direct implementation of ReLU neural networks
install
pip install ReLUs
or
pip3 install ReLUs
parameters for the model to train
layers_sizes (e.g. layers_size=[13,5,5,1])
num_iters (e.g. num_iters=1000)
learning_rate (e.g. learning_rate=0.03)
training the model
model_name = model(X_train, Y_train, layer_sizes, num_iters, learning_rate)
train and test accuracy
train_acc, test_acc = compute_accuracy(X_train, X_test, Y_train, Y_test, model_name)
making predictions
predict(X_train,your_model)
REFRENCES
https://en.wikipedia.org/wiki/Rectifier_(neural_networks)
https://www.kaggle.com/dansbecker/rectified-linear-units-relu-in-deep-learning
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
No source distribution files available for this release.
See tutorial on generating distribution archives.
Built Distribution
Close
Hashes for ReLU_nueral_network-1.0.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 99276e1f1a97bccad4ef63d6a7af15dbdf3545d7b8af6c550aa806a0de7ad9fc |
|
MD5 | 951b0452704dd6ba6b65fd34cf0c7942 |
|
BLAKE2b-256 | fa462e055e9b8fa47452c049f2d1429f7aa57dc36e768727af03951600356191 |