A Python package for direct implementation of ReLU network.
Project description
ReLU neural network
rectified linear activation function
What is ReLU ?
ReLU is defined as g(x) = max(0,x). It is 0 when x is negative and equal to x when positive. Due to it’s lower saturation region, it is highly trainable and decreases the cost function far more quickly than sigmoid.
acitation functions
Inline-style:
ReLU acitation function
Inline-style:
direct implementation of ReLU neural networks
install
pip install ReLUs
or
pip3 install ReLUs
parameters for the model to train
layers_sizes (e.g. layers_size=[13,5,5,1])
num_iters (e.g. num_iters=1000)
learning_rate (e.g. learning_rate=0.03)
training the model
model_name = model(X_train, Y_train, layer_sizes, num_iters, learning_rate)
train and test accuracy
train_acc, test_acc = compute_accuracy(X_train, X_test, Y_train, Y_test, model_name)
making predictions
predict(X_train,your_model)
REFRENCES
https://en.wikipedia.org/wiki/Rectifier_(neural_networks)
https://www.kaggle.com/dansbecker/rectified-linear-units-relu-in-deep-learning
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
ReLUs-1.0.1.tar.gz
(3.8 kB
view hashes)
Built Distribution
ReLUs-1.0.1-py3-none-any.whl
(3.9 kB
view hashes)