A Python package for direct implementation of ReLU network.
ReLU neural network
rectified linear activation function
What is ReLU ?
ReLU is defined as g(x) = max(0,x). It is 0 when x is negative and equal to x when positive. Due to itâ€™s lower saturation region, it is highly trainable and decreases the cost function far more quickly than sigmoid.
ReLU acitation function
direct implementation of ReLU neural networks
pip install ReLUs
pip3 install ReLUs
parameters for the model to train
layers_sizes (e.g. layers_size=[13,5,5,1]) num_iters (e.g. num_iters=1000) learning_rate (e.g. learning_rate=0.03)
training the model
model_name = model(X_train, Y_train, layer_sizes, num_iters, learning_rate)
train and test accuracy
train_acc, test_acc = compute_accuracy(X_train, X_test, Y_train, Y_test, model_name)
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
No source distribution files available for this release. See tutorial on generating distribution archives.
Hashes for ReLU_nueral_network-1.0.0-py3-none-any.whl