A Neural Network framework for building Multi-layer Perceptron model.
Project description
neural-network
neural-network is a Python package on TestPyPi that provides a Multi-Layer Perceptron (MLP) framework built using only NumPy. The framework supports Gradient Descent, Momentum, RMSProp, Adam optimizers.
Table of Contents
Installation
Dependencies
python>= 3.8
numpy>= 1.22.1
matplotlib>= 3.5.1
User installation
You can install neural-network using pip
:
pip install -i https://test.pypi.org/simple/neural-network
Simple Usage
Designing the Model Architecture
To define your MLP model, you need to specify the number of layers, and the number of neurons in each one.
Unless you want to manually set up the parameters, the size of the input layer is not needed, as it will be automatically determined in the initial training process.
from neural_network import NeuralNetwork
model = NeuralNetwork(neurons=[64, 120, 1])
In this example, we have a four-layer neural network containing auto-defined input neurons, first hidden layer with 64 neurons, second hidden layer with 120 neurons, and one output neuron.
Training the Model
To train the model, you need to provide the input data and the corresponding target (or label) data.
X = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
y = np.array([[0], [1], [1], [0]])
model.fit(X, y, epochs=1000, learning_rate=0.1, optimizer='adam')
When training the model without setting the activation functions or/and the loss functions, the framework will automatically do the job for you. It will initialize the parameters and the functions according to the type of model (regression or classification) and its architecture.
Making predictions
Once the model has been trained, you can use it to make predictions by simple call predict
method.
predictions = model.predict(X)
Beyond the Framework
Apart from the neural network framework, the package also provides:
Activation functions
Sigmoid function | sigmoid() |
Hyperbolic tangent function | tanh() |
Rectified linear unit | relu() |
Leaky Rectified linear unit | leaky_relu() |
Softmax function | softmax() |
Gaussian error linear unit | gelu() |
All above functions have 2 parameters:
x
: The input values. Even though some functions can accept numeric primitive data type, it is advised to use NumPy array.derivative
: A boolean value indicating whether the function computes the derivative on inputx
. Default is False.
Loss functions
Logistic loss function | log_loss() |
Cross-entropy loss function | cross_entropy_loss() |
Quadratic loss function | quadratic_loss() |
All above functions have 3 parameters:
y_pred
: Predicted labels. It must be a 2D NumPy array and have the same size asy_true
.y_true
: True labels. It must be a 2D NumPy array and have the same size asy_pred
.derivative
: A boolean value indicating whether the function computes the derivative. Default is False.
2D Decision Boundary
This utility function is used for illustrative purpose. It takes a trained binary classification model, a 2D NumPy input data with 2 attributes, and the corresponding binary label data as input. The function then will plot a 2D decision boundary based on the prediction of the model.
The input model is not necessarily an instance of NeuralNetwork, but it must have predict
method that accepts a 2D NumPy array as input.
plot_decision_boundary(model, train_x, train_y)
License
This project has MIT License, as found in the LICENSE file.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file neural-network-0.1.1.tar.gz
.
File metadata
- Download URL: neural-network-0.1.1.tar.gz
- Upload date:
- Size: 12.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
57e8c054485f7e4f1b65a215a10b57d1adf3d629e0ac24edfbee474398e83db3
|
|
MD5 |
c1565c9b81d59ba7ca91deb76e8b4c47
|
|
BLAKE2b-256 |
6f614e019acebeebe640a9cbf1023eed641942ece5f89916519be715257b5b31
|
File details
Details for the file neural_network-0.1.1-py3-none-any.whl
.
File metadata
- Download URL: neural_network-0.1.1-py3-none-any.whl
- Upload date:
- Size: 11.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
7df4ca8ea9a513009a3f0c68f2eecb9ea49e489acc323eb2242bc4ba061ce957
|
|
MD5 |
dfbda3e966e6c5e24e7530013b22485d
|
|
BLAKE2b-256 |
243847a58cb7a98a05147c3e41c8271840b83c76d6c4ee235ac4691a46b8a7bc
|