Skip to main content

Implementation of very small scale Neural Network from scratch.

Project description

miniMLP

This repository contains an implementation of a small Multilayer Perceptron (MLP) Neural Network in Python, with the ability to specify the number of hidden layers, the number of neurons in each layer, the activation function, the optimizer to be used, and support for batch training, validation, and different loss functions.

Installation

To install miniMLP, simply run:

pip install miniMLP

Dependencies

The code requires the following dependencies:

  • NumPy: For matrix operations and mathematical computations.
  • matplotlib (optional): For visualization of training metrics, like loss curves.

Install them via pip:

pip install numpy matplotlib

Features

  • Customizable Layers: Specify the number of layers, neurons per layer, and activation functions.
  • Multiple Optimizers: Choose from various optimizers (SGD, Adam, Momentum, etc.).
  • Flexible Loss Functions: Support for several loss functions, including MSE, Cross Entropy, MAE, and more.
  • Mini-Batch Training: Efficient training using mini-batches for large datasets.
  • Validation Support: Monitor validation performance during training.
  • Training History: Track loss over epochs, useful for plotting and debugging.
  • Learning Rate Scheduling: Support for dynamic learning rates.

Example Usage

Creating an MLP Model

Create an instance of the MLP class by specifying the input size, output size, hidden layers, the number of neurons in each layer, activation functions, and optimizer:

import numpy as np
from miniMLP.engine import MLP
from miniMLP.activations import ActivationFunction
from miniMLP.optimizers import Adam
from miniMLP.losses import MSE
from miniMLP.layers import Layer

# Example MLP Architecture
layers = [
    Layer(input_size=2, output_size=4, activation=ActivationFunction.relu),
    Layer(input_size=4, output_size=6, activation=ActivationFunction.relu),
    Layer(input_size=6, output_size=1, activation=ActivationFunction.sigmoid)
]

# Define loss function and optimizer
loss_fn = MSE()
optimizer = Adam(learning_rate=0.001)

# Initialize MLP
mlp = MLP(layers=layers, loss_function=loss_fn, optimizer=optimizer)

Training the MLP

Use the train method to train the MLP, specifying the training data, validation data, learning rate, number of epochs, batch size, and more.

X_train = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
y_train = np.array([[0], [1], [1], [0]])

# Train the model
mlp.train(X_train, y_train, epochs=2000, learning_rate=0.0001, batch_size=4)

Making Predictions

After training, use the predict method to generate predictions for new data:

X_new = np.array([[1, 1], [0, 0]])
y_pred = mlp.predict(X_new)
print(y_pred)

Activation Functions

The following activation functions are supported:

  • Sigmoid
  • ReLU
  • Tanh
  • Softmax
  • Leaky ReLU
  • ELU
  • GELU
  • Softplus
  • SeLU
  • PReLU
  • Swish
  • Gaussian

Optimizers

The following optimizers are supported:

  • Adam
  • Stochastic Gradient Descent (SGD) with momentum and Nesterov
  • RMSProp
  • Momentum
  • Nesterov Accelerated Gradient (NAG)

Loss Functions

Supported loss functions include:

  • Mean Squared Error (MSE)
  • Mean Absolute Error (MAE)
  • Cross Entropy (for classification)
  • Binary Cross Entropy
  • Hinge Loss (used in SVM)
  • Huber Loss (robust regression)

Example with Validation

You can also pass validation data to track model performance:

X_val = np.array([[1, 1], [0, 1]])
y_val = np.array([[0], [1]])

history = mlp.train(X_train, y_train, X_val=X_val, Y_val=y_val, epochs=2000, learning_rate=0.0001, batch_size=4, validation=True)

This will output the training loss and validation loss for each epoch.

Plotting Training and Validation Loss

If you track loss history during training, you can plot it using matplotlib:

import matplotlib.pyplot as plt

plt.plot(history['train_loss'], label='Training Loss')
plt.plot(history['val_loss'], label='Validation Loss')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.title('Training and Validation Loss')
plt.legend()
plt.show()

License

This project is licensed under the MIT License. Feel free to use and modify this code for your own projects.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

minimlp-0.0.1.tar.gz (9.9 kB view details)

Uploaded Source

Built Distribution

miniMLP-0.0.1-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file minimlp-0.0.1.tar.gz.

File metadata

  • Download URL: minimlp-0.0.1.tar.gz
  • Upload date:
  • Size: 9.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for minimlp-0.0.1.tar.gz
Algorithm Hash digest
SHA256 1a9b3bebd9aac12e58bdf0a553b12b76108bc0467f611910d915195a44055fe8
MD5 396e71e28f62b9f88cdcf940595a6ada
BLAKE2b-256 64565f7bbccaf67607d932167c28a886c2986db3c7cdc4d1c86b8d7904686c1e

See more details on using hashes here.

File details

Details for the file miniMLP-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: miniMLP-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 10.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for miniMLP-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1ad91d590d725358d4cda50a7f25a10bfffa758edcac541ea8ebe517ccfee4d2
MD5 162ef96b637e8ff9134d7cf4c9284895
BLAKE2b-256 2ac3e2115faa50bc980bcc0620f5ac93fed651af09309f713ca31900b9b9585e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page