Skip to main content

A simple convolutional neural network library with only numpy as dependency

Project description

Neuralnetlib

📝 Description

This is a handmade convolutional neural network library, made in python, using numpy as the only dependency.

I made it to challenge myself and to learn more about neural networks, how they work in depth.

The big part of this project was made in 4 hours and a half. The save and load features, and the binary classification support were added later.

Remember that this library is not optimized for performance, but for learning purposes (although I tried to make it as fast as possible).

I intend to improve the neural networks and add more features in the future.

📦 Features

  • Many layers (wrappers, dense, dropout, conv1d/2d, pooling1d/2d, flatten, embedding, batchnormalization, lstm, attention and more) 🧠
  • Many activation functions (sigmoid, tanh, relu, leaky relu, softmax, linear, elu, selu) 📈
  • Many loss functions (mean squared error, mean absolute error, categorical crossentropy, binary crossentropy, huber loss) 📉
  • Many optimizers (sgd, momentum, rmsprop, adam) 📊
  • Supports binary classification, multiclass classification and regression 📖
  • Save and load models 📁
  • Simple to use 📚

⚙️ Installation

You can install the library using pip:

pip install neuralnetlib

💡 How to use

See this file for a simple example of how to use the library.
For a more advanced example, see this file for using CNN.
You can also check this file for text classification using RNN.

More examples in this folder.

You are free to tweak the hyperparameters and the network architecture to see how it affects the results.

I used the MNIST dataset to test the library, but you can use any dataset you want.

🚀 Quick examples (more here)

Binary Classification

from neuralnetlib.model import Model
from neuralnetlib.layers import Input, Dense
from neuralnetlib.activations import Sigmoid
from neuralnetlib.losses import BinaryCrossentropy
from neuralnetlib.optimizers import SGD
from neuralnetlib.metrics import accuracy_score

# ... Preprocess x_train, y_train, x_test, y_test if necessary (you can use neuralnetlib.preprocess and neuralnetlib.utils)

# Create a model
model = Model()
model.add(Input(10))  # 10 features
model.add(Dense(8))
model.add(Dense(1))
model.add(Activation(Sigmoid()))  # many ways to tell the model which Activation Function you'd like, see the next example

# Compile the model
model.compile(loss_function='bce', optimizer='sgd')

# Train the model
model.fit(X_train, y_train, epochs=10, batch_size=32, metrics=['accuracy'])

Multiclass Classification

from neuralnetlib.activations import Softmax
from neuralnetlib.losses import CategoricalCrossentropy
from neuralnetlib.optimizers import Adam
from neuralnetlib.metrics import accuracy_score

# ... Preprocess x_train, y_train, x_test, y_test if necessary (you can use neuralnetlib.preprocess and neuralnetlib.utils)

# Create and compile a model
model = Model()
model.add(Input(28, 28, 1)) # For example, MNIST images
model.add(Conv2D(32, kernel_size=3, padding='same'), activation='relu')  # activation supports both str...
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=2))
model.add(Dense(64, activation='relu'))
model.add(Dense(10, activation=Softmax()))  # ... and ActivationFunction objects
model.compile(loss_function='categorical_crossentropy', optimizer=Adam())


model.compile(loss_function='categorical_crossentropy', optimizer=Adam())  # same for loss_function and optimizer

# Train the model
model.fit(X_train, y_train_ohe, epochs=5, metrics=['accuracy'])

Regression

from neuralnetlib.losses import MeanSquaredError
from neuralnetlib.metrics import accuracy_score

# ... Preprocess x_train, y_train, x_test, y_test if necessary (you can use neuralnetlib.preprocess and neuralnetlib.utils)

# Create and compile a model
model = Model()
model.add(Input(13))
model.add(Dense(64, activation='leakyrelu'))
model.add(Dense(1), activation="linear")

model.compile(loss_function="mse", optimizer='adam')  # you can either put acronyms or full name

# Train the model
model.fit(X_train, y_train, epochs=100, batch_size=128, metrics=['accuracy'])

You can also save and load models:

# Save a model
model.save('my_model.json')

# Load a model
model = Model.load('my_model.json')

📜 Output of the example file

Here is the decision boundary on a Binary Classification (breast cancer dataset):

decision_boundary

[!NOTE] PCA (Principal Component Analysis) was used to reduce the number of features to 2, so we could plot the decision boundary. Representing n-dimensional data in 2D is not easy, so the decision boundary may not be always accurate. I also tried with t-SNE, but the results were not good.

Here is an example of a model training on the mnist using the library

cli

Here is an example of a loaded model used with Tkinter:

gui

Here, I decided to print the first 10 predictions and their respective labels to see how the network is performing.

plot

You can of course use the library for any dataset you want.

✍️ Authors

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neuralnetlib-3.0.1.tar.gz (40.1 kB view details)

Uploaded Source

Built Distribution

neuralnetlib-3.0.1-py3-none-any.whl (40.8 kB view details)

Uploaded Python 3

File details

Details for the file neuralnetlib-3.0.1.tar.gz.

File metadata

  • Download URL: neuralnetlib-3.0.1.tar.gz
  • Upload date:
  • Size: 40.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.15

File hashes

Hashes for neuralnetlib-3.0.1.tar.gz
Algorithm Hash digest
SHA256 2e00da3fe03ef20aaad4ff4135f623906b21970a6fa89c90a87eef9a19a14879
MD5 57b606b37ef40c3ceff5777baf1067c7
BLAKE2b-256 35d0bcd364e5b79707ebec702932f1664f5285cf0f42920e81f72e28bd1be154

See more details on using hashes here.

File details

Details for the file neuralnetlib-3.0.1-py3-none-any.whl.

File metadata

  • Download URL: neuralnetlib-3.0.1-py3-none-any.whl
  • Upload date:
  • Size: 40.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.15

File hashes

Hashes for neuralnetlib-3.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 99de4d3b088ba6ff3d4289b69a6805a393589aadf30dfe03364e05b4a6de4eaa
MD5 1b63e1836993f6f0524ed10fb0abaf74
BLAKE2b-256 d22e1798f929d6ad7d4b9f9c401dcb8894c4163a787b6a6a29b4085d5312b838

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page