Skip to main content

An implementation of an MLP classifier (with an interface of that of scikit-learn's MLPClassifier class.

Project description

Neural-Network-From-Scratch-COSC-221-CSB

Links:

Documentation and derivation:

Neural Network to classify handwritten digits (a rite of passage project at this point lol).

We will try to re-implement a stripped down version of the MLPClassifier class from scikit-learn from first principles. With this, we can then train a general classifier using the Multi-Layered Perceptron model.

To run

So since we've re-implemented an MLP using scikit-learn's MLPClassifier as a template, the API should be familiar.

Downloading the package

pip install mlpclassifier

Or create a virtual environment to download it.

Using uv:

uv add mlpclassifier

To use the library:

from classifier.classifier import *
# to load the MNIST dataset.
images = get_images_fast("dataset/train-images.idx3-ubyte")
labels = get_labels_fast("dataset/train-labels.idx1-ubyte")

test_images = get_images_fast("dataset/t10k-images.idx3-ubyte")
test_labels = get_labels_fast("dataset/t10k-labels.idx1-ubyte")

# pre-processing
X = images.reshape(images.shape[0], -1)/255
X_test = test_images.reshape(test_images.shape[0], -1)/255

model = MLPClassifier(
        hidden_layer_sizes=(128, 64),
        activation='relu',
        max_iter=1000,
        alpha=1e-3,
        batch_size=200,
        verbose=True,
)

SAVE_PATH = "weights/self_trained.pkl"

# if Ctrl-C, then it will save to that file path
model.fit(X, labels, save_path=SAVE_PATH)

N = 10_000
score, incorrect_indicies = model.score(X_test[:N], test_labels[:N])
print("score: ", score)
model.save(SAVE_PATH)

Loading the saved model:

with open("weights/sklearn_weights_and_biases.pkl", 'rb') as file:
    weights_and_biases = pickle.load(file)
    model.load_weights(weights_and_biases["weights"],
                       weights_and_biases["biases"],
                       weights_and_biases["classes"])

Results

Digits MNIST

Accuracy: 97.47%

Code:

model = cl.MLPClassifier(
    hidden_layer_sizes=(512, 256, 128),
    activation="relu",
    learning_rate="adaptive",
    max_iter=200,
    alpha=1e-3,
    verbose=True
)

model.fit(X, Y, save_path="./saved_models/digit_classifier.pkl")
model.score(X_test, Y_test)

Fashion MNIST

Accuracy: 86.61%

code:

model = cl.MLPClassifier(
    hidden_layer_sizes=(512, 256, 128),
    max_iter=1_000,
    activation="relu",
    learning_rate="adaptive",
    alpha=10e-3,
    verbose=True
)

X = images.reshape(len(images), -1) / 255
X_test = test_images.reshape(len(test_images), -1)/255

Y = labels
Y_test = test_labels

model.fit(X, Y, save_path="./saved_models/mnist_fashion.pkl")
model.score(X_test, Y_test)

Dataset

Download the dataset from Kaggle

curl -L https://www.kaggle.com/api/v1/datasets/download/hojjatk/mnist-dataset -o ./dataset.zip

Then just unzip it into a directory called ./dataset

unzip -d dataset ./dataset.zip

Optional, but clean redundancy:

rm -r *-idx*-ubyte

I've removed some duplicates, so currently I have:

$ ls ./dataset/
 t10k-images.idx3-ubyte   train-images.idx3-ubyte
 t10k-labels.idx1-ubyte   train-labels.idx1-ubyte

So it seems like by convention:

  • we divide our dataset into training data, and then testing data
  • currently, it seems like we have 60k training examples and 10k testing examples
  • they do this to see how well the model has generalized

TODO

  • debug all the row vector stuff
  • package it in pip
  • document the API

Forward propagation

  • variable L for layer
  • a list $n^{[l]}$ for the size at each layer
  • initialize using He's initalization
  • forward propagation step using that forward propagation formula

Backward propagation

  • He's initialization
  • back propagation
  • scoring
  • saving
  • make the learn rate $\alpha$ more adjustable

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlpclassifier-0.1.4.tar.gz (8.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mlpclassifier-0.1.4-py3-none-any.whl (8.3 kB view details)

Uploaded Python 3

File details

Details for the file mlpclassifier-0.1.4.tar.gz.

File metadata

  • Download URL: mlpclassifier-0.1.4.tar.gz
  • Upload date:
  • Size: 8.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for mlpclassifier-0.1.4.tar.gz
Algorithm Hash digest
SHA256 a305266eda148c424494b9906b9476b05f5e51709fbcb88ecd8846a62ca0f9de
MD5 6595ab0b7a08709459264971dc722e50
BLAKE2b-256 9a68d2cd4b6eb10cd4cba64d4dc55863b6fa01d203bb58b4d1af7a97a496741a

See more details on using hashes here.

File details

Details for the file mlpclassifier-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: mlpclassifier-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 8.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for mlpclassifier-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 babae1b0c60fdb58ce10b8740eab621fe758a9906607de273d3b1a2ca8b518d3
MD5 4eb6d119f9c1d5654cefb6c5576c0d73
BLAKE2b-256 43e42a98b48e3d4558d2d9df7f354609422b688f9e8bed5b772be6b9273f9de8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page