Skip to main content

An implementation of an MLP classifier (with an interface of that of scikit-learn's MLPClassifier class.

Project description

Neural-Network-From-Scratch-COSC-221-CSB

Links:

Documentation and derivation:

Neural Network to classify handwritten digits (a rite of passage project at this point lol).

We will try to re-implement a stripped down version of the MLPClassifier class from scikit-learn from first principles. With this, we can then train a general classifier using the Multi-Layered Perceptron model.

To run

So since we've re-implemented an MLP using scikit-learn's MLPClassifier as a template, the API should be familiar.

Downloading the package

pip install mlpclassifier

Or create a virtual environment to download it.

Using uv:

uv add mlpclassifier

To use the library:

from classifier.classifier import *
# to load the MNIST dataset.
images = get_images_fast("dataset/train-images.idx3-ubyte")
labels = get_labels_fast("dataset/train-labels.idx1-ubyte")

test_images = get_images_fast("dataset/t10k-images.idx3-ubyte")
test_labels = get_labels_fast("dataset/t10k-labels.idx1-ubyte")

# pre-processing
X = images.reshape(images.shape[0], -1)/255
X_test = test_images.reshape(test_images.shape[0], -1)/255

model = MLPClassifier(
        hidden_layer_sizes=(128, 64),
        activation='relu',
        max_iter=1000,
        alpha=1e-3,
        batch_size=200,
        verbose=True,
)

SAVE_PATH = "weights/self_trained.pkl"

# if Ctrl-C, then it will save to that file path
model.fit(X, labels, save_path=SAVE_PATH)

N = 10_000
score, incorrect_indicies = model.score(X_test[:N], test_labels[:N])
print("score: ", score)
model.save(SAVE_PATH)

Dataset

Download the dataset from Kaggle

curl -L https://www.kaggle.com/api/v1/datasets/download/hojjatk/mnist-dataset -o ./dataset.zip

Then just unzip it into a directory called ./dataset

unzip -d dataset ./dataset.zip

Optional, but clean redundancy:

rm -r *-idx*-ubyte

I've removed some duplicates, so currently I have:

$ ls ./dataset/
 t10k-images.idx3-ubyte   train-images.idx3-ubyte
 t10k-labels.idx1-ubyte   train-labels.idx1-ubyte

So it seems like by convention:

  • we divide our dataset into training data, and then testing data
  • currently, it seems like we have 60k training examples and 10k testing examples
  • they do this to see how well the model has generalized

Reference model

For now, we'll use a reference model through scikit-learn.

TODO

  • debug all the row vector stuff
  • package it in pip
  • document the API

Forward propagation

  • variable L for layer
  • a list $n^{[l]}$ for the size at each layer
  • initialize using He's initalization
  • forward propagation step using that forward propagation formula

Backward propagation

  • He's initialization
  • back propagation
  • scoring
  • saving
  • [] make the learn rate $\alpha$ more adjustable

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlpclassifier-0.1.2.tar.gz (8.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mlpclassifier-0.1.2-py3-none-any.whl (8.0 kB view details)

Uploaded Python 3

File details

Details for the file mlpclassifier-0.1.2.tar.gz.

File metadata

  • Download URL: mlpclassifier-0.1.2.tar.gz
  • Upload date:
  • Size: 8.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for mlpclassifier-0.1.2.tar.gz
Algorithm Hash digest
SHA256 4064237b1b78dd77cb8aca5ebd9bf1a047493fdc1c8dcf888afe848fe6171056
MD5 9d7f6347e225b0fc89b49c5d30146210
BLAKE2b-256 102d088a01b86292a175545bcc692aa2d2b5e83c57692089386cb6a0f66874c5

See more details on using hashes here.

File details

Details for the file mlpclassifier-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: mlpclassifier-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 8.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for mlpclassifier-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 249cf3f32a88be395f3ac4ad4b3ec16627d23403e99a5eb55e48225b8fd1cf00
MD5 b4511cc4d7856c8e986500b8800f5454
BLAKE2b-256 1f7f31920514e55660e90d715c52f2679dd7cd37bcaf4c7e96db478f0412a411

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page