Skip to main content

Simple, lightweight neural network framework built in numpy

Project description

Numpy-Neuron

A small, simple neural network framework built using only numpy and python (duh).

Install

pip install numpyneuron

Example

from sklearn import datasets
from sklearn.preprocessing import OneHotEncoder
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score, precision_score, recall_score
import numpy as np
from numpyneuron import (
    NN,
    Relu,
    Sigmoid,
    CrossEntropyWithLogits,
)


RANDOM_SEED = 2


def _preprocess_digits(
    seed: int,
) -> tuple[np.ndarray, ...]:
    digits = datasets.load_digits(as_frame=False)
    n_samples = len(digits.images)
    data = digits.images.reshape((n_samples, -1))
    y = OneHotEncoder().fit_transform(digits.target.reshape(-1, 1)).toarray()
    X_train, X_test, y_train, y_test = train_test_split(
        data,
        y,
        test_size=0.2,
        random_state=seed,
    )
    return X_train, X_test, y_train, y_test


def train_nn_classifier() -> None:
    X_train, X_test, y_train, y_test = _preprocess_digits(seed=RANDOM_SEED)

    nn_classifier = NN(
        epochs=2_000,
        hidden_size=16,
        batch_size=1,
        learning_rate=0.01,
        loss_fn=CrossEntropyWithLogits(),
        hidden_activation_fn=Relu(),
        output_activation_fn=Sigmoid(),
        input_size=64,  # 8x8 pixel grid images
        output_size=10,  # digits 0-9
        seed=2,
    )

    nn_classifier.train(
        X_train=X_train,
        y_train=y_train,
    )

    pred = nn_classifier.predict(X_test=X_test)

    pred = np.argmax(pred, axis=1)
    y_test = np.argmax(y_test, axis=1)

    accuracy = accuracy_score(y_true=y_test, y_pred=pred)

    print(f"accuracy on validation set: {accuracy:.4f}")


if __name__ == "__main__":
    train_nn_classifier()

Roadmap

Optimizers

Currently the learning rate in a NN object is static during training. I would like to work on developing at least the functionality for the Adam optimizer at some point. This would help prevent getting stuck in local minima of the loss function.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

numpyneuron-0.5.tar.gz (5.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

numpyneuron-0.5-py3-none-any.whl (5.6 kB view details)

Uploaded Python 3

File details

Details for the file numpyneuron-0.5.tar.gz.

File metadata

  • Download URL: numpyneuron-0.5.tar.gz
  • Upload date:
  • Size: 5.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for numpyneuron-0.5.tar.gz
Algorithm Hash digest
SHA256 5dcd88270055d51adb311da830b64db3666f99ac53a2de3557b22a3d2d92cdd5
MD5 d67078c7935122cf98783039a0067943
BLAKE2b-256 dd234e96519095923ea714c900e8d78072bcbe8e3ee7a0e3b39dcd19bd18909a

See more details on using hashes here.

File details

Details for the file numpyneuron-0.5-py3-none-any.whl.

File metadata

  • Download URL: numpyneuron-0.5-py3-none-any.whl
  • Upload date:
  • Size: 5.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for numpyneuron-0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 a5828df93c146ffd81e992141761cfcb15f654c6bb8465f535fdc39736edeac7
MD5 46ff7244b2b2d388027298055a51fa95
BLAKE2b-256 434d84c4967ea5ad4847e5d8827d3a4c7cff21add9973e7e40bb55656bc33ef0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page