Skip to main content

A from-scratch neural network library built with NumPy for learning how neural networks work under the hood.

Project description

nn — Neural Network Library from Scratch

A lightweight neural network library built entirely with NumPy. No TensorFlow, no PyTorch — just pure Python and linear algebra.

Purpose

This project was built to help beginners understand the core components of neural networks and how they function under the hood.

While frameworks like TensorFlow and PyTorch provide powerful abstractions, they often hide important implementation details. This library exposes those details while still offering a simple, Keras-style API for building and training models.

The goal is to prepare users to confidently transition into using industry-standard ML libraries by giving them a strong foundation in how neural networks are constructed and trained.

This project also served as a personal learning exercise for the author — specifically in designing clean, extensible interfaces, improving object-oriented programming skills, and structuring a Python package with a modular file architecture. Claude was used throughout the process as a design partner for planning the interface, organizing the file structure, and working through OOP patterns to support future additions to the library.

Quick Start

from nn import Sequential, Dense

# Define the model
model = Sequential()
model.add(Dense(64, activation='relu'))
model.add(Dense(32, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

# Configure training
model.compile(loss='bce', lr=0.01, momentum=0.9)

# Train
history = model.fit(X_train, y_train, epochs=100, batch_size=32)

# Predict
predictions = model.predict(X_test)

# Save and load
model.save('my_model.npz')
loaded_model = Sequential.load('my_model.npz')

Project Structure

nn/
├── __init__.py              # Public API — exports Sequential, Dense
├── sequential.py            # Model orchestrator — add, compile, fit, predict, save, load
├── layers/
│   ├── __init__.py
│   ├── base.py              # Layer base class — contract for all layers
│   └── dense.py             # Dense (fully connected) layer
├── activations/
│   ├── __init__.py
│   └── functions.py         # ReLU, Sigmoid, Linear + string registry
├── losses/
│   ├── __init__.py
│   └── functions.py         # MSE, BCE + string registry
├── optimizers/
│   ├── __init__.py
│   ├── base.py              # Optimizer base class
│   └── sgd.py               # SGD with optional momentum
└── utils/
    ├── __init__.py
    └── initializers.py      # Xavier, He, and zero initialization

Current Capabilities

Layers

  • Dense (fully connected)

Activations

  • ReLU
  • Sigmoid
  • Linear

Loss Functions

  • Mean Squared Error (MSE)
  • Binary Cross-Entropy (BCE)

Optimization

  • Stochastic Gradient Descent (SGD) with optional momentum

Utilities

  • Save and load trained models
  • Model summary
  • Training history tracking
  • Mini-batch and full-batch training
  • Xavier and He weight initialization

Planned Features

  • Categorical Cross-Entropy (CCE) loss
  • Softmax activation
  • Tanh activation
  • Adam optimizer
  • Dropout layer

Requirements

  • Python 3.8+
  • NumPy

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

learnneuralnet-0.1.0.tar.gz (8.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

learnneuralnet-0.1.0-py3-none-any.whl (10.8 kB view details)

Uploaded Python 3

File details

Details for the file learnneuralnet-0.1.0.tar.gz.

File metadata

  • Download URL: learnneuralnet-0.1.0.tar.gz
  • Upload date:
  • Size: 8.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for learnneuralnet-0.1.0.tar.gz
Algorithm Hash digest
SHA256 63261a8cb70bf6de2acb0606d110f6e4a6bcc449fae529ed90c5213caf065260
MD5 4e7bf2baaddbc6e8324b9765fd9e4cef
BLAKE2b-256 a5d2d8f0e723e1fcf74fab92edcebebf522bb97a00a613f47b6f2f6549d9bcac

See more details on using hashes here.

Provenance

The following attestation bundles were made for learnneuralnet-0.1.0.tar.gz:

Publisher: workflow.yml on Jort12/LearnNeuralNet

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file learnneuralnet-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: learnneuralnet-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 10.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for learnneuralnet-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1f2c395eac230bd574ce73240f5ba170f0d1413c948acc9fb0ca33ec69e9460f
MD5 49921a1b6143d2d87cf231f7d4e17165
BLAKE2b-256 e635e5a0bae51cd05f7ae41d6403991377fe919d5991ee6b5dbd817d75504dba

See more details on using hashes here.

Provenance

The following attestation bundles were made for learnneuralnet-0.1.0-py3-none-any.whl:

Publisher: workflow.yml on Jort12/LearnNeuralNet

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page