A from-scratch neural network library built with NumPy for learning how neural networks work under the hood.
Project description
nn — Neural Network Library from Scratch
A lightweight neural network library built entirely with NumPy. No TensorFlow, no PyTorch — just pure Python and linear algebra.
Purpose
This project was built to help beginners understand the core components of neural networks and how they function under the hood.
While frameworks like TensorFlow and PyTorch provide powerful abstractions, they often hide important implementation details. This library exposes those details while still offering a simple, Keras-style API for building and training models.
The goal is to prepare users to confidently transition into using industry-standard ML libraries by giving them a strong foundation in how neural networks are constructed and trained.
This project also served as a personal learning exercise for the author — specifically in designing clean, extensible interfaces, improving object-oriented programming skills, and structuring a Python package with a modular file architecture. Claude was used throughout the process as a design partner for planning the interface, organizing the file structure, and working through OOP patterns to support future additions to the library.
Quick Start
from nn import Sequential, Dense
# Define the model
model = Sequential()
model.add(Dense(64, activation='relu'))
model.add(Dense(32, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
# Configure training
model.compile(loss='bce', lr=0.01, momentum=0.9)
# Train
history = model.fit(X_train, y_train, epochs=100, batch_size=32)
# Predict
predictions = model.predict(X_test)
# Save and load
model.save('my_model.npz')
loaded_model = Sequential.load('my_model.npz')
Project Structure
nn/
├── __init__.py # Public API — exports Sequential, Dense
├── sequential.py # Model orchestrator — add, compile, fit, predict, save, load
├── layers/
│ ├── __init__.py
│ ├── base.py # Layer base class — contract for all layers
│ └── dense.py # Dense (fully connected) layer
├── activations/
│ ├── __init__.py
│ └── functions.py # ReLU, Sigmoid, Linear + string registry
├── losses/
│ ├── __init__.py
│ └── functions.py # MSE, BCE + string registry
├── optimizers/
│ ├── __init__.py
│ ├── base.py # Optimizer base class
│ └── sgd.py # SGD with optional momentum
└── utils/
├── __init__.py
└── initializers.py # Xavier, He, and zero initialization
Current Capabilities
Layers
- Dense (fully connected)
Activations
- ReLU
- Sigmoid
- Linear
Loss Functions
- Mean Squared Error (MSE)
- Binary Cross-Entropy (BCE)
Optimization
- Stochastic Gradient Descent (SGD) with optional momentum
Utilities
- Save and load trained models
- Model summary
- Training history tracking
- Mini-batch and full-batch training
- Xavier and He weight initialization
Planned Features
- Categorical Cross-Entropy (CCE) loss
- Softmax activation
- Tanh activation
- Adam optimizer
- Dropout layer
Requirements
- Python 3.8+
- NumPy
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file learnneuralnet-0.1.0.tar.gz.
File metadata
- Download URL: learnneuralnet-0.1.0.tar.gz
- Upload date:
- Size: 8.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
63261a8cb70bf6de2acb0606d110f6e4a6bcc449fae529ed90c5213caf065260
|
|
| MD5 |
4e7bf2baaddbc6e8324b9765fd9e4cef
|
|
| BLAKE2b-256 |
a5d2d8f0e723e1fcf74fab92edcebebf522bb97a00a613f47b6f2f6549d9bcac
|
Provenance
The following attestation bundles were made for learnneuralnet-0.1.0.tar.gz:
Publisher:
workflow.yml on Jort12/LearnNeuralNet
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
learnneuralnet-0.1.0.tar.gz -
Subject digest:
63261a8cb70bf6de2acb0606d110f6e4a6bcc449fae529ed90c5213caf065260 - Sigstore transparency entry: 1315259342
- Sigstore integration time:
-
Permalink:
Jort12/LearnNeuralNet@8fc2d5dbf4883e556afcc36b064ab90028ed5b73 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/Jort12
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
workflow.yml@8fc2d5dbf4883e556afcc36b064ab90028ed5b73 -
Trigger Event:
release
-
Statement type:
File details
Details for the file learnneuralnet-0.1.0-py3-none-any.whl.
File metadata
- Download URL: learnneuralnet-0.1.0-py3-none-any.whl
- Upload date:
- Size: 10.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1f2c395eac230bd574ce73240f5ba170f0d1413c948acc9fb0ca33ec69e9460f
|
|
| MD5 |
49921a1b6143d2d87cf231f7d4e17165
|
|
| BLAKE2b-256 |
e635e5a0bae51cd05f7ae41d6403991377fe919d5991ee6b5dbd817d75504dba
|
Provenance
The following attestation bundles were made for learnneuralnet-0.1.0-py3-none-any.whl:
Publisher:
workflow.yml on Jort12/LearnNeuralNet
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
learnneuralnet-0.1.0-py3-none-any.whl -
Subject digest:
1f2c395eac230bd574ce73240f5ba170f0d1413c948acc9fb0ca33ec69e9460f - Sigstore transparency entry: 1315259453
- Sigstore integration time:
-
Permalink:
Jort12/LearnNeuralNet@8fc2d5dbf4883e556afcc36b064ab90028ed5b73 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/Jort12
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
workflow.yml@8fc2d5dbf4883e556afcc36b064ab90028ed5b73 -
Trigger Event:
release
-
Statement type: