Skip to main content

A NumPy-only autograd and neural network library with a PyTorch-like API

Project description

CI License: MIT Python Ruff Coverage mypy Docs

NumpyGrad

A small autograd and neural network library with a PyTorch-like API, built on NumPy only. No GPU, no C++ extensions—just Python and NumPy. Useful for learning how backprop and frameworks like PyTorch work, or for lightweight experiments where a full framework is overkill.

Documentation

Features

Autograd

  • NumPy-only — Single dependency: NumPy.
  • Define-by-run autograd — Builds a computation graph as ops are invoked (i.e., Torch eager).
  • Familiar array APIarray with shape, ndim, dtype etc.
  • Familiar array creation - ones, zeros, arange, randn, etc.
  • Familiar torch autograd API - .backward(), requires_grad, .grad, no_grad(), etc.
  • Broadcasting & batched ops — Linear algebra, reductions, transforms, and elementwise ops support batched and broadcasted shapes.
  • Familiar special methods - x @ y, mask = x > 0, etc.

Neural Nets

  • Modules - Linear, Conv2d, Embedding, LayerNorm, MultiHeadAttention, Dropout, Sequential, etc.
  • Optimizers - AdamW, SGD.
  • Activations - ReLU, GELU, Tanh, Sigmoid, SoftPlus.
  • Losses - CrossEntropy (supports N-D logits), MSE.
  • Parameter init - nn.init.kaiming_uniform_, xavier_normal_, etc.

Installation

pip install numpygrad

Requires Python ≥3.12 and NumPy ≥2.4.2.

Or install from source in editable mode:

git clone https://github.com/njkrichardson/numpygrad.git
cd numpygrad
pip install -e .

Optional dependencies (e.g. for tests and examples):

pip install -e ".[tests]"      # pytest, hypothesis, torch (for gradient checks)
pip install -e ".[examples]"  # matplotlib for plotting

Quick start

import numpygrad as np # live on the edge! 
import numpygrad.nn as nn

# Arrays and gradients
x = np.randn((3, 4), requires_grad=True)
y = (x ** 2).sum()
y.backward()
print(x.grad)  # gradients of sum(x²) w.r.t. x

# Small MLP
net = nn.MLP(input_dim=1, hidden_sizes=[8, 8], output_dim=1)
optimizer = np.optim.SGD(net.parameters(), step_size=1e-1)

x = np.randn(32, 1)
targets = np.randn(32, 1)
out = net(x)
loss = ((out - targets) ** 2).mean()
loss.backward()
optimizer.step()

Examples

The examples/ directory includes self-contained demos. Run any of them with:

python -m examples.<name>.main   # use --help for CLI options

If matplotlib is not installed, training and evaluation run as usual but no figures are generated. Install the examples extra to enable plotting:

pip install -e ".[examples]"   # adds matplotlib

Scalar Regression

MLP fit to a noisy 1D function. Demonstrates MSE loss, AdamW, and train/test splits.

python -m examples.regression_1d.main

2D Classification

Classifier on a pinwheel (interleaved spirals) dataset. Demonstrates cross-entropy loss and decision-boundary visualisation.

python -m examples.classification_2d.main

MNIST

Conv-net classifier on handwritten digits. Data is downloaded automatically.

python -m examples.mnist.main

GPT-2 Character Language Model

A GPT-2-style transformer trained character-by-character on Shakespeare's complete works. Uses LayerNorm, Embedding, Dropout, GELU, causal masking via triu + masked_fill, and ND cross-entropy — all in pure NumPy.

python -m examples.gpt2.main   # ~45 min on CPU to generate proto-English

Project layout

src/numpygrad/
├── core/           # Array, autograd (Function, backward), dispatch, device
├── ops/            # Operators: elementwise, linalg, transforms, reductions, etc.
├── nn/             # Linear, MLP (and other modules)
├── optim/          # SGD and optimizer base
├── utils/          # Logging, I/O, visualizations
└── configuration.py

Tests live in tests/ and use Hypothesis plus PyTorch to check gradients against a reference.

Development

Run tests:

pytest
# or with optional deps
pip install -e ".[tests]" && pytest

License

MIT License. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

numpygrad-0.1.1.tar.gz (39.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

numpygrad-0.1.1-py3-none-any.whl (44.7 kB view details)

Uploaded Python 3

File details

Details for the file numpygrad-0.1.1.tar.gz.

File metadata

  • Download URL: numpygrad-0.1.1.tar.gz
  • Upload date:
  • Size: 39.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for numpygrad-0.1.1.tar.gz
Algorithm Hash digest
SHA256 8bc7a69b51020c4f202e6a61429fc3fd097d38e1aa64937c074d3daa0a84c474
MD5 d509ca640b7e292979e42ff11f1b0266
BLAKE2b-256 f5af94fd65e5604a14d8c590eac7cea5d990812d1956a17d6758aa7f9168a1cb

See more details on using hashes here.

Provenance

The following attestation bundles were made for numpygrad-0.1.1.tar.gz:

Publisher: publish.yml on njkrichardson/numpygrad

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file numpygrad-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: numpygrad-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 44.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for numpygrad-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 cc54625d5f48f8d5cf03b19a6a380987bbbd89084ca8e16155fccb927ce4dd97
MD5 40a1f2fd6f5327eb0d418b26182535d2
BLAKE2b-256 4a28a6b8655969390b027bdc00d79a873ddfebe10792fc0be4d61b38a1dfe22d

See more details on using hashes here.

Provenance

The following attestation bundles were made for numpygrad-0.1.1-py3-none-any.whl:

Publisher: publish.yml on njkrichardson/numpygrad

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page