Skip to main content

Noni — a tiny tensor library with autograd, for humans.

Project description

Noni (WIP)

A minimal tensor library with autograd flexible for building good enough deep learning models.

Familiar API

a = Tensor([[1., 2.], [3., 4.]], requires_grad=True)
b = Tensor([[0.5, -1.], [2., 0.]], requires_grad=True)

# Each op records its backward function
c = a * b        # op="*",  backward: dc/da = b, dc/db = a
d = c.sum()      # op="sum", backward: ones

d.backward()     # topological sort → apply each _backward in reverse

print(a.grad)    # dL/da = b.data = [[0.5, -1.], [2., 0.]]
print(b.grad)    # dL/db = a.data = [[1., 2.], [3., 4.]]

Common Modules for everything

from noni.nn import Linear, LayerNorm, MultiHeadAttention, CrossEntropyLoss

# A simple 2-layer MLP
W1 = Linear(784, 256)
W2 = Linear(256, 10)

x = Tensor(some_batch)
h = W1(x).relu()
logits = W2(h)

loss = CrossEntropyLoss()(logits, targets)
loss.backward()   # gradients in W1.weight.grad, W2.weight.grad etc.

Build your own

Noni has opencl for gpu and numpy (cpu) backends and there is work going on to support CUDA natively as well as vulkan compute and triton, but you can always implement and register your own backend if you prefer.

from noni.backends import Backend, register_backend


class MyDevice(Backend):
	...

register_backend("mygpu", MyDevice())
Module Description
Linear Fully connected layer with weight + bias parameters, initialized using Kaiming initialization
Embedding Lookup table for token embeddings with scatter-add backward pass
LayerNorm Normalizes across the last N dimensions with learned affine parameters
Dropout Inverted dropout applied during training for regularization
MultiHeadAttention Multi-head self-attention module with optional causal mask for autoregressive models
FeedForward Position-wise feedforward network using GELU activation
TransformerBlock Pre-norm residual block combining Multi-Head Attention and FeedForward layers
CrossEntropyLoss Numerically stable implementation using log-softmax + negative log likelihood
Optimizers Includes SGD, Adam, AdamW, and CosineAnnealingLR scheduler

Building wheels

python -m build
twine upload --repository testpypi dist/*
twine upload dist/*

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

noniml-0.1.0.tar.gz (32.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

noniml-0.1.0-py3-none-any.whl (27.5 kB view details)

Uploaded Python 3

File details

Details for the file noniml-0.1.0.tar.gz.

File metadata

  • Download URL: noniml-0.1.0.tar.gz
  • Upload date:
  • Size: 32.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for noniml-0.1.0.tar.gz
Algorithm Hash digest
SHA256 0d1b12c5bb235e4a5fd7807f2435d47c4d0d254383504e12506cf090900cd6f4
MD5 84245456650cefd3d3f671e6d7c31d55
BLAKE2b-256 3e39339799c3255a2d3bf2944c042b91597a8e42c5a676e085113ae878706203

See more details on using hashes here.

File details

Details for the file noniml-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: noniml-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 27.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for noniml-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a2b4713bce0700f8609d1bc9799d35178ee3c8ef0f7f1090e6a48edb55cdb4f1
MD5 0740155fe80daec6357e46c00cda4bc5
BLAKE2b-256 8f3424b5c6896a13d70bcc01aa5298b6bb92b05ed3a9e3e13f727b7f2bf4141f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page