Skip to main content

Noni — a tiny tensor library with autograd, for humans.

Project description

Noni (WIP)

A minimal tensor library with autograd flexible for building good enough deep learning models.

Familiar API

a = Tensor([[1., 2.], [3., 4.]], requires_grad=True)
b = Tensor([[0.5, -1.], [2., 0.]], requires_grad=True)

# Each op records its backward function
c = a * b        # op="*",  backward: dc/da = b, dc/db = a
d = c.sum()      # op="sum", backward: ones

d.backward()     # topological sort → apply each _backward in reverse

print(a.grad)    # dL/da = b.data = [[0.5, -1.], [2., 0.]]
print(b.grad)    # dL/db = a.data = [[1., 2.], [3., 4.]]

Common Modules for everything

from noni.nn import Linear, LayerNorm, MultiHeadAttention, CrossEntropyLoss

# A simple 2-layer MLP
W1 = Linear(784, 256)
W2 = Linear(256, 10)

x = Tensor(some_batch)
h = W1(x).relu()
logits = W2(h)

loss = CrossEntropyLoss()(logits, targets)
loss.backward()   # gradients in W1.weight.grad, W2.weight.grad etc.

Build your own

Noni has opencl for gpu and numpy (cpu) backends and there is work going on to support CUDA natively as well as vulkan compute and triton, but you can always implement and register your own backend if you prefer.

from noni.backends import Backend, register_backend


class MyDevice(Backend):
	...

register_backend("mygpu", MyDevice())
Module Description
Linear Fully connected layer with weight + bias parameters, initialized using Kaiming initialization
Embedding Lookup table for token embeddings with scatter-add backward pass
LayerNorm Normalizes across the last N dimensions with learned affine parameters
Dropout Inverted dropout applied during training for regularization
MultiHeadAttention Multi-head self-attention module with optional causal mask for autoregressive models
FeedForward Position-wise feedforward network using GELU activation
TransformerBlock Pre-norm residual block combining Multi-Head Attention and FeedForward layers
CrossEntropyLoss Numerically stable implementation using log-softmax + negative log likelihood
Optimizers Includes SGD, Adam, AdamW, and CosineAnnealingLR scheduler

Building wheels

python -m build
twine upload dist/*

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

noniml-0.1.1.tar.gz (32.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

noniml-0.1.1-py3-none-any.whl (27.5 kB view details)

Uploaded Python 3

File details

Details for the file noniml-0.1.1.tar.gz.

File metadata

  • Download URL: noniml-0.1.1.tar.gz
  • Upload date:
  • Size: 32.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for noniml-0.1.1.tar.gz
Algorithm Hash digest
SHA256 6f1a47bd2d261b5f13283b5bebda43ed3f2e3ce4bb67bf36b5081d1c5c9dcd9e
MD5 49b32fc5811c02ad71898c25e9e9d347
BLAKE2b-256 6178b4d76c9f0e04e86b1efd414a360de2a7ee4bf19b07a2d2a59eb49b2af668

See more details on using hashes here.

File details

Details for the file noniml-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: noniml-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 27.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for noniml-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 740bec97a0b4bec3c92ded163bf423fca59c1ddd95ff4335b492c81556a53b5e
MD5 a6bee469e0df86c2c47a804c482da51f
BLAKE2b-256 b1fc180b5f2aa61cc7e38a5d842465a0ee7ee480838a57bff0ff2bff707aa8fe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page