Skip to main content

lightweight tensor library with autograd engine for training deep neural nets

Project description

axgrad

axon.jpg My attempt to make something like TinyGrad or PyTorch A framework like PyTorch & MicroGrad written fully in python(i will add the c & cpp components for faster implementation though). It's supposed to be a good and lightweight C and Python based deep learning framework, which it's not, as of now(still building).

Overview

It contains a framework similar to Numpy which allows to do basic matrix operations like element-wise add/mul + matrix multiplication + broadcasting. Also building pytorch like auto-differentiation engine: axgrad (work in progress!)

Features

It has basic building blocks required to build a neural network:

  1. Basic tensor ops framework that could easily so matrix add/mul (element-wise), transpose, broadcasting, matmul, etc.
  2. A gradient engine that could compute and update gradients, automatically, much like micrograd, but on a tensor level ~ autograd like (work in progress!).
  3. Optimizer & loss computation blocks to compute and optimize (work in progress!). i'll be adding more things in future...

Usage

This shows basic usage of axgrad.engine & few of the axon's modules to preform tensor operations and build a sample neural network

anyway, prefer documentation for detailed usage guide:

  1. axon.doc: for using like numpy
  2. axgrad.doc: for building neural network from axon library (incomplete for now)

Creating a MLP

To create a multi-layer perceptron in axgrad, you'll just need to follow the steps you followed in PyTorch. Very basic, initiallize two linear layers & a basic activation layer.

import axgrad
import axgrad.nn as nn

class MLP(nn.Module):
  def __init__(self, _in, _hid, _out, bias=False) -> None:
    super().__init__()
    self.layer1 = nn.Linear(_in, _hid, bias)
    self.gelu = nn.GELU()
    self.layer2 = nn.Linear(_hid, _out, bias)
  
  def forward(self, x):
    out = self.layer1(x)
    out = self.gelu(out)
    out = self.layer2(out)
    return out

refer to this Example for detailed info on making mlp

btw, here's the outputs i got from my implementation, that ran till 6k iters: implemented results

Contribution

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change. Please make sure to update tests as appropriate. But it's still a work in progress.

License

None!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

axgrad-0.0.1.tar.gz (902.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

axgrad-0.0.1-cp313-cp313-win_amd64.whl (119.7 kB view details)

Uploaded CPython 3.13Windows x86-64

File details

Details for the file axgrad-0.0.1.tar.gz.

File metadata

  • Download URL: axgrad-0.0.1.tar.gz
  • Upload date:
  • Size: 902.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.3

File hashes

Hashes for axgrad-0.0.1.tar.gz
Algorithm Hash digest
SHA256 70dd24569ae746c096bd8dc8e895a9febcf3ab70f137e1f0629412791d210df6
MD5 c2b06106fc529f7425f23dfb79a619e2
BLAKE2b-256 2ff8fc1a5c75866312b51655acbfa2ebfb80d4b933ce451b20b1f7dd2ad17683

See more details on using hashes here.

File details

Details for the file axgrad-0.0.1-cp313-cp313-win_amd64.whl.

File metadata

  • Download URL: axgrad-0.0.1-cp313-cp313-win_amd64.whl
  • Upload date:
  • Size: 119.7 kB
  • Tags: CPython 3.13, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.3

File hashes

Hashes for axgrad-0.0.1-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 0395384446bf8926e01e94e0602c118166243262491e18a3a0d237dec6a469c0
MD5 f9908d6be0f6040f039f3ab4416123b4
BLAKE2b-256 1f27509a1b5d4676dde298884b1c5c675b4176e8115c886ec45dde7e2c40733d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page