Skip to main content

lightweight tensor library with autograd engine for training deep neural nets

Project description

axgrad

axon.png

Overview

It contains a framework similar to Numpy which allows to do basic matrix operations like element-wise add/mul + matrix multiplication + broadcasting. Also building pytorch like auto-differentiation engine: axgrad

Features

It has basic building blocks required to build a neural network:

  1. Basic tensor ops framework that could easily so matrix add/mul (element-wise), transpose, broadcasting, matmul, etc.
  2. A gradient engine that could compute and update gradients, automatically, much like micrograd, but on a tensor level ~ autograd like (work in progress!).
  3. Optimizer & loss computation blocks to compute and optimize (work in progress!). i'll be adding more things in future...

Usage

This shows basic usage of axgrad.engine & few of the axon's modules to preform tensor operations and build a sample neural network

anyway, prefer documentation for detailed usage guide:

  1. Usage.md: User documentation for AxGrad

Creating a MLP

To create a multi-layer perceptron in axgrad, you'll just need to follow the steps you followed in PyTorch. Very basic, initiallize two linear layers & a basic activation layer.

import axgrad
import axgrad.nn as nn

class MLP(nn.Module):
  def __init__(self, _in, _hid, _out, bias=False) -> None:
    super().__init__()
    self.layer1 = nn.Linear(_in, _hid, bias)
    self.gelu = nn.GELU()
    self.layer2 = nn.Linear(_hid, _out, bias)
  
  def forward(self, x):
    out = self.layer1(x)
    out = self.gelu(out)
    out = self.layer2(out)
    return out

refer to this Example for detailed info on making mlp

btw, here's the outputs i got from my simple implementation, that ran till 5kiters:

result

Contribution

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change. Please make sure to update tests as appropriate. But it's still a work in progress.

License

This project is licensed under the Apache 2.0 License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

axgrad-0.1.0.tar.gz (378.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

axgrad-0.1.0-cp313-cp313-win_amd64.whl (274.5 kB view details)

Uploaded CPython 3.13Windows x86-64

File details

Details for the file axgrad-0.1.0.tar.gz.

File metadata

  • Download URL: axgrad-0.1.0.tar.gz
  • Upload date:
  • Size: 378.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.3

File hashes

Hashes for axgrad-0.1.0.tar.gz
Algorithm Hash digest
SHA256 2d115903f58c443bcea658e9722f82a83156238d26eafd0ec5602838a36f6c02
MD5 7f049149c5ee056aebde41aa12017dcf
BLAKE2b-256 87bba4d45ba9a8c9a8a63e3db8f806efdfb78c7bcb99663a42b6e9c4ccc344c3

See more details on using hashes here.

File details

Details for the file axgrad-0.1.0-cp313-cp313-win_amd64.whl.

File metadata

  • Download URL: axgrad-0.1.0-cp313-cp313-win_amd64.whl
  • Upload date:
  • Size: 274.5 kB
  • Tags: CPython 3.13, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.3

File hashes

Hashes for axgrad-0.1.0-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 7cc3f82d584e2e7c082f38795390aac0ad72a2a5fb9cd67b9889d84fca48836d
MD5 6e958f37c37ee68b0b7ca1a1d89045cc
BLAKE2b-256 073e4d6107100a84fd576445e2e998940d279501d3216199b72d9050422794c3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page