Skip to main content

Multi-dimensional array creation & manipulation library like numpy written from scratch in Python along with a scalar level autograd engine written in C/C++ with Python wrapper

Project description

Axon Library

axonlogo.png

Axon: is a lightweight Python library for creating and manipulating multi-dimensional arrays, inspired by libraries such as NumPy. It's written in python only, for now.

Axon.micro: You have seen Micrograd by Karpathy, this is the upgraded version of micrograd written in c/c++ & has more functions & operational support. A light weight scalar-level autograd engine written in c/c++ & python

Features

  • Element-wise operations (addition, multiplication, etc.)
  • Matrix multiplication
  • Broadcasting
  • Activation functions (ReLU, tanh, sigmoid, GELU)
  • Reshape, transpose, flatten
  • Data type conversion
  • Micrograd support(Scalar level autograd engine)

Installation

Clone the repository:

git clone https://github.com/shivendrra/axon.git
cd axon

Usage

You can use this similar to micrograd to build a simple neural network or do scalar level backprop.

Axon.array

import axon
from axon import array

# Create two 2D arrays
a = array([[1, 2], [3, 4]], dtype=axon.int32)
b = array([[5, 6], [7, 8]], dtype=axon.int32)

# Addition
c = a + b
print("Addition:\n", c)

# Multiplication
d = a * b
print("Multiplication:\n", d)

# Matrix Multiplication
e = a @ b
print("Matrix Multiplication:\n", e)

Output:

Addition:
 array([6, 8], [10, 12], dtype=int32)
Multiplication:
 array([5, 12], [21, 32], dtype=int32)
Matrix Multiplication:
 array([19, 22], [43, 50], dtype=int32)

anyway, prefer documentation for detailed usage guide:

  1. axon.md: for development purpose
  2. usage.md: for using it like numpy
  3. axon_micro.md: for axon.micro i.e. scalar autograd engine

Axon.micro

from axon.micro import scalar

a = scalar(2)
b = scalar(3)

c = a + b
d = a * b
e = c.relu()
f = d ** 2.0

f.backward()

print(a)
print(b)
print(c)
print(d)
print(e)
print(f)

you can even checkout example neural networks to run them on your system, or build your own :-D.

Forking the Repository

If you would like to contribute to this project, you can start by forking the repository:

  1. Click the "Fork" button at the top right of this page.
  2. Clone your forked repository to your local machine:
git clone https://github.com/shivendrra/axon.git
  1. Create a new branch:
git checkout -b my-feature-branch
  1. Make your changes.
  2. Commit and push your changes:
git add .
git commit -m "Add my feature"
git push origin my-feature-branch
  1. Create a pull request on the original repository.

Testing

To run the unit tests you will have to install PyTorch & Numpy, which the tests use as a reference for verifying the correctness of the calculated gradients & calculated values. Then simply run each file according to your prefrence:

python -m tests/test_array.py # for testing the axon functions with numpy
python -m tests/test_micro.py # for testing the axon.micro functions with pytorch

Contributing

We welcome contributions! Please follow these steps to contribute:

  1. Fork the repository.
  2. Create a new branch for your feature or bugfix.
  3. Make your changes.
  4. Ensure all tests pass.
  5. Submit a pull request with a clear description of your changes.

License

This project is licensed under the MIT License. See the LICENSE file for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

axon_pypi-1.0.0-py3-none-any.whl (721.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page