Skip to main content

Tensor manipulation library wrapped over a scalar-level autograd to compute backpropagation like PyTorch, but more like microgad

Project description

Drop

Drop is a part of the Axon project. It is an autograd library that uses scalar-level autograd instead of tensor-level autograd, which is essentially a Python tensor class wrapper over scalar value classes. The core scalar operations are implemented in C/C++, making it faster and more efficient while supporting additional functions.

tensor class is a wrapper over scalar class written in pure python, though the actual c-version implementation exist in tensor.cpp which works properly and can be used for same purpose but would work faster than the python version.

FunFact: this project is developed almost (90%) by GPT-4o and o3-mini as an experiment to check the understanding, reasoning and ability of language models for big and logically complex projects like this one

Bounty:

Solve the bug for the reward. More info in this Git Issue

Features

  • Basic Arithmetic Operations: Addition, subtraction, multiplication, division, exponentiation.
  • Common Mathematical Functions: ReLU, sigmoid, tanh, SiLU, and more.
  • Automatic Gradient Computation: Supports backpropagation for both scalar and tensor operations.
  • Efficient and Fast: Core operations implemented in C/C++.

Installation

Install library from PyPI.org:

pip install axon-drop 

Clone this repository and build the library:

git clone https://github.com/shivendrra/axon-drop.git
cd drop

Scalar

The Scalar library is a simple implementation of scalar operations with automatic gradient computation. It supports basic operations like addition, multiplication, exponentiation, and common functions such as ReLU, sigmoid, and tanh. The library also includes backpropagation functionality for gradient updates.

Usage

Here's a simple example demonstrating how to use the Scalar library:

from drop import scalar

# Initialize scalars
x1 = scalar(2)
x2 = scalar(3)

# Perform operations
a1 = x1 + x2
a2 = x1 - x2
y = (a1 * a2).tanh()

# Perform backpropagation
y.backward()

# Print gradients
print(x1.grad)  # Gradient of x1
print(x2.grad)  # Gradient of x2

Tensor

The Tensor class extends the capabilities of the Scalar class to support multi-dimensional arrays, similar to PyTorch's Tensor class. It allows for more complex operations and is essential for implementing neural networks or any machine learning models that require multi-dimensional data.

Usage

Here's a simple example demonstrating how to use the Tensor class:

from drop import tensor

# Initialize tensors
a = tensor([[2, 4, 5, -4], [-3, 0, 9, -1]])
b = tensor([[1, 0, -2, 0], [-1, 10, -2, 4]])

# Perform operations
c = a + b
d = c.tanh()
e = d.silu()
f = e ** 2
g = f.sigmoid()
h = g.sum()

# Perform backpropagation
h.backward()

# Print gradients
print("Gradients of a:\n", a.grad)
print("Gradients of b:\n", b.grad)

Explanation:

  • Tensor Initialization: Tensors are initialized with multi-dimensional arrays, and gradients are automatically set up for each operation.
  • Operations: The example demonstrates basic operations (+, **, etc.), as well as more advanced functions (tanh, silu, sigmoid).
  • Backpropagation: The .backward() function computes gradients for all tensors involved in the computation graph.

Contributing

Feel free to open issues or submit pull requests if you have any improvements or bug fixes!

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

axon_drop-0.0.9.tar.gz (264.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

axon_drop-0.0.9-py3-none-any.whl (269.4 kB view details)

Uploaded Python 3

File details

Details for the file axon_drop-0.0.9.tar.gz.

File metadata

  • Download URL: axon_drop-0.0.9.tar.gz
  • Upload date:
  • Size: 264.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.2

File hashes

Hashes for axon_drop-0.0.9.tar.gz
Algorithm Hash digest
SHA256 720b879ea842673e29141c014a08363883bcc4b4ae26a82bdf09ab8fc9d1b7ed
MD5 f765480f3b196173eba72b5de75e4a27
BLAKE2b-256 0dddebf50530229393465dd6705f1f639f40e89b6a8fad548d6fbdc6fa7d338d

See more details on using hashes here.

File details

Details for the file axon_drop-0.0.9-py3-none-any.whl.

File metadata

  • Download URL: axon_drop-0.0.9-py3-none-any.whl
  • Upload date:
  • Size: 269.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.2

File hashes

Hashes for axon_drop-0.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 a03999b498cf98d4dbee7770c1a446c847a2ce1fa68f690c54fd59360cd6d255
MD5 5e5da44fb4c0c8f79fe5d47bf9a5dd02
BLAKE2b-256 2b6f5d51a4d6fd54813f031a0fcac3e7ef3e2f048fbe99e6733a73169974809f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page