Skip to main content

Automatic differentiation library for basic arithmetic operations

Project description

SimpleGrad

SimpleGrad is a lightweight automatic differentiation library written in C++ with Python bindings.

Compatible Operating Systems:

  • Linux (x86_64 architecture only)

Installation

pip install simplegrad

Features

  • Multi-layer perceptron (MLP) which can be used for regression and classification tasks
  • Supports basic arithmetic operations
  • Lightweight and easy to use
  • Gradient computation
  • Backpropagation
  • Numpy compatibility

Usage

Here's a quick example of how to use MLP in SimpleGrad:

from simplegrad import MLP, Node
from sklearn import datasets

# Define the model
X, y = datasets.make_classification(
        n_samples=1000,
        n_features=10,
        n_classes=2,
        random_state=42,  # for reproducibility
    )
lr = 0.01
batch_size = 16
epochs = 10

# Define the model
model = MLP(
    10, [16, 1]
)  # 2 input nodes, 2 hidden layers with arbitrary sizes, 1 output node

# Training data
n_batches = (len(X) + batch_size - 1) // batch_size  # Ceiling division

for epoch in range(epochs):
    epoch_loss = 0.0
    for i in range(0, len(X), batch_size):
        batch_X = X[i : i + batch_size]
        batch_y = y[i : i + batch_size]
        current_batch_size = len(batch_X)  # Handle last batch

        batch_loss = 0.0
        #model.zero_grad()  # gradients are automatically reset after step function

        # Accumulate gradients over batch
        for x, y_true in zip(batch_X, batch_y):
            y_hat = model(x)[0]
            y_true = Node(y_true)
            loss = (y_hat - y_true) ** 2
            loss = loss * (1.0 / current_batch_size)  # Normalize loss
            batch_loss += loss.data()
            loss.backward()

        model.step(lr)  # Update weights using accumulated gradients
        del loss, y_hat, y_true  # Clean up
        epoch_loss += batch_loss

    # Average loss over all batches
    print(f"Epoch {epoch+1}, Average Loss: {epoch_loss/n_batches:.3f}")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

simplegrad-0.0.2.tar.gz (119.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

simplegrad-0.0.2-py2.py3-none-manylinux2014_x86_64.whl (117.8 kB view details)

Uploaded Python 2Python 3

File details

Details for the file simplegrad-0.0.2.tar.gz.

File metadata

  • Download URL: simplegrad-0.0.2.tar.gz
  • Upload date:
  • Size: 119.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.3

File hashes

Hashes for simplegrad-0.0.2.tar.gz
Algorithm Hash digest
SHA256 527ea1bd980c8b61c90937d94dfb9bf5346064b1e0b1f83e8ad87b91eceb96ba
MD5 103563ea1cf5b44cbbd9384df2f67400
BLAKE2b-256 2bcc46e2a35e68fd31d25a4d554a98c98df49d471c0a4ce1fca47b63ca8767c0

See more details on using hashes here.

File details

Details for the file simplegrad-0.0.2-py2.py3-none-manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for simplegrad-0.0.2-py2.py3-none-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 073082a6d538568a47e75ccde0e86890327b68bd46acacc095d09103c9706c67
MD5 5a564066d06a588d75178d1d5e38e6ee
BLAKE2b-256 a71cbf4f997250fbf8d2421cc3519ebb51aa6f2712453b50e3819cf5d506f1b0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page