Skip to main content

A lightweight deep learning framework from scratch

Project description

gradzero

A lightweight deep learning framework built from scratch with automatic differentiation.

Features

  • Automatic Differentiation: Built-in autograd engine for computing gradients
  • Tensor Operations: NumPy-based tensor with gradient tracking
  • Neural Network Layers: Linear, ReLU, Sigmoid, and more
  • Optimizers: SGD and Adam optimizers with momentum and weight decay support
  • Easy to Use: Simple API similar to PyTorch

Installation

pip install gradzero

Quick Start

Creating Tensors with Factory Methods (实例化方法)

import gradzero as gz

# Create tensors using class method factories (实例化方法)
zeros = gz.Tensor.zeros((3, 3))          # Tensor filled with zeros
ones = gz.Tensor.ones((2, 4))            # Tensor filled with ones
randn = gz.Tensor.randn((3, 3))          # Random from normal distribution
rand = gz.Tensor.rand((2, 2))            # Random from uniform distribution
from_np = gz.Tensor.from_numpy(my_array) # From existing numpy array

# With gradient tracking
trainable = gz.Tensor.randn((3, 3), requires_grad=True)

Building a Neural Network

import gradzero as gz

# Define a model
model = gz.Sequential(
    gz.Linear(784, 128),
    gz.ReLU(),
    gz.Linear(128, 10),
)

# Loss and optimizer
criterion = gz.CrossEntropyLoss()
optimizer = gz.Adam(model.parameters(), lr=0.001)

# Training loop
for epoch in range(100):
    # Forward pass
    output = model(input_tensor)
    loss = criterion(output, target_tensor)

    # Backward pass
    model.zero_grad()
    loss.backward()

    # Update parameters
    optimizer.step()

API Reference

Tensor Factory Methods (实例化方法)

  • Tensor.zeros(shape, dtype=None, requires_grad=False) - Create a tensor filled with zeros
  • Tensor.ones(shape, dtype=None, requires_grad=False) - Create a tensor filled with ones
  • Tensor.randn(shape, dtype=None, requires_grad=False) - Create a tensor with random values from standard normal distribution
  • Tensor.rand(shape, dtype=None, requires_grad=False) - Create a tensor with random values from uniform distribution [0, 1)
  • Tensor.from_numpy(array, requires_grad=False) - Create a tensor from a numpy array

Neural Network Layers

  • Linear(in_features, out_features, bias=True) - Fully connected layer
  • ReLU() - ReLU activation
  • Sigmoid() - Sigmoid activation
  • Sequential(*layers) - Container for sequential layers

Loss Functions

  • MSELoss() - Mean squared error
  • CrossEntropyLoss() - Cross entropy loss for classification

Optimizers

  • SGD(parameters, lr=0.01, momentum=0.0, weight_decay=0.0) - Stochastic gradient descent
  • Adam(parameters, lr=0.001, betas=(0.9, 0.999), eps=1e-8, weight_decay=0.0) - Adam optimizer

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gradzero-0.1.0.tar.gz (7.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gradzero-0.1.0-py3-none-any.whl (8.5 kB view details)

Uploaded Python 3

File details

Details for the file gradzero-0.1.0.tar.gz.

File metadata

  • Download URL: gradzero-0.1.0.tar.gz
  • Upload date:
  • Size: 7.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.0 CPython/3.12.12 Darwin/25.2.0

File hashes

Hashes for gradzero-0.1.0.tar.gz
Algorithm Hash digest
SHA256 ff7450cbe243495b24e5500529868b5217ae3a445ec9fa9f9720e430081ad575
MD5 75c7fe626cb463d7860aa97304d67713
BLAKE2b-256 6102e46f0f932b71a1a69271273435e4d21bb77877ee4f2de9c58339a51286ff

See more details on using hashes here.

File details

Details for the file gradzero-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: gradzero-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 8.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.0 CPython/3.12.12 Darwin/25.2.0

File hashes

Hashes for gradzero-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5c72106660b4833eb76f0bb6a13ef353380a9934b17384e5593e8b2fb2b62eb5
MD5 26a90b693470b3ac4daaee786bb08f79
BLAKE2b-256 fd138fde1298b41557c0fea8d7763487e6204c87030b2056f44c1bc94ff70d37

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page