Skip to main content

PyOpenCL-based deep learning playground with autograd, kernels, and high-level APIs.

Project description

netcl – PyOpenCL Deep Learning Playground

netcl is an experimental PyOpenCL-based deep learning framework. It combines low-level kernels (Conv/Matmul/Elementwise) with a lightweight autograd engine and a high-level API (Modules, Trainer, Serializer), without pulling in other DL frameworks.

Installation

pip install .

Requirements: Python ≥ 3.10, NumPy, PyOpenCL, and an available OpenCL device.

Quick Start (MNIST mini-MLP)

import numpy as np
from netcl.core.device import manager
from netcl.nn.layers import Sequential, Flatten, Linear, ReLU
from netcl import autograd as ag
from netcl.optim import Adam
from netcl.core.tensor import Tensor

dev = manager.default()
q = dev.queue

model = Sequential(
    Flatten(),
    Linear(q, in_features=28*28, out_features=128),
    ReLU(),
    Linear(q, in_features=128, out_features=10),
)
opt = Adam(model.parameters(), lr=5e-3)

def one_hot(y, n=10):
    oh = np.zeros((y.shape[0], n), dtype=np.float32)
    oh[np.arange(y.shape[0]), y] = 1
    return oh

xb = np.random.randn(32, 1, 28, 28).astype(np.float32)
yb = one_hot(np.random.randint(0, 10, size=(32,)))

tape = ag.Tape()
ag.set_current_tape(tape)
x = ag.tensor(Tensor.from_host(q, xb))
y = ag.tensor(Tensor.from_host(q, yb))
logits = model(x)
loss = ag.cross_entropy(logits, y)
tape.backward(loss)
opt.step(); opt.zero_grad()
ag.set_current_tape(None)

Key Features

  • Autograd: Tape-based, core ops (Matmul, Conv2d, Pooling, Elementwise) with backward.
  • Modules/High-Level API: Linear, Conv2d, BatchNorm2d, Sequential, @model decorator, Trainer.
  • Optimizations: Conv algo heuristics/autotuning, optional mixed precision, buffer pool.
  • Serialization: Save Sequential models as JSON (architecture) + NPZ (weights) via netcl.io.serialization.

Save & Load a Model

from netcl.io.serialization import save_model, load_model
save_model(model, "checkpoints/mnist_mlp")
model2 = load_model("checkpoints/mnist_mlp")  # queue auto="default"

Notes

  • Tests are not installed. For local dev: python -m pytest.
  • For performance: increase batch size, minimize augment, optionally mixed precision (Trainer(..., mixed_precision=True)).
  • Conv algorithms pick optimized paths automatically; env flags like NETCL_CONV_AUTOTUNE=1 enable tuning.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

netcl-0.1.1.tar.gz (66.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

netcl-0.1.1-py3-none-any.whl (87.1 kB view details)

Uploaded Python 3

File details

Details for the file netcl-0.1.1.tar.gz.

File metadata

  • Download URL: netcl-0.1.1.tar.gz
  • Upload date:
  • Size: 66.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.1

File hashes

Hashes for netcl-0.1.1.tar.gz
Algorithm Hash digest
SHA256 4318c3ccd310702efecdfec0d51530f5fa6e438d7334ed264c687ac75860dcd6
MD5 36e77e764091025feb990fe21d8b3882
BLAKE2b-256 56393a4d6c3ad092d84ce37467f292e1fa33efef72b821ab4ae648ec2c93fa90

See more details on using hashes here.

File details

Details for the file netcl-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: netcl-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 87.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.1

File hashes

Hashes for netcl-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 423039b9b7a05c1d56b91ea93b5a2f455dc80dc774454126d3dd1cce4859f736
MD5 b2f1788091fdddc2ac04d0611a01fec8
BLAKE2b-256 5808398a465c52d5db17fff7393eabae5493202d6969d44c1457c454350a84f5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page