Skip to main content

High-performance machine learning library with lazy evaluation and automatic kernel fusion

Project description

zyx - Python Bindings

PyPI version License: LGPL 3.0

zyx is a high-performance machine learning library for Python, powered by Rust. It features lazy evaluation, automatic kernel fusion, and multi-backend support (CPU, GPU via OpenCL/CUDA/WebGPU).

Note: Install with pip install zyx-py, then import as import zyx

Installation

pip install zyx-py

Requirements:

  • Python 3.5+
  • Linux x86_64 (more platforms coming soon)

Quick Start

import zyx

# Create tensors
x = zyx.Tensor([1.0, -2.0, 3.0])
y = zyx.Tensor([4.0, 5.0, 6.0])

# Lazy evaluation - operations build a graph
z = x.relu() + y

# Realize to compute the result
z.realize()
print(z.numpy())  # [5.0, 5.0, 9.0]

# Random tensors
a = zyx.Tensor.randn(3, 3)
b = zyx.Tensor.randn(3, 3)
c = a @ b  # Matrix multiplication
c.realize()

Key Features

  • Lazy Evaluation - Operations accumulate until realize(), reducing temporary allocations
  • Kernel Fusion - Multiple operations compile into single optimized GPU kernels
  • Immutable Tensors - No in-place modification errors common in other frameworks
  • Explicit Gradient Tape - Control what's recorded, no no_grad() semantics needed
  • Arbitrary-Order Gradients - Native support for 2nd, 3rd, and higher-order derivatives
  • Cross-Platform - OpenCL (CPU/GPU), WebGPU, CUDA/ROCm backends

Tensor Operations

Creation

import zyx

# From Python lists
t1 = zyx.Tensor([1, 2, 3])

# Random initialization
t2 = zyx.Tensor.randn(100, 100)
t3 = zyx.Tensor.rand((50, 50), dtype=zyx.DType.F32)

# Constant tensors
zeros = zyx.Tensor.zeros(10, 10)
ones = zyx.Tensor.ones(10, 10)
eye = zyx.Tensor.eye(5)

# Full tensor with specific value
filled = zyx.Tensor.full((3, 3), 2.5)

# With numpy arrays
import numpy as np
arr = np.array([[1, 2], [3, 4]])
t = zyx.Tensor(arr)

Math Operations

x = zyx.Tensor.randn(100, 100)

# Unary operations
y = x.relu()
y = x.sigmoid()
y = x.tanh()
y = x.gelu()
y = x.softmax(dim=-1)
y = x.log_softmax(dim=-1)

# Binary operations
a = zyx.Tensor.randn(100, 100)
b = zyx.Tensor.randn(100, 100)
c = a + b
c = a - b
c = a * b
c = a / b
c = a.matmul(b)  # Matrix multiplication

# Reduction operations
mean = x.mean(dim=0)
sum_val = x.sum(dim=1)
max_val = x.max(dim=-1)
min_val = x.min(dim=-1)
argmax = x.argmax(dim=-1)
argmin = x.argmin(dim=-1)

# Shape operations
y = x.reshape(10, 10)
y = x.transpose(0, 1)
y = x.permute(1, 0, 2)
y = x.squeeze()
y = x.unsqueeze(0)

# Realize to get numpy array
result = y.numpy()

Automatic Differentiation

import zyx

# Create a gradient tape
tape = zyx.GradientTape()

# Forward pass
x = zyx.Tensor([1.0, 2.0, 3.0])
w = zyx.Tensor([0.5, 0.5, 0.5])
y = (x * w).sum()

# Compute gradients
grads = tape.gradient(y, [w])
print(grads[0].numpy())  # [1.0, 2.0, 3.0]

# Higher-order gradients
tape2 = zyx.GradientTape()
y = x.relu().sum()
grads = tape2.gradient(y, [x])

Neural Network Modules

import zyx
import zyx.nn as nn

# Linear layer
linear = nn.Linear(in_features=128, out_features=64)

# Convolutional layer
conv = nn.Conv2d(in_channels=3, out_channels=16, kernel_size=(3, 3))

# Normalization layers
ln = nn.LayerNorm(normalized_shape=(128,))
bn = nn.BatchNorm(num_features=64)
gn = nn.GroupNorm(num_groups=8, num_channels=64)
rn = nn.RMSNorm(dim=128)

# Activation functions
x = zyx.Tensor.randn(10, 128)
y = x.relu()
y = x.gelu()
y = x.silu()

# Multihead attention
attn = nn.MultiheadAttention(embed_dim=512, num_heads=8)
query = zyx.Tensor.randn(32, 128, 512)  # batch, seq, embed
key = zyx.Tensor.randn(32, 128, 512)
value = zyx.Tensor.randn(32, 128, 512)
output, _ = attn(query, key, value)

# Transformer layers
encoder_layer = nn.TransformerEncoderLayer(d_model=512, nhead=8)
transformer = nn.TransformerEncoder(encoder_layer, num_layers=6)

Optimizers

import zyx
import zyx.nn as nn
import zyx.optim as optim

# Create model
model = nn.Linear(10, 5)

# Create optimizer
optimizer = optim.Adam(learning_rate=0.001)
# or: optim.SGD(learning_rate=0.01, momentum=0.9)
# or: optim.AdamW(learning_rate=0.001, weight_decay=0.01)
# or: optim.RMSprop(learning_rate=0.001)

# Training loop
tape = zyx.GradientTape()
x = zyx.Tensor.randn(32, 10)
target = zyx.Tensor.randn(32, 5)

# Forward
pred = model(x)
loss = ((pred - target) ** 2).mean()

# Backward
grads = tape.gradient(loss, model.get_params())
optimizer.update(model, grads)

# Realize all pending computations
zyx.Tensor.realize_all()

Training Example

import zyx
import zyx.nn as nn
import zyx.optim as optim

# Set random seed
zyx.manual_seed(42)

# Create model
class SimpleNet:
    def __init__(self):
        self.fc1 = nn.Linear(784, 128)
        self.fc2 = nn.Linear(128, 10)

    def forward(self, x):
        x = self.fc1(x).relu()
        x = self.fc2(x)
        return x

    def get_params(self):
        return self.fc1.get_params() + self.fc2.get_params()

    def set_params(self, params):
        self.fc1.set_params(params[:2])
        self.fc2.set_params(params[2:])

model = SimpleNet()
optimizer = optim.Adam(learning_rate=0.001)

# Training loop
for epoch in range(10):
    tape = zyx.GradientTape()

    # Dummy data
    x = zyx.Tensor.randn(64, 784)
    y = zyx.Tensor.randint(0, 10, (64,))

    # Forward pass
    logits = model.forward(x)
    loss = logits.cross_entropy(y)

    # Backward pass
    grads = tape.gradient(loss, model.get_params())
    optimizer.update(model, grads)

    # Compute accuracy
    pred = logits.argmax(dim=-1)
    acc = (pred == y).float().mean()

    zyx.Tensor.realize_all()
    print(f"Epoch {epoch}, Loss: {loss.numpy():.4f}, Acc: {acc.numpy():.4f}")

Data Types

import zyx

# Supported dtypes
zyx.DType.F32    # 32-bit float
zyx.DType.F64    # 64-bit float
zyx.DType.F16    # 16-bit float
zyx.DType.BF16   # BFloat16
zyx.DType.I8     # 8-bit int
zyx.DType.I16    # 16-bit int
zyx.DType.I32    # 32-bit int
zyx.DType.I64    # 64-bit int
zyx.DType.U8     # 8-bit uint
zyx.DType.U16    # 16-bit uint
zyx.DType.U32    # 32-bit uint
zyx.DType.U64    # 64-bit uint
zyx.DType.Bool   # Boolean

# Create tensor with specific dtype
t = zyx.Tensor([1, 2, 3], dtype=zyx.DType.F32)

Device Support

zyx automatically selects the best available backend:

  • OpenCL - CPU and GPU support via POCL or vendor drivers
  • WebGPU - Modern cross-platform GPU API
  • CUDA/ROCm - NVIDIA and AMD GPU support

To check or set the runtime backend, use environment variables:

export ZYX_BACKEND=opencl    # Use OpenCL backend
export ZYX_BACKEND=wgpu      # Use WebGPU backend
export ZYX_BACKEND=cuda      # Use CUDA backend

Debug Options

Enable debug output with the ZYX_DEBUG environment variable:

ZYX_DEBUG=1    # Print hardware devices
ZYX_DEBUG=2    # Print performance info
ZYX_DEBUG=4    # Print kernel scheduling
ZYX_DEBUG=8    # Print kernel IR
ZYX_DEBUG=16   # Print native assembly

Combine flags: ZYX_DEBUG=18 (2 + 16) for perf + assembly output.

Why zyx?

Feature zyx PyTorch
Gradient recording Explicit GradientTape Implicit, requires no_grad()
Tensor mutability Immutable (no in-place errors) Mutable (can cause back-prop failures)
Higher-order gradients Arbitrary order natively Supported but more complex
Kernel fusion Automatic via lazy graph Manual or via torch.compile
Disk I/O Lazy loading parallel to compute Typically blocking

License

LGPL-3.0-only - see LICENSE file for details.

Links

Status

Experimental - API is stabilizing, performance under active optimization. Not production-ready yet.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

zyx_py-0.15.1-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.0 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.17+ x86-64

zyx_py-0.15.1-cp314-cp314-macosx_11_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.14macOS 11.0+ ARM64

zyx_py-0.15.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.0 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64

zyx_py-0.15.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.0 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

zyx_py-0.15.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.0 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

zyx_py-0.15.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.0 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

zyx_py-0.15.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.0 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ x86-64

zyx_py-0.15.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.0 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.17+ x86-64

File details

Details for the file zyx_py-0.15.1-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for zyx_py-0.15.1-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d5858a7c8831e11fc5d5ed6a7457f31aebffda4f0234b77f1ebfe4f15d565561
MD5 45e5f986b70a10310e59ea0b0095dfdc
BLAKE2b-256 707991dfa1665ad271d3dcae69249369063c26131003c8317fca383838176529

See more details on using hashes here.

Provenance

The following attestation bundles were made for zyx_py-0.15.1-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: build-wheels.yml on zk4x/zyx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zyx_py-0.15.1-cp314-cp314-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for zyx_py-0.15.1-cp314-cp314-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 1f3c89578463689f366fcf9fc769ab2f981d6860da0c51ffce43a6d213e466cf
MD5 b479839ee448ae19d96fa6e73c1daa52
BLAKE2b-256 e2bc03255fb22332127dd5a350ec578f0899e0eb14a232f34713b040d211d602

See more details on using hashes here.

Provenance

The following attestation bundles were made for zyx_py-0.15.1-cp314-cp314-macosx_11_0_arm64.whl:

Publisher: build-wheels.yml on zk4x/zyx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zyx_py-0.15.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for zyx_py-0.15.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 c02ab8e73612b6024641ab408083ed667ea417255885c1a11541353476bfefa3
MD5 d63bbddfa9ccd7b3b1003325531cc6f9
BLAKE2b-256 24d4a2488f3b746b91647a652f050679884ae346efd1f57ea0ad273312d3cd17

See more details on using hashes here.

Provenance

The following attestation bundles were made for zyx_py-0.15.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: build-wheels.yml on zk4x/zyx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zyx_py-0.15.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for zyx_py-0.15.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 b7ed343ed2555ec76a4ccc5d86c2c51999f476def8f5421df19d70518c5d264e
MD5 bd2a2dd044f17b53c0b38ebe050abc54
BLAKE2b-256 aa78840aeaaf7ac7e446679e2444c8e1ff45b6cb6b20e4a0cdaf7691e6ff2ed9

See more details on using hashes here.

Provenance

The following attestation bundles were made for zyx_py-0.15.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: build-wheels.yml on zk4x/zyx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zyx_py-0.15.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for zyx_py-0.15.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 884716b9e4902e11ec3bea1d7bef48f9fff9e8d68716cdc500c2b1edfab4628a
MD5 6ea1994701d7d9010ffcada20769e454
BLAKE2b-256 fd5349b9049e357af82aabf0a47155a50ac18e68592d4c522331a390a7a964e5

See more details on using hashes here.

Provenance

The following attestation bundles were made for zyx_py-0.15.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: build-wheels.yml on zk4x/zyx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zyx_py-0.15.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for zyx_py-0.15.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5bad1195bf46afb330c35f2f0ea7f2d57a78bee2ad30232a940d7e2e567648ec
MD5 bd740aeb977f3e7d06660a9aafaae5fc
BLAKE2b-256 11dc42dc14d39f9204b9ff6dc9ee52d52ba8a85893b89b8566b78c74b75c0eca

See more details on using hashes here.

Provenance

The following attestation bundles were made for zyx_py-0.15.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: build-wheels.yml on zk4x/zyx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zyx_py-0.15.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for zyx_py-0.15.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 eb6964f7f1b70df36d8bea59f1001da1040d0642dc2edea1be0e6d33d2434c1d
MD5 7d293a7c9fbd1991918afd6d569540e1
BLAKE2b-256 cc686fb8d296e5edfd39a3fabbb709b34ea8df7d878e8da1d72e1cd32efc9e13

See more details on using hashes here.

Provenance

The following attestation bundles were made for zyx_py-0.15.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: build-wheels.yml on zk4x/zyx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zyx_py-0.15.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for zyx_py-0.15.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 35f6431835efdf960aa559fadd54163f1584f2c2a7c427f60427c77521a66301
MD5 07e86898659cdaee06e578bdfae963ac
BLAKE2b-256 0c55589f1ceb6dcdda4327c4616729e04484d4def3faf46bdf01abf0c803cd12

See more details on using hashes here.

Provenance

The following attestation bundles were made for zyx_py-0.15.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: build-wheels.yml on zk4x/zyx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page