Skip to main content

Modern, modular, composability-first neural network building blocks for PyTorch.

Project description

HaloBlocks Logo

HaloBlocks

Modern, Modular, and Composability-First Neural Network Components.

Build Status PyPI Version License: MIT Python Version


HaloBlocks is a high-performance Python library for assembling complex neural network architectures from simple, composable building blocks. Whether you are prototyping a new Transformer variant, experimenting with Mixture-of-Experts (MoE) scaling, or building a Vision-Language-Action (VLA) model from scratch, HaloBlocks provides the foundational "bricks" you need — without getting in your way.

[!TIP] New to HaloBlocks? Jump straight into the interactive Tutorial Notebook for a hands-on tour of the library.


Features

Feature Description
First-Class Composability Every component is a Block that can be freely nested and combined via CompositeBlock
Rich Attention Zoo MHA, MQA, GQA, Cross, Gated, Sliding-Window, Linear, Trinity Attention & more
MoE Ready DeepSeek-style Mixture-of-Experts with routed + shared experts and noisy Top-K routing
Positional Embeddings Sinusoidal, Learned, RoPE (Rotary), and ALiBi — all plug-and-play
VLA Integration Flow Matching decoder blocks for Vision-Language-Action pipelines
Config-Driven Build entire model graphs from plain Python dicts or YAML/JSON configs via BlockFactory
Keras-like API Access any block directly from haloblocks.blocks (registry keys) without touching a config
PyTorch Native Zero magic — pure nn.Module subclasses built for speed and debuggability

Installation

# Using pip
pip install haloblocks

# Using uv (recommended — faster)
uv add haloblocks

Requirements: Python ≥ 3.10, PyTorch ≥ 2.0, NumPy ≥ 2.0

Running tests

From a clone of this repository (at the repo root):

./scripts/run_all_unit_tests.sh
./scripts/run_all_unit_tests.sh -v
./scripts/run_all_unit_tests.sh tests/test_core.py::TestCore::test_composite_block

The script runs uv sync --dev and then pytest. With no arguments it runs the full suite under tests/; any extra arguments are forwarded to pytest.

Releasing a new version

Requires GitHub CLI (gh) authenticated (gh auth login).

./scripts/open_release_pr.sh 0.2.0

This updates version in pyproject.toml (and uv.lock when uv is available), commits on branch release/v0.2.0, pushes it, and opens a PR to main. After the PR is merged, GitHub Actions creates tag v0.2.0 and pushes it; the existing Publish to PyPI workflow runs on that tag.


Quick Start

HaloBlocks supports three usage styles — choose whichever fits your workflow.

Style 1 — Keras-like direct access

No config dictionaries needed. Import blocks from haloblocks and call any block by its class name, or import attention classes from the attention package:

from haloblocks import blocks
from haloblocks.blocks import attention

# Multi-Head Attention (registry key = class name)
attn = blocks.MultiHeadAttention(emb_dim=512, num_heads=8)

# Same family, explicit class (namespace style)
mqa = attention.MultiQueryAttention(emb_dim=512, num_heads=8)

# Grouped-Query Attention
gqa = blocks.GroupedQueryAttention(emb_dim=512, num_heads=8, num_kv_heads=2)

# Rotary Positional Embedding (RoPE is per-head; pass head_dim, not full emb_dim)
rope = blocks.RotaryPositionalEmbedding(head_dim=64, max_len=8192)

# DeepSeek Mixture-of-Experts
moe = blocks.DeepseekMoE(
    emb_dim=512,
    hid_dim=2048,
    num_router_exprts=8,
    best_k=2,
    num_shared_exprts=2,
)

Style 2 — hb.create convenience function

import haloblocks as hb

attn = hb.create('MultiHeadAttention', emb_dim=512, num_heads=8)
moe = hb.create(
    'DeepseekMoE',
    emb_dim=512,
    hid_dim=2048,
    num_router_exprts=8,
    best_k=2,
    num_shared_exprts=2,
)

Style 3 — Config-driven (YAML / JSON friendly)

Perfect for experiment configs or hyperparameter sweeps:

import haloblocks as hb

config = {
    'type': 'TransformerBlock',
    'emb_dim': 768,
    'num_heads': 12,
    'mlp_dim': 3072,
    'use_moe': False,
}
block = hb.create(config)

Building a Composite Model

import haloblocks as hb

model = hb.CompositeBlock([
    hb.create('SinusoidalPositionalEmbedding', emb_dim=512, max_len=1024),
    hb.create('TransformerBlock', emb_dim=512, num_heads=8),
    hb.create('TransformerBlock', emb_dim=512, num_heads=8),
])

Block Catalogue

Registry keys now match class names exactly. Use them with blocks.<ClassName>, hb.create('<ClassName>', ...), and hb.create({'type': '<ClassName>', ...}).

Every block supports three equivalent ways to instantiate. For example, Multi-Head Attention:

# 1. blocks.<ClassName>
from haloblocks import blocks
attn = blocks.MultiHeadAttention(emb_dim=256, num_heads=8)

# 2. hb.create with keyword args
import haloblocks as hb
attn = hb.create('MultiHeadAttention', emb_dim=256, num_heads=8)

# 3. hb.create with a config dict
attn = hb.create({'type': 'MultiHeadAttention', 'emb_dim': 256, 'num_heads': 8})

Attention

Block Registry Key Description
Scaled Dot-Product ScaledDotProductAttention Bare-metal scaled dot-product attention
Self-Attention SelfAttention Single-head self-attention (QKV from same input)
Attention head (sub-block) HeadAttention Single head in a reduced subspace (building block inside MHA)
Multi-Head Attention MultiHeadAttention Classic MHA (Vaswani et al.)
Multi-Query Attention MultiQueryAttention MQA — single shared KV head
Grouped-Query Attention GroupedQueryAttention GQA — configurable KV head groups
Cross-Attention (single-head) CrossAttention Single-head encoder–decoder cross-attention
Cross-Attention (multi-head) MultiHeadCrossAttention Multi-head cross-attention
Gated Attention GatedAttention Attention with gating mechanism
Sliding Window Attention SlidingWindowAttention Local context window (Longformer-style)
Linear Attention LinearAttention Sub-quadratic linear attention
Multi-Head Latent Attention MultiHeadLatentAttention Latent-space compressed attention
Trinity Attention TrinityAttention Combined local + global + linear attention

Positional Embeddings

Block Registry Key Description
Sinusoidal SinusoidalPositionalEmbedding Fixed sinusoidal PE (original Transformer)
Learned LearnedPositionalEmbedding Trainable position embeddings
Rotary (RoPE) RotaryPositionalEmbedding Rotary position encoding
ALiBi AlibiPositionalBias Attention with Linear Biases

MLP

Block Registry Key Description
MLP MLP Configurable feed-forward block (activations, bias, last-layer options)

Mixture-of-Experts

Block Registry Key Description
DeepSeek MoE DeepseekMoE Routed + shared experts with noisy Top-K routing

Transformer

Block Registry Key Description
Transformer Block Builder TransformerBlockBuilder Highly composable layer supporting arbitrary Attn & FFN combinations
Stacked Transformer (config) StackedTransformerBlock Same as TransformerBlockBuilder with num_layers; use inside CompositeBlock configs
Transformer Block TransformerBlock Pre-norm Transformer layer (attn + MLP)
Decoder DecoderTransformer Stacked Transformer decoder

Vision-Language-Action

Block Registry Key Description
Flow Decoder FlowActionDecoder Flow-matching decoder for VLA action prediction

Project Structure

HaloBlocks/
├── src/haloblocks/
│  ├── __init__.py       # Top-level API: Block, BlockFactory, create
│  ├── layers.py        # Alias of ``blocks`` (same module; backward compatible)
│  ├── core/
│  │  ├── block.py      # Base Block (nn.Module subclass)
│  │  ├── registry.py     # BlockRegistry — central name → class map
│  │  ├── factory.py     # BlockFactory.create() dispatcher
│  │  ├── composite.py    # CompositeBlock for sequential pipelines
│  │  └── builder.py     # TransformerBlockBuilder, stacked blocks
│  └── blocks/
│    ├── attention/     # MHA, MQA, GQA, Cross, Gated, Sliding-Window, Linear, Trinity
│    ├── norm/          # RMSNorm (shared by attention and builder)
│    ├── positional_embedding/ # Sinusoidal, Learned, RoPE, ALiBi
│    ├── mlp/        # Configurable MLP block
│    ├── moe/        # DeepSeek Mixture-of-Experts
│    ├── transformer/    # Transformer Block & Decoder
│    └── vla/        # Flow Matching Decoder
├── scripts/
│  ├── run_all_unit_tests.sh  # uv sync --dev && pytest (optional args)
│  ├── open_release_pr.sh     # bump version + PR; merge triggers tag + PyPI
│  └── format.sh             # black, isort, pyflakes
├── notebooks/
│  └── tutorial.ipynb     # Interactive getting-started guide
├── tests/           # Pytest test suite
└── pyproject.toml

Contributing

Contributions are welcome! Please read our Contributing Guide before opening a pull request. When adding a new block:

  1. Create your module under src/haloblocks/blocks/<category>/.
  2. Subclass Block and decorate with @BlockRegistry.register() (auto-uses the class name).
  3. Re-export from the category's __init__.py.
  4. Add a test in tests/ and a usage example in the tutorial notebook.

License

HaloBlocks is released under the MIT License.


Built with by Naveen

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

haloblocks-0.1.3.tar.gz (30.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

haloblocks-0.1.3-py3-none-any.whl (50.1 kB view details)

Uploaded Python 3

File details

Details for the file haloblocks-0.1.3.tar.gz.

File metadata

  • Download URL: haloblocks-0.1.3.tar.gz
  • Upload date:
  • Size: 30.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.1 {"installer":{"name":"uv","version":"0.11.1","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"25.10","id":"questing","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for haloblocks-0.1.3.tar.gz
Algorithm Hash digest
SHA256 58b46fdab5c39ceeaf8eb452075b4235c26958f82876defbe7c292f460a28807
MD5 f807644b6e056369f69fedab746f530c
BLAKE2b-256 d107943403f7024a518bab8e66ab1d0eef396aa5a604c8951027ad0309f2c29f

See more details on using hashes here.

File details

Details for the file haloblocks-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: haloblocks-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 50.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.1 {"installer":{"name":"uv","version":"0.11.1","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"25.10","id":"questing","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for haloblocks-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 c8dfeb754f2ed1fab3a305d1d4316e12da3f6a2e152b04f6dd57bf9c04125486
MD5 8dfff62f04815d8b5148df27c4717a20
BLAKE2b-256 bd88deab9d8f986708e82420322ecb34fbca82261872925ac44949e8c44c4472

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page