Typed pluggable Transformer blocks with registries for attention, norm, feedforward, adapters, and position layers.
Project description
torchblocks-vp
torchblocks-vp is a typed PyTorch building-block library with a registry-driven API for assembling neural architectures out of interchangeable modules.
PyPI package name:
pip install torchblocks-vp
Import name:
import torchblocks
The package is meant for projects that want pluggable model components without copying the same attention, norm, feedforward, or adapter code into every application.
Design goals
torchblocks-vp is built around a few principles:
- reusable blocks instead of one fixed architecture
- explicit registration and lookup
- publishable, typed interfaces
- easy extension from downstream packages
It works especially well in codebases where model components should be configurable from TOML or Python configs.
Registry API
The central API is:
register(category, name)get(category, name)list_modules(category=None)
Example:
import torch
import torch.nn as nn
from torchblocks import get, register
@register("feedforward", "my_ff")
class MyFeedForward(nn.Module):
def __init__(self, d_model: int, dim_ff: int, dropout: float = 0.1) -> None:
super().__init__()
self.net = nn.Sequential(
nn.Linear(d_model, dim_ff),
nn.GELU(),
nn.Dropout(dropout),
nn.Linear(dim_ff, d_model),
)
def forward(self, x: torch.Tensor) -> torch.Tensor:
return self.net(x)
ff_cls = get("feedforward", "my_ff")
This allows applications to select blocks dynamically without importing implementation modules directly everywhere.
Included module families
Attention
Registered attention modules:
gqamhacross
These cover grouped-query attention, standard multi-head self-attention, and cross-attention.
Feedforward
Registered feedforward modules:
swiglugelu
Normalization
Registered norm modules:
rmsnormlayernorm
Adapters
Registered adapter modules:
language_conditionedbottlenecknone
Convolution
Registered convolution modules:
localnone
Position
Registered positional module:
rope
Installation
Requirements:
- Python
>=3.14 - PyTorch
>=2.0
Install from PyPI:
pip install torchblocks-vp
Common usage pattern
Most downstream code uses the registry as a factory layer:
from typing import cast
from torchblocks import get
norm_cls = get("norm", "rmsnorm")
ff_cls = get("feedforward", "swiglu")
norm = norm_cls(768)
ff = ff_cls(768, 3072, dropout=0.1)
Applications can keep architectural choices in config files while the runtime maps names to modules.
Included implementations
The current package exports the registry and auto-registers the bundled modules on import.
Implemented classes include:
GroupedQueryAttentionMultiHeadAttentionCrossAttentionSwiGLUFeedForwardGeLUFeedForwardRMSNormLayerNormLanguageConditionedAdapterBottleneckAdapterNoAdapterLocalConvModuleNoConvRotaryEmbedding
Helper functions:
rotate_halfapply_rotary_emb
Extending the package
You do not need to modify torchblocks-vp itself to add new blocks. A downstream package can register its own modules at startup:
from torchblocks import register
@register("attention", "my_attention")
class MyAttention(...):
...
As long as the constructor and forward signature match what the consuming model expects, the registry is enough.
Why the package is split out
Keeping these blocks in their own package helps with:
- reuse across multiple models
- cleaner application packages
- separate publishing and versioning
- stronger type boundaries between architecture code and task code
That matters in this repository because the libraries are published independently rather than bundled into one monolithic distribution.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file torchblocks_vp-2.2.0.tar.gz.
File metadata
- Download URL: torchblocks_vp-2.2.0.tar.gz
- Upload date:
- Size: 8.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6e1b9d0221b6be9991eb5fe34f583bc8fa0cbbb2be925ad6dbe2920f8a817fb2
|
|
| MD5 |
e58f84ea12f1619c8b3cb881ccb4f67b
|
|
| BLAKE2b-256 |
233d911ac5f78d30f2e08b489b4016b42fd28e924b750bdaca5a5551e704bc1e
|
File details
Details for the file torchblocks_vp-2.2.0-py3-none-any.whl.
File metadata
- Download URL: torchblocks_vp-2.2.0-py3-none-any.whl
- Upload date:
- Size: 8.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5f4bdea05af04a2b78c31d14b71d5f1a52f0b2ed901ba4157f0a40183cc9d91d
|
|
| MD5 |
28990f233992ce7e107192fa775483d9
|
|
| BLAKE2b-256 |
be7a9a01e8cb0672589deb86dcfe78b2d1fac50e7da88d1180b77a3a5dcc95cf
|