Typed pluggable Transformer blocks with registries for attention, norm, feedforward, adapters, and position layers.
Project description
torchblocks-vp
torchblocks-vp provides reusable neural building blocks on top of PyTorch.
It includes:
- registry-based module factories
- attention, normalization, feedforward, and adapter layers
- rotary position embeddings and convolutional blocks
- typed interfaces for downstream model packages
Install:
pip install torchblocks-vp
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
torchblocks_vp-2.0.1.tar.gz
(5.7 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file torchblocks_vp-2.0.1.tar.gz.
File metadata
- Download URL: torchblocks_vp-2.0.1.tar.gz
- Upload date:
- Size: 5.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
72f9669c28a75882506daacc3509ea7508486c3969d887bd5048a1d5b59810f0
|
|
| MD5 |
9c29924174c2247bb67ea6046083a195
|
|
| BLAKE2b-256 |
ad48f74bf8a2c64165c0b42b35e2584fa26fb83e78fc36aef0ddad4824b5bfc0
|
File details
Details for the file torchblocks_vp-2.0.1-py3-none-any.whl.
File metadata
- Download URL: torchblocks_vp-2.0.1-py3-none-any.whl
- Upload date:
- Size: 7.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
60d27f3569bffd32ba4d3c9a94a5ffcf1d83042006e70cd91483eeabafd15510
|
|
| MD5 |
b74f8d98dee2f8e1bd2a4afc809a811a
|
|
| BLAKE2b-256 |
bbac737b2ad749b3388c82a78a6e88f79ba9c57002163e1b94f891c284e8b937
|