Typed pluggable Transformer blocks with registries for attention, norm, feedforward, adapters, and position layers.
Project description
torchblocks-vp
torchblocks-vp provides reusable neural building blocks on top of PyTorch.
It includes:
- registry-based module factories
- attention, normalization, feedforward, and adapter layers
- rotary position embeddings and convolutional blocks
- typed interfaces for downstream model packages
Install:
pip install torchblocks-vp
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
torchblocks_vp-2.0.0.tar.gz
(5.7 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file torchblocks_vp-2.0.0.tar.gz.
File metadata
- Download URL: torchblocks_vp-2.0.0.tar.gz
- Upload date:
- Size: 5.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c5a487317707e8b2ccb1b68c5bd13a774c5975506fb44cc2b5f825ccb0e8dee1
|
|
| MD5 |
c516724db95ef52a934d3ca3832a68fb
|
|
| BLAKE2b-256 |
76635719cbef0ae2de88da3810934811212f662669b4047e54fa95f7cc32d794
|
File details
Details for the file torchblocks_vp-2.0.0-py3-none-any.whl.
File metadata
- Download URL: torchblocks_vp-2.0.0-py3-none-any.whl
- Upload date:
- Size: 7.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9bc630549def87d860ebfa23c9c634c18d7ed8627c1918241e664cfcf2fdb74d
|
|
| MD5 |
5ccb9a2680c4fd86d5cae564da43eaf8
|
|
| BLAKE2b-256 |
7743c8e11e3976074cf8f664e81e27330a590b5e91bbb9e0d723e06cffb30aa7
|