Transformer model attention in Pydantic.
Project description
pydanttention
Transformer model attention in Pydantic.
Adapted from the source by Theia Vogel (MIT licensed, included here as vogel_manual_transformer.py
):
In turn using model ops from picoGPT (MIT license)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
pydanttention-0.1.0.tar.gz
(6.0 kB
view hashes)
Built Distribution
Close
Hashes for pydanttention-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 37cef01dfc20b3aed37d83c40c90ad9a7d77a5ba3b3d13e083a52d868a6990ef |
|
MD5 | caf8418d7f99bb26f0968d7975ef09ab |
|
BLAKE2b-256 | bd6262de408ad16f7cbb79da4345b0d6640d1d4f04a9ed6661e95f723f6acebb |