RATransformer - make a transformer model learn implicit relations passed in the input
Project description
RATransformers 🐭
⚠👷♀👷♂ This package is WIP. Currently we only support the T5 model. Feel free to contribute with PRs!️👷♂👷♀⚠
RATransformers, short for Relation-Aware Transformers, is a package built on top of transformers that enables the training/fine-tuning of multiple models with some extra relation-aware weights. These extra weights enable the model to encode explicit relations between different parts of the input data.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
ratransformers-0.0.0.tar.gz
(9.3 kB
view hashes)
Built Distribution
Close
Hashes for ratransformers-0.0.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 228ad36f6bf6e983b97725325dcf0d692be8efbecfb91c0c56f7146f6546ac85 |
|
MD5 | 2d40a1b4fe2dd59b922054053ade0c07 |
|
BLAKE2b-256 | 15cc15316c48965f8d72739e14e56d57fd89c4ac6bb9574c4b84bd00f1efeda6 |