RATransformer - make a transformer model learn implicit relations passed in the input
Project description
RATransformers 🐭
⚠👷♀👷♂ This package is WIP. Currently we only support the T5 model. Feel free to contribute with PRs!️👷♂👷♀⚠
RATransformers, short for Relation-Aware Transformers, is a package built on top of transformers that enables the training/fine-tuning of multiple models with some extra relation-aware weights. These extra weights enable the model to encode explicit relations between different parts of the input data.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
ratransformers-0.0.1.tar.gz
(9.3 kB
view hashes)
Built Distribution
Close
Hashes for ratransformers-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2d5be34dd2574698d778688137960da25ba591f4bbf59e73be93b4c9d08676d3 |
|
MD5 | 752c1de60e643cf0bf3330b72c430220 |
|
BLAKE2b-256 | a616a3b8449d4902f8270e241c587da7b7c3c6e98c3e96747208522d7804c346 |