RATransformer - make a transformer model learn implicit relations passed in the input
Project description
RATransformers 🐭
RATransformers, short for Relation-Aware Transformers, is a package built on top of transformers 🤗 that enables the training/fine-tuning of models with extra relation-aware input features.
Example - Encoding a table in TableQA (Question Answering on Tabular Data)
In this example we can see that passing the table as text with no additional information to the model is a poor representation.
With RATransformers 🐭 you are to able to encode the table in a more structured way by passing specific relations within the input. RATransformers 🐭 also allows you to pass further features related with each input word/token.
Check more examples in [here]
Installation
Install directly from PyPI:
pip install ratransformers
Supported Models
Currently we support a limited number of transformer models:
Want another model? Feel free to open an Issue or create a Pull Request and let's get started 🚀
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for ratransformers-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | acc38e059644e6c48b104693c001f028db01e8fccb7c7f2f8073a3dda0cd0e5e |
|
MD5 | 562d17299db594b8c26ec339743c25bf |
|
BLAKE2b-256 | 6c55ed9acf1faf8e1c2eda07c23529cea8d6e9f60832429e79dba35f4ed25ff6 |