RATransformer - make a transformer model learn implicit relations passed in the input
Project description
RATransformers 🐭
RATransformers, short for Relation-Aware Transformers, is a package built on top of transformers 🤗 that enables the training/fine-tuning of models with extra relation-aware input features.
Example - Encoding a table in TableQA (Question Answering on Tabular Data)
In this example we can see that passing the table as text with no additional information to the model is a poor representation.
With RATransformers 🐭 you are able to encode the table in a more structured way by passing specific relations within the input. RATransformers 🐭 also allows you to pass further features related with each input word/token.
Check more examples in [here].
Installation
Install directly from PyPI:
pip install ratransformers
Usage
import ratransformers
from transformers import BartTokenizerFast, BartForSequenceClassification
ratransformer = ratransformers.RATransformer(
"nielsr/tapex-large-finetuned-tabfact", # define the 🤗 model you want to load
relation_kinds=['is_value_of_column', 'is_from_same_row'], # define the relations that you want to model in the input
tokenizer_cls=BartTokenizerFast, # define the tokenizer class
model_cls=BartForSequenceClassification, # define the model class
pretrained_tokenizer_name_or_path='facebook/bart-large' # define the tokenizer you want to load (in case it is not the same as the model)
)
model = ratransformer.model
tokenizer = ratransformer.tokenizer
With only these steps your RATransformer 🐭 is ready to be trained.
More implementation details in [the examples here].
How does it work?
We modify the self-attention layers of the transformer model as explained in the section 3 of the RAT-SQL paper.
Supported Models
Currently we support a limited number of transformer models:
Want another model? Feel free to open an Issue or create a Pull Request and let's get started 🚀
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for ratransformers-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d593c1588c9703e093437638dbb838a69c1cf384cbbf3eac87cf66deaee7afae |
|
MD5 | 89b89cb3b62b9bad3d57026891a2b39b |
|
BLAKE2b-256 | 8e3f0d14b1e68241b23097d380c5d742342c464d5b9153e9ad6f226a801842f2 |