simpleT5 is built on top of PyTorch-lightning ⚡️ and Transformers 🤗 that lets you quickly train your T5 models.
Project description
Quickly train T5 models in just 3 lines of code with ONNX inference
simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.
T5 models can be used for several NLP tasks such as summarization, QA, QG, translation, text generation, and more.
Here's a link to Medium article along with an example colab notebook
Install
pip install --upgrade simplet5
Usage
simpleT5 for summarization task
# import
from simplet5 import SimpleT5
# instantiate
model = SimpleT5()
# load (supports t5, mt5, byt5)
model.from_pretrained("t5","t5-base")
# train
model.train(train_df=train_df, # pandas dataframe with 2 columns: source_text & target_text
eval_df=eval_df, # pandas dataframe with 2 columns: source_text & target_text
source_max_token_len = 512,
target_max_token_len = 128,
batch_size = 8,
max_epochs = 5,
use_gpu = True,
outputdir = "outputs",
early_stopping_patience_epochs = 0,
precision = 32
)
# load trained T5 model
model.load_model("t5","path/to/trained/model/directory", use_gpu=False)
# predict
model.predict("input text for prediction")
# need faster inference on CPU, get ONNX support
model.convert_and_load_onnx_model("path/to/T5 model/directory")
model.onnx_predict("input text for prediction")
Articles
- Geek Culture: simpleT5 — Train T5 Models in Just 3 Lines of Code
- Abstractive Summarization with SimpleT5⚡️
- Training T5 model in just 3 lines of Code with ONNX Inference
- Kaggle: simpleT5⚡️ - Generating one line summary of papers
- Youtube: Abstractive Summarization Demo with SimpleT5
Acknowledgements
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
simplet5-0.1.1.tar.gz
(7.3 kB
view hashes)