An easy-to-use wrapper library for the Transformers library.
Project description
Simple Transformers
This library is based on the Transformers library by HuggingFace. Simple Transformers
lets you quickly train and evaluate Transformer models. Only 3 lines of code are needed to initialize, train, and evaluate a model.
Supported Tasks:
- Sequence Classification
- Token Classification (NER)
- Question Answering
- Language Model Fine-Tuning
- Language Model Training
- Language Generation
- T5 Model
- Seq2Seq Tasks
- Multi-Modal Classification
- Conversational AI.
- Text Representation Generation.
Table of contents
Setup
With Conda
- Install
Anaconda
orMiniconda
Package Manager from here - Create a new virtual environment and install packages.
$ conda create -n st python pandas tqdm
$ conda activate st
Using Cuda:
$ conda install pytorch>=1.6 cudatoolkit=11.0 -c pytorch
Without using Cuda
$ conda install pytorch cpuonly -c pytorch
- Install
simpletransformers
.
$ pip install simpletransformers
Optional
- Install
Weights
andBiases
(wandb) for tracking and visualizing training in a web browser.
$ pip install wandb
Usage
All documentation is now live at simpletransformers.ai
Simple Transformer
models are built with a particular Natural Language Processing (NLP) task in mind. Each such model comes equipped with features and functionality designed to best fit the task that they are intended to perform. The high-level process of using Simple Transformers models follows the same pattern.
- Initialize a task-specific model
- Train the model with
train_model()
- Evaluate the model with
eval_model()
- Make predictions on (unlabelled) data with
predict()
However, there are necessary differences between the different models to ensure that they are well suited for their intended task. The key differences will typically be the differences in input/output data formats and any task specific features/configuration options. These can all be found in the documentation section for each task.
The currently implemented task-specific Simple Transformer
models, along with their task, are given below.
Task | Model |
---|---|
Binary and multi-class text classification | ClassificationModel |
Conversational AI (chatbot training) | ConvAIModel |
Language generation | LanguageGenerationModel |
Language model training/fine-tuning | LanguageModelingModel |
Multi-label text classification | MultiLabelClassificationModel |
Multi-modal classification (text and image data combined) | MultiModalClassificationModel |
Named entity recognition | NERModel |
Question answering | QuestionAnsweringModel |
Regression | ClassificationModel |
Sentence-pair classification | ClassificationModel |
Text Representation Generation | RepresentationModel |
- Please refer to the relevant section in the docs for more information on how to use these models.
- Example scripts can be found in the examples directory.
- See the Changelog for up-to-date changes to the project.
A quick example
from simpletransformers.classification import ClassificationModel, ClassificationArgs
import pandas as pd
import logging
logging.basicConfig(level=logging.INFO)
transformers_logger = logging.getLogger("transformers")
transformers_logger.setLevel(logging.WARNING)
# Preparing train data
train_data = [
["Aragorn was the heir of Isildur", 1],
["Frodo was the heir of Isildur", 0],
]
train_df = pd.DataFrame(train_data)
train_df.columns = ["text", "labels"]
# Preparing eval data
eval_data = [
["Theoden was the king of Rohan", 1],
["Merry was the king of Rohan", 0],
]
eval_df = pd.DataFrame(eval_data)
eval_df.columns = ["text", "labels"]
# Optional model configuration
model_args = ClassificationArgs(num_train_epochs=1)
# Create a ClassificationModel
model = ClassificationModel(
"roberta", "roberta-base", args=model_args
)
# Train the model
model.train_model(train_df)
# Evaluate the model
result, model_outputs, wrong_predictions = model.eval_model(eval_df)
# Make predictions with the model
predictions, raw_outputs = model.predict(["Sam was a Wizard"])
Experiment Tracking with Weights and Biases
- Weights and Biases makes it incredibly easy to keep track of all your experiments. Check it out on Colab here:
Current Pretrained Models
For a list of pretrained models, see Hugging Face docs.
The model_types
available for each task can be found under their respective section. Any pretrained model of that type
found in the Hugging Face docs should work. To use any of them set the correct model_type
and model_name
in the args
dictionary.
Contributors โจ
Thanks goes to these wonderful people (emoji key):
This project follows the all-contributors specification. Contributions of any kind welcome!
If you should be on this list but you aren't, or you are on the list but don't want to be, please don't hesitate to contact me!
How to Contribute
How to Update Docs
The latest version of the docs is hosted on Github Pages, if you want to help document Simple Transformers below are the steps to edit the docs. Docs are built using Jekyll library, refer to their webpage for a detailed explanation of how it works.
- Install Jekyll: Run the command
gem install bundler jekyll
- Visualizing the docs on your local computer:
In your terminal cd into the docs directory of this repo, eg:
cd simpletransformers/docs
From the docs directory run this command to serve the Jekyll docs locally:bundle exec jekyll serve
Browse to http://localhost:4000 or whatever url you see in the console to visualize the docs. - Edit and visualize changes:
All the section pages of our docs can be found under
docs/_docs
directory, you can edit any file you want by following the markdown format and visualize the changes after refreshing the browser tab.
Acknowledgements
None of this would have been possible without the hard work by the HuggingFace team in developing the Transformers library.
<div>
Icon for the Social Media Preview made by <a href="https://www.flaticon.com/authors/freepik" title="Freepik">
Freepik</a>
from <a href="https://www.flaticon.com/" title="Flaticon">
www.flaticon.com</a></div>
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file simpletransformers-0.62.1.tar.gz
.
File metadata
- Download URL: simpletransformers-0.62.1.tar.gz
- Upload date:
- Size: 196.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6cbc5f07d271cd473cf209742cd022d7ee6419e1d4d87577a3c2f044ee58417c |
|
MD5 | b82f7f9570f9c23d4bd0cbcc824b2a68 |
|
BLAKE2b-256 | 414ae878c8cab422597ed45381c1b8082202047bd3bb98130c02851ce5709447 |
File details
Details for the file simpletransformers-0.62.1-py3-none-any.whl
.
File metadata
- Download URL: simpletransformers-0.62.1-py3-none-any.whl
- Upload date:
- Size: 231.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 41a9e9ffd07e3ff7b923738994e3f69c62dcff01183e3acd9b63d43119eea82e |
|
MD5 | 8f91fa9d4e54d4c3f4dc54ae540dc6ba |
|
BLAKE2b-256 | af59a3f0eba483d75edfe2a3f69c434c46e3e163c48e26668cb125c584c4a32e |