Skip to main content

An easy-to-use wrapper library for the Transformers library.

Project description

License Downloads

All Contributors

Simple Transformers

This library is based on the Transformers library by HuggingFace. Simple Transformers lets you quickly train and evaluate Transformer models. Only 3 lines of code are needed to initialize, train, and evaluate a model.

Supported Tasks:

  • Information Retrieval (Dense Retrieval)
  • (Large) Language Models (Training, Fine-tuning, and Generation)
  • Encoder Model Training and Fine-tuning
  • Sequence Classification
  • Token Classification (NER)
  • Question Answering
  • Language Generation
  • T5 Model
  • Seq2Seq Tasks
  • Multi-Modal Classification
  • Conversational AI.

Table of contents

Setup

With Conda

  1. Install Anaconda or Miniconda Package Manager from here
  2. Create a new virtual environment and install packages.
$ conda create -n st python pandas tqdm
$ conda activate st

Using Cuda:

$ conda install pytorch>=1.6 cudatoolkit=11.0 -c pytorch

Without using Cuda

$ conda install pytorch cpuonly -c pytorch
  1. Install simpletransformers.
$ pip install simpletransformers

Optional

  1. Install Weights and Biases (wandb) for tracking and visualizing training in a web browser.
$ pip install wandb

Usage

All documentation is now live at simpletransformers.ai

Simple Transformer models are built with a particular Natural Language Processing (NLP) task in mind. Each such model comes equipped with features and functionality designed to best fit the task that they are intended to perform. The high-level process of using Simple Transformers models follows the same pattern.

  1. Initialize a task-specific model
  2. Train the model with train_model()
  3. Evaluate the model with eval_model()
  4. Make predictions on (unlabelled) data with predict()

However, there are necessary differences between the different models to ensure that they are well suited for their intended task. The key differences will typically be the differences in input/output data formats and any task specific features/configuration options. These can all be found in the documentation section for each task.

The currently implemented task-specific Simple Transformer models, along with their task, are given below.

Task Model
Binary and multi-class text classification ClassificationModel
Conversational AI (chatbot training) ConvAIModel
Language generation LanguageGenerationModel
Language model training/fine-tuning LanguageModelingModel
Multi-label text classification MultiLabelClassificationModel
Multi-modal classification (text and image data combined) MultiModalClassificationModel
Named entity recognition NERModel
Question answering QuestionAnsweringModel
Regression ClassificationModel
Sentence-pair classification ClassificationModel
Text Representation Generation RepresentationModel
Document Retrieval RetrievalModel
  • Please refer to the relevant section in the docs for more information on how to use these models.
  • Example scripts can be found in the examples directory.
  • See the Changelog for up-to-date changes to the project.

A quick example

from simpletransformers.classification import ClassificationModel, ClassificationArgs
import pandas as pd
import logging


logging.basicConfig(level=logging.INFO)
transformers_logger = logging.getLogger("transformers")
transformers_logger.setLevel(logging.WARNING)

# Preparing train data
train_data = [
    ["Aragorn was the heir of Isildur", 1],
    ["Frodo was the heir of Isildur", 0],
]
train_df = pd.DataFrame(train_data)
train_df.columns = ["text", "labels"]

# Preparing eval data
eval_data = [
    ["Theoden was the king of Rohan", 1],
    ["Merry was the king of Rohan", 0],
]
eval_df = pd.DataFrame(eval_data)
eval_df.columns = ["text", "labels"]

# Optional model configuration
model_args = ClassificationArgs(num_train_epochs=1)

# Create a ClassificationModel
model = ClassificationModel(
    "roberta", "roberta-base", args=model_args
)

# Train the model
model.train_model(train_df)

# Evaluate the model
result, model_outputs, wrong_predictions = model.eval_model(eval_df)

# Make predictions with the model
predictions, raw_outputs = model.predict(["Sam was a Wizard"])

Experiment Tracking with Weights and Biases

  • Weights and Biases makes it incredibly easy to keep track of all your experiments. Check it out on Colab here: Open In Colab

Current Pretrained Models

For a list of pretrained models, see Hugging Face docs.

The model_types available for each task can be found under their respective section. Any pretrained model of that type found in the Hugging Face docs should work. To use any of them set the correct model_type and model_name in the args dictionary.


Contributors โœจ

Thanks goes to these wonderful people (emoji key):


hawktang

๐Ÿ’ป

Mabu Manaileng

๐Ÿ’ป

Ali Hamdi Ali Fadel

๐Ÿ’ป

Tovly Deutsch

๐Ÿ’ป

hlo-world

๐Ÿ’ป

huntertl

๐Ÿ’ป

Yann Defretin

๐Ÿ’ป ๐Ÿ“– ๐Ÿ’ฌ ๐Ÿค”

Manuel

๐Ÿ“– ๐Ÿ’ป

Gilles Jacobs

๐Ÿ“–

shasha79

๐Ÿ’ป

Mercedes Garcia

๐Ÿ’ป

Hammad Hassan Tarar

๐Ÿ’ป ๐Ÿ“–

Todd Cook

๐Ÿ’ป

Knut O. Hellan

๐Ÿ’ป ๐Ÿ“–

nagenshukla

๐Ÿ’ป

flaviussn

๐Ÿ’ป ๐Ÿ“–

Marc Torrellas

๐Ÿšง

Adrien Renaud

๐Ÿ’ป

jacky18008

๐Ÿ’ป

Matteo Senese

๐Ÿ’ป

sarthakTUM

๐Ÿ“– ๐Ÿ’ป

djstrong

๐Ÿ’ป

Hyeongchan Kim

๐Ÿ“–

Pradhy729

๐Ÿ’ป ๐Ÿšง

Iknoor Singh

๐Ÿ“–

Gabriel Altay

๐Ÿ’ป

flozi00

๐Ÿ“– ๐Ÿ’ป ๐Ÿšง

alexysdussier

๐Ÿ’ป

Jean-Louis Queguiner

๐Ÿ“–

aced125

๐Ÿ’ป

Laksh1997

๐Ÿ’ป

Changlin_NLP

๐Ÿ’ป

jpotoniec

๐Ÿ’ป

fcggamou

๐Ÿ’ป ๐Ÿ“–

guy-mor

๐Ÿ› ๐Ÿ’ป

Cahya Wirawan

๐Ÿ’ป

BjarkePedersen

๐Ÿ’ป

tekkkon

๐Ÿ’ป

Amit Garg

๐Ÿ’ป

caprone

๐Ÿ›

Ather Fawaz

๐Ÿ’ป

Santiago Castro

๐Ÿ“–

taranais

๐Ÿ’ป

Pablo N. Marino

๐Ÿ’ป ๐Ÿ“–

Anton Kiselev

๐Ÿ’ป ๐Ÿ“–

Alex

๐Ÿ’ป

Karthik Ganesan

๐Ÿ’ป

Zhylko Dima

๐Ÿ’ป

Jonatan Kล‚osko

๐Ÿ’ป

sarapapi

๐Ÿ’ป ๐Ÿ’ฌ

Abdul

๐Ÿ’ป

James Milliman

๐Ÿ“–

Suraj Parmar

๐Ÿ“–

KwanHong Lee

๐Ÿ’ฌ

Erik FรครŸler

๐Ÿ’ป

Thomas Sรธvik

๐Ÿ’ฌ

Gagandeep Singh

๐Ÿ’ป ๐Ÿ“–

Andrea Esuli

๐Ÿ’ป

DM2493

๐Ÿ’ป

Nick Doiron

๐Ÿ’ป

Abhinav Gupta

๐Ÿ’ป

Martin H. Normark

๐Ÿ“–

Mossad Helali

๐Ÿ’ป

calebchiam

๐Ÿ’ป

Daniele Sartiano

๐Ÿ’ป

tuner007

๐Ÿ“–

xia jiang

๐Ÿ’ป

Hendrik Buschmeier

๐Ÿ“–

Mana Borwornpadungkitti

๐Ÿ“–

rayline

๐Ÿ’ป

Mehdi Heidari

๐Ÿ’ป

William Roe

๐Ÿ’ป

รlvaro Abella Bascarรกn

๐Ÿ’ป

Brett Fazio

๐Ÿ“–

Viet-Tien

๐Ÿ’ป

Bisola Olasehinde

๐Ÿ’ป ๐Ÿ“–

William Chen

๐Ÿ“–

Reza Ebrahimi

๐Ÿ“–

gabriben

๐Ÿ“–

Prashanth Kurella

๐Ÿ’ป

dopc

๐Ÿ’ป

Tanish Tyagi

๐Ÿ“– ๐Ÿ’ป

kongyurui

๐Ÿ’ป

Andrew Lensen

๐Ÿ’ป

jinschoi

๐Ÿ’ป

Le Nguyen Khang

๐Ÿ’ป

Jordi Mas

๐Ÿ“–

mxa

๐Ÿ’ป

MichelBartels

๐Ÿ’ป

Luke Tudge

๐Ÿ“–

Saint

๐Ÿ’ป

deltaxrg

๐Ÿ’ป ๐Ÿ“–

Fortune Adekogbe

๐Ÿ’ป

This project follows the all-contributors specification. Contributions of any kind welcome!

If you should be on this list but you aren't, or you are on the list but don't want to be, please don't hesitate to contact me!


How to Contribute

How to Update Docs

The latest version of the docs is hosted on Github Pages, if you want to help document Simple Transformers below are the steps to edit the docs. Docs are built using Jekyll library, refer to their webpage for a detailed explanation of how it works.

  1. Install Jekyll: Run the command gem install bundler jekyll
  2. Visualizing the docs on your local computer: In your terminal cd into the docs directory of this repo, eg: cd simpletransformers/docs From the docs directory run this command to serve the Jekyll docs locally: bundle exec jekyll serve Browse to http://localhost:4000 or whatever url you see in the console to visualize the docs.
  3. Edit and visualize changes: All the section pages of our docs can be found under docs/_docs directory, you can edit any file you want by following the markdown format and visualize the changes after refreshing the browser tab.

Acknowledgements

None of this would have been possible without the hard work by the HuggingFace team in developing the Transformers library.

<div>Icon for the Social Media Preview made by <a href="https://www.flaticon.com/authors/freepik" title="Freepik">Freepik</a> from <a href="https://www.flaticon.com/" title="Flaticon">www.flaticon.com</a></div>

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

simpletransformers-0.70.1.tar.gz (280.4 kB view details)

Uploaded Source

Built Distribution

simpletransformers-0.70.1-py3-none-any.whl (316.3 kB view details)

Uploaded Python 3

File details

Details for the file simpletransformers-0.70.1.tar.gz.

File metadata

  • Download URL: simpletransformers-0.70.1.tar.gz
  • Upload date:
  • Size: 280.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for simpletransformers-0.70.1.tar.gz
Algorithm Hash digest
SHA256 17f1683a46638524ce1c8e30c75cf7d22d31c7ac7662a3e5f3506a4f0f32b0e4
MD5 244fc57f0fafc5db37341732971f4557
BLAKE2b-256 d426646a6ea44cd83d54e611bc7f697f953687dda4b0ccc2f1c70e20c33f6072

See more details on using hashes here.

File details

Details for the file simpletransformers-0.70.1-py3-none-any.whl.

File metadata

File hashes

Hashes for simpletransformers-0.70.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c5a36301bb3a92c9cf50037c0c1b5a0a97f0e08ae5e605b2c9abb3fe613ec5d9
MD5 b50d293190aed875b269e5ac93dac46b
BLAKE2b-256 e5851c49e063939c70b70e615ee003ef09a8ac82030303a4f1397d0be6590b3d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page