Happy Transformer makes it easy to fine-tune NLP Transformer models and use them for inference.
Project description
Happy Transformer
Documentation and news: happytransformer.com
Happy Transformer makes it easy to fine-tune NLP Transformer models and use them for inference.
3.0.0
- Deepspeed for training
- Apple's MPS for training and inference
- WandB to track training runs
- Data supplied for training is automatically split into portions for training and evaluating
- Push models directly to Hugging Face's Model Hub
Read about the full 3.0.0 update including breaking changes here.
Tasks
Tasks | Inference | Training |
---|---|---|
Text Generation | ✔ | ✔ |
Text Classification | ✔ | ✔ |
Word Prediction | ✔ | ✔ |
Question Answering | ✔ | ✔ |
Text-to-Text | ✔ | ✔ |
Next Sentence Prediction | ✔ | |
Token Classification | ✔ |
Quick Start
pip install happytransformer
from happytransformer import HappyWordPrediction
#--------------------------------------#
happy_wp = HappyWordPrediction() # default uses distilbert-base-uncased
result = happy_wp.predict_mask("I think therefore I [MASK]")
print(result) # [WordPredictionResult(token='am', score=0.10172799974679947)]
print(result[0].token) # am
Maintainers
- Eric Fillion Lead Maintainer
- Ted Brownlow Maintainer
Tutorials
Text generation with training (GPT-Neo)
Text classification (training)
Text classification (hate speech detection)
Text classification (sentiment analysis)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
happytransformer-3.0.0.tar.gz
(19.2 kB
view hashes)
Built Distribution
Close
Hashes for happytransformer-3.0.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 30e01622603ae191f5febe252a01ba28a6eac9edc3ad123010f667657dfe47d2 |
|
MD5 | 9a3a8e09dd1f686a8c65aaf4320474f9 |
|
BLAKE2b-256 | f6ed8abe77d280294a454534003242431439b102cf17e1089fde6020f90ab621 |