Happy Transformer is an API built on top of Hugging Face's Transformer library that makes it easy to utilize state-of-the-art NLP models.
Project description
Happy Transformer
Documentation and news: happytransformer.com
New Course: Create a text generation web app. Also learn how to fine-tune GPT-Neo link
Happy Transformer is an package built on top of Hugging Face's transformer library that makes it easy to utilize state-of-the-art NLP models.
Features
Public Methods | Basic Usage | Training |
---|---|---|
Text Generation | ✔ | ✔ |
Text Classification | ✔ | ✔ |
Word Prediction | ✔ | ✔ |
Question Answering | ✔ | ✔ |
Text-to-Text | ✔ | ✔ |
Next Sentence Prediction | ✔ | |
Token Classification | ✔ |
Quick Start
pip install happytransformer
from happytransformer import HappyWordPrediction
#--------------------------------------#
happy_wp = HappyWordPrediction() # default uses distilbert-base-uncased
result = happy_wp.predict_mask("I think therefore I [MASK]")
print(result) # [WordPredictionResult(token='am', score=0.10172799974679947)]
print(result[0].token) # am
Maintainers
- Eric Fillion Lead Maintainer
- Ted Brownlow Maintainer
Tutorials
Text classification (training)
Text classification (hate speech detection)
Text classification (sentiment analysis)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
happytransformer-2.3.2.tar.gz
(25.5 kB
view hashes)
Built Distribution
Close
Hashes for happytransformer-2.3.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 086b3274a3103a0ef246a43d1dc308439a73836b9e4e6d889acf1c5e7f7a6c27 |
|
MD5 | 0288bdc519618ff088fcc42733f79203 |
|
BLAKE2b-256 | 4e4c0fa91635cd73b994e099a0040f57279709302e98bedd86e5518d3de5a090 |