Easily deploy HuggingFace Transformers on a website
Project description
🚀 Deploy Transformers 🤗
Deploy a SOTA model for text-generation in just three lines of code 💻
Installation
Pytorch and Transformers are obviously needed.
pip install deploy-transformers
For deployment, file structure needs to be like this:
├── static
│ ├── script.js
│ ├── style.css
├── templates
│ ├── 404.html
│ ├── index.html
|
└── your_file.py
You can either clone this repository to have original files or use the function website.create_structure()
or create yourself the structure.
website.create_structure()
will automatically create templates/, static/ and all the files that are in it (.html, .js, .css).
Usage
Check the examples/ folder.
# Deployment
from deploy_transformers import Website
website = Website(model_type="gpt2", model_name="distilgpt2")
# website.create_folder(homepage_file="index.html", template_folder='templates', static_folder='static')
website.deploy()
You can change homepage filename, templates/ and static/ names in website.deploy()
but it's better to keep them as default.
# Only text generation
from deploy_transformers import ListModels, Model
# ListModels() to show available models
model = Model("gpt2", "distilgpt2", seed=42, verbose=False)
model.generate(length=20, prompt="The quick brown fox jumps over the lazy dog")
# If no prompt, input will be ask until exit
## Thanks
- Transformers package by HuggingFace
- gpt-2-cloudrun by minimaxir
Notes
- Do the same but for other tasks like sentiment analysis, or Q&A.
- Add Flask option?
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for deploy_transformers-0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c5d0fe946be7cebeb46d622b6717af4ae171e326b0feb24d6c4570ab7005cf01 |
|
MD5 | 21fb02f0dc030082aac6485dad40bdd1 |
|
BLAKE2b-256 | d8277f06939c3c4663dd88a043565b50e896fbf2f622932a2199d5c99243d454 |