Curated transformer models
Project description
🤖 Curated transformers
This Python package provides a curated set of transformer models for spaCy. It is focused on deep integration into spaCy and will support deployment-focused features such as distillation and quantization. Curated transformers currently supports the following model types:
- ALBERT
- BERT
- CamemBERT
- RoBERTa
- XLM-RoBERTa
Supporting a wide variety of transformer models is a non-goal. If you want
to use another type of model, use
spacy-transformers
, which
allows you to use Hugging Face
transformers
models with spaCy.
⚠️ Warning: experimental package
This package is experimental and it is possible that the models will still change in incompatible ways.
⏳ Install
pip install spacy-curated-transformers
🚀 Quickstart
An example project is provided in the project
directory.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for spacy-curated-transformers-0.1.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | c982d9cf9704cca191d518c0086159361fe7d20741f383c74c40c88226fee5c9 |
|
MD5 | 4169242e1ea968f24243c6ca7ffb8cec |
|
BLAKE2b-256 | 91f65ef9a82c378ffabb5d5b7efb8ecf22972d7ef314f8fe9d801704dd6afad8 |
Hashes for spacy_curated_transformers-0.1.0-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2b2cbf8f5c86b46a1c640b7d1cda9745395d2fdb3e4f256b921993c3c2ca74ac |
|
MD5 | 85196e7ff0cbb041623a1d1bd98e7df9 |
|
BLAKE2b-256 | 26bb01d376b233bcddd5041684ea156d5646bb33e7af038821c8560cdf7b7c15 |