Curated transformer models
Project description
🤖 Curated transformers
This Python package provides a curated set of transformer models for spaCy. It is focused on deep integration into spaCy and will support deployment-focused features such as distillation and quantization. Curated transformers currently supports the following model types:
- ALBERT
- BERT
- CamemBERT
- RoBERTa
- XLM-RoBERTa
Supporting a wide variety of transformer models is a non-goal. If you want
to use another type of model, use
spacy-transformers
, which
allows you to use Hugging Face
transformers
models with spaCy.
⚠️ Warning: experimental package
This package is experimental and it is possible that the models will still change in incompatible ways.
⏳ Install
pip install git+https://github.com/explosion/curated-transformers.git
🚀 Quickstart
An example project is provided in the project
directory.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for curated-transformers-0.0.4.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | e744b3fc1de53a8dde30cba6abb25b42b7b103747c793a32be1ea193e5cfa20d |
|
MD5 | 09d95942cc3a1cfa4d0468e9ad2dc09f |
|
BLAKE2b-256 | 850d6155e3c90161c3c4787f2f1980968d4de3678316276a6e44f4a69a3302bc |
Hashes for curated_transformers-0.0.4-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b7dc95c1c2eaea8db668e6644eebb57255ba7d415243f951e6c9bd939a2ed5d8 |
|
MD5 | e2845260b2ef58c04260ff2203d28041 |
|
BLAKE2b-256 | 15ec2d0df852c167a15659aa46dcedcbcde12400330c7b5041e26861a1b64938 |