Curated transformer models
Project description
🤖 Curated transformers
This Python package provides a curated set of transformer models for spaCy. It is focused on deep integration into spaCy and will support deployment-focused features such as distillation and quantization. Curated transformers currently supports the following model types:
- ALBERT
- BERT
- CamemBERT
- RoBERTa
- XLM-RoBERTa
Supporting a wide variety of transformer models is a non-goal. If you want
to use another type of model, use
spacy-transformers
, which
allows you to use Hugging Face
transformers
models with spaCy.
⚠️ Warning: experimental package
This package is experimental and it is possible that the models will still change in incompatible ways.
⏳ Install
pip install curated-transformers
🚀 Quickstart
An example project is provided in the project
directory.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for curated-transformers-0.0.7.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 68526f0efaaa94fdc10e7181e21c723637a898e1c2b80550cb7220395654bfc6 |
|
MD5 | 56028e79dbc16a9bd8d0d73b9eb6cf12 |
|
BLAKE2b-256 | d9be199fbf7e32cb56eddfefda079216075756eec7a00744eb25acf5d1e41802 |
Hashes for curated_transformers-0.0.7-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 814f897469abba96132aa382815059dabcd607f5867e2b856a1d725e8cd75c94 |
|
MD5 | 09c6bbc7bf84af485838db7dc340010b |
|
BLAKE2b-256 | 60fd5a9d1db1e85cc8a8334f9426f077476b7adb7396850dea7e22ab04a91d5b |