Curated transformer models
Project description
🤖 Curated transformers
This Python package provides a curated set of transformer models for spaCy. It is focused on deep integration into spaCy and will support deployment-focused features such as distillation and quantization. Curated transformers currently supports the following model types:
- ALBERT
- BERT
- CamemBERT
- RoBERTa
- XLM-RoBERTa
Supporting a wide variety of transformer models is a non-goal. If you want
to use another type of model, use
spacy-transformers
, which
allows you to use Hugging Face
transformers
models with spaCy.
⚠️ Warning: experimental package
This package is experimental and it is possible that the models will still change in incompatible ways.
⏳ Install
pip install git+https://github.com/explosion/curated-transformers.git
🚀 Quickstart
An example project is provided in the project
directory.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for curated-transformers-0.0.6.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | dfe4fc34df2266b6b76291f1a226347fd57eec460c9738b7a0aed2e5facff0fb |
|
MD5 | 76a9424e4cd0cfa6ccee1c111998a144 |
|
BLAKE2b-256 | 773afba900077e19218f0e9d58b90d4ba4db99d1ad3d0d54564c64e8bb3ea14d |
Hashes for curated_transformers-0.0.6-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 622b2473f52e24b2acfd99ec4d961817485e1b1ae886a4e43c7238c7b86c9d55 |
|
MD5 | 672885bc900a1ce891baa497c587dd15 |
|
BLAKE2b-256 | eabb8b4f2b8971aa7021c3080244287254ce33461706e6ebc158d78cf9c03a2b |