Curated transformer models for spaCy pipelines
Reason this release was yanked:
Missing wheel
Project description
💫 🤖 spaCy Curated Transformers
This package provides spaCy components and
architectures to use a curated set of transformer models via
curated-transformers
in
spaCy.
Features
- Use pretrained models based on one of the following architectures to
power your spaCy pipeline:
- ALBERT
- BERT
- CamemBERT
- RoBERTa
- XLM-RoBERTa
- All the nice features supported by
spacy-transformers
such as support for Hugging Face Hub, multi-task learning, the extensible config system and out-of-the-box serialization - Deep integration into spaCy, which lays the groundwork for deployment-focused features such as distillation and quantization
- Minimal dependencies
⏳ Installation
Installing the package from pip will automatically install all dependencies.
pip install spacy-curated-transformers
🚀 Quickstart
An example project is provided in the project
directory.
📖 Documentation
- 📘 Layers and Model Architectures: Power spaCy components with custom neural networks
- 📗
CuratedTransformer
: Pipeline component API reference - 📗 Transformer architectures: Architectures and registered functions
Bug reports and other issues
Please use spaCy's issue tracker to report a bug, or open a new thread on the discussion board for any other issue.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Close
Hashes for spacy_curated_transformers-2.1.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 25b386b023e21d7a87fd40d6fd519aeb054748c89dddcdd3bbd81e1650f12433 |
|
MD5 | b5ca3488665b69de31f1766ab168187f |
|
BLAKE2b-256 | 693791834f5efe7bdf6e6ff99471f616ecdeb6a62530f557aac4764f1fc54c73 |