Skip to main content

Curated transformer models

Project description

🤖 Curated transformers

This Python package provides a curated set of transformer models for spaCy. It is focused on deep integration into spaCy and will support deployment-focused features such as distillation and quantization. Curated transformers currently supports the following model types:

  • ALBERT
  • BERT
  • CamemBERT
  • RoBERTa
  • XLM-RoBERTa

Supporting a wide variety of transformer models is a non-goal. If you want to use another type of model, use spacy-transformers, which allows you to use Hugging Face transformers models with spaCy.

⚠️ Warning: experimental package

This package is experimental and it is possible that the models will still change in incompatible ways.

⏳ Install

pip install spacy-curated-transformers

🚀 Quickstart

An example project is provided in the project directory.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spacy-curated-transformers-0.0.8.tar.gz (47.7 kB view details)

Uploaded Source

File details

Details for the file spacy-curated-transformers-0.0.8.tar.gz.

File metadata

File hashes

Hashes for spacy-curated-transformers-0.0.8.tar.gz
Algorithm Hash digest
SHA256 179e8497c73589ce3df86401dad9301bbd7a0d1a70979777354fb4d21bc27a15
MD5 eeeb75dc9550b857bfeddfde000375cb
BLAKE2b-256 b06699c785ec1d4ec244661ca2ce887e997c32ce9c1a1301f865205e168afc28

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page