Parallel and distributed training with spaCy and Ray
Project description
spacy-ray: Parallel and distributed training with spaCy and Ray
⚠️ This repo is still a work in progress and requires the new spaCy v3.0.
Ray is a fast and simple framework for building and running distributed applications. This very lightweight extension package lets you use Ray for parallel and distributed training with spaCy. If spacy-ray
is installed in the same environment as spaCy, it will automatically add spacy ray
commands to your spaCy CLI.
The main command is spacy ray train
for
parallel and distributed training, but we expect to add spacy ray pretrain
and spacy ray parse
as well.
🚀 Quickstart
You can install spacy-ray
from pip:
pip install spacy-ray
To check if the command has been registered successfully:
python -m spacy ray --help
Train a model using the same API as spacy train
:
python -m spacy ray train config.cfg --n-workers 2
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for spacy_ray-0.1.4-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ebbe8268ee642eb460a18dc243269bb340c31214d32409c5b9c21beffecf0bce |
|
MD5 | 8634b8732970c4c690cee6315c808b33 |
|
BLAKE2b-256 | 2f6526795a2901c38466b64e0a17447d987ec31b50145ef4d3fc16c94f9480ea |