Parallel and distributed training with spaCy and Ray
spacy-ray: Parallel and distributed training with spaCy and Ray
⚠️ This repo is still a work in progress and requires the new spaCy v3.0.
Ray is a fast and simple framework for building and running distributed applications. This very lightweight extension package lets you use Ray for parallel and distributed training with spaCy. If
spacy-ray is installed in the same environment as spaCy, it will automatically add
spacy ray commands to your spaCy CLI.
The main command is
spacy ray train for
parallel and distributed training, but we expect to add
spacy ray pretrain and
spacy ray parse as well.
You can install
spacy-ray from pip:
pip install spacy-ray
To check if the command has been registered successfully:
python -m spacy ray --help
Train a model using the same API as
python -m spacy ray train config.cfg --n-workers 2
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size spacy_ray-0.0.0-py2.py3-none-any.whl (13.6 kB)||File type Wheel||Python version py2.py3||Upload date||Hashes View|
|Filename, size spacy_ray-0.0.0.tar.gz (12.8 kB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for spacy_ray-0.0.0-py2.py3-none-any.whl