Train transformer-based models
Project description
Zelda Rose
A trainer for transformer-based models.
Installation
Simply install with pip (preferably in a virtual env, you know the drill)
pip install zeldarose
Train a model
Here is a short example:
TOKENIZERS_PARALLELISM=true zeldarose-tokenizer --vocab-size 4096 --out-path local/tokenizer --model-name "my-muppet" tests/fixtures/raw.txt
zeldarose-transformer --tokenizer local/tokenizer --pretrained-model flaubert/flaubert_small_cased --out-dir local/muppet --val-text tests/fixtures/raw.txt tests/fixtures/raw.txt
There are other parameters (see zeldarose-transformer --help
for a comprehensive list), the one
you are probably mostly interested in is --config
(for which there is an example target in
examples/
).
The parameters --pretrained-models
, --tokenizer
and --model-config
are all fed directly to
Huggingface's transformers
and can be pretrained
models names or local path.
Distributed training
This is somewhat tricky, you have several options
-
If you are running in a SLURM cluster use
--strategy ddp
and invoke viasrun
-
Otherwise you have two options
- Run with
--strategy ddp_spawn
, which usesmultiprocessing.spawn
to start the process swarm (tested, but possibly slower and more limited, seepytorch-lightning
doc) - Run with
--strategy ddp
and start withtorch.distributed.launch
with--use_env
and--no_python
(untested)
- Run with
Other hints
- Data management relies on 🤗 datasets and use their cache management system. To run in a clear
environment, you might have to check the cache directory pointed to by the
HF_DATASETS_CACHE
environment variable.
Inspirations
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for zeldarose-0.6.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | e3435d6a8fd1a67e14613b6067d7e1bf40bba4273c83dddb93015e0cedcfa8b0 |
|
MD5 | ddbd8d7b14e4f294956065b759547736 |
|
BLAKE2b-256 | fe7c9ed727fd29d2b9698624dc95f7ed1dee0ae414f0297a6f035937c09f7a02 |