A task queue for parallel processing
Project description
tc4v-tasq
A parallel safe task queue on disk.
About
tasq maintains a database of tasks and let you claim and mark tasks as done
in a multi-process safe way. It has two backends, a sqlite based one for general
single machine multi-process use, and a file based one for multi-machine multi-process
use (on a shared filesystem).
Installation
pip install tc4v-tasq
or
git clone https://git.sr.ht/~lattay/tasq.py
cd tasq.py
pip install .
Usage
A task is only defined as a string of text, or name, that should be unique. The interpretation of the name is up to the user.
To create a new task queue, use init and provide the first list of task from a file or stdin:
$ tasq init tasks.db - <<EOF
./work/task1 3
./work/task2 9
# comments and empty lines are ignored
./work/task3 0 # inline comments work too
EOF
Tasks can be added later in a similar fashion with add:
$ tasq add tasks.db - <<EOF
./work/task4 blublublu
./work/task5 "it uses shlex so spaces are ok"
./work/task6 blablabla
EOF
To see the current state of the queue, use list:
$ tasq list tasks.db
a6cc64a4401b4e0a94489b43bd22ac72 pending 2025-11-22 21:53:31 ['./work/task1', '3']
390dda92b89f4b4a9ed519fc86356fd7 pending 2025-11-22 21:53:31 ['./work/task2', '9']
a2380735393e4f07a7b4b67eb7a969eb pending 2025-11-22 21:53:31 ['./work/task3', '0']
482955a8421940a5b4874d4987affb30 pending 2025-11-22 21:53:37 ['./work/task4', 'blublublu']
1d875235a5a64d16921fa9d5e0d54309 pending 2025-11-22 21:53:37 ['./work/task5', 'it uses shlex so spaces are ok']
b030397203a94303b1a6a1960ba78aaf pending 2025-11-22 21:53:37 ['./work/task6', 'blablabla']
The hex string at the begining is a unique identifier for the task that is used to interact with it.
Tasks can be claimed using claim:
$ tasq claim tasks.db -n 2 --data
390dda92b89f4b4a9ed519fc86356fd7 ["./work/task2", "9"]
a6cc64a4401b4e0a94489b43bd22ac72 ["./work/task1", "3"]
and marked done with done:
$ tasq done tasks.db a6cc64a4401b4e0a94489b43bd22ac72
See also tasq help for more commands.
There is also a second script pll that does something similar to GNU parallel
but for a task queue.
For example the following command:
$ pll tasks.db -n 2 ./process.sh '{0}' '{1}'
will process all tasks calling the command ./process.sh with the task
arguments (placeholder follow str.format syntax) using two parallel workers.
The outputs will be printed concatenates with # <TASK_ID> line above each
outputs.
Because it uses the task queue, you can have more than one pll running in
parallel (for example if using slurm, each slurm job can use pll).
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file tc4v_tasq-0.1.1.tar.gz.
File metadata
- Download URL: tc4v_tasq-0.1.1.tar.gz
- Upload date:
- Size: 15.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.7.22
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
344524d2ccf62e9597bb72e73ff0b82380a46fa46d64a0101c67c312d9bfa34d
|
|
| MD5 |
d4df456a17b0c6b1044428f928946f2a
|
|
| BLAKE2b-256 |
73f4043ec17d6f4d7b8f60c400d3aa7e5bc2d25d8ec9e9f6d90afbf005e53c9b
|
File details
Details for the file tc4v_tasq-0.1.1-py3-none-any.whl.
File metadata
- Download URL: tc4v_tasq-0.1.1-py3-none-any.whl
- Upload date:
- Size: 17.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.7.22
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
dc3ac1b144877c43ae031bf254d030f42ef432a5415051e8cea6ff5e5cd602dc
|
|
| MD5 |
7c9b07a99e88191414656fd44cd0d02b
|
|
| BLAKE2b-256 |
919e424fe4321c509b0fd05c7611cf7d6c418383c90d01e0fcbd764e4a94607a
|