Skip to main content

A queue service for quickly running scripts sequentially to use all your CPUs efficiently

Project description

Inspired with code from https://github.com/fastai/fastgpu

fastcpu

A queue service for quickly developing scripts that use all your CPUs efficiently

fastcpu provides a single command, fastcpu_poll, which polls a directory to check for scripts to run, and then runs them on the first available CPU. If no CPUs are available, it waits until one is. If more than one CPU is available, multiple scripts are run in parallel, one per CPU. (Note currently the CPU load checking is not implemented, the scripts are run sequentially at polling interval)

Installation

pip install fastcpu

How to use

--help provides command help:

$ fastgpu_poll --help

optional arguments:

  -h, --help                         show this help message and exit
  --path PATH                        Path containing `to_run` directory (default: .)
  --exit_when_empty EXIT_WHEN_EMPTY  Exit when `to_run` is empty (default: 1)
  --poll_interval POLL_INTERVAL      The duration between polls (default: 0.1)

If installed via pip there is a handy command line method available

fastcpu_poll --path /path/to/scripts --exit_when_empty 0 --poll_interval 60

If running as a module

python -m fastcpu.cliu --path /path/to/scripts --exit_when_empty 0 --poll_interval 60

The above examples will run scrips located in the to_run subdirectory of the directory being monitored The program will not exit when there are no scripts left to run, it will keep polling since we set that to 0 the polling interval is 60 seconds, it can be set as fractions of a second e.g 0.1

once the program starts it creates the following directory structure. you can then your scripts in the to_run folder, and the scrips are run sequentially

.
├── complete
├── fail
├── out
├── running
└── to_run
    ├── script_example1.sh
    └── script_example2.sh

fastcpu_poll will run each script in to_run in sorted order. Each script will be assigned to one CPU (future)

Once a script is selected to be run, it is moved into a directory called running. Once it's finished, it's moved into complete or fail as appropriate. stdout and stderr are captured to files with the same name as the script, plus stdout or stderr appended.

If exit_when_empty is 1 (which is the default), then once all scripts are run, fastcpu_poll will exit. If it is 0 then fastcpu_poll will continue running until it is killed; it will keep polling for any new scripts that are added to to_run.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastcpu-1.0.1.tar.gz (5.2 kB view details)

Uploaded Source

Built Distribution

fastcpu-1.0.1-py3-none-any.whl (6.4 kB view details)

Uploaded Python 3

File details

Details for the file fastcpu-1.0.1.tar.gz.

File metadata

  • Download URL: fastcpu-1.0.1.tar.gz
  • Upload date:
  • Size: 5.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/53.0.0 requests-toolbelt/0.9.1 tqdm/4.58.0 CPython/3.8.5

File hashes

Hashes for fastcpu-1.0.1.tar.gz
Algorithm Hash digest
SHA256 bb252e00715e24a7f4a1f5e19ef8a956ae244813d79a9c3287fc516fb668d130
MD5 a3bae3aa51059be1abf41b2c19b6d5c1
BLAKE2b-256 a780b7e692c3a6a5c2a210425adc5776800780fe5db46cfd138b24abd931ade1

See more details on using hashes here.

File details

Details for the file fastcpu-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: fastcpu-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 6.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/53.0.0 requests-toolbelt/0.9.1 tqdm/4.58.0 CPython/3.8.5

File hashes

Hashes for fastcpu-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7414096520441cca14d2705f25bcd2d942c66b0c2e2fb08b171986039f4068c7
MD5 7c29537939f5ba03f75dc5c4fb480e13
BLAKE2b-256 4ddedc1c45b0388b637e44e8b9ee7be2f648983ee5cc5dbbf79f676bf1e507a7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page