Skip to main content

A queue service for quickly running scripts sequentially to use all your CPUs efficiently

Project description

fastcpu

Inspired with code from https://github.com/fastai/fastgpu

A queue service for quickly developing scripts that use all your CPUs efficiently

fastcpu provides a single command, fastcpu_poll, which polls a directory to check for scripts to run, and then runs them on the first available CPU. If no CPUs are available, it waits until one is. If more than one CPU is available, multiple scripts are run in parallel, one per CPU. (Note currently the CPU load checking is not implemented, the scripts are run sequentially at polling interval)

Installation

pip install fastcpu

How to use

--help provides command help:

$ fastgpu_poll --help

optional arguments:

  -h, --help                         show this help message and exit
  --path PATH                        Path containing `to_run` directory (default: .)
  --exit_when_empty EXIT_WHEN_EMPTY  Exit when `to_run` is empty (default: 1)
  --poll_interval POLL_INTERVAL      The duration between polls (default: 0.1)

If installed via pip there is a handy command line method available

fastcpu_poll --path /path/to/scripts --exit_when_empty 0 --poll_interval 60

If running as a module

python -m fastcpu.cliu --path /path/to/scripts --exit_when_empty 0 --poll_interval 60

The above examples will run scrips located in the to_run subdirectory of the directory being monitored The program will not exit when there are no scripts left to run, it will keep polling since we set that to 0 the polling interval is 60 seconds, it can be set as fractions of a second e.g 0.1

once the program starts it creates the following directory structure. you can then your scripts in the to_run folder, and the scrips are run sequentially

.
├── complete
├── fail
├── out
├── running
└── to_run
    ├── script_example1.sh
    └── script_example2.sh

fastcpu_poll will run each script in to_run in sorted order. Each script will be assigned to one CPU (future)

Once a script is selected to be run, it is moved into a directory called running. Once it's finished, it's moved into complete or fail as appropriate. stdout and stderr are captured to files with the same name as the script, plus stdout or stderr appended.

If exit_when_empty is 1 (which is the default), then once all scripts are run, fastcpu_poll will exit. If it is 0 then fastcpu_poll will continue running until it is killed; it will keep polling for any new scripts that are added to to_run.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastcpu-1.0.0.tar.gz (5.1 kB view details)

Uploaded Source

Built Distribution

fastcpu-1.0.0-py3-none-any.whl (6.3 kB view details)

Uploaded Python 3

File details

Details for the file fastcpu-1.0.0.tar.gz.

File metadata

  • Download URL: fastcpu-1.0.0.tar.gz
  • Upload date:
  • Size: 5.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/53.0.0 requests-toolbelt/0.9.1 tqdm/4.58.0 CPython/3.8.5

File hashes

Hashes for fastcpu-1.0.0.tar.gz
Algorithm Hash digest
SHA256 328c8e74b93e97ecf617307c71f2ccb7d400153dc958f724db22b2ab66362b26
MD5 8ff7999a90a1c1c330d928b8450ab90e
BLAKE2b-256 498c6034403e3807dc037aa37b2cb102177884a7ce15d4336932a4f6c951a519

See more details on using hashes here.

File details

Details for the file fastcpu-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: fastcpu-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 6.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/53.0.0 requests-toolbelt/0.9.1 tqdm/4.58.0 CPython/3.8.5

File hashes

Hashes for fastcpu-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2e04ddcd6897808afd9f808ec43f0f6331ed1cb92bf1b9a1b7488ee0c13f7b4e
MD5 649258e8bddad4eb75e1473fe7c05005
BLAKE2b-256 961c37914b1b7bd74f92b846cf52a598f517793cba8569b2ad00f54166675376

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page