Skip to main content

A queue service for quickly running scripts sequentially to use all your CPUs efficiently

Project description

Inspired with code from https://github.com/fastai/fastgpu

fastcpu

A queue service for quickly developing scripts that use all your CPUs efficiently

PULL REQUESTS WELCOME!

fastcpu provides a single command, fastcpu_poll, which polls a directory to check for scripts to run, and then runs them on the first available CPU. If no CPUs are available, it waits until one is. If more than one CPU is available, multiple scripts are run in parallel, one per CPU. (Note currently the CPU load checking is not implemented, the scripts are run sequentially at polling interval)

Installation

pip install fastcpu

How to use

--help provides command help:

$ fastgpu_poll --help

optional arguments:

  -h, --help                         show this help message and exit
  --path PATH                        Path containing `to_run` directory (default: .)
  --exit_when_empty EXIT_WHEN_EMPTY  Exit when `to_run` is empty (default: 1)
  --poll_interval POLL_INTERVAL      The duration between polls (default: 0.1)

If installed via pip there is a handy command line method available

fastcpu_poll --path /path/to/scripts --exit_when_empty 0 --poll_interval 60

If running as a module

python -m fastcpu.cliu --path /path/to/scripts --exit_when_empty 0 --poll_interval 60

The above examples will run scrips located in the to_run subdirectory of the directory being monitored The program will not exit when there are no scripts left to run, it will keep polling since we set that to 0 the polling interval is 60 seconds, it can be set as fractions of a second e.g 0.1

once the program starts it creates the following directory structure. you can then your scripts in the to_run folder, and the scrips are run sequentially

.
├── complete
├── fail
├── out
├── running
└── to_run
    ├── script_example1.sh
    └── script_example2.sh

fastcpu_poll will run each script in to_run in sorted order. Each script will be assigned to one CPU (future)

Once a script is selected to be run, it is moved into a directory called running. Once it's finished, it's moved into complete or fail as appropriate. stdout and stderr are captured to files with the same name as the script, plus stdout or stderr appended.

If exit_when_empty is 1 (which is the default), then once all scripts are run, fastcpu_poll will exit. If it is 0 then fastcpu_poll will continue running until it is killed; it will keep polling for any new scripts that are added to to_run.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastcpu-1.0.3.tar.gz (5.2 kB view details)

Uploaded Source

Built Distribution

fastcpu-1.0.3-py3-none-any.whl (6.4 kB view details)

Uploaded Python 3

File details

Details for the file fastcpu-1.0.3.tar.gz.

File metadata

  • Download URL: fastcpu-1.0.3.tar.gz
  • Upload date:
  • Size: 5.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/53.0.0 requests-toolbelt/0.9.1 tqdm/4.58.0 CPython/3.8.5

File hashes

Hashes for fastcpu-1.0.3.tar.gz
Algorithm Hash digest
SHA256 09f5a8a6b6a0e139e3d45c695b97b959045768d7c842812b3643f7e81b157834
MD5 0c69e007bf9a2e24d179afdfadd38867
BLAKE2b-256 f6f30cfea9a227c070ae99a85469d5935f9e7531a9c49ab70bf597b164cab5dd

See more details on using hashes here.

File details

Details for the file fastcpu-1.0.3-py3-none-any.whl.

File metadata

  • Download URL: fastcpu-1.0.3-py3-none-any.whl
  • Upload date:
  • Size: 6.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/53.0.0 requests-toolbelt/0.9.1 tqdm/4.58.0 CPython/3.8.5

File hashes

Hashes for fastcpu-1.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 4df951fa0d5c6c650b90013360d97b183fbad6c688034be51097224c978c8288
MD5 c35b25ef29c0c92b92ff4df43dd52e54
BLAKE2b-256 86e0b1011348a9515252cf5653787e19297cfd741da7bf44cc504b60774eaf2f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page