Skip to main content
This is a pre-production deployment of Warehouse. Changes made here affect the production instance of PyPI (
Help us improve Python packaging - Donate today!

A utility to transport files and execute at remote in prarallel

Project Description

This is a framework to execute programs in multiple nodes concurrently.
It consists of two modules: controller and executor. The controller controls the whole process, it first dispatches the executor
and user-defined program files to all the nodes specified by the user. Then it starts the executor remotely and waits for all the executors
to end. The executor run the program specified by the user and reports to the controller peroidly. When it finishes, it notifys the
controller and then exits. After all the executors have reported their status or a speicified timeout is reached, the controller exits.
The user can then send another request to pull the results.
To monitor every executor's status, the executor reports to the controller peroidly. When one executor fails to report in a
predefined timeout, the controller will mark it as dead. And if the controller receives the report later, it changes the status to be alive
again. The controller never waits for a dead node.

Heart beat monitor
Complete parallel execution
Easy tool for cloning files
Agile and simple:
unlike program in other languages, this is simple and the executor is only one single file.
Minimal resource exhaustion:
the execution is distributed in all nodes, and the controller exausted little resource(mainly for monitor heartbeat)

This is just tested on python2.7 and python2.6 in linux environment.
And it based on ssh and scp for net communication, before your exection, you should authenticated your controller with all executors.

corunner-run: to run program in multiple nodes concurrently.
corunner-cp: to dispatch files from or to multiple nodes concurrently.

1. Execute the myscript in all machines from 192.168.101 to 192.168.200 conrrently with heartbeat switched off:
Run in controller: corunner-run -n -f myscript -r "/tmp/corunner" -i 1 python /tmp/corunner/myscript
2. Collect all output files from above to the controller and put in seperated directory:
Run in controller: corunner-cp -n -i -s /tmp/corunner/ouput -d /temp/corunner/all --divide

If you have any problem or suggestion, welcome to contact me, my email is zwsun<>. I'm pleasant if it can help
you and improve your efficiency when working with many machines.

Release History

This version
History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


Download Files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, Size & Hash SHA256 Hash Help File Type Python Version Upload Date
(23.2 kB) Copy SHA256 Hash SHA256
Source None Nov 5, 2013

Supported By

Elastic Elastic Search Pingdom Pingdom Monitoring Dyn Dyn DNS Sentry Sentry Error Logging CloudAMQP CloudAMQP RabbitMQ Heroku Heroku PaaS Kabu Creative Kabu Creative UX & Design Fastly Fastly CDN DigiCert DigiCert EV Certificate Google Google Cloud Servers DreamHost DreamHost Log Hosting