Skip to main content

A pure Python remote job scheduler system

Project description

Description: pscheduler

.. image::
.. image::
.. image::
.. image::
.. image::
:alt: Code Health


* This pscheduler service is critical dependent upon the usage of shared filesystem across the remote machines. Setup NFS for all the remote machines.
* Have same login credentials for all the remote machines.
* Basic Linux utilities like *ssh* and *nohup* should be installed on all the remote machines.


**pscheduler** is a Python 3 only module and can be easily installed using pip:

``pip install pscheduler``


* Start the service using ``pscheduler service start``. This will create a directory **.pscheduler** under your home directory.
* Edit file **~/.pscheduler/hosts.cfg** and add all the IP addresses/domain aliases of the remote machines in each line. Save and exit. Alternately if you are in an HPC environment and LSF is already installed, then you can use the provided script ```` to directly populate the remote machine. You should then edit it to remove the head node machine from the list and any other required host.
* If you have not already setup password-less login into remote machines, then use the provided script ```` to set it up.


* There are four basic utilities:

| **service** | start/stop the background service |
| **hosts** | get information on remote machines|
| **sub** | submit jobs |
| **jobs** | monitor running/pending job |

*See helpfiles of individual subcommands for further details*

* It is critical that the background service is started before the submitting any job

* Examples:

* Start service: ``pscheduler service start``
* Submit job: ``pscheduler sub "sleep 10"``

* Pending and running job configuration files are stored under **~/.pscheduler/jobs** in PEND and RUN directory respectively. By default the finished job configuration file containing standard output(JSON format) are stored under **~/.pscheduler/jobs/FINISH**, but users can choose a custom location for this file by using **-o** flag while submitting job and providing the output location path.

* **~/.pscheduler/DAEMON.log** contains the log of the background service. You should have a look at this file to see if any error messages have been thrown.

* Note that pscheduler *will not* will ignore the scheduling by other schedulers and simply launch jobs based on availability of resources (currently only number of CPU cores).

Roadmap for future versions

* Implement ``jobs`` subcommand
* Job restart in case of resource spike
* Improve code for catching fatal exceptions


* 0.0.1:
* 0.0.2:
* BHOST wrapper script separated form phosts module
* Introduced hosts.cfg: A user editable list of hosts
* Batch script made for creating login keys
* PSUB now saves in JSON format
* DEV: Submit to PyPi using python script
* DEV: Automatic update of version in
* 0.0.3:
* Added ``pscheduler`` in scripts for command line invocation
* 0.0.4:
* Deployment fix
* 0.0.5:
* Import fix
* 0.0.6:
* Json fix
* 0.0.7:
* jobs subcommand implemented
* phosts doublehost check issue rectified
* submission process improved in daemon
* class naming convention changed
* default host fixed to localhost. passwordless loging into localhost created
* DEV: Travis CI now being used for testing
* 0.1.0:
* pdaemon fix
* Rolledback SSH keygen

Parashar Dhapola (

Keywords: scheduler
Platform: UNKNOWN
Classifier: Development Status :: 3 - Alpha
Classifier: License :: OSI Approved :: MIT License
Classifier: Natural Language :: English
Classifier: Operating System :: POSIX :: Linux
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.5
Classifier: Topic :: System :: Distributed Computing

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pscheduler-0.1.0rc3.tar.gz (11.3 kB view hashes)

Uploaded Source

Built Distribution

pscheduler-0.1.0rc3-py3-none-any.whl (14.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page