Skip to main content

A simple and fast task queue for executing multiple tasks in parallel.

Project description

tasks
===

__tasks.py__ is a simple and fast task queue for executing multiple tasks in parallel. All you need to do is specify the task as a simple function that takes an argument and you get instant parallelism.

Based on eventlet, multiprocessing and redis.

It is ideal for executing multiple network bound tasks in parallel from a single node, without going through the pain of setting up a map reduce cluster. It uses both processes and green threads to extract the maximum out of a single node setup.

Installation
-----------

1. Install redis and start the server, **tasks** uses redis for queueing jobs. If you already have a redis server setup, call `tasks.set_redis` and pass a redis connection object with a different database/namespace from what you normally use in your application.

2. Install the redis-py and eventlet libraries.

`pip install redis eventlet`

3. Install tasks or copy this package to your source code.

`pip install tasks-py`

Usage
-----
Import `tasks` and call eventlet's monkey patch function in the first line of your module. Call `tasks.set_func` to register your function. This function will be receiving a string as an argument and its return value will be ignored. To indicate failure of the task, raise an error or exception within the function. Call `tasks.main()` to get the interactive command line options.

import eventlet
eventlet.monkey_patch()
import tasks

from urllib2 import urlopen

def fetch(url):
f = open('/tmp/download', 'w')
body = urlopen(url).read()
f.write(body)
f.close()

tasks.set_func(fetch)
tasks.main()

Now to add jobs, create a file with one argument per line and use this command.

`$ python yourfile.py add <list_of_jobs.txt>`

To start (or restart) the job processing (do this in a **screen** session or close the input stream):

`$ python yourfile.py run`

**tasks** has resume support, so it will start where you left off the last time.

To view the current status while it is running:

`$ python yourfile.py status`

Once you are done, you can clear the logs and the completed tasks by calling reset.

`$ python yourfile.py reset`

See the code or the test.py file for more information. Feel free to fork and modify this.

----

**Author** : Vivek Narayanan <<vivek@vivekn.com>>

**License** : BSD

(C) Vivek Narayanan (2014)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tasks-py-1.0.3.tar.gz (3.5 kB view details)

Uploaded Source

Built Distribution

tasks-py-1.0.3.macosx-10.10-intel.tar.gz (4.8 kB view details)

Uploaded Source

File details

Details for the file tasks-py-1.0.3.tar.gz.

File metadata

  • Download URL: tasks-py-1.0.3.tar.gz
  • Upload date:
  • Size: 3.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for tasks-py-1.0.3.tar.gz
Algorithm Hash digest
SHA256 c2c162c7e20afcaddb0bb24012d4b58182f1ebfa36f3bc2e4d1f02484dc22784
MD5 765ddeab387b1ab5624f9685808726c6
BLAKE2b-256 f31b43e081533932f54d09e2c1e1b7efb3b04d658bf28654d8132b276f4482d0

See more details on using hashes here.

File details

Details for the file tasks-py-1.0.3.macosx-10.10-intel.tar.gz.

File metadata

File hashes

Hashes for tasks-py-1.0.3.macosx-10.10-intel.tar.gz
Algorithm Hash digest
SHA256 c9c2d68c1be8933fa0e7fbe7c329c4c5dbd5b36038beca7f3f8180eb58eee6bf
MD5 9f4481283f7d07ad447fc3c79632d1e8
BLAKE2b-256 4abfc7b87d36fdaab6fd28111d80a626b4d2baf09e214cac6892634e451f2be5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page