Skip to main content

Queue service built on top of mongo.

Project description

mongo_queue

Task queue built on mongo with channels and unique job id.

WebsiteautobotAI Automation Platform

Inspired from kapilt/mongoqueue

Change Log:

v0.0.5

  • Added depends_on feature. You can create dependency between jobs by supplying depends_on[] with previously created job ids.

v0.0.3

  • Added unique index with job_id and channel. This is to make sure that the same job is not added multiple times. If not job id provided an unique id generated by default.

Usage

Install the package.

pip install mongo_queue

Usage Example:

  • Create Queue Object
from mongo_queue.queue import Queue
from pymongo import MongoClient

queue = Queue(MongoClient('localhost', 27017).task_queue, consumer_id="consumer-1", timeout=300, max_attempts=3)
  • Add task to queue default channel
queue.put({"task_id": 1})
  • Add task to queue with priority to default channel
queue.put({"task_id": 1}, priority=1)
  • Add task to queue in a specific channel
queue.put({"task_id": 1}, priority=1, channel="channel_1")
  • Add task to queue with unique job_id
queue.put({"task_id": 1}, priority=1, channel="channel_1", job_id="x_job")
  • Add task with dependency
job1 = queue.put({"task_id": 1}, priority=1, channel="channel_1", job_id="x_job")
job2 = queue.put({"task_id": 2}, priority=1, channel="channel_1", job_id="x_job", depends_on=[job1])
  • Get the next job to be executed from the default channel
job = queue.next()
  • Get the next job to be executed from a specific channel
job = queue.next(channel="channel_1")
  • Update job progress for long-running jobs
job.progress(count=10)
  • Put the job back in queue, this will be picked up again later, this will update attempts after max attempts the job will not be picked up again.
job.release()
  • Put the job back in queue with error, this will be picked up again later, this will update attempts after max attempts the job will not be picked up again.
job.error("Some error occured")
  • Complete the job. This will delete job from the database.
job.complete()

Build Steps

# Setup venv of python version 3.6 and above
pip install -r requirements.txt
python -m pip install --upgrade twine
python setup.py sdist bdist_wheel
python -m twine upload --repository-url https://upload.pypi.org/legacy/ dist/*

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mongo_queue_service-0.0.5.tar.gz (8.3 kB view details)

Uploaded Source

Built Distribution

mongo_queue_service-0.0.5-py3-none-any.whl (8.6 kB view details)

Uploaded Python 3

File details

Details for the file mongo_queue_service-0.0.5.tar.gz.

File metadata

  • Download URL: mongo_queue_service-0.0.5.tar.gz
  • Upload date:
  • Size: 8.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.10.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.9

File hashes

Hashes for mongo_queue_service-0.0.5.tar.gz
Algorithm Hash digest
SHA256 8f153b30ec38babc5898f351783dbd4292be46922ec4722e92655998793203b5
MD5 f390efd2db46f5c347ec145a1c73f50f
BLAKE2b-256 427ad87e8028d9f72caf8966faa1b2ef901918262513e77419b873164fc15747

See more details on using hashes here.

File details

Details for the file mongo_queue_service-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: mongo_queue_service-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 8.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.10.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.9

File hashes

Hashes for mongo_queue_service-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 2b299d91f7c144417a38dc7e46aff97f0b22b06e891c3c151de4df1dcd9adea5
MD5 ed9e80df9fb1bdf3e8d1d7b32e4aff8a
BLAKE2b-256 98de0206a9685307072e395cc211ba4c34fe53e542ffa51178a7b7103d8369ac

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page