Skip to main content

A simple queue for Python tasks

Project description

pytask

A simple sqlite3-based job queue with a worker. Main purpose is to run jobs in a queue. Jobs are not popped from the queue, which means the queue can act as a history.

Installation

pip install pytask-queue

Usage

The worker will run the function func for each job. The function will be passed a Job object. Which means that you can alter the job object in the function, and the newly updated job will be saved to the queue.

# python process 1
from pytask import Queue, Job, SQLDataType, SQLColumnConditions

queue = Queue(schema=[
    ("foo", SQLDataType.INTEGER, [SQLColumnConditions.NOT_NULL]), 
    ("bar", SQLDataType.TEXT, [SQLColumnConditions.NOT_NULL]), 
    ("baz", SQLDataType.JSON, [SQLColumnConditions.NOT_NULL])
])
queue.insert(Job(data={"foo": 1, "bar": "test", "baz": {"foo": "bar"}}))
# python process 2
from <relative_file> import queue
from pytask import Job

def func(job: Job):
    # Do something with job
    job.data["foo"] += 1

worker = Worker(queue, func)
worker.run()

Creating multiple queues or multiple workers is possible. Creating a new queue object won't actually create a new queue, it just creates a new connection to the queue. Which means you can have multiple queue objects pointing to the same queue, or you can use the same queue object for multiple workers.

Be careful to avoid race conditions when using the same queue object for multiple workers.

Flags

Flags are used to configure the behavior of the queue and worker.

Current flags:

  • auto_convert_json_keys: If True, the queue will automatically convert JSON keys to strings. Useful for retrieving and manipulating JSON data.
  • pop_after_processing: If True, the job will be popped from the queue after processing.
from pytask import Queue, Worker, Job, SQLDataType, SQLColumnConditions, Flags

flags = Flags(auto_convert_json_keys=True, pop_after_processing=True)
queue = Queue(schema=[("foo", SQLDataType.INTEGER, [SQLColumnConditions.NOT_NULL])], flags=flags)

worker = Worker(queue, func, logger=logger)
worker.run()

Concurrent Worker

The concurrent worker is a worker that runs jobs in parallel. It uses a thread pool to run the jobs.

from pytask import Queue, ConcurrentWorker, Job, SQLDataType, SQLColumnConditions

worker = ConcurrentWorker(queue, func, logger=logger, interval=1, max_workers=16)
worker.run()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytask_queue-1.0.2.tar.gz (8.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pytask_queue-1.0.2-py3-none-any.whl (11.2 kB view details)

Uploaded Python 3

File details

Details for the file pytask_queue-1.0.2.tar.gz.

File metadata

  • Download URL: pytask_queue-1.0.2.tar.gz
  • Upload date:
  • Size: 8.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: pdm/2.22.1 CPython/3.10.5 Linux/6.5.0-1025-azure

File hashes

Hashes for pytask_queue-1.0.2.tar.gz
Algorithm Hash digest
SHA256 3795235f982063040bc14c897944d043e37e4d3513c8cc8d60abe87c7eefca6e
MD5 c6e36491ec5ebba0648afd39c8b33a9d
BLAKE2b-256 7f24942a3e504ba9710958d06178ad1e39859f84d93c9085734333f431dc73c7

See more details on using hashes here.

File details

Details for the file pytask_queue-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: pytask_queue-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 11.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: pdm/2.22.1 CPython/3.10.5 Linux/6.5.0-1025-azure

File hashes

Hashes for pytask_queue-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 6e7074237d0bd1962a6267bd60989b4299b95004dd852a661180f871bae77892
MD5 b5bdfa80b5c3a373611fa7de0de7e059
BLAKE2b-256 306f4a7c41a3a81e2d8a6ee068711d167bda676fda8f5e4872d602c5e2a7ca2d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page