Skip to main content

Background Workflows for task processing

Reason this release was yanked:

old

Project description

Background Workflows

A pluggable framework for managing background tasks in Python. Supports both local task execution with threading and distributed task execution using Celery.

Features

  • Task orchestration with support for Azure Table Storage or SQLite for task persistence.
  • Queue management for local in-memory queues or Celery for distributed task processing.
  • Dynamic task creation and registration for extensible workflow management.

Additional Documentation

For more detailed documentation, please refer to the following sections:

  • Summary: summary.md – A brief summary of the framework and its key components.
  • Controller Documentation: controller.md – Detailed information about the controller components and how they work.
  • Storage Documentation: storage.md – Information on task storage options, including Azure and SQLite.
  • Task Documentation: tasks.md – Overview of task management, including how to define and run tasks.
  • Utils Documentation: utils.md – A guide to the utility functions and classes used in the framework.
  • Overall Framework Diagram: overal_diagram.md – A high-level diagram illustrating the components and their relationships.

Installation

You can install the background_workflows library via pip:

pip install background_workflows

Usage

1. Initialize Task Store and Queue Backend

from background_workflows.storage.tables.task_store_factory import TaskStoreFactory
from background_workflows.storage.queue.local_queue_backend import LocalQueueBackend

# Initialize task store and queue backend
factory = TaskStoreFactory(store_mode="sqlite")
task_store = factory.get_task_store()
queue_backend = LocalQueueBackend()

2. Start a Background Activity

from background_workflows.utils.workflow_client import WorkflowClient

workflow_client = WorkflowClient(task_store, queue_backend)
workflow_client.start_activity("MY_CUSTOM_TASK", resource_id="1234", payload={"data": "value"})

3. Monitor Task Status

status = workflow_client.get_status(row_key="unique-row-id", resource_id="1234")
print(f"Task Status: {status}")

Configuration

Environment Variables

To configure the library, set the following environment variables:

  • STORE_MODE: Choose between "azure" or "sqlite" for task storage.
  • AZURE_STORAGE_CONNECTION_STRING: Required for Azure storage mode.
  • SQLITE_DB_PATH: Path to the SQLite database (default is local_tasks.db).
  • CELERY_BROKER_URL: The URL for the Celery message broker (e.g., Redis).
  • CELERY_BACKEND_URL: The URL for the Celery result backend (e.g., Redis).

Example:

export STORE_MODE=azure
export AZURE_STORAGE_CONNECTION_STRING="your_connection_string_here"

Controllers

The framework provides two key controllers for managing task execution:

1. MainController

  • Purpose: Polls queues and executes tasks in a local thread pool.
  • Configuration: Can be configured to use a specific number of threads and CPU usage threshold.
  • Usage:
from background_workflows.controller.main.main_controller import MainController

controller = MainController(task_store, queue_backend)
controller.run()  # Starts the continuous polling and task execution loop

1. CeleryController

  • Purpose: Delegates task processing to Celery workers, removing the need for local polling.
  • Configuration: Requires a Celery setup and message broker.
  • Usage:
from background_workflows.controller.celery.celery_controller import CeleryController

controller = CeleryController(task_store, celery_queue_backend)
controller.run_once()  # Executes a single pass of task handling

Contributing

We welcome contributions to the background_workflows library! To contribute:

  1. Fork the repository.
  2. Create a new branch for your feature or bug fix.
  3. Make your changes.
  4. Run tests to ensure everything works.
  5. Submit a pull request with a description of the changes you’ve made.

Running Tests

To run the tests, you can use pytest:

pytest tests/

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

background_workflows-0.1.0.tar.gz (40.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

background_workflows-0.1.0-py3-none-any.whl (66.7 kB view details)

Uploaded Python 3

File details

Details for the file background_workflows-0.1.0.tar.gz.

File metadata

  • Download URL: background_workflows-0.1.0.tar.gz
  • Upload date:
  • Size: 40.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.4

File hashes

Hashes for background_workflows-0.1.0.tar.gz
Algorithm Hash digest
SHA256 c86d2a9b21f5bdb51be077d2bd3c658b10ddcfb8026423d9cf0b5ac2be0c18f2
MD5 00620f56e4abc2d08a4f9c9c05793ad5
BLAKE2b-256 5b071fb8402b40c54d54253adb9a31fde9b556bb025a21402bdea3f3b58377a4

See more details on using hashes here.

File details

Details for the file background_workflows-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for background_workflows-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a34710158defff7a2cacadaf7da5feb05080d17102bd49e14d8cb6c499e0bdab
MD5 71282861e3a8b768b0e7cb21a8a4d08d
BLAKE2b-256 fbb7604b0dcb8d86a18753cb383d89b5d6baaf9b02d054f3e09eecd037a4853d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page