Skip to main content

AI for Science Job Queue - A distributed job queue system for large scale embarassingly parallel workloads

Project description

AI for Science JobQ

The main documentation is hosted at microsoft.github.io/ai4s-jobq.

Installation

To install, run

pip install ai4s-jobq

# or, if you log data to app insights and want to use a local dashboard:
pip install ai4s-jobq[track]

The ai4s.jobq package enables multiple users to push work items to an Azure Queue or an Azure Servicebus, while one or more workers pull and process tasks asynchronously. This approach is useful in scenarios where:

  • Tasks are too small to justify the overhead of launching an Azure ML job for each one.
  • Workloads need to be distributed across diverse environments (e.g., Azure ML clusters in different regions).
  • Throughput control is desired, scaling workers up or down as needed.

By decoupling job creation from execution, ai4s.jobq allows users to queue up tasks in advance and process them at a controlled rate based on resource availability.

Key Features

  • Native Azure Queues: Uses Azure Storage queues or Servicebus, no additional infrastructure.
  • Robustness: Jobs automatically reappear in the queue if a worker fails to complete them (for example, after pre-emptions or crashes).
  • Simple CLI Usage:
    # Azure Storage Queue
    export QUEUE=my_storage_account_name/my_queue_name
    # ...or Azure Servicebus (pick one!)
    export QUEUE=sb://my_service_bus/my_queue_name
    
    ai4s-jobq $QUEUE push -c "echo hello"
    ai4s-jobq $QUEUE worker
    
    (Requires Storage Queue Data Contributor role on the selected storage account for Azure Storage Queues or Azure Service Bus Data Owner role for Servicebus.)
  • Advanced Python API: Efficient handling of I/O-bound tasks, minimizing overhead in blob storage interactions and reducing the need for manual multi-threading/multi-processing.
  • Scalability & Efficiency: Enables large-scale distributed batch processing while being able to rely on cheap and available pre-emptible compute.
  • Observability: Workers can transmit telemetry which powers a Grafana/local dashboard to monitor queue progress.

AI for Science: Powering Large-Scale Research

ai4s.jobq is a critical tool in Microsoft Research -- AI for Science, enabling researchers to handle massive computational workloads with ease. It plays a key role in:

🔹 Generating large-scale synthetic datasets for AI-driven simulations. 🔹 Efficiently pre- and post-processing vast amounts of scientific data. 🔹 Scaling model evaluation by managing high-throughput inference workloads.

Why AI for Science Relies on ai4s.jobq

🚀 Maximizing Compute Efficiency By seamlessly leveraging preemptible compute across diverse environments, ai4s.jobq significantly boosts scalability while reducing costs—accelerating scientific discovery without wasted resources.

🛠 Focusing on Science, Not Infrastructure Researchers can stay focused on their work instead of dealing with unreliable infrastructure. ai4s.jobq abstracts away system failures and optimizes task execution, freeing up valuable time for breakthroughs in AI and science.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft’s Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party’s policies.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai4s_jobq-3.2.0.tar.gz (1.8 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ai4s_jobq-3.2.0-py2.py3-none-any.whl (121.6 kB view details)

Uploaded Python 2Python 3

File details

Details for the file ai4s_jobq-3.2.0.tar.gz.

File metadata

  • Download URL: ai4s_jobq-3.2.0.tar.gz
  • Upload date:
  • Size: 1.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ai4s_jobq-3.2.0.tar.gz
Algorithm Hash digest
SHA256 c0bb7c8bd025e4e1e59a8e0c9689c2226d4ced21579bead6ea3590284d94e7f9
MD5 379ef1c5b9bd9dddb24c5d7c76c82949
BLAKE2b-256 7a8e0d78098fef5d7af047181e9222f531026cd9736349769cfa0056b7be8232

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai4s_jobq-3.2.0.tar.gz:

Publisher: pypi-deployment.yml on microsoft/ai4s-jobq

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai4s_jobq-3.2.0-py2.py3-none-any.whl.

File metadata

  • Download URL: ai4s_jobq-3.2.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 121.6 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ai4s_jobq-3.2.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 a090560808da99d4f3f77ac674cd86edcccf7ae005f33f2e1bc7ba4e8ba6db00
MD5 22da4a469819f5000874074df5c5278f
BLAKE2b-256 f0b56a7eac7d8e8d08f42ac1d0b6c4972eee55a2d999131a21785af390cc0d42

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai4s_jobq-3.2.0-py2.py3-none-any.whl:

Publisher: pypi-deployment.yml on microsoft/ai4s-jobq

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page