Skip to main content

Common utilities used by LHCb DPA WP2 related software

Project description

LbAPCommon

pipeline status coverage report

Common utilities for LHCb DPA WP2 related software, including Analysis Productions workflow parsing, validation, and conversion tools.

Features

  • Workflow Parsing: Parse and render Analysis Productions YAML configuration files
  • DIRAC Conversion: Convert workflow definitions to DIRAC production requests
  • CWL Conversion: Convert production requests to Common Workflow Language (CWL) format
  • Validation & Linting: Validate workflow configurations and check for common issues
  • Data Models: Pydantic v2 models for workflow and production data structures

Installation

pip install LbAPCommon

For development:

# Using pixi (recommended)
pixi install
pixi run test

# Using conda/mamba
mamba env create --file environment.yaml
conda activate lbaplocal-dev
pip install -e '.[testing]'

Usage

DIRAC Production Request Generation

Convert Analysis Productions info.yaml files to DIRAC production requests. This is the primary CLI used by the Analysis Productions system:

python -m LbAPCommon <production_name> --input info.yaml --output output.json

# With specific AP package version
python -m LbAPCommon my_production --input info.yaml --output result.json --ap-pkg-version v1r0

# Process only a specific job from the workflow
python -m LbAPCommon my_production --input info.yaml --output result.json --only-include job_name

# Dump individual YAML request files for each production
python -m LbAPCommon my_production --input info.yaml --output result.json --dump-requests

# With DIRAC server credentials
python -m LbAPCommon my_production --input info.yaml --output result.json --server-credentials user password

Arguments:

  • production_name - Name of the production (alphanumeric + underscore, 2-200 chars)
  • --input - Path to the info.yaml workflow definition file
  • --output - Path for the output JSON file containing production requests
  • --ap-pkg-version - AnalysisProductions package version (default: v999999999999)
  • --only-include - Only process the workflow chain containing this job name
  • --dump-requests - Also write individual .yaml files for each production request
  • --server-credentials - DIRAC server credentials (username and password)

The output JSON contains:

  • rendered_yaml - The rendered workflow YAML after Jinja2 processing
  • productions - Dictionary of production requests, each containing:
    • request - The DIRAC production request in LbAPI format
    • input-dataset - Input dataset specification from Bookkeeping query
    • dynamic_files - Auto-generated configuration files
    • raw-yaml - Original YAML for the jobs in this production

CWL Conversion CLI

Convert DIRAC production request YAML files to CWL workflows:

# List productions in a file
lhcb-production-yaml-to-cwl list_productions production.yaml

# Generate CWL workflow files
lhcb-production-yaml-to-cwl generate production.yaml --output-dir ./cwl-output

# Convert a specific production
lhcb-production-yaml-to-cwl generate production.yaml --production "MyProduction"

Python API

from LbAPCommon import parse_yaml, render_yaml, validate_yaml
from LbAPCommon.dirac_conversion import group_in_to_requests, step_to_production_request
from LbAPCommon.prod_request_to_cwl import fromProductionRequestYAMLToCWL

# Parse and validate workflow YAML
rendered = render_yaml(yaml_content)
jobs_data = parse_yaml(rendered, production_name, wg)
validate_yaml(jobs_data, wg, production_name)

# Group jobs into production requests
for job_names in group_in_to_requests(jobs_data):
    request = step_to_production_request(
        production_name, jobs_data, job_names, input_spec, ap_pkg_version
    )

# Convert DIRAC output to CWL
workflow, inputs, metadata = fromProductionRequestYAMLToCWL(yaml_path)

Development

Running Tests

pixi run test              # Run all tests
pixi run test-dirac        # Run DIRAC conversion tests
pixi run test-cov          # Run with coverage report

Updating Test Fixtures

When CWL output changes, update reference fixtures:

pixi run update-fixtures   # Regenerate reference .cwl files

Code Quality

pixi run pre-commit        # Run all pre-commit hooks (linting, formatting, etc.)

Documentation

Documentation goes in the LbAPDoc project.

Related Projects

  • LbAPI - REST API server for Analysis Productions
  • LbAPLocal - Local testing tools
  • LbAPWeb - Web interface for Analysis Productions

License

GPL-3.0 - See COPYING for details.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lbapcommon-0.16.10.tar.gz (523.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lbapcommon-0.16.10-py3-none-any.whl (166.0 kB view details)

Uploaded Python 3

File details

Details for the file lbapcommon-0.16.10.tar.gz.

File metadata

  • Download URL: lbapcommon-0.16.10.tar.gz
  • Upload date:
  • Size: 523.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for lbapcommon-0.16.10.tar.gz
Algorithm Hash digest
SHA256 aaa1ff237574ad13f87347935d7a95d0bb7ded0dfd57c525a42620f6a33c70cc
MD5 9e8405f4e9391f1966d40f197fd5db1d
BLAKE2b-256 7b8b7049d00e3a24f8b2b728391f5eff2f3095204129d27a1dc40e2e6054e94b

See more details on using hashes here.

File details

Details for the file lbapcommon-0.16.10-py3-none-any.whl.

File metadata

  • Download URL: lbapcommon-0.16.10-py3-none-any.whl
  • Upload date:
  • Size: 166.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for lbapcommon-0.16.10-py3-none-any.whl
Algorithm Hash digest
SHA256 02465bd45be9dfe8041f4e6e409eee7e375a8d3bee96563fe28f6624430d07b8
MD5 10e5def3a7923b381028f2583fab7680
BLAKE2b-256 994f4ae4d10c15620bd0bc625e566bf6f187dc6a78f3305c0f898ae21340e8a0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page