Skip to main content

Common utilities used by LHCb DPA WP2 related software

Project description

LbAPCommon

pipeline status coverage report

Common utilities for LHCb DPA WP2 related software, including Analysis Productions workflow parsing, validation, and conversion tools.

Features

  • Workflow Parsing: Parse and render Analysis Productions YAML configuration files
  • DIRAC Conversion: Convert workflow definitions to DIRAC production requests
  • CWL Conversion: Convert production requests to Common Workflow Language (CWL) format
  • Validation & Linting: Validate workflow configurations and check for common issues
  • Data Models: Pydantic v2 models for workflow and production data structures

Installation

pip install LbAPCommon

For development:

# Using pixi (recommended)
pixi install
pixi run test

# Using conda/mamba
mamba env create --file environment.yaml
conda activate lbaplocal-dev
pip install -e '.[testing]'

Usage

DIRAC Production Request Generation

Convert Analysis Productions info.yaml files to DIRAC production requests. This is the primary CLI used by the Analysis Productions system:

python -m LbAPCommon <production_name> --input info.yaml --output output.json

# With specific AP package version
python -m LbAPCommon my_production --input info.yaml --output result.json --ap-pkg-version v1r0

# Process only a specific job from the workflow
python -m LbAPCommon my_production --input info.yaml --output result.json --only-include job_name

# Dump individual YAML request files for each production
python -m LbAPCommon my_production --input info.yaml --output result.json --dump-requests

# With DIRAC server credentials
python -m LbAPCommon my_production --input info.yaml --output result.json --server-credentials user password

Arguments:

  • production_name - Name of the production (alphanumeric + underscore, 2-200 chars)
  • --input - Path to the info.yaml workflow definition file
  • --output - Path for the output JSON file containing production requests
  • --ap-pkg-version - AnalysisProductions package version (default: v999999999999)
  • --only-include - Only process the workflow chain containing this job name
  • --dump-requests - Also write individual .yaml files for each production request
  • --server-credentials - DIRAC server credentials (username and password)

The output JSON contains:

  • rendered_yaml - The rendered workflow YAML after Jinja2 processing
  • productions - Dictionary of production requests, each containing:
    • request - The DIRAC production request in LbAPI format
    • input-dataset - Input dataset specification from Bookkeeping query
    • dynamic_files - Auto-generated configuration files
    • raw-yaml - Original YAML for the jobs in this production

CWL Conversion CLI

Convert DIRAC production request YAML files to CWL workflows:

# List productions in a file
lhcb-production-yaml-to-cwl list_productions production.yaml

# Generate CWL workflow files
lhcb-production-yaml-to-cwl generate production.yaml --output-dir ./cwl-output

# Convert a specific production
lhcb-production-yaml-to-cwl generate production.yaml --production "MyProduction"

Python API

from LbAPCommon import parse_yaml, render_yaml, validate_yaml
from LbAPCommon.dirac_conversion import group_in_to_requests, step_to_production_request
from LbAPCommon.prod_request_to_cwl import fromProductionRequestYAMLToCWL

# Parse and validate workflow YAML
rendered = render_yaml(yaml_content)
jobs_data = parse_yaml(rendered, production_name, wg)
validate_yaml(jobs_data, wg, production_name)

# Group jobs into production requests
for job_names in group_in_to_requests(jobs_data):
    request = step_to_production_request(
        production_name, jobs_data, job_names, input_spec, ap_pkg_version
    )

# Convert DIRAC output to CWL
workflow, inputs, metadata = fromProductionRequestYAMLToCWL(yaml_path)

Development

Running Tests

pixi run test              # Run all tests
pixi run test-dirac        # Run DIRAC conversion tests
pixi run test-cov          # Run with coverage report

Updating Test Fixtures

When CWL output changes, update reference fixtures:

pixi run update-fixtures   # Regenerate reference .cwl files

Code Quality

pixi run pre-commit        # Run all pre-commit hooks (linting, formatting, etc.)

Documentation

Documentation goes in the LbAPDoc project.

Related Projects

  • LbAPI - REST API server for Analysis Productions
  • LbAPLocal - Local testing tools
  • LbAPWeb - Web interface for Analysis Productions

License

GPL-3.0 - See COPYING for details.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lbapcommon-0.15.11.tar.gz (513.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lbapcommon-0.15.11-py3-none-any.whl (163.8 kB view details)

Uploaded Python 3

File details

Details for the file lbapcommon-0.15.11.tar.gz.

File metadata

  • Download URL: lbapcommon-0.15.11.tar.gz
  • Upload date:
  • Size: 513.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for lbapcommon-0.15.11.tar.gz
Algorithm Hash digest
SHA256 b0f1ff3fb7632f74dd11c048c6dc55dc9bb604430d1c53683f3f8ba5591b481c
MD5 c2905fb6181ed6864a642f9f03c53a91
BLAKE2b-256 e9674aebbf5e32e3eaa86c5f292b01131dd6e159ad40a6972bcaf4640abcd98d

See more details on using hashes here.

File details

Details for the file lbapcommon-0.15.11-py3-none-any.whl.

File metadata

  • Download URL: lbapcommon-0.15.11-py3-none-any.whl
  • Upload date:
  • Size: 163.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for lbapcommon-0.15.11-py3-none-any.whl
Algorithm Hash digest
SHA256 0a86ae4498909dca1e43a55c91e67f5d63ce423b7f38c9434d056c85fd38fde8
MD5 88ffc274f43af8a1ddf6c763f414eaa2
BLAKE2b-256 b9aba5ab2a5c0a46b4208710540f6da2f75bf2fc89f86116b9f08169dd0e35b3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page