Skip to main content

Common utilities used by LHCb DPA WP2 related software

Project description

LbAPCommon

pipeline status coverage report

Common utilities for LHCb DPA WP2 related software, including Analysis Productions workflow parsing, validation, and conversion tools.

Features

  • Workflow Parsing: Parse and render Analysis Productions YAML configuration files
  • DIRAC Conversion: Convert workflow definitions to DIRAC production requests
  • CWL Conversion: Convert production requests to Common Workflow Language (CWL) format
  • Validation & Linting: Validate workflow configurations and check for common issues
  • Data Models: Pydantic v2 models for workflow and production data structures

Installation

pip install LbAPCommon

For development:

# Using pixi (recommended)
pixi install
pixi run test

# Using conda/mamba
mamba env create --file environment.yaml
conda activate lbaplocal-dev
pip install -e '.[testing]'

Usage

DIRAC Production Request Generation

Convert Analysis Productions info.yaml files to DIRAC production requests. This is the primary CLI used by the Analysis Productions system:

python -m LbAPCommon <production_name> --input info.yaml --output output.json

# With specific AP package version
python -m LbAPCommon my_production --input info.yaml --output result.json --ap-pkg-version v1r0

# Process only a specific job from the workflow
python -m LbAPCommon my_production --input info.yaml --output result.json --only-include job_name

# Dump individual YAML request files for each production
python -m LbAPCommon my_production --input info.yaml --output result.json --dump-requests

# With DIRAC server credentials
python -m LbAPCommon my_production --input info.yaml --output result.json --server-credentials user password

Arguments:

  • production_name - Name of the production (alphanumeric + underscore, 2-200 chars)
  • --input - Path to the info.yaml workflow definition file
  • --output - Path for the output JSON file containing production requests
  • --ap-pkg-version - AnalysisProductions package version (default: v999999999999)
  • --only-include - Only process the workflow chain containing this job name
  • --dump-requests - Also write individual .yaml files for each production request
  • --server-credentials - DIRAC server credentials (username and password)

The output JSON contains:

  • rendered_yaml - The rendered workflow YAML after Jinja2 processing
  • productions - Dictionary of production requests, each containing:
    • request - The DIRAC production request in LbAPI format
    • input-dataset - Input dataset specification from Bookkeeping query
    • dynamic_files - Auto-generated configuration files
    • raw-yaml - Original YAML for the jobs in this production

CWL Conversion CLI

Convert DIRAC production request YAML files to CWL workflows:

# List productions in a file
lhcb-production-yaml-to-cwl list_productions production.yaml

# Generate CWL workflow files
lhcb-production-yaml-to-cwl generate production.yaml --output-dir ./cwl-output

# Convert a specific production
lhcb-production-yaml-to-cwl generate production.yaml --production "MyProduction"

Python API

from LbAPCommon import parse_yaml, render_yaml, validate_yaml
from LbAPCommon.dirac_conversion import group_in_to_requests, step_to_production_request
from LbAPCommon.prod_request_to_cwl import fromProductionRequestYAMLToCWL

# Parse and validate workflow YAML
rendered = render_yaml(yaml_content)
jobs_data = parse_yaml(rendered, production_name, wg)
validate_yaml(jobs_data, wg, production_name)

# Group jobs into production requests
for job_names in group_in_to_requests(jobs_data):
    request = step_to_production_request(
        production_name, jobs_data, job_names, input_spec, ap_pkg_version
    )

# Convert DIRAC output to CWL
workflow, inputs, metadata = fromProductionRequestYAMLToCWL(yaml_path)

Development

Running Tests

pixi run test              # Run all tests
pixi run test-dirac        # Run DIRAC conversion tests
pixi run test-cov          # Run with coverage report

Updating Test Fixtures

When CWL output changes, update reference fixtures:

pixi run update-fixtures   # Regenerate reference .cwl files

Code Quality

pixi run pre-commit        # Run all pre-commit hooks (linting, formatting, etc.)

Documentation

Documentation goes in the LbAPDoc project.

Related Projects

  • LbAPI - REST API server for Analysis Productions
  • LbAPLocal - Local testing tools
  • LbAPWeb - Web interface for Analysis Productions

License

GPL-3.0 - See COPYING for details.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lbapcommon-0.15.9.tar.gz (481.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lbapcommon-0.15.9-py3-none-any.whl (161.0 kB view details)

Uploaded Python 3

File details

Details for the file lbapcommon-0.15.9.tar.gz.

File metadata

  • Download URL: lbapcommon-0.15.9.tar.gz
  • Upload date:
  • Size: 481.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for lbapcommon-0.15.9.tar.gz
Algorithm Hash digest
SHA256 28e3ffa536321bc4e3e04043f5e34917905e2c5ff29d502f1c28ce3977fc4709
MD5 5aa4b1f81e45227b14ae6df9aabfb574
BLAKE2b-256 162c782d3b7fbb3d9c50fe5ca8e3150343a38977ed8c3334c6265de058fed62f

See more details on using hashes here.

File details

Details for the file lbapcommon-0.15.9-py3-none-any.whl.

File metadata

  • Download URL: lbapcommon-0.15.9-py3-none-any.whl
  • Upload date:
  • Size: 161.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for lbapcommon-0.15.9-py3-none-any.whl
Algorithm Hash digest
SHA256 468ec07e89ccbadaee66bfedcfa59c9e1f3ca64350e35db651370ee0802985c4
MD5 b87ce90f94a78e360fa7259cdfcbe673
BLAKE2b-256 ee8052bf721f136d9badcc22b3f108847a6206d4127f978c50a5dc317da7e7ee

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page