Skip to main content

Common utilities used by LHCb DPA WP2 related software

Project description

LbAPCommon

pipeline status coverage report

Common utilities for LHCb DPA WP2 related software, including Analysis Productions workflow parsing, validation, and conversion tools.

Features

  • Workflow Parsing: Parse and render Analysis Productions YAML configuration files
  • DIRAC Conversion: Convert workflow definitions to DIRAC production requests
  • CWL Conversion: Convert production requests to Common Workflow Language (CWL) format
  • Validation & Linting: Validate workflow configurations and check for common issues
  • Data Models: Pydantic v2 models for workflow and production data structures

Installation

pip install LbAPCommon

For development:

# Using pixi (recommended)
pixi install
pixi run test

# Using conda/mamba
mamba env create --file environment.yaml
conda activate lbaplocal-dev
pip install -e '.[testing]'

Usage

DIRAC Production Request Generation

Convert Analysis Productions info.yaml files to DIRAC production requests. This is the primary CLI used by the Analysis Productions system:

python -m LbAPCommon <production_name> --input info.yaml --output output.json

# With specific AP package version
python -m LbAPCommon my_production --input info.yaml --output result.json --ap-pkg-version v1r0

# Process only a specific job from the workflow
python -m LbAPCommon my_production --input info.yaml --output result.json --only-include job_name

# Dump individual YAML request files for each production
python -m LbAPCommon my_production --input info.yaml --output result.json --dump-requests

# With DIRAC server credentials
python -m LbAPCommon my_production --input info.yaml --output result.json --server-credentials user password

Arguments:

  • production_name - Name of the production (alphanumeric + underscore, 2-200 chars)
  • --input - Path to the info.yaml workflow definition file
  • --output - Path for the output JSON file containing production requests
  • --ap-pkg-version - AnalysisProductions package version (default: v999999999999)
  • --only-include - Only process the workflow chain containing this job name
  • --dump-requests - Also write individual .yaml files for each production request
  • --server-credentials - DIRAC server credentials (username and password)

The output JSON contains:

  • rendered_yaml - The rendered workflow YAML after Jinja2 processing
  • productions - Dictionary of production requests, each containing:
    • request - The DIRAC production request in LbAPI format
    • input-dataset - Input dataset specification from Bookkeeping query
    • dynamic_files - Auto-generated configuration files
    • raw-yaml - Original YAML for the jobs in this production

CWL Conversion CLI

Convert DIRAC production request YAML files to CWL workflows:

# List productions in a file
lhcb-production-yaml-to-cwl list_productions production.yaml

# Generate CWL workflow files
lhcb-production-yaml-to-cwl generate production.yaml --output-dir ./cwl-output

# Convert a specific production
lhcb-production-yaml-to-cwl generate production.yaml --production "MyProduction"

Python API

from LbAPCommon import parse_yaml, render_yaml, validate_yaml
from LbAPCommon.dirac_conversion import group_in_to_requests, step_to_production_request
from LbAPCommon.prod_request_to_cwl import fromProductionRequestYAMLToCWL

# Parse and validate workflow YAML
rendered = render_yaml(yaml_content)
jobs_data = parse_yaml(rendered, production_name, wg)
validate_yaml(jobs_data, wg, production_name)

# Group jobs into production requests
for job_names in group_in_to_requests(jobs_data):
    request = step_to_production_request(
        production_name, jobs_data, job_names, input_spec, ap_pkg_version
    )

# Convert DIRAC output to CWL
workflow, inputs, metadata = fromProductionRequestYAMLToCWL(yaml_path)

Development

Running Tests

pixi run test              # Run all tests
pixi run test-dirac        # Run DIRAC conversion tests
pixi run test-cov          # Run with coverage report

Updating Test Fixtures

When CWL output changes, update reference fixtures:

pixi run update-fixtures   # Regenerate reference .cwl files

Code Quality

pixi run pre-commit        # Run all pre-commit hooks (linting, formatting, etc.)

Documentation

Documentation goes in the LbAPDoc project.

Related Projects

  • LbAPI - REST API server for Analysis Productions
  • LbAPLocal - Local testing tools
  • LbAPWeb - Web interface for Analysis Productions

License

GPL-3.0 - See COPYING for details.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lbapcommon-0.16.2.tar.gz (509.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lbapcommon-0.16.2-py3-none-any.whl (163.8 kB view details)

Uploaded Python 3

File details

Details for the file lbapcommon-0.16.2.tar.gz.

File metadata

  • Download URL: lbapcommon-0.16.2.tar.gz
  • Upload date:
  • Size: 509.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for lbapcommon-0.16.2.tar.gz
Algorithm Hash digest
SHA256 b373580cd284fecc6105cce9198511d5c1d7bb491fe51ee27ce8669d5c98cfb7
MD5 662fcfc91416ea79597d3d461c4c99dd
BLAKE2b-256 76688025ffde5f482696a7aa663b9b3d1c2eb24b691bd105a4bbb46f1376bc91

See more details on using hashes here.

File details

Details for the file lbapcommon-0.16.2-py3-none-any.whl.

File metadata

  • Download URL: lbapcommon-0.16.2-py3-none-any.whl
  • Upload date:
  • Size: 163.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for lbapcommon-0.16.2-py3-none-any.whl
Algorithm Hash digest
SHA256 f0d49ec1064a3da6dce73b82338d391bac6757953081b73464e51f8d3338270e
MD5 0e91393e4a005bbbd685a8b0081204e7
BLAKE2b-256 24f1a1d441107ac0cc44401ac3c8514dd5f19b867031bd9489cf67945ae47359

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page