Skip to main content

Utilities needed for analysis productions in LHCb

Project description

ap_utilities

For documentation specific to MVA lines of the RD group, check this

Environment and installation

To run this one has to be in an environment with:

  1. Access to DIRAC.
  • Setup the LHCb environment with
. /cvmfs/lhcb.cern.ch/lib/LbEnv 

# Token valid fro 100 hours
lhcb-proxy-init -v 100:00
  • Access a shell with dirac:
lb-dirac bash
  • Install this project:
pip install ap-utilities

Decay nicknames

Accessing table with DecFiles sample nicknames

These nicknames can be accessed from python scripts with:

import ap_utilities.physics.utilities as aput

# To get exactly what was saved
literal = aput.read_decay_name(event_type=event_type, style='literal')

# To get representation with special symbols like "," or "-" replaced
safe_1  = aput.read_decay_name(event_type=event_type, style= 'safe_1')

Update table with nicknames and event types

This is most likely not needed, unless a new sample has been created and a new nickname needs to be added. The following lines:

export DECPATH=/home/acampove/Packages/DecFiles

update_decinfo

will:

  1. Set the path to the DecFiles root directory such that update_decinfo can use it.
  2. Read the event types and nicknames and save them to a YAML file

Check for samples existence

Given a set of MC samples specified in a YAML file like:

settings_common: &common
  year      : 2024
  mc_path   : 2024.W31.34
  polarity  : MagUp
  nu_path   : Nu6.3
  sim_vers  : Sim10d
  generator : Pythia8
  ctags     : sim10-2024.Q3.4-v1.3-mu100
  dtags     : dddb-20240427
# -------------------------------------------
sections:
  one:
    settings:
      <<: *common
    evt_type:
      - '11102211'
      - '11102202'
  two:
    settings:
      <<        : *common
      sim_vers  : Sim10d-SplitSim02
    evt_type:
      - '11102211'
      - '11102202'
  three:
    settings:
      <<        : *common
      generator : BcVegPyPythia8
    evt_type:
      - '14143013'
      - '14113032'

run:

check_samples -i samples.yaml -n 6

to check if the samples exist using 6 threads (default is 1). The script will produce info_SECTION_NAME.yaml and validation_SECTION_NAME.yaml, which will correspond to each sections up there i.e. one, two and three.

Important: Given that most settings are the same between sections, one can use anchors and aliases to override only what is different between them.

Validate outputs of pipelines

In order to do this:

Mount EOS in laptop

# install SSHF
...
# Check that it's installed
which sshfs

# Make directory to mount EOS

APDIR=/eos/lhcb/wg/dpa/wp2/ci/
sudo mkdir -p $APDIR
sudo chown $USER:$USER $APDIR 

# Mount EOS
sshfs -o idmap=user USERNAME@lxplus.cern.ch:$MNT_DIR $MNT_DIR

Run Validation

# This project is in pip
pip install ap_utilities

validate_ap_tuples -p PIPELINE -f ntuple_scheme.yaml -t 5

where:
-l: Logging level, by default 20 (info), but it can be 10 (debug) or 30 (warning)
-t: Is the number of threads to use, if not passed, it will use one.
-p: Is the pipeline number, needed to find the ROOT files in EOS
-f: passes the file with the configuration

# -----------------------------------------
# Needed to find where files are in EOS
# -----------------------------------------
paths:
  pipeline_dir : /eos/lhcb/wg/dpa/wp2/ci
  analysis_dir : rd_ap_2024
# -----------------------------------------
# Each key corresponds to a MC sample, the value is a list of lines that must be found
# as a tree in the file. If any, then the sample is not signal for any of the HLT2 lines
# therefore no tree (equivalent to a line) is required to be made
# -----------------------------------------
samples:
  # These is a sample without a dedicated trigger
  Bu_K1ee_eq_DPC:
    - any 
  # This is a sample with two triggers targetting it
  Bd_Kpiee_eq_DPC:
    - Hlt2RD_B0ToKpPimEE
    - Hlt2RD_B0ToKpPimEE_MVA

a few examples of config files can be found here

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ap_utilities-0.1.7.tar.gz (89.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ap_utilities-0.1.7-py3-none-any.whl (90.5 kB view details)

Uploaded Python 3

File details

Details for the file ap_utilities-0.1.7.tar.gz.

File metadata

  • Download URL: ap_utilities-0.1.7.tar.gz
  • Upload date:
  • Size: 89.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for ap_utilities-0.1.7.tar.gz
Algorithm Hash digest
SHA256 e2cec5f93fe6374512a1fd9692439c00ba51b27824ce8bbf19336abe25528953
MD5 5b2587eb1a3653670b0c610ba1560882
BLAKE2b-256 cc07971f732ca021822baa067e7b78a3b31073e95c6c7e8bf3c65f3388dd7a03

See more details on using hashes here.

Provenance

The following attestation bundles were made for ap_utilities-0.1.7.tar.gz:

Publisher: publish.yaml on acampove/ap_utilities

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ap_utilities-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: ap_utilities-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 90.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for ap_utilities-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 1aacccae36e6366289cab5f126e2d5a6d3171c6e48b799cd7df9c77314c0ec53
MD5 1dceef20c292b81444e1dcc2364f18b8
BLAKE2b-256 416ed6194d53e2b33c1a8e7054f736418567c504b5d23e55889bdd4a305df808

See more details on using hashes here.

Provenance

The following attestation bundles were made for ap_utilities-0.1.7-py3-none-any.whl:

Publisher: publish.yaml on acampove/ap_utilities

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page