Skip to main content

Utilities needed for analysis productions in LHCb

Project description

ap_utilities

  • For instructions on how to install this project, check this
  • For documentation specific to MVA lines of the RD group, check this
  • For tools to deal with nicknames check this
  • For instructions to mount EOS in your laptop check this
  • For instructions on how to make the decay descriptor fields check this

How to add a decay

Add the decay matching lines

This is done in rd_ap_2024/tupling/config/mcfuntuple.yaml. Each section in this file looks like:

# This is a nickname for the sample
Bd_Denu_Kstenu_eq_VisibleInAcceptance_HighVisMass_EGDWC:
# This is the decay descriptor and how to match particles to branch names
  Bd   : '[B0  ==>   (  D-  ==>   (  K*(892)0  ==>  K+  pi-  )   e-  anti-nu_e  )   e+  nu_e  ]CC'
  D    : '[B0  ==>  ^(  D-  ==>   (  K*(892)0  ==>  K+  pi-  )   e-  anti-nu_e  )   e+  nu_e  ]CC'
  Em   : '[B0  ==>   (  D-  ==>   (  K*(892)0  ==>  K+  pi-  )  ^e-  anti-nu_e  )   e+  nu_e  ]CC'
  Ep   : '[B0  ==>   (  D-  ==>   (  K*(892)0  ==>  K+  pi-  )   e-  anti-nu_e  )  ^e+  nu_e  ]CC'
  Kp   : '[B0  ==>   (  D-  ==>   (  K*(892)0  ==> ^K+  pi-  )   e-  anti-nu_e  )   e+  nu_e  ]CC'
  Kst  : '[B0  ==>   (  D-  ==>  ^(  K*(892)0  ==>  K+  pi-  )   e-  anti-nu_e  )   e+  nu_e  ]CC'
  nu   : '[B0  ==>   (  D-  ==>   (  K*(892)0  ==>  K+  pi-  )   e-  anti-nu_e  )   e+ ^nu_e  ]CC'
  pim  : '[B0  ==>   (  D-  ==>   (  K*(892)0  ==>  K+ ^pi-  )   e-  anti-nu_e  )   e+  nu_e  ]CC'
  • To get the sample nickname follow this
  • To get the descriptors one can either write them down in the case of a few samples or run follow these instructions in case of multiple.

Add the list of samples

The list goes in rd_ap_2024/info.yaml. For this, the installation that allows access to DIRAC is needed. Given a set of MC samples specified in a YAML file like:

settings_common: &common
  year      : 2024
  mc_path   : 2024.W31.34
  polarity  : MagUp
  nu_path   : Nu6.3
  sim_vers  : Sim10d
  generator : Pythia8
  ctags     : sim10-2024.Q3.4-v1.3-mu100
  dtags     : dddb-20240427
# -------------------------------------------
sections:
  one:
    settings:
      <<: *common
    evt_type:
      - '11102211'
      - '11102202'
  two:
    settings:
      <<        : *common
      sim_vers  : Sim10d-SplitSim02
    evt_type:
      - '11102211'
      - '11102202'
  three:
    settings:
      <<        : *common
      generator : BcVegPyPythia8
    evt_type:
      - '14143013'
      - '14113032'

run:

check_samples -i samples.yaml -n 6

to check if the samples exist using 6 threads (default is 1). The script will produce:

  • info_SECTION_NAME.yaml: Where SECTION_NAME corresponds to each section above, i.e. one, two, three
  • validation_SECTION_NAME.yaml: Which will be needed for validation later.

Once this has been done, the lines needed for the info.yaml can be obtained by concatenating the partial outputs with:

cat info_*.yaml > samples.yaml

Updating tupling/config/samples.yaml

This file lists the samples together with the analyses like in:

Bc_Dsst2573mumu_KKpi_eq_BcVegPy_DPC: # This is a sample
- Bc_lines                           # This is an analysis
Bc_Jpsipi_mm_eq_WeightedBcVegPy_DPC:
- Bc_lines
Bc_pimumu_eq_PHSP_BcVegPy_DPC:
- Bc_lines

Where the analyses are sets of HLT2 lines described in tupling/config/analyses.yaml.

Checks

Before pipelines

In order to do these checks run:

check_production -p /home/acampove/Packages/AnalysisProductions/rd_ap_2024

Where the path is the path to the production directory. This script will check:

  1. If the samples (by nickname) in info.yaml are different. Same nicknames are not expected.
  2. If the entries in mcfuntuple.yaml are different.
  3. If there are samples in info.yaml are not found in mcfuntuple.yaml. In which case MCDecayTree will not be made.

The second argument is a list of strings representing samples. Here they represent inclusive samples, which should be skipped; this argument is optional.

This script will produce report.yaml, which looks like:

# Print nicknames of samples going above 100 characters
long_nicknames: ['105', 'some_long_sample_name']
missing:
  info.yaml_mcfuntuple.yaml:
    only info.yaml:
      - Bd_KstPi0gamma_Kpi_eq_DPC_SS
      - Bd_Ksteta_gg_eq_DPC_SS
    only mcfuntuple.yaml:
      - Bd_KplKmn_eq_DPC
      - Bd_Kplpimn_eq_CPV2017_DPC
  info.yaml_samples.yaml:
    only info.yaml:
      - Bd_Denu_Kstenu_eq_VIA_HVM_EGDWC
      - Bd_Dmunu_Kstmunu_eq_DPC
    only samples.yaml:
      - Bd_KplKmn_eq_DPC
      - Bd_Kplpimn_eq_CPV2017_DPC
  mcfuntuple.yaml_samples.yaml:
    only mcfuntuple.yaml:
      - Bd_Denu_Kstenu_eq_VIA_HVM_EGDWC
      - Bd_Dmunu_Kstmunu_eq_DPC
    only samples.yaml:
      - Dst_D0pi_KK_TightCut
      - Dst_D0pi_KPi_TightCut

After pipelines

This is done in an environment with access to EOS. To gain EOS access from outside of LXPLUS (e.g. a laptop) follow these instructions. After that do:

# This project is in pip
pip install ap_utilities

validate_ap_tuples -p PIPELINE -f ntuple_scheme.yaml -t 5

Where:
-l: Logging level, by default 20 (info), but it can be 10 (debug) or 30 (warning)
-t: Is the number of threads to use, if not passed, it will use one.
-p: Is the pipeline number, needed to find the ROOT files in EOS
-f: passes the file with the configuration

# -----------------------------------------
# Needed to find where files are in EOS
# -----------------------------------------
paths:
  pipeline_dir : /eos/lhcb/wg/dpa/wp2/ci
  analysis_dir : rd_ap_2024
# -----------------------------------------
# Each key corresponds to a MC sample, the value is a list of lines that must be found
# as a tree in the file. If any, then the sample is not signal for any of the HLT2 lines
# therefore no tree (equivalent to a line) is required to be made
# -----------------------------------------
samples:
  # These is a sample without a dedicated trigger
  Bu_K1ee_eq_DPC:
    - any 
  # This is a sample with two triggers targetting it
  Bd_Kpiee_eq_DPC:
    - Hlt2RD_B0ToKpPimEE
    - Hlt2RD_B0ToKpPimEE_MVA

a few examples of config files can be found here

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ap_utilities-0.2.5.tar.gz (238.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ap_utilities-0.2.5-py3-none-any.whl (245.8 kB view details)

Uploaded Python 3

File details

Details for the file ap_utilities-0.2.5.tar.gz.

File metadata

  • Download URL: ap_utilities-0.2.5.tar.gz
  • Upload date:
  • Size: 238.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for ap_utilities-0.2.5.tar.gz
Algorithm Hash digest
SHA256 c4c62964b8b1922cbf9e828d6cbcfe9dbb615c9d9bc1f5e40d5dbdf39d11fe92
MD5 c9d93261c27c1c9a1fbb2da4839378c7
BLAKE2b-256 114186f16356f2d28a731b0508091d455a9eb11afdd9644554ad6d04d904640f

See more details on using hashes here.

Provenance

The following attestation bundles were made for ap_utilities-0.2.5.tar.gz:

Publisher: publish.yaml on acampove/ap_utilities

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ap_utilities-0.2.5-py3-none-any.whl.

File metadata

  • Download URL: ap_utilities-0.2.5-py3-none-any.whl
  • Upload date:
  • Size: 245.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for ap_utilities-0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 64d9ef42fb4a2a90fdef4eb26e03dc90ccc6eb2422639094c91c926d5286c02f
MD5 bcf6ccdf43e9eff12ea560e8ec383090
BLAKE2b-256 2d3c52f42f7cad2f077155d808240c254be08cf15410ae63f181ced8b1a01069

See more details on using hashes here.

Provenance

The following attestation bundles were made for ap_utilities-0.2.5-py3-none-any.whl:

Publisher: publish.yaml on acampove/ap_utilities

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page