Skip to main content

Project with code used to ease data manipulation tasks

Project description

Description

This project is used to carry out checks on Run3 data. Check this for installation instructions. Regardless, the code will need the following variable to be defined:

export LXNAME=$USER # This is the username when running in LXPLUS
# This is the value of VENVS used to create the virtual environment that will be used
export VENVS=/afs/ihep.ac.cn/users/c/campoverde/VENVS

such that the code that is ran, will be taken from a tarball in the grid, and it will be associated to a specific user.

and instead of running in a virtual environment one will have to run in an environment with DIRAC with:

. /cvmfs/lhcb.cern.ch/lib/LbEnv

# This will open a new shell
lb-dirac

# And you will work here
export PATH+=:$HOME/.local/bin

Specifying configuration for filtering and slimming

For this to work, configs need to be uploaded to the grid with the scripts below. The scripts need to know the place in the grid where the user LFNs live. For that, the following line needs to be issued:

export LXNAME=$USER # This is the username when running in LXPLUS

The configuration file is updated with:

update_config -u 1

The -u flag will update the config file if its LFN is already in the grid. The script runs with:

  1. The LHCb environment set up.
  2. With a valid grid token.
  3. Within the working virtual environment. lb-dirac and the script need to be used. No conflict between the VENV and the LHCb environments seems to happen.

Save lists of PFNs

The PFNs to be processed will be stored once with the AP api and will be read as package data when processing ntuples. The list of PFNs is created with, e.g.:

save_pfns -c dt_2024_turbo_comp

where -c will correspond to the config file.

Submitting jobs


All the jobs below require code that lives in a virtual environment, there should be multiple versions of this environment and the latest one should be obtained by running:

lb-dirac dirac-dms-user-lfns -w dcheck.tar -b /lhcb/user/${LXNAME:0:1}/$LXNAME/run3/venv

The instructions below need to be done outside the virtual environment in an environment with access to dirac and in the post_ap_grid directory.

First run a test job with:

./job_filter -d dt_2024_turbo -c comp -j 1211 -e 003 -m local -n test_flt

where -j specifies the number of jobs. For tests, this is the number of files to process, thus, the test job does only one file. The -n flag is the name of the job, for tests it will do/send only one job if either:

  1. Its name has the substring test.
  2. It is a local job.

Thus one can do local or grid tests running over a single file.

For real jobs:

./job_filter -d dt_2024_turbo -c comp -j 200 -e 003 -m wms -n flt_001

Downloading ntuples

A test would look like:

run3_download_ntuples -j flt_004 -n 3 [-d $PWD/files]

where:

-j: Is the name of the job, which has to coincide with the directory name, where the ntuples are in EOS, e.g. /eos/lhcb/grid/user/lhcb/user/a/acampove/flt_004.
-n: Number of ntuples to download, if not pased, will download everything.
-d: Directory where output ntuples will go, if not passed, directory pointed by DOWNLOAD_NTUPPATH will be used.

A real download would look like:

run3_download_ntuples -j flt_001 -m 40

Where -m denotes the number of threads used to download, -j the name of the job.

Notes

  • The downloads can be ran many times, if a file has been downloaded already, it will not be downloaded again.

Linking and merging

Once the ntuples are downloaded these need to be linked and merged with:

link_merge -j flt_002 -v v1

where -j is the name of the job and the files are linked to a directory named as -v v1. For tests run:

link_merge -j flt_002 -d 1 -m 10 -v v1

which will do the same with at most 10 files, can use debug messages with -l 10.

Making basic plots

For this run:

plot_vars -y 2024 -v v2 -c bukee_opt -d data_ana_cut_bp_ee:Data ctrl_BuToKpEE_ana_ee:Simulation

which will run the plotting of the variables in a config specified by bukee_opt where also axis, names, ranges, etc are specified. This config is in post_ap_data. The script above will overlay data and MC.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

post_ap-0.0.4.tar.gz (53.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

post_ap-0.0.4-py3-none-any.whl (58.3 kB view details)

Uploaded Python 3

File details

Details for the file post_ap-0.0.4.tar.gz.

File metadata

  • Download URL: post_ap-0.0.4.tar.gz
  • Upload date:
  • Size: 53.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for post_ap-0.0.4.tar.gz
Algorithm Hash digest
SHA256 b489ff2185a6f43a28565167e836e0c0108a05279123cdf736dd97b49f204ccf
MD5 4fea0fb36ce585cd51d68eaecaea293d
BLAKE2b-256 f98a621801cacd89a4fc025584a6bef68360c978880bb579971865bcf3263c5b

See more details on using hashes here.

Provenance

The following attestation bundles were made for post_ap-0.0.4.tar.gz:

Publisher: publish.yaml on acampove/post_ap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file post_ap-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: post_ap-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 58.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for post_ap-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 69bc4c9d34a5568b7293b5db3b9ae3123b5b6a242d5342a7e77caa55fe550d5f
MD5 4dd8f973aa3eea51d460c33f951a0b49
BLAKE2b-256 06a1d43f6bea37cd92af72b21022dc6e570d15f37a77a8c150f52123f7f6b5f1

See more details on using hashes here.

Provenance

The following attestation bundles were made for post_ap-0.0.4-py3-none-any.whl:

Publisher: publish.yaml on acampove/post_ap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page