Skip to main content

Project with code needed to filter, trim and slim ntuples produced by AP

Project description

Description

This project is used to:

  • Filter, slim, trim the trees from a given AP production
  • Rename branches
  • Download the outputs

This is done using configurations in a YAML file and through DIRAC jobs.

Installation

This project can be installed from pypi with:

pip install post_ap

in order for it to run one needs to setup a shell with the right environment, this is done with:

post_shell -u X -c Y

where:

X: User name in LXPLUS or current user, needed to: - Find place in EOS where to put outputs Y: Username in LXPLUS of user whose virtual environment will be used. If you haven't made any virtual environment, use acampove.

Submitting jobs

Check latest version of virtual environment

The jobs below will run with code from a virtual environment that is already in the grid. One should use the latest version of this environment. To know the latest versions, run:

list_venvs

should have been ran before using list_venvs.

Submit jobs

To run the filtering, after properly installing the project, as shown here do:

# Local will create a local sandbox, use wms to send to the grid

# For data, this will process a single PFN locally
job_filter -n test_job -p rd_ap_2024 -s       data -c /home/acampove/Packages/config_files/post_ap/v3.yaml -e 025 -u acampove -m local -t

# For data, this will process all the PFNs in the grid 
job_filter -n data_job -p rd_ap_2024 -s       data -c /home/acampove/Packages/config_files/post_ap/v3.yaml -e 025 -u acampove -m wms

# For MC, this will process all the PFNs in the grid 
job_filter -n mc_job   -p rd_ap_2024 -s simulation -c /home/acampove/Packages/config_files/post_ap/v3.yaml -e 025 -u acampove -m wms

where the options mean:

  -h, --help            show this help message and exit
  -n NAME  --name NAME  Name of job, needed for dirac naming and to name output
  -p PROD, --prod PROD  Name of production, e.g. rd_ap_2024, this shoudl be the same as in the config section.
  -s SAMP, --samp SAMP  Sample nickname found in the config section `samples`
  -c CONF, --conf CONF  Path to config file, which should be a YAML file and a few examples are linked below.
  -e VENV, --venv VENV  Index of virtual environment, e.g. 023
  -u USER, --user USER  User associated to venv, currently acampove should be the only choice, but if you author your own virtual environment and upload it, then this should be your user name
  -d DRYR, --dryr DRYR  If used, submission will be skipped, needed for debugging.
  -M MAXJ, --maxj MAXJ  Maximum number of jobs, default 500. If 1000 PFNs are found, will do 500 jobs, if 100 PFNs are found, will do 100 jobs
  -m {local,wms}, --mode {local,wms} Run locally (for tests) or in the grid
  -t       --test       If used, will send only one job

Regarding the name, the output will go to a directory in EOS named JOBNAME_SAMPLENAME, e.g. test_001_data if -n test_001 is used with -s data sample. Some config files can be found here

Downloading ntuples

A test would look like:

run3_download_ntuples -j dec_06_2024_data -n 20 -r 1 -m 5 [-d $PWD/files]

where:

options:
  -h, --help            show this help message and exit
  -j JOBN, --jobn JOBN  Job name, used to find directory, e.g. flt_001
  -n NFILE, --nfile NFILE
                        Number of files to download
  -d DEST, --dest DEST  Destination directory will override whatever is in DOWNLOAD_NTUPPATH
  -t, --test            Runs a test run
  -l {10,20,30,40}, --log {10,20,30,40}
                        Log level, default 20
  -r {0,1}, --ran {0,1}
                        When picking a subset of files, with -n, pick them randomly (1) or the first files (0 default)
  -m MTH, --mth MTH     Number of threads to use for downloading, default 1

A real download would look like:

run3_download_ntuples -j dec_06_2024_data -m 40

Where -m denotes the number of threads used to download, -j the name of the job.

Removing old outputs

If outputs of old jobs need to be removed, it can be done with:

remove_job -n job_name -s sample_name

from the examples above this could look like:

remove_job -n dec_08_2024 -s simulation 

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

post_ap-0.1.7.tar.gz (34.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

post_ap-0.1.7-py3-none-any.whl (41.5 kB view details)

Uploaded Python 3

File details

Details for the file post_ap-0.1.7.tar.gz.

File metadata

  • Download URL: post_ap-0.1.7.tar.gz
  • Upload date:
  • Size: 34.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for post_ap-0.1.7.tar.gz
Algorithm Hash digest
SHA256 34353464782aa4e6ebef37a9c68caa60952dec6ff24d2450de6e2970ddad5869
MD5 b20d2b926830b9e9514d231ffaff1824
BLAKE2b-256 ac526c24038d1030f00ef76ce3dc459baacf355cab76741a64f28fc83aadadba

See more details on using hashes here.

Provenance

The following attestation bundles were made for post_ap-0.1.7.tar.gz:

Publisher: publish.yaml on acampove/post_ap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file post_ap-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: post_ap-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 41.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for post_ap-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 bcc349875ad143102f5053f946c4b43cf37473c6198ef5b639c44f5303293156
MD5 27097efdb65ef39df1b0c2064fe9386a
BLAKE2b-256 332ed4e47d3c5a9d54eb8c518c45fe5667246b875cb164a94e08960e925390b5

See more details on using hashes here.

Provenance

The following attestation bundles were made for post_ap-0.1.7-py3-none-any.whl:

Publisher: publish.yaml on acampove/post_ap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page