Skip to main content

Project with code needed to filter, trim and slim ntuples produced by AP

Project description

Description

This project is used to:

  • Filter, slim, trim the trees from a given AP production
  • Rename branches
  • Download the outputs

This is done using configurations in a YAML file and through DIRAC jobs.

Installation

This project can be installed from pypi with:

pip install post_ap

in order for it to run one needs to setup a shell with the right environment, this is done with:

post_shell -u X -c Y

where:

X: User name in LXPLUS or current user, needed to: - Find place in EOS where to put outputs Y: Username in LXPLUS of user whose virtual environment will be used. If you haven't made any virtual environment, use acampove.

It is recommended that you make a grid proxy afterwards with:

# Validity of 100 hours
dirac-proxy-init -v 100:00

Submitting jobs

Check latest version of virtual environment

The jobs below will run with code from a virtual environment that is already in the grid. One should use the latest version of this environment. To know the latest versions, run:

list_venvs

should have been ran before using list_venvs.

Submit jobs

To run the filtering, after properly installing the project, as shown here do:

# Local will create a local sandbox, use wms to send to the grid

# For data, this will process a single PFN locally
job_filter -n test_job -p rd_ap_2024 -s       data -c /home/acampove/Packages/config_files/post_ap/v3.yaml -e 025 -u acampove -m local -t

# For data, this will process all the PFNs in the grid 
job_filter -n data_job -p rd_ap_2024 -s       data -c /home/acampove/Packages/config_files/post_ap/v3.yaml -e 025 -u acampove -m wms

# For MC, this will process all the PFNs in the grid 
job_filter -n mc_job   -p rd_ap_2024 -s simulation -c /home/acampove/Packages/config_files/post_ap/v3.yaml -e 025 -u acampove -m wms

where the options mean:

  -h, --help            show this help message and exit
  -n NAME  --name NAME  Name of job, needed for dirac naming and to name output
  -p PROD, --prod PROD  Name of production, e.g. rd_ap_2024, this shoudl be the same as in the config section.
  -s SAMP, --samp SAMP  Sample nickname found in the config section `samples`
  -c CONF, --conf CONF  Path to config file, which should be a YAML file and a few examples are linked below.
  -e VENV, --venv VENV  Index of virtual environment, e.g. 023
  -u USER, --user USER  User associated to venv, currently acampove should be the only choice, but if you author your own virtual environment and upload it, then this should be your user name
  -d DRYR, --dryr DRYR  If used, submission will be skipped, needed for debugging.
  -M MAXJ, --maxj MAXJ  Maximum number of jobs, default 500. If 1000 PFNs are found, will do 500 jobs, if 100 PFNs are found, will do 100 jobs
  -m {local,wms}, --mode {local,wms} Run locally (for tests) or in the grid
  -t       --test       If used, will send only one job

Regarding the name, the output will go to a directory in EOS named JOBNAME_SAMPLENAME, e.g. test_001_data if -n test_001 is used with -s data sample. Some config files can be found here

Downloading ntuples

A test would look like:

run3_download_ntuples -j dec_06_2024_data -n 20 -r 1 -m 5 [-d $PWD/files]

where:

options:
  -h, --help            show this help message and exit
  -j JOBN, --jobn JOBN  Job name, used to find directory, e.g. flt_001
  -n NFILE, --nfile NFILE
                        Number of files to download
  -d DEST, --dest DEST  Destination directory will override whatever is in DOWNLOAD_NTUPPATH
  -e EOSN, --eosn EOSN  username from whom to download the ntuples, e.g. acampove
  -t, --test            Runs a test run
  -l {10,20,30,40}, --log {10,20,30,40}
                        Log level, default 20
  -r {0,1}, --ran {0,1}
                        When picking a subset of files, with -n, pick them randomly (1) or the first files (0 default)
  -m MTH, --mth MTH     Number of threads to use for downloading, default 1

A real download would look like:

run3_download_ntuples -j dec_06_2024_data -m 40 -n username

Where -m denotes the number of threads used to download, -j the name of the job. If acampove made these ntuples, they will go to his directory in EOS, thus -n acampove should be used.

Removing old outputs

If outputs of old jobs need to be removed, it can be done with:

remove_job -n job_name -s sample_name

from the examples above this could look like:

remove_job -n dec_08_2024 -s simulation 

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

post_ap-0.2.1.tar.gz (35.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

post_ap-0.2.1-py3-none-any.whl (41.6 kB view details)

Uploaded Python 3

File details

Details for the file post_ap-0.2.1.tar.gz.

File metadata

  • Download URL: post_ap-0.2.1.tar.gz
  • Upload date:
  • Size: 35.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for post_ap-0.2.1.tar.gz
Algorithm Hash digest
SHA256 e6c7329f5caed26e885e57bc990cef4bc4f3a2fb3387df225cdbe1217c1e4458
MD5 0ae79ba28d4a08402ea38e6dee3792ee
BLAKE2b-256 c036252bad51cef5b71b6594e17ce1e3d9488c0da1fad915863516ffaf57cae0

See more details on using hashes here.

Provenance

The following attestation bundles were made for post_ap-0.2.1.tar.gz:

Publisher: publish.yaml on acampove/post_ap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file post_ap-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: post_ap-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 41.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for post_ap-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9d77590e218a51a4aa46461dc4655fecd59e32c0b1e06a6196703591212c6d04
MD5 f666f8a206637bd24e7cc5fca280d173
BLAKE2b-256 f76b66418be4951a1be786092f1da255521530ce9ef5676dc4aac62a24b61685

See more details on using hashes here.

Provenance

The following attestation bundles were made for post_ap-0.2.1-py3-none-any.whl:

Publisher: publish.yaml on acampove/post_ap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page