Skip to main content

Project with code used to ease data manipulation tasks

Project description

Description

This project is used to:

  • Filter, slim, trim the trees files from a given AP production
  • Rename branches
  • Download the outputs

This is done using configurations in a YAML file and through DIRAC jobs.

Check this for installation instructions and for instructions on how to setup an environment to use this project.

Submitting jobs

Check latest version of virtual environment

All the jobs below require code that lives in a virtual environment, there should be multiple versions of this environment and the latest one should be obtained by running:

dirac-dms-user-lfns -w dcheck.tar -b /lhcb/user/${LXNAME:0:1}/$LXNAME/run3/venv

currently, the latest. Unless you have made your own tarballs, LXNAME=acampove.

Submit jobs

Run a test job with:

job_filter -d dt_2024_turbo -c comp -j 1211 -e 003 -m local -n test_flt -u acampove

where -u specifies the user who authored the environment that the job will use. The flag -j specifies the number of jobs. For tests, this is the number of files to process, thus, the test job does only one file. The -n flag is the name of the job, for tests it will do/send only one job if either:

  1. Its name has the substring test.
  2. It is a local job.

Thus one can do local or grid tests running over a single file.

For real jobs:

job_filter -d dt_2024_turbo -c comp -j 200 -e 003 -m wms -n flt_001 -u acampove

Downloading ntuples

A test would look like:

run3_download_ntuples -j flt_004 -n 3 [-d $PWD/files]

where:

-j: Is the name of the job, which has to coincide with the directory name, where the ntuples are in EOS, e.g. /eos/lhcb/grid/user/lhcb/user/a/acampove/flt_004. -n: Number of ntuples to download, if not pased, will download everything. -d: Directory where output ntuples will go, if not passed, directory pointed by DOWNLOAD_NTUPPATH will be used.

A real download would look like:

run3_download_ntuples -j flt_001 -m 40

Where -m denotes the number of threads used to download, -j the name of the job.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

post_ap-0.1.0.tar.gz (23.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

post_ap-0.1.0-py3-none-any.whl (27.1 kB view details)

Uploaded Python 3

File details

Details for the file post_ap-0.1.0.tar.gz.

File metadata

  • Download URL: post_ap-0.1.0.tar.gz
  • Upload date:
  • Size: 23.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for post_ap-0.1.0.tar.gz
Algorithm Hash digest
SHA256 f4cc0eb73e535808f5db6a651ff2d0905c1fd3226c06bebcf1596950bc566b7e
MD5 e258d06dd0752890c575058c93410286
BLAKE2b-256 bd53bebcffecf753dcc765f9644b06b49307abd03e0150e4cfa99614947a92d6

See more details on using hashes here.

Provenance

The following attestation bundles were made for post_ap-0.1.0.tar.gz:

Publisher: publish.yaml on acampove/post_ap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file post_ap-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: post_ap-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 27.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for post_ap-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3be4bfbdc952202f4a8a62a1affd71e83fbc221ab1a2ba12379fd0201f4fafec
MD5 0faf113d692e0fd2526c96096e647421
BLAKE2b-256 01828899c26f84f7273fba3c36734f7969ffb3c31ae98a940e6e204cd6750dce

See more details on using hashes here.

Provenance

The following attestation bundles were made for post_ap-0.1.0-py3-none-any.whl:

Publisher: publish.yaml on acampove/post_ap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page