Skip to main content

A python package to inject profiling initialisation into bash scripts, translate queue options and submit jobs

Project description

PyProfQueue

The package PyProfQueue provides the Script class, which takes the path and name of the bash script to be submitted and profiled in order to read in the queue options and add any needed initialisations in order to profile using prometheus and likwid.

The Script class has a few methods that can be called, these are:

  • add_likwid(likwid_req: list, likwid_output: str ='./')
    • Adds necessary initiation to use likwid to create a roof-line model
    • likwid_req is a list that should contain the necessary lines for the system in use to be able to use liquid. For example, loading the likwid module:
likwid_req = ['module load oneAPI_comp', 'module load likwid']
  • add_prometheus(prometheus_req: list, prometheus_output: str ='./')
    • Adds necessary initiation to use likwid to create a roof-line model
    • prometheus_req is a list that should contain the necessary lines for the system in use to be able to use prometheus. It is necessary to at least add the following two entries:
prometheus_req = [
    'export PROMETHEUS_SOFTWARE=<Path to Prometheus software>',
    'export PROMPYTHON=<Path to Python file for read_prometheus.py>'
]
  • change_options(queue_options: dict)
    • Allows for options to be changed post initiation of a Script object, in case the options given in the initialisation are no longer desired.

Beyond these three, it is also possible to call the method 'create_profilefile' if one wants to create the bash files. This method is called automatically by the submit option and is provided as an optional method to validate the content of the files to be submitted.

Requirements

Python Packages

For the sake of PyProfQueue, the required python version is at least 3.10, as this package utilises the match functionality. In order to run read_prometheus.py, the path to a python implementation/environment needs to be provided. This implementation needs to have the following packages:

  • promql_http_api
  • numpy
  • pytz
  • pandas
  • datetime

This python implementation does not need to be the same as the one being used to submit the scripts, but does need to be present on the system on which the job is being submitted.

Non python requirements

In addition to the python requirements listed above, PyProfQueue also needs to have the following software on the system to which the job will be submitted:

For prometheus, it is enough to download the software as long as prometheus can be launched by the user without sudo rights. For the sake of likwid, it needs to be installed or loaded in such a way that a user could run the following command without sudo rights:

likwid-perfctr -g MEM_DP -f <executable>

Adding new Queue system compatibility

In order to add new queue system compatibility refer to the block comments in the Script class in script.py. Each of the four section that needs changes is marked with "Queue System specifics" followed by a {1}, {2}, {3} or {4}.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

PyProfQueue-0.1.tar.gz (12.7 kB view details)

Uploaded Source

File details

Details for the file PyProfQueue-0.1.tar.gz.

File metadata

  • Download URL: PyProfQueue-0.1.tar.gz
  • Upload date:
  • Size: 12.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.14

File hashes

Hashes for PyProfQueue-0.1.tar.gz
Algorithm Hash digest
SHA256 9797cab39be443369343f530bfe4023eeaffde724a6a6a0ab4984fcd3cb6b99e
MD5 55b4538eba1df0e4cf7b525214e76182
BLAKE2b-256 0b13557832b87296b70854168e79b76efb5ae5f13ec300fe0dd0f99c54ffab79

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page