Skip to main content

Utility methods to interface with the HPC. Auto-generated from openapi tools

Project description

aind-hpc-client

License semantic-release: angular Python

Usage

from aind_hpc_client import ApiClient as Client
from aind_hpc_client import Configuration as Config
from aind_hpc_client.api.slurm_api import SlurmApi
from aind_hpc_client.models.v0036_job_submission import V0036JobSubmission
from aind_hpc_client.models.v0036_job_properties import V0036JobProperties

host = "http://slurm/api"
username = "*****"  # Change this
# Ideally, the password and access_token are set as secrets and read in using a secrets manager
password = "*****"  # Change this
access_token = "*****"  # Change this
config = Config(host=host, password=password, username=username, access_token=access_token)
slurm = SlurmApi(Client(config))
slurm.api_client.set_default_header(header_name='X-SLURM-USER-NAME', header_value=username)
slurm.api_client.set_default_header(header_name='X-SLURM-USER-PASSWORD', header_value=password)
slurm.api_client.set_default_header(header_name='X-SLURM-USER-TOKEN', header_value=access_token)

command_str = [
            "#!/bin/bash",
            "\necho",
            "'Hello World?'",
            "&&",
            "sleep",
            "120",
            "&&",
            "echo",
            "'Example json string'",
            "&&",
            "echo",
            "'",
            '{"input_source":"/path/to/directory","output_directory":"/path/to/another_directory"}',
            "'",
            "&&",
            "echo",
            "'Goodbye!'"
        ]
script = " ".join(command_str)

hpc_env = {"PATH": "/bin:/usr/bin/:/usr/local/bin/", "LD_LIBRARY_PATH": "/lib/:/lib64/:/usr/local/lib",}

job_props = V0036JobProperties(
  partition = "aind",  # Change this if needed
  name = "test_job1",
  environment = hpc_env,
  standard_out = "/path/for/logs/test_job1.out",  # Change this
  standard_error = "/path/for/logs/test_job1_error.out",  # Change this
  memory_per_cpu = 500,
  tasks = 1,
  minimum_cpus_per_node = 1,
  nodes = [1, 1],
  time_limit = 5  # In minutes
)

job_submission = V0036JobSubmission(script=script, job=job_props)
submit_response = slurm.slurmctld_submit_job_0(v0036_job_submission=job_submission)
job_id = submit_response.job_id
job_response = slurm.slurmctld_get_job_0(job_id=submit_response.job_id)
print(job_response.jobs[0].job_state)

Installation

The code is automatically generated using openapi tools and the specification from slurm.

To get the specification from slurm

curl -s -H X-SLURM-USER-NAME:$SLURM_USER_NAME \
 -H X-SLURM-USER-PASSWORD:$SLURM_USER_PASSWORD \
 -H X-SLURM-USER-TOKEN:$SLURM_USER_TOKEN \
 -X GET 'http://slurm/api/openapi/v3' > openapi.json

Update schema

The original specification has some validation issues, so the output is modified. The changes are tracked in schema_changes.json.

To create the python code, openapi tools is used. generateSourceCodeOnly in configs.json can be set to False to generate tests and additional files.

docker run --rm \
  -u "$(id -u):$(id -g)" \
  -v ${PWD}:/local openapitools/openapi-generator-cli generate \
  --skip-validate-spec \
  --config /local/configs.json \
  -i /local/openapi.json \
  -g python \
  -o /local/src

Contributing

We can update the openapi.json specification if validation errors are raised.

Pull requests

For internal members, please create a branch. For external members, please fork the repository and open a pull request from the fork. We'll primarily use Angular style for commit messages. Roughly, they should follow the pattern:

<type>(<scope>): <short summary>

where scope (optional) describes the packages affected by the code changes and type (mandatory) is one of:

  • build: Changes that affect build tools or external dependencies (example scopes: pyproject.toml, setup.py)
  • ci: Changes to our CI configuration files and scripts (examples: .github/workflows/ci.yml)
  • docs: Documentation only changes
  • feat: A new feature
  • fix: A bugfix
  • perf: A code change that improves performance
  • refactor: A code change that neither fixes a bug nor adds a feature
  • test: Adding missing tests or correcting existing tests

Semantic Release

The table below, from semantic release, shows which commit message gets you which release type when semantic-release runs (using the default configuration):

Commit message Release type
fix(pencil): stop graphite breaking when too much pressure applied Patch Fix Release, Default release
feat(pencil): add 'graphiteWidth' option Minor Feature Release
perf(pencil): remove graphiteWidth option

BREAKING CHANGE: The graphiteWidth option has been removed.
The default graphite width of 10mm is always used for performance reasons.
Major Breaking Release
(Note that the BREAKING CHANGE: token must be in the footer of the commit)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aind-hpc-client-0.0.2.tar.gz (178.9 kB view details)

Uploaded Source

Built Distribution

aind_hpc_client-0.0.2-py3-none-any.whl (550.3 kB view details)

Uploaded Python 3

File details

Details for the file aind-hpc-client-0.0.2.tar.gz.

File metadata

  • Download URL: aind-hpc-client-0.0.2.tar.gz
  • Upload date:
  • Size: 178.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for aind-hpc-client-0.0.2.tar.gz
Algorithm Hash digest
SHA256 c112841c1bcf0566843f002f49f97955fd643b9ecc904bc48828831c7a35a5d2
MD5 58ccb8ffa34471d6b678536dd0f4a2be
BLAKE2b-256 6834531549d903d2bd637dadf47852ae4e26cee5da6454b48a1fbef9a34529ac

See more details on using hashes here.

File details

Details for the file aind_hpc_client-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for aind_hpc_client-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 41768ba20c795b57aecf2331ceeb1104cbea438e308fb83de3a7b37ffb1c709c
MD5 6763fda02a058194d5a0fe9fb26d819a
BLAKE2b-256 012572d7de59c8c023d6bb62023de528f6b2bb6a1451263f4e8acfdd4cdde2f2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page