Skip to main content

Hkube Python Wrapper

Project description

HKube Python Wrapper

Build Status

Hkube python wrapper provides a simple interface for integrating algorithm in HKube

For general information on HKube see hkube.io

Installation

pip install hkube-python-wrapper

Download hkubectl latest version.

curl -Lo hkubectl https://github.com/kube-HPC/hkubectl/releases/download/$(curl -s https://api.github.com/repos/kube-HPC/hkubectl/releases/latest | grep -oP '"tag_name": "\K(.*)(?=")')/hkubectl-linux \
&& chmod +x hkubectl \
&& sudo mv hkubectl /usr/local/bin/

For mac replace with hkubectl-macos
For Windows download hkubectl-win.exe

Config hkubectl with your running Kubernetes.

hkubectl config # and follow the prompts

Basic Usage (using hkube build feature)

create a file for the algorithm entry-points (alg.py)

from typing import Dict
from hkube_python_wrapper import Algorunner, HKubeApi
def start(args: Dict, hkubeApi: HKubeApi):
    return 1

build the algorithm with hkubectl

hkubectl algorithm apply algorithm-name  --codePath ./folder_of_alg_py --codeEntryPoint alg.py --env python --setCurrent

Basic Usage (manual build)

from typing import Dict
from hkube_python_wrapper import Algorunner, HKubeApi
def start(args: Dict, hkubeApi: HKubeApi):
    return 1
if __name__ == "__main__":
    Algorunner.Run(start=start)

The start method accepts two arguments:

args: dict of invocation input

key type description
input Array algorithm input as defined in the pipeline descriptor
jobId string The job ID of the pipeline run
taskId string The task ID of the algorithm invocation
nodeName string The name of the node in the pipeline descriptor
pipelineName string The name of the pipeline
batchIndex int For batch instance, the index in the batch array
parentNodeName string For child (code-api) algorithm. The name of the invoking node
info.rootJobId string for sub-pipeline, the jobId of the invoking pipeline

hkubeApi: instance of HKubeApi for code-api operations

Class HKubeApi


Method start_algorithm

def start_algorithm(
    self,
    algorithmName,
    input=[],
    includeResult=True,
    blocking=False
)

Starts an invocation of algorithm with input, and optionally waits for results

Args

algorithmName: string : The name of the algorithm to start.

input :array : Optional input for the algorithm.

includeResult :bool : if True, returns the result of the algorithm execution.
default: True

blocking :bool : if True, blocks until the algorithm finishes, and returns the results. If False, returns an awaiter object, that can be awaited (blocking) at a later time
default: False

Returns

if blocking==False, returns an awaiter. If true, returns the result of the algorithm

Example:

hkubeApi.start_algorithm('some_algorithm',input=[3], blocking=True)

Method start_stored_subpipeline

def start_stored_subpipeline(
    self,
    name,
    flowInput={},
    includeResult=True,
    blocking=False
)

Starts an invocation of a sub-pipeline with input, and optionally waits for results

Args

name : string : The name of the pipeline to start.

flowInput : dict : Optional flowInput for the pipeline.

includeResult :bool : if True, returns the result of the pipeline execution.
default: True

blocking : bool : if True, blocks until the pipeline finishes, and returns the results. If False, returns an awaiter object, that can be awaited (blocking) at a later time
default: False

Returns

if blocking==False, returns an awaiter. If true, returns the result of the pipeline

Example:

hkubeApi.start_stored_subpipeline('simple',flowInput={'foo':3},blocking=True)

Method start_raw_subpipeline

def start_raw_subpipeline(
    self,
    name,
    nodes,
    flowInput,
    options={},
    webhooks={},
    includeResult=True,
    blocking=False
)

Starts an invocation of a sub-pipeline with input, nodes, options, and optionally waits for results

Args

name : string : The name of the pipeline to start.

nodes : string : Array of nodes. See example below.

flowInput : dict : FlowInput for the pipeline.

options : dict : pipeline options (like in the pipeline descriptor).

webhooks : dict : webhook options (like in the pipeline descriptor).

includeResult :bool : if True, returns the result of the pipeline execution.
default: True

blocking : bool : if True, blocks until the pipeline finishes, and returns the results. If False, returns an awaiter object, that can be awaited (blocking) at a later time
default: False

Returns

if blocking==False, returns an awaiter. If true, returns the result of the pipeline

Example:

nodes=[{'nodeName': 'd1', 'algorithmName': 'green-alg', 'input': ['@flowInput.foo']}]
flowInput={'foo':3}
hkubeApi.start_raw_subpipeline('ddd',nodes, flowInput,webhooks={}, options={}, blocking=True)

Project details


Release history Release notifications | RSS feed

This version

2.4.2

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hkube_python_wrapper-2.4.2.tar.gz (45.6 kB view details)

Uploaded Source

Built Distribution

hkube_python_wrapper-2.4.2-py2.py3-none-any.whl (69.7 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file hkube_python_wrapper-2.4.2.tar.gz.

File metadata

  • Download URL: hkube_python_wrapper-2.4.2.tar.gz
  • Upload date:
  • Size: 45.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for hkube_python_wrapper-2.4.2.tar.gz
Algorithm Hash digest
SHA256 6c8c54d8624fc4f7bbbd686fb1ee7bbb264b4f3a145ff064ca62dd6f6e275b23
MD5 4d287780c6996171bff8014af9676399
BLAKE2b-256 a11d981075a575145e730fa7b586bb32c95c7f26f38190fcd0f9d8ff59668b8b

See more details on using hashes here.

File details

Details for the file hkube_python_wrapper-2.4.2-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for hkube_python_wrapper-2.4.2-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 6d3f81f84a8a14f9f2d251e34254c30706ed2307da1ececccd49cfc1aefe2608
MD5 6c7f57e954af17ed20f90209cf290459
BLAKE2b-256 8d51214ecc3758ec5c8d33eb8a277c223b56e568619d6bbc33bc239556711675

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page