Skip to main content

Hkube Python Wrapper

Project description

HKube Python Wrapper

Build Status

Hkube python wrapper provides a simple interface for integrating algorithm in HKube

For general information on HKube see hkube.io

Installation

pip install hkube-python-wrapper

Download hkubectl latest version.

curl -Lo hkubectl https://github.com/kube-HPC/hkubectl/releases/download/$(curl -s https://api.github.com/repos/kube-HPC/hkubectl/releases/latest | grep -oP '"tag_name": "\K(.*)(?=")')/hkubectl-linux \
&& chmod +x hkubectl \
&& sudo mv hkubectl /usr/local/bin/

For mac replace with hkubectl-macos
For Windows download hkubectl-win.exe

Config hkubectl with your running Kubernetes.

hkubectl config # and follow the prompts

Basic Usage (using hkube build feature)

create a file for the algorithm entry-points (alg.py)

from typing import Dict
from hkube_python_wrapper import Algorunner, HKubeApi
def start(args: Dict, hkubeApi: HKubeApi):
    return 1

build the algorithm with hkubectl

hkubectl algorithm apply algorithm-name  --codePath ./folder_of_alg_py --codeEntryPoint alg.py --env python --setCurrent

Basic Usage (manual build)

from typing import Dict
from hkube_python_wrapper import Algorunner, HKubeApi
def start(args: Dict, hkubeApi: HKubeApi):
    return 1
if __name__ == "__main__":
    Algorunner.Run(start=start)

The start method accepts two arguments:

args: dict of invocation input

key type description
input Array algorithm input as defined in the pipeline descriptor
jobId string The job ID of the pipeline run
taskId string The task ID of the algorithm invocation
nodeName string The name of the node in the pipeline descriptor
pipelineName string The name of the pipeline
batchIndex int For batch instance, the index in the batch array
parentNodeName string For child (code-api) algorithm. The name of the invoking node
info.rootJobId string for sub-pipeline, the jobId of the invoking pipeline

hkubeApi: instance of HKubeApi for code-api operations

Class HKubeApi


Method start_algorithm

def start_algorithm(
    self,
    algorithmName,
    input=[],
    includeResult=True,
    blocking=False
)

Starts an invocation of algorithm with input, and optionally waits for results

Args

algorithmName: string : The name of the algorithm to start.

input :array : Optional input for the algorithm.

includeResult :bool : if True, returns the result of the algorithm execution.
default: True

blocking :bool : if True, blocks until the algorithm finishes, and returns the results. If False, returns an awaiter object, that can be awaited (blocking) at a later time
default: False

Returns

if blocking==False, returns an awaiter. If true, returns the result of the algorithm

Example:

hkubeApi.start_algorithm('some_algorithm',input=[3], blocking=True)

Method start_stored_subpipeline

def start_stored_subpipeline(
    self,
    name,
    flowInput={},
    includeResult=True,
    blocking=False
)

Starts an invocation of a sub-pipeline with input, and optionally waits for results

Args

name : string : The name of the pipeline to start.

flowInput : dict : Optional flowInput for the pipeline.

includeResult :bool : if True, returns the result of the pipeline execution.
default: True

blocking : bool : if True, blocks until the pipeline finishes, and returns the results. If False, returns an awaiter object, that can be awaited (blocking) at a later time
default: False

Returns

if blocking==False, returns an awaiter. If true, returns the result of the pipeline

Example:

hkubeApi.start_stored_subpipeline('simple',flowInput={'foo':3},blocking=True)

Method start_raw_subpipeline

def start_raw_subpipeline(
    self,
    name,
    nodes,
    flowInput,
    options={},
    webhooks={},
    includeResult=True,
    blocking=False
)

Starts an invocation of a sub-pipeline with input, nodes, options, and optionally waits for results

Args

name : string : The name of the pipeline to start.

nodes : string : Array of nodes. See example below.

flowInput : dict : FlowInput for the pipeline.

options : dict : pipeline options (like in the pipeline descriptor).

webhooks : dict : webhook options (like in the pipeline descriptor).

includeResult :bool : if True, returns the result of the pipeline execution.
default: True

blocking : bool : if True, blocks until the pipeline finishes, and returns the results. If False, returns an awaiter object, that can be awaited (blocking) at a later time
default: False

Returns

if blocking==False, returns an awaiter. If true, returns the result of the pipeline

Example:

nodes=[{'nodeName': 'd1', 'algorithmName': 'green-alg', 'input': ['@flowInput.foo']}]
flowInput={'foo':3}
hkubeApi.start_raw_subpipeline('ddd',nodes, flowInput,webhooks={}, options={}, blocking=True)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hkube_python_wrapper-2.5.0.dev3.tar.gz (45.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hkube_python_wrapper-2.5.0.dev3-py2.py3-none-any.whl (69.8 kB view details)

Uploaded Python 2Python 3

File details

Details for the file hkube_python_wrapper-2.5.0.dev3.tar.gz.

File metadata

File hashes

Hashes for hkube_python_wrapper-2.5.0.dev3.tar.gz
Algorithm Hash digest
SHA256 062ba7ec88fd96084d96241133124416347651195d49e67fa9feee4fb6697fa8
MD5 246f1d0be76d5b41cc69ab45e7d2186c
BLAKE2b-256 b13458243206f6121aaf0a2caca931e33e6e769ed863c8799df20f5e37c754eb

See more details on using hashes here.

File details

Details for the file hkube_python_wrapper-2.5.0.dev3-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for hkube_python_wrapper-2.5.0.dev3-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 a9e17e938f8252929a5f9a4c91172f84a4d3a891b313f292183989ce4e4394be
MD5 b3a2fc75bc5d70c1dd4b2d3d935cdaf1
BLAKE2b-256 db77cf77b736fb5297ea91c909adaaccf7af07c79ec541bd477d4a21a9a4be70

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page