Skip to main content

A Python package to get details from OceanProtocol jobs

Project description

A Python package to get details from OceanProtocol jobs


Installation

pip install oceanprotocol-job-details

Usage

As a simple library, we only need to import JobDetails and load it, it will:

  1. Fetch the needed parameters to populate the JobDetails instance from the environment variables or use the passed values to the load() method.
  2. Look for the files corresponding to the passed DIDs in the filesystem according to the Ocean Protocol Structure and load them into the JobDetails instance.

Minimal Example

from oceanprotocol_job_details import JobDetails

job_details = JobDetails.load()

Custom Input Parameters

If our algorithm has custom input parameters and we want to load them into our algorithm, we can do it as follows:

from dataclasses import dataclass
from oceanprotocol_job_details import JobDetails


@dataclass
class InputParameters:
    foobar: str


job_details = JobDetails[InputParameters].load(InputParameters)

# Usage
job_details.input_parameters.foobar
from dataclasses import dataclass
from oceanprotocol_job_details import JobDetails


@dataclass
class Foo:
    bar: str


@dataclass
class InputParameters:
    # Allows for nested types
    foo: Foo


job_details = JobDetails[InputParameters].load(InputParameters)

# Usage
job_details.input_parameters.foo.bar

The values to fill the custom InputParameters will be parsed from the algoCustomData.json located next to the input data directories.

Iterating Input Files the clean way

from oceanprotocol_job_details import JobDetails


job_details = JobDetails.load()

for idx, file_path in job_details.next_file():
    ...

# Or if you just want one file path
_, file_path = job_details.next_file()

OceanProtocol Structure

data        # Root /data directory
├── ddos    # Contains the loaded dataset's DDO   ├── 17feb...e42 # DDO file   └── ... # One DDO per loaded dataset
├── inputs  # Datasets dir   ├── 17feb...e42 # Dir holding the data of its name DID, contains files named 0..X      └── 0 # Data file   └── algoCustomData.json # Custom algorithm input data
├── logs    # Algorithm output logs dir
└── outputs # Algorithm output files dir

Note: Even though it's possible that the algorithm is passed multiple datasets, right now the implementation only allows to use one dataset per algorithm execution, so normally the executing job will only have one ddo, one dir inside inputs, and one data file named 0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oceanprotocol_job_details-0.2.8.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

oceanprotocol_job_details-0.2.8-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file oceanprotocol_job_details-0.2.8.tar.gz.

File metadata

File hashes

Hashes for oceanprotocol_job_details-0.2.8.tar.gz
Algorithm Hash digest
SHA256 bb6ef7aa8a74e146db89ad9403b058451e83da3d7e03a8fdf12a351d9831b539
MD5 133530d5b80436817269a405bc9c6e5a
BLAKE2b-256 ced00ec34658c13d17cc5877b3b697fffaa0663b432ad3d4ade4ac83cd4732e8

See more details on using hashes here.

File details

Details for the file oceanprotocol_job_details-0.2.8-py3-none-any.whl.

File metadata

File hashes

Hashes for oceanprotocol_job_details-0.2.8-py3-none-any.whl
Algorithm Hash digest
SHA256 10b7d4ee99d72bfa28016f0ff2128e1a7ebff12137dd5aaf741da60b952b9147
MD5 0348056598b5dca4e382d213888bfb5b
BLAKE2b-256 a8a907e873d8ca59eb99b8a614d4a4da6aae17410130701003253fa3b6cce465

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page