Skip to main content

A Python package to get details from OceanProtocol jobs

Project description

A Python package to get details from OceanProtocol jobs


Installation

pip install oceanprotocol-job-details
uv add oceanprotocol-job-details

Usage

As a simple library, we only need to import load_job_details and run it. It will:

  1. Fetch the needed parameters to populate the JobDetails instance from the environment variables or use the passed values to the function.
  2. Look for the files corresponding to the passed DIDs in the filesystem according to the Ocean Protocol Structure and load them into the JobDetails instance.

Minimal Example

from oceanprotocol_job_details import load_job_details

class InputParameters(BaseModel): ...

job_details = load_job_details({}, InputParameters)

Custom Input Parameters

If our algorithm has custom input parameters and we want to load them into our algorithm, we can do it as follows:

from pydantic import BaseModel
from oceanprotocol_job_details import load_job_details


class Foo(BaseModel):
    bar: str


class InputParameters(BaseModel):
    # Allows for nested types
    foo: Foo


job_details = load_job_details({}, InputParameters)

# Usage
job_details.input_parameters.foo
job_details.input_parameters.foo.bar

The values to fill the custom InputParameters will be parsed from the algoCustomData.json located next to the input data directories.

Iterating Input Files the clean way

from oceanprotocol_job_details import load_job_details


job_details = load_job_details

for idx, file_path in job_details.inputs():
    ...

_, file_path = next(job_details.inputs())

OceanProtocol Structure

data        # Root /data directory
├── ddos    # Contains the loaded dataset's DDO   ├── 17feb...e42 # DDO file   └── ... # One DDO per loaded dataset
├── inputs  # Datasets dir   ├── 17feb...e42 # Dir holding the data of its name DID, contains files named 0..X      └── 0 # Data file   └── algoCustomData.json # Custom algorithm input data
├── logs    # Algorithm output logs dir
└── outputs # Algorithm output files dir

Note: Even though it's possible that the algorithm is passed multiple datasets, right now the implementation only allows to use one dataset per algorithm execution, so normally the executing job will only have one ddo, one dir inside inputs, and one data file named 0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oceanprotocol_job_details-0.3.2.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

oceanprotocol_job_details-0.3.2-py3-none-any.whl (10.8 kB view details)

Uploaded Python 3

File details

Details for the file oceanprotocol_job_details-0.3.2.tar.gz.

File metadata

File hashes

Hashes for oceanprotocol_job_details-0.3.2.tar.gz
Algorithm Hash digest
SHA256 cde54159a4ca4e1b468868097b82be61a43e1480fd0bba4ad81c9da90e935d89
MD5 d0471264e5195832d94a1fc1d68bd5d9
BLAKE2b-256 248621f1785cd28cd9689c6c931e9b29f57a0056f31352a46b1dbfcd9373d0c2

See more details on using hashes here.

File details

Details for the file oceanprotocol_job_details-0.3.2-py3-none-any.whl.

File metadata

File hashes

Hashes for oceanprotocol_job_details-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ce5ba17b7a373e8e11a5eda5a4c5271d5a350c4d3246611dd0ec58b0779f5386
MD5 7e45ca80d89f624cffcb99dc4143d62a
BLAKE2b-256 68b1be04de378b62f9abb26e1e8c2a0c768f9ab9ed2ee25abae094a435505828

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page