Skip to main content

A Python package to get details from OceanProtocol jobs

Project description

OceanProtocol Job Details

PyPI Coverage

A Python package to get details from OceanProtocol jobs

Installation

pip install oceanprotocol-job-details
#or
uv add oceanprotocol-job-details

Usage

As a simple library, we only need to import load_job_details and run it. It will:

  1. Read from disk the needed parameters to populate the JobDetails from the given base_dir. Looking for the files corresponding to the passed DIDs in the filesystem according to the Ocean Protocol Structure.
  2. If given a InputParameters type that inherits from pydantic.BaseModel, it will create an instance from the environment variables.

Minimal Example

from oceanprotocol_job_details import load_job_details

job_details = load_job_details({"base_dir": "...", "transformation_did": "..."})

Custom Input Parameters

If our algorithm has custom input parameters and we want to load them into our algorithm, we can do it as follows:

from pydantic import BaseModel
from oceanprotocol_job_details import load_job_details


class Foo(BaseModel):
    bar: str


class InputParameters(BaseModel):
    # Allows for nested types
    foo: Foo


job_details = load_job_details({"base_dir": "...", "transformation_did": "..."}, InputParameters)

# Usage
parameters = await job_details.input_parameters()
parameters.foo
parameters.foo.bar

The values to fill the custom InputParameters will be parsed from the algoCustomData.json located next to the input data directories.

Iterating Input Files the clean way

from oceanprotocol_job_details import load_job_details


job_details = load_job_details(...)

for idx, file_path in job_details.inputs():
    ...

_, file_path = next(job_details.inputs())

OceanProtocol Structure

data        # Root /data directory
├── ddos    # Contains the loaded dataset's DDO (metadata)   ├── 17feb...e42 # DDO file   └── ... # One DDO per loaded dataset
├── inputs  # Datasets dir   ├── 17feb...e42 # Dir holding the data of its name DID, contains files named 0..X      └── 0 # Data file   └── algoCustomData.json # Custom algorithm input data
├── logs    # Algorithm output logs dir
└── outputs # Algorithm output files dir

Note: Even though it's possible that the algorithm is passed multiple datasets, right now the implementation only allows to use one dataset per algorithm execution, so normally the executing job will only have one ddo, one dir inside inputs, and one data file named 0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oceanprotocol_job_details-0.4.3.tar.gz (8.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

oceanprotocol_job_details-0.4.3-py3-none-any.whl (13.9 kB view details)

Uploaded Python 3

File details

Details for the file oceanprotocol_job_details-0.4.3.tar.gz.

File metadata

File hashes

Hashes for oceanprotocol_job_details-0.4.3.tar.gz
Algorithm Hash digest
SHA256 ea8d09c0a94c119d378a3c4f89b89f8fe1a261f844c8ca0a8a26f3a4c07a4d24
MD5 8ddc928b5ef598c1d3b3f573f6ecc3de
BLAKE2b-256 28d8bcb952ae3e9c7841b8c76a9e34579d631639e276dec2e4e9d7dd7e1d8006

See more details on using hashes here.

File details

Details for the file oceanprotocol_job_details-0.4.3-py3-none-any.whl.

File metadata

File hashes

Hashes for oceanprotocol_job_details-0.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 1fbc1c2a7564d0968456f4815a84426d341b91327bf1d07e2279be13a1f41ba0
MD5 c1c5cab11287b1ec46e741a3dcce4f9d
BLAKE2b-256 0aaf8410a1a8efa7ab39f47969c1cdc8697b6c21f613b4dd9976f1d2a4fbb645

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page