A Python package to get details from OceanProtocol jobs
Project description
OceanProtocol Job Details
A Python package to get details from OceanProtocol jobs
Installation
pip install oceanprotocol-job-details
#or
uv add oceanprotocol-job-details
Usage
As a simple library, we only need to import load_job_details and run it. It will:
- Read from disk the needed parameters to populate the
JobDetailsfrom the givenbase_dir. Looking for the files corresponding to the passed DIDs in the filesystem according to the Ocean Protocol Structure. - If given a
InputParameterstype that inherits frompydantic.BaseModel, it will create an instance from the environment variables.
Minimal Example
from oceanprotocol_job_details import load_job_details
job_details = load_job_details({"base_dir": "...", "transformation_did": "..."})
Custom Input Parameters
If our algorithm has custom input parameters and we want to load them into our algorithm, we can do it as follows:
from pydantic import BaseModel
from oceanprotocol_job_details import load_job_details
class Foo(BaseModel):
bar: str
class InputParameters(BaseModel):
# Allows for nested types
foo: Foo
job_details = load_job_details({"base_dir": "...", "transformation_did": "..."}, InputParameters)
# Usage
parameters = await job_details.input_parameters()
parameters.foo
parameters.foo.bar
The values to fill the custom InputParameters will be parsed from the algoCustomData.json located next to the input data directories.
Iterating Input Files the clean way
from oceanprotocol_job_details import load_job_details
job_details = load_job_details(...)
for idx, file_path in job_details.inputs():
...
_, file_path = next(job_details.inputs())
OceanProtocol Structure
data # Root /data directory
├── ddos # Contains the loaded dataset's DDO (metadata)
│ ├── 17feb...e42 # DDO file
│ └── ... # One DDO per loaded dataset
├── inputs # Datasets dir
│ ├── 17feb...e42 # Dir holding the data of its name DID, contains files named 0..X
│ │ └── 0 # Data file
│ └── algoCustomData.json # Custom algorithm input data
├── logs # Algorithm output logs dir
└── outputs # Algorithm output files dir
Note: Even though it's possible that the algorithm is passed multiple datasets, right now the implementation only allows to use one dataset per algorithm execution, so normally the executing job will only have one ddo, one dir inside inputs, and one data file named
0.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file oceanprotocol_job_details-0.4.5.tar.gz.
File metadata
- Download URL: oceanprotocol_job_details-0.4.5.tar.gz
- Upload date:
- Size: 9.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
821d0b88c210344781b3dc1d100658c399e30cac747638807f2a61c5dbf3ef88
|
|
| MD5 |
5be2c928f56141cdc1de009a64b53d8d
|
|
| BLAKE2b-256 |
3f70b89167e82d40a1bec53d455c175997e372a51ec740cdb5248d14b2273349
|
File details
Details for the file oceanprotocol_job_details-0.4.5-py3-none-any.whl.
File metadata
- Download URL: oceanprotocol_job_details-0.4.5-py3-none-any.whl
- Upload date:
- Size: 15.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
99085ad1772ba36e37e7d0bd11e2ff6fc1878a5289adbb58b2f859a273475e51
|
|
| MD5 |
c4a2e0ee22daa772566c3bfd23bba5f0
|
|
| BLAKE2b-256 |
b71f64a75452aff5936ce89b2b9999a4eeb21eab892c40f20b70d90df55295d9
|