Skip to main content

Python SDK for Osiris (Energinet DataPlatform).

Project description

osiris-sdk

Installing

$ pip install osiris-sdk

The SDK requires Python 3.

Getting Started

To get started with the SDK you will need the URL to the Osiris-ingress API and the tenant ID for the organisation who runs the API. Furthermore, you will need to register your application withing the tenant using Azure App Registration. You will also need to create a dataset in the DataPlatform.

Data application registration

An App Registration with credentials are required to upload data to the DataPlatform through the Osiris Ingress API.

Prerequisites

  • The dataset has been created through the Data Platform.
  • The Azure CLI is installed on your workstation

Steps

Login with the Azure CLI with the following command:

az login

You can also specify a username and password with:

az login -u <username> -p <password>

Create an App Registration

The App Registration serves as a registration of trust for your application (or data publishing service) towards the Microsoft Identity Platform (allowing authentication).

This is the "identity" of your application. Note that an App Registration is globally unique.

Run the following command:

az ad app create --display-name "<YOUR APP NAME>"

The application name should be descriptive correlate to the application/service you intend to upload data with.

Take note of the appId GUID in the returned object.

Create a Service Principal and credentials

The Service Principal and credentials are what enables authorization to the Data Platform.

Create a Service Principal using the appId GUID from when creating the App Registration:

az ad sp create --id "<appID>"

Then create a credential for the App Registration:

az ad app credential reset --id "<appID>"

NOTE: Save the output somewhere secure. The credentials you receive are required to authenticate with the Osiris Ingress API.

Grant access to the dataset

The application must be granted read- and write-access to the dataset on the Data Platform.

Add the application you created earlier, using the <YOUR APP NAME> name, to the read- and write-access lists.

Usage

Here are some simple examples on how to use the SDK.

Upload

The following is a simple example which shows how you can upload files using the Osiris SDK:

from osiris.apis.ingress import Ingress

ingress = Ingress(ingress_url=<INGRESS_URL>,
                  tenant_id=<TENANT_ID>,
                  client_id=<CLIENT_ID>,
                  client_secret=<CLIENT_SECRET>,
                  dataset_guid=<DATASET_GUID>)

file = open('test_file.json', 'rb')

# Without schema validation and a JSON file
ingress.upload_json_file(file, False)

# With schema validation and a JSON file
ingress.upload_json_file(file, True)

# Arbitrary file
ingress.upload_file(file)

# Save state file
with open('state.json', 'r') as state:
    ingress.save_state(state)

# Retrieve state
state = ingress.retrieve_state()

Download

The following is a simple example which shows how you can download files using the Osiris SDK:

from osiris.apis.egress import Egress

egress = Egress(egress_url=<EGRESS_URL>,
                tenant_id=<TENANT_ID>,
                client_id=<CLIENT_ID>,
                client_secret=<CLIENT_SECRET>,
                dataset_guid=<DATASET_GUID>)

# JSON file
file_date: date = datetime.utcnow().date(),
content_json = egress.download_json_file(file_date)

# Arbitrary file
content_arbitrary = egress.download_file(file_date)

Time series pipeline

The following is a simple example which shows how you can create a time series pipeline.

from osiris.pipelines.pipeline_timeseries import PipelineTimeSeries

pipeline = PipelineTimeSeries(storage_account_url=<AZURE_STORAGE_ACCOUNT_URL>,
                              filesystem_name=<CONTAINER_NAME>,
                              tenant_id=<TENANT_ID>,
                              client_id=<CLIENT_ID>,
                              client_secret=<CLIENT_SECRET>,
                              source_dataset_guid=<DATASET_GUID>,
                              destination_dataset_guid=<DATASET_GUID>,
                              date_format=<DATE_FORMAT_FOR_DATE_FIELD>,  # Example: "%Y-%m-%dT%H:%M:%S.%fZ"
                              date_key_name=<FIELD_NAME_FOR_EVENT_TIME>)

# Running the pipeline with current time
pipeline.transform_ingest_time_to_event_time_daily()

# Running the pipeline with specific time
ingest_time = datetime.datetime(2021, 04, 08, 12, 0, 0)  # April 4th, 2021 at 12:00:00
pipeline.transform_ingest_time_to_event_time_daily(ingest_time=ingest_time)

Data conversion pipeline

This is an example of using the data conversion classes to transform structured data into other formats of structured data.

from osiris.pipelines.pipeline_conversion import PipelineConversion

pipeline = PipelineConversion(storage_account_url=<AZURE_STORAGE_ACCOUNT_URL>,
                              filesystem_name=<CONTAINER_NAME>,
                              tenant_id=<TENANT_ID>,
                              client_id=<CLIENT_ID>,
                              client_secret=<CLIENT_SECRET>,
                              source_dataset_guid=<DATASET_GUID>,
                              destination_dataset_guid=<DATASET_GUID>)

# Running the pipeline with current time, using method defaults
pipeline.transform_convert_csv_to_json()

# Running the pipeline with specific time and tab as CSV separator
ingest_time = datetime.datetime(2021, 04, 08, 12, 0, 0)  # April 4th, 2021 at 12:00:00
pipeline.transform_convert_csv_to_json(ingest_time=ingest_time, separator='\t')

Ingress Adapter

The following is a simple example which shows how you can create a new ingress adapter.

import json
from osiris.adapters.ingress_adapter import IngressAdapter


class MyAdapter(IngressAdapter):
    def retrieve_data(self) -> bytes:
        return json.dumps('Hello World').encode('UTF-8')


def main():
    adapter = MyAdapter(ingress_url=<INGRESS_URL>,
                        tenant_id=<TENANT_ID>,
                        client_id=<CLIENT_ID>,
                        client_secret=<CLIENT_SECRET>,
                        dataset_guid=<DATASET_GUID>)

    # as json data
    adapter.upload_json_data(schema_validate=False)

    # or as arbitrary data
    adapter.upload_data('bin')

if __name__ == '__main__':
    main()

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

osiris-sdk-0.4.14.tar.gz (15.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

osiris_sdk-0.4.14-py3-none-any.whl (24.6 kB view details)

Uploaded Python 3

File details

Details for the file osiris-sdk-0.4.14.tar.gz.

File metadata

  • Download URL: osiris-sdk-0.4.14.tar.gz
  • Upload date:
  • Size: 15.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.9.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.5

File hashes

Hashes for osiris-sdk-0.4.14.tar.gz
Algorithm Hash digest
SHA256 e931ef0ac48c35121bcf122edeefabad3824b3e27b1b26389e88cb4fcd03b007
MD5 26597c4f8edf480b0690a4b67358c19a
BLAKE2b-256 e5612fe2257f1edabfea82eeffef80553f70ffb0654c17160b32289a3be21d12

See more details on using hashes here.

File details

Details for the file osiris_sdk-0.4.14-py3-none-any.whl.

File metadata

  • Download URL: osiris_sdk-0.4.14-py3-none-any.whl
  • Upload date:
  • Size: 24.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.9.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.5

File hashes

Hashes for osiris_sdk-0.4.14-py3-none-any.whl
Algorithm Hash digest
SHA256 113873e4ef69d91105a19ae9d9536f0b36544c2cef364633d1632775dee9a7ab
MD5 07affef553d551eceebabe0e83e60576
BLAKE2b-256 cb5d2fd2310b03bcac89f6e826b77d533d8d2bd51acaa35827873c77480ae428

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page