Skip to main content

Horizon’s core data SDK

Project description

Horizon Data Core SDK

The Horizon Data Core SDK provides a simple interface for working with both PostgreSQL and Iceberg tables in the Horizon system.

Getting started

Install the SDK:

uv add horizon-data-core

Features

  • Pydantic BaseModels: Type-safe data models for all entities
  • PostgreSQL Operations: Full CRUD operations for PostgreSQL tables
  • Iceberg Operations: Write and read operations for Iceberg tables
  • Automatic Conversion: Seamless conversion between Pydantic models and SQLAlchemy ORM models

Quick Start

1. Initialize the SDK

from horizon_data_core.api import initialize_sdk
from horizon_data_core.client import PostgresClient
from pyiceberg.catalog import load_catalog
from uuid import uuid4

# Set up PostgreSQL client
postgres_client = PostgresClient(
    organization_id=uuid4(),
    user="postgres",
    passwd="password",
    host="localhost",
    port=5432,
    database_name="horizon"
)

# Set up Iceberg catalog
iceberg_catalog = load_catalog("rest", uri="http://localhost:8181")

# Initialize the SDK with organization_id
organization_id = uuid4()  # This should come from your user context
initialize_sdk(postgres_client, iceberg_catalog, organization_id)

2. Working with PostgreSQL Tables

from horizon_data_core.api import create_platform, read_platform, update_platform, delete_platform, list_platforms
from horizon_data_core.base_types import Platform
from uuid import uuid4
from datetime import datetime

# Create a new platform
platform = Platform(
    id=uuid4(),
    name="My Platform",
    kind_id=uuid4(),
    # organization_id will be automatically set by the SDK
)
created_platform = create_platform(platform)

# Read the platform
retrieved_platform = read_platform(created_platform.id)

# Update the platform
retrieved_platform.name = "Updated Platform Name"
updated_platform = update_platform(retrieved_platform)

# List platforms with filters (organization_id is automatically applied)
platforms = list_platforms()

# Delete the platform
delete_platform(updated_platform.id)

3. Working with Iceberg Tables

from horizon_data_core.api import create_data_row, create_metadata_row, list_data_rows, list_metadata_rows
from horizon_data_core.base_types import DataRow, MetadataRow
from datetime import datetime

# Create a data row
data_row = DataRow(
    data_stream_id="stream-123",
    datetime=datetime.now(),
    vector=[1.0, 2.0, 3.0, 4.0, 5.0],
    data_type="sensor_data",
    vector_start_bound=0.0,
    vector_end_bound=10.0
)
create_data_row(data_row)

# Create a metadata row
metadata_row = MetadataRow(
    data_stream_id="stream-123",
    datetime=datetime.now(),
    latitude=40.7128,
    longitude=-74.0060,
    altitude=10.5,
    speed=25.0,
    heading=90.0
)
create_metadata_row(metadata_row)

# List data rows with filters
data_rows = list_data_rows(data_stream_id="stream-123")
metadata_rows = list_metadata_rows(data_stream_id="stream-123")

Available Models

PostgreSQL Models

  • PlatformKind: A descriptive class of which a platform is a physical instantiation of.
  • Platform: Core platform instances
  • DataStream: Data streams associated with platforms
  • Mission: Mission definitions
  • MissionEntity: Mission-entity relationships
  • Ontology: Ontology definitions
  • OntologyClass: Classes within ontologies
  • BeamgramSpecification: The set of parameters used to specify how a beamgram is constructed
  • BearingTimeRecordSpecification: The set of parameters used to specify how a bearing time record is constructed
  • DataRow: Time-series data with vector information
  • MetadataRow: Location and movement metadata

Iceberg Models

  • DataRow: Time-series data with vector information
  • MetadataRow: Location and movement metadata

API Reference

PostgreSQL Operations

For each PostgreSQL model, the following operations are available:

  • create_[model](model_instance): Create a new record
  • read_[model](id): Read a record by ID
  • read_[model](model_instance): Read a record by matching non-id fields
  • update_[model](model_instance): Update an existing record
  • delete_[model](id): Delete a record by ID
  • list_[models](**filters): List records with optional filters

Iceberg Operations

For Iceberg models, the following operations are available:

  • create_[model](model_instance): Create a new record
  • list_[models](**filters): List records with optional filters

Note: Update and delete operations for Iceberg tables require table-specific implementation due to the nature of Iceberg's data model.

Error Handling

The SDK includes proper error handling for:

  • Invalid model data
  • Database connection issues
  • Missing records
  • Iceberg catalog connectivity

Examples

See below for complete working examples of SDK operations.

"""Example usage of the Horizon Data Core SDK."""

from datetime import datetime
from uuid import uuid4

from .api import initialize_sdk
from .base_types import DataRow, DataStream, Platform, MetadataRow, Mission
from .client import PostgresClient
from .helpers import name_to_uuid


def example_usage() -> None:
    """Example of how to use the Horizon Data Core SDK."""
    # Initialize the SDK
    postgres_client = PostgresClient(
        user="postgres",
        passwd="password",
        host="localhost",
        port=5432,
        database_name="horizon",
    )

    # Initialize the SDK with organization_id
    organization_id = uuid4()  # This should come from your user context
    sdk = initialize_sdk(postgres_client, {}, organization_id)

    # Create a new platform
    platform = Platform(
        id=uuid4(),
        name="Example Platform",
        kind_id=uuid4(),
        free_text="This is an example platform",
        # organization_id will be automatically set by the SDK
    )
    created_platform = sdk.create_platform(platform)
    print(f"Created platform: {created_platform}")

    # Read the platform back
    assert created_platform.id is not None
    retrieved_platform = sdk.read_platform(created_platform.id)
    print(f"Retrieved platform: {retrieved_platform}")

    # List platforms with filters (organization_id is automatically applied)
    platforms = sdk.list_platforms()
    print(f"Found {len(platforms)} platforms")

    # Create a data stream
    data_stream = DataStream(
        id=uuid4(),
        platform_id=created_platform.id,
        # organization_id will be automatically set by the SDK
    )
    created_data_stream = sdk.create_data_stream(data_stream)
    print(f"Created data stream: {created_data_stream}")
    assert created_data_stream.id is not None
    # Create a mission
    mission = Mission(
        id=uuid4(),
        name="Example Mission",
        start_datetime=datetime.now(),
        # organization_id will be automatically set by the SDK
    )
    created_mission = sdk.create_mission(mission)
    print(f"Created mission: {created_mission}")

    # Create a data row in Postgres
    data_row = DataRow(
        data_stream_id=created_data_stream.id,
        datetime=datetime.now(),
        vector=[1.0, 2.0, 3.0, 4.0, 5.0],
        data_type="example_data",
        track_id=name_to_uuid(
            "example_track"
        ),  # This should be a valid specification/track_id that has been created previously
        vector_start_bound=0.0,
        vector_end_bound=10.0,
    )
    created_data_row = sdk.create_data_row(data_row)
    print(f"Created data row: {created_data_row}")

    # Create a metadata row in Iceberg table
    metadata_row = MetadataRow(
        data_stream_id=created_data_stream.id,
        datetime=datetime.now(),
        latitude=40.7128,
        longitude=-74.0060,
        altitude=10.5,
        speed=25.0,
        heading=90.0,
    )
    created_metadata_row = sdk.create_metadata_row(metadata_row)
    print(f"Created metadata row: {created_metadata_row}")


if __name__ == "__main__":
    example_usage()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

horizon_data_core-4.5.0.tar.gz (122.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

horizon_data_core-4.5.0-py3-none-any.whl (42.8 kB view details)

Uploaded Python 3

File details

Details for the file horizon_data_core-4.5.0.tar.gz.

File metadata

  • Download URL: horizon_data_core-4.5.0.tar.gz
  • Upload date:
  • Size: 122.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.15 {"installer":{"name":"uv","version":"0.9.15","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for horizon_data_core-4.5.0.tar.gz
Algorithm Hash digest
SHA256 f1b4f557617e84accf937866facc37d1baeab1aa1b2e0a10b419535683ddf121
MD5 37276ad899a3af26d5c58925a7edb2d5
BLAKE2b-256 237b8c0116e8980a09e021d465671a9b291eeedb2f3b934e1ead9e8750e0bcc6

See more details on using hashes here.

File details

Details for the file horizon_data_core-4.5.0-py3-none-any.whl.

File metadata

  • Download URL: horizon_data_core-4.5.0-py3-none-any.whl
  • Upload date:
  • Size: 42.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.15 {"installer":{"name":"uv","version":"0.9.15","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for horizon_data_core-4.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 49e7b983436807a788cbb5d49f550d0dc6799a93a9550bff8fd15085916fc463
MD5 68cc020b0578f749921535399204d128
BLAKE2b-256 2f877527c13dc6804d3e5835d2ea2b59e354b137e1aaf98747a9996c95e82e69

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page