Skip to main content

Python client library for accessing dClimate weather and climate data

Project description

dClimate logo

dClimate-Client-Py

codecov

Retrieve dClimate GIS zarr datasets stored on IPFS

Uses py-hamt to access Zarr data structures stored efficiently on IPFS.

Filtering and aggregation are packaged into convenience functions optimized for flexibility and performance.

Looking for JavaScript? Check out our JavaScript client for Node.js and browser environments.

Usage

from datetime import datetime
import dclimate_client_py as client
from dclimate_client_py import dClimateClient

# --- Recommended: Using dClimateClient (async context manager) ---

async def main():
    # The client manages IPFS connections automatically
    # No need to import or configure KuboCAS directly!
    async with dClimateClient() as dclimate:
        # Load datasets by name from the internal catalog
        # For datasets with multiple variants, you must specify which variant
        dataset = await dclimate.load_dataset(
            dataset="2m_temperature",
            collection="era5",
            variant="finalized",  # Required for multi-variant datasets
            return_xarray=False   # Returns GeotemporalData wrapper (default)
        )

        # Apply queries using the GeotemporalData interface
        dataset_filtered = dataset.point(latitude=40.875, longitude=-104.875)
        dataset_filtered = dataset_filtered.time_range(
            datetime(2023, 1, 1),
            datetime(2023, 1, 5)
        )
        data_dict = dataset_filtered.as_dict()
        print(data_dict['data'])

# Custom IPFS endpoints (optional)
async def main_custom_ipfs():
    async with dClimateClient(
        gateway_base_url="https://ipfs.io",
        rpc_base_url="http://localhost:5001"
    ) as dclimate:
        dataset = await dclimate.load_dataset(
            dataset="2m_temperature",
            collection="era5",
            variant="finalized"
        )
        # Query dataset...

# Get raw xarray.Dataset directly
async def main_xarray():
    async with dClimateClient() as dclimate:
        xr_dataset = await dclimate.load_dataset(
            dataset="2m_temperature",
            collection="era5",
            variant="finalized",
            return_xarray=True  # Returns xarray.Dataset
        )
        print(xr_dataset)

# List available datasets in the catalog (synchronous helper)
catalog = client.list_dataset_catalog()
for collection in catalog:
    print(f"Collection: {collection['collection']}")
    for dataset in collection['datasets']:
        print(f"  Dataset: {dataset['dataset']}")
        for variant in dataset['variants']:
            print(f"    Variant: {variant['variant']}")

More examples can be found at dClimate Jupyter Notebooks. To run your own IPFS gateway follow the instructions for installing ipfs. For additional assistance find us on Discord, if you are an organization or business reach out to us at community at dclimate dot net.

Create and activate a virtual environment:

uv venv .venv
source .venv/bin/activate  # macOS/Linux
.\.venv\Scripts\activate   # Windows

Install Dependencies

uv sync --extra dev --extra testing

Run tests for your local environment

uv run pytest tests/

Use Coverage

uv run pytest --cov=dclimate_client_py tests/ --cov-report=xml

Environment requirements

  • Optionally you can run your own IPFS Server to host your own datasets or connect to others.

File breakdown:

client.py

Entrypoint to code, contains geo_temporal_query, which combines all possible subsetting and aggregation logic in a single function. Can output the data as either a dict or bytes representing an xarray dataset.


dclimate_zarr_errors.py

Various exceptions to be raised for bad or invalid user input.


geo_utils.py

Functions to manipulate xarray datasets. Contains polygon, rectangle, circle and point spatial subsetting options, as well as temporal subsetting. Also allows for both spatial and temporal aggregations.


ipfs_retrieval.py

Functions for resolving IPNS names, traversing the dClimate STAC catalog stored on IPFS, and loading Zarr datasets using py-hamt. Handles interaction with IPFS gateways and RPC endpoints.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dclimate_client_py-0.1.2.tar.gz (13.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dclimate_client_py-0.1.2-py3-none-any.whl (32.5 kB view details)

Uploaded Python 3

File details

Details for the file dclimate_client_py-0.1.2.tar.gz.

File metadata

  • Download URL: dclimate_client_py-0.1.2.tar.gz
  • Upload date:
  • Size: 13.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.3

File hashes

Hashes for dclimate_client_py-0.1.2.tar.gz
Algorithm Hash digest
SHA256 ddc73e694cfdac5427d1b0bdf26e3fde5ee4142850368920ade42b4c7a044b64
MD5 b6aeebec3a36819506f3676430b09afb
BLAKE2b-256 cba4762d0288d424f4522089abfdbe08d3a6a9b06cb52300a7902cad42c9f783

See more details on using hashes here.

File details

Details for the file dclimate_client_py-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for dclimate_client_py-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ee1e54bde573d82cccc510f919c1ef921aa80223295cf883884c4371dce1815e
MD5 8ee60eb9476083f7aad9cfe3de5df7d9
BLAKE2b-256 7cdb8f88ee46e65fb8fb4918762edad9c78e4daf4d0a84311903ad2187eb94f7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page