Skip to main content

Python client library for accessing dClimate weather and climate data

Project description

dClimate logo

dClimate-Client-Py

codecov

Retrieve dClimate GIS zarr datasets stored on IPFS

Uses py-hamt to access Zarr data structures stored efficiently on IPFS.

Filtering and aggregation are packaged into convenience functions optimized for flexibility and performance.

Looking for JavaScript? Check out our JavaScript client for Node.js and browser environments.

Usage

from datetime import datetime
import dclimate_client_py as client
from dclimate_client_py import dClimateClient

# --- Recommended: Using dClimateClient (async context manager) ---

async def main():
    # The client manages IPFS connections automatically
    # No need to import or configure KuboCAS directly!
    async with dClimateClient() as dclimate:
        # Load datasets by name from the internal catalog
        # For datasets with multiple variants, you must specify which variant
        # Returns a tuple: (dataset, metadata)
        dataset, metadata = await dclimate.load_dataset(
            dataset="2m_temperature",
            collection="era5",
            variant="finalized",  # Required for multi-variant datasets
            return_xarray=False   # Returns GeotemporalData wrapper (default)
        )

        # Check metadata about what was loaded
        print(f"Loaded: {metadata['slug']}")
        print(f"CID: {metadata['cid']}")
        print(f"Timestamp: {metadata.get('timestamp')}")  # If available from URL fetch
        print(f"Source: {metadata['source']}")  # 'catalog' or 'direct_cid'

        # Apply queries using the GeotemporalData interface
        dataset_filtered = dataset.point(latitude=40.875, longitude=-104.875)
        dataset_filtered = dataset_filtered.time_range(
            datetime(2023, 1, 1),
            datetime(2023, 1, 5)
        )
        data_dict = dataset_filtered.as_dict()
        print(data_dict['data'])

# Custom IPFS endpoints (optional)
async def main_custom_ipfs():
    async with dClimateClient(
        gateway_base_url="https://ipfs.io",
        rpc_base_url="http://localhost:5001"
    ) as dclimate:
        dataset, metadata = await dclimate.load_dataset(
            dataset="2m_temperature",
            collection="era5",
            variant="finalized"
        )
        # Query dataset...

# Get raw xarray.Dataset directly
async def main_xarray():
    async with dClimateClient() as dclimate:
        xr_dataset, metadata = await dclimate.load_dataset(
            dataset="2m_temperature",
            collection="era5",
            variant="finalized",
            return_xarray=True  # Returns xarray.Dataset
        )
        print(xr_dataset)
        print(f"Dataset CID: {metadata['cid']}")

# List available datasets in the catalog (synchronous helper)
catalog = client.list_dataset_catalog()
for collection in catalog:
    print(f"Collection: {collection['collection']}")
    for dataset in collection['datasets']:
        print(f"  Dataset: {dataset['dataset']}")
        for variant in dataset['variants']:
            print(f"    Variant: {variant['variant']}")

More examples can be found at dClimate Jupyter Notebooks. To run your own IPFS gateway follow the instructions for installing ipfs. For additional assistance find us on Discord, if you are an organization or business reach out to us at community at dclimate dot net.

Create and activate a virtual environment:

uv venv .venv
source .venv/bin/activate  # macOS/Linux
.\.venv\Scripts\activate   # Windows

Install Dependencies

uv sync --extra dev --extra testing

Run tests for your local environment

uv run pytest tests/

Use Coverage

uv run pytest --cov=dclimate_client_py tests/ --cov-report=xml

Environment requirements

  • Optionally you can run your own IPFS Server to host your own datasets or connect to others.

File breakdown:

client.py

Entrypoint to code, contains geo_temporal_query, which combines all possible subsetting and aggregation logic in a single function. Can output the data as either a dict or bytes representing an xarray dataset.


dclimate_zarr_errors.py

Various exceptions to be raised for bad or invalid user input.


geo_utils.py

Functions to manipulate xarray datasets. Contains polygon, rectangle, circle and point spatial subsetting options, as well as temporal subsetting. Also allows for both spatial and temporal aggregations.


ipfs_retrieval.py

Functions for resolving IPNS names, traversing the dClimate STAC catalog stored on IPFS, and loading Zarr datasets using py-hamt. Handles interaction with IPFS gateways and RPC endpoints.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dclimate_client_py-0.4.0.tar.gz (13.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dclimate_client_py-0.4.0-py3-none-any.whl (33.3 kB view details)

Uploaded Python 3

File details

Details for the file dclimate_client_py-0.4.0.tar.gz.

File metadata

  • Download URL: dclimate_client_py-0.4.0.tar.gz
  • Upload date:
  • Size: 13.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.3

File hashes

Hashes for dclimate_client_py-0.4.0.tar.gz
Algorithm Hash digest
SHA256 bab506e23541246de8893d57d0cab2779684131f7eed7d701746d73b2fe505ce
MD5 b97dcda769977f992cbf73d5afefcdd2
BLAKE2b-256 ca86af6ee2a444348eee44f5ad608a728e98afceb2c418abc02f3b7cc6c68f67

See more details on using hashes here.

File details

Details for the file dclimate_client_py-0.4.0-py3-none-any.whl.

File metadata

File hashes

Hashes for dclimate_client_py-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 044a1a07d3410e81e3d75b55b0e987d743a9056497dad9b89978b8d497827a79
MD5 90d565dcf612b80e6f409ddf1ca9af26
BLAKE2b-256 e7e1c7f38013b22f75f49b0d5434af223beb343e122d571d9f04ab39651df23f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page