Skip to main content

Geospatial analysis environment.

Project description

SpatialOperations

This repo leverages cloud-native-geospatial tooling alongside Kubernetes and S3 infrastructure (currently geared around NRP Nautilus), to create high-performance spatial workflows.

It is designed to:

  1. Publish the spatialoperations package, which supports operations around cloud-native raster and vector operations
  2. Abstract compute and storage, so that high-throughput computing like dask can run on cloud-native data
  3. Deploy a jupyter notebook for analysis
  4. Manage exports and APIs for visualizing and sharing data. Via Rest APIs, COGs, and PMTiles.

Distribution

The package is distributed using PyPI, and built/published using UV. It is intended to be installed via conda, which does a good job of managing GDAL as a dependency.

In the future we'll build a conda recipe that manages this better, but currently installing via conda + pip has been effective. Examples of installation can be found in the environments yaml files.

Environments

The environments directory contains the base environment and any other environments that are needed.

The base environment is the core dependencies for all later tooling and environments.

Publishing Base Environment

Setting up the config.mk

See the example config.mk. This is expected to export two paths at the moment, VOLUME_MOUNTS and ENV_FILES. The rest is just to construct these variables.

Adding a new dependency

When adding a new dependency to the project:

  1. Add the package to pyproject.toml:
dependencies = [
    "new-package>=1.0.0",
]
  1. Update the package version:
[project]
name = "spatialoperations"
version = "0.1.1"

Publishing the base environment to PyPI

This requires a UV_PUBLISH_TOKEN in .env.publish 3. Build the base environment:

# Build and run the container
make publisher-build
make publisher-run

# Publish to PyPI
make publish

This publishes the spatialoperations package to PyPI.

Building downstream environments

We use a two-stage build:

  1. analysis, which builds a base image that has everything we need, particularly:
  • spatialoperations, installed from PyPI
  • pmtiles
  1. jupyter, which includes the jupyter-specific dependencies. This relies on the analysis base image.

These can be created with make analysis-build and make jupyter-build, and run with make analysis-run and make jupyter-run.

Example rasterops environment

name: rasterops
channels:
  - conda-forge
dependencies:
  - python>=3.12.0,<3.13.0
  - gdal>=3.10.0
  - pip:
    - geospatial-analysis-environment>=0.1.9
    - coastal_resilience_utilities>=0.1.35

Building the analysis environment

  1. Update any dependencies in environments/analysis/analysis.yml

  2. Build the analysis environment:

make analysis-build
make analysis-run

Building the jupyter environment

  1. Update any dependencies in environments/jupyter/jupyter.yml

  2. Build the jupyter environment:

make jupyter-build
make jupyter-run

Prerequisites

  1. Install helm (On MacOSX):
brew install helm

See https://helm.sh/docs/intro/install/ for other systems.

  1. Configure AWS credentials: Create a file named .env.s3 with your Nautilus Cept S3 credentials. See .env.s3.example, as their may be other variables needed.

Deployment

Create a deployment with a pod, ingress, and persistent volume unique to you:

make jupyter-push
make jupyter-deploy

Release resources when you're done:

make jupyter-teardown

Formatting

You can use ruff to format your code before committing. The easiest way is to make sure that uv is installed and run make format. If you want to make sure that files are formatted as you save them make sure to install the relevant ruff extension (https://marketplace.cursorapi.com/items?itemName=charliermarsh.ruff for VSCode/Cursor).

Developing Dependencies on a deployed Jupyter server

You will need to have fswatch installed (brew install fswatch). To develop spatialoperations just run:

make dev-spatialoperations

This command will ensure that there is a server running at https://dev-jupyter.nrp-nautilus.io.

Don't forget to use importlib to reload dependencies from disk:

import importlib
import rasterops

# If you change a file locally, wait for it to be synced and then run:

importlib.reload(rasterops)

If you want to make sure that the dev server is shut down you can just run

helm uninstall dev

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spatialoperations-0.1.25.tar.gz (27.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

spatialoperations-0.1.25-py3-none-any.whl (30.6 kB view details)

Uploaded Python 3

File details

Details for the file spatialoperations-0.1.25.tar.gz.

File metadata

  • Download URL: spatialoperations-0.1.25.tar.gz
  • Upload date:
  • Size: 27.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.12

File hashes

Hashes for spatialoperations-0.1.25.tar.gz
Algorithm Hash digest
SHA256 1ba9b0c8c4ee7a8a7555d8e0547c2b5817b9a5b9dae39012a29eddac317c12bc
MD5 f577cdec136da4e637c272a17912fdce
BLAKE2b-256 2c53923b136c5400c912a377ff13dfc4612d53dc675c094792154e41e5ab347f

See more details on using hashes here.

File details

Details for the file spatialoperations-0.1.25-py3-none-any.whl.

File metadata

File hashes

Hashes for spatialoperations-0.1.25-py3-none-any.whl
Algorithm Hash digest
SHA256 7c9daaf5bcadec1e16acabe85e575ee127c0c69176b4500a3ef06bf32a1a0b87
MD5 293c7afaacafb1398a73ec6120f29d8a
BLAKE2b-256 0aca32b17b3d97fd6cbc53f78263b81d40bde1f00d04d4ee6b84dd872722caaf

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page