Skip to main content

Spatial tiling, MVT generation, and tile serving for geospatial data

Project description

Starlet

Spatial tiling, MVT generation, and tile serving for geospatial data.

Setup

python -m venv .venv
source .venv/bin/activate
pip install -e .

CLI

All commands are available through the starlet CLI.

starlet --help

starlet tile — Partition a dataset

starlet tile --input data.parquet --outdir datasets/mydata --num-tiles 40
Flag Default Description
--input (required) Path to GeoParquet or GeoJSON file
--outdir (required) Output dataset directory
--num-tiles 40 Target number of spatial partitions
--partition-size 1gb Target partition size (e.g. 512mb, 1gb)
--sort zorder Sort order: zorder, hilbert, columns, none
--sample-cap 10000 Reservoir sampling cap for centroids
--compression zstd Parquet compression codec

starlet mvt — Generate vector tiles

starlet mvt --dir datasets/mydata --zoom 7 --threshold 100000
Flag Default Description
--dir (required) Dataset directory with parquet_tiles/ and histograms/
--zoom 7 Maximum zoom level
--threshold 0 Minimum feature count per tile
--outdir <dir>/mvt/ MVT output directory

starlet build — Full pipeline (tile + MVT)

starlet build --input data.parquet --outdir datasets/mydata

starlet serve — Launch the tile server

starlet serve --dir datasets --port 8765
Flag Default Description
--dir (required) Root directory containing dataset subdirectories
--host 0.0.0.0 Host to bind
--port 8765 Port to bind
--cache-size 256 In-memory tile cache size

starlet info — Inspect a dataset

starlet info --dir datasets/mydata

Make Targets

Convenience wrappers around the CLI:

make tiles INPUT=path/to/data.parquet
make mvt   INPUT=path/to/data.parquet
make build INPUT=path/to/data.parquet   # tiles + mvt
make server                              # starts on port 8765
make clean                               # removes datasets/*

API Endpoints

Once the server is running:

Method Path Description
GET / Interactive dataset selector
GET /api/datasets List all datasets
GET /datasets.json Search datasets by name
GET /datasets/<dataset>.json Dataset metadata
GET /datasets/<dataset>.html Dataset detail page
GET /<dataset>/<z>/<x>/<y>.mvt Mapbox Vector Tile
GET /datasets/<dataset>/features.<fmt> Download features (csv/geojson)
POST /datasets/<dataset>/features.<fmt> Download with geometry filter
GET /datasets/<dataset>/features/sample.json Sample attributes
GET /datasets/<dataset>/features/sample.geojson Sample record with geometry
GET /api/datasets/<dataset>/stats Attribute statistics
POST /datasets/<dataset>/styles.json LLM-generated styling suggestions

LLM Styling Suggestions

The POST /datasets/<dataset>/styles.json endpoint uses an LLM to generate map styling rules from dataset attribute statistics.

Provider Selection

Set the LLM_PROVIDER environment variable to choose a provider:

export LLM_PROVIDER=gemini   # default
export LLM_PROVIDER=ollama   # local Ollama

Falls back to Gemini if the variable is unset or invalid.

Gemini (default)

Requires a Google AI Studio API key:

export GEMINI_API_KEY=your-key-here
starlet serve --dir datasets

Ollama (local)

Requires a running Ollama instance on the default port (11434):

ollama serve                  # start Ollama
ollama pull llama3            # pull a model (once)

export LLM_PROVIDER=ollama
starlet serve --dir datasets

To use a different model:

export OLLAMA_MODEL=mistral

See starlet/_internal/server/llm/README.md for full LLM provider documentation.

Example

# Full pipeline
starlet build --input ../data/TIGER2018_COUNTY.parquet --outdir datasets/TIGER2018_COUNTY

# Or via Make
make build INPUT=../data/TIGER2018_COUNTY.parquet

# Start the server
make server

Then open http://localhost:8765 and select a dataset to visualize.

Prerequisites

  • Python 3.10+
  • make (optional, for convenience targets)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

starlet-0.2.0.tar.gz (72.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

starlet-0.2.0-py3-none-any.whl (85.3 kB view details)

Uploaded Python 3

File details

Details for the file starlet-0.2.0.tar.gz.

File metadata

  • Download URL: starlet-0.2.0.tar.gz
  • Upload date:
  • Size: 72.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.0

File hashes

Hashes for starlet-0.2.0.tar.gz
Algorithm Hash digest
SHA256 96117c3ce1864fe60650222e8b6b713beb6c5d704eda953073c6ac827a3ccc1b
MD5 9ce9978ab5048bb0b2e81dd7f8b2af42
BLAKE2b-256 8c6184e8fc95c78045fecbdb66578ec63b6d16baed63f363542f4fc2f5382679

See more details on using hashes here.

File details

Details for the file starlet-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: starlet-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 85.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.0

File hashes

Hashes for starlet-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a4e0b3cc4e1d0924531c0ff73870750847c9173fc8e309212be5385dabb891d5
MD5 9733a6df6b797a7142011b551b303dc0
BLAKE2b-256 ac85a03cbf0367d9f31ec5de0161cbad6b7222354540c47df8dc33ff1e7e920f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page