Skip to main content

Spatial tiling, MVT generation, and tile serving for geospatial data

Project description

Starlet

Spatial tiling, MVT generation, and tile serving for geospatial data.

Setup

python -m venv .venv
source .venv/bin/activate
pip install -e .

CLI

All commands are available through the starlet CLI.

starlet --help

starlet tile — Partition a dataset

starlet tile --input data.parquet --outdir datasets/mydata --num-tiles 40
Flag Default Description
--input (required) Path to GeoParquet or GeoJSON file
--outdir (required) Output dataset directory
--num-tiles 40 Target number of spatial partitions
--partition-size 1gb Target partition size (e.g. 512mb, 1gb)
--sort zorder Sort order: zorder, hilbert, columns, none
--sample-cap 10000 Reservoir sampling cap for centroids
--compression zstd Parquet compression codec

starlet mvt — Generate vector tiles

starlet mvt --dir datasets/mydata --zoom 7 --threshold 100000
Flag Default Description
--dir (required) Dataset directory with parquet_tiles/ and histograms/
--zoom 7 Maximum zoom level
--threshold 0 Minimum feature count per tile
--outdir <dir>/mvt/ MVT output directory

starlet build — Full pipeline (tile + MVT)

starlet build --input data.parquet --outdir datasets/mydata

starlet serve — Launch the tile server

starlet serve --dir datasets --port 8765
Flag Default Description
--dir (required) Root directory containing dataset subdirectories
--host 0.0.0.0 Host to bind
--port 8765 Port to bind
--cache-size 256 In-memory tile cache size

starlet info — Inspect a dataset

starlet info --dir datasets/mydata

Make Targets

Convenience wrappers around the CLI:

make tiles INPUT=path/to/data.parquet
make mvt   INPUT=path/to/data.parquet
make build INPUT=path/to/data.parquet   # tiles + mvt
make server                              # starts on port 8765
make clean                               # removes datasets/*

API Endpoints

Once the server is running:

Method Path Description
GET / Interactive dataset selector
GET /api/datasets List all datasets
GET /datasets.json Search datasets by name
GET /datasets/<dataset>.json Dataset metadata
GET /datasets/<dataset>.html Dataset detail page
GET /<dataset>/<z>/<x>/<y>.mvt Mapbox Vector Tile
GET /datasets/<dataset>/features.<fmt> Download features (csv/geojson)
POST /datasets/<dataset>/features.<fmt> Download with geometry filter
GET /datasets/<dataset>/features/sample.json Sample attributes
GET /datasets/<dataset>/features/sample.geojson Sample record with geometry
GET /api/datasets/<dataset>/stats Attribute statistics
POST /datasets/<dataset>/styles.json LLM-generated styling suggestions

LLM Styling Suggestions

The POST /datasets/<dataset>/styles.json endpoint uses an LLM to generate map styling rules from dataset attribute statistics.

Provider Selection

Set the LLM_PROVIDER environment variable to choose a provider:

export LLM_PROVIDER=gemini   # default
export LLM_PROVIDER=ollama   # local Ollama

Falls back to Gemini if the variable is unset or invalid.

Gemini (default)

Requires a Google AI Studio API key:

export GEMINI_API_KEY=your-key-here
starlet serve --dir datasets

Ollama (local)

Requires a running Ollama instance on the default port (11434):

ollama serve                  # start Ollama
ollama pull llama3            # pull a model (once)

export LLM_PROVIDER=ollama
starlet serve --dir datasets

To use a different model:

export OLLAMA_MODEL=mistral

See starlet/_internal/server/llm/README.md for full LLM provider documentation.

Example

# Full pipeline
starlet build --input ../data/TIGER2018_COUNTY.parquet --outdir datasets/TIGER2018_COUNTY

# Or via Make
make build INPUT=../data/TIGER2018_COUNTY.parquet

# Start the server
make server

Then open http://localhost:8765 and select a dataset to visualize.

Prerequisites

  • Python 3.10+
  • make (optional, for convenience targets)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

starlet-0.2.1.tar.gz (72.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

starlet-0.2.1-py3-none-any.whl (85.3 kB view details)

Uploaded Python 3

File details

Details for the file starlet-0.2.1.tar.gz.

File metadata

  • Download URL: starlet-0.2.1.tar.gz
  • Upload date:
  • Size: 72.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.0

File hashes

Hashes for starlet-0.2.1.tar.gz
Algorithm Hash digest
SHA256 bce9df14e696c029c070f563e265cef1a2c68323d7d34d010eff014a575839d4
MD5 f6f827b6a35a298a644d65e89c822ff5
BLAKE2b-256 2cc4fa5a498f9b0c862274dc6b9f7808ac6e008eda40aa547467ef96720fb428

See more details on using hashes here.

File details

Details for the file starlet-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: starlet-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 85.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.0

File hashes

Hashes for starlet-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f96678a76681f031799dcb6480ecfcffbc49475d6e99f817d0b502927d093b4f
MD5 bee7cf3420ceb9e8b38315b53db5003a
BLAKE2b-256 a3353be5755a29d54991c1c7885a2148b862ca8ed0201ae4b72fe1188da13419

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page