Skip to main content

Spatial tiling, MVT generation, and tile serving for geospatial data

Project description

Starlet

Spatial tiling, MVT generation, and tile serving for geospatial data.

Setup

python -m venv .venv
source .venv/bin/activate
pip install -e .

CLI

All commands are available through the starlet CLI.

starlet --help

starlet tile — Partition a dataset

starlet tile --input data.parquet --outdir datasets/mydata --num-tiles 40
Flag Default Description
--input (required) Path to GeoParquet or GeoJSON file
--outdir (required) Output dataset directory
--num-tiles 40 Target number of spatial partitions
--partition-size 1gb Target partition size (e.g. 512mb, 1gb)
--sort zorder Sort order: zorder, hilbert, columns, none
--sample-cap 10000 Reservoir sampling cap for centroids
--compression zstd Parquet compression codec

starlet mvt — Generate vector tiles

starlet mvt --dir datasets/mydata --zoom 7 --threshold 100000
Flag Default Description
--dir (required) Dataset directory with parquet_tiles/ and histograms/
--zoom 7 Maximum zoom level
--threshold 0 Minimum feature count per tile
--outdir <dir>/mvt/ MVT output directory

starlet build — Full pipeline (tile + MVT)

starlet build --input data.parquet --outdir datasets/mydata

starlet serve — Launch the tile server

starlet serve --dir datasets --port 8765
Flag Default Description
--dir (required) Root directory containing dataset subdirectories
--host 0.0.0.0 Host to bind
--port 8765 Port to bind
--cache-size 256 In-memory tile cache size

starlet info — Inspect a dataset

starlet info --dir datasets/mydata

Make Targets

Convenience wrappers around the CLI:

make tiles INPUT=path/to/data.parquet
make mvt   INPUT=path/to/data.parquet
make build INPUT=path/to/data.parquet   # tiles + mvt
make server                              # starts on port 8765
make clean                               # removes datasets/*

API Endpoints

Once the server is running:

Method Path Description
GET / Interactive dataset selector
GET /api/datasets List all datasets
GET /datasets.json Search datasets by name
GET /datasets/<dataset>.json Dataset metadata
GET /datasets/<dataset>.html Dataset detail page
GET /<dataset>/<z>/<x>/<y>.mvt Mapbox Vector Tile
GET /datasets/<dataset>/features.<fmt> Download features (csv/geojson)
POST /datasets/<dataset>/features.<fmt> Download with geometry filter
GET /datasets/<dataset>/features/sample.json Sample attributes
GET /datasets/<dataset>/features/sample.geojson Sample record with geometry
GET /api/datasets/<dataset>/stats Attribute statistics
POST /datasets/<dataset>/styles.json LLM-generated styling suggestions

LLM Styling Suggestions

The POST /datasets/<dataset>/styles.json endpoint uses an LLM to generate map styling rules from dataset attribute statistics.

Provider Selection

Set the LLM_PROVIDER environment variable to choose a provider:

export LLM_PROVIDER=gemini   # default
export LLM_PROVIDER=ollama   # local Ollama

Falls back to Gemini if the variable is unset or invalid.

Gemini (default)

Requires a Google AI Studio API key:

export GEMINI_API_KEY=your-key-here
starlet serve --dir datasets

Ollama (local)

Requires a running Ollama instance on the default port (11434):

ollama serve                  # start Ollama
ollama pull llama3            # pull a model (once)

export LLM_PROVIDER=ollama
starlet serve --dir datasets

To use a different model:

export OLLAMA_MODEL=mistral

See starlet/_internal/server/llm/README.md for full LLM provider documentation.

Example

# Full pipeline
starlet build --input ../data/TIGER2018_COUNTY.parquet --outdir datasets/TIGER2018_COUNTY

# Or via Make
make build INPUT=../data/TIGER2018_COUNTY.parquet

# Start the server
make server

Then open http://localhost:8765 and select a dataset to visualize.

Prerequisites

  • Python 3.10+
  • make (optional, for convenience targets)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

starlet-0.1.0.tar.gz (54.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

starlet-0.1.0-py3-none-any.whl (67.1 kB view details)

Uploaded Python 3

File details

Details for the file starlet-0.1.0.tar.gz.

File metadata

  • Download URL: starlet-0.1.0.tar.gz
  • Upload date:
  • Size: 54.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for starlet-0.1.0.tar.gz
Algorithm Hash digest
SHA256 5d63b78c1305ba1aa23645fb61e18858bac8153ed7dea39c1f414ba3265c4531
MD5 8c5f77fb12b24c3e98d01c33b3b718db
BLAKE2b-256 8edecaa5a9b8f01063c1a8c5ebc0685abc61ba47d5d17183b073d29067bbc0f7

See more details on using hashes here.

Provenance

The following attestation bundles were made for starlet-0.1.0.tar.gz:

Publisher: publish.yml on ucr-bdlab/starlet

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file starlet-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: starlet-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 67.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for starlet-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6d3b315cd51a4e98d8ef7f323d1817b244b7948d8be93b95b7d42263afd06a28
MD5 b0bedbeaac97221bf3190c1ec0242753
BLAKE2b-256 ce885eb885a13debf9288fea7158ae92585aba077372ac0243be8fc32588606c

See more details on using hashes here.

Provenance

The following attestation bundles were made for starlet-0.1.0-py3-none-any.whl:

Publisher: publish.yml on ucr-bdlab/starlet

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page