Skip to main content

Spatial tiling, MVT generation, and tile serving for geospatial data

Project description

Starlet

Spatial tiling, MVT generation, and tile serving for geospatial data.

Setup

python -m venv .venv
source .venv/bin/activate
pip install -e .

CLI

All commands are available through the starlet CLI.

starlet --help

starlet tile — Partition a dataset

starlet tile --input data.parquet --outdir datasets/mydata --num-tiles 40
Flag Default Description
--input (required) Path to GeoParquet or GeoJSON file
--outdir (required) Output dataset directory
--num-tiles 40 Target number of spatial partitions
--partition-size 1gb Target partition size (e.g. 512mb, 1gb)
--sort zorder Sort order: zorder, hilbert, columns, none
--sample-cap 10000 Reservoir sampling cap for centroids
--compression zstd Parquet compression codec

starlet mvt — Generate vector tiles

starlet mvt --dir datasets/mydata --zoom 7 --threshold 100000
Flag Default Description
--dir (required) Dataset directory with parquet_tiles/ and histograms/
--zoom 7 Maximum zoom level
--threshold 0 Minimum feature count per tile
--outdir <dir>/mvt/ MVT output directory

starlet build — Full pipeline (tile + MVT)

starlet build --input data.parquet --outdir datasets/mydata

starlet serve — Launch the tile server

starlet serve --dir datasets --port 8765
Flag Default Description
--dir (required) Root directory containing dataset subdirectories
--host 0.0.0.0 Host to bind
--port 8765 Port to bind
--cache-size 256 In-memory tile cache size

starlet info — Inspect a dataset

starlet info --dir datasets/mydata

Make Targets

Convenience wrappers around the CLI:

make tiles INPUT=path/to/data.parquet
make mvt   INPUT=path/to/data.parquet
make build INPUT=path/to/data.parquet   # tiles + mvt
make server                              # starts on port 8765
make clean                               # removes datasets/*

API Endpoints

Once the server is running:

Method Path Description
GET / Interactive dataset selector
GET /api/datasets List all datasets
GET /datasets.json Search datasets by name
GET /datasets/<dataset>.json Dataset metadata
GET /datasets/<dataset>.html Dataset detail page
GET /<dataset>/<z>/<x>/<y>.mvt Mapbox Vector Tile
GET /datasets/<dataset>/features.<fmt> Download features (csv/geojson)
POST /datasets/<dataset>/features.<fmt> Download with geometry filter
GET /datasets/<dataset>/features/sample.json Sample attributes
GET /datasets/<dataset>/features/sample.geojson Sample record with geometry
GET /api/datasets/<dataset>/stats Attribute statistics
POST /datasets/<dataset>/styles.json LLM-generated styling suggestions

LLM Styling Suggestions

The POST /datasets/<dataset>/styles.json endpoint uses an LLM to generate map styling rules from dataset attribute statistics.

Provider Selection

Set the LLM_PROVIDER environment variable to choose a provider:

export LLM_PROVIDER=gemini   # default
export LLM_PROVIDER=ollama   # local Ollama

Falls back to Gemini if the variable is unset or invalid.

Gemini (default)

Requires a Google AI Studio API key:

export GEMINI_API_KEY=your-key-here
starlet serve --dir datasets

Ollama (local)

Requires a running Ollama instance on the default port (11434):

ollama serve                  # start Ollama
ollama pull llama3            # pull a model (once)

export LLM_PROVIDER=ollama
starlet serve --dir datasets

To use a different model:

export OLLAMA_MODEL=mistral

See starlet/_internal/server/llm/README.md for full LLM provider documentation.

Example

# Full pipeline
starlet build --input ../data/TIGER2018_COUNTY.parquet --outdir datasets/TIGER2018_COUNTY

# Or via Make
make build INPUT=../data/TIGER2018_COUNTY.parquet

# Start the server
make server

Then open http://localhost:8765 and select a dataset to visualize.

Prerequisites

  • Python 3.10+
  • make (optional, for convenience targets)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

starlet_test-0.1.0.tar.gz (53.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

starlet_test-0.1.0-py3-none-any.whl (67.1 kB view details)

Uploaded Python 3

File details

Details for the file starlet_test-0.1.0.tar.gz.

File metadata

  • Download URL: starlet_test-0.1.0.tar.gz
  • Upload date:
  • Size: 53.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for starlet_test-0.1.0.tar.gz
Algorithm Hash digest
SHA256 36040598bfba18a26dcf9512d28d16b2027f3e36191f59dfc999c3d7ca3e0c6f
MD5 e5288820f222ab101a8d734eafc4e7b7
BLAKE2b-256 82dcb03ab39d0e281956ec8e9169e36e3f5a4e9c7154c305fcb1a44ea2c923da

See more details on using hashes here.

Provenance

The following attestation bundles were made for starlet_test-0.1.0.tar.gz:

Publisher: publish.yml on rohanbennur43/starlet

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file starlet_test-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: starlet_test-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 67.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for starlet_test-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 17c56a84d70e472c311ce9cf579b9fb4a3cabf3902f8ff9d0bd25f4d40ce09a6
MD5 26c915a2f3c278dc98f242d50fd93ae1
BLAKE2b-256 64d756d882aa589032afaba160eb9275296eba7d6aae3a9d8dabef6c6639c67b

See more details on using hashes here.

Provenance

The following attestation bundles were made for starlet_test-0.1.0-py3-none-any.whl:

Publisher: publish.yml on rohanbennur43/starlet

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page