GPU-first spatial analytics for Python — drop-in GeoPandas replacement backed by CUDA kernels
Project description
vibeSpatial
vibeSpatial is a GPU-first spatial analytics library for Python. Change one import line and your existing GeoPandas code runs on CUDA — binary predicates, buffer, overlay, dissolve, make-valid, spatial joins, and I/O all dispatch to GPU kernels automatically, with explicit, observable CPU compatibility fallback only when the native GPU path is unavailable or unsupported.
[!WARNING] vibeSpatial is still early, but the public GPU path is now the design center: the April 20, 2026 local GPU health gate reports 95.09% value-weighted GPU acceleration across tracked public dispatches. File an issue if you hit a fallback, correctness mismatch, or unexpected host transfer.
The repository enforces fallback observability: once a workflow is on device, hidden host exits are treated as bugs, and strict-native tests fail if a path materializes to host without first recording an explicit fallback or compatibility boundary. The maintained warmed
10kpublic shootout suite underbenchmarks/shootout/currently passes with matching fingerprints on local RTX 4090 runs; heavier workflows show clear wins while tiny CPU-shaped workflows are treated as crossover signals rather than benchmark theater.
Install
pip install vibespatial # CPU-only (GeoPandas drop-in)
pip install vibespatial[cu12] # CUDA 12 GPU acceleration
pip install vibespatial[cu13] # CUDA 13 GPU acceleration
Quick start
import vibespatial as gpd
gdf = gpd.read_file("my_data.gpkg")
buffered = gdf.buffer(100)
joined = gpd.sjoin(gdf, buffered)
gdf.to_parquet("out.parquet")
Real-world example: 7.2 million buildings
Load every building footprint in Florida, reproject to UTM, find all buildings
within 1 km of a random pick, and export to GeoParquet. The full script is
at examples/nearby_buildings.py.
import vibespatial as gpd
# Read 7.2M buildings from Microsoft US Building Footprints
gdf = gpd.read_file("Florida.geojson")
# Reproject to UTM for metric distances
gdf_utm = gdf.to_crs(gdf.geometry.estimate_utm_crs())
# Pick a random building and find everything within 1 km
seed = gdf_utm.geometry.iloc[random.randrange(len(gdf_utm))]
nearby = gdf_utm[gdf_utm.geometry.dwithin(seed.centroid, 1_000)]
# Export to GeoParquet
nearby.to_crs(epsg=4326).to_parquet("nearby_buildings.parquet")
vibeSpatial is a drop-in replacement for GeoPandas. Here is the only diff:
-import geopandas as gpd
+import vibespatial as gpd
gdf = gpd.read_file("Florida.geojson")
gdf_utm = gdf.to_crs(gdf.geometry.estimate_utm_crs())
seed = gdf_utm.geometry.iloc[random.randrange(len(gdf_utm))]
nearby = gdf_utm[gdf_utm.geometry.dwithin(seed.centroid, 1_000)]
nearby.to_crs(epsg=4326).to_parquet("nearby_buildings.parquet")
Performance on 7.2M polygons (GeoPandas CPU baseline vs current public vibeSpatial run on local RTX 4090 / i9-13900K):
| Step | GeoPandas | vibeSpatial | Speedup |
|---|---|---|---|
| Read GeoJSON | 57.7 s | 6.7 s | 8.6x |
| Reproject to UTM | 8.2 s | 0.1 s | 82x |
| Select within 1 km | 0.2 s | 0.2 s | 1.0x |
| End-to-end including GeoParquet export | 66.3 s | 8.0 s | 8.3x |
The vibeSpatial column is the public
examples/nearby_buildings.py path, not a
private benchmark hook. GeoJSON reading uses GPU byte-classification: NVRTC
kernels parse JSON structure, detect geometry families, extract coordinates,
and assemble geometry into owned device buffers. Property payloads are decoded
through a narrowed host seam. Reprojection uses
vibeProj fused GPU kernels via
transform_buffers() -- no host round-trip. Spatial queries use
device-resident bounding-box prefilter + GPU distance kernels.
Current GPU coverage
The April 20, 2026 local GPU health gate reports 95.09% value-weighted GPU acceleration across tracked public dispatches:
| Surface | GPU work coverage |
|---|---|
| I/O write | 99.88% |
| Query | 95.51% |
| I/O read | 94.71% |
| Other public APIs | 94.48% |
| Constructive | 85.92% |
| Overlay | 76.14% |
| Dissolve | 54.43% |
The remaining work is concentrated in exact constructive/overlay/dissolve paths and uncommon compatibility boundaries. Silent CPU fallback is not an accepted success mode.
Tech stack
| Layer | Technology |
|---|---|
| GPU kernels | NVRTC (runtime-compiled CUDA C via cuda-python) |
| GPU primitives | CCCL (cccl — scan, sort, reduce, select) |
| GPU arrays | CuPy (device memory, element-wise ops, prefix sums) |
| GPU JSON parse | Custom byte-classification kernels (ADR-0038) |
| GPU projection | vibeProj |
| GPU Parquet/Arrow | pylibcudf (WKB decode, GeoArrow codec) |
| CPU compatibility | GeoPandas API (vendored upstream test suite) |
| JSON parsing | orjson (property extraction) |
| File I/O | native GPU/hybrid routes for GeoJSON, Shapefile, FlatGeobuf, GeoJSONSeq, OSM PBF; pyogrio for GDAL compatibility |
| Packaging | uv, hatchling |
All GPU kernels are pure Python — CUDA C source strings compiled at
runtime via NVRTC with background warmup (ADR-0034). Compiled CUBINs are
cached on disk so the JIT cost is paid only once per install. No compiled
extensions, no nvcc build step. The entire suite ships as pure-Python
wheels:
| Package | Wheel size |
|---|---|
| vibespatial | 612 KB |
| vibeproj | 57 KB |
| vibespatial-raster | 51 KB |
| Total | 720 KB |
Pre-compilation
The first time a GPU operation runs, CUDA kernels are JIT-compiled in the background (~2-3 s wall time on 8 threads). Compiled CUBINs are cached on disk so subsequent process starts are near-instant. To pre-populate the caches (e.g. in CI or after install):
from vibespatial.cccl_precompile import precompile_all
precompile_all() # compiles all 21 CCCL specs + 61 NVRTC kernels, blocks until done
Or from the command line:
uv run python -c "from vibespatial.cccl_precompile import precompile_all; precompile_all()"
See GPU Kernel Caching for the full design and environment variables.
Documentation
See the documentation for the full API reference, GPU acceleration guide, and I/O format support matrix.
Contributing
uv sync --group dev
uv run python scripts/check_docs.py --refresh
uv run python scripts/vendor_geopandas_tests.py
uv run pytest tests/upstream/geopandas/tests/test_config.py
Dependency groups
dev: local development and pytest toolingupstream-optional: heavier I/O and visualization extras for broader coveragegpu-optional: CUDA runtime, CuPy, pylibcudf
Layout
src/vibespatial/: package codesrc/geopandas/: GeoPandas compatibility shimtests/: repo-owned teststests/upstream/geopandas/: vendored upstream GeoPandas test suitedocs/: architecture docs and ADRsexamples/: benchmarks and usage examples
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file vibespatial-0.4.0.tar.gz.
File metadata
- Download URL: vibespatial-0.4.0.tar.gz
- Upload date:
- Size: 6.3 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d06737c094fb1a03a6620344205eea64bad0e0f83809014cd55a1b712ae1c105
|
|
| MD5 |
9daff2483a6c662143715872b69897ed
|
|
| BLAKE2b-256 |
a58892d555227b1229deb775bd275459c3011c676dbd5041db56d86f27c16d48
|
Provenance
The following attestation bundles were made for vibespatial-0.4.0.tar.gz:
Publisher:
release.yml on jarmak-personal/vibeSpatial
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
vibespatial-0.4.0.tar.gz -
Subject digest:
d06737c094fb1a03a6620344205eea64bad0e0f83809014cd55a1b712ae1c105 - Sigstore transparency entry: 1343558379
- Sigstore integration time:
-
Permalink:
jarmak-personal/vibeSpatial@d799faee4a4106271789360b68fc6061b32ff894 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/jarmak-personal
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@d799faee4a4106271789360b68fc6061b32ff894 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file vibespatial-0.4.0-py3-none-any.whl.
File metadata
- Download URL: vibespatial-0.4.0-py3-none-any.whl
- Upload date:
- Size: 1.5 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
233839541f0698b78aeec9a7984ee9801d993e677954ba3b21cd93fdf487e7fc
|
|
| MD5 |
5ca8420e63b76ef632910ffa1cbdd055
|
|
| BLAKE2b-256 |
18d27c78ec8350d5d37de591f089ec73e9bf5706a652fa1a9e21abe04b493028
|
Provenance
The following attestation bundles were made for vibespatial-0.4.0-py3-none-any.whl:
Publisher:
release.yml on jarmak-personal/vibeSpatial
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
vibespatial-0.4.0-py3-none-any.whl -
Subject digest:
233839541f0698b78aeec9a7984ee9801d993e677954ba3b21cd93fdf487e7fc - Sigstore transparency entry: 1343558458
- Sigstore integration time:
-
Permalink:
jarmak-personal/vibeSpatial@d799faee4a4106271789360b68fc6061b32ff894 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/jarmak-personal
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@d799faee4a4106271789360b68fc6061b32ff894 -
Trigger Event:
workflow_dispatch
-
Statement type: