No project description provided
Project description
spark-viewer-tui
A terminal UI for browsing and querying Delta Lake and Parquet tables with Apache Spark.
Built with Textual and PySpark.
GitHub: https://github.com/eritondev-stack/spark-viewer-tui
Features
- Catalog Browser - Sidebar tree with databases and tables
- SQL Editor - Write and execute Spark SQL queries with syntax highlighting
- Results Table - View query results with column types and row count
- Scan Paths - Auto-register Delta/Parquet folders as Spark tables
- Rescan - Refresh tables on demand (folders are live, Ctrl+R rescans)
- Save/Load Queries - Persist frequently used queries
- Themes - Multiple color themes (Transparent, Dracula, Gruvbox)
- Maximize - Focus on editor or results in full screen
Requirements
- Python 3.12+
- Java 17 (for PySpark) — must be available via
JAVA_HOMEorjavain yourPATH
Java Setup
macOS (Homebrew):
brew install openjdk@17
export JAVA_HOME=$(/usr/libexec/java_home -v 17)
Linux (Debian/Ubuntu):
sudo apt install openjdk-17-jdk
export JAVA_HOME=/usr/lib/jvm/java-17-openjdk-amd64
Add the export JAVA_HOME=... line to your ~/.bashrc or ~/.zshrc to make it persistent.
Verify:
java -version
Installation
pip install spark-viewer-tui
Or with uv:
uv pip install spark-viewer-tui
Usage
spark-viewer
Or run directly from source:
uv run spark-viewer
Keyboard Shortcuts
| Key | Action |
|---|---|
F2 |
Spark Configuration (metastore, warehouse, scan paths) |
F3 |
Save current query |
F4 |
Load saved query |
Ctrl+R |
Start Spark session / Rescan paths |
Ctrl+E |
Execute SQL query |
Ctrl+T |
Change theme |
Ctrl+W |
Maximize editor or results |
Ctrl+C |
Exit |
Getting Started
- Run
spark-viewer - Press
F2to configure:- Metastore DB Path - Where Spark stores metadata (e.g.
/tmp/metastore_db) - Warehouse Dir Path - Spark warehouse directory (e.g.
/tmp/spark-warehouse) - Scan Paths - Folders to scan for Delta/Parquet tables
- Metastore DB Path - Where Spark stores metadata (e.g.
- Press
Ctrl+Rto start the Spark session - Click a table in the sidebar or write SQL in the editor
- Press
Ctrl+Eto run the query
Seed (Example Data)
The package includes a seed command that creates 6 Delta tables with 500 rows each (employees, products, orders, customers, logs, metrics). Useful for testing and exploring the tool.
# Uses paths from spark_config.json
spark-viewer-seed
# Or specify paths manually
spark-viewer-seed --metastore-db ./metastore_db --warehouse-dir ./spark-warehouse
After seeding, run spark-viewer and press Ctrl+R to load the tables.
Scan Paths
Scan paths auto-register Delta and Parquet tables from a directory. Each scan path has a database name and a folder path.
db_name: vendas
path: /data/warehouse
Subfolders are registered as tables:
- Subfolder with
_delta_log/-> Delta table - Subfolder with
.parquetfiles -> Parquet table
Every Ctrl+R (Refresh Catalog) drops and recreates the databases from scan paths, keeping tables in sync with the filesystem.
Configuration
Settings are saved in spark_config.json in the project directory:
{
"metastore_db": "/tmp/metastore_db",
"warehouse_dir": "/tmp/spark-warehouse",
"scan_paths": [
{ "path": "/data/warehouse", "db_name": "vendas" },
{ "path": "/data/lake", "db_name": "analytics" }
]
}
Themes are stored in ~/.config/spark-viewer-tui/themes.json. The file is created automatically on first run with the default themes. Edit it to customize colors or add new themes.
Project Structure
spark-viewer-tui/
├── src/
│ └── spark_viewer_tui/
│ ├── app.py # Main application
│ ├── seed.py # Seed example Delta tables
│ ├── config.py # Configuration management
│ ├── spark_manager.py # Spark session and table registration
│ ├── queries.py # Query persistence
│ ├── themes.py # Theme system
│ └── screens/
│ ├── spark_config.py # Spark config modal (F2)
│ ├── save_query.py # Save query modal (F3)
│ ├── load_query.py # Load query modal (F4)
│ └── theme_selector.py # Theme selector modal (Ctrl+T)
└── pyproject.toml
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file spark_viewer_tui-0.1.5.tar.gz.
File metadata
- Download URL: spark_viewer_tui-0.1.5.tar.gz
- Upload date:
- Size: 34.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9e5846a702e2bedfd46014a194b26ebc83395e3bb7c579fa7647b6295b2ed7a6
|
|
| MD5 |
79c59de74b649d9df958c76b6a13ed22
|
|
| BLAKE2b-256 |
e6051572d0993d1698dfb2e42d5a1ede69bac7f90ea1eeb4472c8926083e8931
|
Provenance
The following attestation bundles were made for spark_viewer_tui-0.1.5.tar.gz:
Publisher:
publish.yml on eritondev-stack/spark-viewer-tui
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
spark_viewer_tui-0.1.5.tar.gz -
Subject digest:
9e5846a702e2bedfd46014a194b26ebc83395e3bb7c579fa7647b6295b2ed7a6 - Sigstore transparency entry: 957372824
- Sigstore integration time:
-
Permalink:
eritondev-stack/spark-viewer-tui@a5c27694af3e98692d567c0a22a9730f0003248e -
Branch / Tag:
refs/tags/v0.1.5 - Owner: https://github.com/eritondev-stack
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@a5c27694af3e98692d567c0a22a9730f0003248e -
Trigger Event:
push
-
Statement type:
File details
Details for the file spark_viewer_tui-0.1.5-py3-none-any.whl.
File metadata
- Download URL: spark_viewer_tui-0.1.5-py3-none-any.whl
- Upload date:
- Size: 20.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1e2b19355fe802b835568448f4c9ffbfa20baca978c06b5b4966d68c2d64156d
|
|
| MD5 |
ab20783458bb7a50983a115f0bcf88a9
|
|
| BLAKE2b-256 |
09bfd0bb5679434668599bc45492c656f19762488a7fc9ad55f5368cb1b24b36
|
Provenance
The following attestation bundles were made for spark_viewer_tui-0.1.5-py3-none-any.whl:
Publisher:
publish.yml on eritondev-stack/spark-viewer-tui
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
spark_viewer_tui-0.1.5-py3-none-any.whl -
Subject digest:
1e2b19355fe802b835568448f4c9ffbfa20baca978c06b5b4966d68c2d64156d - Sigstore transparency entry: 957372832
- Sigstore integration time:
-
Permalink:
eritondev-stack/spark-viewer-tui@a5c27694af3e98692d567c0a22a9730f0003248e -
Branch / Tag:
refs/tags/v0.1.5 - Owner: https://github.com/eritondev-stack
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@a5c27694af3e98692d567c0a22a9730f0003248e -
Trigger Event:
push
-
Statement type: