Skip to main content

Wren Engine CLI and Python SDK — semantic SQL layer for 20+ data sources

Project description

wren-engine

PyPI version Python License

Wren Engine CLI and Python SDK — semantic SQL layer for 20+ data sources.

Translate natural SQL queries through an MDL (Modeling Definition Language) semantic layer and execute them against your database. Powered by Apache DataFusion and Ibis.

Installation

pip install wren-engine              # Core (DuckDB included)
pip install wren-engine[postgres]    # PostgreSQL
pip install wren-engine[mysql]       # MySQL
pip install wren-engine[bigquery]    # BigQuery
pip install wren-engine[snowflake]   # Snowflake
pip install wren-engine[clickhouse]  # ClickHouse
pip install wren-engine[trino]       # Trino
pip install wren-engine[mssql]       # SQL Server
pip install wren-engine[databricks]  # Databricks
pip install wren-engine[redshift]    # Redshift
pip install wren-engine[spark]       # Spark
pip install wren-engine[athena]      # Athena
pip install wren-engine[oracle]      # Oracle
pip install 'wren-engine[memory]'    # Schema & query memory (LanceDB)
pip install 'wren-engine[ui]'        # Browser-based profile form (starlette + uvicorn)
pip install 'wren-engine[main]'      # memory + interactive prompts + ui
pip install 'wren-engine[all]'       # All connectors + main

Requires Python 3.11+.

Quick start

1. Initialize a project — scaffolds a YAML-based MDL project:

mkdir my-project && cd my-project
wren context init

This creates wren_project.yml, models/, and views/. Edit wren_project.yml to set your data_source and add models under models/:

# wren_project.yml
schema_version: 2
name: my_project
catalog: wren
schema: public
data_source: postgres
# models/orders/metadata.yml
name: orders
table_reference:
  schema: mydb
  table: orders
columns:
  - name: order_id
    type: integer
  - name: customer_id
    type: integer
  - name: total
    type: double
  - name: status
    type: varchar
primary_key: order_id

Already have an MDL JSON? Import it directly: wren context init --from-mdl path/to/mdl.json

2. Configure a connection profile:

# Browser form (recommended, requires wren-engine[ui])
wren profile add my-db --ui

# Interactive terminal prompts
wren profile add my-db --interactive

# Import from an existing connection file
wren profile add my-db --from-file connection_info.json

3. Build the manifest:

wren context build

This compiles YAML files into target/mdl.json. The CLI auto-discovers this file when you run queries from within the project directory.

4. Run queries:

wren --sql 'SELECT order_id FROM "orders" LIMIT 10'

wren walks up from the current directory to find wren_project.yml and uses target/mdl.json. You can also pass --mdl path/to/mdl.json explicitly.

For the full CLI reference and per-datasource connection field reference, see docs/cli.md and docs/connections.md.

5. (Optional) Configure security policy — create ~/.wren/config.json:

{
  "strict_mode": true,
  "denied_functions": ["pg_read_file", "dblink", "lo_import"]
}
Key Default Description
strict_mode false When true, every table in a query must be defined in the MDL. Queries referencing undeclared tables are rejected before execution.
denied_functions [] List of function names (case-insensitive) that are forbidden in queries.

6. (Optional) Index schema for semantic search (requires wren-engine[memory]):

wren memory index                              # index MDL schema
wren memory fetch -q "customer order price"    # fetch relevant schema context
wren memory store --nl "top customers" --sql "SELECT ..."  # store NL→SQL pair
wren memory recall -q "best customers"         # retrieve similar past queries

Connection profiles

Profiles let you store named connection configurations in ~/.wren/profiles.yml and switch between them easily — useful when working across multiple databases or environments.

# Add a profile (browser form, interactive prompts, or file import)
wren profile add prod --ui                        # opens http://localhost:<port>
wren profile add staging --interactive            # terminal prompts
wren profile add local --from-file conn.json      # import existing file

# List and switch profiles
wren profile list                                 # * marks the active profile
wren profile switch prod

# Inspect a profile (sensitive fields masked)
wren profile debug prod

# Remove a profile
wren profile rm old-profile --force

The --ui flag opens a browser-based form that auto-derives fields from each datasource's schema — including file upload for BigQuery credentials, variant selection for Databricks/Redshift, and sensible defaults for all 20+ supported sources. Requires pip install 'wren-engine[ui]'.

Once a profile is active, wren uses it automatically:

wren profile switch prod
wren --sql 'SELECT COUNT(*) FROM "orders"'        # connects using prod profile

Python SDK

import base64, orjson
from wren import WrenEngine, DataSource

manifest = { ... }  # your MDL dict
manifest_str = base64.b64encode(orjson.dumps(manifest)).decode()

with WrenEngine(manifest_str, DataSource.mysql, {"host": "...", ...}) as engine:
    result = engine.query('SELECT * FROM "orders" LIMIT 10')
    print(result.to_pandas())

Development

just install-dev    # Install with dev dependencies
just lint           # Ruff format check + lint
just format         # Auto-fix
Command What it runs Docker needed
just test-unit Unit tests (engine, CTE rewriter, field registry, profiles) No
just test-duckdb DuckDB connector tests No
just test-postgres PostgreSQL connector tests Yes
just test-mysql MySQL connector tests Yes
just test All tests Yes

Profile web tests (test_profile_web.py) require wren-engine[ui]:

uv sync --extra dev --extra ui --find-links ../wren-core-py/target/wheels/
uv run pytest tests/test_profile_web.py -v

Publishing

./scripts/publish.sh            # Build + publish to PyPI
./scripts/publish.sh --test     # Build + publish to TestPyPI
./scripts/publish.sh --build    # Build only

License

Apache-2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wren_engine-0.2.1rc0.tar.gz (278.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

wren_engine-0.2.1rc0-py3-none-any.whl (88.1 kB view details)

Uploaded Python 3

File details

Details for the file wren_engine-0.2.1rc0.tar.gz.

File metadata

  • Download URL: wren_engine-0.2.1rc0.tar.gz
  • Upload date:
  • Size: 278.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for wren_engine-0.2.1rc0.tar.gz
Algorithm Hash digest
SHA256 675fd3a899f2ef6b2743588104122cacf0a0d12f06273f21fa92be1e52aaca47
MD5 d15e4f63f85f9510305e6f5d84a8faea
BLAKE2b-256 cb373455a2dbe56595226b775a55925d896cf4c7649b22dfb4fc7b4f886ee9ad

See more details on using hashes here.

Provenance

The following attestation bundles were made for wren_engine-0.2.1rc0.tar.gz:

Publisher: wren-pypi-publish.yml on Canner/wren-engine

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file wren_engine-0.2.1rc0-py3-none-any.whl.

File metadata

  • Download URL: wren_engine-0.2.1rc0-py3-none-any.whl
  • Upload date:
  • Size: 88.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for wren_engine-0.2.1rc0-py3-none-any.whl
Algorithm Hash digest
SHA256 8298d9486adb7a98f4dc2eb3c05024357441f53594a3906381658f247bec3d8a
MD5 fce2ff576136a01f13d1125ee6f0c23b
BLAKE2b-256 cecfdbbf1acecb653bd07bc00af801daf714073f396135733e5734f4d498b29e

See more details on using hashes here.

Provenance

The following attestation bundles were made for wren_engine-0.2.1rc0-py3-none-any.whl:

Publisher: wren-pypi-publish.yml on Canner/wren-engine

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page