Git-native data modeling for dbt users
Project description
DataLex
Git-native data modeling for dbt users.
Point us at your dbt project and warehouse — we produce versioned, reviewable YAML with contracts, lineage, ERDs, and clean round-trip back to dbt.
Quickstart — two commands
pip install 'datalex-cli[serve]' # CLI + bundled Node — one command, no prereqs
datalex serve # opens http://localhost:3030
That's it. No Node install, no Docker, no database. [serve] pulls a
portable Node runtime so Python alone is enough. If you already have
Node 20+ on PATH, plain pip install datalex-cli works too.
Point it at your dbt repo:
cd ~/my-dbt-project # folder containing dbt_project.yml
datalex serve --project-dir .
The folder auto-registers as your active project; the browser opens
straight into your real file tree. Every UI edit writes back to the
original .yml files — git status shows real diffs.
Build your first ER diagram:
- Click Import dbt repo → Local folder → pick your project root
- In the Explorer, right-click any folder → New diagram here…
(or use the Explorer toolbar's New Diagram button for the
default
datalex/diagrams/location) - Open the new
.diagram.yamland click Add Entities on the canvas toolbar — multi-select with search + domain filter, then confirm. Entities auto-layout via ELK on add. You can also drag anyschema.yml/.model.yamlfrom the Explorer onto the canvas as an alternative — FK edges fromtests: - relationships:render automatically. - Drag to reposition → Save All → positions persist in the
diagram file;
git commitpicks them up. Save All is merge-safe: multiple in-memory docs targeting the sameschema.ymlare merged through the core-enginemerge_models_preserving_docshelper instead of clobbering siblings.
See docs/getting-started.md for the full path matrix (demo → local dbt → git URL → live warehouse).
Want your warehouse drivers too?
pip install 'datalex-cli[serve,postgres]' # or snowflake, bigquery, databricks…
pip install 'datalex-cli[serve,all]' # every driver + Node
Pick a tutorial
Once datalex serve is running, follow the path that matches what you
have in hand:
| You have... | Tutorial | Time |
|---|---|---|
| Nothing — just want the demo | Jaffle-shop one-click walkthrough | 3 min |
| An existing dbt project (folder or git) | Import an existing dbt project | 5 min |
| A live warehouse (Snowflake/Postgres/…) | Pull a warehouse schema | 7 min |
| CLI-only, no UI | CLI dbt-sync tutorial | 5 min |
New here? Start with docs/getting-started.md — it's the map across all four paths plus the mental model.
60-second demo (offline, no warehouse)
pip install 'datalex-cli[duckdb]'
git clone https://github.com/duckcode-ai/DataLex.git
cd DataLex
# 1. Build a local DuckDB warehouse (no external credentials)
python examples/jaffle_shop_demo/setup.py
# 2. Sync the dbt project into DataLex YAML
datalex datalex dbt sync examples/jaffle_shop_demo \
--out-root examples/jaffle_shop_demo/datalex-out
# 3. Emit dbt-parseable YAML back, with contracts enforced
datalex datalex dbt emit examples/jaffle_shop_demo/datalex-out \
--out-dir examples/jaffle_shop_demo/dbt-out
Open examples/jaffle_shop_demo/datalex-out/sources/jaffle_shop_raw.yaml —
every column has its warehouse type, descriptions from the manifest, and a
meta.datalex.dbt.unique_id stamp so re-running the sync never clobbers
anything you've hand-authored.
What it does
DataLex treats your data models as code. On top of a stricter YAML
substrate (the DataLex layout — one file per entity, kind:-dispatched,
streaming-safe for 10K+ entities), it gives you:
datalex datalex dbt sync <project>— readstarget/manifest.json+ yourprofiles.yml, introspects live column types, and merges them into DataLex YAML. Idempotent: user-authoreddescription:,tags:,sensitivity:, andtests:survive re-sync.datalex datalex dbt emit— writessources.ymlandschema.ymlwithcontract.enforced: trueanddata_type:on every column.dbt parsesucceeds out of the box.datalex datalex emit ddl --dialect ...— Postgres, Snowflake, BigQuery, Databricks, MySQL, SQL Server, Redshift. Same source, all dialects.datalex datalex diff— semantic diff with explicit rename tracking (previous_name:), breaking-change gate for CI.- Cross-repo package imports — pin
acme/warehouse-core@1.4.0inimports:, lockfile + content hash drift detection, Git-or-path resolution, on-disk parse cache for large projects. - Visual studio — React Flow UI for editing entities, relationships, and metadata; same YAML files as the CLI.
Supported warehouses
| Warehouse | dbt sync introspection |
Forward DDL | Reverse engineering |
|---|---|---|---|
| DuckDB | ✓ | — | — |
| PostgreSQL | ✓ | ✓ | ✓ |
| Snowflake | (fallback) | ✓ | ✓ |
| BigQuery | (fallback) | ✓ | ✓ |
| Databricks | (fallback) | ✓ | ✓ |
| MySQL | (fallback) | ✓ | ✓ |
| SQL Server / Azure SQL | (fallback) | ✓ | ✓ |
| Redshift | (fallback) | ✓ | ✓ |
"Fallback" = uses the existing full-schema connector (slower than the per-table path but already works today; a narrow introspection path ships per-dialect over time).
Install
For users — from PyPI:
pip install 'datalex-cli[serve]' # CLI + UI (recommended)
pip install 'datalex-cli[serve,postgres]' # add a warehouse driver
pip install 'datalex-cli[serve,all]' # every driver + UI
pip install datalex-cli # CLI-only, no UI
Available extras: serve, duckdb, postgres, mysql, snowflake,
bigquery, databricks, sqlserver, redshift, all.
Prereqs: Python 3.9+ and Git. That's it — [serve] bundles Node.
For contributors — from source:
git clone https://github.com/duckcode-ai/DataLex.git
cd DataLex
python3 -m venv .venv && source .venv/bin/activate
pip install -e '.[serve,duckdb]'
datalex serve # auto-builds the UI on first run
Project layout
DataLex/
packages/
core_engine/ # Python: loader, dialects, dbt integration, packages
src/datalex_core/
_schemas/datalex/ # JSON Schema per `kind:` — bundled with the package
cli/ # `datalex` entry point
api-server/ # Node.js API (UI backend)
web-app/ # React Flow studio
examples/
jaffle_shop_demo/ # zero-setup dbt-sync demo (DuckDB)
model-examples/ # sample projects and scenario walkthroughs
docs/ # architecture, specs, runbooks
tests/ # unittest suite (core engine + datalex)
Visual Studio
datalex serve ships the full UI — no extra setup. If you're hacking
on the web app itself and want hot-reload, run the two dev servers from
a source checkout:
# Terminal 1 — api (port 3030)
npm --prefix packages/api-server run dev
# Terminal 2 — web (port 5173)
npm --prefix packages/web-app run dev
The UI reads and writes the same YAML files the CLI does — no database, no hosted service.
CI / GitOps
DataLex is designed to live in your repo next to your dbt project. A typical CI step:
./datalex datalex validate datalex/
./datalex datalex diff datalex-main/ datalex/ --exit-on-breaking
./datalex datalex dbt emit datalex/ --out-dir dbt/
dbt parse
Documentation
Onboarding
- Getting started — the one-page map covering install, the three GUI paths, and the mental model.
- Jaffle-shop walkthrough — 3-minute offline demo of every UI feature.
- Import an existing dbt project — 5-minute bring-your-own-repo flow (local folder or git URL).
- Pull a warehouse schema — 7-minute live-connection flow with inferred PKs/FKs and streaming progress.
- CLI dbt-sync tutorial — original CLI-only jaffle_shop walkthrough.
Reference
- DataLex layout reference — what each
kind:file looks like and how the loader discovers them. - CLI cheat sheet — every
datalex datalex …subcommand on one page. - API contracts — HTTP API reference for integrators.
- Architecture — core engine modules and end-to-end data flow.
- Pre-DataLex specs have moved to docs/archive/.
Community
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file datalex_cli-1.0.6.tar.gz.
File metadata
- Download URL: datalex_cli-1.0.6.tar.gz
- Upload date:
- Size: 4.4 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
885d5b200175b07fcce4b626def9d6b15f9e9db3d0b36ea015654f4c27db74b0
|
|
| MD5 |
c97de8c1d98a8fc80ce74da4f7b2f24c
|
|
| BLAKE2b-256 |
93f6aa1f623457a0ca81d26e8123a5cb278e4ae1847113cccda65f9712685ccc
|
Provenance
The following attestation bundles were made for datalex_cli-1.0.6.tar.gz:
Publisher:
publish.yml on duckcode-ai/DataLex
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
datalex_cli-1.0.6.tar.gz -
Subject digest:
885d5b200175b07fcce4b626def9d6b15f9e9db3d0b36ea015654f4c27db74b0 - Sigstore transparency entry: 1355855253
- Sigstore integration time:
-
Permalink:
duckcode-ai/DataLex@ef1453e91f8918c4e6450bd90c2dbb12778b5b63 -
Branch / Tag:
refs/tags/v1.0.6 - Owner: https://github.com/duckcode-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ef1453e91f8918c4e6450bd90c2dbb12778b5b63 -
Trigger Event:
push
-
Statement type:
File details
Details for the file datalex_cli-1.0.6-py3-none-any.whl.
File metadata
- Download URL: datalex_cli-1.0.6-py3-none-any.whl
- Upload date:
- Size: 4.9 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c11bc3b3fc17b27b70ee6e30a693c1fd7f838d55559871f30f93872833e2e837
|
|
| MD5 |
6f769bbcae3f384c7d44a8b054b86610
|
|
| BLAKE2b-256 |
2727213a35a3483c4e8c59da20a8f6ae7cfeeed99b6eb800fe257cb59fe0611f
|
Provenance
The following attestation bundles were made for datalex_cli-1.0.6-py3-none-any.whl:
Publisher:
publish.yml on duckcode-ai/DataLex
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
datalex_cli-1.0.6-py3-none-any.whl -
Subject digest:
c11bc3b3fc17b27b70ee6e30a693c1fd7f838d55559871f30f93872833e2e837 - Sigstore transparency entry: 1355855264
- Sigstore integration time:
-
Permalink:
duckcode-ai/DataLex@ef1453e91f8918c4e6450bd90c2dbb12778b5b63 -
Branch / Tag:
refs/tags/v1.0.6 - Owner: https://github.com/duckcode-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ef1453e91f8918c4e6450bd90c2dbb12778b5b63 -
Trigger Event:
push
-
Statement type: