Skip to main content

Reverse ETL for the code-first data stack

Project description

drt logo

drt — data reverse tool

Reverse ETL for the code-first data stack.

CI PyPI License Python

drt syncs data from your data warehouse to external services — declaratively, via YAML and CLI. Think dbt rundrt run. Same developer experience, opposite data direction.

drt quickstart demo

pip install drt-core          # core (DuckDB included)
drt init && drt run

Why drt?

Problem drt's answer
Census/Hightouch are expensive SaaS Free, self-hosted OSS
GUI-first tools don't fit CI/CD CLI + YAML, Git-native
dbt/dlt ecosystem has no reverse leg Same philosophy, same DX
LLM/MCP era makes GUI SaaS overkill LLM-native by design

Quickstart

1. Install

pip install drt-core[bigquery]
# or
uv add drt-core[bigquery]

2. Initialize a project

mkdir my-drt-project && cd my-drt-project
drt init

This creates:

my-drt-project/
├── drt_project.yml   # project config
└── syncs/            # put your sync definitions here

drt init prompts for source type: bigquery, duckdb, or postgres.

3. Create a sync

# syncs/notify_slack.yml
name: notify_slack
description: "Notify Slack on new users"
model: ref('new_users')
destination:
  type: rest_api
  url: "https://hooks.slack.com/services/YOUR/WEBHOOK/URL"
  method: POST
  headers:
    Content-Type: "application/json"
  body_template: |
    { "text": "New user: {{ row.name }} ({{ row.email }})" }
sync:
  mode: full
  batch_size: 100
  rate_limit:
    requests_per_second: 5
  on_error: skip

4. Run

drt run --dry-run        # preview, no data written
drt run                  # run all syncs
drt run --select notify_slack  # run one sync
drt status               # check recent sync results

CLI Reference

drt init                    # initialize project
drt list                    # list sync definitions
drt run                     # run all syncs
drt run --select <name>     # run a specific sync
drt run --dry-run           # dry run
drt run --verbose           # show row-level error details
drt validate                # validate sync YAML configs
drt status                  # show recent sync status
drt status --verbose        # show per-row error details
drt mcp run                 # start MCP server (requires drt-core[mcp])

MCP Server

Connect drt to Claude, Cursor, or any MCP-compatible client so you can run syncs, check status, and validate configs without leaving your AI environment.

pip install drt-core[mcp]
drt mcp run

Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json):

{
  "mcpServers": {
    "drt": {
      "command": "drt",
      "args": ["mcp", "run"]
    }
  }
}

Available MCP tools:

Tool What it does
drt_list_syncs List all sync definitions
drt_run_sync Run a sync (supports dry_run)
drt_get_status Get last run result(s)
drt_validate Validate sync YAML configs
drt_get_schema Return JSON Schema for config files

AI Skills for Claude Code

Install the official Claude Code skills to generate YAML, debug failures, and migrate from other tools — all from the chat interface.

Install via Plugin Marketplace (recommended)

/plugin marketplace add drt-hub/drt
/plugin install drt@drt-hub

Tip: Enable auto-update so you always get the latest skills when drt is updated: /plugin → Marketplaces → drt-hub → Enable auto-update

Manual install (slash commands)

Copy the files from .claude/commands/ into your drt project's .claude/commands/ directory.

Skill Trigger What it does
/drt-create-sync "create a sync" Generates valid sync YAML from your intent
/drt-debug "sync failed" Diagnoses errors and suggests fixes
/drt-init "set up drt" Guides through project initialization
/drt-migrate "migrate from Census" Converts existing configs to drt YAML

Connectors

Type Name Status Install
Source BigQuery ✅ v0.1 pip install drt-core[bigquery]
Source DuckDB ✅ v0.1 (core)
Source PostgreSQL ✅ v0.1 pip install drt-core[postgres]
Source Snowflake 🗓 planned pip install drt-core[snowflake]
Source Redshift ✅ v0.3.4 pip install drt-core[redshift]
Source MySQL 🗓 planned pip install drt-core[mysql]
Destination REST API ✅ v0.1 (core)
Destination Slack Incoming Webhook ✅ v0.1 (core)
Destination GitHub Actions (workflow_dispatch) ✅ v0.1 (core)
Destination HubSpot (Contacts / Deals / Companies) ✅ v0.1 (core)
Destination Google Sheets 🗓 v0.4 pip install drt-core[sheets]
Destination CSV / JSON file 🗓 v0.5 (core)
Destination Salesforce 🗓 v0.6 pip install drt-core[salesforce]
Destination Notion 🗓 planned (core)
Destination Linear 🗓 planned (core)
Destination SendGrid 🗓 planned (core)

Roadmap

Detailed plans & progress → GitHub Milestones Looking to contribute? → Good First Issues

Version Focus
v0.1 BigQuery / DuckDB / Postgres sources · REST API / Slack / GitHub Actions / HubSpot destinations · CLI · dry-run
v0.2 Incremental sync (cursor_field watermark) · retry config per-sync
v0.3 MCP Server (drt mcp run) · AI Skills for Claude Code · LLM-readable docs · row-level errors · security hardening · Redshift source
v0.4 Dagster integration · Google Sheets destination · dbt post-hook · examples
v0.5 Snowflake source · CSV/JSON destination · test coverage
v0.6 Salesforce destination · Airflow integration
v1.x Rust engine (PyO3)

Ecosystem

drt is designed to work alongside, not against, the modern data stack:

drt ecosystem — dlt load, dbt transform, drt activate


Contributing

See CONTRIBUTING.md.

Disclaimer

drt is an independent open-source project and is not affiliated with, endorsed by, or sponsored by dbt Labs, dlt-hub, or any other company.

"dbt" is a registered trademark of dbt Labs, Inc. "dlt" is a project maintained by dlt-hub.

drt is designed to complement these tools as part of the modern data stack, but is a separate project with its own codebase and maintainers.

License

Apache 2.0 — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

drt_core-0.4.0.tar.gz (1.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

drt_core-0.4.0-py3-none-any.whl (54.4 kB view details)

Uploaded Python 3

File details

Details for the file drt_core-0.4.0.tar.gz.

File metadata

  • Download URL: drt_core-0.4.0.tar.gz
  • Upload date:
  • Size: 1.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for drt_core-0.4.0.tar.gz
Algorithm Hash digest
SHA256 538c919ab3b6ebb7cfbf0425d81a14c67b4686408437e6ced6ece38dbd7504f5
MD5 0a96e54289efebaa45e7dd30550ca17b
BLAKE2b-256 126006c54ca6060542d6567527c4200986dc12d153b2695b0738837e3f674f65

See more details on using hashes here.

Provenance

The following attestation bundles were made for drt_core-0.4.0.tar.gz:

Publisher: publish.yml on drt-hub/drt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file drt_core-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: drt_core-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 54.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for drt_core-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c45f3d21972535198575d1b9da21e39025b21bf3350acb5ca966269c0bbfccd6
MD5 1d2f146cddc93aa8415f304397744cd8
BLAKE2b-256 a3851ab829bd303e20ac05a9364b6f1894fad175aff986268398509bb75d2614

See more details on using hashes here.

Provenance

The following attestation bundles were made for drt_core-0.4.0-py3-none-any.whl:

Publisher: publish.yml on drt-hub/drt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page