Skip to main content

Database Tycoon — local-first analytics CLI that adapts to your existing data stack

Project description

tycoon

A pip-installable CLI that wires dlt → DuckDB → dbt → Rill into a working local analytics stack. No Docker, no cloud account required.

Adaptable: bring your own ingestion (Airbyte, Fivetran), warehouse (Snowflake, BigQuery, MotherDuck), dbt project, BI tool, or orchestrator — tycoon init asks before it assumes.


Install

Requires Python >= 3.12.

pip install database-tycoon
# or
uv add database-tycoon

Optional extras:

pip install "database-tycoon[dagster]"   # Dagster orchestration
pip install "database-tycoon[ask]"       # AI natural language queries (Ollama supported)

Quickstart

The fastest path to a working dashboard uses the PokéAPI — no credentials, no signup.

# 1. Create a project
mkdir my-project && cd my-project
tycoon init --template csv-import

# 2. Add the PokéAPI as a source (press Enter twice for defaults)
tycoon data sources add rest_api

# 3. Ingest into DuckDB
tycoon data sources run pokeapi

# 4. Scaffold dbt models and generate Rill dashboards
tycoon data analyze pokeapi --rill

# 5. Open Rill
tycoon start --only rill

Rill opens at http://localhost:9009 with pokemon, berry, and type tables ready to explore.

Already have a pipeline? tycoon init will ask about your ingestion tool, warehouse, dbt project, BI tool, and orchestrator — and configure itself around what you already have.


CLI Reference

Command Description
tycoon init Scaffold a new project
tycoon data sources catalog Browse available source integrations
tycoon data sources add <type> Register a new data source
tycoon data sources list List sources configured in this project
tycoon data sources list show <name> Show detailed config for a source
tycoon data sources run <name> Run ingestion for a named source
tycoon data sources run-all Run ingestion for all sources
tycoon data transform run Run dbt transformations
tycoon data analyze <source> Scaffold dbt staging models; add --rill to also generate dashboards
tycoon data db query <sql> Run a SQL query against the warehouse
tycoon data run-all Ingest all sources then run dbt build
tycoon data status Show freshness and row counts for each source
tycoon start Start Rill, Dagster, Nao, and the web UI
tycoon stop Stop all services
tycoon ask chat Query your data in natural language (Nao)
tycoon run <tool> Passthrough to dbt, dlt, rill, dagster

tycoon.yml Reference

name: my-project
version: 0.1.0

database:
  raw: data/raw.duckdb              # dlt output (or md: URI for MotherDuck)
  warehouse: data/warehouse.duckdb  # dbt output — read by Rill and Nao

dbt_project_dir: dbt_project   # path to dbt project (yours or tycoon-scaffolded)
rill_dir: rill                 # path to Rill dashboard definitions

stack:                         # generated by tycoon init — edit as needed
  ingestion: dlt               # dlt | airbyte | fivetran | meltano | none
  ingestion_managed: true      # false = tycoon won't run `data sources run`
  warehouse: duckdb            # duckdb | motherduck | snowflake | bigquery | other
  transformation_managed: true # false = tycoon won't scaffold or overwrite dbt
  bi: rill                     # rill | metabase | looker | tableau | other | none
  bi_managed: true             # false = tycoon won't start Rill
  orchestrator: dagster        # dagster | airflow | prefect | other | none
  orchestrator_managed: true   # false = tycoon won't start Dagster

sources:
  my-github:
    type: github               # matches a catalog source name
    schema: raw_github         # schema name in the raw DuckDB file
    config:
      access_token: ${GITHUB_TOKEN}   # env vars are interpolated
      owner: my-org
      repo: my-repo

ask:                           # optional — requires tycoon[ask]
  llm:
    provider: ollama           # fully local, no API key required
  port: 5005

Each source produces its own raw DuckDB file: data/raw_<source>.duckdb. All sources write into data/warehouse.duckdb after transformation.


Catalog Sources

These sources are available via tycoon data sources add <name>. They are downloaded on demand via dlt init and not bundled in the package.

Source Category Key Tables
github Developer commits, issues, pull_requests, repositories
slack Communication channels, messages, users
stripe Finance customers, invoices, products, subscriptions
hubspot CRM companies, contacts, deals, tickets
notion Knowledge databases, pages, users

Data Directory

Raw DuckDB files follow the naming convention raw_<source>.duckdb (written by ingestion) while warehouse.duckdb is the single transformed database read by Rill and Nao. See data/README.md for details.


Rill Dashboards

Rill is a local-first BI tool. Dashboard definitions are YAML files in the rill/ directory. Launch Rill via tycoon start or tycoon start --only rill.

Auto-generate dashboards for a source after ingestion:

tycoon data analyze my-source --rill

This exports each raw table to Parquet, then generates Rill source, metrics view, and explore files — one dashboard per table. The --rill flag is opt-in; dashboard generation is skipped by default since it requires a Rill project directory (rill/) to already exist.

Architecture: sources read from Parquet via Rill's local_file connector into its built-in in-memory OLAP. Dashboards are immediately usable without a dbt run.


Optional Extras

Dagster orchestration (tycoon[dagster])

Installs Dagster, dagster-dbt, and dagster-dlt. Provides a full asset graph covering ingestion and transformation. Run the Dagster UI with:

dagster dev

The workspace is defined in workspace.yaml at the project root.

AI queries (tycoon[ask])

Installs Nao and Ibis for natural language querying of the warehouse. Requires a running LLM — Ollama (local) is supported out of the box with no API key.

tycoon ask init
tycoon ask sync
tycoon ask chat

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

database_tycoon-0.1.0.tar.gz (241.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

database_tycoon-0.1.0-py3-none-any.whl (81.0 kB view details)

Uploaded Python 3

File details

Details for the file database_tycoon-0.1.0.tar.gz.

File metadata

  • Download URL: database_tycoon-0.1.0.tar.gz
  • Upload date:
  • Size: 241.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for database_tycoon-0.1.0.tar.gz
Algorithm Hash digest
SHA256 c284f4e9d0122128e15b9c769efad06618652bd8c168021c85b990687c006260
MD5 55af112a1a06224a089ecaf4e6845729
BLAKE2b-256 3338b0ec6934a40dd4f617377386ffae7ae57264b8f8087addf75f4d7a68850b

See more details on using hashes here.

Provenance

The following attestation bundles were made for database_tycoon-0.1.0.tar.gz:

Publisher: publish.yml on Database-Tycoon/tycoon-cli

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file database_tycoon-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: database_tycoon-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 81.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for database_tycoon-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 17c166de8b9f1d7e4ff62e49a534404221249edddb003658c03946d228a7f8cd
MD5 a7667b845e128e020257cf51bd35030a
BLAKE2b-256 f13edae60ec15e97f00d77a921db5eacc18c6a4d5845a0f2571a74dc2d139fea

See more details on using hashes here.

Provenance

The following attestation bundles were made for database_tycoon-0.1.0-py3-none-any.whl:

Publisher: publish.yml on Database-Tycoon/tycoon-cli

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page