Skip to main content

chatdbt is an openai-based dbt documentation robot. You can use natural language to describe your data query requirements to the robot, and chatdbt will help you select the dbt model you need, or generate sql responses based on these dbt models to meet your need

Project description

chatdbt

What is this?

chatdbt is an openai-based dbt documentation robot. You can use natural language to describe your data query requirements to the robot, and chatdbt will help you select the dbt model you need, or generate sql responses based on these dbt models to meet your needs. Of course, you need to set up your dbt documentation for chatdbt in advance.

Quick Install

pip install chatdbt

package extras:

  • nomic: use nomic/atlas as vector storage backend
  • pgvector: use pgvector as vector storage backend

Internals

Chatdbt uses openai's text-embedding-ada-002 model interface to embed your dbt documentation and save the vectors to the vector storage you provide. When you make an inquiry to chatdbt, it retrieves the models and metrics (todo😊) that are semantically similar to your question. Based on the returned content and your question, it uses openai gpt-3.5-turbo model to provide appropriate answers. Similar to langchain or llama_index.

How does chatdbt integrate with my dbt doc, and where is my embedding data stored?

There are several interfaces within chatdbt:

  • VectorStorage is responsible for storing embedding vectors. Currently supporting:
    • atlas

      Set up your api_key and project_name to use Nomic Atlas for storing and retrieving the vector data.

    • pgvector

      Set up your connect_string and table_name to use pgvector for storing and retrieving the vector data.

  • DBTDocResolver is responsible for providing dbt manifest and catalog data. Currently supporting:
    • localfs

      Set up manifest_json_path and manifest_json_path, and chatdbt will read the dbt manifest and catalog from the local file system.

  • TikTokenProvider is responsible for estimating the number of tokens consumed by OpenAI. Currently supporting:
    • tiktoken_http_server

      Set up a tiktoken-http-server api_base(example: http://localhost:8080) to use tiktoken-http-server for estimating the number of tokens consumed by OpenAI.

You can also implement the above interfaces yourself and integrate them into your own system.

Quick Start

You can initialize a chatdbt instance manually:

your_pgvector_connect_string = "postgresql+psycopg://postgres:foobar@localhost:5432/chatdbt"
your_pgvector_table_name = "chatdbt"
your_manifest_json_path = "data/manifest.json"
your_catalog_json_path = "data/catalog.json"
your_openai_key = "sk-foobar"
import os

os.environ["OPENAI_API_KEY"] = your_openai_key

from chatdbt import ChatBot
from chatdbt.vector_storage.pgvector import PGVectorStorage
from chatdbt.dbt_doc_resolver.localfs import LocalfsDBTDocResolver


vector_storage = PGVectorStorage(connect_string=your_pgvector_connect_string, table_name=your_pgvector_table_name)
dbt_doc_resolver = LocalfsDBTDocResolver(manifest_json_path=your_manifest_json_path, catalog_json_path=your_catalog_json_path)

bot = ChatBot(doc_resolver=dbt_doc_resolver, vector_storage=vector_storage, tiktoken_provider=None)

bot.suggest_table("query the number of users who have purchased a product")

bot.suggest_sql("query the number of users who have purchased a product")

or initialize a chatdbt instance with environment variables:

import os

os.environ["CHATDBT_I18N"] = "zh-cn"
os.environ["CHATDBT_VECTOR_STORAGE_TYPE"] = "pgvector"
os.environ[
    "CHATDBT_VECTOR_STORAGE_CONFIG_CONNECT_STRING"
] = your_pgvector_connect_string
os.environ["CHATDBT_VECTOR_STORAGE_CONFIG_TABLE_NAME"] = your_pgvector_table_name

os.environ["CHATDBT_DBT_DOC_RESOLVER_TYPE"] = "localfs"
os.environ["CHATDBT_DBT_DOC_RESOLVER_CONFIG_MANIFEST_JSON_PATH"] = your_manifest_json_path
os.environ["CHATDBT_DBT_DOC_RESOLVER_CONFIG_CATALOG_JSON_PATH"] = your_catalog_json_path

os.environ["OPENAI_API_KEY"] = your_openai_key

import chatdbt

chatdbt.suggest_table("query the number of users who have purchased a product")

chatdbt.suggest_sql("query the number of users who have purchased a product")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatdbt-0.0.3.tar.gz (12.9 kB view details)

Uploaded Source

Built Distribution

chatdbt-0.0.3-py3-none-any.whl (14.8 kB view details)

Uploaded Python 3

File details

Details for the file chatdbt-0.0.3.tar.gz.

File metadata

  • Download URL: chatdbt-0.0.3.tar.gz
  • Upload date:
  • Size: 12.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.1 CPython/3.8.13 Darwin/22.3.0

File hashes

Hashes for chatdbt-0.0.3.tar.gz
Algorithm Hash digest
SHA256 d417b6d3d557839fb9de0e41ea7ae43ea2c0f3187ef4035715d4d6e74438fef3
MD5 aed400456fa16d14e15c561d25cc57c8
BLAKE2b-256 6cab335bab82c899c05266d01ee9b1b632884442ee3b11e493db18e06ef5d761

See more details on using hashes here.

File details

Details for the file chatdbt-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: chatdbt-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 14.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.1 CPython/3.8.13 Darwin/22.3.0

File hashes

Hashes for chatdbt-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 6cbbecc55bb28b5979fd23348707b062c9eb1b94cd1531bdc43f9126a5123534
MD5 d9ee39300e72e6ed0af482f7c4770ac2
BLAKE2b-256 8111d72f45ba1d1f64b44add70a8ad2ce61c1f95ec63a5af6d2b20dd02d526ed

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page