Unified toolkit for building production agents on Databricks - zero-frontend with official UI templates (Streamlit, Gradio, Dash)
Project description
Databricks Agent Toolkit
Build production agents on Databricks in minutes, not days.
Generate production-ready agent scaffolds that follow official Databricks patterns and work with any of the 6 official UI templates.
pip install databricks-agent-toolkit
# Generate a chatbot (use the short alias!)
dbat generate chatbot my-bot
cd my-bot
python start_server.py
# ๐ Done! Agent running at http://localhost:8000
Why This Toolkit?
Philosophy: "On Top Of, Not Instead Of"
We don't create custom frameworks. We generate production-ready agent backends that follow:
- โ OpenAI API standard (industry-wide compatibility)
- โ Official Databricks patterns (FastAPI + OpenAPI)
- โ MLflow best practices (auto-tracing, experiments)
- โ Databricks integrations (Model Serving, Lakebase, Unity Catalog)
Your agents work with:
- All 6 official Databricks UI templates
- Any OpenAI-compatible UI framework
- Custom frontends (React, Streamlit, Gradio, etc.)
Quick Start
1. Install
pip install databricks-agent-toolkit
2. Generate an Agent
# Use the short alias (much easier to type!)
dbat generate chatbot my-bot
# Or the full command:
# databricks-agent-toolkit generate chatbot my-bot
Available aliases:
dbatโ Databricks Agent Toolkit โจ (recommended)datโ Even shorter!databricks-agent-toolkitโ Full command
3. What You Get
my-bot/
โโโ agent.py # Your agent logic (OpenAI API compatible)
โโโ start_server.py # FastAPI server with auto-docs
โโโ chatbot.py # CLI interface for testing
โโโ config.yaml # Easy configuration
โโโ requirements.txt # All dependencies
โโโ databricks.yml # Deploy to Databricks Apps
โโโ app.yaml # App configuration
โโโ README.md # Setup instructions
4. Run Locally
cd my-bot
pip install -r requirements.txt
# Start the server
python start_server.py
# ๐ Backend: http://localhost:8000
# ๐ API Docs: http://localhost:8000/docs
# ๐ Health: http://localhost:8000/health
# Or test in CLI
python chatbot.py
5. Deploy to Databricks
# One command deployment
databricks bundle deploy
# Your app is live at: https://<workspace>/apps/my-bot
What's Included
L1: Chatbot (Simple Conversational AI)
dbat generate chatbot my-bot
Features:
- ๐ฌ OpenAI API compatible via MLflow AgentServer (
/invocations) - ๐ Streaming with Server-Sent Events (SSE)
- ๐ MLflow auto-tracing (requires mlflow>=3.6.0)
- ๐๏ธ Configuration-driven (no code changes to switch models)
- ๐ One-command deploy to Databricks Apps
- ๐ Built-in web UI (or use official templates)
- ๐ Auto-generated OpenAPI docs
Perfect for:
- Quick prototypes and demos
- Simple Q&A bots
- Customer support assistants
- Internal tools
L2: Assistant (With Memory + RAG)
dbat generate assistant my-assistant --enable-rag
Everything in L1, plus:
- ๐พ Conversation memory (Lakebase/PostgreSQL)
- ๐ง RAG with pgvector or Databricks Vector Search
- ๐ Auto-index documents from Unity Catalog Volumes
- ๐ Session management
- ๐ฅ Multi-user support
Perfect for:
- Knowledge base assistants
- Document Q&A
- Support bots with history
- Enterprise applications
Configuration
Edit config.yaml to customize your agent:
# config.yaml
model:
endpoint: databricks-meta-llama-3-1-70b-instruct # Any Databricks Model Serving endpoint
temperature: 0.7
max_tokens: 500
streaming: true
token_delay_ms: 50 # Streaming speed (lower = faster)
system_prompt: "You are a helpful AI assistant."
mlflow:
experiment: /Shared/my-bot
auto_trace: true # Automatic tracing of all LLM calls
# L2 only: Memory configuration
memory:
enabled: true
backend: lakebase
host: ${LAKEBASE_HOST}
database: ${LAKEBASE_DATABASE}
# L2 only: RAG configuration
rag:
enabled: true
source: /Volumes/main/default/docs # Unity Catalog Volume
backend: pgvector # or vector_search
embedding_model: databricks-bge-large-en
No code changes needed! Just edit config and redeploy.
Official UI Templates
Your agent backend is 100% compatible with all official Databricks UI templates:
| Framework | Best For | Template |
|---|---|---|
| Streamlit | Quick prototypes, data apps | streamlit-chatbot-app |
| Gradio | ML demos, simple interfaces | gradio-chatbot-app |
| Plotly Dash | Data dashboards with chat | dash-chatbot-app |
| Shiny | Statistical apps, R users | shiny-chatbot-app |
| React | Production, enterprise | e2e-chatbot-app |
| Next.js | Modern production, SSR | e2e-chatbot-app-next |
Why compatible?
We follow the OpenAI API standard via MLflow AgentServer (/invocations endpoint).
Using official UIs:
# 1. Generate our backend
dbat generate chatbot my-bot
# 2. Clone official UI
git clone https://github.com/databricks/app-templates.git
cp -r app-templates/streamlit-chatbot-app my-bot/frontend
# 3. Point UI to backend (http://localhost:8000)
# 4. Deploy together!
Coming in v0.3.0: One-command integration!
dbat generate chatbot my-bot --ui=streamlit
# โ
Backend + official Streamlit UI, pre-configured!
See UI Integration Guide for details.
API Documentation
Your agent comes with auto-generated OpenAPI documentation:
- Swagger UI: http://localhost:8000/docs
- OpenAPI JSON: http://localhost:8000/openapi.json
- Health Check: http://localhost:8000/health
Example API Call
import requests
response = requests.post(
"http://localhost:8000/invocations",
json={
"input": [{"role": "user", "content": "Hello!"}],
"stream": False
}
)
# ResponsesAgent format: data.output[0].content[0].text
print(response.json()["output"][0]["content"][0]["text"])
Streaming Example
import requests
import json
response = requests.post(
"http://localhost:8000/invocations",
json={
"input": [{"role": "user", "content": "Tell me a story"}],
"stream": True
},
stream=True
)
for line in response.iter_lines():
if line.startswith(b"data: "):
data = line[6:].decode('utf-8')
if data != "[DONE]":
chunk = json.loads(data)
# ResponsesAgent streaming format: chunk.content
print(chunk.get("content", ""), end="", flush=True)
Features
Production-Ready
- โ FastAPI for performance and reliability
- โ OpenAPI schema for API documentation
- โ Health endpoints for monitoring
- โ Error handling and logging
- โ CORS configured for web UIs
Databricks-Native
- โ Auto-authentication with Databricks
- โ MLflow auto-tracing (track all LLM calls)
- โ Unity Catalog for data governance
- โ Model Serving integration
- โ Lakebase (PostgreSQL) for memory
- โ Vector Search for RAG
Developer Experience
- โ Configuration-driven (YAML, no code changes)
- โ CLI for quick testing
- โ Local development with hot-reload
- โ One-command deployment
- โ Comprehensive documentation
Standards-Based
- โ OpenAI API format (universal compatibility)
- โ Server-Sent Events (SSE) for streaming
- โ OpenAPI 3.0 schema
- โ REST best practices
Architecture
Your Application
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโ โ
โ โ Your UI โโโโโโโโโบโ Agent Backend โ โ
โ โ โ โ (our toolkit) โ โ
โ โ - Streamlit โ โ โ โ
โ โ - React โ โ - agent.py โ โ
โ โ - Gradio โ โ - FastAPI server โ โ
โ โ - Custom โ โ - OpenAI API โ โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโฌโโโโโโโโโโ โ
โ โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโ
โ
โโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโ
โ Databricks Platform โ
โ โ
โ - Model Serving (LLMs) โ
โ - MLflow (tracing) โ
โ - Lakebase (memory) โ
โ - Vector Search (RAG) โ
โ - Unity Catalog (data) โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
We provide the agent backend. You choose the UI.
Examples
Basic Chatbot
dbat generate chatbot hello-bot
cd hello-bot
python start_server.py
Assistant with Memory
dbat generate assistant support-bot
cd support-bot
# Configure Lakebase in databricks.yml
databricks bundle deploy
RAG-Powered Assistant
dbat generate assistant doc-bot --enable-rag
cd doc-bot
# Edit config.yaml:
# rag:
# enabled: true
# source: /Volumes/main/default/docs
# backend: pgvector
databricks bundle deploy
Custom Model
dbat generate chatbot custom-bot
cd custom-bot
# Edit config.yaml:
# model:
# endpoint: my-custom-endpoint
# temperature: 0.9
# max_tokens: 1000
python start_server.py
Requirements
- Python: 3.9+
- Databricks: Workspace access (for deployment)
- Model Serving: At least one LLM endpoint
- Optional:
- Lakebase (for L2 memory)
- Vector Search (for L2 RAG)
- Unity Catalog Volumes (for RAG documents)
Documentation
- UI Integration Guide - Using official Databricks UI templates
- App Templates Compliance - Compatibility verification
- Upstream Sync Strategy - How we stay compatible
FAQ
Q: Do I need to use Databricks? A: For deployment, yes. For local development, you just need access to Databricks Model Serving endpoints.
Q: Can I use my own UI? A: Absolutely! Your agent backend follows the OpenAI API standard, so any OpenAI-compatible UI works.
Q: What about LangChain/LangGraph? A: Coming in v0.3.0+. For now, our agents use a simple, lightweight pattern. You can integrate LangChain yourself if needed.
Q: Is this production-ready? A: Yes! L1 (chatbot) is production-ready in v0.2.0. L2 (assistant) is in active testing.
Q: How do I switch models?
A: Just edit config.yaml โ model.endpoint. No code changes needed!
Q: Can I customize the agent logic?
A: Yes! Edit agent.py - it's your code, do whatever you want.
Q: How do I add custom tools/functions?
A: Modify the predict() method in agent.py to call your functions before/after the LLM.
Roadmap
v0.3.0 (Coming Soon)
-
--ui=streamlit|gradio|reactflag for one-command UI integration - L3: API agents with custom tools
- L4: Multi-step workflows
- Template upgrade commands
v0.4.0+ (Future)
- L5: Multi-agent systems
- LangGraph integration
- Custom tool marketplace
- Performance benchmarking
Contributing
We welcome contributions! See CONTRIBUTING.md for guidelines.
Areas we'd love help with:
- Additional UI framework integrations
- More example agents
- Documentation improvements
- Bug reports and feature requests
Philosophy
"On Top Of, Not Instead Of"
We don't reinvent wheels. We integrate official Databricks patterns:
- โ Official app-templates for UI
- โ OpenAI API standard
- โ MLflow for tracing
- โ FastAPI for servers
- โ Databricks services for infrastructure
We add:
- ๐ ๏ธ Scaffold generation (save time)
- โ๏ธ Configuration management (no code changes)
- ๐ฆ Pre-wired integrations (batteries included)
- ๐ Best practices (production-ready)
License
Apache 2.0
Support
- GitHub Issues: https://github.com/databricks/agent-toolkit/issues
- Documentation: See
docs/folder - Examples: See
examples/folder
Built with โค๏ธ for the Databricks community
Start building agents today:
pip install databricks-agent-toolkit
dbat generate chatbot my-bot
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file databricks_agent_toolkit-0.3.0.tar.gz.
File metadata
- Download URL: databricks_agent_toolkit-0.3.0.tar.gz
- Upload date:
- Size: 162.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e1bd8478d495ee43f759d4e8fcd98577258a203000762d196aa84aae071aea65
|
|
| MD5 |
ccdb47ffa08df602cd2f06de82fc33a1
|
|
| BLAKE2b-256 |
0b05585ccfc0c7b8461f3fe8df5a2b5d5466c0fcfa23a36f53a22160fdb29b55
|
File details
Details for the file databricks_agent_toolkit-0.3.0-py3-none-any.whl.
File metadata
- Download URL: databricks_agent_toolkit-0.3.0-py3-none-any.whl
- Upload date:
- Size: 194.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b6962638e68200e2e4312476bea6fd8ddc16668c0b789b118616a311c2e0ffde
|
|
| MD5 |
3e934daf0fbbddb34fc6d694474e2826
|
|
| BLAKE2b-256 |
fd7047a8b4d74745000dc2d4c8180713246b96141d0478857b1bd42232f87cdd
|