Deep Agent framework built on Pydantic-ai with planning, filesystem, and subagent capabilities
Project description
pydantic-deep
Looking for a full-stack template? Check out fastapi-fullstack - a production-ready project generator for AI/LLM applications with FastAPI, Next.js, and pydantic-deep integration.
Need just the todo toolset? Check out pydantic-ai-todo - standalone task planning toolset that works with any pydantic-ai agent.
Need just the backends? Check out pydantic-ai-backend - file storage and sandbox backends that work with any pydantic-ai agent.
Deep agent framework built on pydantic-ai with planning, filesystem, and subagent capabilities.
Demo
See the full demo application - a complete example showing how to build a chat interface with file uploads, skills, and streaming responses.
Features
- Multiple Backends: StateBackend (in-memory), LocalBackend, DockerSandbox, CompositeBackend - via pydantic-ai-backend
- Rich Toolsets: TodoToolset (via pydantic-ai-todo), Console Toolset (via pydantic-ai-backend), SubAgentToolset, SkillsToolset
- File Uploads: Upload files for agent processing with
run_with_files()ordeps.upload_file() - Skills System: Extensible skill definitions with markdown prompts
- Structured Output: Type-safe responses with Pydantic models via
output_type - Context Management: Automatic conversation summarization for long sessions
- Human-in-the-Loop: Built-in support for human confirmation workflows
- Streaming: Full streaming support for agent responses
Modular Architecture
pydantic-deep is built with modular, reusable components:
| Component | Package | Description |
|---|---|---|
| Backends | pydantic-ai-backend | File storage and Docker sandbox |
| Todo Toolset | pydantic-ai-todo | Task planning and tracking |
| Summarization | Built-in | Automatic context management* |
*Note: Summarization will be added to pydantic-ai core in late January 2025 (pydantic-ai#3780). We will migrate to use it once available.
Installation
pip install pydantic-deep
Or with uv:
uv add pydantic-deep
Optional dependencies
# Docker sandbox support
pip install pydantic-deep[sandbox]
Quick Start
import asyncio
from pydantic_ai_backends import StateBackend
from pydantic_deep import create_deep_agent, create_default_deps
async def main():
# Create a deep agent with state backend
backend = StateBackend()
deps = create_default_deps(backend)
agent = create_deep_agent()
# Run the agent
result = await agent.run("Help me organize my tasks", deps=deps)
print(result.output)
asyncio.run(main())
Structured Output
Get type-safe responses with Pydantic models:
from pydantic import BaseModel
from pydantic_deep import create_deep_agent, create_default_deps
class TaskAnalysis(BaseModel):
summary: str
priority: str
estimated_hours: float
agent = create_deep_agent(output_type=TaskAnalysis)
deps = create_default_deps()
result = await agent.run("Analyze this task: implement user auth", deps=deps)
print(result.output.priority) # Type-safe access
File Uploads
Process user-uploaded files with the agent:
from pydantic_ai_backends import StateBackend
from pydantic_deep import create_deep_agent, DeepAgentDeps, run_with_files
agent = create_deep_agent()
deps = DeepAgentDeps(backend=StateBackend())
# Upload and process files
with open("sales.csv", "rb") as f:
result = await run_with_files(
agent,
"Analyze this sales data and find top products",
deps,
files=[("sales.csv", f.read())],
)
Or upload files directly to deps:
deps.upload_file("config.json", b'{"key": "value"}')
# File is now at /uploads/config.json and agent sees it in system prompt
Context Management
Automatically summarize long conversations to manage token limits:
from pydantic_deep import create_deep_agent
from pydantic_deep.processors import create_summarization_processor
processor = create_summarization_processor(
trigger=("tokens", 100000), # Summarize when reaching 100k tokens
keep=("messages", 20), # Keep last 20 messages
)
agent = create_deep_agent(history_processors=[processor])
Note: This feature will be added to pydantic-ai core in late January 2025 (pydantic-ai#3780). Once available, we will migrate to use the upstream implementation.
Documentation
- Full Documentation - Complete guides and API reference
- PyPI Package - Package information and releases
- GitHub Repository - Source code and issues
Quick Links
Related Projects
- pydantic-ai - The foundation: Agent framework by Pydantic
- pydantic-ai-backend - File storage and sandbox backends (extracted from pydantic-deep)
- pydantic-ai-todo - Task planning toolset (extracted from pydantic-deep)
- fastapi-fullstack - Full-stack AI app template with pydantic-deep
Development
# Clone the repository
git clone https://github.com/vstorm-co/pydantic-deepagents.git
cd pydantic-deepagents
# Install dependencies
make install
# Run tests
make test
# Run all checks (lint, typecheck, test, coverage)
make all
Star History
License
MIT License - see LICENSE for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pydantic_deep-0.2.13.tar.gz.
File metadata
- Download URL: pydantic_deep-0.2.13.tar.gz
- Upload date:
- Size: 266.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bce4f262d581aaab4dd62b410d4188de61bd77918fe88ee7742b99ddec36c2a6
|
|
| MD5 |
3a4d022f9ad6ea19aa3fdbdf829e3607
|
|
| BLAKE2b-256 |
b2050f91499c7aeb990d57db4d4f4ab15bd00f7cccdec01a7fefb274cf72d227
|
Provenance
The following attestation bundles were made for pydantic_deep-0.2.13.tar.gz:
Publisher:
publish.yml on vstorm-co/pydantic-deepagents
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
pydantic_deep-0.2.13.tar.gz -
Subject digest:
bce4f262d581aaab4dd62b410d4188de61bd77918fe88ee7742b99ddec36c2a6 - Sigstore transparency entry: 832450496
- Sigstore integration time:
-
Permalink:
vstorm-co/pydantic-deepagents@8539f62fc140a285707d5a7d523900766cb6e316 -
Branch / Tag:
refs/tags/0.2.13 - Owner: https://github.com/vstorm-co
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@8539f62fc140a285707d5a7d523900766cb6e316 -
Trigger Event:
release
-
Statement type:
File details
Details for the file pydantic_deep-0.2.13-py3-none-any.whl.
File metadata
- Download URL: pydantic_deep-0.2.13-py3-none-any.whl
- Upload date:
- Size: 24.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3b82625a92543e03937e7c0dfa997c2c26b86ff5d7bea212836225ef37574f8c
|
|
| MD5 |
254306f0adf3e4d8b3768b441d2b294a
|
|
| BLAKE2b-256 |
e5dfa3d2254c739140cbaaab1029c88912af3e88a4ed5a4899c4363906aed54f
|
Provenance
The following attestation bundles were made for pydantic_deep-0.2.13-py3-none-any.whl:
Publisher:
publish.yml on vstorm-co/pydantic-deepagents
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
pydantic_deep-0.2.13-py3-none-any.whl -
Subject digest:
3b82625a92543e03937e7c0dfa997c2c26b86ff5d7bea212836225ef37574f8c - Sigstore transparency entry: 832450498
- Sigstore integration time:
-
Permalink:
vstorm-co/pydantic-deepagents@8539f62fc140a285707d5a7d523900766cb6e316 -
Branch / Tag:
refs/tags/0.2.13 - Owner: https://github.com/vstorm-co
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@8539f62fc140a285707d5a7d523900766cb6e316 -
Trigger Event:
release
-
Statement type: