Skip to main content

Deep Agent framework built on Pydantic-ai with planning, filesystem, and subagent capabilities

Project description

pydantic-deep

Looking for a full-stack template? Check out fastapi-fullstack - a production-ready project generator for AI/LLM applications with FastAPI, Next.js, and pydantic-deep integration.

Need just the todo toolset? Check out pydantic-ai-todo - standalone task planning toolset that works with any pydantic-ai agent.

Need just the backends? Check out pydantic-ai-backend - file storage and sandbox backends that work with any pydantic-ai agent.

PyPI version Python 3.10+ License: MIT Coverage Status CI

Deep agent framework built on pydantic-ai with planning, filesystem, and subagent capabilities.

Demo

Watch Demo

Demo Screenshot

See the full demo application - a complete example showing how to build a chat interface with file uploads, skills, and streaming responses.

Features

  • Multiple Backends: StateBackend (in-memory), LocalBackend, DockerSandbox, CompositeBackend - via pydantic-ai-backend
  • Rich Toolsets: TodoToolset (via pydantic-ai-todo), Console Toolset (via pydantic-ai-backend), SubAgentToolset, SkillsToolset
  • File Uploads: Upload files for agent processing with run_with_files() or deps.upload_file()
  • Skills System: Extensible skill definitions with markdown prompts
  • Structured Output: Type-safe responses with Pydantic models via output_type
  • Context Management: Automatic conversation summarization for long sessions
  • Human-in-the-Loop: Built-in support for human confirmation workflows
  • Streaming: Full streaming support for agent responses

Modular Architecture

pydantic-deep is built with modular, reusable components:

Component Package Description
Backends pydantic-ai-backend File storage and Docker sandbox
Todo Toolset pydantic-ai-todo Task planning and tracking
Summarization Built-in Automatic context management*

*Note: Summarization will be added to pydantic-ai core in late January 2025 (pydantic-ai#3780). We will migrate to use it once available.

Installation

pip install pydantic-deep

Or with uv:

uv add pydantic-deep

Optional dependencies

# Docker sandbox support
pip install pydantic-deep[sandbox]

Quick Start

import asyncio
from pydantic_ai_backends import StateBackend
from pydantic_deep import create_deep_agent, create_default_deps

async def main():
    # Create a deep agent with state backend
    backend = StateBackend()
    deps = create_default_deps(backend)
    agent = create_deep_agent()

    # Run the agent
    result = await agent.run("Help me organize my tasks", deps=deps)
    print(result.output)

asyncio.run(main())

Structured Output

Get type-safe responses with Pydantic models:

from pydantic import BaseModel
from pydantic_deep import create_deep_agent, create_default_deps

class TaskAnalysis(BaseModel):
    summary: str
    priority: str
    estimated_hours: float

agent = create_deep_agent(output_type=TaskAnalysis)
deps = create_default_deps()

result = await agent.run("Analyze this task: implement user auth", deps=deps)
print(result.output.priority)  # Type-safe access

File Uploads

Process user-uploaded files with the agent:

from pydantic_ai_backends import StateBackend
from pydantic_deep import create_deep_agent, DeepAgentDeps, run_with_files

agent = create_deep_agent()
deps = DeepAgentDeps(backend=StateBackend())

# Upload and process files
with open("sales.csv", "rb") as f:
    result = await run_with_files(
        agent,
        "Analyze this sales data and find top products",
        deps,
        files=[("sales.csv", f.read())],
    )

Or upload files directly to deps:

deps.upload_file("config.json", b'{"key": "value"}')
# File is now at /uploads/config.json and agent sees it in system prompt

Context Management

Automatically summarize long conversations to manage token limits:

from pydantic_deep import create_deep_agent
from pydantic_deep.processors import create_summarization_processor

processor = create_summarization_processor(
    trigger=("tokens", 100000),  # Summarize when reaching 100k tokens
    keep=("messages", 20),       # Keep last 20 messages
)

agent = create_deep_agent(history_processors=[processor])

Note: This feature will be added to pydantic-ai core in late January 2025 (pydantic-ai#3780). Once available, we will migrate to use the upstream implementation.

Documentation

Quick Links

Related Projects

Development

# Clone the repository
git clone https://github.com/vstorm-co/pydantic-deepagents.git
cd pydantic-deepagents

# Install dependencies
make install

# Run tests
make test

# Run all checks (lint, typecheck, test, coverage)
make all

Star History

Star History Chart

License

MIT License - see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pydantic_deep-0.2.13.tar.gz (266.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pydantic_deep-0.2.13-py3-none-any.whl (24.9 kB view details)

Uploaded Python 3

File details

Details for the file pydantic_deep-0.2.13.tar.gz.

File metadata

  • Download URL: pydantic_deep-0.2.13.tar.gz
  • Upload date:
  • Size: 266.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pydantic_deep-0.2.13.tar.gz
Algorithm Hash digest
SHA256 bce4f262d581aaab4dd62b410d4188de61bd77918fe88ee7742b99ddec36c2a6
MD5 3a4d022f9ad6ea19aa3fdbdf829e3607
BLAKE2b-256 b2050f91499c7aeb990d57db4d4f4ab15bd00f7cccdec01a7fefb274cf72d227

See more details on using hashes here.

Provenance

The following attestation bundles were made for pydantic_deep-0.2.13.tar.gz:

Publisher: publish.yml on vstorm-co/pydantic-deepagents

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pydantic_deep-0.2.13-py3-none-any.whl.

File metadata

  • Download URL: pydantic_deep-0.2.13-py3-none-any.whl
  • Upload date:
  • Size: 24.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pydantic_deep-0.2.13-py3-none-any.whl
Algorithm Hash digest
SHA256 3b82625a92543e03937e7c0dfa997c2c26b86ff5d7bea212836225ef37574f8c
MD5 254306f0adf3e4d8b3768b441d2b294a
BLAKE2b-256 e5dfa3d2254c739140cbaaab1029c88912af3e88a4ed5a4899c4363906aed54f

See more details on using hashes here.

Provenance

The following attestation bundles were made for pydantic_deep-0.2.13-py3-none-any.whl:

Publisher: publish.yml on vstorm-co/pydantic-deepagents

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page