A production-ready SDK for building AI agents with planning, workspace, and observability.
Project description
Agentic AI SDK
A production-ready SDK for building AI agents with planning, workspace management, and observability.
Installation
# Basic installation
pip install agentic-ai-sdk
# With AG-UI protocol support
pip install agentic-ai-sdk[ag-ui]
# With observability (OpenTelemetry)
pip install agentic-ai-sdk[observability]
# All features
pip install agentic-ai-sdk[all]
Quick Start
1. Basic Agent
from agentic_ai import (
DeepAgentSession,
build_agent,
create_workspace,
LLMClientFactory,
)
# Create workspace
workspace = create_workspace(".ws/my_agent")
# Create LLM factory
llm_factory = LLMClientFactory.from_config({
"default": {
"provider": "azure_openai",
"model": "gpt-4o",
"endpoint": "https://your-endpoint.openai.azure.com",
}
})
# Build agent session
session = build_agent(
chat_client=llm_factory.get_client("default"),
workspace=workspace,
agent_id="my_agent",
tools=[your_tool_1, your_tool_2],
instructions="You are a helpful assistant.",
)
# Run the agent
response = await session.run("Hello, what can you do?")
print(response.text)
2. Using Configuration Files
Create a env.yaml:
llm_profiles:
- name: default
provider: azure_openai
model: gpt-4o
endpoint: ${AZURE_OPENAI_ENDPOINT}
api_key: ${AZURE_OPENAI_API_KEY}
agents:
- id: analyst
llm_profile_name: default
system_prompt_file: manifest/prompts/analyst.md
planning_enabled: true
max_tool_iterations: 30
Then load and use it:
from agentic_ai import (
BaseAppContext,
build_app_context,
build_agent_from_store,
create_workspace,
)
# Load configuration
ctx = build_app_context("env.yaml")
# Create workspace
workspace = create_workspace(".ws/analyst")
# Build agent from config store
session = build_agent_from_store(
agent_store=ctx.agent_store,
llm_factory=ctx.llm_factory,
agent_id="analyst",
workspace=workspace,
tools=[...],
)
response = await session.run("Analyze the sales data")
3. Streaming Responses
async for update in session.run_stream("Generate a report"):
if update.text:
print(update.text, end="", flush=True)
4. With AG-UI Protocol
from agentic_ai import build_ag_ui_agent, DeepAgentProtocolAdapter
# Wrap session for AG-UI
adapter = DeepAgentProtocolAdapter(session)
# Or use the convenience function
agent = build_ag_ui_agent(
agent_store=ctx.agent_store,
llm_factory=ctx.llm_factory,
agent_id="analyst",
workspace=workspace,
tools=[...],
)
Core Concepts
DeepAgentSession
The main entry point for interacting with an agent. It wraps a ChatAgent and provides:
- Workspace Management: Persistent storage for artifacts and state
- Planning: Built-in planning tool for complex tasks
- Context Compaction: Automatic history management for long conversations
- Observability: Logging and tracing integration
Configuration
The SDK uses a hierarchical configuration system:
LLMConfig: LLM provider settings (model, endpoint, temperature, etc.)AgentConfig: Agent-specific settings (prompt, tools, planning, etc.)BaseAppConfig: Application-wide configuration
Tools Manifest & Config Injection
Tools can be declared in manifest/tools.yaml with two config layers:
config_section: Injects a runtime config section (fromenv.yaml)config: Declarative per-tool overrides in tools.yaml
Inside tools, use get_effective_tool_config() to read both:
from agentic_ai.tool_runtime import get_effective_tool_config
tool_cfg = get_effective_tool_config("openmetadata")
section = tool_cfg["section"]
overrides = tool_cfg["config"]
Recommended precedence: tool overrides > injected section values > runtime defaults.
Workspace
Each agent session has a workspace for storing:
- Artifacts: Tool outputs, generated files
- Plans: Task plans and progress tracking
- State: Session state and context
Middleware
Extend agent behavior with middleware:
from agentic_ai import ToolResultPersistenceMiddleware
# Auto-persist tool results to workspace
middleware = ToolResultPersistenceMiddleware(auto_persist=True)
session = build_agent(
...,
function_middlewares=[middleware],
)
API Reference
Agent Builders
| Function | Description |
|---|---|
build_agent() |
Create session with chat client |
build_agent_with_llm() |
Create session with LLM factory |
build_agent_from_config() |
Create session from AgentConfig |
build_agent_from_store() |
Create session from config store |
build_agent_session() |
Low-level session builder |
Configuration
| Class | Description |
|---|---|
AgentConfig |
Agent configuration schema |
LLMConfig |
LLM provider configuration |
BaseAppConfig |
Base application config |
AgentConfigStore |
Store for multiple agent configs |
LLMClientFactory |
Factory for creating LLM clients |
Workspace & Artifacts
| Class/Function | Description |
|---|---|
WorkspaceHandle |
Handle to a workspace directory |
WorkspaceManager |
Manages multiple workspaces |
create_workspace() |
Create a new workspace |
ArtifactStore |
Store for tool artifacts |
persist_artifact() |
Save artifact to workspace |
Planning
| Class | Description |
|---|---|
PlanStore |
Stores task plans |
PlanRecord |
A plan with steps |
PlanStep |
A single step in a plan |
build_update_plan_tool() |
Creates the update_plan tool |
Observability
| Function | Description |
|---|---|
setup_logging() |
Configure logging |
enable_observability() |
Enable tracing |
get_tracer() |
Get OpenTelemetry tracer |
Examples
See the examples/ directory for complete examples:
basic_agent.py- Simple agent setupconfig_based_agent.py- Configuration-driven agentstreaming_agent.py- Streaming responsesmulti_agent.py- Multiple coordinated agents
License
MIT License - see LICENSE for details.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file agentic_ai_sdk-0.1.0.tar.gz.
File metadata
- Download URL: agentic_ai_sdk-0.1.0.tar.gz
- Upload date:
- Size: 87.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6657de526e4059f295d19bcb49432646a59d07f07a53e3d3ffd1b1df9167e755
|
|
| MD5 |
5fd35f02a5321d1fb8414e4ab8badb34
|
|
| BLAKE2b-256 |
b3dc887ba771ea40d4c8598a9b92a4d0a7555775c5bf4b340f0070bb01d25a5d
|
File details
Details for the file agentic_ai_sdk-0.1.0-py3-none-any.whl.
File metadata
- Download URL: agentic_ai_sdk-0.1.0-py3-none-any.whl
- Upload date:
- Size: 112.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8f9e68bf58a951b0399b52b77b8e9e046b4beb8061df0ebf58a11f3885a5159c
|
|
| MD5 |
f203df24c4e73c02ec9825f2c079afe5
|
|
| BLAKE2b-256 |
563390b2a24e0dc996a649395fa87a237174711c5cee24cc809904d9a1cd1132
|