Python SDK and runtime to serve AI agents with FastAPI, LangGraph, and observability.
Project description
Idun Agent Engine
Turn any LangGraph-based agent into a production-grade API in minutes.
Idun Agent Engine is a lightweight runtime and SDK that wraps your agent with a FastAPI server, adds streaming, structured responses, config validation, and optional observability — with zero boilerplate. Use a YAML file or a fluent builder to configure and run.
Installation
pip install idun-agent-engine
- Requires Python 3.13
- Ships with FastAPI, Uvicorn, LangGraph, SQLite checkpointing, and optional observability hooks
Quickstart
1) Minimal one-liner (from a YAML config)
from idun_agent_engine.core.server_runner import run_server_from_config
run_server_from_config("config.yaml")
Example config.yaml:
server:
api:
port: 8000
agent:
type: "langgraph"
config:
name: "My Example LangGraph Agent"
graph_definition: "./examples/01_basic_config_file/example_agent.py:app"
# Optional: conversation persistence
checkpointer:
type: "sqlite"
db_url: "sqlite:///example_checkpoint.db"
# Optional: provider-agnostic observability
observability:
provider: langfuse # or phoenix
enabled: true
options:
host: ${LANGFUSE_HOST}
public_key: ${LANGFUSE_PUBLIC_KEY}
secret_key: ${LANGFUSE_SECRET_KEY}
run_name: "idun-langgraph-run"
Run and open docs at http://localhost:8000/docs.
2) Programmatic setup with the fluent builder
from pathlib import Path
from idun_agent_engine import ConfigBuilder, create_app, run_server
config = (
ConfigBuilder()
.with_api_port(8000)
.with_langgraph_agent(
name="Programmatic Example Agent",
graph_definition=str(Path("./examples/02_programmatic_config/smart_agent.py:app")),
sqlite_checkpointer="programmatic_example.db",
)
.build()
)
app = create_app(engine_config=config)
run_server(app, reload=True)
Endpoints
All servers expose these by default:
- POST
/agent/invoke: single request/response - POST
/agent/stream: server-sent events stream ofag-uiprotocol events - GET
/health: service health with engine version - GET
/: root landing with links
Invoke example:
curl -X POST "http://localhost:8000/agent/invoke" \
-H "Content-Type: application/json" \
-d '{"query": "Hello!", "session_id": "user-123"}'
Stream example:
curl -N -X POST "http://localhost:8000/agent/stream" \
-H "Content-Type: application/json" \
-d '{"query": "Tell me a story", "session_id": "user-123"}'
LangGraph integration
Point the engine to a StateGraph variable in your file using graph_definition:
# examples/01_basic_config_file/example_agent.py
import operator
from typing import Annotated, TypedDict
from langgraph.graph import END, StateGraph
class AgentState(TypedDict):
messages: Annotated[list, operator.add]
def greeting_node(state):
user_message = state["messages"][-1] if state["messages"] else ""
return {"messages": [("ai", f"Hello! You said: '{user_message}'")]}
graph = StateGraph(AgentState)
graph.add_node("greet", greeting_node)
graph.set_entry_point("greet")
graph.add_edge("greet", END)
# This variable name is referenced by graph_definition
app = graph
Then reference it in config:
agent:
type: "langgraph"
config:
graph_definition: "./examples/01_basic_config_file/example_agent.py:app"
Behind the scenes, the engine:
- Validates config with Pydantic models
- Loads your
StateGraphfrom disk - Optionally wires a SQLite checkpointer via
langgraph.checkpoint.sqlite - Exposes
invokeandstreamendpoints - Bridges LangGraph events to
ag-uistream events
Observability (optional)
Enable provider-agnostic observability via the observability block in your agent config. Today supports Langfuse and Arize Phoenix (OpenInference) patterns; more coming soon.
agent:
type: "langgraph"
config:
observability:
provider: langfuse # or phoenix
enabled: true
options:
host: ${LANGFUSE_HOST}
public_key: ${LANGFUSE_PUBLIC_KEY}
secret_key: ${LANGFUSE_SECRET_KEY}
run_name: "idun-langgraph-run"
Configuration reference
server.api.port(int): HTTP port (default 8000)agent.type(enum): currentlylanggraph(CrewAI placeholder exists but not implemented)agent.config.name(str): human-readable nameagent.config.graph_definition(str): absolute or relativepath/to/file.py:variableagent.config.checkpointer(sqlite):{ type: "sqlite", db_url: "sqlite:///file.db" }agent.config.observability(optional): provider options as shown abovemcp_servers(list, optional): collection of MCP servers that should be available to your agent runtime. Each entry matches the fields supported bylangchain-mcp-adapters(name, transport, url/command, headers, etc.).
Config can be sourced by:
engine_config(preferred): pass a validatedEngineConfigtocreate_appconfig_dict: dict validated at runtimeconfig_path: path to YAML; defaults toconfig.yaml
MCP Servers
You can mount MCP servers directly in your engine config. The engine will automatically
create a MultiServerMCPClient and expose it on app.state.mcp_registry.
mcp_servers:
- name: "math"
transport: "stdio"
command: "python"
args:
- "/path/to/math_server.py"
- name: "weather"
transport: "streamable_http"
url: "http://localhost:8000/mcp"
Inside your FastAPI dependencies or handlers:
from idun_agent_engine.server.dependencies import get_mcp_registry
@router.get("/mcp/{server}/tools")
async def list_tools(server: str, registry = Depends(get_mcp_registry)):
return await registry.get_tools(server)
Or outside of FastAPI:
from langchain_mcp_adapters.tools import load_mcp_tools
registry = app.state.mcp_registry
async with registry.get_session("math") as session:
tools = await load_mcp_tools(session)
Examples
The examples/ folder contains complete projects:
01_basic_config_file: YAML config + simple agent02_programmatic_config:ConfigBuilderusage and advanced flows03_minimal_setup: one-line server from config
Run any example with Python 3.13 installed.
CLI and runtime helpers
Top-level imports for convenience:
from idun_agent_engine import (
create_app,
run_server,
run_server_from_config,
run_server_from_builder,
ConfigBuilder,
)
create_app(...)builds the FastAPI app and registers routesrun_server(app, ...)runs with Uvicornrun_server_from_config(path, ...)loads config, builds app, and runsrun_server_from_builder(builder, ...)builds from a builder and runs
Production notes
- Use a process manager (e.g., multiple Uvicorn workers behind a gateway). Note:
reload=Trueis for development and incompatible with multi-worker mode. - Mount behind a reverse proxy and enable TLS where appropriate.
- Persist conversations using the SQLite checkpointer in production or replace with a custom checkpointer when available.
Roadmap
- CrewAI adapter (placeholder exists, not yet implemented)
- Additional stores and checkpointers
- First-class CLI for
iduncommands
Contributing
Issues and PRs are welcome. See the repository:
- Repo:
https://github.com/geoffreyharrazi/idun-agent-platform - Package path:
libs/idun_agent_engine - Open an issue:
https://github.com/geoffreyharrazi/idun-agent-platform/issues
Run locally:
cd libs/idun_agent_engine
poetry install
poetry run pytest -q
License
MIT — see LICENSE in the repo root.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file idun_agent_engine-0.3.3.tar.gz.
File metadata
- Download URL: idun_agent_engine-0.3.3.tar.gz
- Upload date:
- Size: 40.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
140788813185f3fa20432655f404523dd4fcf59ea1819fee5f2ca9c716ded17a
|
|
| MD5 |
549d3706e985e98655702b410c4ad35c
|
|
| BLAKE2b-256 |
662f4dfd402a441d8c5b32577f15efddc292b2c47c373eb6930c4522b859e7d4
|
Provenance
The following attestation bundles were made for idun_agent_engine-0.3.3.tar.gz:
Publisher:
release.yml on Idun-Group/idun-agent-platform
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
idun_agent_engine-0.3.3.tar.gz -
Subject digest:
140788813185f3fa20432655f404523dd4fcf59ea1819fee5f2ca9c716ded17a - Sigstore transparency entry: 732812882
- Sigstore integration time:
-
Permalink:
Idun-Group/idun-agent-platform@aa932324d86fcea85cf1a2892b6a63aa477c8f55 -
Branch / Tag:
refs/tags/0.3.3 - Owner: https://github.com/Idun-Group
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@aa932324d86fcea85cf1a2892b6a63aa477c8f55 -
Trigger Event:
release
-
Statement type:
File details
Details for the file idun_agent_engine-0.3.3-py3-none-any.whl.
File metadata
- Download URL: idun_agent_engine-0.3.3-py3-none-any.whl
- Upload date:
- Size: 59.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b4bf66cc963a00fbc58b71e8e1277809660edbcf09a4b3058443132b3e200465
|
|
| MD5 |
66db901add8d057f8652427b14cd7f66
|
|
| BLAKE2b-256 |
885e33a6bd74920d228d89dd7bf77c9408fddce2fb86acb3ee559c1d42bee1d6
|
Provenance
The following attestation bundles were made for idun_agent_engine-0.3.3-py3-none-any.whl:
Publisher:
release.yml on Idun-Group/idun-agent-platform
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
idun_agent_engine-0.3.3-py3-none-any.whl -
Subject digest:
b4bf66cc963a00fbc58b71e8e1277809660edbcf09a4b3058443132b3e200465 - Sigstore transparency entry: 732812885
- Sigstore integration time:
-
Permalink:
Idun-Group/idun-agent-platform@aa932324d86fcea85cf1a2892b6a63aa477c8f55 -
Branch / Tag:
refs/tags/0.3.3 - Owner: https://github.com/Idun-Group
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@aa932324d86fcea85cf1a2892b6a63aa477c8f55 -
Trigger Event:
release
-
Statement type: