A Langgraph agent API server that implements Langgraph agent's web capabilities and can interact with chatbot
Project description
agent-api-server
agent-api-server is an agent runtime that now supports both:
- API Server mode for HTTP/SSE access
- SDK mode for direct in-process agent execution
It keeps compatibility with existing LangGraph-based agents and introduces a framework abstraction so additional runtimes such as OpenClaw can be added behind the same interface.
Features
- FastAPI application factory for embedding or standalone deployment
- Direct SDK entry via root module
agent_api_server.py - Framework-neutral agent loading with LangGraph and OpenClaw adapters
- Backward-compatible LangGraph-oriented API routes under
/api/v1 - Built-in static client assets bundled in both sdist and wheel artifacts
- Redis-backed thread storage and PostgreSQL checkpoint integration
- Optional integration with model management registration and listener services
- Layered package structure centered on
adapters/,core/,common/,api/,integration/, andsdk/
Requirements
- Python 3.11 to 3.13
- Redis
- PostgreSQL
- Access to all runtime dependencies declared in
pyproject.toml
Installation
Install from a package index that provides all required dependencies:
pip install agent-api-server
This project depends on llm-sdk and model-manage-client. If those packages are hosted on a private index in your environment, configure pip or Poetry to use that index before installation.
Configuration
The runtime reads configuration from environment variables. Common settings include:
REDIS_URL=redis://localhost:6379/0
POSTGRES_URL=postgresql://postgres:postgres@localhost:5432/postgres
MODEL_MANAGER_SERVICE_URL=http://127.0.0.1:10053
CLIENT_TOKEN=
SERVER_PORT=8080
SERVER_WORKER_AMOUNT=1
LOG_LEVEL=INFO
See .env_example for a more complete example.
Agent Config
agents.json is now the primary agent registry. Existing deployments that still use langgraph.json remain compatible via fallback loading. LangGraph string entries still work:
{
"graphs": {
"demo-agent": "./agents/demo.py:graph"
}
}
You can also use the extended form to declare the framework explicitly and attach adapter-specific settings:
{
"graphs": {
"demo-agent": {
"framework": "langgraph",
"entrypoint": "./agents/demo.py:graph"
},
"openclaw-agent": {
"framework": "openclaw",
"agent_id": "openclaw-agent",
"create_agent": true,
"agent_config": {
"name": "OpenClaw Agent"
},
"client": {
"gateway_ws_url": "ws://127.0.0.1:18789/gateway",
"api_key": "your_gateway_token",
"scopes": ["operator.read", "operator.write"],
"device_identity_path": "~/.openclaw/identity/device.json",
"timeout": 60
},
"input_schema": {
"type": "object",
"properties": {
"query": {
"type": "string"
}
},
"required": ["query"]
},
"context_schema": {
"type": "object",
"properties": {
"CHAT_PROVIDER": {
"type": "string"
}
}
}
}
}
}
For OpenClaw agents, API and SDK calls map the local thread_id to the OpenClaw SDK session_name, so each Chatbot conversation uses an isolated OpenClaw session by default. When create_agent is enabled, thread creation calls list_agents() and creates the remote OpenClaw agent if it is missing. If workspace is omitted, the adapter creates the agent under the OpenClaw client work directory using the OpenClaw workspace naming convention, for example .openclaw/workspace-openclaw-agent.
The Gateway protocol requires both auth.token and a signed device identity during connect. In practice that means client.api_key alone is not enough for a normal WS connection: you also need a paired device identity (default ~/.openclaw/identity/device.json) or a gateway configured for insecure local auth.
When client.api_key is configured, this project now auto-generates a local Ed25519 device identity if client.device_identity_path does not exist yet. The first connection may still require device approval on the gateway before hello-ok.auth.deviceToken is issued.
Request Payloads
Thread run and stream endpoints accept the current payload shape:
{
"query": "hello",
"attachments": []
}
They also accept the legacy Chatbot payload shape:
{
"inputs": {
"user_input": "hello"
},
"attachments": []
}
Internally both forms are normalized to {"query": "hello"} plus an attachment list before they reach the adapter. OpenClaw SSE events include conversation_id and use model / tools nodes for node_message, tools_message, and token_stream events.
Running the server
Run the application with Uvicorn:
uvicorn service:create_fastapi_app --factory --host 0.0.0.0 --port 8080
After startup:
- API root:
http://127.0.0.1:8080/api/v1 - OpenAPI docs:
http://127.0.0.1:8080/docs - Built-in client:
http://127.0.0.1:8080/site
SDK Usage
For OpenClaw agents, a package consumer can configure the gateway directly in the SDK constructor instead of shipping an agents.json file:
import asyncio
from agent_api_server import AgentSDK
async def main():
sdk = AgentSDK(
agent_name="demo-openclaw-agent",
agent_id="your-openclaw-agent-id",
gateway_ws_url="ws://127.0.0.1:18789/gateway",
api_key="your_gateway_token",
scopes=["operator.read", "operator.write"],
device_identity_path="~/.openclaw/identity/device.json",
timeout=60,
)
result = await sdk.run(query="hello", thread_id="sdk-run-thread")
print(result.content)
asyncio.run(main())
To print each streamed chunk to the console as it arrives:
import asyncio
from agent_api_server import AgentSDK
async def main():
sdk = AgentSDK(
agent_name="demo-openclaw-agent",
agent_id="your-openclaw-agent-id",
gateway_ws_url="ws://127.0.0.1:18789/gateway",
api_key="your_gateway_token",
scopes=["operator.read", "operator.write"],
device_identity_path="~/.openclaw/identity/device.json",
timeout=60,
)
async for chunk in sdk.stream(query="hello", thread_id="sdk-stream-thread"):
print(chunk, end="", flush=True)
asyncio.run(main())
The existing config-file style remains supported:
import asyncio
from agent_api_server import AgentSDK
async def main():
sdk = AgentSDK()
result = await sdk.run(
"demo-agent",
{"query": "hello"},
thread_id="sdk-thread",
)
print(result.content)
asyncio.run(main())
For synchronous usage with direct OpenClaw settings:
from agent_api_server import AgentSDK
sdk = AgentSDK(
agent_name="demo-openclaw-agent",
agent_id="your-openclaw-agent-id",
gateway_ws_url="ws://127.0.0.1:18789/gateway",
api_key="your_gateway_token",
scopes=["operator.read", "operator.write"],
device_identity_path="~/.openclaw/identity/device.json",
timeout=60,
)
result = sdk.run_sync(query="hello", thread_id="sdk-sync-thread")
print(result.content)
For synchronous usage with config-file-managed agents:
from agent_api_server import AgentSDK
sdk = AgentSDK()
result = sdk.run_sync("demo-agent", {"query": "hello"}, thread_id="sync-thread")
print(result.content)
attachments is the public file/URL metadata channel. ts_tenant, ei_token, runtime_config, and use_system_llm are optional advanced parameters for deployments that need tenant-specific model credentials or LangGraph runtime configuration.
Build
Build source and wheel distributions with Poetry:
poetry build
Layout
The repository is now organized as layered packages:
adapters/for framework-specific agent adapterscore/for agent runtime, loading, and shared execution modelscommon/for config, logging, crypto, Redis, Postgres, NATS, and formatting helpersapi/for FastAPI route modulesintegration/for registration and model-update listener integrationssdk/for the direct SDK client
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file agent_api_server-2.2.1a2.tar.gz.
File metadata
- Download URL: agent_api_server-2.2.1a2.tar.gz
- Upload date:
- Size: 1.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cadfc64345f87ea501b93c3dd3d1c56729094cc95a348be3b0566c3065cc715b
|
|
| MD5 |
cb877f83140dfc712886b449f435c19d
|
|
| BLAKE2b-256 |
ddc06ef75544869f0e0ef1cb65f7ba2bf54b916a77ad42dc608155e889006bd6
|
File details
Details for the file agent_api_server-2.2.1a2-py3-none-any.whl.
File metadata
- Download URL: agent_api_server-2.2.1a2-py3-none-any.whl
- Upload date:
- Size: 1.2 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f2d8a97c70bd1ee055edc3069b2ebfd37b5814b13e37c50206760d2e72a52761
|
|
| MD5 |
a207e3fdf9ff6ab711643ac22f4c841b
|
|
| BLAKE2b-256 |
d0b9a61976a1b431122c7de16b8cffc0b47681aac7bba4bc53a6c48f289f4db4
|