Skip to main content

FtAi Agent Hub adapter for LangChain / LangGraph streaming

Project description

ftai-langchain

将任意 LangChain / LangGraph 模型或图接入 FtAi Agent Hub 的通用适配器。

只需提供一个 async generator(输入 OpenAI 格式消息,yield AIMessageChunk),即可获得流式输出、工具调用上报、人机交互、自动重连等全部能力。

安装

uv add ftai-langchain

快速开始

接入单独的 Chat Model

import asyncio
import os

from langchain_anthropic import ChatAnthropic
from ftai_langchain import LangChainAgentHubClient, openai_to_langchain

model = ChatAnthropic(model="claude-sonnet-4-6")


async def handler(messages):
    """接收 OpenAI 格式消息,yield LangChain AIMessageChunk。"""
    lc_messages = openai_to_langchain(messages)
    async for chunk in model.astream(lc_messages):
        yield chunk


async def main():
    client = LangChainAgentHubClient(secret=os.environ["AGENT_SECRET"])
    await client.run(handler)


asyncio.run(main())

接入 LangGraph 图

import asyncio
import os

from langchain.agents import create_agent
from ftai_langchain import LangChainAgentHubClient, openai_to_langchain


def get_weather(city: str) -> str:
    """获取城市天气。"""
    return f"{city}:晴,25°C"


agent = create_agent(
    model="anthropic:claude-sonnet-4-6",
    tools=[get_weather],
)


async def handler(messages):
    lc_messages = openai_to_langchain(messages)
    async for msg, _metadata in agent.astream(
        {"messages": lc_messages},
        stream_mode="messages",
    ):
        yield msg  # AIMessageChunk 或 ToolMessage,均可处理


async def main():
    client = LangChainAgentHubClient(secret=os.environ["AGENT_SECRET"])
    await client.run(handler)


asyncio.run(main())

注意事项:LangGraph 子图流式输出

如果你将 create_agent() 返回的图作为节点嵌入更大的 StateGraph,外层图的 astream(stream_mode="messages") 默认不会传播子图内部的流式 chunk,只会返回子图的最终结果。需要加 subgraphs=True

# ❗ 输出格式从 (msg, metadata) 变为 (namespace, (msg, metadata))
async def handler(messages):
    lc_messages = openai_to_langchain(messages)
    async for _namespace, (msg, _metadata) in app.astream(
        {"messages": lc_messages},
        stream_mode="messages",
        subgraphs=True,     # ← 穿透子图,获取完整的流式 chunk
    ):
        yield msg

完整示例见 examples/3_langgraph_react/

核心概念

ChatHandler

async def handler(messages: list[dict]) -> AsyncIterator[BaseMessage]:
    ...
参数 类型 说明
messages list[dict[str, Any]] OpenAI 格式的完整对话历史
yield AIMessageChunk 文本 / 思考 / 工具调用片段 → 自动转为协议消息
yield ToolMessage (可选)工具执行完毕 → 触发 tool_call 上报

自动处理的事件映射

yield 的内容 Agent Hub 协议消息
AIMessageChunk.content(文本) stream_text
AIMessageChunk.additional_kwargs["reasoning_content"] stream_thinking
AIMessageChunk.tool_call_chunks 内部累积,由 ToolMessage 触发上报
ToolMessage tool_call(上报工具名 + 参数)
handler 正常结束 message_end(自动)
handler 被取消 message_end(cancel)(自动)
handler 抛异常 error(internal_error)(自动)

API 参考

LangChainAgentHubClient

from ftai_langchain import LangChainAgentHubClient

client = LangChainAgentHubClient(
    secret="sk-ftai-ag-...",        # Agent 密钥(必填)
    # agent_hub_url="wss://...",    # 可选,默认读取 AGENT_HUB_URL 环境变量或内置默认地址
    reconnect_initial=2.0,          # 重连初始间隔(秒)
    reconnect_max=60.0,             # 重连最大间隔(秒)
)
属性 / 方法 说明
client.agent_id 认证成功后的 Agent ID(只读)

| await client.run(handler) | 连接网关并处理请求(阻塞,自动重连) | | await client.stop() | 优雅关闭连接 |

工具函数

from ftai_langchain import openai_to_langchain, ToolCallAccumulator
函数 / 类 说明
openai_to_langchain(messages) OpenAI 格式 → LangChain BaseMessage 列表
ToolCallAccumulator 将流式 tool_call_chunks 重组为完整的工具调用记录

与 ftai-deep-agent 的关系

ftai-deep-agent 构建在本包之上,是一个面向 DeepAgent 的瘦包装——它将 agent.astream(stream_mode="messages") 封装为 ChatHandler 后委托给 LangChainAgentHubClient

如果你使用 DeepAgent,直接用 ftai-deep-agent 即可;如果你使用其他 LangChain / LangGraph 用法,直接用本包。

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ftai_langchain-0.2.2.tar.gz (7.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ftai_langchain-0.2.2-py3-none-any.whl (7.4 kB view details)

Uploaded Python 3

File details

Details for the file ftai_langchain-0.2.2.tar.gz.

File metadata

  • Download URL: ftai_langchain-0.2.2.tar.gz
  • Upload date:
  • Size: 7.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.6 {"installer":{"name":"uv","version":"0.10.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for ftai_langchain-0.2.2.tar.gz
Algorithm Hash digest
SHA256 fc22aea6089e129c033b3fdd7103316bef456caed0177a6ff37fd19bd3dad0c3
MD5 aaddffa5456acda2a998471c855d69ea
BLAKE2b-256 6ca7c025b4994f49c278f205785986d485b394e76a78316a148f106ee958d035

See more details on using hashes here.

File details

Details for the file ftai_langchain-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: ftai_langchain-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 7.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.6 {"installer":{"name":"uv","version":"0.10.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for ftai_langchain-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 34340931f17440e344c319e06adb82e8beb1d91a117c130d14308f4ea6738872
MD5 3b18d82aa1bda6bfd692629d9091c64f
BLAKE2b-256 5057e0847e4080db119ab584769c9e4332a75ab72d0496d38cc47ac9af16c2e6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page