Add your description here
Project description
pydantic-ai-chat-ui
A lightweight adapter that allows Pydantic AI agents to work with Vercel's Data Stream format via useChat, and more specifically the LlamaIndex Chat UI flavour. The adpater supports:
- streaming (SSE) real-time messages from Pydantic AI agents to your frontend, including intermediate tool calls
- conversion of historical Pydantic AI messages, should you store them and want to return a full message history
Check out:
- Vercel AI SDK (TypeScript): https://github.com/vercel/ai (>=5)
- LlamaIndex Chat UI (React): https://github.com/run-llama/chat-ui
Installation
uv add pydantic-ai-chat-ui
poetry add pydantic-ai-chat-ui
pip install pydantic-ai-chat-ui
Quickstart Example
Not all logic is implemented, but hopefully it's enough of a guide to point you in the right direction.
Route (FastAPI)
import logging
import uuid
import logfire
from fastapi import APIRouter, HTTPException, status
from fastapi.responses import StreamingResponse
from pydantic_ai_chat_ui import ChatRequest, stream_results
from pydantic_ai_chat_ui.messages import from_pydantic_ai_message
from pydantic_ai_chat_ui.tools import DataPartState
from voyageai.client_async import AsyncClient as AsyncVoyageClient
from your_app.agents import your_agent
from your_app.agents.operative_report_state import (
OperativeReportDeps,
OperativeReportState,
)
from your_app.app.security import UserIdDep
from your_app.config import settings
from your_app.database import SessionDep
from your_app.threads import (
create_or_get_thread,
get_thread,
store_message,
)
logger = logging.getLogger(__name__)
router = APIRouter()
@router.post("/chat")
async def agent_chat(
chat_request: ChatRequest, user_id: UserIdDep, db_session: SessionDep
) -> StreamingResponse:
thread = create_or_get_thread(chat_request.id, user_id, db_session=db_session)
deps = Deps(
thread_id=thread.id,
voyage_client=AsyncVoyageClient(settings.VOYAGE_API_KEY),
db_session=db_session,
)
return StreamingResponse(
stream_results(
chat_request.messages[0],
your_agent.agent,
deps,
message_history=thread.messages,
tool_messages={
"validate_data": {
DataPartState.PENDING: "Validating data...",
DataPartState.SUCCESS: "Data validated.",
DataPartState.ERROR: "Encountered an error during validation.",
},
"identify_stuff": {
DataPartState.PENDING: "Identifying and analysing stuff...",
DataPartState.SUCCESS: "Stuff identified.",
DataPartState.ERROR: "Encountered an error during identification.",
},
},
store_message_history=lambda message: store_message(
thread.id, message, db_session=db_session
),
),
media_type="text/event-stream",
headers={
"Cache-Control": "no-cache",
"Connection": "keep-alive",
# this is an important header for having types picked up
"X-Vercel-AI-UI-Message-Stream": "v1",
},
)
Frontend
import { useChat } from "@ai-sdk/react";
import {
ChatCanvas,
ChatInput,
ChatMessage,
ChatMessages,
ChatSection,
useChatUI,
} from "@llamaindex/chat-ui";
import { createFileRoute } from "@tanstack/react-router";
import { DefaultChatTransport, type UIMessage } from "ai";
import { toast } from "sonner";
import { v4 as uuid4 } from "uuid";
import "@llamaindex/chat-ui/styles/editor.css";
import "@llamaindex/chat-ui/styles/markdown.css";
import "@llamaindex/chat-ui/styles/pdf.css";
import "@mdxeditor/editor/style.css";
export const Route = createFileRoute("/(app)/chat")({
component: ChatRoute,
});
function ChatRoute() {
const initialMessages: UIMessage[] = [];
const { messages, status, sendMessage, stop, regenerate, setMessages } =
useChat({
transport: new DefaultChatTransport({
api: `${import.meta.env.VITE_BASE_API_URL}/chat`,
}),
generateId: () => uuid4(),
messages: initialMessages,
onError: (err) => toast.error(err.message),
});
const CustomChatMessages = () => {
const { messages } = useChatUI();
return (
<>
{messages.map((message, idx) => (
<ChatMessage
key={`message-${message.id}`}
message={message}
isLast={idx === messages.length - 1}
>
<ChatMessage.Avatar />
<ChatMessage.Content>
<ChatMessage.Part.Markdown />
<ChatMessage.Part.Artifact />
<ChatMessage.Part.Event />
<ChatMessage.Part.Suggestion />
</ChatMessage.Content>
<ChatMessage.Actions />
</ChatMessage>
))}
</>
);
};
return (
<ChatSection
handler={{
messages,
status,
sendMessage,
stop,
regenerate,
setMessages,
}}
className="h-full flex-row gap-4 p-0 flex md:p-5"
>
<div className="mx-auto flex h-full min-w-0 max-w-full flex-1 flex-col gap-4">
<ChatMessages>
<ChatMessages.List>
<CustomChatMessages />
</ChatMessages.List>
<ChatMessages.Empty />
<ChatMessages.Loading />
</ChatMessages>
<ChatInput>
<ChatInput.Form>
<ChatInput.Field className="max-h-32" />
<ChatInput.Submit />
</ChatInput.Form>
</ChatInput>
</div>
<ChatCanvas className="w-full md:w-2/3" />
</ChatSection>
);
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pydantic_ai_chat_ui-0.1.0.tar.gz.
File metadata
- Download URL: pydantic_ai_chat_ui-0.1.0.tar.gz
- Upload date:
- Size: 7.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2d53e4171624ab957e07a2ca74ea7779da53d445aab98cb7f8f49d37e61c1dab
|
|
| MD5 |
00e804b149ea2aaf2256cff2a0bca510
|
|
| BLAKE2b-256 |
cfb5e6a80ff63e51ca5c351e6e027f4ac876a8a9e9be74ff81cec081d2bf064d
|
Provenance
The following attestation bundles were made for pydantic_ai_chat_ui-0.1.0.tar.gz:
Publisher:
test.yml on thepratt/pydantic-ai-chat-ui
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
pydantic_ai_chat_ui-0.1.0.tar.gz -
Subject digest:
2d53e4171624ab957e07a2ca74ea7779da53d445aab98cb7f8f49d37e61c1dab - Sigstore transparency entry: 543760889
- Sigstore integration time:
-
Permalink:
thepratt/pydantic-ai-chat-ui@2b24e61cc36d5adf2920e3ae1e9cf2832e344858 -
Branch / Tag:
refs/tags/0.1.0 - Owner: https://github.com/thepratt
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
test.yml@2b24e61cc36d5adf2920e3ae1e9cf2832e344858 -
Trigger Event:
push
-
Statement type:
File details
Details for the file pydantic_ai_chat_ui-0.1.0-py3-none-any.whl.
File metadata
- Download URL: pydantic_ai_chat_ui-0.1.0-py3-none-any.whl
- Upload date:
- Size: 10.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8a39c56e7b75e7d0fae544bfe488aedce323f30b5fd032aa27ed38b19cccc801
|
|
| MD5 |
a0923dbb6ba8597c9629beb0f11db85a
|
|
| BLAKE2b-256 |
ca1e20148f844820d4db2751cd361b231aab9fcf650b01174c921cf2a3d49ddd
|
Provenance
The following attestation bundles were made for pydantic_ai_chat_ui-0.1.0-py3-none-any.whl:
Publisher:
test.yml on thepratt/pydantic-ai-chat-ui
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
pydantic_ai_chat_ui-0.1.0-py3-none-any.whl -
Subject digest:
8a39c56e7b75e7d0fae544bfe488aedce323f30b5fd032aa27ed38b19cccc801 - Sigstore transparency entry: 543760890
- Sigstore integration time:
-
Permalink:
thepratt/pydantic-ai-chat-ui@2b24e61cc36d5adf2920e3ae1e9cf2832e344858 -
Branch / Tag:
refs/tags/0.1.0 - Owner: https://github.com/thepratt
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
test.yml@2b24e61cc36d5adf2920e3ae1e9cf2832e344858 -
Trigger Event:
push
-
Statement type: