Web UI components for AI chat interfaces
Project description
pig-web-ui
Web UI components for AI chat interfaces with FastAPI backend.
Features
- 🚀 FastAPI Backend: High-performance async web server
- 💬 Chat Interface: Ready-to-use chat UI
- ⚡ SSE Streaming: Server-Sent Events for real-time responses
- 🎨 Modern UI: Clean, responsive design
- 🔌 Easy Integration: Works with pig-agent-core
- 📱 Mobile Friendly: Responsive design
Installation
pip install pig-web-ui
Quick Start
Simple Chat Server
from pig_web_ui import ChatServer
from pig_llm import LLM
# Create server
server = ChatServer(
llm=LLM(provider="openai"),
title="My AI Assistant",
port=8000,
)
# Run server
server.run()
Then open http://localhost:8000 in your browser!
With Agent and Tools
from pig_web_ui import ChatServer
from pig_llm import LLM
from pig_agent_core import Agent, tool
@tool(description="Get current time")
def get_time() -> str:
from datetime import datetime
return datetime.now().strftime("%H:%M:%S")
# Create agent
agent = Agent(
llm=LLM(),
tools=[get_time],
system_prompt="You are a helpful assistant.",
)
# Create server with agent
server = ChatServer(agent=agent, title="Time Assistant")
server.run()
Custom Routes
from pig_web_ui import ChatServer
from fastapi import Request
server = ChatServer(llm=LLM())
@server.app.get("/custom")
async def custom_route():
return {"message": "Custom endpoint"}
server.run()
API Usage
Chat Endpoint
# Send message (SSE streaming)
curl -N http://localhost:8000/api/chat \
-H "Content-Type: application/json" \
-d '{"message": "Hello!"}'
# Response streams as SSE:
# data: {"type": "start"}
# data: {"type": "token", "content": "Hi"}
# data: {"type": "token", "content": " there"}
# data: {"type": "done"}
WebSocket Support
from pig_web_ui import ChatServer
server = ChatServer(llm=LLM(), use_websocket=True)
server.run()
// Client-side
const ws = new WebSocket('ws://localhost:8000/ws');
ws.send(JSON.stringify({message: "Hello"}));
ws.onmessage = (event) => {
const data = JSON.parse(event.data);
console.log(data.content);
};
Configuration
Server Options
server = ChatServer(
llm=LLM(),
title="My Assistant",
port=8000,
host="0.0.0.0",
cors=True,
api_prefix="/api",
static_dir="./static",
)
Environment Variables
export OPENAI_API_KEY=your-key
export WEB_UI_PORT=8000
export WEB_UI_HOST=0.0.0.0
UI Customization
Custom Theme
theme = {
"primary_color": "#007bff",
"background_color": "#ffffff",
"message_user_bg": "#007bff",
"message_assistant_bg": "#f0f0f0",
}
server = ChatServer(llm=LLM(), theme=theme)
Custom Templates
server = ChatServer(
llm=LLM(),
template_dir="./my_templates",
)
Create my_templates/chat.html:
<!DOCTYPE html>
<html>
<head>
<title>{{ title }}</title>
<!-- Your custom HTML -->
</head>
<body>
<!-- Your custom chat UI -->
</body>
</html>
CLI Usage
# Start server with default settings
pig-webui
# Specify model and port
pig-webui --model gpt-4 --port 8080
# With agent configuration
pig-webui --agent-config agent.json
Deployment
Docker
FROM python:3.11-slim
WORKDIR /app
COPY . .
RUN pip install pig-web-ui
CMD ["pig-webui", "--host", "0.0.0.0"]
Production
# Use gunicorn
gunicorn pig_web_ui.app:create_app -k uvicorn.workers.UvicornWorker
Components
ChatServer
Main server class:
from pig_web_ui import ChatServer
server = ChatServer(
llm=None, # LLM instance
agent=None, # Or Agent instance
title="Chat", # Page title
port=8000, # Server port
host="127.0.0.1", # Server host
cors=False, # Enable CORS
)
Endpoints
GET /- Chat interfacePOST /api/chat- Send message (SSE stream)GET /api/history- Get chat historyDELETE /api/history- Clear historyWS /ws- WebSocket connection (if enabled)
Examples
See examples/web-ui/:
basic_server.py- Basic chat serveragent_server.py- Server with agent and toolscustom_ui.py- Custom UI examplewebsocket_demo.py- WebSocket example
Architecture
Browser
↓
FastAPI Server
↓
ChatServer
↓
Agent/LLM (pig-agent-core/pig-llm)
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
pig_web_ui-0.0.1.tar.gz
(14.0 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pig_web_ui-0.0.1.tar.gz.
File metadata
- Download URL: pig_web_ui-0.0.1.tar.gz
- Upload date:
- Size: 14.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bf860999b9d6ad29a3efea3cc329ac4da94e420f8780a20c03a39c297fe41e2b
|
|
| MD5 |
0effa54110ad2d7067c681848c5c0bf5
|
|
| BLAKE2b-256 |
c559426848e743418195b2e7207d982d88b4c40cc25c2e147284bf4d917d92db
|
File details
Details for the file pig_web_ui-0.0.1-py3-none-any.whl.
File metadata
- Download URL: pig_web_ui-0.0.1-py3-none-any.whl
- Upload date:
- Size: 14.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4c036afee1705301d727f9f637861a8d7258d14e2209f72c18d272a155fae629
|
|
| MD5 |
3715bb633f31ff797a2e48cbaffcf13d
|
|
| BLAKE2b-256 |
be1b3048909ab2f2573f9a94c746efd3fb3a2078d93402a5b7629f632d51f888
|