LLM chat for humans. AI in your Textual app in a few lines.
Project description
textual-chat
LLM chat for humans. Add AI to your terminal app in a few lines of code.
from textual.app import App, ComposeResult
from textual_chat import Chat
class MyApp(App):
def compose(self) -> ComposeResult:
yield Chat()
MyApp().run()
That's it. No configuration, no boilerplate.
Features
- Zero-config - Auto-detects your LLM setup and just works
- ACP agents - Works with Claude Code, OpenCode, and custom agents
- Function calling - Decorate Python functions as tools
- Fully customizable - It's a Textual widget, style it however you want
Install
uv add textual-chat
Or with pip: pip install textual-chat
For ACP agent support: uv add textual-chat[acp]
Quick Start
Set an API key:
export ANTHROPIC_API_KEY=sk-ant-... # or OPENAI_API_KEY
Then run:
from textual.app import App, ComposeResult
from textual_chat import Chat
class MyApp(App):
def compose(self) -> ComposeResult:
yield Chat(
model="claude-sonnet-4-20250514", # Optional
system="You are a helpful assistant.", # Optional
)
MyApp().run()
Tools
Pass any FastMCP tool directly:
from fastmcp import FastMCP
mcp = FastMCP("My Tools")
@mcp.tool
def get_weather(city: str) -> str:
"""Get the weather for a city."""
return f"72°F and sunny in {city}"
chat = Chat(tools=mcp.tools)
Or use the @chat.tool decorator for quick one-offs:
chat = Chat()
@chat.tool
def search(query: str) -> str:
"""Search the web."""
return results
Examples
See the examples/ folder for complete examples:
| Example | Description |
|---|---|
basic.py |
Minimal chat app |
with_tools.py |
Function calling |
with_thinking.py |
Extended thinking (Claude) |
with_mcp.py |
MCP server tools |
custom_model.py |
Custom model and system prompt |
in_larger_app.py |
Sidebar integration with tools |
chatbot_modal.py |
Modal dialog pattern |
chatbot_sidebar.py |
Toggleable sidebar |
with_tabs.py |
Tabbed interface |
acp_chat.py |
ACP agent integration |
Run any example:
uv run examples/basic.py
uv run examples/acp_chat.py examples/echo_agent.py
Configuration
Chat(
model="claude-sonnet-4-20250514", # Model ID or agent command
adapter="litellm", # "litellm" or "acp"
system="You are a pirate.", # System prompt
temperature=0.9, # Response randomness
thinking=True, # Extended thinking (Claude)
tools=[fn1, fn2], # Tool functions
cwd="/path/to/project", # Working directory
show_token_usage=True, # Show token counts
show_model_selector=True, # Allow /model switching
)
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file textual_chat-0.1.4.tar.gz.
File metadata
- Download URL: textual_chat-0.1.4.tar.gz
- Upload date:
- Size: 551.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.22
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c52b7a4d06c343c835a665afacfd647ac25d2629d0ac59e499ac7f78630f19d9
|
|
| MD5 |
17dc1d559fb791458c88cb69ba969838
|
|
| BLAKE2b-256 |
7c371b74b2ab60d036745e1067b8f51980224f5a54adc777b40fc36598eaf282
|
File details
Details for the file textual_chat-0.1.4-py3-none-any.whl.
File metadata
- Download URL: textual_chat-0.1.4-py3-none-any.whl
- Upload date:
- Size: 62.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.22
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
50a83f283f27485d11361accc5f7555417220908e8b2cc35083720d9fbeda3e2
|
|
| MD5 |
1797107d7a578f4745bc3bc550cefc69
|
|
| BLAKE2b-256 |
38861014ccd4178b15d1593b9036c38d3acf1a9d45e0fe554427415dccc9791f
|