Skip to main content

LLM chat for humans. AI in your Textual app in a few lines.

Project description

Example

textual-chat

LLM chat for humans. Add AI to your terminal app in a few lines of code.

from textual.app import App, ComposeResult
from textual_chat import Chat

class MyApp(App):
    def compose(self) -> ComposeResult:
        yield Chat()

MyApp().run()

That's it. No configuration, no boilerplate.

Features

  • Zero-config - Auto-detects your LLM setup and just works
  • ACP agents - Works with Claude Code, OpenCode, and custom agents
  • Function calling - Decorate Python functions as tools
  • Fully customizable - It's a Textual widget, style it however you want

Install

uv add textual-chat

Or with pip: pip install textual-chat

For ACP agent support: uv add textual-chat[acp]

Quick Start

Set an API key:

export ANTHROPIC_API_KEY=sk-ant-...  # or OPENAI_API_KEY

Then run:

from textual.app import App, ComposeResult
from textual_chat import Chat

class MyApp(App):
    def compose(self) -> ComposeResult:
        yield Chat(
            model="claude-sonnet-4-20250514",  # Optional
            system="You are a helpful assistant.",  # Optional
        )

MyApp().run()

Tools

Pass any FastMCP tool directly:

from fastmcp import FastMCP

mcp = FastMCP("My Tools")

@mcp.tool
def get_weather(city: str) -> str:
    """Get the weather for a city."""
    return f"72°F and sunny in {city}"

chat = Chat(tools=mcp.tools)

Or use the @chat.tool decorator for quick one-offs:

chat = Chat()

@chat.tool
def search(query: str) -> str:
    """Search the web."""
    return results

Examples

See the examples/ folder for complete examples:

Example Description
basic.py Minimal chat app
with_tools.py Function calling
with_thinking.py Extended thinking (Claude)
with_mcp.py MCP server tools
custom_model.py Custom model and system prompt
in_larger_app.py Sidebar integration with tools
chatbot_modal.py Modal dialog pattern
chatbot_sidebar.py Toggleable sidebar
with_tabs.py Tabbed interface
acp_chat.py ACP agent integration

Run any example:

uv run examples/basic.py
uv run examples/acp_chat.py examples/echo_agent.py

Configuration

Chat(
    model="claude-sonnet-4-20250514",  # Model ID or agent command
    adapter="litellm",                  # "litellm" or "acp"
    system="You are a pirate.",         # System prompt
    temperature=0.9,                    # Response randomness
    thinking=True,                      # Extended thinking (Claude)
    tools=[fn1, fn2],                   # Tool functions
    cwd="/path/to/project",             # Working directory
    show_token_usage=True,              # Show token counts
    show_model_selector=True,           # Allow /model switching
)

License

MIT


Built with Textual and LiteLLM

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

textual_chat-0.1.3.tar.gz (550.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

textual_chat-0.1.3-py3-none-any.whl (62.1 kB view details)

Uploaded Python 3

File details

Details for the file textual_chat-0.1.3.tar.gz.

File metadata

  • Download URL: textual_chat-0.1.3.tar.gz
  • Upload date:
  • Size: 550.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.22

File hashes

Hashes for textual_chat-0.1.3.tar.gz
Algorithm Hash digest
SHA256 50c8ade95309c51f513cb584e4792be464c878f46057452d9cc42d29712039e7
MD5 a4fb824b6c79274d321c149b6cae56f5
BLAKE2b-256 04d1b8138f9f31a5241714814e6800cfcaf839d3bf646d301234385fb4539288

See more details on using hashes here.

File details

Details for the file textual_chat-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for textual_chat-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 ab7cee6844cfaebb2a3ef1919cafeef0fbaf52fc2e6656b06030d13f9ecd659c
MD5 8285fa7611f3c28777fec11ec500a8c6
BLAKE2b-256 0163d586e0bdaa8190fa7f3c0814f9f3aedd9cbabaac83fa842c88726ff55d6a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page