Skip to main content

LLM chat for humans. AI in your Textual app in a few lines.

Project description

Example

textual-chat

LLM chat for humans. Add AI to your terminal app in a few lines of code.

from textual.app import App, ComposeResult
from textual_chat import Chat

class MyApp(App):
    def compose(self) -> ComposeResult:
        yield Chat()

MyApp().run()

That's it. No configuration, no boilerplate.

Features

  • Zero-config - Auto-detects your LLM setup and just works
  • ACP agents - Works with Claude Code, OpenCode, and custom agents
  • Function calling - Decorate Python functions as tools
  • Fully customizable - It's a Textual widget, style it however you want

Install

uv add textual-chat

Or with pip: pip install textual-chat

For ACP agent support: uv add textual-chat[acp]

Quick Start

Set an API key:

export ANTHROPIC_API_KEY=sk-ant-...  # or OPENAI_API_KEY

Then run:

from textual.app import App, ComposeResult
from textual_chat import Chat

class MyApp(App):
    def compose(self) -> ComposeResult:
        yield Chat(
            model="claude-sonnet-4-20250514",  # Optional
            system="You are a helpful assistant.",  # Optional
        )

MyApp().run()

Tools

Pass any FastMCP tool directly:

from fastmcp import FastMCP

mcp = FastMCP("My Tools")

@mcp.tool
def get_weather(city: str) -> str:
    """Get the weather for a city."""
    return f"72°F and sunny in {city}"

chat = Chat(tools=mcp.tools)

Or use the @chat.tool decorator for quick one-offs:

chat = Chat()

@chat.tool
def search(query: str) -> str:
    """Search the web."""
    return results

Examples

See the examples/ folder for complete examples:

Example Description
basic.py Minimal chat app
with_tools.py Function calling
with_thinking.py Extended thinking (Claude)
with_mcp.py MCP server tools
custom_model.py Custom model and system prompt
in_larger_app.py Sidebar integration with tools
chatbot_modal.py Modal dialog pattern
chatbot_sidebar.py Toggleable sidebar
with_tabs.py Tabbed interface
acp_chat.py ACP agent integration

Run any example:

uv run examples/basic.py
uv run examples/acp_chat.py examples/echo_agent.py

Configuration

Chat(
    model="claude-sonnet-4-20250514",  # Model ID or agent command
    adapter="litellm",                  # "litellm" or "acp"
    system="You are a pirate.",         # System prompt
    temperature=0.9,                    # Response randomness
    thinking=True,                      # Extended thinking (Claude)
    tools=[fn1, fn2],                   # Tool functions
    cwd="/path/to/project",             # Working directory
    show_token_usage=True,              # Show token counts
    show_model_selector=True,           # Allow /model switching
)

License

MIT


Built with Textual and LiteLLM

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

textual_chat-0.1.1.tar.gz (439.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

textual_chat-0.1.1-py3-none-any.whl (61.6 kB view details)

Uploaded Python 3

File details

Details for the file textual_chat-0.1.1.tar.gz.

File metadata

  • Download URL: textual_chat-0.1.1.tar.gz
  • Upload date:
  • Size: 439.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.22

File hashes

Hashes for textual_chat-0.1.1.tar.gz
Algorithm Hash digest
SHA256 94dc67e9ced762b360e5684f43c66421965d3d2dc6a235e6a73637bf4f4957bc
MD5 5f470cf1cb36b52926f2d450d3265b10
BLAKE2b-256 6473d926c1620132aad0cec4ca72e81749e926fdecb61b0df0a633f84370ad3f

See more details on using hashes here.

File details

Details for the file textual_chat-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for textual_chat-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 48cb120f0cd911e539ba05f759ab94b107d3cb4952e372b2fd8c73e5c4e3eda6
MD5 655403d0c93aeb919020b695f5fe158e
BLAKE2b-256 e3082218c345292522bf59619ee83b3a4a13fd0c2347c71fee8692cff6739758

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page