Skip to main content

LLM chat for humans. AI in your Textual app in a few lines.

Project description

Example

textual-chat

LLM chat for humans. Add AI to your terminal app in a few lines of code.

from textual.app import App, ComposeResult
from textual_chat import Chat

class MyApp(App):
    def compose(self) -> ComposeResult:
        yield Chat()

MyApp().run()

That's it. No configuration, no boilerplate.

Features

  • Zero-config - Auto-detects your LLM setup and just works
  • ACP agents - Works with Claude Code, OpenCode, and custom agents
  • Function calling - Decorate Python functions as tools
  • Fully customizable - It's a Textual widget, style it however you want

Install

uv add textual-chat

Or with pip: pip install textual-chat

For ACP agent support: uv add textual-chat[acp]

Quick Start

Set an API key:

export ANTHROPIC_API_KEY=sk-ant-...  # or OPENAI_API_KEY

Then run:

from textual.app import App, ComposeResult
from textual_chat import Chat

class MyApp(App):
    def compose(self) -> ComposeResult:
        yield Chat(
            model="claude-sonnet-4-20250514",  # Optional
            system="You are a helpful assistant.",  # Optional
        )

MyApp().run()

Tools

Pass any FastMCP tool directly:

from fastmcp import FastMCP

mcp = FastMCP("My Tools")

@mcp.tool
def get_weather(city: str) -> str:
    """Get the weather for a city."""
    return f"72°F and sunny in {city}"

chat = Chat(tools=mcp.tools)

Or use the @chat.tool decorator for quick one-offs:

chat = Chat()

@chat.tool
def search(query: str) -> str:
    """Search the web."""
    return results

Examples

See the examples/ folder for complete examples:

Example Description
basic.py Minimal chat app
with_tools.py Function calling
with_thinking.py Extended thinking (Claude)
with_mcp.py MCP server tools
custom_model.py Custom model and system prompt
in_larger_app.py Sidebar integration with tools
chatbot_modal.py Modal dialog pattern
chatbot_sidebar.py Toggleable sidebar
with_tabs.py Tabbed interface
acp_chat.py ACP agent integration

Run any example:

uv run examples/basic.py
uv run examples/acp_chat.py examples/echo_agent.py

Configuration

Chat(
    model="claude-sonnet-4-20250514",  # Model ID or agent command
    adapter="litellm",                  # "litellm" or "acp"
    system="You are a pirate.",         # System prompt
    temperature=0.9,                    # Response randomness
    thinking=True,                      # Extended thinking (Claude)
    tools=[fn1, fn2],                   # Tool functions
    cwd="/path/to/project",             # Working directory
    show_token_usage=True,              # Show token counts
    show_model_selector=True,           # Allow /model switching
)

License

MIT


Built with Textual and LiteLLM

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

textual_chat-0.1.2.tar.gz (446.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

textual_chat-0.1.2-py3-none-any.whl (61.7 kB view details)

Uploaded Python 3

File details

Details for the file textual_chat-0.1.2.tar.gz.

File metadata

  • Download URL: textual_chat-0.1.2.tar.gz
  • Upload date:
  • Size: 446.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.22

File hashes

Hashes for textual_chat-0.1.2.tar.gz
Algorithm Hash digest
SHA256 bf88a524f8d1465803f167c4af41a437662c67d2aa3a1be4a5ed3ed03a056dfa
MD5 02ea52386153ce9030098ac5a1686581
BLAKE2b-256 f52ff1fc696e8ba0a86e93447dc7c6148ee0780768ffff7dcb26780735314d98

See more details on using hashes here.

File details

Details for the file textual_chat-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for textual_chat-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 0d8bd36d0a3f304b5d8b9ee3be3013607ef5b4e0373f8f62f8cc394f75b8468e
MD5 9834015338518016bf69e904aa9dbc09
BLAKE2b-256 44abaaea82d4bc62e20555784c5bde121ac36bfaddaf6b2e43b8d78858a19ba5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page