Skip to main content

LLM chat app framework - Minimally complete. Maximally hackable

Project description

Chatnificent

LLM chat app framework. Minimally complete. Maximally hackable.

PyPI version DeepWiki

Pre-built chat UIs give you a working app but almost no way to customize it. Building from scratch gives you full control but means wiring up a UI, LLM client, message store, streaming, auth, and tool calling yourself.

Chatnificent is a Python framework where each of those concerns is an independent, swappable component. You get a working app immediately. When you need to change something — the LLM provider, the database, the entire UI — you swap one component, instead of rewriting the whole app.

Quickstart

pip install chatnificent
import chatnificent as chat

app = chat.Chatnificent()
app.run()  # http://127.0.0.1:7777

No API keys, no extras, no configuration. You get a working chat UI with the built-in Echo LLM, a stdlib HTTP server, and an HTML/JS frontend — all with zero dependencies.

One Install Away from Real LLM Responses

pip install openai
export OPENAI_API_KEY="sk-..."

Run the same code. Chatnificent auto-detects the installed OpenAI SDK and your API key — no code change needed.

Swap Anything

Every component is a pillar you can swap independently:

import chatnificent as chat

# Different LLM providers
app = chat.Chatnificent(llm=chat.llm.Anthropic())   # pip install anthropic
app = chat.Chatnificent(llm=chat.llm.Gemini())       # pip install google-genai
app = chat.Chatnificent(llm=chat.llm.Ollama())       # pip install ollama (local)

# Persistent storage
app = chat.Chatnificent(store=chat.store.SQLite(db_path="chats.db"))
app = chat.Chatnificent(store=chat.store.File(base_dir="./conversations"))

# Mix and match
app = chat.Chatnificent(
    llm=chat.llm.Anthropic(),
    store=chat.store.SQLite(db_path="conversations.db"),
    layout=chat.layout.Bootstrap(),  # Requires: pip install "chatnificent[dash]"
)

Streaming by Default

All LLM providers stream by default — token-by-token delivery via Server-Sent Events. Opt out with stream=False:

app = chat.Chatnificent(llm=chat.llm.OpenAI(stream=False))

The Architecture: 9 Pillars

Every major function is handled by an independent pillar with an abstract interface:

Pillar Purpose Default Implementations
Server HTTP transport DevServer (stdlib) DevServer, DashServer
Layout UI rendering DefaultLayout (HTML/JS) DefaultLayout, Bootstrap, Mantine, Minimal
LLM LLM API calls OpenAI / Echo OpenAI, Anthropic, Gemini, OpenRouter, DeepSeek, Ollama, Echo
Store Persistence InMemory InMemory, File, SQLite
Engine Orchestration Orchestrator Orchestrator
Auth User identification Anonymous Anonymous, SingleUser
Tools Function calling NoTool PythonTool, NoTool
Retrieval RAG / context NoRetrieval NoRetrieval
URL Route parsing PathBased PathBased, QueryParams

Dash-based layouts (Bootstrap, Mantine, Minimal) require pip install "chatnificent[dash]" and the DashServer.

Customize the Engine

The Orchestrator manages the full request lifecycle: conversation resolution, RAG retrieval, the agentic tool-calling loop, and persistence. Override hooks (for monitoring) and seams (for logic):

import chatnificent as chat
from typing import Any, Optional

class CustomEngine(chat.engine.Orchestrator):

    def _after_llm_call(self, llm_response: Any) -> None:
        tokens = getattr(llm_response, 'usage', 'N/A')
        print(f"Tokens: {tokens}")

    def _prepare_llm_payload(self, conversation, retrieval_context: Optional[str]):
        payload = super()._prepare_llm_payload(conversation, retrieval_context)
        if not any(m['role'] == 'system' for m in payload):
            payload.insert(0, {"role": "system", "content": "Be concise."})
        return payload

app = chat.Chatnificent(engine=CustomEngine())

Build Your Own Pillars

Implement the abstract interface and inject it:

import chatnificent as chat
from chatnificent.models import Conversation

class MongoStore(chat.store.Store):
    def save_conversation(self, user_id, conversation): ...
    def load_conversation(self, user_id, convo_id): ...
    def list_conversations(self, user_id): ...

app = chat.Chatnificent(store=MongoStore())

Every pillar works the same way: subclass the ABC, implement the required methods, pass it in.

Can't Wait? Try It Right Now

No cloning, no installing — just install uv and run any example directly from GitHub:

Note: Most examples require LLM provider API keys. Set the ones you need before running:

export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export GOOGLE_API_KEY="AI..."
export OPENROUTER_API_KEY="sk-or-v1-..."

quickstart.py and persistent_storage.py work with zero keys (Echo LLM).

# Zero-dep — works immediately
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/quickstart.py

# LLM providers
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/llm_providers.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/ollama_local.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/openrouter_models.py

# Features
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/persistent_storage.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/tool_calling.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/system_prompt.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/multi_tool_agent.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/memory_tool.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/memory_tool_multi_user.py

# Customization
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/single_user.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/auto_title.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/custom_branding.py

# Display enrichment
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/usage_display.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/usage_display_multi_provider.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/conversation_title.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/conversation_summary.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/display_redaction.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/web_search.py

# Starlette server (requires OPENAI_API_KEY)
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/starlette_quickstart.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/starlette_server_options.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/starlette_uvicorn_options.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/starlette_multi_mount.py

# OpenAI Responses API (requires OPENAI_API_KEY)
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/openai_responses.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/openai_responses_website_search.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/openai_responses_image_generator.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/openai_responses_image_studio.py

# UI Interactions (requires OPENAI_API_KEY)
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/ui_interactions.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/openai_responses_interactive_search.py
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/single_app_multi_chat_mode.py

# OpenAI Cookbook — From Cookbook to Production (requires OPENAI_API_KEY)
uv run --script https://raw.githubusercontent.com/eliasdabbas/chatnificent/main/examples/How_to_call_functions_with_chat_models.py

Examples

The examples/ directory has 31 standalone scripts covering basics, tool calling, display enrichment, web search, and more — each runnable with a single command:

uv run --script examples/quickstart.py

See the examples README for the full list.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatnificent-0.0.22.tar.gz (53.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chatnificent-0.0.22-py3-none-any.whl (57.9 kB view details)

Uploaded Python 3

File details

Details for the file chatnificent-0.0.22.tar.gz.

File metadata

  • Download URL: chatnificent-0.0.22.tar.gz
  • Upload date:
  • Size: 53.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.26 {"installer":{"name":"uv","version":"0.9.26","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for chatnificent-0.0.22.tar.gz
Algorithm Hash digest
SHA256 bb06429718c1e06d5bf5628d0d9ef28fdbb1d73f079f7afe50141764502299ed
MD5 360cd94f972223ad64c1035a6669c9d5
BLAKE2b-256 6b8b732b0781ec277e4b91ca01d909be4dd4518ff1297879a428cdde1a7b1526

See more details on using hashes here.

File details

Details for the file chatnificent-0.0.22-py3-none-any.whl.

File metadata

  • Download URL: chatnificent-0.0.22-py3-none-any.whl
  • Upload date:
  • Size: 57.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.26 {"installer":{"name":"uv","version":"0.9.26","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for chatnificent-0.0.22-py3-none-any.whl
Algorithm Hash digest
SHA256 4af69cde1a3f1d9b53c6d5b9bb6ae84f19836794bb1f79947aeca9c36d439deb
MD5 1386f8c61796ec882fe23b5e02a0d68a
BLAKE2b-256 7a4cfb35b8340e919ece3a3561e8f79a2b5aa58e984fa401143bd126ac9388ff

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page