Skip to main content

LLM chat app framework - Minimally complete. Maximally hackable

Project description

Chatnificent

LLM chat app framework. Minimally complete. Maximally hackable.

PyPI version DeepWiki

Pre-built chat UIs give you a working app but almost no way to customize it. Building from scratch gives you full control but means wiring up a UI, LLM client, message store, streaming, auth, and tool calling yourself.

Chatnificent is a Python framework where each of those concerns is an independent, swappable component. You get a working app immediately. When you need to change something — the LLM provider, the database, the entire UI — you swap one component, instead of rewriting the whole app.

Quickstart

pip install chatnificent
import chatnificent as chat

app = chat.Chatnificent()
app.run()  # http://127.0.0.1:7777

No API keys, no extras, no configuration. You get a working chat UI with the built-in Echo LLM, a stdlib HTTP server, and an HTML/JS frontend — all with zero dependencies.

One Install Away from Real LLM Responses

pip install openai
export OPENAI_API_KEY="sk-..."

Run the same code. Chatnificent auto-detects the installed OpenAI SDK and your API key — no code change needed.

Swap Anything

Every component is a pillar you can swap independently:

import chatnificent as chat

# Different LLM providers
app = chat.Chatnificent(llm=chat.llm.Anthropic())   # pip install anthropic
app = chat.Chatnificent(llm=chat.llm.Gemini())       # pip install google-genai
app = chat.Chatnificent(llm=chat.llm.Ollama())       # pip install ollama (local)

# Persistent storage
app = chat.Chatnificent(store=chat.store.SQLite(db_path="chats.db"))
app = chat.Chatnificent(store=chat.store.File(directory="./conversations"))

# Mix and match
app = chat.Chatnificent(
    llm=chat.llm.Anthropic(),
    store=chat.store.SQLite(db_path="conversations.db"),
    layout=chat.layout.Bootstrap(),  # Requires: pip install "chatnificent[dash]"
)

Streaming by Default

All LLM providers stream by default — token-by-token delivery via Server-Sent Events. Opt out with stream=False:

app = chat.Chatnificent(llm=chat.llm.OpenAI(stream=False))

The Architecture: 9 Pillars

Every major function is handled by an independent pillar with an abstract interface:

Pillar Purpose Default Implementations
Server HTTP transport DevServer (stdlib) DevServer, DashServer
Layout UI rendering DefaultLayout (HTML/JS) DefaultLayout, Bootstrap, Mantine, Minimal
LLM LLM API calls OpenAI / Echo OpenAI, Anthropic, Gemini, OpenRouter, DeepSeek, Ollama, Echo
Store Persistence InMemory InMemory, File, SQLite
Engine Orchestration Orchestrator Orchestrator
Auth User identification Anonymous Anonymous, SingleUser
Tools Function calling NoTool PythonTool, NoTool
Retrieval RAG / context NoRetrieval NoRetrieval
URL Route parsing PathBased PathBased, QueryParams

Dash-based layouts (Bootstrap, Mantine, Minimal) require pip install "chatnificent[dash]" and the DashServer.

Customize the Engine

The Orchestrator manages the full request lifecycle: conversation resolution, RAG retrieval, the agentic tool-calling loop, and persistence. Override hooks (for monitoring) and seams (for logic):

import chatnificent as chat
from typing import Any, Optional

class CustomEngine(chat.engine.Orchestrator):

    def _after_llm_call(self, llm_response: Any) -> None:
        tokens = getattr(llm_response, 'usage', 'N/A')
        print(f"Tokens: {tokens}")

    def _prepare_llm_payload(self, conversation, retrieval_context: Optional[str]):
        payload = super()._prepare_llm_payload(conversation, retrieval_context)
        if not any(m['role'] == 'system' for m in payload):
            payload.insert(0, {"role": "system", "content": "Be concise."})
        return payload

app = chat.Chatnificent(engine=CustomEngine())

Build Your Own Pillars

Implement the abstract interface and inject it:

import chatnificent as chat
from chatnificent.models import Conversation

class MongoStore(chat.store.Store):
    def save_conversation(self, user_id, conversation): ...
    def load_conversation(self, user_id, convo_id): ...
    def list_conversations(self, user_id): ...

app = chat.Chatnificent(store=MongoStore())

Every pillar works the same way: subclass the ABC, implement the required methods, pass it in.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatnificent-0.0.14.tar.gz (41.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chatnificent-0.0.14-py3-none-any.whl (46.6 kB view details)

Uploaded Python 3

File details

Details for the file chatnificent-0.0.14.tar.gz.

File metadata

  • Download URL: chatnificent-0.0.14.tar.gz
  • Upload date:
  • Size: 41.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.26 {"installer":{"name":"uv","version":"0.9.26","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for chatnificent-0.0.14.tar.gz
Algorithm Hash digest
SHA256 0e05952c8c70e36e05fbb4b5fd45013baae8124a74482a8f385148f7ceace7ae
MD5 266610648743f899dabad79332aba13a
BLAKE2b-256 db1e936f16e36f2a3ff5bd71898b461c906ec67fc384514049cfaedaef522da5

See more details on using hashes here.

File details

Details for the file chatnificent-0.0.14-py3-none-any.whl.

File metadata

  • Download URL: chatnificent-0.0.14-py3-none-any.whl
  • Upload date:
  • Size: 46.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.26 {"installer":{"name":"uv","version":"0.9.26","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for chatnificent-0.0.14-py3-none-any.whl
Algorithm Hash digest
SHA256 8898a7e4170c072094f18d8128f72b18f5b2b2ad7c209af5433d69ac2a0cd2b7
MD5 86cee76c9b39137338dcffc81c9a0df7
BLAKE2b-256 ae7cfe99354ff2770f97d1e9e23f77bc8c760e7c0d418aedde86c895ea1edc34

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page