Skip to main content

Kivi โ€” Unified AI chat interface. Provider-agnostic streaming chat with tools, sessions, auto-compaction. Supports OpenAI, vLLM, GitHub Copilot SDK, Claude Agent SDK.

Project description

๐Ÿฅ Kivi โ€” Unified AI Chat Interface

PyPI License: MIT Python 3.10+

Provider-agnostic AI chat with token-level streaming, server-side tools, session persistence, and auto-compaction. One beautiful UI, any backend.

โœจ Features

  • ๐Ÿ”€ Provider Switching โ€” Switch between OpenAI, vLLM, Copilot, Claude mid-conversation
  • โšก Token-Level Streaming โ€” Real-time token streaming for all providers (not word-level)
  • ๐Ÿ› ๏ธ Server-Side Tools โ€” bash, read, write, edit, glob, grep, web_search, web_fetch
  • ๐Ÿ’พ Session Persistence โ€” SQLite-backed sessions with full message history
  • ๐Ÿ“ฆ Auto-Compaction โ€” Automatically compacts context at 75% of provider's window
  • ๐ŸŽจ Claude-like UI โ€” Dark/light themes, thinking blocks, tool blocks, markdown, code highlighting, LaTeX, SVG preview
  • ๐Ÿ“Š Token Dashboard โ€” Usage tracking with Plotly charts
  • ๐Ÿ”ง Git Dashboard โ€” Built-in diff viewer, commit & push

๐Ÿš€ Quick Start

pip install kivi-ai
kivi

Open http://localhost:8899 in your browser. That's it.

๐Ÿ“ฆ Install with SDK support

# With GitHub Copilot SDK support
pip install kivi-ai[copilot]

# With Claude Agent SDK support
pip install kivi-ai[claude]

# Everything
pip install kivi-ai[all]

โš™๏ธ Configuration

All configuration via environment variables or CLI flags:

# Set vLLM backend URL
kivi --vllm-url http://your-server:8000

# Custom port
kivi --port 9000

# Or use environment variables
export VLLM_URL=http://your-server:8000
export OPENAI_API_KEY=sk-...
export KIVI_PORT=9000
kivi

CLI Options

kivi                    Start server (default: 0.0.0.0:8899)
kivi --port 9000        Custom port
kivi --host 127.0.0.1   Bind to localhost only
kivi --vllm-url URL     vLLM backend URL
kivi --reload           Dev mode with auto-reload
kivi --help             Show help

๐Ÿ—๏ธ Architecture

kivi_ai/
โ”œโ”€โ”€ core/               # Types, interfaces, registry
โ”‚   โ”œโ”€โ”€ types.py        # Message, StreamChunk, ToolCall, ModelInfo, etc.
โ”‚   โ”œโ”€โ”€ interfaces.py   # BaseProvider ABC, ToolInterface, SessionStore
โ”‚   โ””โ”€โ”€ registry.py     # Provider & tool registry singleton
โ”œโ”€โ”€ providers/          # Provider implementations
โ”‚   โ”œโ”€โ”€ openai_provider.py   # OpenAI & vLLM (OpenAI-compatible)
โ”‚   โ”œโ”€โ”€ copilot_provider.py  # GitHub Copilot SDK
โ”‚   โ”œโ”€โ”€ claude_provider.py   # Claude Agent SDK
โ”‚   โ””โ”€โ”€ config.py            # Context windows, costs, defaults
โ”œโ”€โ”€ streaming/          # Stream processing
โ”‚   โ”œโ”€โ”€ adapter.py      # Normalize streams (filter empty, ensure DONE)
โ”‚   โ””โ”€โ”€ sse.py          # StreamChunk โ†’ SSE text
โ”œโ”€โ”€ sessions/           # Session management
โ”‚   โ”œโ”€โ”€ store.py        # SQLite store (async, WAL mode)
โ”‚   โ”œโ”€โ”€ manager.py      # Session lifecycle & provider switching
โ”‚   โ””โ”€โ”€ compaction.py   # Auto-compact at 75% context window
โ”œโ”€โ”€ tools/              # Server-side tool system
โ”‚   โ””โ”€โ”€ builtins.py     # bash, read, write, edit, glob, grep, web_search, web_fetch
โ”œโ”€โ”€ frontend/
โ”‚   โ””โ”€โ”€ index.html      # Claude-like chat UI (single file)
โ”œโ”€โ”€ server.py           # FastAPI app โ€” unified streaming endpoint
โ””โ”€โ”€ cli.py              # CLI entry point

Key Design Decisions

  • All providers normalize to AsyncIterator[StreamChunk] โ€” single streaming format
  • Sessions are provider-agnostic โ€” switch providers without losing history
  • SQLite with WAL mode + ThreadPoolExecutor โ€” non-blocking async I/O
  • Atomic message sequencing โ€” no race conditions on concurrent writes
  • Single unified endpoint โ€” POST /api/chat/stream handles all providers

๐Ÿ”Œ Providers

Provider Streaming Tools Thinking Backend
vllm โœ… Token โœ… โœ… Local vLLM server
openai โœ… Token โœ… โŒ OpenAI API
copilot โœ… Token โœ… โŒ GitHub Copilot SDK
claude โœ… Token โœ… โœ… Claude Agent SDK
qwen-copilot โœ… Token โœ… โŒ Copilot SDK โ†’ vLLM
qwen-claude โœ… Token โœ… โœ… Claude SDK โ†’ vLLM

๐ŸŒ API

All endpoints under the server:

Method Endpoint Description
POST /api/chat/stream Unified streaming (SSE)
GET /api/sessions List all sessions
POST /api/sessions Create session
GET /api/sessions/:id/messages Get messages
DELETE /api/sessions/:id Delete session
GET /api/providers List providers
GET /api/models/:provider List models
GET /api/tools List available tools
GET /api/usage Token usage stats

๐Ÿ“„ License

MIT โ€” see LICENSE

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kivi_ai-0.1.0.tar.gz (59.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kivi_ai-0.1.0-py3-none-any.whl (64.2 kB view details)

Uploaded Python 3

File details

Details for the file kivi_ai-0.1.0.tar.gz.

File metadata

  • Download URL: kivi_ai-0.1.0.tar.gz
  • Upload date:
  • Size: 59.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for kivi_ai-0.1.0.tar.gz
Algorithm Hash digest
SHA256 2cc76e37efa332504a6012ecd5f3ebf484a3182a93f002e9f7dce821d42b4c66
MD5 3dc8361af933192fea919f650c5a0b8c
BLAKE2b-256 50e31b3cf16cf211503e60162b51059a5c1e0abb8e0a3d090a24a7b1ff821b83

See more details on using hashes here.

File details

Details for the file kivi_ai-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: kivi_ai-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 64.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for kivi_ai-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3651c40bfa9255c21aeb6e43e2e356e568940b7072bfae959ea9d7e214d14298
MD5 f5df6fe7e38596966b1b1c2ac46a08f9
BLAKE2b-256 dd9f88a60b8c6cd4c97464c9d5be46cf87487ba733b6c7bd7d81bb6c4f3a0efe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page