An intuitive AI coding assistant and interactive CLI tool that boosts developer productivity with intelligent automation and context-aware support.
Project description
Koder
Koder is an experimental, universal AI coding assistant designed to explore how to build an advanced terminal-based AI coding assistant. Written entirely in Python, it serves as both a functional tool and a learning playground for AI agent development.
🎯 Project Status: Under active vibe coding! This is a learning-focused project where we explore building AI coding agents.
✨ Features
- 🤖 Universal AI Support: Works with OpenAI, Anthropic, Google, GitHub Copilot, and 100+ providers via LiteLLM with intelligent auto-detection
- 💾 Smart Context Management: Persistent sessions with SQLite storage and automatic token-aware compression (50k token limit)
- 🔄 Real-time Streaming: Rich Live displays with intelligent terminal cleanup for responsive user experience
- 🛠️ Comprehensive Toolset: file operations, search, shell, task delegation and todos.
- 🔌 MCP Integration: Model Context Protocol support with stdio, SSE, and HTTP transports for extensible tool ecosystem
- 🛡️ Enterprise Security: SecurityGuard validation, output filtering, permission system, and input sanitization
- 🎯 Zero Configuration: Automatic provider detection with fallback defaults
🛠️ Installation
Using uv (Recommended)
uv tool install koder
Using pip
pip install koder
⚡ Quick Start
Simply run Koder with your question or request:
# Configure one provider (example: OpenAI)
export OPENAI_API_KEY="your-openai-api-key"
export KODER_MODEL="gpt-4o"
# Run in interactive mode
koder
# Run with prompt
koder "create a Python function to calculate fibonacci numbers"
# Execute a single prompt in a named session
koder -s my-project "Help me implement a new feature"
# Use an explicit session flag
koder -s my-project "Your prompt here"
# Use high reasoning effort for complex problems (OpenAI reasoning models)
koder --reasoning high "Solve this complex algorithm problem"
# Use low reasoning for simple tasks
koder --reasoning low "Add a print statement"
🤖 Configuration
Koder supports flexible configuration through three mechanisms (in order of priority):
- CLI Arguments - Highest priority, for runtime overrides
- Environment Variables - For secrets and runtime configuration
- Config File - For persistent defaults (
~/.koder/config.yaml)
Quick Setup
# Minimal setup - just set your API key and go
export OPENAI_API_KEY="your-api-key"
koder
# Or use a different provider
export ANTHROPIC_API_KEY="your-api-key"
export KODER_MODEL="claude-opus-4-20250514"
koder
Config File
Koder uses a YAML config file at ~/.koder/config.yaml for persistent settings.
Config CLI Commands
# Initialize config file with defaults
koder config init
# Show current configuration
koder config show
# Show config file path
koder config path
# Open config file in editor (respects $EDITOR)
koder config edit
# Set specific values (supports dot notation)
koder config set model.name gpt-4o
koder config set model.provider anthropic
koder config set cli.stream false
Config File Format
# ~/.koder/config.yaml
# Model configuration
model:
name: "gpt-4.1" # Model name (default: gpt-4.1)
provider: "openai" # Provider name (default: openai)
api_key: null # API key (prefer env vars for security)
base_url: null # Custom API endpoint (optional)
# Azure-specific settings
azure_api_version: null # e.g., "2025-04-01-preview"
# Vertex AI-specific settings
vertex_ai_location: null # e.g., "us-central1"
vertex_ai_credentials_path: null # Path to service account JSON
# Reasoning effort for OpenAI reasoning models (o1, o3, gpt-5.1, etc.)
reasoning_effort: "medium" # none, minimal, low, medium, high, or null
# CLI defaults
cli:
session: null # Default session name (auto-generated if null)
stream: true # Enable streaming output (default: true)
# MCP servers for extended functionality
mcp_servers: []
Environment Variables
Core Variables
| Variable | Purpose | Example |
|---|---|---|
KODER_MODEL |
Model selection (highest priority) | gpt-4o, claude-opus-4-20250514 |
KODER_REASONING_EFFORT |
Reasoning effort for reasoning models | medium, high, low, null |
EDITOR |
Editor for koder config edit |
vim, code |
Provider API Keys
| Provider | API Key Variable | Additional Variables |
|---|---|---|
| OpenAI | OPENAI_API_KEY |
OPENAI_BASE_URL |
| Anthropic | ANTHROPIC_API_KEY |
- |
| Google/Gemini | GOOGLE_API_KEY or GEMINI_API_KEY |
- |
| Azure | AZURE_API_KEY |
AZURE_API_BASE, AZURE_API_VERSION |
| Vertex AI | GOOGLE_APPLICATION_CREDENTIALS |
VERTEXAI_LOCATION |
| GitHub Copilot | GITHUB_TOKEN |
- |
| Groq | GROQ_API_KEY |
- |
| Together AI | TOGETHERAI_API_KEY |
- |
| OpenRouter | OPENROUTER_API_KEY |
- |
| Mistral | MISTRAL_API_KEY |
- |
| Cohere | COHERE_API_KEY |
- |
| Bedrock | AWS_ACCESS_KEY_ID |
AWS_SECRET_ACCESS_KEY |
Supported Providers
OpenAI
export OPENAI_API_KEY=your-api-key
export KODER_MODEL="gpt-4o" # Optional, default: gpt-4.1
# Optional: Custom endpoint
export OPENAI_BASE_URL=https://your-endpoint.com/v1
koder
Anthropic
export ANTHROPIC_API_KEY=your-api-key
export KODER_MODEL="claude-opus-4-20250514"
koder
Google Gemini
export GOOGLE_API_KEY=your-api-key
export KODER_MODEL="gemini/gemini-2.5-pro"
koder
GitHub Copilot
export KODER_MODEL="github_copilot/claude-sonnet-4"
koder
On first run you will see a device code in the terminal. Visit https://github.com/login/device and enter the code to authenticate.
Azure OpenAI
export AZURE_API_KEY="your-azure-api-key"
export AZURE_API_BASE="https://your-resource.openai.azure.com"
export AZURE_API_VERSION="2025-04-01-preview"
export KODER_MODEL="azure/gpt-4"
koder
Or configure in ~/.koder/config.yaml:
model:
name: "gpt-4"
provider: "azure"
azure_api_version: "2025-04-01-preview"
Google Vertex AI
export GOOGLE_APPLICATION_CREDENTIALS="path/to/service-account.json"
export VERTEXAI_LOCATION="us-central1"
export KODER_MODEL="vertex_ai/claude-sonnet-4@20250514"
koder
Or configure in ~/.koder/config.yaml:
model:
name: "claude-sonnet-4@20250514"
provider: "vertex_ai"
vertex_ai_location: "us-central1"
vertex_ai_credentials_path: "path/to/service-account.json"
Other Providers (100+ via LiteLLM)
LiteLLM supports 100+ providers. Use the format provider/model:
# Groq
export GROQ_API_KEY=your-key
export KODER_MODEL="groq/llama-3.3-70b-versatile"
# Together AI
export TOGETHERAI_API_KEY=your-key
export KODER_MODEL="together_ai/meta-llama/Llama-3-70b-chat-hf"
# OpenRouter
export OPENROUTER_API_KEY=your-key
export KODER_MODEL="openrouter/anthropic/claude-3-opus"
# Custom OpenAI-compatible endpoints
export OPENAI_API_KEY="your-key"
export OPENAI_BASE_URL="https://your-custom-endpoint.com/v1"
export KODER_MODEL="openai/your-model-name"
koder
MCP Server Configuration
Model Context Protocol (MCP) servers extend Koder's capabilities with additional tools.
MCP CLI Commands
# Add an MCP server (stdio transport)
koder mcp add myserver "python -m my_mcp_server" --transport stdio
# Add with environment variables
koder mcp add myserver "python -m server" -e API_KEY=xxx -e DEBUG=true
# Add HTTP/SSE server
koder mcp add webserver --transport http --url http://localhost:8000
# List all MCP servers
koder mcp list
# Get server details
koder mcp get myserver
# Remove a server
koder mcp remove myserver
MCP Config Format
# In ~/.koder/config.yaml
mcp_servers:
# stdio transport (runs a local command)
- name: "filesystem"
transport_type: "stdio"
command: "python"
args: ["-m", "mcp.server.filesystem"]
env_vars:
ROOT_PATH: "/home/user/projects"
cache_tools_list: true
allowed_tools: # Optional: whitelist specific tools
- "read_file"
- "write_file"
# HTTP transport (connects to remote server)
- name: "web-tools"
transport_type: "http"
url: "http://localhost:8000"
headers:
Authorization: "Bearer token123"
# SSE transport (server-sent events)
- name: "streaming-server"
transport_type: "sse"
url: "http://localhost:9000/sse"
Example Configurations
Minimal (OpenAI)
# ~/.koder/config.yaml
model:
name: "gpt-4o"
provider: "openai"
export OPENAI_API_KEY="sk-..."
koder
Enterprise Azure Setup
# ~/.koder/config.yaml
model:
name: "gpt-4"
provider: "azure"
azure_api_version: "2025-04-01-preview"
cli:
session: "enterprise-project"
stream: true
mcp_servers:
- name: "company-tools"
transport_type: "http"
url: "https://internal-mcp.company.com"
headers:
X-API-Key: "${COMPANY_API_KEY}"
export AZURE_API_KEY="..."
export AZURE_API_BASE="https://your-resource.openai.azure.com"
koder
Multi-Provider Development
# ~/.koder/config.yaml - set a default
model:
name: "gpt-4o"
provider: "openai"
# Override at runtime with KODER_MODEL
export OPENAI_API_KEY="..."
export ANTHROPIC_API_KEY="..."
# Use default (OpenAI)
koder
# Switch to Claude for specific tasks
KODER_MODEL="claude-opus-4-20250514" koder "complex reasoning task"
Configuration Priority
When the same setting is defined in multiple places, the priority is:
CLI Arguments > Environment Variables > Config File > Defaults
Example:
# ~/.koder/config.yaml
model:
name: "gpt-4o"
# Environment variable overrides config file
export KODER_MODEL="claude-opus-4-20250514"
koder # Uses claude-opus-4-20250514
🛠️ Development
Setup Development Environment
# Clone the repository
git clone https://github.com/feiskyer/koder.git
cd koder
uv sync
uv run koder
Code Quality
# Code formatting
black .
# Linting
ruff check --fix
# pylint
pylint koder_agent/ --disable=C,R,W --errors-only
🔒 Security
- API Keys: All API keys are stored in environment variables and never in code.
- Local Storage: Sessions are stored locally in your home directory.
- No Telemetry: Koder doesn't send any data besides API requests to your chosen provider.
- Code Execution: Shell commands require explicit user confirmation.
🤝 Contributing
Contributions are welcome! Here's how you can help:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Please read our Contributing Guidelines for more details.
🌐 Code of Conduct
This project follows a Code of Conduct based on the Contributor Covenant. Be kind and respectful. If you observe unacceptable behavior, please open an issue.
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
Use of third-party AI services is governed by their respective provider terms.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file koder-0.4.0.tar.gz.
File metadata
- Download URL: koder-0.4.0.tar.gz
- Upload date:
- Size: 385.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.5.29
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fb19a27b08211f6edbd16d42aada6e4b3981b7d65f73164c4f734aab6027e081
|
|
| MD5 |
8f71816c6d03fb3e8d2e7a356409b599
|
|
| BLAKE2b-256 |
6846e1828ff7d337e23722062f535893b51ef83e69a63dd53d6289a1bdac35ce
|
File details
Details for the file koder-0.4.0-py3-none-any.whl.
File metadata
- Download URL: koder-0.4.0-py3-none-any.whl
- Upload date:
- Size: 85.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.5.29
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ff5d79dfd5885757c5f29f7d4ab87ffa6c484ba1991bb1dd921904ffd4e77da2
|
|
| MD5 |
53dcb9f9781ebaecc51aa8180273f9c8
|
|
| BLAKE2b-256 |
88f310805ddfdae4c729262b847c048285d20a127290c7092374b5f00a207bd5
|