An intuitive AI coding assistant and interactive CLI tool that boosts developer productivity with intelligent automation and context-aware support.
Project description
Koder
Koder is an experimental, universal AI coding assistant designed to explore how to build an advanced terminal-based AI coding assistant. Written entirely in Python, it serves as both a functional tool and a learning playground for AI agent development.
🎯 Project Status: Under active vibe coding! This is a learning-focused project where we explore building AI coding agents.
✨ Features
- 🤖 Universal AI Support: Works with OpenAI, Anthropic, Google, GitHub Copilot, and 100+ providers via LiteLLM with intelligent auto-detection
- 💾 Smart Context Management: Persistent sessions with SQLite storage and automatic token-aware compression (50k token limit)
- 🔄 Real-time Streaming: Rich Live displays with intelligent terminal cleanup for responsive user experience
- 🛠️ Comprehensive Toolset: file operations, search, shell, task delegation and todos.
- 🔌 MCP Integration: Model Context Protocol support with stdio, SSE, and HTTP transports for extensible tool ecosystem
- 🛡️ Enterprise Security: SecurityGuard validation, output filtering, permission system, and input sanitization
- 🎯 Zero Configuration: Automatic provider detection with fallback defaults
🛠️ Installation
Using uv (Recommended)
uv tool install koder
Using pip
pip install koder
⚡ Quick Start
Simply run Koder with your question or request:
# Configure one provider (example: OpenAI)
export OPENAI_API_KEY="your-openai-api-key"
export KODER_MODEL="gpt-4o"
# Run in interactive mode
koder
# Run with prompt
koder "create a Python function to calculate fibonacci numbers"
# Execute a single prompt in a named session
koder -s my-project "Help me implement a new feature"
# Use an explicit session flag
koder -s my-project "Your prompt here"
🤖 Configuration
Environment Variables
Koder automatically detects your AI provider based on available environment variables. The KODER_MODEL environment variable controls which model to use:
# OpenAI models
export KODER_MODEL="gpt-4.1"
koder
# Claude models (via LiteLLM)
export KODER_MODEL="claude-opus-4-20250514"
export ANTHROPIC_API_KEY=your-api-key
koder
# Google Gemini models (via LiteLLM)
export KODER_MODEL="gemini/gemini-2.5-pro"
export GOOGLE_API_KEY=your-api-key
koder
# Github Copilot (via LiteLLM)
export KODER_MODEL="github_copilot/claude-sonnet-4"
koder
Supported Providers
OpenAI
export OPENAI_API_KEY=your-api-key
# Optional: Use custom endpoint
export OPENAI_API_BASE=https://your-endpoint.com
# Optional: Specify model (default: gpt-4.1)
export KODER_MODEL="gpt-4o"
koder
Anthropic
export KODER_MODEL="claude-opus-4-20250514"
export ANTHROPIC_API_KEY=your-api-key
koder
Google Gemini
export KODER_MODEL="gemini/gemini-2.5-pro"
export GOOGLE_API_KEY=your-api-key
koder
GitHub Copilot
export KODER_MODEL="github_copilot/claude-sonnet-4"
koder
On first run you will see a device code in the terminal. Visit https://github.com/login/device and enter the code to authenticate.
Azure OpenAI
export KODER_MODEL=azure/gpt-5
export AZURE_API_KEY="your-azure-api-key"
export ZURE_API_BASE="https://your-resource.openai.azure.com"
export AZURE_API_VERSION="2025-04-01-preview"
koder
Other AI providers (via LiteLLM)
LiteLLM supports 100+ providers including Anthropic, Google, Cohere, Hugging Face, and more:
# Google Vertex AI
export KODER_MODEL="vertex_ai/claude-sonnet-4@20250514"
export GOOGLE_APPLICATION_CREDENTIALS="your-sa-path.json"
export VERTEXAI_LOCATION="<your-region>"
koder
# Custom OpenAI-compatible endpoints
export KODER_MODEL="openai/<your-model-name>"
export OPENAI_API_KEY="your-key"
export OPENAI_BASE_URL="https://your-custom-endpoint.com/v1"
koder
🛠️ Development
Setup Development Environment
# Clone the repository
git clone https://github.com/feiskyer/koder.git
cd koder
uv sync
uv run koder
Code Quality
# Code formatting
black .
# Linting
ruff check --fix
# pylint
pylint koder_agent/ --disable=C,R,W --errors-only
🔒 Security
- API Keys: All API keys are stored in environment variables and never in code.
- Local Storage: Sessions are stored locally in your home directory.
- No Telemetry: Koder doesn't send any data besides API requests to your chosen provider.
- Code Execution: Shell commands require explicit user confirmation.
🤝 Contributing
Contributions are welcome! Here's how you can help:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Please read our Contributing Guidelines for more details.
🌐 Code of Conduct
This project follows a Code of Conduct based on the Contributor Covenant. Be kind and respectful. If you observe unacceptable behavior, please open an issue.
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
Use of third-party AI services is governed by their respective provider terms.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file koder-0.3.3.tar.gz.
File metadata
- Download URL: koder-0.3.3.tar.gz
- Upload date:
- Size: 250.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.5.29
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e7d553c7872083500426b7d4a05f9c9aadb55df8aff9fb8907710403c436ab50
|
|
| MD5 |
beeb5c6f21f4f3c5fb44ef1b56c524b3
|
|
| BLAKE2b-256 |
444c7904d8e0cac0c960dafe10f0b93f470f0658c63a626317f28d2810d24e3c
|
File details
Details for the file koder-0.3.3-py3-none-any.whl.
File metadata
- Download URL: koder-0.3.3-py3-none-any.whl
- Upload date:
- Size: 59.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.5.29
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
872da8c2ad930fda8f5b55d9eb8ac046f37d6421664f12fb912b5e68b9d8b326
|
|
| MD5 |
ee4313fb01a3b8f4309dff2fb064c207
|
|
| BLAKE2b-256 |
2cc8a55cd9b3b7696a6c3c4369e56e3b3437ddbd5ed42e8b641483b46ae52e83
|