Skip to main content

An intuitive AI coding assistant and interactive CLI tool that boosts developer productivity with intelligent automation and context-aware support.

Project description

Koder

Python License PyPI Downloads Code style: black Ruff

An experimental, universal AI coding assistant for the terminal. Built in Python, Koder works with OpenAI, Anthropic, Google, GitHub Copilot, and 100+ providers via LiteLLM.

🎯 Status: Alpha - This is a learning-focused project exploring AI agent development.

Features

  • Universal AI Support - Works with ChatGPT/Gemini/Claude subscriptions and API keys via OpenAI, Anthropic, Google, GitHub Copilot, and 100+ providers
  • Smart Context - Persistent sessions with SQLite storage and automatic token-aware compression
  • Real-time Streaming - Rich terminal displays with live output
  • Comprehensive Tools - File operations, search, shell, task delegation, todos, and skills
  • MCP Integration - Extensible tool ecosystem via Model Context Protocol
  • Zero Config - Automatic provider detection with sensible defaults

Installation

Using uv (Recommended)

uv tool install koder

Using pip

pip install koder

Quick Start

# 1. Set your API key (works with any provider)
export KODER_API_KEY="your-api-key"

# 2. Run Koder
koder

That's it! KODER_API_KEY works with any provider - no need to remember provider-specific variable names.

Basic Usage

# Interactive mode
koder

# Single prompt
koder "create a Python function to calculate fibonacci numbers"

# Named session (persists conversation)
koder -s my-project "help me implement a new feature"

# Use a different model
KODER_MODEL="claude-opus-4-20250514" koder "your prompt"

Configuration

Koder can be configured via (in priority order):

  1. CLI arguments - Highest priority
  2. Environment variables - Universal KODER_* vars override everything
  3. Config file - ~/.koder/config.yaml

Environment Variables

Variable Description Example
KODER_API_KEY Universal API key (works with any provider) sk-...
KODER_MODEL Model to use gpt-4o, claude-opus-4-20250514
KODER_BASE_URL Custom API endpoint http://localhost:8080/v1
KODER_REASONING_EFFORT For reasoning models low, medium, high

KODER_API_KEY and KODER_BASE_URL take priority over provider-specific variables (like OPENAI_API_KEY) and config file settings.

Providers

Use KODER_API_KEY for any provider, or provider-specific variables:

Provider Environment Variable Model Example
Any KODER_API_KEY Works with all providers
OpenAI OPENAI_API_KEY gpt-4o, gpt-4.1
Anthropic ANTHROPIC_API_KEY claude-opus-4-20250514
Google GOOGLE_API_KEY gemini/gemini-2.5-pro
GitHub Copilot (device auth) github_copilot/claude-sonnet-4
Azure AZURE_API_KEY azure/gpt-4

See Configuration Guide for all 100+ providers.

OAuth Providers (Subscription-Based)

Use your existing subscriptions (Claude Max, ChatGPT Plus/Pro, Google Gemini) without API keys:

# Authenticate with a provider
koder auth login google      # Google/Gemini CLI (free with Google account)
koder auth login claude      # Claude Max subscription
koder auth login chatgpt     # ChatGPT Plus/Pro subscription
koder auth login antigravity # Antigravity (Gemini/Claude models)

# Check authentication status
koder auth list              # List configured providers
koder auth status            # Show detailed token status

# Revoke access
koder auth revoke google     # Remove stored tokens

OAuth Providers:

OAuth Provider Subscription Description
google Google Gemini Access to Gemini models via Google account
claude Claude Pro/Max Access to Claude models via subscription
chatgpt ChatGPT Plus/Pro Access to GPT models via subscription
antigravity Antigravity Access to Gemini/Claude models

Available models are fetched from each provider's API after login and cached locally (1 day TTL). Use koder auth list to see all available models for your authenticated providers.

Usage after authentication:

# Use OAuth-authenticated models (OAuth provider prefix required)
KODER_MODEL="google/gemini-3-pro-preview" koder "your prompt"                # Google OAuth
KODER_MODEL="claude/claude-opus-4-5-20250514" koder "your prompt"            # Claude Max
KODER_MODEL="chatgpt/gpt-5.2" koder "your prompt"                            # ChatGPT OAuth
KODER_MODEL="antigravity/gemini-3-pro-high" koder "your prompt"              # Antigravity (Gemini 3.0)
KODER_MODEL="antigravity/claude-opus-4-5-thinking" koder "your prompt"       # Antigravity (Claude Opus 4.5)

OAuth tokens and cached models are stored in ~/.koder/tokens/ and automatically refreshed before expiry.

Config File Example

# ~/.koder/config.yaml

model:
  name: "gpt-4o"
  provider: "openai"
  reasoning_effort: null    # For reasoning models: low, medium, high

cli:
  session: null             # Default session name
  stream: true              # Enable streaming output

mcp_servers: []             # MCP server configurations

Commands

koder config show          # Show current config
koder config edit          # Edit config file
koder -s SESSION_NAME      # Use named session

MCP Servers

Model Context Protocol (MCP) servers extend Koder with additional tools.

CLI Commands

# Add servers
koder mcp add myserver "python -m my_mcp_server"
koder mcp add myserver "python -m server" -e API_KEY=xxx

# HTTP/SSE transport
koder mcp add webserver --transport http --url http://localhost:8000

# Manage servers
koder mcp list
koder mcp get myserver
koder mcp remove myserver

Config Example

# In ~/.koder/config.yaml
mcp_servers:
  # stdio transport (local command)
  - name: "filesystem"
    transport_type: "stdio"
    command: "python"
    args: ["-m", "mcp.server.filesystem"]
    env_vars:
      ROOT_PATH: "/home/user/projects"
    cache_tools_list: true
    allowed_tools:
      - "read_file"
      - "write_file"

  # HTTP transport (remote server)
  - name: "web-tools"
    transport_type: "http"
    url: "http://localhost:8000"
    headers:
      Authorization: "Bearer token123"

  # SSE transport
  - name: "streaming-server"
    transport_type: "sse"
    url: "http://localhost:9000/sse"

Skills

Skills provide specialized knowledge loaded on-demand, saving 90%+ tokens via progressive disclosure.

Directory Structure

Skills are loaded from (project skills take priority):

  1. Project: .koder/skills/
  2. User: ~/.koder/skills/
.koder/skills/
├── api-design/
│   └── SKILL.md
├── code-review/
│   ├── SKILL.md
│   └── checklist.md
└── testing/
    └── SKILL.md

Creating a Skill

Create a SKILL.md with YAML frontmatter:

---
name: api-design
description: Best practices for designing RESTful APIs
allowed_tools:
  - read_file
  - write_file
---

# API Design Guidelines

## RESTful Principles

Use nouns for resources, HTTP verbs for actions...

How Skills Work

  1. Startup: Only skill names and descriptions are loaded (minimal tokens)
  2. On-demand: Full content fetched when needed via get_skill(name)
  3. Supplementary: Skills can reference additional files

Architecture

koder_agent/
├── agentic/        # Agent creation, hooks, and approval system
├── cli.py          # Main CLI entry point
├── config/         # Configuration management
├── core/           # Scheduler, context, streaming, security
├── mcp/            # Model Context Protocol integration
├── tools/          # Tool implementations
└── utils/          # Helpers and utilities

Core Flow

  1. CLI (cli.py) parses arguments, initializes session
  2. AgentScheduler (core/scheduler.py) orchestrates execution with streaming
  3. Agent (agentic/agent.py) builds agent with tools, MCP servers, model settings
  4. Tools (tools/engine.py) register tools, validate inputs, filter output
  5. Context (core/context.py) persists conversations in SQLite

Data Storage

  • Database: ~/.koder/koder.db (SQLite)
  • Config: ~/.koder/config.yaml
  • Skills: ~/.koder/skills/ or .koder/skills/

Development

Setup

git clone https://github.com/feiskyer/koder.git
cd koder
uv sync
uv run koder

Code Quality

uv run black .              # Format
uv run ruff check --fix     # Lint
uv run pytest               # Test

Security

  • API Keys: Stored in environment variables, never in code
  • Local Storage: Sessions stored in ~/.koder/
  • No Telemetry: Only API requests to your chosen provider
  • Shell Commands: Require explicit user confirmation

Contributing

Contributions are welcome!

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

See CONTRIBUTING.md for guidelines.

License

MIT License - see LICENSE for details.


Built with Python and curiosity

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

koder-0.4.13.tar.gz (361.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

koder-0.4.13-py3-none-any.whl (194.1 kB view details)

Uploaded Python 3

File details

Details for the file koder-0.4.13.tar.gz.

File metadata

  • Download URL: koder-0.4.13.tar.gz
  • Upload date:
  • Size: 361.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.29

File hashes

Hashes for koder-0.4.13.tar.gz
Algorithm Hash digest
SHA256 974ee19888f877ce667d24731119bec631ed81df7b2a6b0c86df45201432f0d6
MD5 303ad2254c3fc4d0fbb789803c8c1b64
BLAKE2b-256 37071b7501402d3d6c517f3cc8612c3a1b2b012946d2a59172e5ed9298d183e5

See more details on using hashes here.

File details

Details for the file koder-0.4.13-py3-none-any.whl.

File metadata

  • Download URL: koder-0.4.13-py3-none-any.whl
  • Upload date:
  • Size: 194.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.29

File hashes

Hashes for koder-0.4.13-py3-none-any.whl
Algorithm Hash digest
SHA256 ce585e74cb2c17a2b62f7b1b1a9dd30ad4fb7229bfd14fb56e83e0311c421e45
MD5 889142c70bf00e65e7b1a057b808dfd5
BLAKE2b-256 5bfd2961b6793bb645b6751bafa6ed3c511303e495bed07360df4669397758b6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page