A multi-provider coding agent with Textual TUI (supports Gemini, OpenAI, Anthropic)
Project description
๐ค Nanocoder
A minimal, multi-provider coding agent with a beautiful Textual TUI. Supports Google Gemini, OpenAI, and Anthropic Claude via provider-native APIs.
โจ Features
- Multi-Provider Support: Switch between Gemini, OpenAI, and Anthropic with a single environment variable
- Unified Agent Loop: Same UX across all providers
- Streaming Responses: Real-time text and thought/reasoning display
- Parallel Tool Execution: Concurrent file operations and commands
- Trajectory Logging: JSONL traces for debugging, reproducibility, and training data
- Beautiful TUI: Modern terminal interface with Textual
๐งญ Overview
Nanocoder pairs a Textual front-end with a provider-agnostic agent core so the UX stays identical whether you call Gemini, OpenAI, or Anthropic. The codebase is intentionally compact and organized around a few focused modules:
- CLI + TUI (
nanocoder/cli.py,nanocoder/app_tui.py) โ boots the Textual application, renders streaming text/thought panes, and wires up keyboard shortcuts plus chat commands like/trace. - Agent core (
nanocoder/agent/core.py) โ streamsLLMSessionevents, manages multi-iteration tool calling, and records trajectories for every turn. - Tooling layer (
nanocoder/agent/tools.py,nanocoder/agent/exec.py) โ exposes filesystem/shell helpers with JSON Schema contracts and executes them in parallel with callback hooks for the UI. - LLM adapters (
nanocoder/llm/*.py) โ wrap provider-native SDKs while converting their streaming outputs into the unified event model. - Tracing (
nanocoder/tracing/*) โ produces JSONL traces in.nanocoder_traces/for debugging and reproducibility.
๐ Quick Start
Installation
# Using uv (recommended)
uv tool install nanocoder
# Or clone and install
git clone https://github.com/yuxiang-wu/nanocoder
cd nanocoder
uv sync
Setup API Key
# For Gemini (default)
export GEMINI_API_KEY="your-key"
# Or for OpenAI
export OPENAI_API_KEY="your-key"
export NANOCODER_PROVIDER="openai"
# Or for Anthropic
export ANTHROPIC_API_KEY="your-key"
export NANOCODER_PROVIDER="anthropic"
Run
nanocoder
Repo at a Glance
| Area | What lives there |
|---|---|
nanocoder/__init__.py |
Package init, version (single source of truth) |
nanocoder/app_tui.py |
Textual widgets, streaming UI, keyboard bindings |
nanocoder/agent/ |
Provider-agnostic loop, tool registry, parallel executor |
nanocoder/llm/ |
Adapters for Gemini, OpenAI Responses, Anthropic Messages |
nanocoder/tracing/ |
JSONL schema + logger used for .nanocoder_traces/ |
๐ ๏ธ Available Tools
| Tool | Description |
|---|---|
read_file |
Read file contents with optional line ranges |
edit_file |
Create or edit files using search & replace |
run_command |
Execute shell commands with persistent working directory |
web_search |
Search the web using Exa (requires EXA_API_KEY) |
web_read |
Fetch full content from specific URLs |
โ๏ธ Configuration
All configuration is via environment variables:
| Variable | Default | Description |
|---|---|---|
NANOCODER_PROVIDER |
gemini |
Provider: gemini, openai, anthropic |
NANOCODER_MODEL |
(per provider) | Model identifier (see defaults below) |
NANOCODER_SHOW_THOUGHTS |
1 |
Show thought panel (0 or 1) |
NANOCODER_TRACE_DIR |
.nanocoder_traces |
Trace output directory |
NANOCODER_MAX_TOOL_WORKERS |
10 |
Max concurrent tool executions |
Provider-Specific Keys
| Provider | API Key Variable |
|---|---|
| Gemini | GEMINI_API_KEY |
| OpenAI | OPENAI_API_KEY |
| Anthropic | ANTHROPIC_API_KEY |
| Exa (web search) | EXA_API_KEY (optional) |
Default Models
| Provider | Default Model | Thinking/Reasoning |
|---|---|---|
| Gemini | gemini-3-pro-preview |
Thought summaries (high effort) |
| OpenAI | gpt-5.1-codex |
Detailed reasoning summaries |
| Anthropic | claude-opus-4-5-20251101 |
Extended thinking (10k token budget) |
All providers show their internal reasoning/thinking in the UI's thought panel.
โจ๏ธ Keyboard Shortcuts
| Key | Action |
|---|---|
Ctrl+C |
Quit |
Ctrl+L |
Clear chat history |
Escape |
Focus input |
๐ฌ Commands
| Command | Description |
|---|---|
/quit |
Exit the application |
/clear |
Clear chat history |
/help |
Show help information |
/provider |
Show current provider and model |
/trace |
Show trace file path |
๐ Trajectory Logging
Every session creates a JSONL trace file for debugging and reproducibility:
.nanocoder_traces/20251227_143022_a1b2c3d4_gemini.jsonl
Trace Contents
Each trace includes:
- Machine metadata: hostname, platform, Python version, Nanocoder version
- Model responses: full text, reasoning/thinking, start/end timestamps, token usage
- Tool calls: arguments and results
- Timing: sufficient for replay without streaming
Trace events: run.start, run.end, turn.start, turn.end, model.request, model.response, tool.start, tool.end, error
๐๏ธ Architecture
nanocoder/
โโโ __init__.py # Package init + version
โโโ cli.py # Entry point
โโโ app_tui.py # Textual TUI
โโโ agent/
โ โโโ core.py # Provider-agnostic agent loop
โ โโโ tools.py # Tool definitions + registry
โ โโโ exec.py # Parallel tool execution
โโโ llm/
โ โโโ base.py # Event types & LLMSession protocol
โ โโโ gemini.py # Gemini adapter (google.genai)
โ โโโ openai.py # OpenAI adapter (Responses API)
โ โโโ anthropic.py # Anthropic adapter (Messages API)
โโโ tracing/
โโโ logger.py # TrajectoryLogger + JSONL output
โโโ schema.py # Trace event dataclasses
Design Principles
- Unify on events, not messages: Provider adapters maintain native conversation state
- Provider correctness first: No cross-provider message normalization
- Trace everything: Consistent event schema for debugging, reproducibility, and training
- Minimal surface area: Thin adapters, provider-agnostic core loop
๐ License
MIT License - see LICENSE for details.
๐ Acknowledgments
Built with:
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nanocoder-0.2.7.tar.gz.
File metadata
- Download URL: nanocoder-0.2.7.tar.gz
- Upload date:
- Size: 33.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1530659ec0968ab5463e336d3e6ae57458c3ed34b5ea2a0c307ff59cbca16eea
|
|
| MD5 |
419a31cbaa38bc136e468eef92c130ab
|
|
| BLAKE2b-256 |
4b5fbea087863249c5e02dcc5f4bd7f59bc26350d12611b93849991373c36c7c
|
File details
Details for the file nanocoder-0.2.7-py3-none-any.whl.
File metadata
- Download URL: nanocoder-0.2.7-py3-none-any.whl
- Upload date:
- Size: 39.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d64ed4d2778eaefc403e8c1d82124d2fff9d323947ad4f0dbd1e5e406c689e0c
|
|
| MD5 |
aa590c940a26ee6aa34c705169229463
|
|
| BLAKE2b-256 |
c8b7609811e2cfd6e89ebb09218e7a1935bcc66b19eace5f9a146312df5b93f2
|