Skip to main content

A multi-provider coding agent with Textual TUI (supports Gemini, OpenAI, Anthropic)

Project description

๐Ÿค– Nanocoder

A minimal, multi-provider coding agent with a beautiful Textual TUI. Supports Google Gemini, OpenAI, and Anthropic Claude via provider-native APIs.

Python 3.13+ License MIT

โœจ Features

  • Multi-Provider Support: Switch between Gemini, OpenAI, and Anthropic with a single environment variable
  • Unified Agent Loop: Same UX across all providers
  • Streaming Responses: Real-time text and thought/reasoning display
  • Parallel Tool Execution: Concurrent file operations and commands
  • Trajectory Logging: JSONL traces for debugging, reproducibility, and training data
  • Beautiful TUI: Modern terminal interface with Textual

๐Ÿงญ Overview

Nanocoder pairs a Textual front-end with a provider-agnostic agent core so the UX stays identical whether you call Gemini, OpenAI, or Anthropic. The codebase is intentionally compact and organized around a few focused modules:

  • CLI + TUI (nanocoder/cli.py, nanocoder/app_tui.py) โ€“ boots the Textual application, renders streaming text/thought panes, and wires up keyboard shortcuts plus chat commands like /trace.
  • Agent core (nanocoder/agent/core.py) โ€“ streams LLMSession events, manages multi-iteration tool calling, and records trajectories for every turn.
  • Tooling layer (nanocoder/agent/tools.py, nanocoder/agent/exec.py) โ€“ exposes filesystem/shell helpers with JSON Schema contracts and executes them in parallel with callback hooks for the UI.
  • LLM adapters (nanocoder/llm/*.py) โ€“ wrap provider-native SDKs while converting their streaming outputs into the unified event model.
  • Tracing (nanocoder/tracing/*) โ€“ produces JSONL traces in .nanocoder_traces/ for debugging and reproducibility.

๐Ÿš€ Quick Start

Installation

# Using uv (recommended)
uv tool install nanocoder

# Or clone and install
git clone https://github.com/yuxiang-wu/nanocoder
cd nanocoder
uv sync

Setup API Key

# For Gemini (default)
export GEMINI_API_KEY="your-key"

# Or for OpenAI
export OPENAI_API_KEY="your-key"
export NANOCODER_PROVIDER="openai"

# Or for Anthropic
export ANTHROPIC_API_KEY="your-key"
export NANOCODER_PROVIDER="anthropic"

Run

nanocoder

Repo at a Glance

Area What lives there
nanocoder/__init__.py Package init, version (single source of truth)
nanocoder/app_tui.py Textual widgets, streaming UI, keyboard bindings
nanocoder/agent/ Provider-agnostic loop, tool registry, parallel executor
nanocoder/llm/ Adapters for Gemini, OpenAI Responses, Anthropic Messages
nanocoder/tracing/ JSONL schema + logger used for .nanocoder_traces/

๐Ÿ› ๏ธ Available Tools

Tool Description
read_file Read file contents with optional line ranges
edit_file Create or edit files using search & replace
run_command Execute shell commands with persistent working directory
web_search Search the web using Exa (requires EXA_API_KEY)
web_read Fetch full content from specific URLs

โš™๏ธ Configuration

All configuration is via environment variables:

Variable Default Description
NANOCODER_PROVIDER gemini Provider: gemini, openai, anthropic
NANOCODER_MODEL (per provider) Model identifier (see defaults below)
NANOCODER_SHOW_THOUGHTS 1 Show thought panel (0 or 1)
NANOCODER_TRACE_DIR .nanocoder_traces Trace output directory
NANOCODER_MAX_TOOL_WORKERS 10 Max concurrent tool executions

Provider-Specific Keys

Provider API Key Variable
Gemini GEMINI_API_KEY
OpenAI OPENAI_API_KEY
Anthropic ANTHROPIC_API_KEY
Exa (web search) EXA_API_KEY (optional)

Default Models

Provider Default Model Thinking/Reasoning
Gemini gemini-3-pro-preview Thought summaries (high effort)
OpenAI gpt-5.1-codex Detailed reasoning summaries
Anthropic claude-opus-4-5-20251101 Extended thinking (10k token budget)

All providers show their internal reasoning/thinking in the UI's thought panel.

โŒจ๏ธ Keyboard Shortcuts

Key Action
Ctrl+C Quit
Ctrl+L Clear chat history
Escape Focus input

๐Ÿ’ฌ Commands

Command Description
/quit Exit the application
/clear Clear chat history
/help Show help information
/provider Show current provider and model
/trace Show trace file path

๐Ÿ“Š Trajectory Logging

Every session creates a JSONL trace file for debugging and reproducibility:

.nanocoder_traces/20251227_143022_a1b2c3d4_gemini.jsonl

Trace Contents

Each trace includes:

  • Machine metadata: hostname, platform, Python version, Nanocoder version
  • Model responses: full text, reasoning/thinking, start/end timestamps, token usage
  • Tool calls: arguments and results
  • Timing: sufficient for replay without streaming

Trace events: run.start, run.end, turn.start, turn.end, model.request, model.response, tool.start, tool.end, error

๐Ÿ—๏ธ Architecture

nanocoder/
โ”œโ”€โ”€ __init__.py     # Package init + version
โ”œโ”€โ”€ cli.py          # Entry point
โ”œโ”€โ”€ app_tui.py      # Textual TUI
โ”œโ”€โ”€ agent/
โ”‚   โ”œโ”€โ”€ core.py     # Provider-agnostic agent loop
โ”‚   โ”œโ”€โ”€ tools.py    # Tool definitions + registry
โ”‚   โ””โ”€โ”€ exec.py     # Parallel tool execution
โ”œโ”€โ”€ llm/
โ”‚   โ”œโ”€โ”€ base.py     # Event types & LLMSession protocol
โ”‚   โ”œโ”€โ”€ gemini.py   # Gemini adapter (google.genai)
โ”‚   โ”œโ”€โ”€ openai.py   # OpenAI adapter (Responses API)
โ”‚   โ””โ”€โ”€ anthropic.py # Anthropic adapter (Messages API)
โ””โ”€โ”€ tracing/
    โ”œโ”€โ”€ logger.py   # TrajectoryLogger + JSONL output
    โ””โ”€โ”€ schema.py   # Trace event dataclasses

Design Principles

  1. Unify on events, not messages: Provider adapters maintain native conversation state
  2. Provider correctness first: No cross-provider message normalization
  3. Trace everything: Consistent event schema for debugging, reproducibility, and training
  4. Minimal surface area: Thin adapters, provider-agnostic core loop

๐Ÿ“ License

MIT License - see LICENSE for details.

๐Ÿ™ Acknowledgments

Built with:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nanocoder-0.2.7.tar.gz (33.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nanocoder-0.2.7-py3-none-any.whl (39.9 kB view details)

Uploaded Python 3

File details

Details for the file nanocoder-0.2.7.tar.gz.

File metadata

  • Download URL: nanocoder-0.2.7.tar.gz
  • Upload date:
  • Size: 33.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.13

File hashes

Hashes for nanocoder-0.2.7.tar.gz
Algorithm Hash digest
SHA256 1530659ec0968ab5463e336d3e6ae57458c3ed34b5ea2a0c307ff59cbca16eea
MD5 419a31cbaa38bc136e468eef92c130ab
BLAKE2b-256 4b5fbea087863249c5e02dcc5f4bd7f59bc26350d12611b93849991373c36c7c

See more details on using hashes here.

File details

Details for the file nanocoder-0.2.7-py3-none-any.whl.

File metadata

  • Download URL: nanocoder-0.2.7-py3-none-any.whl
  • Upload date:
  • Size: 39.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.13

File hashes

Hashes for nanocoder-0.2.7-py3-none-any.whl
Algorithm Hash digest
SHA256 d64ed4d2778eaefc403e8c1d82124d2fff9d323947ad4f0dbd1e5e406c689e0c
MD5 aa590c940a26ee6aa34c705169229463
BLAKE2b-256 c8b7609811e2cfd6e89ebb09218e7a1935bcc66b19eace5f9a146312df5b93f2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page