Skip to main content

Minimal AI coding agent

Project description

NTN - Minimal AI Coding Agent

A minimal AI agent that helps with coding tasks in a workspace. Supports multiple LLM providers (OpenAI GPT-5.2, Anthropic Claude).

Features

  • Multi-provider support: GPT-5.2 (default), Claude Opus/Sonnet/Haiku
  • Docker-first file operations: All file operations run in a Docker container with Unix tools
  • Web search: Search using DuckDuckGo (ddgs package)
  • Web fetching: Fetch and read webpage content
  • Terminal execution: Run Windows commands when needed
  • Persistent container: Single container per session, auto-starts on launch
  • Command denylist: Dangerous commands require user confirmation
  • Colored output: Easy-to-read console with color-coded messages
  • Debug logging: Incremental logging to debug/ folder (crash-resilient)
  • Resume sessions: Continue previous conversations with -r flag
  • Mid-turn resume: Automatically recovers from crashes mid-tool-execution
  • Auto-compact: Automatically summarizes context when approaching token limit
  • Auto-cleanup: Empty conversations (no user messages) are automatically deleted
  • Rate limit handling: Automatically waits and retries using retry-after header
  • Prompt caching: System prompt and tools are cached to reduce costs
  • Model selection: Choose between GPT and Claude models with -m flag
  • Streaming output: Real-time response display (always enabled)
  • Cost tracking: Shows per-request and session costs with token usage
  • Extended thinking: Enable deep reasoning for complex tasks with -t flag

Installation

Install from PyPI:

pip install ntn

Or install from source:

git clone https://github.com/ntrnghia/coding-agent.git
cd ntn
pip install -e .

Setup

Set your API key based on the model you want to use:

For GPT-5.2 (default):

export OPENAI_API_KEY='your-api-key-here'

For Claude models:

export ANTHROPIC_API_KEY='your-api-key-here'

(Optional) Install Docker for sandbox functionality.

Usage

Run the agent:

ntn

Resume a previous session:

# Resume most recent session
ntn -r

# Resume specific session
ntn -r debug/debug_20251210_120000.txt

Enable extended thinking (better for complex reasoning):

ntn -t

Use a different model:

ntn -m gpt     # Use GPT-5.2 (default)
ntn -m opus    # Use Claude Opus 4.5
ntn -m sonnet  # Use Claude Sonnet 4.5
ntn -m haiku   # Use Claude Haiku 4.5

Combine flags:

ntn -t -r           # Resume with extended thinking
ntn -m opus -t      # Opus with extended thinking

Alternative: Run as Python module:

python -m ntn

Input controls:

  • Shift+Enter - New line (shows \)
  • Enter - Submit message
  • Ctrl+C - Exit the agent

Example prompts:

  • "Create a new Python project with main.py and tests/"
  • "Search for PyTorch distributed training docs"
  • "List all Python files in this directory"
  • "Run pytest on my tests"
  • "Tell me what the code in D:\Downloads\some-project does" (uses Docker sandbox)

Package Structure

ntn/
├── src/ntn/
│   ├── __init__.py    # Package exports
│   ├── __main__.py    # Entry for `python -m ntn`
│   ├── agent.py       # Main agent with auto-compact and resume support
│   ├── tools.py       # Tool implementations (Terminal, Web, Docker)
│   ├── providers.py   # LLM provider abstraction (OpenAI, Anthropic)
│   ├── config.py      # Configuration loader
│   ├── config.yaml    # Configuration values
│   └── cli.py         # CLI entry point
├── pyproject.toml     # Package configuration
├── LICENSE            # MIT License
└── README.md          # This file

Tools

Terminal Tool

Executes shell commands in your workspace. Dangerous commands (rm, sudo, curl, etc.) require user confirmation before execution.

Web Search Tool

Searches the web using DuckDuckGo, returns top 10 results.

Fetch Web Tool

Fetches and extracts text content from URLs.

Docker Sandbox Tool

All file operations run in a Docker container for consistent Unix environment:

  • Auto-starts on launch with workspace pre-mounted
  • Single persistent container per session (named agent_<timestamp>)
  • Directories mounted at Unix-style paths: D:\Downloads\project/d/downloads/project
  • Read-write access to all mounted directories
  • Multiple directories can be mounted dynamically
  • Container persists across prompts and survives resume
  • Lazy recovery: If container stops, auto-restarts on next command
  • Uses python:slim image by default

Context Management

The agent automatically manages context when approaching token limits:

  1. Auto-compact triggers: Summarizes older conversation turns
  2. Preserves current task: Summary includes your current question
  3. Seamless continuation: You won't notice the compaction

Debug file shows compaction events:

=== COMPACTION EVENT ===
Reason: Exceeded context (180000 tokens attempted)
Removed turns: 1-3
Summary content: [condensed conversation]

Resume Sessions

Sessions are logged incrementally to debug/debug_<timestamp>.txt. To resume:

# Resume most recent session
ntn -r

# Resume specific session
ntn -r debug/debug_20251210_120000.txt

On resume:

  • Previous conversation is displayed (including tool operations)
  • Context is restored (including any compacted summaries)
  • Container state is restored (mounts preserved)
  • New messages append to the same debug file
  • Crash recovery: If the agent crashed mid-turn, it will automatically continue from where it left off
  • Multi-model support: Can resume with a different model than originally used

Debug Log Format

Debug files use an incremental format for crash resilience:

=== TURN 1 ===
--- USER ---
<user message>
--- ASSISTANT ---
<JSON response>
--- USAGE: {"model": "gpt", "input": 1000, "output": 50, ...} ---
--- TOOL_RESULT ---
<JSON tool results>
--- END_TURN ---

Each block is written immediately, so even if the agent crashes, the debug file contains all completed operations.

Output Format

The agent uses colored output for readability:

  • 🟢 Green: Agent messages
  • 🟡 Yellow: Tool operations (📂 List files, 📄 Read file, ✏️ Edit file, 🐳 Docker, etc.)
  • 🟣 Magenta: Thinking indicator (when extended thinking enabled)
  • 🔵 Cyan: System messages, user prompts
  • 🔴 Red: Errors

Full JSON input/output is logged to debug/debug_<timestamp>.txt for debugging.

Security Notes

  • Commands run without timeout (for long-running processes)
  • Dangerous commands require explicit user confirmation
  • Docker sandbox provides isolated environment for external directories
  • All commands run in the specified workspace directory
  • Never commit API keys to version control

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ntn-0.3.1.tar.gz (40.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ntn-0.3.1-py3-none-any.whl (43.6 kB view details)

Uploaded Python 3

File details

Details for the file ntn-0.3.1.tar.gz.

File metadata

  • Download URL: ntn-0.3.1.tar.gz
  • Upload date:
  • Size: 40.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for ntn-0.3.1.tar.gz
Algorithm Hash digest
SHA256 226ea5f4d38afeae782fd0fcc70c42ef6ecd4da1412a81c9d0b6b4f13a62dccb
MD5 67ba237bc96aa390b6db61bfe2f3f161
BLAKE2b-256 3621c74b05e9f84fb37b8ef9be2ccea94851c8374b5593e38a9823c20506b1ef

See more details on using hashes here.

File details

Details for the file ntn-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: ntn-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 43.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for ntn-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c9f92d31a92e18b97682d6e5841c823e47b118afd2e8838737a3b4e1aa12b954
MD5 7b77aa5f21d6ab6e2ad7c01caba22816
BLAKE2b-256 cd916c36d5f7df77fc7a98d34bdb51e66e6ed71c0b8f8d487e2a10e529b497f2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page