Skip to main content

Minimal Terminal Agent powered by RLMs

Project description

Microcode

Minimal Terminal Agent powered by RLMs

Python Version License Version

Microcode is an efficient terminal-based AI agent with an internal REPL environment for coding assistance. It leverages Reasoning Language Models (RLMs) to help developers with coding tasks directly from the command line. Because we are solely using RLMs, it can handle extra large code snippets, file contents, and pasted content without dumping it directly into the context window. Try running with the --verbose flag to view the trajectories, or the "internal monologue" of the agent.

Warning: Microcode is currently in beta and does not yet have the standard guardrails (suching as asking the user to accept changes) as production coding agents. Use at your own risk.

Features

  • Verbose Output - Enable verbose output for debugging
  • Interactive CLI - Seamless conversational interface with AI models
  • Multiple Model Support - Choose from various AI providers (OpenAI, Anthropic, Google, Qwen, etc.)
  • MCP Integration - Model Context Protocol server support for extended capabilities
  • Smart Caching - Persistent settings, API keys, and model configurations
  • Rich Terminal UI - Beautiful output with markdown rendering, gradient banners, and status indicators
  • Paste Support - Handle large code snippets and file contents with ease
  • Configurable - Environment variables and CLI flags for full customization

Installation

Prerequisites

  • Python 3.11 or higher
  • uv (recommended) or pip
  • Deno (RLM runtime for the REPL)

Install Deno with one of the following:

curl -fsSL https://deno.land/install.sh | sh
brew install deno

Install via uv (recommended)

uv tool install microcode

Upgrade via uv

uv tool upgrade microcode

Install via pip

pip install microcode

Install from source

git clone https://github.com/modaic-ai/microcode.git
cd microcode
uv sync  # or: pip install -e .

Configuration

API Key Setup

Microcode uses OpenRouter for model access. Set your API key using one of these methods:

  1. Environment Variable (recommended for CI/CD):

    export OPENROUTER_API_KEY="your-api-key"
    
  2. Interactive Setup (persisted to cache):

    microcode
    /key  # Then enter your API key when prompted
    

Environment Variables

Variable Description Default
OPENROUTER_API_KEY OpenRouter API key -
MICROCODE_MODEL Primary model ID Auto-selected
MICROCODE_SUB_LM Sub-model for auxiliary tasks Auto-selected
MICROCODE_MAX_ITERATIONS Max iterations per task -
MICROCODE_MAX_TOKENS Max tokens per response -
MICROCODE_MAX_OUTPUT_CHARS Max output characters -
MICROCODE_API_BASE Custom API base URL -
MICROCODE_VERBOSE Enable verbose logging (1/0) 0
MODAIC_ENV / MICROCODE_ENV Environment (dev/prod) prod

Usage

Starting the CLI

microcode

CLI Options

microcode --help
Flag Description
--model, -m Override primary model
--sub-lm, -s Override sub-model
--api-key, -k Provide API key directly
--max-iterations Set max iterations
--max-tokens Set max tokens
--max-output-chars Set max output characters
--api-base Custom API base URL
--verbose, -v Enable verbose output
--env Set environment (dev/prod)
--history-limit Conversation history limit
--no-banner Disable startup banner

Interactive Commands

Once in the CLI, use these commands:

Command Description
/help, /h, ? Show help menu
/q, exit Exit the CLI
/clear, /cls Clear the terminal screen
/c Clear conversation history
/key [key] Set OpenRouter API key (or enter interactively)
/key clear Remove stored API key
/model Change primary model via TUI selector
/model <id> Set primary model directly
/mcp add <name> <command> Add an MCP server

Available Models

# Model Provider
1 GPT-5.2 Codex OpenAI
2 GPT-5.2 OpenAI
3 Claude Opus 4.5 Anthropic
4 Claude Opus 4 Anthropic
5 Qwen 3 Coder Qwen
6 Gemini 3 Flash Preview Google
7 Kimi K2 0905 Moonshot AI
8 Minimax M2.1 Minimax

Project Structure

microcode/
├── main.py              # Entry point and interactive CLI loop
├── pyproject.toml       # Project configuration and dependencies
├── utils/
│   ├── __init__.py
│   ├── cache.py         # API key and settings persistence
│   ├── constants.py     # Colors, models, paths, and banner art
│   ├── display.py       # Terminal rendering and UI utilities
│   ├── mcp.py           # MCP server integration
│   ├── models.py        # Model selection and configuration
│   └── paste.py         # Clipboard and paste handling
└── tests/
    └── test_main_settings.py

Key Components

  • main.py - Orchestrates the interactive session, handles user input, manages conversation history, and invokes the AI agent via Modaic's AutoProgram
  • utils/cache.py - Secure storage for API keys and user preferences using JSON files
  • utils/constants.py - Centralized configuration including available models, ANSI color codes, and file paths
  • utils/display.py - Terminal output formatting, markdown rendering, and the startup banner
  • utils/models.py - Model selection TUI using Textual, model ID normalization, and agent reconfiguration
  • utils/mcp.py - Model Context Protocol server registration and management
  • utils/paste.py - Handles large text inputs via placeholder replacement

Development

Setting Up Development Environment

git clone https://github.com/modaic-ai/microcode.git
cd microcode
uv sync --dev

Running Tests

uv run pytest tests/

Code Style

The project follows standard Python conventions. Use type hints for all function signatures.

Dependencies

Core dependencies:

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Links


Built with by Modaic

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

microcode-0.2.0.tar.gz (19.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

microcode-0.2.0-py3-none-any.whl (20.1 kB view details)

Uploaded Python 3

File details

Details for the file microcode-0.2.0.tar.gz.

File metadata

  • Download URL: microcode-0.2.0.tar.gz
  • Upload date:
  • Size: 19.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.9 {"installer":{"name":"uv","version":"0.9.9"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for microcode-0.2.0.tar.gz
Algorithm Hash digest
SHA256 4e90d8a8557a1582449916d8610232e9fb0f5d97d2fccedb1dfc05aa410573f6
MD5 a9ba5edfac575d368efb8e98691a6afc
BLAKE2b-256 5ebbde2b65833efb862f70523300fa4b91bc4af773f7e2a1e3229d763ce604de

See more details on using hashes here.

File details

Details for the file microcode-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: microcode-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 20.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.9 {"installer":{"name":"uv","version":"0.9.9"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for microcode-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e9378a2bde2c4b80c862621b3b06a25e8f5dd52a599044bc1392f169382ca72b
MD5 bd0b8f68ebbbdf0652123b95919f3d5d
BLAKE2b-256 82a1c0b0b216884747649745f32baeb77a3ca137ea0972ae6f5c8f1ecf28f66c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page