Skip to main content

Production-grade TUI for AI-assisted development with RLM context management

Project description

opencode-rlm

AI-powered development tool with TUI interface and Recursive Language Model (RLM) context management for unlimited conversation context.

Features

  • RLM Context Management: Automatic context archival and retrieval enables conversations beyond context window limits
  • Multi-Provider Support: Anthropic (Claude), OpenAI (GPT-4), and more
  • Rich TUI Interface: Full-featured terminal UI with Textual
  • 14 Built-in Tools: bash, read, write, edit, glob, grep, todo, webfetch, memory, and more
  • Session Management: Fork, export, and manage conversation sessions
  • Slash Commands: /context, /search, /compact, /models, /agents

Installation

via npm (recommended)

npm install -g opencode-rlm

via pip

pip install opencode-rlm

From source

git clone https://github.com/tekcin/opencode-rlm.git
cd opencode-rlm
pip install -e .

Requirements

  • Python 3.11+
  • Node.js 16+ (for npm installation)

Usage

# Start the TUI
opencode

# Start with a specific project
opencode /path/to/project

# Continue last session
opencode --continue

# Run a single command (non-interactive)
opencode run "explain this codebase"

# List available models
opencode models

Configuration

Create ~/.config/opencode/config.json:

{
  "model": "anthropic/claude-sonnet-4",
  "theme": "opencode",
  "rlm": {
    "enabled": true,
    "threshold_ratio": 0.33,
    "auto_retrieve": true
  }
}

Environment Variables

export ANTHROPIC_API_KEY=sk-ant-...
export OPENAI_API_KEY=sk-...

RLM (Recursive Language Model)

RLM enables unlimited context by automatically:

  1. Tracking token usage across the conversation
  2. Archiving older context when threshold is reached (default: 33%)
  3. Retrieving relevant archived context for new queries
  4. Injecting context into the conversation seamlessly

Manual RLM Commands

  • /context - Show current token usage
  • /search <query> - Search archived context
  • /compact - Manually trigger context archival

RLM Tools

  • memory - Search and retrieve archived context
  • rlm_query - Execute Python queries on archived context

License

MIT

Author

Michael Thornton (tekcin@yahoo.com)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

opencode_rlm-0.1.0.tar.gz (81.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

opencode_rlm-0.1.0-py3-none-any.whl (115.7 kB view details)

Uploaded Python 3

File details

Details for the file opencode_rlm-0.1.0.tar.gz.

File metadata

  • Download URL: opencode_rlm-0.1.0.tar.gz
  • Upload date:
  • Size: 81.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for opencode_rlm-0.1.0.tar.gz
Algorithm Hash digest
SHA256 48e858c7a9795156a61f11215fee9f9b49741db6892738afdf63e1215e213add
MD5 a6370c2f35cd6701213160667dfe830c
BLAKE2b-256 561bfed322dd042690a1c122537afa24ba56a67107c39308107325078e1bc750

See more details on using hashes here.

File details

Details for the file opencode_rlm-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: opencode_rlm-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 115.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for opencode_rlm-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b4393b88d0760950b8fa99973a5a4daff3c4ecb86ad1520da8fcc7e95fc499ad
MD5 fe4a97fdc5b8042c496a0a9ad05eff2c
BLAKE2b-256 dc6b3b45cdc00eef372a62a306b0e5ac268e7da997a09ad79939283ca81397fc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page