Skip to main content

Official Unofficial MiniMax Platform CLI

Reason this release was yanked:

too early

Project description

MiniMax CLI

 ███╗   ███╗██╗███╗   ██╗██╗███╗   ███╗ █████╗ ██╗  ██╗
 ████╗ ████║██║████╗  ██║██║████╗ ████║██╔══██╗╚██╗██╔╝
 ██╔████╔██║██║██╔██╗ ██║██║██╔████╔██║███████║ ╚███╔╝
 ██║╚██╔╝██║██║██║╚██╗██║██║██║╚██╔╝██║██╔══██║ ██╔██╗
 ██║ ╚═╝ ██║██║██║ ╚████║██║██║ ╚═╝ ██║██║  ██║██╔╝ ██╗
 ╚═╝     ╚═╝╚═╝╚═╝  ╚═══╝╚═╝╚═╝     ╚═╝╚═╝  ╚═╝╚═╝  ╚═╝

Unofficial CLI for the MiniMax M2.1 API - Not affiliated with MiniMax Inc.

A powerful, feature-rich Rust CLI for interacting with MiniMax's M2.1 model via the Anthropic-compatible API. Supports text chat, agentic workflows, image/video/audio/music generation, and a Codex-like RLM sandbox mode.

Features

  • Interactive TUI - Beautiful ratatui-based terminal UI with multiple modes
  • Agent Mode - Autonomous task execution with built-in tools
  • RLM Sandbox - Recursive Language Model REPL for large context processing
  • Multi-Modal Generation - Text, image, video, audio, and music
  • MCP Integration - Model Context Protocol server support
  • Memory System - Persistent long-term memory across sessions
  • Skills System - Extensible prompt augmentation
  • Prompt Caching - Efficient token usage with cache controls
  • Context Compaction - Auto-summarization for long conversations

Quick Start

# Install
cargo install --path .

# Configure
export MINIMAX_API_KEY=your_api_key_here
# Or create ~/.minimax/config.toml

# Interactive TUI (recommended)
minimax-cli tui

# Simple chat
minimax-cli text chat --prompt "Hello, MiniMax!"

# Agent mode with tools
minimax-cli agent run --allow-shell --prompt "List files in current directory"

# RLM sandbox for large files
minimax-cli rlm repl --load large_file.txt

Modes

Normal Mode (Chat)

Standard conversational interaction with the M2.1 model.

minimax-cli text chat
# Or in TUI: /mode normal

Agent Mode

Autonomous task execution with built-in tools:

  • list_dir - List directory contents
  • read_file - Read files from workspace
  • write_file - Write files to workspace
  • edit_file - Search/replace in files
  • exec_shell - Execute shell commands (requires --allow-shell)
  • note - Append to notes file
  • mcp_call - Call MCP server tools
minimax-cli agent run --workspace ./project --allow-shell
# Or in TUI: /mode agent or /yolo (enables shell)

Plan Mode

Design-first workflow - plan your implementation before coding.

# In TUI: /mode plan

Edit Mode

File modification focus with AI assistance.

# In TUI: /mode edit

RLM Sandbox Mode

Recursive Language Model sandbox for processing large contexts. Based on the RLM paradigm that enables LMs to programmatically examine, decompose, and recursively process inputs of near-infinite length.

# Interactive REPL
minimax-cli rlm repl --load myfile.txt

# Command-line operations
minimax-cli rlm load --path myfile.txt --context-id main
minimax-cli rlm search --context-id main --pattern "TODO|FIXME"
minimax-cli rlm exec --context-id main --code "lines(0, 50)"

RLM Expressions:

  • len - Character count
  • line_count - Line count
  • head / tail - First/last 10 lines
  • peek(start, end) - Character slice
  • lines(start, end) - Line range
  • search("pattern") - Regex search
  • chunk(size, overlap) - Split into chunks

Interactive TUI

The TUI provides a rich terminal interface with:

  • MiniMax branded header with logo
  • Mode indicator (Normal/Edit/Agent/Plan/RLM)
  • Chat history with scrolling
  • Status bar with keybindings
  • Help popup (F1)

TUI Commands:

Command Description
/mode <n/e/a/p/r> Switch modes
/yolo Enable Agent + Shell
/help Show help
/clear Clear conversation
/model <name> Change model
/compact Toggle auto-compaction
/save <path> Save session
/load <path> Load session
/exit Exit application

Keybindings:

Key Action
F1 Help
Ctrl+C Exit
Esc Clear input / Normal mode
Alt+Up/Down Scroll chat
PageUp/PageDown Fast scroll
Up/Down History navigation

Configuration

Create ~/.minimax/config.toml:

api_key = "YOUR_MINIMAX_API_KEY"
anthropic_api_key = "YOUR_ANTHROPIC_COMPAT_API_KEY"
base_url = "https://api.minimax.io"
anthropic_base_url = "https://api.minimax.io/anthropic"

default_text_model = "MiniMax-M2.1"
output_dir = "./outputs"
allow_shell = false

[retry]
enabled = true
max_retries = 3

[compaction]
enabled = false
token_threshold = 50000
message_threshold = 50

[rlm]
max_context_chars = 10000000
session_dir = "~/.minimax/rlm"

[profiles.work]
api_key = "WORK_API_KEY"

Environment Variables:

  • MINIMAX_API_KEY - API key
  • ANTHROPIC_API_KEY - Anthropic-compatible API key
  • MINIMAX_BASE_URL - Base URL
  • MINIMAX_PROFILE - Config profile name
  • MINIMAX_ALLOW_SHELL - Enable shell (true/false)

Media Generation

Images

minimax-cli image generate --prompt "A sunset over mountains"

Videos

minimax-cli video generate --prompt "A timelapse of clouds" --wait
minimax-cli video query --task-id <id>

Audio (TTS)

minimax-cli audio t2a --text "Hello, world!" --voice-id english-1
minimax-cli audio voice list
minimax-cli audio voice clone --clone-audio sample.wav

Music

minimax-cli music generate --prompt "Upbeat electronic track"

MCP Integration

Manage Model Context Protocol servers:

# List servers
minimax-cli mcp list

# Add server
minimax-cli mcp add --name myserver --command "python" --arg "-m" --arg "mcp_server"

# Remove server
minimax-cli mcp remove --name myserver

Use MCP tools in agent mode via the mcp_call tool.

Memory & Skills

Long-term Memory

minimax-cli memory show
minimax-cli memory add --content "Important fact to remember"
minimax-cli memory clear

Use --memory flag in agent/TUI to include memory in prompts.

Skills

Skills are prompt templates stored in ~/.minimax/skills/<name>/SKILL.md:

minimax-cli skills list
minimax-cli skills show --name coding

Load skills with --skill coding --skill writing in agent mode.

Prompt Caching

Optimize token usage with Anthropic-style prompt caching:

# Cache system prompt
minimax-cli text chat --cache-system

# Cache tools
minimax-cli text chat --cache-tools

# Cache user message
minimax-cli text chat --cache

# Agent with caching
minimax-cli agent run --cache-system --cache-tools --cache-memory

Context Compaction

Auto-summarize long conversations to stay within context limits:

# Enable in config.toml:
# [compaction]
# enabled = true
# token_threshold = 50000

# Or toggle in TUI:
# /compact

When enabled, older messages are summarized when the conversation exceeds thresholds.

File Management

minimax-cli files upload --path file.txt --purpose "assistants"
minimax-cli files list --purpose "assistants"
minimax-cli files retrieve --file-id <id>
minimax-cli files retrieve-content --file-id <id> --output local.txt
minimax-cli files delete --file-id <id>

Models Registry

minimax-cli models list
minimax-cli models list --json

Session Management

Save and restore chat/agent sessions:

# In TUI or agent mode:
/save session.json
/load session.json
/export session.md

Release

GitHub Release

  1. Update version in Cargo.toml
  2. Tag release: git tag v0.1.0 && git push --tags
  3. GitHub Actions builds and publishes

PyPI (Python wrapper)

cd python && uv build && uv publish

Crates.io

cargo publish

Architecture

src/
├── main.rs          # CLI entry point (clap)
├── client.rs        # HTTP clients (MiniMax, Anthropic)
├── config.rs        # TOML config with profiles
├── models.rs        # API data structures
├── agent.rs         # Agentic tool loop
├── rlm.rs           # RLM sandbox REPL
├── compaction.rs    # Context auto-compaction
├── tui/             # Ratatui TUI
│   ├── app.rs       # App state & modes
│   └── ui.rs        # Rendering
├── modules/         # Media generation
│   ├── text.rs      # Chat
│   ├── image.rs     # Image gen
│   ├── video.rs     # Video gen
│   ├── audio.rs     # TTS
│   └── music.rs     # Music gen
├── mcp.rs           # MCP integration
├── skills.rs        # Skills system
├── memory.rs        # Long-term memory
└── session.rs       # Session save/load

Requirements

  • Rust 1.89+ (edition 2024)
  • MiniMax API key
  • macOS/Linux (Windows untested)

License

MIT


This is an unofficial, community-maintained project. MiniMax and the MiniMax logo are trademarks of MiniMax Inc.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

minimax_cli-0.1.1.tar.gz (6.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

minimax_cli-0.1.1-py3-none-any.whl (6.4 kB view details)

Uploaded Python 3

File details

Details for the file minimax_cli-0.1.1.tar.gz.

File metadata

  • Download URL: minimax_cli-0.1.1.tar.gz
  • Upload date:
  • Size: 6.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for minimax_cli-0.1.1.tar.gz
Algorithm Hash digest
SHA256 16a6e944d16d7ae586de441861c10ad83ce6270382d5b5e55030507cf62a469f
MD5 ec23b7bd8663074a9676a3a9f59bc1ef
BLAKE2b-256 29f0c068d28842e283f2ca976013a8e59ec272392f95e84670727786497418e2

See more details on using hashes here.

Provenance

The following attestation bundles were made for minimax_cli-0.1.1.tar.gz:

Publisher: publish.yml on Hmbown/MiniMax-CLI

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file minimax_cli-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: minimax_cli-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 6.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for minimax_cli-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 097036787d57ed8b7be469ec4a3907798a3a8e84d5643f3e477f8d5f0b5a40fb
MD5 c8c0bd8bf01d3d5304aeab16a4ec4bab
BLAKE2b-256 43e17e53ec8b85dc2911eaba2e0a4492fe4b32d5f2d9c3d78f3a6f176795e812

See more details on using hashes here.

Provenance

The following attestation bundles were made for minimax_cli-0.1.1-py3-none-any.whl:

Publisher: publish.yml on Hmbown/MiniMax-CLI

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page