Skip to main content

MahanAI: terminal AI agent for NVIDIA NIM (OpenAI-compatible), tools, streaming, /api-key, safety prompts.

Project description

(M MahanAI)

MahanAI Super

Terminal AI agent (Super 2.0) with multi-model support, streaming chat, tools, and a built-in Claude CLI mode. Docs: MahanAI.

Install

pip install mahanai
mahanai

Launch Options

Flag Description
--compact Compact mode: renders a small MAI ASCII logo and a trimmed header (no streaming hint, no /api-key reminder)
mahanai --compact

Compact banner looks like:

===================================
  Super 2.0  |  <Model>  |
  /help  /exit  /quit
===================================

Models

MahanAI Super supports multiple backends selectable at runtime via /models.

NVIDIA NIM

Pretty Name Model ID Backend
Llama 3.3 70B meta/llama-3.3-70b-instruct NVIDIA NIM (direct)

Note: A legacy server mode (mahanai/mahanai) exists in the model selector but is undocumented and not recommended for use.

Claude

Pretty Name Model ID Backend
Claude Opus 4 claude-opus-4-7 Claude CLI
Claude Sonnet 4.6 claude-sonnet-4-6 Claude CLI
Claude Haiku 4.5 claude-haiku-4-5-20251001 Claude CLI

OpenAI Codex

Seven models available, each accessible in Direct and Indirect mode (see OpenAI Codex below):

Pretty Name Model ID
GPT-5.4 gpt-5.4
GPT-5.2-Codex gpt-5.2-codex
GPT-5.1-Codex-Max gpt-5.1-codex-max
GPT-5.4-Mini gpt-5.4-mini
GPT-5.3-Codex gpt-5.3-codex
GPT-5.2 gpt-5.2
GPT-5.1-Codex-Mini gpt-5.1-codex-mini

Switch models interactively with /models (arrow-key selector) or quick-switch with /mode claude / /mode default.

Custom Endpoint

Point MahanAI at any OpenAI-compatible API (Ollama, LM Studio, vLLM, OpenRouter, etc.):

/custom http://localhost:11434/v1 llama3 [optional-api-key]

Once saved, select Custom Endpoint from /models to start using it. The config persists across sessions.

Commands

Command Description
/models Interactive model selector (↑↓ arrows, Enter to confirm, Esc to cancel)
/mode claude Quick-switch to Claude Sonnet 4.6
/mode default Quick-switch back to MahanAI Super (server)
/api-key [key] Save server API key (omit key for hidden prompt)
/api-key clear Remove saved server key
/api-key-nvidia [key] Save NVIDIA direct API key
/api-key-nvidia clear Remove NVIDIA key, switch back to server
/codex-login Sign in to OpenAI via browser (Codex Direct mode)
/codex-logout Remove saved OpenAI Codex credentials
/custom [url [model [key]]] Configure a custom OpenAI-compatible endpoint
/custom clear Remove saved custom endpoint
/help Show help
/exit or /quit Leave

API Keys

Server / NVIDIA NIM

  1. Environment: MAHANAI_API_KEY=...
  2. Project .env: MAHANAI_API_KEY=...
  3. In-app: /api-key your-key

Keys are stored under %APPDATA%\MahanAI\config.json on Windows or ~/.config/mahanai/config.json on Linux/macOS.

Claude CLI mode

Claude models use your local claude CLI installation. Make sure Claude Code is installed and on your PATH. No extra API key configuration needed inside MahanAI — it uses whatever account Claude CLI is authenticated with.

OpenAI Codex

MahanAI supports two Codex authentication modes:

Direct mode

Signs in to your OpenAI account via a browser-based OAuth PKCE flow — no API key needed.

/codex-login

This opens your browser to auth.openai.com. After you approve, MahanAI receives and stores the access token automatically. Tokens are refreshed silently before they expire (saved to the same config.json as other keys).

Indirect mode

Reads credentials from a locally installed and signed-in OpenAI Codex CLI. MahanAI looks for auth.json in these locations:

Platform Paths checked
Windows %LOCALAPPDATA%\OpenAI\Codex\auth.json, ~\.codex\auth.json
macOS / Linux ~/.codex/auth.json, ~/.config/codex/auth.json

If no token file is found, MahanAI falls back to running the codex CLI as a subprocess (requires Codex CLI on your PATH).

To use indirect mode, install and sign in to the Codex CLI first:

npm i -g @openai/codex
codex login

Then select any OpenAI Codex (Indirect) model from /models.

Custom Endpoint

Use /custom to connect to any OpenAI-compatible server — Ollama, LM Studio, vLLM, OpenRouter, or your own deployment.

Interactive setup (prompts for each field):

/custom

One-liner:

/custom <base-url> [model] [api-key]

Examples:

/custom http://localhost:11434/v1 llama3
/custom http://localhost:1234/v1 mistral-7b
/custom https://openrouter.ai/api/v1 openai/gpt-4o sk-or-...
  • base-url — the /v1 base URL of the server
  • model — model ID to send in requests (defaults to gpt-3.5-turbo if omitted)
  • api-key — leave blank if the server doesn't require one

After saving, run /models and select Custom Endpoint, or the agent will remind you to switch if you haven't already. To remove the config:

/custom clear

Environment Variables

Variable Purpose
MAHANAI_API_KEY Override saved server API key
MAHANAI_MODEL Override default model ID
MAHANAI_STREAM Set to 0/false/no/off to disable streaming
MAHANAI_CONFIG_DIR Override config file directory
NO_COLOR Disable terminal colors

Tools

MahanAI can execute tools on your behalf:

  • run_command — run shell commands (asks confirmation before destructive ops)
  • read_file — read a file
  • write_file — write a file
  • append_file — append to a file
  • list_directory — list directory contents

Develop

pip install -e .
python -m mahanai

Publish to PyPI

Bump version in both pyproject.toml and mahanai/__init__.py, then:

pip install build twine
python -m build
python -m twine check dist/*

Windows (PowerShell):

$env:TWINE_USERNAME = "__token__"
$env:TWINE_PASSWORD = "pypi-YOUR_TOKEN_HERE"
python -m twine upload dist/mahanai-4.0.0*

macOS / Linux:

export TWINE_USERNAME=__token__
export TWINE_PASSWORD=pypi-YOUR_TOKEN_HERE
python -m twine upload dist/mahanai-4.0.0*

twine cannot publish without your token; keep it out of git.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mahanai-4.4.11.tar.gz (22.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mahanai-4.4.11-py3-none-any.whl (22.5 kB view details)

Uploaded Python 3

File details

Details for the file mahanai-4.4.11.tar.gz.

File metadata

  • Download URL: mahanai-4.4.11.tar.gz
  • Upload date:
  • Size: 22.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.13

File hashes

Hashes for mahanai-4.4.11.tar.gz
Algorithm Hash digest
SHA256 fa15ec24b9b2d7946871d895547c205ae728abb4141855ad098259942b91562e
MD5 26e7c2afe6efe5b9ed58bbd14a70c6fc
BLAKE2b-256 772743fb9aabd96da7416fbc7a40a8bab6925f0d95898af71276d7eabed3d019

See more details on using hashes here.

File details

Details for the file mahanai-4.4.11-py3-none-any.whl.

File metadata

  • Download URL: mahanai-4.4.11-py3-none-any.whl
  • Upload date:
  • Size: 22.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.13

File hashes

Hashes for mahanai-4.4.11-py3-none-any.whl
Algorithm Hash digest
SHA256 d308ab2e0a371eba29c277a6cee6a7782f97fa62ae5bad78303b94991a734301
MD5 41731b57494f93cee5deaaec4d1f8ccb
BLAKE2b-256 aa6951134a299168f7b9250e41d3079d35c88933562992ee0c7ebefb0a9b9d37

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page