Skip to main content

A lightweight CLI tool for AI integration in your terminal.

Reason this release was yanked:

Change repo

Project description

Termai

Termai is a lightweight, zero-dependency CLI wrapper for Google's Gemini AI, built for Termux on Android and general Linux environments. It brings the power of Large Language Models (LLMs) directly to your command line, following the Unix philosophy of piping and standard streams.

⚡ Features

  • 🚀 Lightweight: Uses standard Python requests. No heavy SDKs or complex dependencies.
  • 🟢 Unix Compatible: Supports piping (stdin). Feed logs, code, or text files directly into the AI.
  • 🛠 Configurable: Built-in JSON configuration system (ai --config) to edit System Prompts, Temperature, and Models.
  • ⚡ Fast: Defaults to gemini-2.5-flash for instant responses.
  • 🎨 Clean UI: Minimalist output with syntax-highlighted green text.
  • 🧹 Auto-Cleanup: The installer sets everything up and deletes the repository to save space.

📥 Installation

Method 1: Global Install (Recommended)

You can install Termai directly using pip or pipx:

# Using pip
pip install .

# Using pipx (cleanest)
pipx install .

This will make the ai command available globally.

Method 2: Manual Setup (for Development)

If you want to contribute or run Termai in a development environment:

🔑 Setup

On the very first run, Termai will ask for your Google Gemini API Key.

  • Get a free API key here: Google AI Studio
  • Run the command:
  ai "hello"
  • Paste your key when prompted. It will be saved locally.

💻 Usage

  1. Basic Questions Ask anything directly from the terminal. ai "How do I untar a file in Linux?"

  2. Piping (The Power Move) Feed output from other commands into Termai. Debug an error log:

cat error.log | ai "Explain what caused this crash"

Explain a script:

cat install.sh | ai "What does this script do?"

Generate code and save it:

ai "Write a Python hello world script" > hello.py

⚙️ Configuration

Termai comes with a built-in configuration editor. You can change the AI provider, model, and personality. Run:

ai --config

This opens config.json in your preferred editor. The editor is chosen based on the following priority:

  1. The $EDITOR environment variable.
  2. vim (if installed).
  3. nano (as a fallback).

The configuration file looks like this:

{
    "provider": "gemini",
    "proxy": "http://user:pass@127.0.0.1:1080",
    "gemini_config": {
        "api_key": "YOUR_GEMINI_KEY",
        "model_name": "gemini-2.5-flash",
        "system_instruction": "You are a CLI assistant for Termux...",
        "generation_config": {
            "temperature": 0.7,
            "maxOutputTokens": 1024
        }
    },
    "openai_config": {
        "api_key": "YOUR_OPENAI_KEY",
        "model_name": "gpt-4o",
        "system_instruction": "You are a helpful assistant.",
        "temperature": 0.7,
        "max_tokens": 1024
    }
}
  • provider: Set to "gemini" or "openai" to choose your AI provider.
  • proxy: (Optional) Set an HTTP or HTTPS proxy for all requests.
  • gemini_config: Settings for when provider is "gemini".
    • model_name: Change to gemini-2.5-pro or other available models.
    • system_instruction: Give the AI a persona.
    • temperature: Set to 1.0 for creative answers, 0.1 for precise logic.
  • openai_config: Settings for when provider is "openai".
    • model_name: Change to gpt-3.5-turbo, etc.
    • system_instruction: A different persona for ChatGPT.
    • temperature: Controls randomness.
    • max_tokens: The maximum number of tokens to generate.

🛠 Development

If you want to contribute or run Termai in a development environment:

  1. Clone and set up a virtual environment:

    git clone https://github.com/estiaksoyeb/termai
    cd termai
    python -m venv .venv
    source .venv/bin/activate
    
  2. Install dependencies:

    pip install -r requirements.txt
    
  3. Run in development mode:

    pip install -e .
    # Now the 'ai' command will use your local development version.
    

❓ Help & Troubleshooting

Command List:

ai --help

Re-configure API Keys:

To reset and re-enter your API keys, use the --reinstall flag.

ai --reinstall

Debug Mode:

If the AI isn't responding or you are getting errors, run:

ai --debug "your question"

This will print the raw server response and error codes.

Debug Configuration: If you are having issues with your configuration, you can use the --debug-config flag to print the loaded configuration. API keys will be redacted for security.

ai --debug-config

🗑 Uninstallation

If installed via pip/pipx:

pip uninstall termai
# or
pipx uninstall termai

📄 License

This project is licensed under the MIT License. You are free to use, modify, and distribute this software. See the LICENSE file for more details.

Made with ❤️ for CLI enthusiasts

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

termux_ai-0.2.0.tar.gz (12.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

termux_ai-0.2.0-py3-none-any.whl (12.7 kB view details)

Uploaded Python 3

File details

Details for the file termux_ai-0.2.0.tar.gz.

File metadata

  • Download URL: termux_ai-0.2.0.tar.gz
  • Upload date:
  • Size: 12.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for termux_ai-0.2.0.tar.gz
Algorithm Hash digest
SHA256 b86b6a28811b269dcc14cfa977370105c41e5af6bba2f62bcd7dadcf898defe7
MD5 833b75976dfa7f880e9f2385ad1a7bb0
BLAKE2b-256 1d896f497759000ef77e69c85f4eaf30d54398b9455f8da5a104c7b22722053a

See more details on using hashes here.

Provenance

The following attestation bundles were made for termux_ai-0.2.0.tar.gz:

Publisher: release.yml on attajak/termai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file termux_ai-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: termux_ai-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 12.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for termux_ai-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 260c28c536c27e3daaa4c8b6006731b4847531add83601fad04f5484f655b2c1
MD5 75cf541445cca4bedb13f9b26eeb590c
BLAKE2b-256 6dec3bbcaa7810bdb5f6bb589c2163276c091147b101c69a8509b0bc27d51b41

See more details on using hashes here.

Provenance

The following attestation bundles were made for termux_ai-0.2.0-py3-none-any.whl:

Publisher: release.yml on attajak/termai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page