Skip to main content

A lightweight CLI tool for AI integration in your terminal.

Project description

Termai

Termai is a lightweight, zero-dependency CLI wrapper for Google's Gemini AI, built for Termux on Android and general Linux environments. It brings the power of Large Language Models (LLMs) directly to your command line, following the Unix philosophy of piping and standard streams.

⚡ Features

  • 🚀 Lightweight: Uses standard Python requests. No heavy SDKs or complex dependencies.
  • 🟢 Unix Compatible: Supports piping (stdin). Feed logs, code, or text files directly into the AI.
  • 🛠 Configurable: Built-in JSON configuration system (ai --config) to edit System Prompts, Temperature, and Models.
  • ⚡ Fast: Defaults to gemini-2.5-flash for instant responses.
  • 🎨 Clean UI: Minimalist output with syntax-highlighted green text.
  • 🧹 Auto-Cleanup: The installer sets everything up and deletes the repository to save space.

📥 Installation

Method 1: Global Install (Recommended)

You can install Termai directly using pip or pipx:

# Using pip
pip install termux-ai
# Using pipx (isolated environment)
pipx install termux-ai

This will make the ai command available globally.

Method 2: Manual Setup (for Development)

If you want to contribute or run Termai in a development environment:

git clone https://github.com/attajak/termux-ai.git
cd termux-ai
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
pip install -e .
or
pip install -e git+https://github.com/attajak/termux-ai.git

🔑 Setup

On the very first run, Termai will ask for your Google Gemini API Key.

  • Get a free API key here: Google AI Studio
  • Run the command:
  ai "hello"
  • Paste your key when prompted. It will be saved locally.

💻 Usage

  1. Basic Questions Ask anything directly from the terminal. ai "How do I untar a file in Linux?"

  2. Piping (The Power Move) Feed output from other commands into Termai. Debug an error log:

cat error.log | ai "Explain what caused this crash"

Explain a script:

cat install.sh | ai "What does this script do?"

Generate code and save it:

ai "Write a Python hello world script" > hello.py

⚙️ Configuration

Termai comes with a built-in configuration editor. You can change the AI provider, model, and personality. Run:

ai --config

This opens config.json in your preferred editor. The editor is chosen based on the following priority:

  1. The $EDITOR environment variable.
  2. vim (if installed).
  3. nano (as a fallback).

The configuration file looks like this:

{
    "provider": "gemini",
    "proxy": "http://user:pass@127.0.0.1:1080",
    "gemini_config": {
        "api_key": "YOUR_GEMINI_KEY",
        "model_name": "gemini-2.5-flash",
        "system_instruction": "You are a CLI assistant for Termux...",
        "generation_config": {
            "temperature": 0.7,
            "maxOutputTokens": 1024
        }
    },
    "openai_config": {
        "api_key": "YOUR_OPENAI_KEY",
        "model_name": "gpt-4o",
        "system_instruction": "You are a helpful assistant.",
        "temperature": 0.7,
        "max_tokens": 1024
    }
}
  • provider: Set to "gemini" or "openai" to choose your AI provider.
  • proxy: (Optional) Set an HTTP or HTTPS proxy for all requests.
  • gemini_config: Settings for when provider is "gemini".
    • model_name: Change to gemini-2.5-pro or other available models.
    • system_instruction: Give the AI a persona.
    • temperature: Set to 1.0 for creative answers, 0.1 for precise logic.
  • openai_config: Settings for when provider is "openai".
    • model_name: Change to gpt-3.5-turbo, etc.
    • system_instruction: A different persona for ChatGPT.
    • temperature: Controls randomness.
    • max_tokens: The maximum number of tokens to generate.

❓ Help & Troubleshooting

Command List:

ai --help

Re-configure API Keys:

To reset and re-enter your API keys, use the --reinstall flag.

ai --reinstall

Debug Mode:

If the AI isn't responding or you are getting errors, run:

ai --debug "your question"

This will print the raw server response and error codes.

Debug Configuration: If you are having issues with your configuration, you can use the --debug-config flag to print the loaded configuration. API keys will be redacted for security.

ai --debug-config

🗑 Uninstallation

If installed via pip/pipx:

pip uninstall termux-ai
# or
pipx uninstall termux-ai

📄 License

This project is licensed under the MIT License. You are free to use, modify, and distribute this software. See the LICENSE file for more details.

Made with ❤️ for CLI enthusiasts


Source codes https://github.com/estiaksoyeb/termai.git

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

termux_ai-0.3.0.tar.gz (12.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

termux_ai-0.3.0-py3-none-any.whl (12.8 kB view details)

Uploaded Python 3

File details

Details for the file termux_ai-0.3.0.tar.gz.

File metadata

  • Download URL: termux_ai-0.3.0.tar.gz
  • Upload date:
  • Size: 12.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for termux_ai-0.3.0.tar.gz
Algorithm Hash digest
SHA256 b45a4bb504cacd2f73bd217a7088d3fadc03758b3d2c95a1d4919a738e81ab5e
MD5 c0a8c6595499823ce9299b3554961d63
BLAKE2b-256 d3d694dc443547fb98283669d80c4eaa90b5509be1a015938d23bea6742b1627

See more details on using hashes here.

Provenance

The following attestation bundles were made for termux_ai-0.3.0.tar.gz:

Publisher: release.yml on attajak/termux-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file termux_ai-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: termux_ai-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 12.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for termux_ai-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 eccc7888717a23a266e52275413f87051813d66bfbaaac0a28bd910ea4ab7310
MD5 9154f1754c4f2e9e31bb937127f51509
BLAKE2b-256 58e96f31f9a36211b741ce08add972401586dc6ee63ea947089b4e0318bd85b9

See more details on using hashes here.

Provenance

The following attestation bundles were made for termux_ai-0.3.0-py3-none-any.whl:

Publisher: release.yml on attajak/termux-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page