Skip to main content

A lightweight CLI tool for AI integration in your terminal.

Project description

Termux-AI

Release

Termux-Ai is a lightweight, zero-dependency CLI wrapper for Google's Gemini AI, built for Termux on Android and general Linux environments. It brings the power of Large Language Models (LLMs) directly to your command line, following the Unix philosophy of piping and standard streams.

⚡ Features

  • 🚀 Lightweight: Uses standard Python requests. No heavy SDKs or complex dependencies.
  • 🟢 Unix Compatible: Supports piping (stdin). Feed logs, code, or text files directly into the AI.
  • 🛠 Configurable: Built-in JSON configuration system (ai --config) to edit System Prompts, Temperature, and Models.
  • ⚡ Fast: Defaults to gemini-2.5-flash for instant responses.
  • 🎨 Clean UI: Minimalist output with syntax-highlighted green text.
  • 🧹 Auto-Cleanup: The installer sets everything up and deletes the repository to save space.

📥 Installation

Method 1: Global Install (Recommended)

You can install TermuxAI directly using pip or pipx:

# Using pip
pip install termux-ai
# Using pipx (isolated environment)
pipx install termux-ai

This will make the ai command available globally.

Method 2: Manual Setup (for Development)

If you want to contribute or run Termai in a development environment:

git clone https://github.com/attajak/termux-ai.git
cd termux-ai
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

or

pip install -e git+https://github.com/attajak/termux-ai.git

🔑 Setup

On the very first run, Termai will ask for your Google Gemini API Key.

  • Get a free API key here: Google AI Studio
  • Run the command:
  ai "hello"
  • Paste your key when prompted. It will be saved locally.

💻 Usage

  1. Basic Questions Ask anything directly from the terminal. ai "How do I untar a file in Linux?"

  2. Piping (The Power Move) Feed output from other commands into Termux-AI. Debug an error log:

cat error.log | ai "Explain what caused this crash"

Explain a script:

cat install.sh | ai "What does this script do?"

Generate code and save it:

ai "Write a Python hello world script" > hello.py

⚙️ Configuration

Termux-AI comes with a built-in configuration editor. You can change the AI provider, model, and personality. Run:

ai --config

This opens config.json in your preferred editor. The editor is chosen based on the following priority:

  1. The $EDITOR environment variable.
  2. vim (if installed).
  3. nano (as a fallback).

The configuration file looks like this:

{
    "provider": "gemini",
    "proxy": "http://user:pass@127.0.0.1:1080",
    "gemini_config": {
        "api_key": "YOUR_GEMINI_KEY",
        "model_name": "gemini-2.5-flash",
        "system_instruction": "You are a CLI assistant for Termux...",
        "generation_config": {
            "temperature": 0.7,
            "maxOutputTokens": 1024
        }
    },
    "openai_config": {
        "api_key": "YOUR_OPENAI_KEY",
        "model_name": "gpt-4o",
        "system_instruction": "You are a helpful assistant.",
        "temperature": 0.7,
        "max_tokens": 1024
    }
}
  • provider: Set to "gemini" or "openai" to choose your AI provider.
  • proxy: (Optional) Set an HTTP or HTTPS proxy for all requests.
  • gemini_config: Settings for when provider is "gemini".
    • model_name: Change to gemini-2.5-pro or other available models.
    • system_instruction: Give the AI a persona.
    • temperature: Set to 1.0 for creative answers, 0.1 for precise logic.
  • openai_config: Settings for when provider is "openai".
    • model_name: Change to gpt-3.5-turbo, etc.
    • system_instruction: A different persona for ChatGPT.
    • temperature: Controls randomness.
    • max_tokens: The maximum number of tokens to generate.

🛠 Development

If you want to contribute or run Termai in a development environment:

  1. Clone and set up a virtual environment:

    git clone https://github.com/attajak/termux-ai.git
    cd termux-ai
    python -m venv .venv
    source .venv/bin/activate
    
  2. Install runtime dependencies:

    pip install -r requirements.txt
    
  3. Install development tools (linters & test runner):

    pip install -r dev-requirements.txt
    
  4. Install editable package for development:

    pip install -e .
    # 'ai' will use the local development version
    

Testing & linting

  • Run all tests:
pytest
  • Run a single test:
pytest tests/test_providers.py::test_gemini_success -q
  • Run the linter (ruff):
ruff check .
  • Install pre-commit hooks:
pre-commit install

❓ Help & Troubleshooting

Command List:

ai --help

Re-configure API Keys:

To reset and re-enter your API keys, use the --reinstall flag.

ai --reinstall

Debug Mode:

If the AI isn't responding or you are getting errors, run:

ai --debug "your question"

This will print the raw server response and error codes.

Debug Configuration: If you are having issues with your configuration, you can use the --debug-config flag to print the loaded configuration. API keys will be redacted for security.

ai --debug-config

🗑 Uninstallation

If installed via pip/pipx:

pip uninstall termux-ai
# or
pipx uninstall termux-ai

📄 License

This project is licensed under the MIT License. You are free to use, modify, and distribute this software. See the LICENSE file for more details.

Made with ❤️ for CLI enthusiasts


Source codes https://github.com/estiaksoyeb/termai.git

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

termux_ai-0.3.1.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

termux_ai-0.3.1-py3-none-any.whl (12.3 kB view details)

Uploaded Python 3

File details

Details for the file termux_ai-0.3.1.tar.gz.

File metadata

  • Download URL: termux_ai-0.3.1.tar.gz
  • Upload date:
  • Size: 12.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for termux_ai-0.3.1.tar.gz
Algorithm Hash digest
SHA256 cddf677dd3d7d8d2b50bbc7099b34c954601afb2c4168fa051c6c20d8cfbbdf8
MD5 9e0bb6464373a5e7416a6825aab82640
BLAKE2b-256 c350625533ceed080ff68d1141575acac8a2f6f7384946bbd46b6e3ff5221942

See more details on using hashes here.

Provenance

The following attestation bundles were made for termux_ai-0.3.1.tar.gz:

Publisher: release.yml on attajak/termux-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file termux_ai-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: termux_ai-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 12.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for termux_ai-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3696485fb9078d9db0e6f83e2647cbfb1066f09fd44070375e833e693c1a3bb8
MD5 7c6bcb239b2a0d4b99df7371a3f759bc
BLAKE2b-256 d45cc55bad9cb93e2491e7096af6e7bb444abf7d899f734e810a528b955363ca

See more details on using hashes here.

Provenance

The following attestation bundles were made for termux_ai-0.3.1-py3-none-any.whl:

Publisher: release.yml on attajak/termux-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page