TerminalAI: Command-line AI assistant
Project description
Terminal AI
Bring the power of AI directly to your command line!
TerminalAI is your intelligent command-line assistant. Ask questions in natural language, get shell command suggestions, and execute them safely and interactively. It streamlines your workflow by translating your requests into actionable commands.
████████╗███████╗██████╗ ███╗ ███╗██╗███╗ ██╗ █████╗ ██║ █████╗ ██╗
╚══██╔══╝██╔════╝██╔══██╗████╗ ████║██║████╗ ██║██╔══██╗██║ ██╔══██╗██║
██║ █████╗ ██████╔╝██╔████╔██║██║██╔██╗ ██║███████║██║ ███████║██║
██║ ██╔══╝ ██╔══██╗██║╚██╔╝██║██║██║╚██╗██║██╔══██║██║ ██╔══██║██║
██║ ███████╗██║ ██║██║ ╚═╝ ██║██║██║ ╚████║██║ ██║███████╗ ██║ ██║██║
╚═╝ ╚══════╝╚═╝ ╚═╝╚═╝ ╚═╝╚═╝╚═╝ ╚═══╝╚═╝ ╚═╝╚══════╝ ╚═╝ ╚═╝╚═╝
Key Features
- Natural Language Interaction: Ask questions or request actions naturally.
- Intelligent Command Suggestion: Get relevant shell commands based on your query.
- File Reading & Explanation:
- Use
--read-file <filepath>along with your query to have the AI consider a file's content (any plain text file). - Use
--explain <filepath>for a direct summary and contextual explanation of a file (predefined query, ignores general query). - Supports any plain text file; the AI attempts to interpret the content.
- Use
- Multiple AI Backends: Supports OpenRouter, Gemini, Mistral, and local Ollama models.
- Interactive Execution: Review and confirm commands before they run.
- Context-Aware: Includes OS and current directory information in prompts to the AI.
- Safe Command Handling:
- Non-stateful commands run directly after confirmation.
- Risky commands require explicit confirmation.
- Stateful commands (
cd,export, etc.) are handled safely (see below).
- Multiple Modes:
- Direct Query (
ai "..."): Get a single response and command suggestions. - Single Interaction (
ai): Ask one question, get a response, and return to the shell. - Chat Mode (
ai --chatorai -c): Persistent conversation with the AI.
- Direct Query (
- Easy Configuration:
ai setupprovides a menu for API keys and settings. - Optional Shell Integration: For seamless execution of stateful commands in direct query mode.
- Syntax Highlighting: Uses
richfor formatted output. - Ollama Model Selection:
- When configuring Ollama, you now select a model by number or 'c' to cancel. Invalid input is rejected for safety.
Installation
Option 1: Install from PyPI (Recommended)
pip install coaxial-terminal-ai
Option 2: Install from Source
git clone https://github.com/coaxialdolor/terminalai.git
cd terminalai
pip install -e .
This automatically adds the ai command to your PATH.
Quick Setup
- Install: Use one of the methods above.
- Configure API Keys: Run
ai setupand select option5to add API keys for your chosen provider(s) (e.g., Mistral, Ollama, OpenRouter, Gemini). - Set Default Provider: In
ai setup, select option1to choose which provideraiuses by default. - (Optional) Install Shell Integration: See "Handling Stateful Commands" below if you want direct execution for commands like
cdwhen usingai "...". - Start Using: You're ready!
See the Quick Setup Guide for more detailed instructions.
Usage Examples
1. Single Interaction Mode (ai): Ask one question, get an answer/commands, then return to shell.
Flags like -v or -l can be used here.
# Basic usage
ai
AI:(mistral)> how do I list files by size?
# Request a long response
ai -l
AI:(mistral)> explain the history of Unix shells in detail
2. Direct Query Mode (ai "..."): Provide the query directly. This is where most flags are useful.
# Simple query
ai "find all python files modified in the last day"
# Auto-confirm non-risky command execution
ai -y "show current disk usage"
# (Example: If AI suggests 'df -h', it will run without a [Y/n] prompt)
# Request verbose output
ai -v "explain the concept of inodes"
# Request long output
ai -l "explain the difference between TCP and UDP"
# Combine flags: Auto-confirm and Verbose
ai -y -v "create a new directory called 'test_project' and list its contents"
# (Example: If AI suggests 'mkdir test_project && ls test_project', it will run without a prompt)
# Read and explain a file
ai --read-file ./my_script.py "Summarize this Python script and what it does"
# Get an automatic explanation of a file
ai --explain ./config/app_settings.yaml
# Ollama model selection (example):
# ai --set-ollama
# (Choose a model number, or 'c' to cancel)
3. Chat Mode (ai --chat or ai -c): Have a persistent conversation.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file coaxial_terminal_ai-0.3.2.tar.gz.
File metadata
- Download URL: coaxial_terminal_ai-0.3.2.tar.gz
- Upload date:
- Size: 46.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
59901d8f1e3649ce8b591dcc2ff75349c12c35ff85ec8777d42cffac55ecc475
|
|
| MD5 |
e284a3ddf294a04e0a62145bb388ec03
|
|
| BLAKE2b-256 |
31e2b02e0a91c1b5aa2512c1ba2b6f97010d6d41979599b51a2b5d055fb49628
|
File details
Details for the file coaxial_terminal_ai-0.3.2-py3-none-any.whl.
File metadata
- Download URL: coaxial_terminal_ai-0.3.2-py3-none-any.whl
- Upload date:
- Size: 50.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
15ec6f274fa9c5e45cbdb302bfd6a43bcb2f18d895a1c78c85fd0d0d85c1ce37
|
|
| MD5 |
086249db41400734ea8fffe7e442f7b9
|
|
| BLAKE2b-256 |
2c8d2b24d29a08541487b882a97a7a1421f2b0caf357c27d921589eb3d86f0a0
|