Skip to main content

An interactive terminal tool using GPT, with web search capabilities.

Project description

tux-gpt

tux-gpt is an interactive command-line assistant (not a background AI agent) that leverages GPT-based language models to provide intelligent, conversational help directly in your terminal. It enables on-the-fly code generation, debugging support, technical explanations, and more—all without leaving the command-line environment.

Designed for developers and tech enthusiasts, tux-gpt streamlines workflows by integrating AI assistance seamlessly into terminal sessions, making complex tasks easier and faster to accomplish via intuitive, context-aware command-line interactions. On startup the tool gathers details about the host operating system (distribution, architecture, etc.) so responses and suggested commands are tailored to your environment.


Prerequisites

  • Python 3.7+
  • Pip (Python package manager)
  • An OpenAI API key (see next section)

Setup and Configuration

  1. Install: From PyPI (for most platforms):

    pip install tux-gpt
    

    On newer Debian/Ubuntu releases where pip targets the system Python, prefer pipx to keep the install isolated:

    pipx install tux-gpt
    

    From source:

    git clone https://github.com/fberbert/tux-gpt.git
    cd tux-gpt
    pip install -r requirements.txt
    pip install .
    
  2. Get your OpenAI API key:

  3. Configure your environment variable:

    • Linux/macOS (bash/zsh):
      echo 'export OPENAI_API_KEY="<your_api_key>"' >> ~/.bashrc
      source ~/.bashrc
      
    • Windows (PowerShell):
      [Environment]::SetEnvironmentVariable('OPENAI_API_KEY', '<your_api_key>', 'User')
      

Note: On first run, tux-gpt will create the directory ~/.config/tux-gpt/ (or %APPDATA%\tux-gpt\ on Windows) containing:

  • config.json: CLI configuration (e.g., default model);
  • history.json: persistence of the last 20 messages (user + assistant);
  • input_history: command history for navigation with ↑/↓ arrow keys.

Usage

Start the interactive session:

tux-gpt

Press Ctrl+J to send your message (Enter only inserts a new line).

Usage demonstration

Example commands

  • Search the web for current news:

    > Find the latest headlines about OpenAI
    
  • Look up technical documentation:

    > What is the syntax for Python's list comprehensions?
    
  • Fetch real-time data (e.g., stock price):

    > What's the current stock price of AAPL?
    
  • Summarize a web article:

    > Summarize the top result for "machine learning trends 2025"
    
  • Run a single prompt from the shell:

    tux-gpt -q "What's the weather forecast for Rio das Ostras today?"
    
  • Request JSON output for scripting:

    tux-gpt --json -q "Summarize the latest news about OpenAI"
    

    The response is a JSON object in the form:

    {
      "answer": "Short summary text...",
      "sources": ["https://example.com/article", "..."]
    }
    
  • Generate and execute commands directly:

    tux-gpt -c "create a zip archive with every jpg or png in this folder"
    

    -c/--command asks Tux-GPT for a shell command. The assistant returns a JSON payload with the command plus a safety flag. If danger is false, the command runs immediately and the CLI prints Executando comando: .... If danger is true, the CLI shows the command and waits for confirmation before executing.


Memory & Command History

tux-gpt now persists your conversation and command history locally in the ~/.config/tux-gpt/ directory (or %APPDATA%\tux-gpt\ on Windows). The files created are:

  • config.json: CLI configuration, such as the default model.
  • history.json: stores the last 20 messages (user + assistant) to maintain context between sessions and limit token usage.
  • input_history: command history used by readline for navigation with ↑/↓ arrow keys.

Features:

  • On startup, the conversation history is automatically reloaded from history.json, limited to the last 20 messages to prevent token overload.
  • You can navigate previous commands using the ↑ and ↓ arrow keys at the prompt.
  • To reset the conversation or command history, simply remove the corresponding files in that directory (~/.config/tux-gpt/ or %APPDATA%\tux-gpt\).

Customization

You can configure the default model or terminal spinner settings by editing the configuration file at ~/.config/tux-gpt/config.json. Example:

{
  "model": "gpt-4o-mini"
}

Troubleshooting

  • "OPENAI_API_KEY not set": Ensure you exported the variable correctly and restarted your shell.
  • Slow responses: Check your internet connection or change to a faster model in the config.

License

MIT © 2025 tux-gpt contributors


Configuration File (~/.config/tux-gpt/config.json)

On the first run, tux-gpt will create a configuration file at ~/.config/tux-gpt/config.json (or %APPDATA%\tux-gpt\config.json on Windows). This file contains settings for the default model. You can customize the behavior of tux-gpt by editing the configuration file located at ~/.config/tux-gpt/config.json. This file allows you to set the default model and other preferences.

Example config file to set the model:

{
  "model": "gpt-5-mini"
}

The default model is gpt-5-mini.

Model Compatibility

Important: The model you choose must support web search capability. Currently, only the following models support the web search tool:

  • gpt-5-mini
  • gpt-4.1
  • gpt-4.1-mini

For more details, see the official OpenAI documentation on web search tools and limitations:

https://platform.openai.com/docs/guides/tools-web-search?api-mode=responses#limitations


Author

Fábio Berbert de Paula

Official Repository

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tux_gpt-0.2.4.tar.gz (11.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tux_gpt-0.2.4-py3-none-any.whl (9.3 kB view details)

Uploaded Python 3

File details

Details for the file tux_gpt-0.2.4.tar.gz.

File metadata

  • Download URL: tux_gpt-0.2.4.tar.gz
  • Upload date:
  • Size: 11.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for tux_gpt-0.2.4.tar.gz
Algorithm Hash digest
SHA256 40c811ffb74ad73da3abff6228ee97d6e8e17fece226f0014e538c1b8230e42b
MD5 d3a7d9d8a56ba0fb2556b73a15ae3a15
BLAKE2b-256 b74271ecb7a4bcfa639075cb4a105af0d3b27a29515c434e4d41c2ccf6ff127a

See more details on using hashes here.

File details

Details for the file tux_gpt-0.2.4-py3-none-any.whl.

File metadata

  • Download URL: tux_gpt-0.2.4-py3-none-any.whl
  • Upload date:
  • Size: 9.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for tux_gpt-0.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 89dca1b3bfd267bda845b8878a45193e067354aa555405a60ced50a57ac1802e
MD5 908cb542703f223191b1398ae6bdd5e8
BLAKE2b-256 01511edc47784932687348554902f4fffba4668db964530c66ec2b58b0f9d2a0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page