Skip to main content

A powerful terminal chat interface for multiple AI models with local session storage

Project description

TRMX - Terminal Chat Interface

A powerful terminal-based chat interface that lets you interact with various AI models directly from your command line. TRMX stores your conversations locally and makes it easy to manage multiple chat sessions.

Features

  • Multiple LLM providers: OpenAI, Anthropic, TogetherAI, Groq, Fireworks, Cerebras, Google
  • Chat history: Store and retrieve conversations locally
  • Dynamic model listing: Uses airtrain's ListModelsSkillFactory to fetch available models for each provider
  • Environment variable support: Load API keys from .env files
  • Multiple prompt formats: System prompts and multi-turn chat
  • Intuitive command-line interface: Easy to use and navigate
  • Advanced session management: Titles and session IDs for easy reference
  • Display options: Custom time format display options
  • Auto-update capability: Stay up-to-date with the latest features

Installation

pip install trmx

Quick Start

# Start a new chat using your default provider and model
trmx

# List your saved chat sessions
trmx --list

# Update TRMX to the latest version
trmx --update

Detailed Usage Guide

Managing Chat Sessions

# Start a new chat session
trmx

# List all previous chat sessions
# Shows title, ID, creation date, message count, provider, and model
trmx --list

# Continue a previous session (multiple ways)
trmx 92f31c          # Using a partial session ID directly
trmx 92              # Even just a few characters will work
trmx --continue 92   # Using the --continue or -c flag
trmx -c 92           # Short form

# Delete a chat session (by its number in the list)
trmx --delete 3      # Deletes the 3rd session in the list

# Show information about chat storage location
trmx --info

Configuring Models and Providers

TRMX supports multiple AI providers including OpenAI, Anthropic, Together, Groq, Fireworks, Cerebras, and Google.

# List all available providers and their status
trmx --list-providers

# List available models for the current provider
trmx --list-models

# List models for a specific provider
trmx --list-models --provider openai   # GPT models
trmx --list-models --provider anthropic # Claude models
trmx --list-models --provider groq     # Llama and other models

TRMX dynamically fetches the latest available models from each provider using AirTrain's ListModelsSkillFactory, ensuring you always have access to the most up-to-date model options without requiring credentials for some providers like OpenAI.

# Use a specific provider and model for a single chat session
trmx --provider openai --model gpt-4
trmx --provider anthropic --model claude-3-opus-20240229
trmx --provider groq --model llama-3-70b-8192

# Set a new default provider/model configuration
trmx --add --provider openai --model gpt-4-turbo
trmx --add --provider anthropic --model claude-3-haiku-20240307

Display Settings

# Set the time display style for chat sessions
trmx --set-timestyle iso      # Display times in ISO format (2025-03-17T22:55:28)
trmx --set-timestyle human    # Display times in human-readable format (2025-03-17 22:55:28)
trmx --set-timestyle relative # Display times in relative format (2 hours ago)

# Show model's thinking process (for supported models like DeepSeek)
trmx --provider fireworks --model fireworks/deepseek-r1 --show-thinking

Maintenance

# Check the current version
trmx --version

# Update to the latest version
trmx --update

# Show help information
trmx --help

Chat Interface Features

During a chat session:

  • The provider and model information are displayed prominently
  • Chat history is shown when continuing a session
  • Type exit, quit, or q to end the session
  • Multi-line input is supported:
    • Use /m, /multiline, /multi, /p, or /paste (end with /end)
    • Use triple quotes """ or ''' (end with corresponding triple quotes)

Configuration

TRMX can be configured using environment variables:

  • TRMX_DIR: Path to store chat history, credentials, and configuration (default: ~/.trmx)
  • API key variables for each provider (e.g., OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.)

You can set these in your shell or create a .env file in your working directory.

API Keys

TRMX will search for API keys in this order:

  1. Environment variables
  2. Credential files in ~/.trmx/credentials/
  3. Interactive prompt (if not found, TRMX will ask if you want to enter and save the key)

Example Configuration

For OpenAI:

export OPENAI_API_KEY=your-key-here

For Anthropic:

export ANTHROPIC_API_KEY=your-key-here

Session Information

When you list your sessions with trmx --list, you'll see:

  • Session title (auto-generated from the conversation)
  • Session ID (unique identifier)
  • Creation time
  • Message count
  • Provider (which AI service was used)
  • Model (which specific model was used)
  • Preview of the conversation

Requirements

  • Python 3.8 or higher
  • Internet connection for AI model access

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

trmx-0.3.14.tar.gz (19.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

trmx-0.3.14-py3-none-any.whl (21.3 kB view details)

Uploaded Python 3

File details

Details for the file trmx-0.3.14.tar.gz.

File metadata

  • Download URL: trmx-0.3.14.tar.gz
  • Upload date:
  • Size: 19.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.14

File hashes

Hashes for trmx-0.3.14.tar.gz
Algorithm Hash digest
SHA256 c648bb4dd67b14b14121d454104ff93e27db30ff4ab15c140cb8ed47aba8b351
MD5 474e218dd226b3ace3a99cc8387d4f33
BLAKE2b-256 f2a998596b231b77463a044b13cc1af49fe7fbe01d9004026067021d1492c991

See more details on using hashes here.

File details

Details for the file trmx-0.3.14-py3-none-any.whl.

File metadata

  • Download URL: trmx-0.3.14-py3-none-any.whl
  • Upload date:
  • Size: 21.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.14

File hashes

Hashes for trmx-0.3.14-py3-none-any.whl
Algorithm Hash digest
SHA256 12ab1da28ffe87f3ce8aaec399b7075deeb8ae84c60ee2a4b6d39bd017f8780d
MD5 a13fb20657068050831ecdb8291f4e11
BLAKE2b-256 ec6597e60b7af327b00752aaa99a2b750299aef8716fdd4e9a98d96ff383cb4f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page