Skip to main content

A command line interface for ChatGPT

Project description

aici 🚀

PyPI version Python Versions License

A command line interface tool for AI models like OpenAI's ChatGPT and DeepSeek AI. 🤖💬

Use Case: would like to use AI models with editors like Emacs and/or automated tools.

commandline

emacs

📦 Installation:

pip install aici

📖 Overview:

AICI (AI Chat Interface) is a Python🐍 command-line tool for interacting with AI models from OpenAI or DeepSeek. It takes a user's prompt as input and outputs the response from the selected AI model. The output can be directed to either standard output or the clipboard📋.

Key Features:

  • Support for OpenAI and DeepSeek models
  • Default model: gpt-4o-mini (updated from deprecated gpt-3.5-turbo)
  • Streaming responses (or complete responses with -c)
  • Custom system messages via direct input or file
  • Clipboard output support
  • Environment variable configuration
  • JSON conversation format support
  • Automatic warnings for deprecated models

💻 Command-Line Description:

Argument env val Default Type Description
-v, --version - Show version and exit
prompt - str Prompt to send to AI. Specify "-" to read from stdin
-m, --model AICI_MODEL gpt-4o-mini str Model name to use (gpt-4o, gpt-4o-mini, gpt-4-turbo, deepseek-chat, etc.) ⚠️ gpt-3.5-turbo deprecated in Feb 2026
-c, --complete False (default streaming) bool Get complete response at once without streaming
-s, --system AICI_SYSTEM You are a helpful assistant. str Specify system message
-S, --system-file AICI_SYSTEM_FILE - str Specify file containing system message
-V, --verbose False bool Show detailed debug information
-o, --output stdout str Specify output destination. Use "clip" to copy to clipboard

OpenAI models documentation DeepSeek models documentation

📝 Logging

Log File Location:

Aici automatically creates a log file to record API calls, errors, and debug information:

  • If config file exists: Log file is created in the same directory as your config file

    • Linux/macOS: ~/.config/aici/aici.log or ~/.aici/aici.log
    • Windows: %USERPROFILE%\AppData\Local\aici\aici.log
  • If no config file: Default location is ~/.config/aici/aici.log

Log Levels:

  • INFO: Records all API calls and responses (always logged to file)
  • WARNING: Shows warnings in console (e.g., deprecated models)
  • DEBUG: Detailed information for troubleshooting (enabled with -V flag)

Viewing Logs:

# View recent log entries
tail -f ~/.config/aici/aici.log

# Enable verbose mode to see debug info in console
aici "Hello" -V

Note: If aici cannot write to the log file (e.g., permission denied), it will show a warning and continue without file logging.

📥 input

💻 std input 💬 command parameter

📤output

💻 std output (streaming, buffering) 📋 clipboard

🔧 Config Environment Variables or File:

🔑 API keys can be set using environment variables or config files

API Key Setup

Getting API Keys

OpenAI API Key:

  1. Visit OpenAI Platform
  2. Sign up or log in to your account
  3. Navigate to API Keys
  4. Click "Create new secret key"
  5. Copy the key (starts with sk-)
  6. Store it securely - you won't be able to see it again

DeepSeek API Key:

  1. Visit DeepSeek Platform
  2. Sign up or log in to your account
  3. Navigate to API Keys section
  4. Generate a new API key
  5. Copy the key (starts with sk-)
  6. Store it securely

Setting Up API Keys

You can configure API keys in two ways:

Method 1: Environment Variables (Temporary - current session only)

# Linux/macOS
export AICI_OPENAI_KEY=sk-your-openai-key-here

# Windows
set AICI_OPENAI_KEY=sk-your-openai-key-here

Method 2: Config File (Permanent - persists across sessions)

Create a config file at one of these locations:

  • Linux/macOS: ~/.config/aici/config or ~/.aici
  • Windows: %USERPROFILE%\AppData\Local\aici\config

Add your API key to the file:

AICI_OPENAI_KEY=sk-your-openai-key-here

Verification:

# Check if aici can find your API key
aici --version  # Should not show API key errors
aici "Hello" -V  # Verbose mode shows configuration loaded

Environment Variables

Priority Order

API Keys (in order of priority):

  1. AICI_OPENAI_KEY (highest priority for OpenAI models)
  2. OPENAI_API_KEY (fallback for OpenAI models)
  3. AICI_DEEPSEEK_KEY (highest priority for DeepSeek models)
  4. DEEPSEEK_API_KEY (fallback for DeepSeek models)

Model Selection (in order of priority):

  1. -m command line option (highest priority)
  2. AICI_MODEL environment variable
  3. AICI_OPENAI_MODEL or AICI_DEEPSEEK_MODEL (provider-specific)
  4. Default: gpt-4o-mini

System Message (in order of priority):

  1. -s command line option (highest priority)
  2. -S file specified via command line
  3. AICI_SYSTEM_FILE environment variable
  4. AICI_SYSTEM environment variable
  5. Default: "You are a helpful assistant."

Note on File Paths (Windows):

  • File paths are automatically normalized for cross-platform compatibility
  • You can use forward slashes (/) even on Windows - they will be converted to backslashes (\)
  • Examples that work on Windows:
    set AICI_SYSTEM_FILE=C:/Users/YourName/system.txt
    set AICI_SYSTEM_FILE=~/Documents/system.txt
    
    Both will be correctly processed as Windows paths

Configuration Examples

Linux/macOS:

# OpenAI API Key (AICI_OPENAI_KEY takes priority over OPENAI_API_KEY)
export AICI_OPENAI_KEY=sk-xxxxxxxxxxxxxxxxx
export OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxx

# DeepSeek API Key (AICI_DEEPSEEK_KEY takes priority over DEEPSEEK_API_KEY)
export AICI_DEEPSEEK_KEY=sk-xxxxxxxxxxxxxxxxx
export DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxxx

# Model Selection
export AICI_MODEL=gpt-4o-mini           # General model selection
export AICI_OPENAI_MODEL=gpt-4o-mini    # OpenAI specific model
export AICI_DEEPSEEK_MODEL=deepseek-chat # DeepSeek specific model

# System Message
export AICI_SYSTEM="You are a helpful assistant."
export AICI_SYSTEM_FILE=~/path/to/system_message.txt

Windows (Command Prompt):

# OpenAI API Key (AICI_OPENAI_KEY takes priority over OPENAI_API_KEY)
set AICI_OPENAI_KEY=sk-xxxxxxxxxxxxxxxxx
set OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxx

# DeepSeek API Key (AICI_DEEPSEEK_KEY takes priority over DEEPSEEK_API_KEY)
set AICI_DEEPSEEK_KEY=sk-xxxxxxxxxxxxxxxxx
set DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxxx

# Model Selection
set AICI_MODEL=gpt-4o-mini
set AICI_OPENAI_MODEL=gpt-4o-mini
set AICI_DEEPSEEK_MODEL=deepseek-chat

# System Message
set AICI_SYSTEM="You are a helpful assistant."
set AICI_SYSTEM_FILE=C:\path\to\system_message.txt

Config Files

It will check the files in the following locations (in the order listed below). ~/.config/aici/config ~/.aici

# API Keys
AICI_OPENAI_KEY=sk-xxxxxxxxxxxxxxxxx
OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxx
AICI_DEEPSEEK_KEY=sk-xxxxxxxxxxxxxxxxx
DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxxx

# Model Selection
AICI_MODEL=gpt-4o-mini
AICI_OPENAI_MODEL=gpt-4o-mini
AICI_DEEPSEEK_MODEL=deepseek-chat

# System Message
AICI_SYSTEM=You are a helpful assistant.
AICI_SYSTEM_FILE=~/path/to/system_message.txt  # Load system message from file

🖥️ On Windows file path, it is expanded like

File path Windows Specific
C:\Users\{USERNAME}\AppData\Local\aici\config
C:\Users\{USERNAME}\AppData\Roaming\aici\config
C:\Users\{USERNAME}\.config\aici\config
C:\Users\{USERNAME}\.aici
(The priority of the applied config files is in the order listed from top to bottom.)

👋 Examples:

💨 Basic input from CLI

$ aici Hello

💨 Read from stdin

$ echo Hello | aici -

💨 Specify a model

$ aici -m gpt-4o "What's the weather like today?"
$ aici -m gpt-4o-mini "What's the weather like today?"
$ aici -m deepseek-chat "Tell me about quantum computing"

💨 Use a system message from a file

$ echo "You are a helpful coding assistant." > system.txt
$ aici -S system.txt "How do I write a Python function?"

💨 Enable debug mode

$ aici -V "Hello there"

💨 output to clipboard 📋

$ echo Hello | aici - --output clip

👋 emacs

Emacs Lisp Code Example

Below is the content of emacs/aici.el

(defun aici-call ()
  "Send selected region or prompt for input if no region is selected to the 'aici' command and insert the output in real-time."
  (interactive)
  (let* ((text (if (use-region-p)
                   (buffer-substring-no-properties (region-beginning) (region-end))
                 (read-string "Enter text: ")))
         ;; Attempt to create or get the output buffer
         (output-buffer (get-buffer-create "*AICI Output*")))

    ;; Check if the buffer creation was successful
    (if (not output-buffer)
        (error "Failed to create or access the output buffer")
      ;; Clear the output buffer
      (with-current-buffer output-buffer
        (erase-buffer)
        ;; Set the buffer to markdown-mode
        (markdown-mode))

      ;; Display a message indicating that processing is ongoing
      (message "Processing...")

      ;; Start the process and stream the output to the buffer
      (let ((process (start-process "aici-process" output-buffer "sh" "-c"
                                    (format "echo %s | aici -" (shell-quote-argument text)))))
        ;; Set a process filter to handle output streaming
        (set-process-filter process
                            (lambda (proc output)
                              ;; Explicitly reference the output-buffer by capturing it in the lambda
                              (let ((buffer (process-buffer proc)))
                                (when (buffer-live-p buffer)
                                  (with-current-buffer buffer
                                    (goto-char (point-max))
                                    (insert output)
                                    ;; Optionally display the buffer in real-time
                                    (display-buffer buffer))))))

        ;; Set a sentinel to handle process completion
        (set-process-sentinel process
                              (lambda (proc event)
                                ;; Again, ensure that output-buffer is properly referenced
                                (let ((buffer (process-buffer proc)))
                                  (when (buffer-live-p buffer)
                                    (if (string= event "finished\n")
                                        (message "Processing complete.")
                                      (message "Processing interrupted: %s" event))))))))

      ;; Ensure the output buffer is displayed after starting the process
      (display-buffer output-buffer)))

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aici-0.0.11.tar.gz (486.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aici-0.0.11-py3-none-any.whl (21.8 kB view details)

Uploaded Python 3

File details

Details for the file aici-0.0.11.tar.gz.

File metadata

  • Download URL: aici-0.0.11.tar.gz
  • Upload date:
  • Size: 486.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for aici-0.0.11.tar.gz
Algorithm Hash digest
SHA256 3cf12c32fba0343f28aaf894b115b4ea11d173a532cdc6a41083e83cd75b23d1
MD5 d98adf6d053673dadeef0fcce25ae66b
BLAKE2b-256 f605b07f3cfa4207e35d5100cbad7bfcc2ba72dbf4803f0c7899924f1dc109ab

See more details on using hashes here.

File details

Details for the file aici-0.0.11-py3-none-any.whl.

File metadata

  • Download URL: aici-0.0.11-py3-none-any.whl
  • Upload date:
  • Size: 21.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for aici-0.0.11-py3-none-any.whl
Algorithm Hash digest
SHA256 fdcb1ec4b0aa17add372b6fe0aae26e19843c5cf2d83ac8b06afeb230ea02f2a
MD5 c0cbac05de48dd128c3bb04123432420
BLAKE2b-256 801a4dc5201f52bd185ba4b37be1862b1ac8caef7dccf24114e9ea9805c1eaa6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page