Skip to main content

A command line interface for ChatGPT

Project description

aici 🚀

PyPI version Python Versions License

A command line interface tool for AI models like OpenAI's ChatGPT and DeepSeek AI. 🤖💬

Use Case: would like to use AI models with editors like Emacs and/or automated tools.

commandline

emacs

📦 Installation:

pip install aici

📖 Overview:

AICI (AI Chat Interface) is a Python🐍 command-line tool for interacting with AI models from OpenAI or DeepSeek. It takes a user's prompt as input and outputs the response from the selected AI model. The output can be directed to either standard output or the clipboard📋.

Key Features:

  • Support for OpenAI and DeepSeek models
  • Streaming responses (or complete responses with -c)
  • Custom system messages via direct input or file
  • Clipboard output support
  • Environment variable configuration
  • JSON conversation format support

💻 Command-Line Description:

Argument env val Default Type Description
-v, --version - バージョンを表示して終了
prompt - str AIに送るプロンプト。"-"を指定すると標準入力から読み込みます
-m, --model AICI_MODEL gpt-3.5-turbo str 使用するモデル名 (gpt-3.5-turbo, gpt-4, gpt-4o, deepseek-chat など)
-c, --complete False (default streaming) bool ストリーミングせずに完全な応答を一度に取得
-s, --system AICI_SYSTEM You are a helpful assistant. str システムメッセージを指定
-S, --system-file AICI_SYSTEM_FILE - str システムメッセージを含むファイルを指定
-V, --verbose False bool 詳細なデバッグ情報を表示
-o, --output stdout str 出力先を指定。"clip"でクリップボードにコピー

OpenAI models documentation DeepSeek models documentation

📥 input

💻 std input 💬 command parameter

📤output

💻 std output (streaming, beffering) 📋 clipboard

🔧 Config Environment Variables or File:

🔑 API keys can be set using environment variables or config files

Environment Variables

# OpenAI API Key (either one can be used)
set AICI_OPENAI_KEY=sk-xxxxxxxxxxxxxxxxx
set OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxx

# DeepSeek API Key (either one can be used)
set AICI_DEEPSEEK_KEY=sk-xxxxxxxxxxxxxxxxx
set DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxxx

# Model Selection
set AICI_MODEL=gpt-4o                # General model selection
set AICI_OPENAI_MODEL=gpt-4o         # OpenAI specific model
set AICI_DEEPSEEK_MODEL=deepseek-chat # DeepSeek specific model

# System Message
set AICI_SYSTEM="You are a helpful assistant."
set AICI_SYSTEM_FILE=~/path/to/system_message.txt  # システムメッセージをファイルから読み込む

Config Files

It will check the files in the following locations (in the order listed below). ~/.config/aici/config ~/.aici

# API Keys
AICI_OPENAI_KEY=sk-xxxxxxxxxxxxxxxxx
OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxx
AICI_DEEPSEEK_KEY=sk-xxxxxxxxxxxxxxxxx
DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxxx

# Model Selection
AICI_MODEL=gpt-4o
AICI_OPENAI_MODEL=gpt-4o
AICI_DEEPSEEK_MODEL=deepseek-chat

# System Message
AICI_SYSTEM=You are a helpful assistant.
AICI_SYSTEM_FILE=~/path/to/system_message.txt  # システムメッセージをファイルから読み込む

🖥️ On Windows file path, it is expanded like

File path Windows Specific
C:\Users\{USERNAME}\AppData\Local\aici\config
C:\Users\{USERNAME}\AppData\Roaming\aici\config
C:\Users\{USERNAME}\.config\aici\config
C:\Users\{USERNAME}\.aici
(The priority of the applied config files is in the order listed from top to bottom.)

👋 Examples:

💨 Basic input from CLI

$ aici Hello

💨 Read from stdin

$ echo Hello | aici -

💨 Specify a model

$ aici -m gpt-4o "What's the weather like today?"
$ aici -m deepseek-chat "Tell me about quantum computing"

💨 Use a system message file

$ aici -S system.txt "Tell me about quantum computing"

💨 Use a system message from a file

$ echo "You are a helpful coding assistant." > system.txt
$ aici -sf system.txt "How do I write a Python function?"

💨 Enable debug mode

$ aici -V "Hello there"

🔄 Advanced Input Formats

JSON Input Format

Aici supports advanced input formats through stdin, allowing you to provide conversation context and complex prompts using JSON. When using the - parameter to read from stdin, aici will automatically detect if the input is JSON and process it accordingly.

JSON Conversation Format

You can provide a complete conversation context using the following JSON format:

{
  "prompts": [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Hello, how are you?"},
    {"role": "assistant", "content": "I'm doing well, thank you for asking!"},
    {"role": "user", "content": "Tell me a joke."}
  ]
}

Each prompt in the array should contain a role and content field. The supported roles are:

  • system: Sets the system instructions for the AI
  • user: Represents messages from the user
  • assistant: Represents previous responses from the AI

Alternative JSON Format

For convenience, aici also supports an alternative format where the role is implied by the key name:

{
  "prompts": [
    {"system": "You are a helpful assistant."},
    {"user": "Hello, how are you?"},
    {"assistant": "I'm doing well, thank you for asking!"},
    {"user": "Tell me a joke."}
  ]
}

How JSON Input is Processed

When a JSON input is detected:

  1. If the JSON contains a prompts array, aici will extract the conversation context
  2. System messages are used to set the system instructions
  3. The last user message is used as the primary prompt
  4. All messages are preserved in the conversation context
  5. The AI response will consider the entire conversation history

Example Usage

# Using a JSON file with conversation context
$ cat conversation.json | aici -

# Creating a JSON conversation inline
$ echo '{"prompts": [{"system": "You are a helpful assistant."}, {"user": "Tell me a joke about programming."}]}' | aici -

Fallback Behavior

If the input starts with { and ends with } but cannot be parsed as valid JSON, or if the JSON doesn't contain the expected structure, aici will treat the entire input as plain text.

💨 output to clipboard 📋

$ echo Hello | aici - --output clip

👋 emacs

Emacs Lisp Code Example

Below is the content of emacs/aici.el

(defun aici-call ()
  "Send selected region or prompt for input if no region is selected to the 'aici' command and insert the output in real-time."
  (interactive)
  (let* ((text (if (use-region-p)
                   (buffer-substring-no-properties (region-beginning) (region-end))
                 (read-string "Enter text: ")))
         ;; Attempt to create or get the output buffer
         (output-buffer (get-buffer-create "*AICI Output*")))

    ;; Check if the buffer creation was successful
    (if (not output-buffer)
        (error "Failed to create or access the output buffer")
      ;; Clear the output buffer
      (with-current-buffer output-buffer
        (erase-buffer)
        ;; Set the buffer to markdown-mode
        (markdown-mode))

      ;; Display a message indicating that processing is ongoing
      (message "Processing...")

      ;; Start the process and stream the output to the buffer
      (let ((process (start-process "aici-process" output-buffer "sh" "-c"
                                    (format "echo %s | aici -" (shell-quote-argument text)))))
        ;; Set a process filter to handle output streaming
        (set-process-filter process
                            (lambda (proc output)
                              ;; Explicitly reference the output-buffer by capturing it in the lambda
                              (let ((buffer (process-buffer proc)))
                                (when (buffer-live-p buffer)
                                  (with-current-buffer buffer
                                    (goto-char (point-max))
                                    (insert output)
                                    ;; Optionally display the buffer in real-time
                                    (display-buffer buffer))))))

        ;; Set a sentinel to handle process completion
        (set-process-sentinel process
                              (lambda (proc event)
                                ;; Again, ensure that output-buffer is properly referenced
                                (let ((buffer (process-buffer proc)))
                                  (when (buffer-live-p buffer)
                                    (if (string= event "finished\n")
                                        (message "Processing complete.")
                                      (message "Processing interrupted: %s" event))))))))

      ;; Ensure the output buffer is displayed after starting the process
      (display-buffer output-buffer)))

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aici-0.0.10.tar.gz (481.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aici-0.0.10-py3-none-any.whl (21.7 kB view details)

Uploaded Python 3

File details

Details for the file aici-0.0.10.tar.gz.

File metadata

  • Download URL: aici-0.0.10.tar.gz
  • Upload date:
  • Size: 481.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for aici-0.0.10.tar.gz
Algorithm Hash digest
SHA256 2eaf8bd0e715e9bedd29a8882cda4722c2d94d6f1134c26eb5a45897cc312cea
MD5 83593041879b821ebe69af7f17a65316
BLAKE2b-256 37b4a5daaaf2be961b35682b4dd9454dc545514fdfdcd22f932e86494e63b2d9

See more details on using hashes here.

File details

Details for the file aici-0.0.10-py3-none-any.whl.

File metadata

  • Download URL: aici-0.0.10-py3-none-any.whl
  • Upload date:
  • Size: 21.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for aici-0.0.10-py3-none-any.whl
Algorithm Hash digest
SHA256 7b780dd44332890106ccfe05ea5b0b0daa90935d9a57f8d24b6d2bb304e74440
MD5 f3e7d7cfdd1b83aa7d71b9722439ad2e
BLAKE2b-256 861bd95b0846f7489bd0c1e128673e9566ea226d742f05b1e36276cb7a3a7e38

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page