Skip to main content

Command-line interface for a number of AI models

Project description

A (yet another) GNU Readline-based application for interaction with chat-oriented AI models.

Supported model providers:

Contents

Install

The following installation options are available:

Pip

$ pip install sm_aicli

Nix

$ git clone --depth=1 https://github.com/sergei-mironov/aicli && cd aicli
# Optionally, change the 'nixpkgs' input of the flake.nix to a more suitable
$ nix profile install ".#python-aicli"

Usage

usage: aicli [-h] [--model-dir MODEL_DIR] [--model [STR1:]STR2]
             [--num-threads NUM_THREADS] [--model-apikey STR]
             [--model-temperature MODEL_TEMPERATURE] [--device DEVICE]
             [--readline-key-send READLINE_KEY_SEND]
             [--readline-prompt READLINE_PROMPT] [--readline-history FILE]
             [--verbose NUM] [--revision] [--version] [--no-rc]

Command-line arguments

options:
  -h, --help            show this help message and exit
  --model-dir MODEL_DIR
                        Model directory to prepend to model file names
  --model [STR1:]STR2, -m [STR1:]STR2
                        Model to use. STR1 is 'gpt4all' (the default) or
                        'openai'. STR2 is the model name
  --num-threads NUM_THREADS, -t NUM_THREADS
                        Number of threads to use
  --model-apikey STR    Model provider-specific API key
  --model-temperature MODEL_TEMPERATURE
                        Temperature parameter of the model
  --device DEVICE, -d DEVICE
                        Device to use for chatbot, e.g. gpu, amd, nvidia,
                        intel. Defaults to CPU
  --readline-key-send READLINE_KEY_SEND
                        Terminal code to treat as Ctrl+Enter (default: \C-k)
  --readline-prompt READLINE_PROMPT
                        Input prompt (default: >>>)
  --readline-history FILE
                        History file name (default is '_sm_aicli_history'; set
                        empty to disable)
  --verbose NUM         Set the verbosity level 0-no,1-full
  --revision            Print the revision
  --version             Print the version
  --no-rc               Do not read configuration files

The console accepts language defined by the following grammar:

start: (command | escape | text)? (command | escape | text)*
escape.3: /\\./
command.2: /\/ask|\/exit|\/help|\/reset/ | \
           /\/model/ / +/ (/"/ model_string /"/ | /"/ /"/) | \
           /\/apikey/ / +/ (/"/ apikey_string /"/ | /"/ /"/) | \
           /\/nthreads/ / +/ (number | def) | \
           /\/verbose/ / +/ (number | def) | \
           /\/temp/ / +/ (float | def ) | \
           /\/echo/ | /\/echo/ / /
model_string: (model_provider ":")? model_name
model_provider: "gpt4all" -> mp_gpt4all | "openai" -> mp_openai | "dummy" -> mp_dummy
model_name: /[^"]+/
apikey_string: (apikey_schema ":")? apikey_value
apikey_schema: "verbatim" -> as_verbatim | "file" -> as_file
apikey_value: /[^"]+/
number: /[0-9]+/
float: /[0-9]+\.[0-9]*/
def: "default"
text: /(.(?!\/|\\))*./s

Example session

$ aicli
Type /help or a question followed by the /ask command (or by pressing `C-k` key).
>>> /model "~/.local/share/nomic.ai/GPT4All/Meta-Llama-3-8B-Instruct.Q4_0.gguf"
>>> Hi!
>>> /ask
Hello! I'm happy to help you. What's on your mind?^C
>>> What's your name?
>>> /ask
I don't really have a personal name, but you can call me "Assistant"

Vim integration

Aicli is supported by the Litrepl text processor.

Peek 2024-07-19 00-11

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sm_aicli-1.7.0.tar.gz (12.0 kB view details)

Uploaded Source

Built Distribution

sm_aicli-1.7.0-py3-none-any.whl (11.6 kB view details)

Uploaded Python 3

File details

Details for the file sm_aicli-1.7.0.tar.gz.

File metadata

  • Download URL: sm_aicli-1.7.0.tar.gz
  • Upload date:
  • Size: 12.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for sm_aicli-1.7.0.tar.gz
Algorithm Hash digest
SHA256 b1daa26277e423fb8a9a538e5dbc46ae36c2c3be083cee594a2e3b93c85276bf
MD5 f2b3cad1fcb03128644a5f83c40d220e
BLAKE2b-256 24753543ec13ccdaa6dcf7c844468697a955c2dcf828628bb23b8a557e576ce9

See more details on using hashes here.

File details

Details for the file sm_aicli-1.7.0-py3-none-any.whl.

File metadata

  • Download URL: sm_aicli-1.7.0-py3-none-any.whl
  • Upload date:
  • Size: 11.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for sm_aicli-1.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b624dc5a0c85c75f3a1dc17c5c9c220dcb73ab0ca443af51fb682e8945ea5a42
MD5 4a8964f11d496c5c3e91713e1920508f
BLAKE2b-256 1e7dee2667614bacadc56c06591f4b6158e8444c2879330d38d198083afbcece

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page