Command-line interface for a number of AI models
Project description
A (yet another) GNU Readline-based application for interaction with chat-oriented AI models.
Supported model providers:
Contents
Install
The following installation options are available:
Pip
$ pip install sm_aicli
Nix
$ git clone --depth=1 https://github.com/sergei-mironov/aicli && cd aicli
# Optionally, change the 'nixpkgs' input of the flake.nix to a more suitable
$ nix profile install ".#python-aicli"
Usage
usage: aicli [-h] [--model-dir MODEL_DIR] [--model [STR1:]STR2]
[--num-threads NUM_THREADS] [--model-apikey STR]
[--model-temperature MODEL_TEMPERATURE] [--device DEVICE]
[--readline-key-send READLINE_KEY_SEND]
[--readline-prompt READLINE_PROMPT] [--readline-history FILE]
[--verbose NUM] [--revision] [--version] [--no-rc]
Command-line arguments
options:
-h, --help show this help message and exit
--model-dir MODEL_DIR
Model directory to prepend to model file names
--model [STR1:]STR2, -m [STR1:]STR2
Model to use. STR1 is 'gpt4all' (the default) or
'openai'. STR2 is the model name
--num-threads NUM_THREADS, -t NUM_THREADS
Number of threads to use
--model-apikey STR Model provider-specific API key
--model-temperature MODEL_TEMPERATURE
Temperature parameter of the model
--device DEVICE, -d DEVICE
Device to use for chatbot, e.g. gpu, amd, nvidia,
intel. Defaults to CPU
--readline-key-send READLINE_KEY_SEND
Terminal code to treat as Ctrl+Enter (default: \C-k)
--readline-prompt READLINE_PROMPT
Input prompt (default: >>>)
--readline-history FILE
History file name (default is '_sm_aicli_history'; set
empty to disable)
--verbose NUM Set the verbosity level 0-no,1-full
--revision Print the revision
--version Print the version
--no-rc Do not read configuration files
The console accepts language defined by the following grammar:
start: (command | escape | text)? (command | escape | text)*
escape.3: /\\./
command.2: /\/ask|\/exit|\/help|\/reset/ | \
/\/model/ / +/ (/"/ model_string /"/ | /"/ /"/) | \
/\/apikey/ / +/ (/"/ apikey_string /"/ | /"/ /"/) | \
/\/nthreads/ / +/ (number | def) | \
/\/verbose/ / +/ (number | def) | \
/\/temp/ / +/ (float | def ) | \
/\/echo/ | /\/echo/ / /
model_string: (model_provider ":")? model_name
model_provider: "gpt4all" -> mp_gpt4all | "openai" -> mp_openai | "dummy" -> mp_dummy
model_name: /[^"]+/
apikey_string: (apikey_schema ":")? apikey_value
apikey_schema: "verbatim" -> as_verbatim | "file" -> as_file
apikey_value: /[^"]+/
number: /[0-9]+/
float: /[0-9]+\.[0-9]*/
def: "default"
text: /(.(?!\/|\\))*./s
Example session
$ aicli
Type /help or a question followed by the /ask command (or by pressing `C-k` key).
>>> /model "~/.local/share/nomic.ai/GPT4All/Meta-Llama-3-8B-Instruct.Q4_0.gguf"
>>> Hi!
>>> /ask
Hello! I'm happy to help you. What's on your mind?^C
>>> What's your name?
>>> /ask
I don't really have a personal name, but you can call me "Assistant"
Vim integration
Aicli is supported by the Litrepl text processor.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
sm_aicli-1.7.0.tar.gz
(12.0 kB
view hashes)
Built Distribution
sm_aicli-1.7.0-py3-none-any.whl
(11.6 kB
view hashes)