Command-line interface using GPT4ALL bindings
Project description
A simple GNU Readline-based application for interaction with chat-oriented AI models using GPT4All Python bindings.
Contents
Install
The following installation options are available:
Pip
$ pip install git+https://github.com/sergei-mironov/gpt4all-cli.git
Note: pip install gpt4all-cli
might also work, but the git+https
method would bring the most
recent version.
Nix
$ git clone --depth=1 https://github.com/sergei-mironov/gpt4all-cli && cd gpt4all-cli
# Optionally, change the 'nixpkgs' input of the flake.nix to a more suitable
$ nix profile install ".#python-gpt4all-cli"
Usage
usage: gpt4all-cli [-h] [--model-dir MODEL_DIR] [--model MODEL]
[--num-threads NUM_THREADS]
[--model-temperature MODEL_TEMPERATURE] [--device DEVICE]
[--readline-key-send READLINE_KEY_SEND]
[--readline-prompt READLINE_PROMPT]
[--readline-history FILE] [--revision]
Command-line arguments
options:
-h, --help show this help message and exit
--model-dir MODEL_DIR
Model directory to prepend to model file names
--model MODEL, -m MODEL
Model to use for chatbot
--num-threads NUM_THREADS, -t NUM_THREADS
Number of threads to use for chatbot
--model-temperature MODEL_TEMPERATURE
Temperature parameter of the model
--device DEVICE, -d DEVICE
Device to use for chatbot, e.g. gpu, amd, nvidia,
intel. Defaults to CPU.
--readline-key-send READLINE_KEY_SEND
Terminal code to treat as Ctrl+Enter (default: \C-k)
--readline-prompt READLINE_PROMPT
Input prompt (default: >>>)
--readline-history FILE
History file name (default is '_gpt4all_cli_history';
set empty to disable)
--revision Print the revision
The console accepts language defined by the following grammar:
start: (command | escape | text)? (command | escape | text)*
escape.3: /\\./
command.2: /\/help|\/reset|\/exit|\/ask/ | \
/\/model/ / +/ string | \
/\/nthreads/ / +/ (number | def) | \
/\/temp/ / +/ (float | def ) | \
/\/echo/ | /\/echo/ / /
string: /"[^\"]+"/ | /""/
number: /[0-9]+/
float: /[0-9]+\.[0-9]*/
def: "default"
text: /(.(?!\/|\\))*./s
Example session
$ gpt4all-cli
Type /help or a question followed by the /ask command (or by pressing `C-k` key).
>>> /model "~/.local/share/nomic.ai/GPT4All/Meta-Llama-3-8B-Instruct.Q4_0.gguf"
>>> Hi!
>>> /ask
Hello! I'm happy to help you. What's on your mind?^C
>>> What's your name?
>>> /ask
I don't really have a personal name, but you can call me "Assistant"
Vim integration
Gpt4all-cli is supported by the Litrepl text processor.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
gpt4all_cli-1.2.1.tar.gz
(7.5 kB
view details)
Built Distribution
File details
Details for the file gpt4all_cli-1.2.1.tar.gz
.
File metadata
- Download URL: gpt4all_cli-1.2.1.tar.gz
- Upload date:
- Size: 7.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f1eb289d33347217bda89ab35a5d26462fe0502a01e2b2b730db6d3044b5190a |
|
MD5 | 27c56dc1b415c3eedbd94286b913bcb5 |
|
BLAKE2b-256 | e5fb6a27e0723e54687d652cfda7b55fad584a57cf8d9fbf2ded91edb0c78a20 |
File details
Details for the file gpt4all_cli-1.2.1-py3-none-any.whl
.
File metadata
- Download URL: gpt4all_cli-1.2.1-py3-none-any.whl
- Upload date:
- Size: 7.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | efbd374aa71552458eafd5888d2786ba411591fbc324fc87280946a4364dda53 |
|
MD5 | ba788eda59b9f3c0afd3d02567b17e7c |
|
BLAKE2b-256 | c055a169c6df67a8aacea99add104356c1ee9cfee3c4a0447652f5af9bfc5b2c |