Command-line interface using GPT4ALL bindings
Project description
A simple GNU Readline-based application for interaction with chat-oriented AI models using GPT4All Python bindings.
Contents
Install
The following installation options are available:
Pip
$ pip install git+https://github.com/sergei-mironov/gpt4all-cli.git
Note: pip install gpt4all-cli
might also work, but the git+https
method would bring the most
recent version.
Nix
$ git clone --depth=1 https://github.com/sergei-mironov/gpt4all-cli && cd gpt4all-cli
# Optionally, change the 'nixpkgs' input of the flake.nix to a more suitable
$ nix profile install ".#python-gpt4all-cli"
Usage
usage: gpt4all-cli [-h] [--model-dir MODEL_DIR] [--model MODEL]
[--num-threads NUM_THREADS]
[--model-temperature MODEL_TEMPERATURE] [--device DEVICE]
[--readline-key-send READLINE_KEY_SEND]
[--readline-prompt READLINE_PROMPT]
[--readline-history FILE] [--revision]
Command-line arguments
options:
-h, --help show this help message and exit
--model-dir MODEL_DIR
Model directory to prepend to model file names
--model MODEL, -m MODEL
Model to use for chatbot
--num-threads NUM_THREADS, -t NUM_THREADS
Number of threads to use for chatbot
--model-temperature MODEL_TEMPERATURE
Temperature parameter of the model
--device DEVICE, -d DEVICE
Device to use for chatbot, e.g. gpu, amd, nvidia,
intel. Defaults to CPU.
--readline-key-send READLINE_KEY_SEND
Terminal code to treat as Ctrl+Enter (default: \C-k)
--readline-prompt READLINE_PROMPT
Input prompt (default: >>>)
--readline-history FILE
History file name (default is '_gpt4all_cli_history';
set empty to disable)
--revision Print the revision
The console accepts language defined by the following grammar:
start: (command | escape | text)? (command | escape | text)*
escape.3: /\\./
command.2: /\/exit|\/reset|\/help|\/ask/ | \
/\/model/ / +/ string | \
/\/nthreads/ / +/ (number | def) | \
/\/temp/ / +/ (float | def ) | \
/\/echo/ | /\/echo/ / /
string: /"[^\"]+"/ | /""/
number: /[0-9]+/
float: /[0-9]+\.[0-9]*/
def: "default"
text: /(.(?!\/|\\))*./s
Example session
$ gpt4all-cli
Type /help or a question followed by the /ask command (or by pressing `C-k` key).
>>> /model "~/.local/share/nomic.ai/GPT4All/Meta-Llama-3-8B-Instruct.Q4_0.gguf"
>>> Hi!
>>> /ask
Hello! I'm happy to help you. What's on your mind?^C
>>> What's your name?
>>> /ask
I don't really have a personal name, but you can call me "Assistant"
Vim integration
Gpt4all-cli is supported by the Litrepl text processor.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
gpt4all_cli-1.2.2.tar.gz
(7.7 kB
view details)
Built Distribution
File details
Details for the file gpt4all_cli-1.2.2.tar.gz
.
File metadata
- Download URL: gpt4all_cli-1.2.2.tar.gz
- Upload date:
- Size: 7.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0fdb257633c5a54ef758a9f50c72840eda219db4987005de49e9dd83d3b02e77 |
|
MD5 | 0cdc16169997e0c41f0e666374786bf0 |
|
BLAKE2b-256 | a622a735785fb6ea6c962445c918b80769be80fc359aad5ced12dfa3c9d569e2 |
File details
Details for the file gpt4all_cli-1.2.2-py3-none-any.whl
.
File metadata
- Download URL: gpt4all_cli-1.2.2-py3-none-any.whl
- Upload date:
- Size: 7.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b46069010830bbb92283c4e8f8e0e0cbfd4a013cc2ceb2e61ced1200f4502a9c |
|
MD5 | d4b42fa8d4f5e172a8a231e140d205af |
|
BLAKE2b-256 | 7c442e2598c4f4f8c114a7c72fd3016c69cc46825e5ebbb1734e649168ca91e8 |