Interact with the OpenAI ChatGPT API (and other text generators)
Project description
chap - A Python interface to chatgpt and other LLMs, including a terminal user interface (tui)
System requirements
Chap is primarily developed on Linux with Python 3.11. Moderate effort will be made to support versions back to Python 3.9 (Debian oldstable).
Installation
If you want chap
available as a command, just install with pipx install chap
or pip install chap
.
Use a virtual environment unless you want it installed globally.
Installation for development
Use one of the following two methods to run chap
as a command, with the ability to edit the source files. You are welcome to submit valuable changes as a pull request.
Via pip install --editable .
This is an "editable install", as recommended by the Python Packaging Authority.
Change directory to the root of the chap
project.
Activate your virtual environment, then install chap
in development mode:
pip install --editable .
In this mode, you get the chap
command-line program installed, but you are able to edit the source files in the src
directory in place.
Via chap-dev.py
A simple shim script called chap-dev.py
is included to demonstrate how to load and run the chap
library without installing chap
in development mode. This method may be more familiar to some developers.
Change directory to the root of the chap
project.
Activate your virtual environment, then install requirements:
pip install -r requirements.txt
Run the shim script (with optional command flags as appropriate):
./chap-dev.py
In this mode, you can edit the source files in the src
directory in place, and the shim script will pick up the changes via the import
directive.
Contributing
See CONTRIBUTING.md.
Code of Conduct
See CODE_OF_CONDUCT.md.
Configuration
Put your OpenAI API key in the platform configuration directory for chap, e.g., on linux/unix systems at ~/.config/chap/openai_api_key
Command-line usage
-
chap ask "What advice would you give a 20th century human visiting the 21st century for the first time?"
-
chap render --last
/chap cat --last
-
chap import chatgpt-style-chatlog.json
(for files from pionxzh/chatgpt-exporter) -
chap grep needle
Interactive terminal usage
The interactive terminal mode is accessed via chap tui
.
There are a variety of keyboard shortcuts to be aware of:
- tab/shift-tab to move between the entry field and the conversation, or between conversation items
- While in the text box, F9 or (if supported by your terminal) alt+enter to submit multiline text
- while on a conversation item:
- ctrl+x to re-draft the message. This
- saves a copy of the session in an auto-named file in the conversations folder
- removes the conversation from this message to the end
- puts the user's message in the text box to edit
- ctrl+x to re-submit the message. This
- saves a copy of the session in an auto-named file in the conversations folder
- removes the conversation from this message to the end
- puts the user's message in the text box
- and submits it immediately
- ctrl+y to yank the message. This places the response part of the current interaction in the operating system clipboard to be pasted (e..g, with ctrl+v or command+v in other software)
- ctrl+q to toggle whether this message may be included in the contextual history for a future query. The exact way history is submitted is determined by the back-end, often by counting messages or tokens, but the ctrl+q toggle ensures this message (both the user and assistant message parts) are not considered.
- ctrl+x to re-draft the message. This
Sessions & Command-line Parameters
Details of session handling & command-line arguments are in flux.
By default, a new session is created. It is saved to the user's state directory
(e.g., ~/.local/state/chap
on linux/unix systems).
You can specify the session filename for a new session with -n
or to re-open
an existing session with -s
. Or, you can continue the last session with
--last
.
You can set the "system message" with the -S
flag.
You can select the text generating backend with the -b
flag:
- openai-chatgpt: the default, paid API, best quality results
- llama-cpp: Works with llama.cpp's http server and can run locally with various models,
though it is optimized for models that use the llama2-style prompting.
Set the server URL with
-B url:...
. - textgen: Works with https://github.com/oobabooga/text-generation-webui and can run locally with various models. Needs the server URL in $configuration_directory/textgen_url.
- lorem: local non-AI lorem generator for testing
Environment variables
The backend can be set with the CHAP_BACKEND
environment variable.
Backend settings can be set with CHAP_<backend_name>_<parameter_name>
, with backend_name
and parameter_name
all in caps.
For instance, CHAP_LLAMA_CPP_URL=http://server.local:8080/completion
changes the default server URL for the llama-cpp back-end.
Importing from ChatGPT
The userscript https://github.com/pionxzh/chatgpt-exporter can export chat logs from chat.openai.com in a JSON format.
This format is different than chap's, especially since chap
currently only represents a single branch of conversation in one log.
You can use the chap import
command to import all the branches of a chatgpt-style chatlog in JSON format into a series of chap
-style chat logs.
Plug-ins
Chap supports back-end and command plug-ins.
"Back-ends" add additional text generators.
"Commands" add new ways to interact with text generators, session data, and so forth.
Install a plugin with pip install
or pipx inject
(depending how you installed chap) and then use it as normal.
chap-backend-replay is an example back-end plug-in. It replays answers from a previous session.
chap-command-explain is an example command plug-in. It is similar to chap ask
.
At this time, there is no stability guarantee for the API of commands or backends.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file chap-0.8.1.tar.gz
.
File metadata
- Download URL: chap-0.8.1.tar.gz
- Upload date:
- Size: 363.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 73601e166e44d252b44bb7a31ab273ba655484364ffe157097a4ebba1e1edc98 |
|
MD5 | b68f2f4681c64496d5f215241c39c343 |
|
BLAKE2b-256 | cdcf69d36857ef53ff601887c432865787d07621f35c813506b24a5d857f70d6 |
File details
Details for the file chap-0.8.1-py3-none-any.whl
.
File metadata
- Download URL: chap-0.8.1-py3-none-any.whl
- Upload date:
- Size: 28.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6050790cf71628448fffc545a2dd5040a73f9f714601e4ad47bfe69f8bcf9133 |
|
MD5 | 81320bebb97e775645012bc4e5b8e8e1 |
|
BLAKE2b-256 | 4d488525df481507b9c5939976710b5ffe76d38f3d28676989312f3cae3eff68 |