Interact with the OpenAI ChatGPT API (and other text generators)
Project description
chap - A Python interface to chatgpt, including a terminal user interface (tui)
System requirements
Chap is developed on Linux with Python 3.11. Due to use of the list[int]
style of type hints, it is known not to work on 3.8 and older; the target minimum Python version is 3.9 (debian oldstable).
installation
Install with e.g., pipx install chap
configuration
Put your OpenAI API key in the platform configuration directory for chap, e.g., on linux/unix systems at ~/.config/chap/openai_api_key
commandline usage
-
chap ask "What advice would you give a 20th century human visiting the 21st century for the first time?"
-
chap render --last
-
chap import chatgpt-style-chatlog.json
(for files from pionxzh/chatgpt-exporter)
interactive terminal usage
- chap tui
Sessions & Commandline Parameters
Details of session handling & commandline arguments are in flux.
By default, a new session is created. It is saved to the user's state directory
(e.g., ~/.local/state/chap
on linux/unix systems).
You can specify the session filename for a new session with -n
or to re-open
an existing session with -s
. Or, you can continue the last session with
--last
.
You can set the "system message" with the -S
flag.
You can select the text generating backend with the -b
flag:
- openai_chatgpt: the default, paid API, best quality results
- llama_cpp: Works with (llama.cpp's http server)[https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md] and can run locally with various models. Set the server URL with
-B url:...
. - textgen: Works with https://github.com/oobabooga/text-generation-webui and can run locally with various models. Needs the server URL in $configuration_directory/textgen_url.
- lorem: local non-AI lorem generator for testing
Environment variables
The backend can be set with CHAP_BACKEND
.
Backend settings can be set with CHAP_<backend_name>_<parameter_name>
, with backend_name
and parameter_name
all in caps.
For instance, CHAP_LLAMA_CPP_URL=http://server.local:8080/completion
changes the default server URL for the llama_cpp back-end.
Importing from ChatGPT
The userscript https://github.com/pionxzh/chatgpt-exporter can export chat logs from chat.openai.com in a json format.
This format is different than chap's, especially since chap currently only represents a single branch of conversation in one log.
You can use the chap import
command to import all the branches of a chatgpt-style chatlog in json format into a series of chap-style chat logs.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file chap-0.5.0.tar.gz
.
File metadata
- Download URL: chap-0.5.0.tar.gz
- Upload date:
- Size: 347.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.18
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9e6b91ebc46e23c137b1f11578231e5ed05d96626bca3029f730f9bdd1449163 |
|
MD5 | b30f19216ba250169caf00463b75992e |
|
BLAKE2b-256 | a444f09fbabae4633e2c6afed21b25035aa394284a7be5ebf07d79596f3ed8aa |
File details
Details for the file chap-0.5.0-py3-none-any.whl
.
File metadata
- Download URL: chap-0.5.0-py3-none-any.whl
- Upload date:
- Size: 20.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.18
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2c728e4ec05712a89f3e3d15f7ff2402f31ebbdae4c707830c9076519650afd5 |
|
MD5 | ec4dfd5871a520021ca7083322f8ad85 |
|
BLAKE2b-256 | 917681e9f6ec0ea2e0f67f521584f74ccf2cd452aa5dcf02bd8bfdc36df69134 |