Skip to main content

Interact with the OpenAI ChatGPT API (and other text generators)

Project description

Test Release chap PyPI

chap - A Python interface to chatgpt and other LLMs, including a terminal user interface (tui)

Chap screencast

System requirements

Chap is primarily developed on Linux with Python 3.11. Moderate effort will be made to support versions back to Python 3.9 (Debian oldstable).

Installation

If you want chap available as a command, just install with pipx install chap or pip install chap.

Use a virtual environment unless you want it installed globally.

Installation for development

Use one of the following two methods to run chap as a command, with the ability to edit the source files. You are welcome to submit valuable changes as a pull request.

Via pip install --editable .

This is an "editable install", as recommended by the Python Packaging Authority.

Change directory to the root of the chap project.

Activate your virtual environment, then install chap in development mode:

pip install --editable .

In this mode, you get the chap command-line program installed, but you are able to edit the source files in the src directory in place.

Via chap-dev.py

A simple shim script called chap-dev.py is included to demonstrate how to load and run the chap library without installing chap in development mode. This method may be more familiar to some developers.

Change directory to the root of the chap project.

Activate your virtual environment, then install requirements:

pip install -r requirements.txt

Run the shim script (with optional command flags as appropriate):

./chap-dev.py

In this mode, you can edit the source files in the src directory in place, and the shim script will pick up the changes via the import directive.

Contributing

See CONTRIBUTING.md.

Code of Conduct

See CODE_OF_CONDUCT.md.

Configuration

Put your OpenAI API key in the platform configuration directory for chap, e.g., on linux/unix systems at ~/.config/chap/openai_api_key

Command-line usage

  • chap ask "What advice would you give a 20th century human visiting the 21st century for the first time?"

  • chap render --last / chap cat --last

  • chap import chatgpt-style-chatlog.json (for files from pionxzh/chatgpt-exporter)

  • chap grep needle

@FILE arguments

It's useful to set a bunch of related arguments together, for instance to fully configure a back-end. This functionality is implemented via @FILE arguments.

Before any other command-line argument parsing is performed, @FILE arguments are expanded:

  • An @FILE argument is searched relative to the current directory
  • An @:FILE argument is searched relative to the configuration directory (e.g., $HOME/.config/chap)
  • If an argument starts with a literal @, double it: @@
  • @. stops processing any further @FILE arguments and leaves them unchanged. The contents of an @FILE are parsed according to shlex.split(comments=True). Comments are supported. A typical content might look like this:
# gpt-3.5.txt: Use cheaper gpt 3.5 and custom prompt
--backend openai-chatgpt
-B model:gpt-3.5-turbo
-s my-custom-system-message.txt

and you might use it with

chap @:gpt-3.5.txt ask what version of gpt is this

Interactive terminal usage

The interactive terminal mode is accessed via chap tui.

There are a variety of keyboard shortcuts to be aware of:

  • tab/shift-tab to move between the entry field and the conversation, or between conversation items
  • While in the text box, F9 or (if supported by your terminal) alt+enter to submit multiline text
  • while on a conversation item:
    • ctrl+x to re-draft the message. This
      • saves a copy of the session in an auto-named file in the conversations folder
      • removes the conversation from this message to the end
      • puts the user's message in the text box to edit
    • ctrl+x to re-submit the message. This
      • saves a copy of the session in an auto-named file in the conversations folder
      • removes the conversation from this message to the end
      • puts the user's message in the text box
      • and submits it immediately
    • ctrl+y to yank the message. This places the response part of the current interaction in the operating system clipboard to be pasted (e..g, with ctrl+v or command+v in other software)
    • ctrl+q to toggle whether this message may be included in the contextual history for a future query. The exact way history is submitted is determined by the back-end, often by counting messages or tokens, but the ctrl+q toggle ensures this message (both the user and assistant message parts) are not considered.

Sessions & Command-line Parameters

Details of session handling & command-line arguments are in flux.

By default, a new session is created. It is saved to the user's state directory (e.g., ~/.local/state/chap on linux/unix systems).

You can specify the session filename for a new session with -n or to re-open an existing session with -s. Or, you can continue the last session with --last.

You can set the "system message" with the -S flag.

You can select the text generating backend with the -b flag:

Environment variables

The backend can be set with the CHAP_BACKEND environment variable.

Backend settings can be set with CHAP_<backend_name>_<parameter_name>, with backend_name and parameter_name all in caps.

For instance, CHAP_LLAMA_CPP_URL=http://server.local:8080/completion changes the default server URL for the llama-cpp back-end.

Importing from ChatGPT

The userscript https://github.com/pionxzh/chatgpt-exporter can export chat logs from chat.openai.com in a JSON format.

This format is different than chap's, especially since chap currently only represents a single branch of conversation in one log.

You can use the chap import command to import all the branches of a chatgpt-style chatlog in JSON format into a series of chap-style chat logs.

Plug-ins

Chap supports back-end and command plug-ins.

"Back-ends" add additional text generators.

"Commands" add new ways to interact with text generators, session data, and so forth.

Install a plugin with pip install or pipx inject (depending how you installed chap) and then use it as normal.

chap-backend-replay is an example back-end plug-in. It replays answers from a previous session.

chap-command-explain is an example command plug-in. It is similar to chap ask.

At this time, there is no stability guarantee for the API of commands or backends.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chap-0.8.4.tar.gz (365.8 kB view details)

Uploaded Source

Built Distribution

chap-0.8.4-py3-none-any.whl (31.4 kB view details)

Uploaded Python 3

File details

Details for the file chap-0.8.4.tar.gz.

File metadata

  • Download URL: chap-0.8.4.tar.gz
  • Upload date:
  • Size: 365.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for chap-0.8.4.tar.gz
Algorithm Hash digest
SHA256 45888d2435f94337864950bddc650e3b42903686dc036d11752307fc7f43599a
MD5 8433064276facaf3bb91baf67f487c60
BLAKE2b-256 6c0e763e0aa1347d808aae1b508c6af049db7717a37a2707444f3d0d4359b8cb

See more details on using hashes here.

File details

Details for the file chap-0.8.4-py3-none-any.whl.

File metadata

  • Download URL: chap-0.8.4-py3-none-any.whl
  • Upload date:
  • Size: 31.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for chap-0.8.4-py3-none-any.whl
Algorithm Hash digest
SHA256 b6ba8a4ea480d97ac760fac19f27bd2155d7fa5b70812e40c9e0b4298aa86713
MD5 d1ef547da16d7ec932fc6c5f69b295a7
BLAKE2b-256 3b01bd98ed0fd6faf8a583941beeba89df06b0975eff85a1641c6710cc279cf8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page