A Python 2.7+ REPL for interacting with LLMs with an OpenAI Chat Completions-compatible API.
Project description
A Python 2.7+ REPL for interacting with LLMs with an OpenAI Chat Completions-compatible API.
Features
- Minimal Dependencies: Works with stock Python 2.7+ and 3.x
- Python-Powered REPL: Native interactive shell with special chat commands as Python functions
- Streaming Responses: See model replies as they're generated
- Multiline Editing: Use your default editor for long/structured prompts
- File & Image Support: Attach text files or images to your conversation
- Conversation Persistence: Save/load full conversations as JSON
- Markdown Export: Archive conversations in Markdown for easy reference
- Enhanced Input: Optional readline support for history and line editing
- Dual Modes: Both interactive REPL and pipe-friendly CLI
Installation
pip install chatrepl
After installation, run it as chatrepl.
Interactive Mode (CLI)
chatrepl \
--api-key "your-api-key" \
--base-url "https://api.openai.com/v1" \
--model "gpt-4o"
You'll get a Python shell preloaded with chat helper functions:
| Function | Description |
|---|---|
send(text) |
Send a message and stream the response |
append(text) |
Append text to conversation (don't send) |
multiline() |
Append multiline input via your editor |
img(img_file_path) |
Append an image file |
txt(txt_file_path) |
Append a text file |
load(json_file_path) |
Load conversation from JSON |
save(json_file_path) |
Save conversation to JSON |
export(md_file_path) |
Export conversation to Markdown |
correct() |
Correct (edit) last model response |
exit() or EOF (Ctrl-D on Unix) leaves REPL.
Basic Example
Welcome to ChatREPL! Use the following commands to interact with gpt-4o:
>>> send('Explain recursion.')
Assistant: Sure! Recursion is...
>>> multiline()
(opens your editor for multiline input)
>>> img('diagram.png')
(appends the given image to the conversation)
>>> send('Describe this image.')
Assistant: ...
>>> save('chat.json')
(saves the entire chat so far)
Non-interactive Mode (Piped Input)
$ uname -a | chatrepl --api-key <your_api_key> --base-url <your_base_url> --model <model_name>
The output you've provided appears to be system information from ... [output streamed to STDOUT]
Programmatic Usage (API)
See chat-completions-conversation.
Contributing
Contributions are welcome! Please submit pull requests or open issues on the GitHub repository.
License
This project is licensed under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file chatrepl-0.2.0a2.tar.gz.
File metadata
- Download URL: chatrepl-0.2.0a2.tar.gz
- Upload date:
- Size: 5.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
195622cdd94e7930773d5f0b8cc5a4852aae3396d565bed1a1550adeb06ad363
|
|
| MD5 |
fa3bf2f2a17475716b7b44c7f3ea9ef1
|
|
| BLAKE2b-256 |
521814f283de4e5855b398be4bc8d14c7b23c82daab3f1cde58a0c81fb27f064
|
File details
Details for the file chatrepl-0.2.0a2-py2.py3-none-any.whl.
File metadata
- Download URL: chatrepl-0.2.0a2-py2.py3-none-any.whl
- Upload date:
- Size: 5.8 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
dffaf666cdd6385faf0e497e4785694afbd59bdbbbee0a8607a1a8d9117dc09e
|
|
| MD5 |
c92c56ffa8b3a324e74356be5c39fef6
|
|
| BLAKE2b-256 |
e300fe75f35582eca22d727b43595e6fd5ea0f648a8c96817b5503c6759e9e17
|