A Python 2.7+ REPL for interacting with LLMs with an OpenAI Chat Completions-compatible API.
Project description
chatrepl
A minimal single-file coding agent and chat REPL for OpenAI-compatible Chat Completions APIs.
chatrepl.py is no longer just a plain chat interface. It now exposes a small tool-enabled agent loop with four built-in tools:
readwriteeditbash
The model can call these tools automatically, receive their results, and continue for multiple steps until it finishes.
Features
- Minimal coding agent with automatic tool calling
- Four local tools:
read,write,edit,bash - Streaming assistant responses
- Interactive Python-powered REPL
- Non-interactive CLI mode for piped input or one-shot prompts
- Conversation save/load as JSON
- Markdown export
- Editable multiline input through your editor
- Optional discovery of
AGENTS.mdandCLAUDE.mdfrom the working directory up to/ - Works with OpenAI-compatible
/chat/completionsendpoints - Python 2.7+ and Python 3 compatible
Tool model
The agent is intentionally small and constrained.
read
Reads a text file with optional offset and limit arguments.
- Truncates output to 2000 lines or 50 KB
- Detects binary files and refuses to display them
- Resolves paths relative to the current tool working directory
write
Writes full file contents.
- Creates parent directories automatically
- Rewrites the destination file completely
edit
Applies exact text replacements to an existing file.
Rules:
- each
oldTextmust match exactly once - edits must not overlap
- all edits are matched against the original file
Returns a unified diff after a successful edit.
bash
Runs a shell command in the current working directory.
- live output is streamed to the terminal
- output returned to the model is truncated to the last 2000 lines or 50 KB
- default timeout is 30 seconds
- long outputs are saved to a temporary log file
Installation
From source
git clone <repo-url>
cd chatrepl
pip install -r requirements.txt
Or run the script directly if its dependencies are already available.
Usage
Interactive REPL
python chatrepl.py \
--api-key "your-api-key" \
--base-url "https://api.openai.com/v1" \
--model "gpt-4o"
You enter a Python interactive console with helper functions preloaded.
Available commands:
| Function | Description |
|---|---|
send(text='') |
Send a message and let the agent complete tool calls |
append(text) |
Append a user message without sending |
multiline() |
Append multiline input from your editor |
txt(path) |
Append a UTF-8 text file as a user message |
load(path) |
Load a conversation from JSON |
save(path) |
Save the current conversation to JSON |
export(path) |
Export the conversation to Markdown |
correct() |
Edit the last assistant response |
show() |
Print all conversation messages |
reset() |
Reset to only the system prompt |
cwd(path=None) |
Show or change the tool working directory |
model(name=None) |
Show or change the model ID |
base_url(url=None) |
Show or change the API base URL |
stream(enabled=None) |
Show or change streaming mode |
context_files(enabled=None) |
Enable or disable AGENTS.md / CLAUDE.md discovery |
context_paths() |
Show discovered context file paths |
Exit with exit() or EOF.
One-shot prompt
python chatrepl.py \
--api-key "your-api-key" \
--base-url "https://api.openai.com/v1" \
--model "gpt-4o" \
"Inspect this repository and summarize the build system"
Piped input
cat prompt.txt | python chatrepl.py \
--api-key "your-api-key" \
--base-url "https://api.openai.com/v1" \
--model "gpt-4o"
CLI options
-k, --api-key API key for the OpenAI-compatible endpoint
-u, --base-url Base URL, e.g. http://localhost:11434/v1
-m, --model Model ID
-l, --load Load a conversation JSON file
--no-stream Disable streaming
--no-context-files Disable AGENTS.md and CLAUDE.md discovery
Agent behavior
The core agent loop is implemented in AgentConversation.send().
For each turn it:
- sends the current messages and tool schemas to the model
- receives assistant content and optional tool calls
- executes tool calls locally
- appends tool results as
toolmessages - asks the model again
- repeats until there are no more tool calls or
MAX_AGENT_STEPSis reached
Current limit:
MAX_AGENT_STEPS = 32
Context files
When enabled, the script searches from the current working directory up to the filesystem root for:
AGENTS.mdCLAUDE.md
Their contents are appended to the base system prompt.
This lets you keep project-local instructions outside the script itself.
Notes and limitations
- This is a minimal agent, not a full task planner framework.
- Tool execution is local and unsandboxed.
bashcan modify the current machine and repository.- Tool support depends on your model and server correctly implementing OpenAI-compatible tool calling.
writeoverwrites full files; useeditfor precise changes.- Output truncation is intentional to keep context bounded.
Example session
$ python chatrepl.py --api-key ... --base-url http://localhost:11434/v1 --model your-model
Welcome to pi-single-chatrepl. Use one of the following commands to interact with your-model:
>>> send('Inspect chatrepl.py and tell me whether it is agentic now.')
The script now has agentic behavior.
[tool read]
#!/usr/bin/env python
...
It defines tools, sends them to the model, executes tool calls, and loops until completion.
[tokens in=... out=... total_in=... total_out=...]
License
This project is licensed under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file chatrepl-0.3.0a1.tar.gz.
File metadata
- Download URL: chatrepl-0.3.0a1.tar.gz
- Upload date:
- Size: 15.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b634bdee3be78ac997cf4f15e063ceb30cede9c34ea6083f82b7ca2612096a57
|
|
| MD5 |
934b57207c0e07462a5bc2e415709993
|
|
| BLAKE2b-256 |
cbeb257c7d7ccc775e7359c7fc7eee08b256f5e362e1b6e2a4e8e325775259e8
|
File details
Details for the file chatrepl-0.3.0a1-py2.py3-none-any.whl.
File metadata
- Download URL: chatrepl-0.3.0a1-py2.py3-none-any.whl
- Upload date:
- Size: 15.5 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
16945d61cce090ea6c1734217b54f704c7b9487be947177a4b4c6a9c088469e9
|
|
| MD5 |
5d1093056fbd0b61dc2727ef5b47e932
|
|
| BLAKE2b-256 |
5019a920f846dfaa8f86bb4f96531eb83361658cde852a4ed87a715dc9084707
|