Skip to main content

A Python 2.7+ REPL for interacting with LLMs with an OpenAI Chat Completions-compatible API.

Project description

chatrepl

A minimal single-file coding agent and chat REPL for OpenAI-compatible Chat Completions APIs.

chatrepl.py is no longer just a plain chat interface. It now exposes a small tool-enabled agent loop with four built-in tools:

  • read
  • write
  • edit
  • bash

The model can call these tools automatically, receive their results, and continue for multiple steps until it finishes.

Features

  • Minimal coding agent with automatic tool calling
  • Four local tools: read, write, edit, bash
  • Streaming assistant responses
  • Interactive Python-powered REPL
  • Non-interactive CLI mode for piped input or one-shot prompts
  • Conversation save/load as JSON
  • Markdown export
  • Editable multiline input through your editor
  • Optional discovery of AGENTS.md and CLAUDE.md from the working directory up to /
  • Works with OpenAI-compatible /chat/completions endpoints
  • Python 2.7+ and Python 3 compatible

Tool model

The agent is intentionally small and constrained.

read

Reads a text file with optional offset and limit arguments.

  • Truncates output to 2000 lines or 50 KB
  • Detects binary files and refuses to display them
  • Resolves paths relative to the current tool working directory

write

Writes full file contents.

  • Creates parent directories automatically
  • Rewrites the destination file completely

edit

Applies exact text replacements to an existing file.

Rules:

  • each oldText must match exactly once
  • edits must not overlap
  • all edits are matched against the original file

Returns a unified diff after a successful edit.

bash

Runs a shell command in the current working directory.

  • live output is streamed to the terminal
  • output returned to the model is truncated to the last 2000 lines or 50 KB
  • default timeout is 30 seconds
  • long outputs are saved to a temporary log file

Installation

From source

git clone <repo-url>
cd chatrepl
pip install -r requirements.txt

Or run the script directly if its dependencies are already available.

Usage

Interactive REPL

python chatrepl.py \
  --api-key "your-api-key" \
  --base-url "https://api.openai.com/v1" \
  --model "gpt-4o"

You enter a Python interactive console with helper functions preloaded.

Available commands:

Function Description
send(text='') Send a message and let the agent complete tool calls
append(text) Append a user message without sending
multiline() Append multiline input from your editor
txt(path) Append a UTF-8 text file as a user message
load(path) Load a conversation from JSON
save(path) Save the current conversation to JSON
export(path) Export the conversation to Markdown
correct() Edit the last assistant response
show() Print all conversation messages
reset() Reset to only the system prompt
cwd(path=None) Show or change the tool working directory
model(name=None) Show or change the model ID
base_url(url=None) Show or change the API base URL
stream(enabled=None) Show or change streaming mode
context_files(enabled=None) Enable or disable AGENTS.md / CLAUDE.md discovery
context_paths() Show discovered context file paths

Exit with exit() or EOF.

One-shot prompt

python chatrepl.py \
  --api-key "your-api-key" \
  --base-url "https://api.openai.com/v1" \
  --model "gpt-4o" \
  "Inspect this repository and summarize the build system"

Piped input

cat prompt.txt | python chatrepl.py \
  --api-key "your-api-key" \
  --base-url "https://api.openai.com/v1" \
  --model "gpt-4o"

CLI options

-k, --api-key           API key for the OpenAI-compatible endpoint
-u, --base-url          Base URL, e.g. http://localhost:11434/v1
-m, --model             Model ID
-l, --load              Load a conversation JSON file
--no-stream             Disable streaming
--no-context-files      Disable AGENTS.md and CLAUDE.md discovery

Agent behavior

The core agent loop is implemented in AgentConversation.send().

For each turn it:

  1. sends the current messages and tool schemas to the model
  2. receives assistant content and optional tool calls
  3. executes tool calls locally
  4. appends tool results as tool messages
  5. asks the model again
  6. repeats until there are no more tool calls or MAX_AGENT_STEPS is reached

Current limit:

  • MAX_AGENT_STEPS = 32

Context files

When enabled, the script searches from the current working directory up to the filesystem root for:

  • AGENTS.md
  • CLAUDE.md

Their contents are appended to the base system prompt.

This lets you keep project-local instructions outside the script itself.

Notes and limitations

  • This is a minimal agent, not a full task planner framework.
  • Tool execution is local and unsandboxed.
  • bash can modify the current machine and repository.
  • Tool support depends on your model and server correctly implementing OpenAI-compatible tool calling.
  • write overwrites full files; use edit for precise changes.
  • Output truncation is intentional to keep context bounded.

Example session

$ python chatrepl.py --api-key ... --base-url http://localhost:11434/v1 --model your-model
Welcome to pi-single-chatrepl. Use one of the following commands to interact with your-model:

>>> send('Inspect chatrepl.py and tell me whether it is agentic now.')
The script now has agentic behavior.

[tool read]
#!/usr/bin/env python
...

It defines tools, sends them to the model, executes tool calls, and loops until completion.
[tokens in=... out=... total_in=... total_out=...]

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatrepl-0.3.0a0.tar.gz (15.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chatrepl-0.3.0a0-py2.py3-none-any.whl (15.5 kB view details)

Uploaded Python 2Python 3

File details

Details for the file chatrepl-0.3.0a0.tar.gz.

File metadata

  • Download URL: chatrepl-0.3.0a0.tar.gz
  • Upload date:
  • Size: 15.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for chatrepl-0.3.0a0.tar.gz
Algorithm Hash digest
SHA256 7e640de3488d833e82dd592227730caad8c4af5f4386e79b5f3ba9802ece7bc6
MD5 a3fc4d1dab739f86b508a6af3eb8665a
BLAKE2b-256 eacb2e0f2eb6247dd915f94d0bd4bdaa2740fa051d38a1ec0d4125c07eb73f13

See more details on using hashes here.

File details

Details for the file chatrepl-0.3.0a0-py2.py3-none-any.whl.

File metadata

  • Download URL: chatrepl-0.3.0a0-py2.py3-none-any.whl
  • Upload date:
  • Size: 15.5 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for chatrepl-0.3.0a0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 5eb4c39199ab6b370443fc15dda116bc8ef3954899f0985e23a380724571a5e2
MD5 8e4f47677a74084e1a0a6e8363359702
BLAKE2b-256 f5f012c94584799ce3ce9e2de73b18137819de21738b5040743430c67e1dc7b5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page