Skip to main content

A Python 2.7+ REPL for interacting with LLMs with an OpenAI Chat Completions-compatible API.

Project description

A Python 2.7+ REPL for interacting with LLMs with an OpenAI Chat Completions-compatible API.

Features

  • Zero Dependencies: Works with stock Python 2.7+ and 3.x
  • Interactive Chat: Natural REPL interface with conversation tracking
  • Streaming Responses: See responses as they're generated
  • Multiline Input: Support for complex prompts with :multiline
  • File Integration: Load prompts from text files with :send <textfile>
  • Conversation Persistence: Save/load complete conversations in JSON format
  • Enhanced Input: Optional readline support for history and line editing
  • Dual Modes: Both interactive REPL and pipe-friendly CLI
  • API Ready: Can be imported as a module for programmatic use

Installation

pip install chatrepl

Interactive Mode (CLI)

$ python -m chatrepl \
  --api-key "your-api-key" \
  --base-url "https://api.openai.com/v1" \
  --model "gpt-4o"

Basic Conversation

User [1]: Explain recursion to a 5-year-old

Assistant [1]: Imagine you're holding a doll that has...

Using Files

User [2]: :send code.py
Assistant [2]: I notice this Python code could be improved...

User [3]: :save review_chat.json

Multiline Input

User [4]: :multiline
Enter EOF on a blank line to finish input:
> Compare these programming languages:
> 1. Python
> 2. Rust
> 3. Go
> [Ctrl-D]

Assistant [4]: Here's a comparison:
1. Python - High-level, interpreted...
2. Rust - Systems programming...
3. Go - Compiled, concurrent...

Non-interactive Mode (Piped Input)

$ uname -a | python -m chatrepl --api-key <your_api_key> --base-url <your_base_url> --model <model_name>
The output you've provided appears to be system information from ... [output streamed to STDOUT]

Print Saved Conversations

$ python -m chatrepl --print conversation.json
User [1]: ...

Assistant [1]: ...

Interactive Commands

  • :multiline - Enter multiline input mode (end with blank line + Ctrl-D)
  • :send TEXTFILE - Send contents of TEXTFILE
  • :load JSONFILE - Load conversation from JSONFILE
  • :save JSONFILE - Save conversation to JSONFILE
  • :help - Show help
  • :quit or Ctrl-D - Exit the program

Best Practices

  1. For long sessions, periodically save with :save
  2. Use :multiline for structured prompts (lists, code, etc.)
  3. JSON files can be edited manually for prompt engineering

Programmatic Usage (API)

from chatrepl import Conversation

# Initialize conversation
conv = Conversation(
    api_key="your-api-key",
    base_url="https://api.openai.com/v1",
    model="gpt-4o"
)

# Load conversation
conv.load_messages_from_file("conversation.json")

# Access message history
for msg in conv.messages:
    print(f"{msg['role']}: {msg['content']}")

# Add a message to the model's message list without obtaining a response
conv.add_message('Help me with the following tasks:')

# Send message and stream response
print("Assistant: ", end="")
for chunk in conv.send_message_to_model_and_stream_response("Hello!"):
    print(chunk, end="")
print()

# Save conversation
conv.save_messages_to_file("conversation.json")

Contributing

Contributions are welcome! Please submit pull requests or open issues on the GitHub repository.

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatrepl-0.1.0a3.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chatrepl-0.1.0a3-py2.py3-none-any.whl (7.5 kB view details)

Uploaded Python 2Python 3

File details

Details for the file chatrepl-0.1.0a3.tar.gz.

File metadata

  • Download URL: chatrepl-0.1.0a3.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.10

File hashes

Hashes for chatrepl-0.1.0a3.tar.gz
Algorithm Hash digest
SHA256 ef4c1e8cdf0a00dcb3a501d3b936a6b3f05754c41f6a8801511b9b7faf795127
MD5 de8aae36daf6ec50b8c341ea9f815606
BLAKE2b-256 0ea90f82ade140020b954f127c5b3d08ea5c4663af99c0238d547651fe029c4a

See more details on using hashes here.

File details

Details for the file chatrepl-0.1.0a3-py2.py3-none-any.whl.

File metadata

  • Download URL: chatrepl-0.1.0a3-py2.py3-none-any.whl
  • Upload date:
  • Size: 7.5 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.10

File hashes

Hashes for chatrepl-0.1.0a3-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 28e02e2c292c974de57b568cf7be03c67c64618415366f76aed3a913f0977e2c
MD5 f1cb9391d3b23826da1cf9d8140aa170
BLAKE2b-256 1755bd3fa1db57ddba00c507502aabe217d42797e22457207246dbe359c19f4e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page