A pretty command line interface for LLM chat.
Project description
Chatline
A lightweight CLI library for building terminal-based LLM chat interfaces with minimal effort. Provides rich text styling, animations, and conversation state management.
Installation
pip install chatline
With Poetry:
poetry add chatline
Usage
Embedded Mode (AWS Bedrock)
For quick prototyping with AWS Bedrock:
from chatline import Interface
# Initialize with embedded mode (uses AWS Bedrock)
chat = Interface(logging_enabled=True)
# Add optional welcome message
chat.preface("Welcome to the Demo", title="My App", border_color="green")
# Start the conversation
chat.start()
Remote Mode (Custom Backend)
Connect to your own FastAPI/HTTP backend:
from chatline import Interface
# Initialize with remote mode
chat = Interface(endpoint="http://localhost:8000/chat")
# Start the conversation with custom system and user messages
chat.start([
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello, how can you help me today?"}
])
Setting Up a Backend Server
Example FastAPI server:
# server.py
import json
import uvicorn
from fastapi import FastAPI, Request
from fastapi.responses import StreamingResponse
from chatline import generate_stream
app = FastAPI()
@app.post("/chat")
async def stream_chat(request: Request):
body = await request.json()
state = body.get('conversation_state', {})
messages = state.get('messages', [])
# Process the request and update state as needed
state['server_turn'] = state.get('server_turn', 0) + 1
# Return streaming response with updated state
headers = {
'Content-Type': 'text/event-stream',
'X-Conversation-State': json.dumps(state)
}
return StreamingResponse(
generate_stream(messages),
headers=headers,
media_type="text/event-stream"
)
if __name__ == "__main__":
uvicorn.run("server:app", host="127.0.0.1", port=8000)
Features
- Terminal UI: Rich text formatting with styled quotes, brackets, emphasis, and more
- Response Streaming: Real-time streamed responses with loading animations
- State Management: Conversation history with edit and retry functionality
- Dual Modes: Run with embedded AWS Bedrock or connect to a custom backend
- Keyboard Shortcuts: Ctrl+E to edit previous message, Ctrl+R to retry
Dependencies
- Python ≥ 3.12
- AWS credentials configured (for embedded mode with Bedrock)
- boto3, httpx, rich, prompt-toolkit
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file chatline-0.0.5.tar.gz.
File metadata
- Download URL: chatline-0.0.5.tar.gz
- Upload date:
- Size: 25.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.12.7 Darwin/24.3.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f779addfc2f3d05c310265b74e1f97bb9670a81bfbbaa6673e0a5cf55232173c
|
|
| MD5 |
0468a072f4ac4c8f68cb779f03cf5691
|
|
| BLAKE2b-256 |
1babd5125c863628eaaa5c4230b6f3c3a2d168da2c8076eca891ceca84ea8c5a
|
File details
Details for the file chatline-0.0.5-py3-none-any.whl.
File metadata
- Download URL: chatline-0.0.5-py3-none-any.whl
- Upload date:
- Size: 31.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.12.7 Darwin/24.3.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0ad9783fbc616922118f4a8a77471ff66185c0439be63e25e62d2ce0db53989e
|
|
| MD5 |
f0e4db52c1e035ac6c5cf6c76061b505
|
|
| BLAKE2b-256 |
7f71a355869a44ee1f061b813fd9ec48c2cc729547557dd0bc5372237d99dbe1
|