A terminal based chat application that works with local language models.
Project description
Charla: Terminal-Based Chat Application with Ollama Backend Integration
Charla is a terminal based chat application that integrates with Ollama, a backend designed to serve language models. To use Charla, ensure that the ollama
server is running and at least one language model is installed.
Installation
Install Charla using pipx
:
pipx install charla
Usage
Launch the chat console by typing charla
in your terminal, or view all available command line options with charla -h
.
Features
- Terminal-based chat system that supports context aware conversations using local language models.
- Chat sessions are saved as markdown files in the user's documents directory when ending a chat.
- Prompt history is saved and previously entered prompts are auto-suggested.
- Mode switching between single-line and multi-line input without interruption to your chat session.
Development
Run the command-line interface directly from the project source without installing the package:
python -m charla.cli
ollama API
Installed models:
curl http://localhost:11434/api/tags
Model info:
curl http://localhost:11434/api/show -d '{"name": "phi3"}'
License
Charla is distributed under the terms of the MIT license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
charla-1.0.0.tar.gz
(6.7 kB
view hashes)
Built Distribution
charla-1.0.0-py3-none-any.whl
(5.9 kB
view hashes)