Skip to main content

A terminal based chat application that works with language models.

Project description

Charla: Chat with Language Models in a Terminal

PyPI - Version PyPI - Python Version

Charla is a terminal based application for chatting with language models. Charla integrates with Ollama and GitHub Models for exchanging messages with model services.

preview

Features

  • Terminal-based chat system that supports context aware conversations with language models.
  • Support for local models via Ollama and remote models via GitHub Models.
  • Chat sessions are saved as markdown files in the user's documents directory when ending a chat.
  • Prompt history is saved and previously entered prompts are auto-suggested.
  • Switch between single-line and multi-line input modes without interrupting the chat session.
  • Store default user preferences in a settings file.
  • Provide a system prompt for a chat session.
  • Load content from local files and web pages to append to prompts.

Installation

To use Charla with models on your computer, you need a running installation of the Ollama server and at least one supported language model must be installed. For GitHub Models you need access to the service and a GitHub token. Please refer to the documentation of the service provider you want to use for installation and setup instructions.

Install Charla using pipx:

pipx install charla

For GitHub models, set the environment variable GITHUB_TOKEN to your token. In Bash enter:

export GITHUB_TOKEN=YOUR_GITHUB_TOKEN

Usage

After successful installation and setup you can launch the chat console with the charla command in your terminal.

If you use Charla with Ollama, the default provider, you only need to specify the model to use, e.g.:

charla -m phi3

If you want to use GitHub Models, you have to set the provider:

charla -m gpt-4o --provider github

You can set a default model and change the default provider in your user settings file.

Settings

Settings can be specified as command line arguments and in the settings file. Command line arguments have the highest priority. The location of your settings file depends on your operating system. Use the following command to show the location:

charla settings --location

Example settings for using OpenAI's GPT-4o model and the GitHub Models service by default.

{
    "model": "gpt-4o",
    "chats_path": "./chats",
    "prompt_history": "./prompt-history.txt",
    "provider": "github",
    "message_limit": 20,
    "multiline": false
}

CLI Help

Output of charla -h with information on all available command line options.

usage: charla [-h] [--model MODEL] [--chats-path CHATS_PATH] [--prompt-history PROMPT_HISTORY]
                             [--provider PROVIDER] [--message-limit MESSAGE_LIMIT] [--multiline] [--system-prompt SYSTEM_PROMPT]
                             [--version]
                             {settings} ...

Chat with language models.

positional arguments:
  {settings}            Sub Commands
    settings            Show current settings.

options:
  -h, --help            show this help message and exit
  --model MODEL, -m MODEL
                        Name of language model to chat with.
  --chats-path CHATS_PATH
                        Directory to store chats.
  --prompt-history PROMPT_HISTORY
                        File to store prompt history.
  --provider PROVIDER   Name of the provider to use.
  --message-limit MESSAGE_LIMIT
                        Maximum number of messages to send to GitHub Models service.
  --multiline           Use multiline mode.
  --system-prompt SYSTEM_PROMPT, -sp SYSTEM_PROMPT
                        File that contains system prompt to use.
  --version             show program's version number and exit

Development

Run the command-line interface directly from the project source without installing the package:

python -m charla.cli

ollama API

Installed models:

curl http://localhost:11434/api/tags

Model info:

curl http://localhost:11434/api/show -d '{"name": "phi3"}'

License

Charla is distributed under the terms of the MIT license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

charla-2.0.0.tar.gz (13.5 kB view details)

Uploaded Source

Built Distribution

charla-2.0.0-py3-none-any.whl (10.9 kB view details)

Uploaded Python 3

File details

Details for the file charla-2.0.0.tar.gz.

File metadata

  • Download URL: charla-2.0.0.tar.gz
  • Upload date:
  • Size: 13.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.23.0

File hashes

Hashes for charla-2.0.0.tar.gz
Algorithm Hash digest
SHA256 da5a3565c1881bf1e64bc715b431d0f8d6c69e8a65b50ee32f212dfd2c77e0c3
MD5 6f641abe39f003c2396e95fb95786802
BLAKE2b-256 3159c212c94ce6c7ca6ddc2141f20d74bbfd79c4fa1bf9a708bd693d8af5f145

See more details on using hashes here.

File details

Details for the file charla-2.0.0-py3-none-any.whl.

File metadata

  • Download URL: charla-2.0.0-py3-none-any.whl
  • Upload date:
  • Size: 10.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.23.0

File hashes

Hashes for charla-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a7d06ca60105ed2dedc9fd329bb416bc17ba83ed22241266d935373c0bf3927c
MD5 d44fb9b8c3bd3054d7aa2b78cf067c51
BLAKE2b-256 f31946555eddefb034959638029ff0daa3974b171df672f0d903adba5fbe9e47

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page