An Ollama chat web application
Project description
ollama-chat
Ollama Chat is a web chat client for Ollama that allows you to chat locally (and privately) with Large Language Models (LLMs).
Features
- Platform independent - tested on macOS, Windows, and Linux
- Chat with any local Ollama model
- Save conversations for later viewing and interaction
- Single and multiline prompts
- Regenerate the most recent conversation response
- Delete the most recent conversation exchange
- View responses as Markdown or text
- Save conversations as Markdown text
- Multiple concurrent chats
- Prompt commands for including file and URL content
- Conversation templates for repeating prompts with variable substitutions
Installation
To get up and running with Ollama Chat follows these steps:
-
Install and start Ollama
-
Install Ollama Chat
pip install ollama-chat
Updating
To update Ollama Chat:
pip install -U ollama-chat
Start Ollama Chat
To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:
ollama-chat
A web browser is launched and opens the Ollama Chat web application.
By default, a configuration file, "ollama-chat.json", is created in the user's home directory.
To start a conversation from the command line, use the -m
argument:
ollama-chat -m "Why is the sky blue?"
Prompt Commands
Ollama Chat supports special prompt commands that allow you to include file and URL content in your prompt, among other things. The following prompt commands are available:
-
/file
- include a file/file README.md Please summarize the README file.
-
/dir
- include files from a directory/dir src/ollama_chat py Please provide a summary for each Ollama Chat source file.
-
/url
- include a URL resource/url https://craigahobbs.github.io/ollama-chat/README.md Please summarize the README file.
-
/do
- execute a conversation template by name or title/do city-report -v CityState "Seattle, WA"
To get prompt command help use the -h
option:
/file -h
Conversation Templates
Conversation Templates
allow you to repeat the same prompts with different models. Templates can define variables that may
be included in the template title and prompt text ({{var}}
). For example:
{
"conversations": [],
"templates": [
{
"title": "City Report for {{CityState}}",
"prompts": [
"Tell me about {{CityState}}",
"What is the average cost of living in {{CityState}}?"
],
"variables": [
{
"label": "City, State",
"name": "CityState"
}
]
}
]
}
File Format and API Documentation
Development
This package is developed using python-build. It was started using python-template as follows:
template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file ollama_chat-0.9.27.tar.gz
.
File metadata
- Download URL: ollama_chat-0.9.27.tar.gz
- Upload date:
- Size: 20.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2d29c978a1e305c8f3922a01d34c3076d4d1dc1746a753da80c900a6046877f6 |
|
MD5 | d74e0dc6ece826ac9eb9b1f7460636d2 |
|
BLAKE2b-256 | eabb5e085faeda4a881e4607c8a3fed7c4ea00bbd828aab4fe2bdf64728481a3 |
File details
Details for the file ollama_chat-0.9.27-py3-none-any.whl
.
File metadata
- Download URL: ollama_chat-0.9.27-py3-none-any.whl
- Upload date:
- Size: 21.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e1e81be7ef2839937871db0b821fa9392baba3b9acb99758b3b0a8d56fc7d882 |
|
MD5 | 1104f56271985f2d32b4114f5f79aa1a |
|
BLAKE2b-256 | 235f95c4d940c532b4677dd3f58c16ca600550c105c758ae9165cb121c7f1bf4 |