An Ollama chat web application
Project description
ollama-chat
Ollama Chat is a web chat client for Ollama that allows you to chat locally (and privately) with Large Language Models (LLMs).
Features
- Platform independent - tested on macOS, Windows, and Linux
- Chat with any local Ollama model
- Save conversations for later viewing and interaction
- Single and multiline prompts
- Regenerate the most recent conversation response
- Delete the most recent conversation exchange
- View responses as Markdown or text
- Save conversations as Markdown text
- Multiple concurrent chats
- Prompt commands for including file and URL content
- Conversation templates for repeating prompts with variable substitutions
Installation
To get up and running with Ollama Chat follows these steps:
-
Install and start Ollama
-
Install Ollama Chat
pip install ollama-chat
Updating
To update Ollama Chat:
pip install -U ollama-chat
Start Ollama Chat
To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:
ollama-chat
A web browser is launched and opens the Ollama Chat web application.
By default, a configuration file, "ollama-chat.json", is created in the user's home directory.
To start a conversation from the command line, use the -m
argument:
ollama-chat -m "Why is the sky blue?"
Prompt Commands
Ollama Chat supports special prompt commands that allow you to include file and URL content in your prompt, among other things. The following prompt commands are available:
-
/file
- include a file/file README.md Please summarize the README file.
-
/dir
- include files from a directory/dir src/ollama_chat py Please provide a summary for each Ollama Chat source file.
-
/url
- include a URL resource/url https://craigahobbs.github.io/ollama-chat/README.md Please summarize the README file.
-
/do
- execute a conversation template by name or title/do city-report -v CityState "Seattle, WA"
To get prompt command help use the -h
option:
/file -h
Conversation Templates
Conversation Templates
allow you to repeat the same prompts with different models. Templates can define variables that may
be included in the template title and prompt text ({{var}}
). For example:
{
"conversations": [],
"templates": [
{
"title": "City Report for {{CityState}}",
"prompts": [
"Tell me about {{CityState}}",
"What is the average cost of living in {{CityState}}?"
],
"variables": [
{
"label": "City, State",
"name": "CityState"
}
]
}
]
}
File Format and API Documentation
Development
This package is developed using python-build. It was started using python-template as follows:
template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file ollama_chat-0.9.33.tar.gz
.
File metadata
- Download URL: ollama_chat-0.9.33.tar.gz
- Upload date:
- Size: 21.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.13.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f517f25dc3d6aa1d159603bdc5adbc98f3c3b3a327c71cb98733b81d292c8ca5 |
|
MD5 | be4485689a359db6157fa36d637d44a5 |
|
BLAKE2b-256 | 76db19e33bcade8105c1b822c271ca083dca710bdf13b44f947d9f9e8a59bdb1 |
File details
Details for the file ollama_chat-0.9.33-py3-none-any.whl
.
File metadata
- Download URL: ollama_chat-0.9.33-py3-none-any.whl
- Upload date:
- Size: 24.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.13.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fa1a4e623053021a84f1984bb5763a7038ea36b030dada5fe3e8df7cc91b8d2d |
|
MD5 | b194072019eabb5820f31d9378bc35ed |
|
BLAKE2b-256 | 5c9830cef2938acf79406d32b12d67dab3fd851dd5427d96fb2c6c4b075b826a |