An Ollama chat web application
Project description
ollama-chat
Ollama Chat is a web chat client for Ollama that allows you to chat locally (and privately) with Large Language Models (LLMs).
Features
- Select local model to chat with
- Saves conversations for later viewing and interaction
- Enter single or multiline prompts
- Regnerate the most recent conversation response
- Delete the most recent conversation exchange
- View responses as Markdown text
- Save conversations as Markdown text
- Multiple concurrent chat responses (with proper Ollama configuration)
Installation
To get up and running with Ollama Chat follows these steps:
-
Install and start Ollama
-
Install Ollama Chat
pip install ollama-chat
Updating
To update Ollama Chat:
pip install -U ollama-chat
Start Ollama Chat
To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:
ollama-chat
A web browser is launched and opens the Ollama Chat web application.
By default, a configuration file, "ollama-chat.json", is created in the user's home directory.
Start Conversation from CLI
To start a conversation from the command line, use the -m
argument:
ollama-chat -m "Why is the sky blue?"
File Format and API Documentation
Development
This package is developed using python-build. It was started using python-template as follows:
template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for ollama_chat-0.9.17-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | e62c6845210a4db3ef0b0340ca3a10b2e94cf6553e56ba696dfa712c2f2cb10b |
|
MD5 | 9d9b4a086454c4d0e7ba3f01706511cc |
|
BLAKE2b-256 | 27c9a39ad1ea6c4973b468469172ca80c4a781ac60e7f5a568140d8a1cf0c750 |