An Ollama chat web application
Project description
ollama-chat
Ollama Chat is a web chat client for Ollama that allows you to chat locally (and privately) with Large Language Models (LLMs).
Features
- Select local model to chat with
- Saves conversations for later viewing and interaction
- Enter single or multiline prompts
- Regnerate the most recent conversation response
- Delete the most recent conversation exchange
- View responses as Markdown text
- Save conversations as Markdown text
- Multiple concurrent chat responses (with proper Ollama configuration)
Installation
To get up and running with Ollama Chat follows these steps:
-
Install and start Ollama
-
Install Ollama Chat
pip install ollama-chat
Updating
To update Ollama Chat:
pip install -U ollama-chat
Start Ollama Chat
To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:
ollama-chat
A web browser is launched and opens the Ollama Chat web application.
By default, a configuration file, "ollama-chat.json", is created in the user's home directory.
Start Conversation from CLI
To start a conversation from the command line, use the -m
argument:
ollama-chat -m "Why is the sky blue?"
File Format and API Documentation
Development
This package is developed using python-build. It was started using python-template as follows:
template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for ollama_chat-0.9.16-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b1aac3894c338ac64f7e48c730267d15986782f0b8ffa698f998465c40fcf7c3 |
|
MD5 | 18341ab902fd62b9a5dfda68abf54490 |
|
BLAKE2b-256 | 97d5e7974cbd10961d88c5b5fdb07a900c8a0886f13dbed2d75bfd136413c366 |