ollama-chat
Project description
ollama-chat
Ollama Chat is a simple yet useful web chat client for Ollama that allows you to chat locally (and privately) with open-source LLMs.
Installation
To get up and running with Ollama Chat follows these steps:
-
Install and start Ollama
-
Install Ollama Chat
pip install ollama-chat
Starting Ollama Chat
To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:
ollama-chat
A web browser is launched and opens the Ollama Chat web application.
By default, a configuration file, "ollama-chat.json", is created in the current directory to save your conversations.
Future Features
In no particular order...
-
Save conversation as Markdown file
-
Conversation title edit
-
File / Directory / URL text inclusion in prompt
-
Local model management (pull, rm)
-
Prompt library
Development
This package is developed using python-build. It was started using python-template as follows:
template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for ollama_chat-0.9.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3c786d754eef26755b046b817977a90261c8ed0dfd3a4dca5953a0e7c7864860 |
|
MD5 | 0f20d3b465c5f8120171d2f1ee6754d2 |
|
BLAKE2b-256 | 752fa2dbe1f5d625bdd04644f568f69148cb8aea2fd7fda74e519c9a660d9875 |