Skip to main content

ollama-chat

Project description

ollama-chat

PyPI - Status PyPI GitHub PyPI - Python Version

Ollama Chat is a simple yet useful web chat client for Ollama that allows you to chat locally (and privately) with open-source LLMs.

Installation

To get up and running with Ollama Chat follows these steps:

  1. Install and start Ollama

  2. Install Ollama Chat

    pip install ollama-chat
    

Starting Ollama Chat

To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:

ollama-chat

A web browser is launched and opens the Ollama Chat web application.

By default, a configuration file, "ollama-chat.json", is created in the current directory to save your conversations.

Future Features

In no particular order...

  • Save conversation as Markdown file

  • Conversation title edit

  • File / Directory / URL text inclusion in prompt

  • Local model management (pull, rm)

  • Prompt library

Development

This package is developed using python-build. It was started using python-template as follows:

template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ollama_chat-0.9.3.tar.gz (12.2 kB view hashes)

Uploaded Source

Built Distribution

ollama_chat-0.9.3-py3-none-any.whl (13.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page