Skip to main content

ollama-chat

Project description

ollama-chat

PyPI - Status PyPI GitHub PyPI - Python Version

Ollama Chat is a simple yet useful web chat client for Ollama that allows you to chat locally (and privately) with open-source LLMs.

Installation

To get up and running with Ollama Chat follows these steps:

  1. Install and start Ollama

  2. Install Ollama Chat

    pip install ollama-chat
    

Updating

To update Ollama Chat:

pip install -U ollama-chat

Start Ollama Chat

To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:

ollama-chat

A web browser is launched and opens the Ollama Chat web application.

By default, a configuration file, "ollama-chat.json", is created in the user's home directory.

Start Conversation from CLI

To start a conversation from the command line, use the -m argument:

ollama-chat -m "Why is the sky blue?"

File Format and API Documentation

Ollama Chat File Format

Ollama Chat API

Future

  • Save conversation as Markdown file

    • Save link on index/conversation page
  • Markdown text view on conversation page

  • Auto-title task on start conversation

    • Update conversation title API
    • Update title link on index/conversation page
  • Prompts part 1

    • Prompts config collection (name, title, prompt)
    • Index links start new conversation with current model
    • -t command-line argument starts prompt by name
  • File / Directory / URL text inclusion in prompt

  • Multi-line text input

  • Prompts part 2

    • Prompt editor
    • Create link on index page
    • Delete links on index page
    • Index links open template editor if any template markers (e.g. "{Name}")
  • Local model management (pull, rm)

Development

This package is developed using python-build. It was started using python-template as follows:

template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ollama_chat-0.9.7.tar.gz (13.2 kB view hashes)

Uploaded Source

Built Distribution

ollama_chat-0.9.7-py3-none-any.whl (13.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page