Skip to main content

An Ollama chat web application

Project description

ollama-chat

PyPI - Status PyPI GitHub PyPI - Python Version

Ollama Chat is a web chat client for Ollama that allows you to chat locally (and privately) with Large Language Models (LLMs).

Features

  • Platform independent - tested on macOS, Windows, and Linux
  • Chat with any local Ollama model
  • Save conversations for later viewing and interaction
  • Single and multiline prompts
  • Regenerate the most recent conversation response
  • Delete the most recent conversation exchange
  • View responses as Markdown or text
  • Save conversations as Markdown text
  • Multiple concurrent chats
  • Prompt commands for including file and URL content
  • Conversation templates for repeating prompts with variable substitutions

Installation

To get up and running with Ollama Chat follows these steps:

  1. Install and start Ollama

  2. Install Ollama Chat

    pip install ollama-chat
    

Updating

To update Ollama Chat:

pip install -U ollama-chat

Start Ollama Chat

To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:

ollama-chat

A web browser is launched and opens the Ollama Chat web application.

By default, a configuration file, "ollama-chat.json", is created in the user's home directory.

To start a conversation from the command line, use the -m argument:

ollama-chat -m "Why is the sky blue?"

Prompt Commands

Ollama Chat supports special prompt commands that allow you to include file and URL content in your prompt, among other things. The following prompt commands are available:

  • /file - include a file

    /file README.md
    
    Please summarize the README file.
    
  • /dir - include files from a directory

    /dir src/ollama_chat py
    
    Please provide a summary for each Ollama Chat source file.
    
  • /url - include a URL resource

    /url https://craigahobbs.github.io/ollama-chat/README.md
    
    Please summarize the README file.
    
  • /do - execute a conversation template by name or title

    /do city-report -v CityState "Seattle, WA"
    

To get prompt command help use the -h option:

/file -h

Conversation Templates

Conversation Templates allow you to repeat the same prompts with different models. Templates can define variables that may be included in the template title and prompt text ({{var}}). For example:

{
    "conversations": [],
    "templates": [
        {
            "title": "City Report for {{CityState}}",
            "prompts": [
                "Tell me about {{CityState}}",
                "What is the average cost of living in {{CityState}}?"
            ],
            "variables": [
                {
                    "label": "City, State",
                    "name": "CityState"
                }
            ]
        }
    ]
}

File Format and API Documentation

Ollama Chat File Format

Ollama Chat API

Development

This package is developed using python-build. It was started using python-template as follows:

template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ollama_chat-0.9.31.tar.gz (19.9 kB view details)

Uploaded Source

Built Distribution

ollama_chat-0.9.31-py3-none-any.whl (21.6 kB view details)

Uploaded Python 3

File details

Details for the file ollama_chat-0.9.31.tar.gz.

File metadata

  • Download URL: ollama_chat-0.9.31.tar.gz
  • Upload date:
  • Size: 19.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.13.0rc3

File hashes

Hashes for ollama_chat-0.9.31.tar.gz
Algorithm Hash digest
SHA256 aa0df8756e42211d32b6038acbf27c7f2598cc8539495fb61e5eaca8cfc0bcd7
MD5 21796c435fb382fc83bf4d5ea49ce7e7
BLAKE2b-256 221676b389aba3f070be3f2ccb075a63fa79a027a63ed739abcdda9a53ed9b38

See more details on using hashes here.

File details

Details for the file ollama_chat-0.9.31-py3-none-any.whl.

File metadata

  • Download URL: ollama_chat-0.9.31-py3-none-any.whl
  • Upload date:
  • Size: 21.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.13.0rc3

File hashes

Hashes for ollama_chat-0.9.31-py3-none-any.whl
Algorithm Hash digest
SHA256 289f8612cc778247d88b134531df9d2d6837f757b3d4fc53b051fef35bffaf39
MD5 83e59bc06b3904559767b61ed1bb9565
BLAKE2b-256 509f8a9fa19351b7e3e29d6b469f0a57a7333f2a6af62569b27015ef308f5282

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page