Skip to main content

An Ollama chat web application

Project description

ollama-chat

PyPI - Status PyPI GitHub PyPI - Python Version

Ollama Chat is a web chat client for Ollama that allows you to chat locally (and privately) with Large Language Models (LLMs).

Features

  • Platform independent - tested on macOS, Windows, and Linux
  • Chat with any local Ollama model
  • Save conversations for later viewing and interaction
  • Single and multiline prompts
  • Regenerate the most recent conversation response
  • Delete the most recent conversation exchange
  • View responses as Markdown or text
  • Save conversations as Markdown text
  • Multiple concurrent chats

Installation

To get up and running with Ollama Chat follows these steps:

  1. Install and start Ollama

  2. Install Ollama Chat

    pip install ollama-chat
    

Updating

To update Ollama Chat:

pip install -U ollama-chat

Start Ollama Chat

To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:

ollama-chat

A web browser is launched and opens the Ollama Chat web application.

By default, a configuration file, "ollama-chat.json", is created in the user's home directory.

Start Conversation from CLI

To start a conversation from the command line, use the -m argument:

ollama-chat -m "Why is the sky blue?"

File Format and API Documentation

Ollama Chat File Format

Ollama Chat API

Development

This package is developed using python-build. It was started using python-template as follows:

template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ollama_chat-0.9.21.tar.gz (15.4 kB view details)

Uploaded Source

Built Distribution

ollama_chat-0.9.21-py3-none-any.whl (16.2 kB view details)

Uploaded Python 3

File details

Details for the file ollama_chat-0.9.21.tar.gz.

File metadata

  • Download URL: ollama_chat-0.9.21.tar.gz
  • Upload date:
  • Size: 15.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.5

File hashes

Hashes for ollama_chat-0.9.21.tar.gz
Algorithm Hash digest
SHA256 5149c0209fc646c9c8bbe4d96e0017f42bfe69d87b5c946c75b9552871cd238f
MD5 ff10512af693ee730e1707c5d4ea25cb
BLAKE2b-256 a47dc4b87a47963849ce71b9b1057c7447fbd83f9020b01bb79f28feff94f41c

See more details on using hashes here.

File details

Details for the file ollama_chat-0.9.21-py3-none-any.whl.

File metadata

  • Download URL: ollama_chat-0.9.21-py3-none-any.whl
  • Upload date:
  • Size: 16.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.5

File hashes

Hashes for ollama_chat-0.9.21-py3-none-any.whl
Algorithm Hash digest
SHA256 46e06142ebf6cb3ecd4c7b0c03be5ebed78c164d4eb3f94c7e064d1ed4b43544
MD5 81ad0cb3d69cd2a157840a8bb28e1523
BLAKE2b-256 1ae8744547ad8f1dc67085b05bff1e56c483839d3d70dc341b51ff958e737c3f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page