Skip to main content

A text-based terminal client for Ollama.

Project description

oterm

the text-based terminal client for Ollama.

Features

  • intuitive and simple terminal UI, no need to run servers, frontends, just type oterm in your terminal.
  • multiple persistent chat sessions, stored together with the context embeddings and template/system prompt customizations in sqlite.
  • can use any of the models you have pulled in Ollama, or your own custom models.
  • allows for easy customization of the model's template, system prompt and parameters.

Installation

Using brew for MacOS:

brew tap ggozad/formulas
brew install ggozad/formulas/oterm

Using pip:

pip install oterm

Using

In order to use oterm you will need to have the Ollama server running. By default it expects to find the Ollama API running on http://0.0.0.0:11434/api. If you are running Ollama inside docker or on a different host/port, use the OLLAMA_HOST environment variable to customize the host/port. Alternatively you can use OLLAMA_URL to specify the full http(s) url. Setting OTERM_VERIFY_SSL to False will disable SSL verification.

OLLAMA_URL=http://host:port/api

The following keyboard shortcuts are supported:

  • ^ Ctrl+N - create a new chat session

  • ^ Ctrl+E - edit the chat session (change template, system prompt or format)

  • ^ Ctrl+R - rename the current chat session

  • ^ Ctrl+S - export the current chat session as markdown

  • ^ Ctrl+X - delete the current chat session

  • ^ Ctrl+T - toggle between dark/light theme

  • ^ Ctrl+Q - quit

  • ^ Ctrl+L - switch to multiline input mode

  • ^ Ctrl+P - select an image to include with the next message

  • - navigate through history of previous prompts

While Ollama is inferring the next message, you can press Esc to cancel the inference.

Note that some of the shortcuts may not work in a certain context, for example pressing while the prompt is in multi-line mode.

Customizing models

When creating a new chat, you may not only select the model, but also customize the template as well as the system instruction to pass to the model. Checking the JSON output checkbox will cause the model reply in JSON format. Please note that oterm will not (yet) pull models for you, use ollama to do that. All the models you have pulled or created will be available to oterm.

You can also "edit" the chat to change the template, system prompt or format. Note, that the model cannot be changed once the chat has started. In addition whatever "context" the chat had (an embedding of the previous messages) will be kept.

Chat session storage

All your chat sessions are stored locally in a sqlite database. You can find the location of the database by running oterm --db.

Screenshots

Chat Model selection Image selection

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oterm-0.2.3.tar.gz (19.2 kB view details)

Uploaded Source

Built Distribution

oterm-0.2.3-py3-none-any.whl (25.6 kB view details)

Uploaded Python 3

File details

Details for the file oterm-0.2.3.tar.gz.

File metadata

  • Download URL: oterm-0.2.3.tar.gz
  • Upload date:
  • Size: 19.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.1 CPython/3.10.13 Linux/6.5.0-1015-azure

File hashes

Hashes for oterm-0.2.3.tar.gz
Algorithm Hash digest
SHA256 37eab2a2185e1cf6df9ebeda67e09e52844a32bf4c8b1cc37e9862e3b59659b4
MD5 6e49ef2c0eb94c07eccfe78675a9255e
BLAKE2b-256 1aaaa7b030847a5a8d10d0311c4d9c6d29e1cb122bdb7021f1a46a3b783a0911

See more details on using hashes here.

File details

Details for the file oterm-0.2.3-py3-none-any.whl.

File metadata

  • Download URL: oterm-0.2.3-py3-none-any.whl
  • Upload date:
  • Size: 25.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.1 CPython/3.10.13 Linux/6.5.0-1015-azure

File hashes

Hashes for oterm-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 b6c7b8660642a9699fb7ae60bdccaf53d5c20d63562b4614d5b480a9651c6b02
MD5 e7f2a88ffe6aaa31244fb0e3c3e23f82
BLAKE2b-256 52d06b1b0e6b3d8f9fc236a967127b6e21d7b9205de22c67f337cc3682d103f9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page