Skip to main content

A text-based terminal client for Ollama.

Project description

oterm

the text-based terminal client for Ollama.

Features

  • intuitive and simple terminal UI, no need to run servers, frontends, just type oterm in your terminal.
  • multiple persistent chat sessions, stored together with the context embeddings and template/system prompt customizations in sqlite.
  • can use any of the models you have pulled in Ollama, or your own custom models.
  • allows for easy customization of the model's template, system prompt and parameters.

Installation

Using brew for MacOS:

brew tap ggozad/formulas
brew install ggozad/formulas/oterm

Using pip:

pip install oterm

Using

In order to use oterm you will need to have the Ollama server running. By default it expects to find the Ollama API running on http://localhost:11434/api. If you are running Ollama inside docker or on a different host/port, use the OLLAMA_URL environment variable to customize the API url. Setting OTERM_VERIFY_SSL to False will disable SSL verification.

OLLAMA_URL=http://host:port/api

The following keyboard shortcuts are available:

  • ctrl+n - create a new chat session

  • ctrl+r - rename the current chat session

  • ctrl+x - delete the current chat session

  • ctrl+t - toggle between dark/light theme

  • ctrl+q - quit

  • ctrl+l - switch to multiline input mode

  • ctrl+p - select an image to include with the next message

Customizing models

When creating a new chat, you may not only select the model, but also customize the template as well as the system instruction to pass to the model. Checking the JSON output checkbox will cause the model reply in JSON format. Please note that oterm will not (yet) pull models for you, use ollama to do that. All the models you have pulled or created will be available to oterm.

Chat session storage

All your chat sessions are stored locally in a sqlite database. You can find the location of the database by running oterm --db.

Screenshots

Chat Model selection Image selection

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oterm-0.1.17.tar.gz (16.0 kB view details)

Uploaded Source

Built Distribution

oterm-0.1.17-py3-none-any.whl (20.6 kB view details)

Uploaded Python 3

File details

Details for the file oterm-0.1.17.tar.gz.

File metadata

  • Download URL: oterm-0.1.17.tar.gz
  • Upload date:
  • Size: 16.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.10.13 Linux/6.2.0-1018-azure

File hashes

Hashes for oterm-0.1.17.tar.gz
Algorithm Hash digest
SHA256 5b82054b23ba236dc42f6bc8d547d338249d772cf0f6b54397a2f46713db3b9f
MD5 cbe977f1f0f4cd0d9ca2e4604f24b5b4
BLAKE2b-256 6bb2379e58a8251a674f4833101625a29ca02e76f5f2d78f4334b5f78e19ddd0

See more details on using hashes here.

File details

Details for the file oterm-0.1.17-py3-none-any.whl.

File metadata

  • Download URL: oterm-0.1.17-py3-none-any.whl
  • Upload date:
  • Size: 20.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.10.13 Linux/6.2.0-1018-azure

File hashes

Hashes for oterm-0.1.17-py3-none-any.whl
Algorithm Hash digest
SHA256 0d6ba1e57635e723ad46612dcd3ccff5f22a5b41cedbff14008af98fea5b5120
MD5 950b665b137b711e5e1de0b5f0d9d995
BLAKE2b-256 75a821a8ff5518b973045a673321d37b9f4ecd1a40abe318f87934d4518cf498

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page