Skip to main content

A text-based terminal client for Ollama.

Project description

oterm

the text-based terminal client for Ollama.

Features

  • intuitive and simple terminal UI, no need to run servers, frontends, just type oterm in your terminal.
  • multiple persistent chat sessions, stored together with the context embeddings in sqlite.
  • can use any of the models you have pulled in Ollama, or your own custom models.

Installation

Using brew for MacOS:

brew tap ggozad/formulas
brew install ggozad/formulas/oterm

Using pip:

pip install oterm

Using

In order to use oterm you will need to have the Ollama server running. By default it expects to find the Ollama API running on http://localhost:11434/api. If you are running Ollama inside docker or on a different host/port, use the OLLAMA_URL environment variable to customize the API url.

OLLAMA_URL=http://host:port/api

oterm will not (yet) pull models for you, please use ollama to do that. All the models you have pulled or created will be available to oterm.

Screenshots

Chat Model selection

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oterm-0.1.2.tar.gz (10.1 kB view hashes)

Uploaded Source

Built Distribution

oterm-0.1.2-py3-none-any.whl (12.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page