Skip to main content

A text-based terminal client for Ollama.

Project description

oterm

the text-based terminal client for Ollama.

Features

  • intuitive and simple terminal UI, no need to run servers, frontends, just type oterm in your terminal.
  • multiple persistent chat sessions, stored together with system prompt & parameter customizations in sqlite.
  • can use any of the models you have pulled in Ollama, or your own custom models.
  • allows for easy customization of the model's system prompt and parameters.

Installation

Using brew for MacOS:

brew tap ggozad/formulas
brew install ggozad/formulas/oterm

Using yay (or any AUR helper) for Arch Linux:

yay -S oterm

Using pip:

pip install oterm

Using

In order to use oterm you will need to have the Ollama server running. By default it expects to find the Ollama API running on http://127.0.0.1:11434. If you are running Ollama inside docker or on a different host/port, use the OLLAMA_HOST environment variable to customize the host/port. Alternatively you can use OLLAMA_URL to specify the full http(s) url. Setting OTERM_VERIFY_SSL to False will disable SSL verification.

OLLAMA_URL=http://host:port/api

To start oterm simply run:

oterm

Commands

By pressing ^ Ctrl+p you can access the command palette from where you can perform most of the chat management. The following commands are available:

  • New chat - create a new chat session
  • Edit chat parameters - edit the current chat session (change system prompt, parameters or format)
  • Rename chat - rename the current chat session
  • Export chat - export the current chat session as markdown
  • Delete chat - delete the current chat session

Keyboard shortcuts

The following keyboard shortcuts are supported:

  • ^ Ctrl+t - toggle between dark/light theme

  • ^ Ctrl+q - quit

  • ^ Ctrl+l - switch to multiline input mode

  • ^ Ctrl+i - select an image to include with the next message

  • - navigate through history of previous prompts

  • ^ Ctrl+Tab - open the next chat

  • ^ Ctrl+Shift+Tab - open the previous chat

In multiline mode, you can press Enter to send the message, or Shift+Enter to add a new line at the cursor.

While Ollama is inferring the next message, you can press Esc to cancel the inference.

Note that some of the shortcuts may not work in a certain context, for example pressing while the prompt is in multi-line mode.

Copy / Paste

It is difficult to properly support copy/paste in terminal applications. You can copy blocks to your clipboard as such:

  • clicking a message will copy it to the clipboard.
  • clicking a code block will only copy the code block to the clipboard.

For most terminals there exists a key modifier you can use to click and drag to manually select text. For example:

  • iTerm Option key.
  • Gnome Terminal Shift key.
  • Windows Terminal Shift key.

Customizing models

When creating a new chat, you may not only select the model, but also customize the the system instruction as well as the parameters (such as context length, seed, temperature etc) passed to the model. For a list of all supported parameters refer to the Ollama documentation. Checking the JSON output checkbox will force the model to reply in JSON format. Please note that oterm will not (yet) pull models for you, use ollama to do that. All the models you have pulled or created will be available to oterm.

You can also "edit" the chat to change the system prompt, parameters or format. Note, that the model cannot be changed once the chat has started.

Chat session storage

All your chat sessions are stored locally in a sqlite database. You can customize the directory where the database is stored by setting the OTERM_DATA_DIR environment variable.

You can find the location of the database by running oterm --db.

Screenshots

Chat Model selection Image selection

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oterm-0.5.0.tar.gz (4.3 MB view details)

Uploaded Source

Built Distribution

oterm-0.5.0-py3-none-any.whl (29.5 kB view details)

Uploaded Python 3

File details

Details for the file oterm-0.5.0.tar.gz.

File metadata

  • Download URL: oterm-0.5.0.tar.gz
  • Upload date:
  • Size: 4.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.14

File hashes

Hashes for oterm-0.5.0.tar.gz
Algorithm Hash digest
SHA256 3e8e821c557d9ebffab044f0ca03e73a4b4a5f29def98766812e3bcb9874a94e
MD5 9d625cc1af3182c7ceeaf693d37c7e12
BLAKE2b-256 4f719b2ac8428578cb732963c74e1f9ff4bea53f11f2da3c236b100c66616306

See more details on using hashes here.

File details

Details for the file oterm-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: oterm-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 29.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.14

File hashes

Hashes for oterm-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3b7a0ccf502a5fde6377ca1861ea1557b662b6b73f30bddcea5ad7c135386b36
MD5 5a4a8533d387a3891bea45595fceba83
BLAKE2b-256 016a4a3b77efbe2baaf031c16f5112128bf9830e69bd6166c1b1062338903307

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page