Skip to main content

Simple AI interface to chat with your LLM models from the terminal

Project description

sai

Simple AI interface to chat with your Ollama models from the terminal

Features

  • Pretty print real time responses in Markdown, using rich library.
  • Keep conversation context.
  • Autodetect and option to select models.
  • Add support for custom prompts.
  • Add custom roles (reusable prompts).
  • Improve performance by preloading models.
  • Add conversation persistency (sessions).

Requirements

An Ollama instance is required to get access to local models. By default, the URL is set to http://localhost:11434.

Install

You can install it using any package manager of your preference like pip, but the recommended way is uv tool.

Recommended

Using uv:

uv tool install sai-chat

Usage

Start using it in your terminal just by running sai command:

luis@laptop:~ $ sai
╭───────────────────────────────────────────────────────╮
│ Welcome to Sai. Chat with your local LLM models.      │
│                                                       │
│ Available commands:                                   │
│                                                       │
│   /setup : Setup Ollama URL and preferences          │
│   /model : Select a model                            │
│   /roles : List and select a role                    │
│   /role add : Create a new custom role               │
│   /role delete : Delete a custom role                │
│   /help : Show this help message                     │
│   /quit : Exit the application                       │
╰───────────────────────────────────────────────────────╯
> hi
╭───────────────────────────────── Virtual Assistant  ─╮
│ Hi there! How can I help you today? 😊                │
╰────────────────────── gemma3:1b ──────────────────────╯
> 

Status

This project is under development. Feel free to contribute or provide feedback!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sai_chat-0.1.5.tar.gz (7.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sai_chat-0.1.5-py3-none-any.whl (9.9 kB view details)

Uploaded Python 3

File details

Details for the file sai_chat-0.1.5.tar.gz.

File metadata

  • Download URL: sai_chat-0.1.5.tar.gz
  • Upload date:
  • Size: 7.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.13

File hashes

Hashes for sai_chat-0.1.5.tar.gz
Algorithm Hash digest
SHA256 7f77a71820272c0543de7076edcab4f3ef549ec9eeb2707fed62e3f813ee8cd2
MD5 7045e3c26b02bdfc8c2f282edea50ea0
BLAKE2b-256 d77264d670f0bcf084bc761412751f558db20516b942e2eac85b5c6da0678844

See more details on using hashes here.

File details

Details for the file sai_chat-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: sai_chat-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 9.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.13

File hashes

Hashes for sai_chat-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 cdc97575870d6c4091209d1c7f781749dd8c7191e9956c0a053619816eb53353
MD5 0c248c07b5550d16c3a4f6b972885990
BLAKE2b-256 237633cb36d40de35ba7815751cc5b8373fd9133ea95901ea85f366af2af8a2e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page