Skip to main content

Simple AI interface to chat with your LLM models from the terminal

Project description

sai

Simple AI interface to chat with your Ollama models from the terminal

Features

  • Pretty print real time responses in Markdown, using rich library.
  • Keep conversation context.
  • Autodetect and option to select models.
  • Add support for custom prompts.
  • Add custom roles (reusable prompts).
  • Improve performance by preloading models.
  • Add conversation persistency (sessions).

Requirements

An Ollama instance is required to get access to local models. By default, the URL is set to http://localhost:11434.

Install

You can install it using any package manager of your preference like pip, but the recommended way is uv tool.

Recommended

Using uv:

uv tool install sai-chat

Usage

Start using it in your terminal just by running sai command:

luis@laptop:~ $ sai
╭───────────────────────────────────────────────────────╮
│ Welcome to Sai. Chat with your local LLM models.      │
│                                                       │
│ Available commands:                                   │
│                                                       │
│   /setup : Setup Ollama URL and preferences          │
│   /model : Select a model                            │
│   /roles : List and select a role                    │
│   /role add : Create a new custom role               │
│   /role delete : Delete a custom role                │
│   /help : Show this help message                     │
│   /quit : Exit the application                       │
╰───────────────────────────────────────────────────────╯
> hi
╭───────────────────────────────── Virtual Assistant  ─╮
│ Hi there! How can I help you today? 😊                │
╰────────────────────── gemma3:1b ──────────────────────╯
> 

Status

This project is under development. Feel free to contribute or provide feedback!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sai_chat-0.1.4.tar.gz (6.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sai_chat-0.1.4-py3-none-any.whl (9.4 kB view details)

Uploaded Python 3

File details

Details for the file sai_chat-0.1.4.tar.gz.

File metadata

  • Download URL: sai_chat-0.1.4.tar.gz
  • Upload date:
  • Size: 6.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.13

File hashes

Hashes for sai_chat-0.1.4.tar.gz
Algorithm Hash digest
SHA256 800596c370ab83e8c652e33864cbcdb85a4ab6e428ba4ac9bdd304cacd38c0aa
MD5 18d4f22d6ec7e838e1eb25a47ad64438
BLAKE2b-256 92495267a7e76b757d558ba9a5ec57e773b5579f38d8d84a026400730b4dee6c

See more details on using hashes here.

File details

Details for the file sai_chat-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: sai_chat-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 9.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.13

File hashes

Hashes for sai_chat-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 0e7d32414be932d78f4d33d860acfb09906362a7a12799d20d6564521510a0ce
MD5 294020fa7294750efd34f022fbd857c2
BLAKE2b-256 e7dad6e68781a3c6aae4caf922dcc771a1a7758c164001736b22436e4e095490

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page