Skip to main content

Simple AI interface to chat with your LLM models from the terminal

Project description

sai

Simple AI interface to chat with your Ollama models from the terminal

Features

  • Pretty print real time responses in Markdown, using rich library.
  • Keep conversation context.
  • Autodetect and option to select models.
  • Add support for custom prompts.
  • Add conversation persistency (sessions).

Requirements

An Ollama instance is required to get access to local models. By default, the URL is set to http://localhost:11434.

Install

You can install it using any package manager of your preference like pip, but the recommended way is uv tool.

Recommended

Using uv:

uv tool install sai-chat

Usage

Start using it in your terminal just by running sai command:

luis@laptop:~ $ sai
╭───────────────────────────────────────────────────────╮
│ Welcome to Sai. Chat with your local LLM models.      │
│                                                       │
│ Available commands:                                   │
│                                                       │
│   /setup : Setup Ollama URL and preferences          │
│   /model : Select a model                            │
│   /roles : List and select a role                    │
│   /role add : Create a new custom role               │
│   /role delete : Delete a custom role                │
│   /help : Show this help message                     │
│   /quit : Exit the application                       │
╰───────────────────────────────────────────────────────╯
> hi
╭────────────────────────────────────── LLM Response  ─╮
│ Hi there! How can I help you today? 😊                │
╰────────────────────── gemma3:1b ──────────────────────╯
> 

Status

This project is under development. Feel free to contribute or provide feedback!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sai_chat-0.1.3.tar.gz (6.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sai_chat-0.1.3-py3-none-any.whl (8.8 kB view details)

Uploaded Python 3

File details

Details for the file sai_chat-0.1.3.tar.gz.

File metadata

  • Download URL: sai_chat-0.1.3.tar.gz
  • Upload date:
  • Size: 6.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.13

File hashes

Hashes for sai_chat-0.1.3.tar.gz
Algorithm Hash digest
SHA256 1060d06782b1064d3427a437f02caed31ec94063da47c4dcecc8d1c7b8eb2f54
MD5 28abda0b5bb428d5e419190722c7b384
BLAKE2b-256 f59b4b1286a951db36eb64af7cccdfa41b89d00b2702bdb9058eac67a5ca281c

See more details on using hashes here.

File details

Details for the file sai_chat-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: sai_chat-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 8.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.13

File hashes

Hashes for sai_chat-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 d5f7f8db061dc032db4986b6b344101216987f80289d6e5b042574c3560e4d4f
MD5 0307652e6913a36d16370c8c1635a956
BLAKE2b-256 315c17637b79f2fed211b35cf7995372ec721a663fe299687744dfae3481a5d1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page