A feature-rich Ollama client with enhanced terminal UI using the Rich library.
Project description
ollama-rich
A feature-rich Ollama client with enhanced terminal UI using the Rich library.
Features
- List available Ollama models in a beautiful table
- Chat with models directly from the terminal
- Stream responses live with markdown rendering
- Easy-to-use CLI interface
- More comming soon
Requirements
Clone the Repository
To get the source code, clone this repository:
git clone https://github.com/yourusername/ollama-rich.git
cd ollama-rich
Installation
pip install .
Usage
List Models
ollama-rich models
Chat with a Model
ollama-rich chat <model> "Your message here"
Stream Chat Response
ollama-rich chat <model> "Your message here" --stream
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
No source distribution files available for this release.See tutorial on generating distribution archives.
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ollama_rich-0.1.0-py3-none-any.whl.
File metadata
- Download URL: ollama_rich-0.1.0-py3-none-any.whl
- Upload date:
- Size: 4.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9c07020e8024a492ae8b6f9ab2450752d4279cb1bf01df3e2fcf40b60ff1d67a
|
|
| MD5 |
03eec3751b957f352f6e28c6e54b1faf
|
|
| BLAKE2b-256 |
a46a20a239b212b3a8e354dcca2deba7ab2af941a0f75b6182116d205aeeec67
|