A terminal-based ollama chat interface
Project description
Natter
A wee experimental terminal-based chat client, built to mess around with the Python Ollama library.
Motivation
There are lots of good and very comprehensive terminal-based LLM clients out there. Pretty much all of them are feature-rich and very busy, and most concentrate on privacy-problematic backend engines.
I wanted to toy with something that was simple, direct, to the point, and locally-based.
Also, this is a tinker toy so I can experiment with the library I've built this around, and also experiment with how I can make use of a locally-hosted LLM.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
natter-0.0.4.tar.gz
(10.1 kB
view hashes)
Built Distribution
natter-0.0.4-py3-none-any.whl
(14.8 kB
view hashes)