A command-line interface for interacting with Ollama AI models
Project description
Haiku
A clean, elegant command-line interface for chatting with local AI models via Ollama, featuring real-time Markdown rendering.
Features
- 💬 Interactive chat interface with Ollama models
- 📝 Real-time Markdown rendering (code blocks, tables, lists, etc.)
- 🔄 Full conversation context preservation
- 🎯 Customizable system prompts
- 🌡️ Adjustable temperature settings
- 💾 Conversation saving to files
Installation
pip install haiku-ollama
Make sure you have Ollama installed and running before using Haiku.
Usage
Start a conversation with the default model (llama3.1:8b):
haiku
Command Line Options
| Option | Description |
|---|---|
| --model | Specify which Ollama model to use (default: llama3.1:8b) |
| --keep-context, -k | Maintain full conversation history between prompts |
| --system, -s | Set a custom system prompt to guide the model's behavior |
| --temperature, -t | Set temperature (0.0-1.0) - lower values are more deterministic |
| --save | Save the conversation to a specified file |
Examples
Using a specific model:
haiku --model mistral:7b
Preserving conversation context:
haiku --keep-context
Setting a system prompt:
haiku --system "You are an expert programmer who explains code concisely"
Adjusting temperature:
haiku --temperature 0.2
Saving your conversation:
haiku --save conversation.md
Combining multiple options:
haiku --model codellama --keep-context --system "You write Python code" --temperature 0.3 --save coding_session.md
Exiting
To exit the program, simply type exit or bye, or press Ctrl+C.
Requirements
- Python 3.8+
- Ollama installed and running
- Python packages: ollama, rich
Contributing
Contributions are welcome! Feel free to submit issues or pull requests.
License
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file haiku_ollama-0.1.0.tar.gz.
File metadata
- Download URL: haiku_ollama-0.1.0.tar.gz
- Upload date:
- Size: 4.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e7d1464105dab07b77d4d0ecd8ebaeb1630394a4a5e804f9a359031e8c775286
|
|
| MD5 |
9ff53c13cef3408a88684f8e4c74b590
|
|
| BLAKE2b-256 |
31ea08af70e7a19af38c2d0500e7a077b6c94c213830ac2a230340e23fd57e8d
|
File details
Details for the file haiku_ollama-0.1.0-py3-none-any.whl.
File metadata
- Download URL: haiku_ollama-0.1.0-py3-none-any.whl
- Upload date:
- Size: 5.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
be404b05eda6c53b2f6573d81e0c0714c4830aca339b249476f91e30a2be622d
|
|
| MD5 |
3908520759f987eec4a130f13b72741f
|
|
| BLAKE2b-256 |
1b6d2da0057f0e8bea2a3270f57fd275f02dc52dfb1ac235e13d48637c628bf4
|