Ollama CLI prompt tool for local LLM code analysis
Project description
ollama-prompt
Quick Start • Documentation • Use Cases • Contributing
What is ollama-prompt?
A lightweight Python CLI that transforms Ollama into a powerful analysis tool with:
- Session persistence - Multi-turn conversations with full context
- Structured JSON output - Token counts, timing, and metadata
- File references - Inline local files with
@filesyntax - Multi-agent orchestration - Perfect for subprocess workflows
Perfect for: Code review, analysis pipelines, agent systems, and cost-aware LLM workflows
Features
- Session Management - Persistent conversations across CLI invocations
- Rich Metadata - Full JSON output with token counts, timing, and cost tracking
- File References - Reference local files with
@./path/to/file.pysyntax - Subprocess-Friendly - Designed for agent orchestration and automation
- Cloud & Local Models - Works with both Ollama cloud models and local instances
- Cross-Platform - Windows, macOS, Linux with Python 3.7+
Quick Start
Prerequisites: Ollama CLI installed (server starts automatically)
# 1. Install
pip install ollama-prompt
# 2. First question (creates session automatically)
ollama-prompt --prompt "What is 2+2?"
# 3. Follow-up with context
ollama-prompt --session-id <id-from-output> --prompt "What about 3+3?"
Session created automatically! See session_id in output.
Next steps: 5-Minute Tutorial | Full CLI Reference
Installation
PyPI (Recommended)
pip install ollama-prompt
Development Install
git clone https://github.com/dansasser/ollama-prompt.git
cd ollama-prompt
pip install -e .
Prerequisites
- Python 3.7 or higher
- Ollama installed and running
- For cloud models:
ollama signin(one-time authentication)
Verify installation:
ollama-prompt --help
ollama list # Check available models
Full setup guide: Prerequisites Documentation
Usage
Basic Example
ollama-prompt --prompt "Explain Python decorators" \
--model deepseek-v3.1:671b-cloud
Multi-Turn Conversation
# First question
ollama-prompt --prompt "Who wrote Hamlet?" > out.json
# Follow-up (remembers context)
SESSION_ID=$(jq -r '.session_id' out.json)
ollama-prompt --session-id $SESSION_ID --prompt "When was he born?"
File Analysis
ollama-prompt --prompt "Review @./src/auth.py for security issues"
Stateless Mode
ollama-prompt --prompt "Quick question" --no-session
More examples: Use Cases Guide with 12 real-world scenarios
Documentation
Complete Documentation - Full guide navigation and reference
Quick Links:
Use Cases
Software Development:
- Multi-file code review with shared context
- Iterative debugging sessions
- Architecture analysis across modules
Multi-Agent Systems:
- Subprocess-based agent orchestration
- Context-aware analysis pipelines
- Cost tracking for LLM operations
Data Analysis:
- Sequential data exploration with memory
- Research workflows with source tracking
- Report generation with conversation history
See all 12 scenarios: Use Cases Guide
Why ollama-prompt?
vs. Direct Ollama API:
- Session persistence (no manual context management)
- Structured JSON output (token counts, timing, metadata)
- File reference syntax (no manual file reading)
vs. Other CLI Tools:
- Session-first design (context by default)
- Subprocess-optimized (perfect for agent orchestration)
- Local-first (SQLite, no cloud dependency)
Built for:
- Developers building agent systems
- Code analysis automation
- Cost-aware LLM workflows
- Multi-turn conversations at scale
Architecture: Subprocess Best Practices | Architectural Comparison
Troubleshooting
- If you get
ModuleNotFoundError: ollama, ensure you ranpip install ollamain the correct Python environment. - Ensure Ollama CLI is installed (
ollama --versionshould work). The server starts automatically when needed. - For maximum context windows, check your model's max token support.
- Unexpected session_id in output? Sessions are auto-created by default in v1.2.0+. This is normal behavior. Use
--no-sessionfor stateless operation. - Session context not persisting? Ensure you're using the same
--session-idvalue across invocations. Use--list-sessionsto see available sessions.
Contributing
We welcome contributions! Here's how to get started:
Development Setup:
git clone https://github.com/dansasser/ollama-prompt.git
cd ollama-prompt
pip install -e .
Running Tests:
pytest
Contribution Guidelines:
- Fork the repo and create a branch
- Write tests for new features
- Follow existing code style
- Submit PR with clear description
Areas We Need Help:
- Documentation improvements
- New use case examples
- Bug reports and fixes
- Feature suggestions
Questions? Open an issue or discussion.
Community & Support
- Bug Reports: GitHub Issues
- Discussions: GitHub Discussions
- Documentation: docs/README.md
- Troubleshooting: Reference Guide
License
MIT License - see LICENSE file for details.
Third-Party Licenses:
- Uses Ollama (separate licensing)
Credits
Author: Daniel T. Sasser II
- GitHub: github.com/dansasser
- Blog: dansasser.me
Built With:
Acknowledgments:
- Inspired by the need for structured, cost-aware LLM workflows
- Built for the AI agent orchestration community
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ollama_prompt-1.1.8.tar.gz.
File metadata
- Download URL: ollama_prompt-1.1.8.tar.gz
- Upload date:
- Size: 24.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e7e2f258a5287f3e9f05852408cd0c5d62bc8b699eeb0c16342ddf33d4ee7b47
|
|
| MD5 |
25bcb237cf647028ab0d5afee5f672a0
|
|
| BLAKE2b-256 |
36615ad668a490e6a8fd988198ca18aa9d1cee4ee33ce7721426a4c1f08b264d
|
Provenance
The following attestation bundles were made for ollama_prompt-1.1.8.tar.gz:
Publisher:
publish.yml on dansasser/ollama-prompt
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ollama_prompt-1.1.8.tar.gz -
Subject digest:
e7e2f258a5287f3e9f05852408cd0c5d62bc8b699eeb0c16342ddf33d4ee7b47 - Sigstore transparency entry: 660802045
- Sigstore integration time:
-
Permalink:
dansasser/ollama-prompt@04c184cba694b61e1b5cd7eb9bf83caad6bf6833 -
Branch / Tag:
refs/tags/v1.1.8 - Owner: https://github.com/dansasser
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@04c184cba694b61e1b5cd7eb9bf83caad6bf6833 -
Trigger Event:
release
-
Statement type:
File details
Details for the file ollama_prompt-1.1.8-py3-none-any.whl.
File metadata
- Download URL: ollama_prompt-1.1.8-py3-none-any.whl
- Upload date:
- Size: 25.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6fbe45eb33b086164407a416bcd8e7126cfcad58373c78fe4c87b8065227cc97
|
|
| MD5 |
0ddef6c02933cdbca32fb83ed5e11a30
|
|
| BLAKE2b-256 |
441c7601715bdb999ea33205807a21ce836c4e6ae0f35d796f538dd4b8aa6eed
|
Provenance
The following attestation bundles were made for ollama_prompt-1.1.8-py3-none-any.whl:
Publisher:
publish.yml on dansasser/ollama-prompt
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ollama_prompt-1.1.8-py3-none-any.whl -
Subject digest:
6fbe45eb33b086164407a416bcd8e7126cfcad58373c78fe4c87b8065227cc97 - Sigstore transparency entry: 660802047
- Sigstore integration time:
-
Permalink:
dansasser/ollama-prompt@04c184cba694b61e1b5cd7eb9bf83caad6bf6833 -
Branch / Tag:
refs/tags/v1.1.8 - Owner: https://github.com/dansasser
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@04c184cba694b61e1b5cd7eb9bf83caad6bf6833 -
Trigger Event:
release
-
Statement type: