Skip to main content

Ollama CLI prompt tool for local LLM code analysis

Project description

ollama-prompt

PyPI version Python 3.7+ License: MIT

Quick StartDocumentationUse CasesContributing


What is ollama-prompt?

A lightweight Python CLI that transforms Ollama into a powerful analysis tool with:

  • Session persistence - Multi-turn conversations with full context
  • Structured JSON output - Token counts, timing, and metadata
  • File references - Inline local files with @file syntax
  • Multi-agent orchestration - Perfect for subprocess workflows

Perfect for: Code review, analysis pipelines, agent systems, and cost-aware LLM workflows

Features

  • Session Management - Persistent conversations across CLI invocations
  • Rich Metadata - Full JSON output with token counts, timing, and cost tracking
  • File References - Reference local files with @./path/to/file.py syntax
  • Subprocess-Friendly - Designed for agent orchestration and automation
  • Cloud & Local Models - Works with both Ollama cloud models and local instances
  • Cross-Platform - Windows, macOS, Linux with Python 3.7+

Quick Start

Prerequisites: Ollama CLI installed (server starts automatically)

# 1. Install
pip install ollama-prompt

# 2. First question (creates session automatically)
ollama-prompt --prompt "What is 2+2?"

# 3. Follow-up with context
ollama-prompt --session-id <id-from-output> --prompt "What about 3+3?"

Session created automatically! See session_id in output.

Next steps: 5-Minute Tutorial | Full CLI Reference


Installation

PyPI (Recommended)

pip install ollama-prompt

Development Install

git clone https://github.com/dansasser/ollama-prompt.git
cd ollama-prompt
pip install -e .

Prerequisites

  • Python 3.7 or higher
  • Ollama installed and running
  • For cloud models: ollama signin (one-time authentication)

Verify installation:

ollama-prompt --help
ollama list  # Check available models

Full setup guide: Prerequisites Documentation


Usage

Basic Example

ollama-prompt --prompt "Explain Python decorators" \
              --model deepseek-v3.1:671b-cloud

Multi-Turn Conversation

# First question
ollama-prompt --prompt "Who wrote Hamlet?" > out.json

# Follow-up (remembers context)
SESSION_ID=$(jq -r '.session_id' out.json)
ollama-prompt --session-id $SESSION_ID --prompt "When was he born?"

File Analysis

ollama-prompt --prompt "Review @./src/auth.py for security issues"

Stateless Mode

ollama-prompt --prompt "Quick question" --no-session

More examples: Use Cases Guide with 12 real-world scenarios


Documentation

Complete Documentation - Full guide navigation and reference

Quick Links:


Use Cases

Software Development:

  • Multi-file code review with shared context
  • Iterative debugging sessions
  • Architecture analysis across modules

Multi-Agent Systems:

  • Subprocess-based agent orchestration
  • Context-aware analysis pipelines
  • Cost tracking for LLM operations

Data Analysis:

  • Sequential data exploration with memory
  • Research workflows with source tracking
  • Report generation with conversation history

See all 12 scenarios: Use Cases Guide


Why ollama-prompt?

vs. Direct Ollama API:

  • Session persistence (no manual context management)
  • Structured JSON output (token counts, timing, metadata)
  • File reference syntax (no manual file reading)

vs. Other CLI Tools:

  • Session-first design (context by default)
  • Subprocess-optimized (perfect for agent orchestration)
  • Local-first (SQLite, no cloud dependency)

Built for:

  • Developers building agent systems
  • Code analysis automation
  • Cost-aware LLM workflows
  • Multi-turn conversations at scale

Architecture: Subprocess Best Practices | Architectural Comparison


Troubleshooting

  • If you get ModuleNotFoundError: ollama, ensure you ran pip install ollama in the correct Python environment.
  • Ensure Ollama CLI is installed (ollama --version should work). The server starts automatically when needed.
  • For maximum context windows, check your model's max token support.
  • Unexpected session_id in output? Sessions are auto-created by default in v1.2.0+. This is normal behavior. Use --no-session for stateless operation.
  • Session context not persisting? Ensure you're using the same --session-id value across invocations. Use --list-sessions to see available sessions.

Contributing

We welcome contributions! Here's how to get started:

Development Setup:

git clone https://github.com/dansasser/ollama-prompt.git
cd ollama-prompt
pip install -e .

Running Tests:

pytest

Contribution Guidelines:

  • Fork the repo and create a branch
  • Write tests for new features
  • Follow existing code style
  • Submit PR with clear description

Areas We Need Help:

  • Documentation improvements
  • New use case examples
  • Bug reports and fixes
  • Feature suggestions

Questions? Open an issue or discussion.


Community & Support


License

MIT License - see LICENSE file for details.

Third-Party Licenses:

  • Uses Ollama (separate licensing)

Credits

Author: Daniel T. Sasser II

Built With:

  • Ollama - Local LLM runtime
  • Python - Language and ecosystem

Acknowledgments:

  • Inspired by the need for structured, cost-aware LLM workflows
  • Built for the AI agent orchestration community

PyPI Package

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ollama_prompt-1.1.8.tar.gz (24.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ollama_prompt-1.1.8-py3-none-any.whl (25.2 kB view details)

Uploaded Python 3

File details

Details for the file ollama_prompt-1.1.8.tar.gz.

File metadata

  • Download URL: ollama_prompt-1.1.8.tar.gz
  • Upload date:
  • Size: 24.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ollama_prompt-1.1.8.tar.gz
Algorithm Hash digest
SHA256 e7e2f258a5287f3e9f05852408cd0c5d62bc8b699eeb0c16342ddf33d4ee7b47
MD5 25bcb237cf647028ab0d5afee5f672a0
BLAKE2b-256 36615ad668a490e6a8fd988198ca18aa9d1cee4ee33ce7721426a4c1f08b264d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ollama_prompt-1.1.8.tar.gz:

Publisher: publish.yml on dansasser/ollama-prompt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ollama_prompt-1.1.8-py3-none-any.whl.

File metadata

  • Download URL: ollama_prompt-1.1.8-py3-none-any.whl
  • Upload date:
  • Size: 25.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ollama_prompt-1.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 6fbe45eb33b086164407a416bcd8e7126cfcad58373c78fe4c87b8065227cc97
MD5 0ddef6c02933cdbca32fb83ed5e11a30
BLAKE2b-256 441c7601715bdb999ea33205807a21ce836c4e6ae0f35d796f538dd4b8aa6eed

See more details on using hashes here.

Provenance

The following attestation bundles were made for ollama_prompt-1.1.8-py3-none-any.whl:

Publisher: publish.yml on dansasser/ollama-prompt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page