Skip to main content

Ollama CLI prompt tool for local LLM code analysis

Project description

ollama-prompt

Local Ollama CLI Tool for Deep Analysis

Overview

ollama-prompt is a cross-platform Python command-line utility to interact with a local Ollama server for advanced code analysis, prompt evaluation, and cost tracking. Send custom prompts to your preferred Ollama model and receive a structured JSON response with all server-side metadata: prompt, output, token counts, durations, and much more.

Ideal for:

  • AGI agent orchestration
  • Cost-aware code review workflows
  • Analytics on token usage
  • Integrating structured LLM output into your developer pipeline

Features

  • Flexible CLI flags: set prompt, model, temperature, and token count
  • Prints full verbose JSON: includes response text, token usage (prompt_eval_count, eval_count), and engine stats
  • Integrates easily into developer pipelines (PowerShell, Bash, agent loops)
  • Works on Windows, Mac, Linux (Python 3.7+) with Ollama installed

Installation

Recommended (PyPI):

pip install ollama-prompt

Requirements:

  • Python 3.7 or higher
  • Local Ollama server running (ollama serve)

Alternative: Development/Manual Install

Clone the repository and install in editable mode:

git clone https://github.com/dansasser/ollama-prompt.git
cd ollama-prompt
pip install -e .

Usage

Quick Start:

You must have the Ollama server running locally:

ollama serve

Basic Example:

ollama-prompt --prompt "Summarize the architecture in src/modules." --model deepseek-v3.1:671b-cloud

Custom Flags:

ollama-prompt --prompt "Evaluate performance of sorting algorithms." --model deepseek-v3.1:671b-cloud --temperature 0.05 --max_tokens 4096

Output Example (JSON):

{
  "model": "deepseek-v3.1:671b-cloud",
  "prompt_eval_count": 38,
  "eval_count": 93,
  "response": "...",
  "total_duration": 13300000,
  "prompt_eval_duration": 1000000,
  "eval_duration": 12200000,
  "done": true
}

Advanced:

  • Pipe results with jq:
    ollama-prompt --prompt "Critical design flaws in utils.py?" | jq .eval_count
    
  • Integrate into agent loops or analytics dashboards via JSON output.

Troubleshooting

  • If you get ModuleNotFoundError: ollama, ensure you ran pip install ollama in the correct Python environment.
  • Ollama server must be running locally for requests to succeed (ollama serve).
  • For maximum context windows, check your model’s max token support.

Development & Contributing

Editable Install:

git clone https://github.com/dansasser/ollama-prompt.git
cd ollama-prompt
pip install -e .

To contribute:

  • Fork the repo, create a branch, submit PRs.
  • Open issues for bugs/feature requests.

License

MIT License (see Ollama license for server terms).

Credits

Developed by Daniel T Sasser II for robust code offload workflows, AGI agent orchestration, and token/cost analytics.


1

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ollama_prompt-1.1.2.tar.gz (4.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ollama_prompt-1.1.2-py3-none-any.whl (4.5 kB view details)

Uploaded Python 3

File details

Details for the file ollama_prompt-1.1.2.tar.gz.

File metadata

  • Download URL: ollama_prompt-1.1.2.tar.gz
  • Upload date:
  • Size: 4.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ollama_prompt-1.1.2.tar.gz
Algorithm Hash digest
SHA256 731318a5fd1396fe8266fae7ffd82a1dabd635ed658a653c7f594bdd98477608
MD5 50bb30fe06e419575070661a3e052aba
BLAKE2b-256 1b03efa3bc0dad4f22a8c2ec03fcbb29c87b5ea26f6ef0835f097a8e2ca7f046

See more details on using hashes here.

Provenance

The following attestation bundles were made for ollama_prompt-1.1.2.tar.gz:

Publisher: publish.yml on dansasser/ollama-prompt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ollama_prompt-1.1.2-py3-none-any.whl.

File metadata

  • Download URL: ollama_prompt-1.1.2-py3-none-any.whl
  • Upload date:
  • Size: 4.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ollama_prompt-1.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 2ee1b9ce78a0b3e8b78fb14df2703e7cb897e223b75c7f86d0e4740c01020d44
MD5 234f22eaebad77eee4a9d3cea17817ec
BLAKE2b-256 d54d90fcc48a4693972b6d408ecbf57dddf625b9e49b1e3f7b584a10e80a5e6b

See more details on using hashes here.

Provenance

The following attestation bundles were made for ollama_prompt-1.1.2-py3-none-any.whl:

Publisher: publish.yml on dansasser/ollama-prompt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page