Skip to main content

Python CLI agent wrapper for Ollama with tools, templates, and pipelines.

Project description

orun-py

A Python CLI Agent wrapper for Ollama. It combines chat capabilities with autonomous tools (file I/O, shell execution, web fetching), built-in screenshot analysis, and 200+ prompt/strategy templates.

Features

  • Autonomous Agent: Can read/write files, run shell commands, search the web, and fetch URLs (with user confirmation).
  • Consensus Systems: Multiple models working together in sequential pipelines or parallel aggregation.
  • Web Search: Google Custom Search API (with DuckDuckGo fallback) for internet searches.
  • URL Fetching: Jina AI Reader converts web pages to clean markdown optimized for LLM analysis.
  • arXiv Integration: Search and retrieve academic papers directly from arXiv.
  • Screenshot Analysis: Auto-detects and attaches recent screenshots from your Pictures folder.
  • Prompt Templates: 200+ pre-defined templates for coding, analysis, writing, and more.
  • Strategy Templates: Chain-of-Thought, Tree-of-Thought, and other reasoning strategies.
  • Conversation History: SQLite-backed history lets you resume any session.
  • Model Management: Sync models from Ollama and manage shortcuts.

Installation

pip install orun-py

Usage

  • Running without an immediate prompt starts interactive chat (orun).
  • Providing a prompt runs single-shot (orun "your prompt").

Agent & Query

Ask a question or give a task. The AI will use tools if necessary.

orun "Why is the sky blue?"
orun "Scan the current directory and list all Python files"
orun "Read src/main.py and explain how it works"

Interactive Chat

Start a continuous session:

orun

Start chat with a specific model:

orun -m coder

Prompt & Strategy Templates

Use a prompt template:

orun "Review this code" -p review_code
orun "Analyze this paper" -p analyze_paper

Use a reasoning strategy:

orun "Explain step by step" -s cot
orun "Explore multiple approaches" -s tot

Combine prompt and strategy:

orun "Debug this issue" -p analyze_incident -s cod

List available templates:

orun prompts      # List all prompt templates
orun strategies   # List all strategy templates

Preview a specific template:

orun prompts --show review_code
orun strategies --show cot

In chat mode, apply templates dynamically:

/prompt analyze_paper
/strategy cot

Consensus Systems (Multi-Model)

Let multiple models collaborate for better results:

# List available consensus pipelines
orun consensus

# Code review: generate → review → refine
orun "Create a REST API for users" -C code_review

# Multi-expert: 3 models analyze, then synthesize
orun "Compare React vs Vue" -C multi_expert

# Vision + text: analyze image → refine response
orun "Explain this diagram" -i -C vision_consensus

# Vision + code: analyze UI → generate code
orun "Convert this mockup to React" -i -C vision_code

7 Built-in Pipelines:

  1. best_of_three - Same model 3 times, show all results
  2. code_review - Generate code → Review → Refine (3 models)
  3. iterative_improve - Draft → Critique → Improve (3 models)
  4. multi_expert - 3 models analyze, then synthesizer combines
  5. research_paper - Research → Outline → Write (3 models)
  6. vision_consensus - Vision analysis → Text refinement
  7. vision_code - Vision analysis → Code generation

Create Custom Pipelines:

# Edit ~/.orun/config.json
{
  "consensus": {
    "pipelines": {
      "my_workflow": {
        "type": "sequential",
        "models": [
          {"name": "model1", "role": "analyzer"},
          {"name": "model2", "role": "synthesizer"}
        ]
      }
    }
  }
}

User-defined pipelines automatically override defaults with the same name.

Analyze Screenshots

Attach the most recent screenshot:

orun "What is this error?" -i

Attach the last 3 screenshots:

orun "Compare these images" -i 3x

arXiv Integration

Search for academic papers and let the AI analyze them:

orun "Find recent papers about transformers in NLP"
orun "Get details about arXiv paper 1706.03762"
orun "Search for papers by Geoffrey Hinton and summarize his latest work"

In interactive chat, use the /arxiv command for direct access:

orun
> /arxiv quantum computing
> /arxiv 1706.03762
> /arxiv https://arxiv.org/abs/2301.07041

The AI can autonomously:

  • Search arXiv by keywords, topics, or authors
  • Retrieve full paper details (title, abstract, authors, PDF links)
  • Analyze and summarize research papers
  • Find relevant literature for your projects

The /arxiv command automatically detects whether you're searching or requesting a specific paper, fetches the data, and provides AI analysis without showing raw output.

Web Search & URL Fetching

Search the web or fetch specific web pages in interactive chat:

Web Search (DuckDuckGo with Language Detection):

orun
> /search Python asyncio tutorials
> /search latest news about AI

Fetch URL (via Jina AI Reader):

orun
> /fetch https://example.com
> /fetch github.com/user/repo

Features:

  • Web Search: DuckDuckGo with automatic language detection for region-appropriate results
  • Language Detection: Automatically detects query language (Ukrainian, Russian, English, etc.) and sets appropriate region
  • URL Fetching: Jina AI Reader converts pages to clean markdown optimized for LLM analysis
  • No Configuration Required: Works out of the box with unlimited free searches
  • AI Analysis: All results are analyzed and summarized by the AI

The AI can also autonomously call web_search() and fetch_url() tools during conversations.

Model Management

Models are stored in ~/.orun/config.json with support for multiple shortcuts per model and custom options.

Sync models from Ollama:

orun refresh

List available models with all their aliases:

orun models

Set default active model:

orun set-active llama3.1

Add shortcuts to models (multiple shortcuts per model supported):

orun shortcut llama3.1:8b llama
orun shortcut llama3.1:8b l3
# Now llama3.1:8b has shortcuts: ["llama3.1", "llama", "l3"]

Model Configuration Structure:

{
  "models": {
    "llama3.1:8b": {
      "shortcuts": ["llama3.1", "llama", "l3"],
      "options": {"temperature": 0.7}
    }
  },
  "active_model": "llama3.1:8b"
}

Conversation History

Conversation history is stored in ~/.orun/history.db (SQLite database).

List recent conversations:

orun history

Continue a conversation by ID:

orun c 1

Continue the last conversation:

orun last

Optional Robyn MCP Server

Expose orun through a lightweight Robyn MCP server for tools that prefer HTTP access.

  1. Install the optional dependency:
    pip install "orun-py[mcp]"
    
  2. Start the server (defaults to 127.0.0.1:8000 and the active model):
    orun mcp-server --host 0.0.0.0 --port 8000 -m llama3.1
    
  3. Available endpoints:
    • GET /health{"status": "ok"}
    • POST /chat with JSON { "prompt": "...", "system_prompt": "...", "options": { ... } }{"response": "..."}
      Tools are enabled by default; use --disable-tools when starting the server to force text-only responses.

Requirements

  • Python 3.10+
  • Ollama running locally

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

orun_py-1.5.0b1.tar.gz (524.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

orun_py-1.5.0b1-py3-none-any.whl (582.5 kB view details)

Uploaded Python 3

File details

Details for the file orun_py-1.5.0b1.tar.gz.

File metadata

  • Download URL: orun_py-1.5.0b1.tar.gz
  • Upload date:
  • Size: 524.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for orun_py-1.5.0b1.tar.gz
Algorithm Hash digest
SHA256 1c12633eff0ca73c0b14c4cdbd3d9e1c292f7d70a19a57c7c98404ae72040f6c
MD5 bbf5867d87025b80cbe711b1ff1fdabc
BLAKE2b-256 8c6d4d67ea40f36afa7fbff546ea9227bbad74e41787adf63db6c8ebf4d0e84c

See more details on using hashes here.

File details

Details for the file orun_py-1.5.0b1-py3-none-any.whl.

File metadata

  • Download URL: orun_py-1.5.0b1-py3-none-any.whl
  • Upload date:
  • Size: 582.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for orun_py-1.5.0b1-py3-none-any.whl
Algorithm Hash digest
SHA256 abb84f6b5cdca5eaadeac65cd7c30493c4af2e4adcf3c89deec24cd02d5fa982
MD5 ea1c975396d43c840945b46b2c3b5c3a
BLAKE2b-256 30d8e9020425cfd9c55e78155802bd4a27512cf276ba4c193e3f7fbb828d1203

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page