Skip to main content

MCP server for running Ollama models with async support

Project description

Ollama MCP Server

A comprehensive Model Context Protocol (MCP) server for Ollama integration with advanced features including script management, multi-agent workflows, and process leak prevention.

๐ŸŒŸ Features

  • ๐Ÿ”„ Async Job Management: Execute long-running tasks in the background
  • ๐Ÿ“ Script Templates: Create reusable prompt templates with variable substitution
  • ๐Ÿค– Fast-Agent Integration: Multi-agent workflows (chain, parallel, router, evaluator)
  • ๐Ÿ›ก๏ธ Process Leak Prevention: Proper cleanup and resource management
  • ๐Ÿ“Š Comprehensive Monitoring: Job tracking, status monitoring, and output management
  • ๐ŸŽฏ Built-in Prompts: Interactive guidance templates for common tasks
  • โšก Multiple Model Support: Work with any locally installed Ollama model

๐Ÿš€ Quick Start

Prerequisites

Installation

  1. Setup Environment: Be advised- This readme was revised by a less than concientious AI.
cd /path/to/ollama-mcp-server
uv venv --python 3.12 --seed
source .venv/bin/activate
uv add mcp[cli] python-dotenv
  1. Configure Claude Desktop: Copy configuration from example_of_bad_ai_gen_mcp_config_do_not_use.json (Don't lol. Use the example_claude_desktop_config.json)to your Claude Desktop config file:
  • Linux: ~/.config/Claude/claude_desktop_config.json
  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  1. Update paths in the config to match your system

  2. Restart Claude Desktop

๐Ÿ› ๏ธ Available Tools

Core Operations

  • list_ollama_models - Show all available Ollama models
  • run_ollama_prompt - Execute prompts with any model (sync/async)
  • get_job_status - Check job completion status
  • list_jobs - View all running and completed jobs
  • cancel_job - Stop running jobs

Script Management

  • save_script - Create reusable prompt templates
  • list_scripts - View saved templates
  • get_script - Read template content
  • run_script - Execute templates with variables

Fast-Agent Workflows

  • create_fastagent_script - Single-agent scripts
  • create_fastagent_workflow - Multi-agent workflows
  • run_fastagent_script - Execute agent workflows
  • list_fastagent_scripts - View available workflows

System Integration

  • run_bash_command - Execute system commands safely
  • run_workflow - Multi-step workflow execution

๐Ÿ“– Built-in Prompts

Interactive prompts to guide common tasks:

  • ollama_guide - Interactive user guide
  • ollama_run_prompt - Simple prompt execution
  • model_comparison - Compare multiple models
  • fast_agent_workflow - Multi-agent workflows
  • script_executor - Template execution
  • batch_processing - Multiple prompt processing
  • iterative_refinement - Content improvement workflows

๐Ÿ“ Directory Structure

ollama-mcp-server/
โ”œโ”€โ”€ src/ollama_mcp_server/
โ”‚   โ””โ”€โ”€ server.py                 # Main server code
โ”œโ”€โ”€ outputs/                      # Generated output files
โ”œโ”€โ”€ scripts/                      # Saved script templates
โ”œโ”€โ”€ workflows/                    # Workflow definitions
โ”œโ”€โ”€ fast-agent-scripts/          # Fast-agent Python scripts
โ”œโ”€โ”€ prompts/                      # Usage guides
โ”‚   โ”œโ”€โ”€ tool_usage_guide.md
โ”‚   โ”œโ”€โ”€ prompt_templates_guide.md
โ”‚   โ””โ”€โ”€ setup_guide.md
โ”œโ”€โ”€ example_mcp_config.json      # Claude Desktop config
โ””โ”€โ”€ README.md

๐Ÿ”ง Development

Run Development Server

cd ollama-mcp-server
uv run python -m ollama_mcp_server.server

Debug with MCP Inspector

mcp dev src/ollama_mcp_server/server.py

๐Ÿ›ก๏ธ Process Management

The server includes comprehensive process leak prevention:

  • Signal Handling: Proper SIGTERM/SIGINT handling
  • Background Task Tracking: All async tasks monitored
  • Resource Cleanup: Automatic process termination
  • Memory Management: Prevents accumulation of zombie processes

Monitor health with:

ps aux | grep mcp | wc -l  # Should show <10 processes

๐Ÿ“Š Usage Examples

Simple Prompt Execution

1. Use "ollama_run_prompt" prompt in Claude
2. Specify model and prompt text
3. Get immediate results

Multi-Agent Workflow

1. Use "fast_agent_workflow" prompt
2. Choose workflow type (chain/parallel/router/evaluator)
3. Define agents and initial prompt
4. Monitor execution

Script Templates

1. Create template with save_script
2. Use variables: {variable_name}
3. Execute with run_script
4. Pass JSON variables object

๐Ÿšจ Troubleshooting

Model not found: Use list_ollama_models for exact names Connection issues: Start Ollama with ollama serve High process count: Server now prevents leaks automatically Job stuck: Use cancel_job to stop problematic tasks

๐Ÿค Contributing

  1. Follow the MCP Python SDK development guidelines
  2. Use proper type hints and docstrings
  3. Test all new features thoroughly
  4. Ensure process cleanup in all code paths

๐Ÿ“„ License

This project follows the same license terms as the MCP Python SDK.

๐Ÿ™ Acknowledgments

Built on the Model Context Protocol and Ollama with process management patterns from MCP best practices.


Ready to get started? Check the prompts/setup_guide.md for detailed installation instructions!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iflow_mcp_angrysky56_ollama_mcp_server-0.1.1.tar.gz (405.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file iflow_mcp_angrysky56_ollama_mcp_server-0.1.1.tar.gz.

File metadata

  • Download URL: iflow_mcp_angrysky56_ollama_mcp_server-0.1.1.tar.gz
  • Upload date:
  • Size: 405.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_angrysky56_ollama_mcp_server-0.1.1.tar.gz
Algorithm Hash digest
SHA256 980b0a5c82a2a5e0705d840580a72ab1e11ddd9e9856b70ea6c62b63aee74078
MD5 73f8a7227eb022f18a69a36e84cc7aa4
BLAKE2b-256 a456dd9eebcbbd1ed480d0ecfb5959e09fb87d5abfe2661c880f4e1700f85641

See more details on using hashes here.

File details

Details for the file iflow_mcp_angrysky56_ollama_mcp_server-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: iflow_mcp_angrysky56_ollama_mcp_server-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 28.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_angrysky56_ollama_mcp_server-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6395917064e4278c678d89119cec62853ea5a62d0b97663b396a064f19da329f
MD5 a2fae7027fd65cdac9e183253e168079
BLAKE2b-256 92602cd8fe41506479d57bad6ff7b5ae5ee7b637a2b0940efca82b418b39a39d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page