MCP server for running Ollama models with async support
Project description
Ollama MCP Server
A comprehensive Model Context Protocol (MCP) server for Ollama integration with advanced features including script management, multi-agent workflows, and process leak prevention.
๐ Features
- ๐ Async Job Management: Execute long-running tasks in the background
- ๐ Script Templates: Create reusable prompt templates with variable substitution
- ๐ค Fast-Agent Integration: Multi-agent workflows (chain, parallel, router, evaluator)
- ๐ก๏ธ Process Leak Prevention: Proper cleanup and resource management
- ๐ Comprehensive Monitoring: Job tracking, status monitoring, and output management
- ๐ฏ Built-in Prompts: Interactive guidance templates for common tasks
- โก Multiple Model Support: Work with any locally installed Ollama model
๐ Quick Start
Prerequisites
- Python 3.8+ with uv package manager
- Ollama installed and running
- Claude Desktop for MCP integration
Installation
- Setup Environment: Be advised- This readme was revised by a less than concientious AI.
cd /path/to/ollama-mcp-server
uv venv --python 3.12 --seed
source .venv/bin/activate
uv add mcp[cli] python-dotenv
- Configure Claude Desktop:
Copy configuration from
example_of_bad_ai_gen_mcp_config_do_not_use.json(Don't lol. Use the example_claude_desktop_config.json)to your Claude Desktop config file:
- Linux:
~/.config/Claude/claude_desktop_config.json - macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
-
Update paths in the config to match your system
-
Restart Claude Desktop
๐ ๏ธ Available Tools
Core Operations
list_ollama_models- Show all available Ollama modelsrun_ollama_prompt- Execute prompts with any model (sync/async)get_job_status- Check job completion statuslist_jobs- View all running and completed jobscancel_job- Stop running jobs
Script Management
save_script- Create reusable prompt templateslist_scripts- View saved templatesget_script- Read template contentrun_script- Execute templates with variables
Fast-Agent Workflows
create_fastagent_script- Single-agent scriptscreate_fastagent_workflow- Multi-agent workflowsrun_fastagent_script- Execute agent workflowslist_fastagent_scripts- View available workflows
System Integration
run_bash_command- Execute system commands safelyrun_workflow- Multi-step workflow execution
๐ Built-in Prompts
Interactive prompts to guide common tasks:
ollama_guide- Interactive user guideollama_run_prompt- Simple prompt executionmodel_comparison- Compare multiple modelsfast_agent_workflow- Multi-agent workflowsscript_executor- Template executionbatch_processing- Multiple prompt processingiterative_refinement- Content improvement workflows
๐ Directory Structure
ollama-mcp-server/
โโโ src/ollama_mcp_server/
โ โโโ server.py # Main server code
โโโ outputs/ # Generated output files
โโโ scripts/ # Saved script templates
โโโ workflows/ # Workflow definitions
โโโ fast-agent-scripts/ # Fast-agent Python scripts
โโโ prompts/ # Usage guides
โ โโโ tool_usage_guide.md
โ โโโ prompt_templates_guide.md
โ โโโ setup_guide.md
โโโ example_mcp_config.json # Claude Desktop config
โโโ README.md
๐ง Development
Run Development Server
cd ollama-mcp-server
uv run python -m ollama_mcp_server.server
Debug with MCP Inspector
mcp dev src/ollama_mcp_server/server.py
๐ก๏ธ Process Management
The server includes comprehensive process leak prevention:
- Signal Handling: Proper SIGTERM/SIGINT handling
- Background Task Tracking: All async tasks monitored
- Resource Cleanup: Automatic process termination
- Memory Management: Prevents accumulation of zombie processes
Monitor health with:
ps aux | grep mcp | wc -l # Should show <10 processes
๐ Usage Examples
Simple Prompt Execution
1. Use "ollama_run_prompt" prompt in Claude
2. Specify model and prompt text
3. Get immediate results
Multi-Agent Workflow
1. Use "fast_agent_workflow" prompt
2. Choose workflow type (chain/parallel/router/evaluator)
3. Define agents and initial prompt
4. Monitor execution
Script Templates
1. Create template with save_script
2. Use variables: {variable_name}
3. Execute with run_script
4. Pass JSON variables object
๐จ Troubleshooting
Model not found: Use list_ollama_models for exact names
Connection issues: Start Ollama with ollama serve
High process count: Server now prevents leaks automatically
Job stuck: Use cancel_job to stop problematic tasks
๐ค Contributing
- Follow the MCP Python SDK development guidelines
- Use proper type hints and docstrings
- Test all new features thoroughly
- Ensure process cleanup in all code paths
๐ License
This project follows the same license terms as the MCP Python SDK.
๐ Acknowledgments
Built on the Model Context Protocol and Ollama with process management patterns from MCP best practices.
Ready to get started? Check the prompts/setup_guide.md for detailed installation instructions!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file iflow_mcp_angrysky56_ollama_mcp_server-0.1.0.tar.gz.
File metadata
- Download URL: iflow_mcp_angrysky56_ollama_mcp_server-0.1.0.tar.gz
- Upload date:
- Size: 403.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a1bd58e242462a71f9538b3b43a064c181e0a067441aa748d922c5befc509d3a
|
|
| MD5 |
8047c37c541407482dd8920c504c3280
|
|
| BLAKE2b-256 |
a4544faeb01d536327347e7b0a1490fb8f3769dca2440db08d21e1c5957c6743
|
File details
Details for the file iflow_mcp_angrysky56_ollama_mcp_server-0.1.0-py3-none-any.whl.
File metadata
- Download URL: iflow_mcp_angrysky56_ollama_mcp_server-0.1.0-py3-none-any.whl
- Upload date:
- Size: 28.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
531bda5783ecdb22467ccfce74d2a65d50f638c6029cb7925989dc99b938f943
|
|
| MD5 |
7a0a33ed2ddb5c24f731e4fe29aa4314
|
|
| BLAKE2b-256 |
b5edd1235f47d5703b4f7b9a14e27b27f7684e3913328f2ad1c5163a0373c76e
|