MCP Bridge for Ollama - Use local LLMs from any MCP-compatible AI
Project description
MCP Server Ollama Bridge
Use local Ollama models from any MCP-compatible AI client.
Part of the HumoticaOS "MCP for Any AI" initiative - no vendor lock-in, your choice of AI, same powerful tools.
Why?
MCP (Model Context Protocol) lets AI assistants use external tools. But what if you want to use Ollama's local models as part of your workflow? This bridge exposes Ollama to any MCP client.
Use cases:
- Let Claude Desktop query your local Qwen/Llama for sensitive data
- Have Gemini use your local models for embeddings
- Run hybrid workflows: cloud AI for creativity, local for privacy
Installation
pip install mcp-server-ollama-bridge
Configuration
Add to your claude_desktop_config.json:
{
"mcpServers": {
"ollama": {
"command": "python3",
"args": ["-m", "mcp_server_ollama_bridge"],
"env": {
"OLLAMA_URL": "http://localhost:11434"
}
}
}
}
Available Tools
| Tool | Description |
|---|---|
ollama_chat |
Chat with local model (conversation style) |
ollama_generate |
Text completion (good for code) |
ollama_list_models |
List available local models |
ollama_embeddings |
Generate embeddings for semantic search |
Examples
Chat:
ollama_chat(message="Explain quantum computing", model="qwen2.5:7b")
Code generation:
ollama_generate(prompt="Write a Python function to sort a list", model="qwen2.5:32b")
Embeddings for RAG:
ollama_embeddings(text="HumoticaOS is an AI orchestration platform")
Part of HumoticaOS
This is one of our published MCP servers:
- mcp-server-rabel - AI Memory & Communication
- mcp-server-tibet - Trust & Provenance
- mcp-server-inject-bender - Security Through Absurdity
- mcp-server-ollama-bridge - Local LLM Bridge
License
MIT - By Claude & Jasper from HumoticaOS, Kerst 2025
One love, one fAmIly
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mcp_server_ollama_bridge-1.0.0.tar.gz.
File metadata
- Download URL: mcp_server_ollama_bridge-1.0.0.tar.gz
- Upload date:
- Size: 3.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4cbcf4462d7a02b6418017274e0baf0fd410206d4f10c9781522953971fd7ad8
|
|
| MD5 |
4a8a6c99a0f1703c380dc9e6220370af
|
|
| BLAKE2b-256 |
6431e6e584ff72033f6256f0643dda9013def225ba2b9c54ce0f606b96753000
|
File details
Details for the file mcp_server_ollama_bridge-1.0.0-py3-none-any.whl.
File metadata
- Download URL: mcp_server_ollama_bridge-1.0.0-py3-none-any.whl
- Upload date:
- Size: 4.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b955bea53df2e3b983b9206ee71254c1504174cff4a5774d0b4ddb1b320c8eb2
|
|
| MD5 |
e01b74a2b9b43681610bbbd435944aa0
|
|
| BLAKE2b-256 |
f71a0b31caa572f70095e876fa5b7f0664f9d4efa4d0f8667314281291f7a88f
|