MCP server for Ollama integration
Project description
MCP Ollama
A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.
Requirements
- Python 3.10 or higher
- Ollama installed and running (https://ollama.com/download)
- At least one model pulled with Ollama (e.g.,
ollama pull llama2)
Configure Claude Desktop
Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS, %APPDATA%\Claude\claude_desktop_config.json on Windows):
{
"mcpServers": {
"ollama": {
"command": "uvx",
"args": [
"mcp-ollama"
]
}
}
}
Development
Install in development mode:
git clone https://github.com/yourusername/mcp-ollama.git
cd mcp-ollama
uv sync
Test with MCP Inspector:
mcp dev src/mcp_ollama/server.py
Features
The server provides four main tools:
list_models- List all downloaded Ollama modelsshow_model- Get detailed information about a specific modelask_model- Ask a question to a specified model
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mcp_ollama-0.1.3.tar.gz.
File metadata
- Download URL: mcp_ollama-0.1.3.tar.gz
- Upload date:
- Size: 42.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9e3017047721cc43da7192118e35d9cae76ff828fe08a27fcac56be64a12145f
|
|
| MD5 |
85735da1da7c3ddd19bbf7a4835641bb
|
|
| BLAKE2b-256 |
8026577af2c24a4beda8f8691683df6f313fdd5fc7280c37533f0d72987a159d
|
Provenance
The following attestation bundles were made for mcp_ollama-0.1.3.tar.gz:
Publisher:
publish.yml on emgeee/mcp-ollama
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mcp_ollama-0.1.3.tar.gz -
Subject digest:
9e3017047721cc43da7192118e35d9cae76ff828fe08a27fcac56be64a12145f - Sigstore transparency entry: 168769643
- Sigstore integration time:
-
Permalink:
emgeee/mcp-ollama@b2f1d289c30725c0af36481244d024c711c9351d -
Branch / Tag:
refs/heads/main - Owner: https://github.com/emgeee
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@b2f1d289c30725c0af36481244d024c711c9351d -
Trigger Event:
push
-
Statement type:
File details
Details for the file mcp_ollama-0.1.3-py3-none-any.whl.
File metadata
- Download URL: mcp_ollama-0.1.3-py3-none-any.whl
- Upload date:
- Size: 4.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
57ec190b903a8ca78f3fd6e9fa77f56f3951741fa5fb0df29348ad40595f8ec6
|
|
| MD5 |
0052a4347e33bfe0fc3bd929f6b23dd4
|
|
| BLAKE2b-256 |
f854cdaacf2c878d2aa6c21b06318e6ee3307b767fe75695266d85c469f44072
|
Provenance
The following attestation bundles were made for mcp_ollama-0.1.3-py3-none-any.whl:
Publisher:
publish.yml on emgeee/mcp-ollama
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mcp_ollama-0.1.3-py3-none-any.whl -
Subject digest:
57ec190b903a8ca78f3fd6e9fa77f56f3951741fa5fb0df29348ad40595f8ec6 - Sigstore transparency entry: 168769649
- Sigstore integration time:
-
Permalink:
emgeee/mcp-ollama@b2f1d289c30725c0af36481244d024c711c9351d -
Branch / Tag:
refs/heads/main - Owner: https://github.com/emgeee
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@b2f1d289c30725c0af36481244d024c711c9351d -
Trigger Event:
push
-
Statement type: