Skip to main content

MCP server for Ollama integration

Project description

MCP Ollama

A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.

Requirements

  • Python 3.10 or higher
  • Ollama installed and running (https://ollama.com/download)
  • At least one model pulled with Ollama (e.g., ollama pull llama2)

Configure Claude Desktop

Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS, %APPDATA%\Claude\claude_desktop_config.json on Windows):

{
  "mcpServers": {
    "ollama": {
      "command": "uvx",
      "args": [
        "mcp-ollama"
      ]
    }
  }
}

Development

Install in development mode:

git clone https://github.com/yourusername/mcp-ollama.git
cd mcp-ollama
uv sync

Test with MCP Inspector:

mcp dev src/mcp_ollama/server.py

Features

The server provides four main tools:

  • list_models - List all downloaded Ollama models
  • show_model - Get detailed information about a specific model
  • ask_model - Ask a question to a specified model

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_ollama-0.1.3.tar.gz (42.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_ollama-0.1.3-py3-none-any.whl (4.5 kB view details)

Uploaded Python 3

File details

Details for the file mcp_ollama-0.1.3.tar.gz.

File metadata

  • Download URL: mcp_ollama-0.1.3.tar.gz
  • Upload date:
  • Size: 42.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for mcp_ollama-0.1.3.tar.gz
Algorithm Hash digest
SHA256 9e3017047721cc43da7192118e35d9cae76ff828fe08a27fcac56be64a12145f
MD5 85735da1da7c3ddd19bbf7a4835641bb
BLAKE2b-256 8026577af2c24a4beda8f8691683df6f313fdd5fc7280c37533f0d72987a159d

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcp_ollama-0.1.3.tar.gz:

Publisher: publish.yml on emgeee/mcp-ollama

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mcp_ollama-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: mcp_ollama-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 4.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for mcp_ollama-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 57ec190b903a8ca78f3fd6e9fa77f56f3951741fa5fb0df29348ad40595f8ec6
MD5 0052a4347e33bfe0fc3bd929f6b23dd4
BLAKE2b-256 f854cdaacf2c878d2aa6c21b06318e6ee3307b767fe75695266d85c469f44072

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcp_ollama-0.1.3-py3-none-any.whl:

Publisher: publish.yml on emgeee/mcp-ollama

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page