Skip to main content

MCP Server for MediaLLM

Project description

MediaLLM MCP Server

MCP server that provides AI-powered media processing capabilities for FFmpeg operations through natural language commands. MediaLLM converts natural language requests into precise FFmpeg commands and scans workspaces for media files.

Full Documentation

Installation

# Using pip
pip install mediallm-mcp

# Using uv (recommended)
uv add mediallm-mcp

Usage

# STDIO (default)
mediallm-mcp

# Streamable HTTP (default path: /mcp)
mediallm-mcp --http --port 3001

# SSE
mediallm-mcp --sse --port 3001

# Optional: customize MCP HTTP endpoint path (default: /mcp)
mediallm-mcp --http --port 3001 --path /api/mcp

Running in Docker

# Build image
cd packages/mediallm-mcp
docker build -t mediallm-mcp .

# Run with media directory mounted and HTTP port exposed
docker run -it --rm \
  -p 8080:8080 \
  -v /path/to/media:/workspace \
  mediallm-mcp

# MCP endpoint (default): http://localhost:8080/mcp

Accessing from Claude Desktop

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "mediallm-mcp": {
      "command": "uvx",
      "args": ["mediallm-mcp"],
      "env": {}
    }
  }
}

Config file location:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

Accessing from Claude Code

Add to .mcp.json in project root:

{
  "mcpServers": {
    "mediallm-mcp": {
      "command": "uvx",
      "args": ["mediallm-mcp"],
      "env": {}
    }
  }
}

Accessing from Cursor

Add to Cursor

Or manually add to .cursor/mcp.json:

{
  "mcpServers": {
    "mediallm-mcp": {
      "command": "uvx",
      "args": ["mediallm-mcp"],
      "env": {}
    }
  }
}

Environment Variables (Optional) for MCP configuration

  • MEDIALLM_WORKSPACE - Specify media directory (default: current working directory)
  • MEDIALLM_MODEL - Override LLM model (default: llama3.1:latest)
  • MEDIALLM_OLLAMA_HOST - Ollama server URL (default: http://localhost:11434)
  • MEDIALLM_OUTPUT_DIR - Output directory (default: current working directory)

Debugging

Use MCP inspector to test the connection:

  1. Activate virtual environment
cd packages/mediallm-mcp && uv sync --all-extras && source .venv/bin/activate
  1. Run MCP inspector
npx @modelcontextprotocol/inspector mediallm-mcp

MCP Inspector: Request timed out (-32001)

If testing long-running tools like generate_command results in an error such as: "MCP error -32001: Request timed out", increase the Inspector client timeouts in its Configuration panel:

  • Request Timeout: 300000 (5 minutes)
  • Reset Timeout on Progress: true
  • Maximum Total Timeout: 900000 (15 minutes)

These values follow MCP guidance to allow configurable per-request timeouts and to optionally reset timeouts on progress while still enforcing a maximum overall timeout. See the MCP spec section on timeouts and the Inspector discussion about default client timeouts for context.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mediallm_mcp-0.0.4.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mediallm_mcp-0.0.4-py3-none-any.whl (6.3 kB view details)

Uploaded Python 3

File details

Details for the file mediallm_mcp-0.0.4.tar.gz.

File metadata

  • Download URL: mediallm_mcp-0.0.4.tar.gz
  • Upload date:
  • Size: 7.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mediallm_mcp-0.0.4.tar.gz
Algorithm Hash digest
SHA256 87d84b2701d2f9ef9fe5af245797fa948959d1515da06979eeec4f00fc210fc2
MD5 e14110bb5a596038cb34dde6ad02bce6
BLAKE2b-256 e4880f3ce5904ca3822e603ca23400ce589df041ebc3497b18121d5d93c3c95b

See more details on using hashes here.

Provenance

The following attestation bundles were made for mediallm_mcp-0.0.4.tar.gz:

Publisher: publish-mediallm-mcp.yml on iamarunbrahma/mediallm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mediallm_mcp-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: mediallm_mcp-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 6.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mediallm_mcp-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 2dabbf263c4994d52ea9acf635c432d12fa02f82062344624d03da02f6efba10
MD5 bf72c835ea60e2eed15c32833684b13d
BLAKE2b-256 310f51c06df388aec1497231f7ab932ebdbef6ec88230023016be495109b56ec

See more details on using hashes here.

Provenance

The following attestation bundles were made for mediallm_mcp-0.0.4-py3-none-any.whl:

Publisher: publish-mediallm-mcp.yml on iamarunbrahma/mediallm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page