MCP Server for MediaLLM
Project description
MediaLLM MCP Server
MCP server that provides AI-powered media processing capabilities for FFmpeg operations through natural language commands. MediaLLM converts natural language requests into precise FFmpeg commands and scans workspaces for media files.
Installation
# Using pip
pip install mediallm-mcp
# Using uv (recommended)
uv add mediallm-mcp
Usage
# STDIO (default)
mediallm-mcp
# Streamable HTTP (default path: /mcp)
mediallm-mcp --http --port 3001
# SSE
mediallm-mcp --sse --port 3001
# Optional: customize MCP HTTP endpoint path (default: /mcp)
mediallm-mcp --http --port 3001 --path /api/mcp
Running in Docker
# Build image
cd packages/mediallm-mcp
docker build -t mediallm-mcp .
# Run with media directory mounted and HTTP port exposed
docker run -it --rm \
-p 8080:8080 \
-v /path/to/media:/workspace \
mediallm-mcp
# MCP endpoint (default): http://localhost:8080/mcp
Accessing from Claude Desktop
Add to claude_desktop_config.json:
{
"mcpServers": {
"mediallm-mcp": {
"command": "uvx",
"args": ["mediallm-mcp"],
"env": {}
}
}
}
Config file location:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
Accessing from Claude Code
Add to .mcp.json in project root:
{
"mcpServers": {
"mediallm-mcp": {
"command": "uvx",
"args": ["mediallm-mcp"],
"env": {}
}
}
}
Accessing from Cursor
Or manually add to .cursor/mcp.json:
{
"mcpServers": {
"mediallm-mcp": {
"command": "uvx",
"args": ["mediallm-mcp"],
"env": {}
}
}
}
Environment Variables (Optional) for MCP configuration
MEDIALLM_WORKSPACE- Specify media directory (default: current working directory)MEDIALLM_MODEL- Override LLM model (default: llama3.1:latest)MEDIALLM_OLLAMA_HOST- Ollama server URL (default: http://localhost:11434)MEDIALLM_OUTPUT_DIR- Output directory (default: current working directory)
Debugging
Use MCP inspector to test the connection:
- Activate virtual environment
cd packages/mediallm-mcp && uv sync --all-extras && source .venv/bin/activate
- Run MCP inspector
npx @modelcontextprotocol/inspector mediallm-mcp
MCP Inspector: Request timed out (-32001)
If testing long-running tools like generate_command results in an error such as: "MCP error -32001: Request timed out", increase the Inspector client timeouts in its Configuration panel:
- Request Timeout: 300000 (5 minutes)
- Reset Timeout on Progress: true
- Maximum Total Timeout: 900000 (15 minutes)
These values follow MCP guidance to allow configurable per-request timeouts and to optionally reset timeouts on progress while still enforcing a maximum overall timeout. See the MCP spec section on timeouts and the Inspector discussion about default client timeouts for context.
- Spec: MCP Lifecycle – Timeouts
- Inspector discussion: Set longer request times for Inspector client
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mediallm_mcp-0.0.4.tar.gz.
File metadata
- Download URL: mediallm_mcp-0.0.4.tar.gz
- Upload date:
- Size: 7.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
87d84b2701d2f9ef9fe5af245797fa948959d1515da06979eeec4f00fc210fc2
|
|
| MD5 |
e14110bb5a596038cb34dde6ad02bce6
|
|
| BLAKE2b-256 |
e4880f3ce5904ca3822e603ca23400ce589df041ebc3497b18121d5d93c3c95b
|
Provenance
The following attestation bundles were made for mediallm_mcp-0.0.4.tar.gz:
Publisher:
publish-mediallm-mcp.yml on iamarunbrahma/mediallm
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mediallm_mcp-0.0.4.tar.gz -
Subject digest:
87d84b2701d2f9ef9fe5af245797fa948959d1515da06979eeec4f00fc210fc2 - Sigstore transparency entry: 903088784
- Sigstore integration time:
-
Permalink:
iamarunbrahma/mediallm@60f82364c1066f310fa0df3926b238bbc71f2a13 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/iamarunbrahma
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-mediallm-mcp.yml@60f82364c1066f310fa0df3926b238bbc71f2a13 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file mediallm_mcp-0.0.4-py3-none-any.whl.
File metadata
- Download URL: mediallm_mcp-0.0.4-py3-none-any.whl
- Upload date:
- Size: 6.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2dabbf263c4994d52ea9acf635c432d12fa02f82062344624d03da02f6efba10
|
|
| MD5 |
bf72c835ea60e2eed15c32833684b13d
|
|
| BLAKE2b-256 |
310f51c06df388aec1497231f7ab932ebdbef6ec88230023016be495109b56ec
|
Provenance
The following attestation bundles were made for mediallm_mcp-0.0.4-py3-none-any.whl:
Publisher:
publish-mediallm-mcp.yml on iamarunbrahma/mediallm
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mediallm_mcp-0.0.4-py3-none-any.whl -
Subject digest:
2dabbf263c4994d52ea9acf635c432d12fa02f82062344624d03da02f6efba10 - Sigstore transparency entry: 903088827
- Sigstore integration time:
-
Permalink:
iamarunbrahma/mediallm@60f82364c1066f310fa0df3926b238bbc71f2a13 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/iamarunbrahma
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-mediallm-mcp.yml@60f82364c1066f310fa0df3926b238bbc71f2a13 -
Trigger Event:
workflow_dispatch
-
Statement type: