Skip to main content

Langfuse MCP server for accessing and analyzing telemetry data via natural language

Project description

Langfuse MCP Server

Test PyPI version Python 3.10+ License: MIT

A comprehensive Model Context Protocol server for Langfuse observability. Provides 18 tools for AI agents to query traces, debug errors, analyze sessions, and manage prompts.

Quick Start

uvx langfuse-mcp --public-key YOUR_KEY --secret-key YOUR_SECRET --host https://cloud.langfuse.com

Or set environment variables and run without flags:

export LANGFUSE_PUBLIC_KEY=pk-...
export LANGFUSE_SECRET_KEY=sk-...
export LANGFUSE_HOST=https://cloud.langfuse.com
uvx langfuse-mcp

Why langfuse-mcp?

langfuse-mcp Official Langfuse MCP
Tools 18 2-4
Traces & Observations Yes No
Sessions & Users Yes No
Exception Tracking Yes No
Prompt Management Yes Yes
Language Python TypeScript
Selective Tool Loading Yes No

This project provides a full observability toolkit — traces, observations, sessions, exceptions, and prompts — while the official Langfuse MCP focuses on prompt management only.

Available Tools

Traces

Tool Description
fetch_traces Search/filter traces with pagination
fetch_trace Fetch a specific trace by ID

Observations

Tool Description
fetch_observations Search/filter observations with pagination
fetch_observation Fetch a specific observation by ID

Sessions

Tool Description
fetch_sessions List recent sessions with pagination
get_session_details Get detailed session info by ID
get_user_sessions Get all sessions for a user

Exceptions

Tool Description
find_exceptions Find exceptions grouped by file/function/type
find_exceptions_in_file Find exceptions in a specific file
get_exception_details Get detailed info about a specific exception
get_error_count Get total error count

Prompts

Tool Description
get_prompt Fetch prompt with resolved dependencies
get_prompt_unresolved Fetch prompt with dependency tags intact
list_prompts List/filter prompts with pagination
create_text_prompt Create new text prompt version
create_chat_prompt Create new chat prompt version
update_prompt_labels Update labels for a prompt version

Schema

Tool Description
get_data_schema Get schema information for response structures

Installation

Using uvx (recommended)

uvx langfuse-mcp --help

Using pip

pip install langfuse-mcp
langfuse-mcp --help

Using Docker

docker pull ghcr.io/avivsinai/langfuse-mcp:latest
docker run --rm -i \
  -e LANGFUSE_PUBLIC_KEY=pk-... \
  -e LANGFUSE_SECRET_KEY=sk-... \
  -e LANGFUSE_HOST=https://cloud.langfuse.com \
  ghcr.io/avivsinai/langfuse-mcp:latest

Configuration

Claude Code

Create .mcp.json in your project root:

{
  "mcpServers": {
    "langfuse": {
      "command": "uvx",
      "args": ["langfuse-mcp"],
      "env": {
        "LANGFUSE_PUBLIC_KEY": "pk-...",
        "LANGFUSE_SECRET_KEY": "sk-...",
        "LANGFUSE_HOST": "https://cloud.langfuse.com"
      }
    }
  }
}

Codex CLI

Add to ~/.codex/config.toml:

[mcp_servers.langfuse]
command = "uvx"
args = ["langfuse-mcp"]

[mcp_servers.langfuse.env]
LANGFUSE_PUBLIC_KEY = "pk-..."
LANGFUSE_SECRET_KEY = "sk-..."
LANGFUSE_HOST = "https://cloud.langfuse.com"

Or via CLI:

codex mcp add langfuse \
  --env LANGFUSE_PUBLIC_KEY=pk-... \
  --env LANGFUSE_SECRET_KEY=sk-... \
  --env LANGFUSE_HOST=https://cloud.langfuse.com \
  -- uvx langfuse-mcp

Cursor

Create .cursor/mcp.json in your project:

{
  "mcpServers": {
    "langfuse": {
      "command": "uvx",
      "args": ["langfuse-mcp"],
      "env": {
        "LANGFUSE_PUBLIC_KEY": "pk-...",
        "LANGFUSE_SECRET_KEY": "sk-...",
        "LANGFUSE_HOST": "https://cloud.langfuse.com"
      }
    }
  }
}

Or use the deeplink for quick setup:

cursor://anysphere.cursor-deeplink/mcp/install?name=langfuse-mcp&config=eyJjb21tYW5kIjoidXZ4IiwiYXJncyI6WyJsYW5nZnVzZS1tY3AiXX0=

Claude Desktop

Add to Claude Desktop settings:

{
  "mcpServers": {
    "langfuse": {
      "command": "uvx",
      "args": ["langfuse-mcp"],
      "env": {
        "LANGFUSE_PUBLIC_KEY": "pk-...",
        "LANGFUSE_SECRET_KEY": "sk-...",
        "LANGFUSE_HOST": "https://cloud.langfuse.com"
      }
    }
  }
}

Usage

Selective Tool Loading

Load only the tool groups you need to reduce token overhead:

# Load only trace and prompt tools
langfuse-mcp --tools traces,prompts

# Available groups: traces, observations, sessions, exceptions, prompts, schema

Or via environment variable:

export LANGFUSE_MCP_TOOLS=traces,prompts

Output Modes

Each tool supports different output modes:

Mode Description
compact Summary with large values truncated (default)
full_json_string Complete data as JSON string
full_json_file Save to file, return summary with path

Logging

# Debug logging to console
langfuse-mcp --log-level DEBUG --log-to-console

# Custom log file location
export LANGFUSE_MCP_LOG_FILE=/var/log/langfuse_mcp.log

Default log location: /tmp/langfuse_mcp.log

Timeout Configuration

The Langfuse Python SDK defaults to a 5-second API timeout, which can be too aggressive for cloud APIs experiencing latency. This MCP server uses a more reasonable default of 30 seconds.

# Set via CLI
langfuse-mcp --timeout 60

# Or via environment variable
export LANGFUSE_TIMEOUT=60

If you experience timeout errors, try increasing the timeout value. The Langfuse cloud API occasionally experiences latency spikes.

Development

git clone https://github.com/avivsinai/langfuse-mcp.git
cd langfuse-mcp

# Create virtual environment
uv venv --python 3.11 .venv
source .venv/bin/activate

# Install with dev dependencies
uv pip install -e ".[dev]"

# Run tests
pytest

# Lint and format
ruff check --fix . && ruff format .

Version Management

This project uses Git tags for versioning:

  1. Tag: git tag v1.0.0
  2. Push: git push --tags
  3. GitHub Actions builds and publishes to PyPI

See CHANGELOG.md for release history.

Contributing

Contributions welcome! Please submit a Pull Request.

License

MIT License - see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langfuse_mcp-0.3.2.tar.gz (28.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langfuse_mcp-0.3.2-py3-none-any.whl (27.4 kB view details)

Uploaded Python 3

File details

Details for the file langfuse_mcp-0.3.2.tar.gz.

File metadata

  • Download URL: langfuse_mcp-0.3.2.tar.gz
  • Upload date:
  • Size: 28.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for langfuse_mcp-0.3.2.tar.gz
Algorithm Hash digest
SHA256 f5da4306380556493510e446a6d94e048c6524e70015fa602748b0952c444c50
MD5 ce86e12317ed1d8c1d9b0d682463e778
BLAKE2b-256 547cc15a8ab463ec01457d893f07ab4d75f8d1bb2f85604d3cf6e852426f9f92

See more details on using hashes here.

File details

Details for the file langfuse_mcp-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: langfuse_mcp-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 27.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for langfuse_mcp-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 395f1dcb42d0c2a9e4545796be9cb0c504088acf43e6162057af1072e98a2814
MD5 82f33d2bc4c34ce2bf2ad09c7f21acf5
BLAKE2b-256 760372b0c49689407550feffff66b0b4deb8a527ecdc6292e7bba27b8cd7e036

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page