Skip to main content

Langfuse MCP server for accessing and analyzing telemetry data via natural language

Project description

Langfuse MCP (Model Context Protocol)

Test PyPI version Python 3.10-3.13 License: MIT

This project provides a Model Context Protocol (MCP) server for Langfuse, allowing AI agents to query Langfuse trace data for better debugging and observability.

Quick Start with Cursor

Add to Cursor

Installation Options

🎯 From Cursor IDE: Click the button above (works seamlessly!)
🌐 From GitHub Web: Copy this deeplink and paste into your browser address bar:

cursor://anysphere.cursor-deeplink/mcp/install?name=langfuse-mcp&config=eyJjb21tYW5kIjoidXZ4IiwiYXJncyI6WyJsYW5nZnVzZS1tY3AiLCItLXB1YmxpYy1rZXkiLCJZT1VSX1BVQkxJQ19LRVkiLCItLXNlY3JldC1rZXkiLCJZT1VSX1NFQ1JFVF9LRVkiLCItLWhvc3QiLCJodHRwczovL2Nsb3VkLmxhbmdmdXNlLmNvbSJdfQ==

⚙️ Manual Setup: See Configuration section below

💡 Note: The "Add to Cursor" button only works from within Cursor IDE due to browser security restrictions on custom protocols (cursor://). This is normal and expected behavior per Cursor's documentation.

After installation: Replace YOUR_PUBLIC_KEY and YOUR_SECRET_KEY with your actual Langfuse credentials in Cursor's MCP settings.

Features

  • Integration with Langfuse for trace and observation data
  • Tool suite for AI agents to query trace data
  • Exception and error tracking capabilities
  • Session and user activity monitoring

Available Tools

The MCP server provides the following tools for AI agents:

  • fetch_traces - Find traces based on criteria like user ID, session ID, etc.
  • fetch_trace - Get a specific trace by ID
  • fetch_observations - Get observations filtered by type
  • fetch_observation - Get a specific observation by ID
  • fetch_sessions - List sessions in the current project
  • get_session_details - Get detailed information about a session
  • get_user_sessions - Get all sessions for a user
  • find_exceptions - Find exceptions and errors in traces
  • find_exceptions_in_file - Find exceptions in a specific file
  • get_exception_details - Get detailed information about an exception
  • get_error_count - Get the count of errors
  • get_data_schema - Get schema information for the data structures

Setup

Install uv

First, make sure uv is installed. For installation instructions, see the uv installation docs.

If you already have an older version of uv installed, you might need to update it with uv self update.

Installation

Requirement: The server now depends on the Langfuse Python SDK v3. Installations automatically pull langfuse>=3.0.0 and require Python 3.10–3.13 while upstream SDK support for 3.14 is pending.

uv pip install langfuse-mcp

If you're iterating on this repository, install the local checkout instead of PyPI:

# from the repo root
uv pip install --editable .

Recommended local environment

For development we suggest creating an isolated environment pinned to Python 3.11 (the version used in CI):

uv venv --python 3.11 .venv
source .venv/bin/activate  # On Windows use: .venv\Scripts\activate
uv pip install --python .venv/bin/python -e .

All subsequent examples assume the virtual environment is activated.

Obtain Langfuse credentials

You'll need your Langfuse credentials:

You can store these in a local .env file instead of passing CLI flags each time:

LANGFUSE_PUBLIC_KEY=your_public_key
LANGFUSE_SECRET_KEY=your_secret_key
LANGFUSE_HOST=https://cloud.langfuse.com

When present, the MCP server reads these values automatically. CLI arguments still override the environment if provided.

Running the Server

Run the server using uvx or the project virtual environment:

uvx langfuse-mcp --public-key YOUR_KEY --secret-key YOUR_SECRET --host https://cloud.langfuse.com

# or, once inside the repo virtual environment
langfuse-mcp --public-key YOUR_KEY --secret-key YOUR_SECRET --host https://cloud.langfuse.com

Local checkout tip: During development run uv run --from /path/to/langfuse-mcp langfuse-mcp ... (or uv run python -m langfuse_mcp ...) so uv executes the code in your working tree. Using the PyPI shortcut skips repository-only changes such as the new environment-based credential defaults and logging tweaks.

The server writes diagnostic logs to /tmp/langfuse_mcp.log. Remove the --host switch if you are targeting the default Cloud endpoint. Use --log-level (e.g., --log-level DEBUG) and --log-to-console to control verbosity during debugging.

Run with Docker

Option 1: Pull from GitHub Container Registry (Recommended)

Pull and run the pre-built image:

docker pull ghcr.io/avivsinai/langfuse-mcp:latest
docker run --rm -i \
  -e LANGFUSE_PUBLIC_KEY=YOUR_PUBLIC_KEY \
  -e LANGFUSE_SECRET_KEY=YOUR_SECRET_KEY \
  -e LANGFUSE_HOST=https://cloud.langfuse.com \
  -e LANGFUSE_MCP_LOG_FILE=/logs/langfuse_mcp.log \
  -v "$(pwd)/logs:/logs" \
  ghcr.io/avivsinai/langfuse-mcp:latest

Available tags:

  • latest - Most recent release
  • v0.2.0 - Specific version
  • 0.2 - Major.minor version

Option 2: Build from source

Build the image from the repository root so the container installs the current checkout instead of the latest PyPI release:

docker build -t langfuse-logs-mcp .
docker run --rm -i \
  -e LANGFUSE_PUBLIC_KEY=YOUR_PUBLIC_KEY \
  -e LANGFUSE_SECRET_KEY=YOUR_SECRET_KEY \
  -e LANGFUSE_HOST=https://cloud.langfuse.com \
  -e LANGFUSE_MCP_LOG_FILE=/logs/langfuse_mcp.log \
  -v "$(pwd)/logs:/logs" \
  langfuse-logs-mcp

Why no -t? Allocating a pseudo-TTY can interfere with MCP stdio clients. Use -i only so the server communicates over plain stdin/stdout.

The Dockerfile copies the local source tree and installs it with pip install ., so the container always runs your latest commits - a must while testing features that have not shipped on PyPI.

Configuration with MCP clients

Configure for Cursor

Create a .cursor/mcp.json file in your project root:

{
  "mcpServers": {
    "langfuse": {
      "command": "uvx",
      "args": ["langfuse-mcp", "--public-key", "YOUR_KEY", "--secret-key", "YOUR_SECRET", "--host", "https://cloud.langfuse.com"]
    }
  }
}

Configure for Claude Desktop

Add to your Claude settings:

{
  "command": ["uvx"],
  "args": ["langfuse-mcp"],
  "type": "stdio",
  "env": {
    "LANGFUSE_PUBLIC_KEY": "YOUR_KEY",
    "LANGFUSE_SECRET_KEY": "YOUR_SECRET",
    "LANGFUSE_HOST": "https://cloud.langfuse.com"
  }
}

Output Modes

Each tool supports different output modes to control the level of detail in responses:

  • compact (default): Returns a summary with large values truncated
  • full_json_string: Returns the complete data as a JSON string
  • full_json_file: Saves the complete data to a file and returns a summary with file information

Development

Clone the repository

git clone https://github.com/yourusername/langfuse-mcp.git
cd langfuse-mcp

Create a virtual environment and install dependencies

uv venv --python 3.11 .venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate
uv pip install --python .venv/bin/python -e ".[dev]"

Set up environment variables

export LANGFUSE_SECRET_KEY="your-secret-key"
export LANGFUSE_PUBLIC_KEY="your-public-key"
export LANGFUSE_HOST="https://cloud.langfuse.com"  # Or your self-hosted URL

Testing

Run the unit test suite (mirrors CI):

pytest

To run the demo client:

uv run examples/langfuse_client_demo.py --public-key YOUR_PUBLIC_KEY --secret-key YOUR_SECRET_KEY

Version Management

This project uses dynamic versioning based on Git tags:

  1. The version is automatically determined from git tags using uv-dynamic-versioning
  2. To create a new release:
    • Tag your commit with git tag v0.1.2 (following semantic versioning)
    • Push the tag with git push --tags
    • Create a GitHub release from the tag
  3. The GitHub workflow will automatically build and publish the package with the correct version to PyPI

For a detailed history of changes, please see the CHANGELOG.md file.

Langfuse 3.x migration notes

  • The MCP server now uses the Langfuse Python SDK v3 resource clients (langfuse.api.trace.list, langfuse.api.observations.get_many, etc.) and must currently run on Python 3.10–3.13 because the upstream SDK still relies on Pydantic v1 internals.
  • Unit tests use a v3-style fake client that fails if legacy fetch_* helpers are invoked, helping catch regressions early.
  • Tool responses now include pagination metadata when the Langfuse API returns cursors, while retaining the existing MCP interface.
  • Diagnostic logs continue to stream to /tmp/langfuse_mcp.log; this is useful when verifying the upgraded integration against a live Langfuse deployment.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Cache Management

We use the cachetools library to implement efficient caching with proper size limits:

  • Uses cachetools.LRUCache for better reliability
  • Configurable cache size via the CACHE_SIZE constant
  • Automatically evicts the least recently used items when caches exceed their size limits

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langfuse_mcp-0.2.1.tar.gz (25.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langfuse_mcp-0.2.1-py3-none-any.whl (24.5 kB view details)

Uploaded Python 3

File details

Details for the file langfuse_mcp-0.2.1.tar.gz.

File metadata

  • Download URL: langfuse_mcp-0.2.1.tar.gz
  • Upload date:
  • Size: 25.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for langfuse_mcp-0.2.1.tar.gz
Algorithm Hash digest
SHA256 ce5676e6a27db640c18d8c15ca8fc79ec83172ea7b198a269586db63ede25318
MD5 381eb117cb37812d73b3966b6d34ea74
BLAKE2b-256 1e34cf0ee77c3ae94fad6f63af1add21b2b24048d403bd3450089082f08fce21

See more details on using hashes here.

File details

Details for the file langfuse_mcp-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: langfuse_mcp-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 24.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for langfuse_mcp-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 84d8fa095852f2b89aca641a774a8b2ddc735920c98d84fbdc27d14543702e14
MD5 971d6ee9ceae94cbca31743c063ec89b
BLAKE2b-256 59d1c8e0e47619297c051ceaf4fdc1e5f04430e9884f33540455a54592564301

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page