Langfuse MCP server for accessing and analyzing telemetry data via natural language
Project description
Langfuse MCP (Model Context Protocol)
This project provides a Model Context Protocol (MCP) server for Langfuse, allowing AI agents to query Langfuse trace data for better debugging and observability.
Quick Start with Cursor
Installation Options
🎯 From Cursor IDE: Click the button above (works seamlessly!)
🌐 From GitHub Web: Copy this deeplink and paste into your browser address bar:
cursor://anysphere.cursor-deeplink/mcp/install?name=langfuse-mcp&config=eyJjb21tYW5kIjoidXZ4IiwiYXJncyI6WyJsYW5nZnVzZS1tY3AiLCItLXB1YmxpYy1rZXkiLCJZT1VSX1BVQkxJQ19LRVkiLCItLXNlY3JldC1rZXkiLCJZT1VSX1NFQ1JFVF9LRVkiLCItLWhvc3QiLCJodHRwczovL2Nsb3VkLmxhbmdmdXNlLmNvbSJdfQ==
⚙️ Manual Setup: See Configuration section below
💡 Note: The "Add to Cursor" button only works from within Cursor IDE due to browser security restrictions on custom protocols (
cursor://). This is normal and expected behavior per Cursor's documentation.
After installation: Replace YOUR_PUBLIC_KEY and YOUR_SECRET_KEY with your actual Langfuse credentials in Cursor's MCP settings.
Features
- Integration with Langfuse for trace and observation data
- Prompt management - get, list, create, and update prompts
- Tool suite for AI agents to query trace data
- Exception and error tracking capabilities
- Session and user activity monitoring
Available Tools
The MCP server exposes 18 tools, grouped by domain:
Traces
fetch_traces- Search/filter traces with paginationfetch_trace- Fetch a specific trace by ID
Observations
fetch_observations- Search/filter observations with paginationfetch_observation- Fetch a specific observation by ID
Sessions
fetch_sessions- List recent sessions with paginationget_session_details- Get detailed session info by IDget_user_sessions- Get all sessions for a user
Exceptions
find_exceptions- Find exceptions grouped by file/function/typefind_exceptions_in_file- Find exceptions in a specific fileget_exception_details- Get detailed info about a specific exceptionget_error_count- Get total error count
Prompts
get_prompt- Fetch prompt with resolved dependenciesget_prompt_unresolved- Fetch prompt with dependency tags intact (falls back to resolved content if SDK lacks resolve support)list_prompts- List/filter prompts with paginationcreate_text_prompt- Create new text prompt versioncreate_chat_prompt- Create new chat prompt versionupdate_prompt_labels- Update labels for a prompt version
Schema
get_data_schema- Get schema information for the data structures used in responses
Setup
Install uv
First, make sure uv is installed. For installation instructions, see the uv installation docs.
If you already have an older version of uv installed, you might need to update it with uv self update.
Installation
Requirement: The server depends on the Langfuse Python SDK v3. Installations automatically pull
langfuse>=3.11.2and require Python 3.10–3.13 while upstream SDK support for 3.14 is pending.
uv pip install langfuse-mcp
If you're iterating on this repository, install the local checkout instead of PyPI:
# from the repo root
uv pip install --editable .
Recommended local environment
For development we suggest creating an isolated environment pinned to Python 3.11 (the version used in CI):
uv venv --python 3.11 .venv
source .venv/bin/activate # On Windows use: .venv\Scripts\activate
uv pip install --python .venv/bin/python -e .
All subsequent examples assume the virtual environment is activated.
Obtain Langfuse credentials
You'll need your Langfuse credentials:
- Public key
- Secret key
- Host URL (usually https://cloud.langfuse.com or your self-hosted URL)
You can store these in a local .env file instead of passing CLI flags each time:
LANGFUSE_PUBLIC_KEY=your_public_key
LANGFUSE_SECRET_KEY=your_secret_key
LANGFUSE_HOST=https://cloud.langfuse.com
When present, the MCP server reads these values automatically. CLI arguments still override the environment if provided.
Running the Server
Run the server using uvx or the project virtual environment:
uvx langfuse-mcp --public-key YOUR_KEY --secret-key YOUR_SECRET --host https://cloud.langfuse.com
# or, once inside the repo virtual environment
langfuse-mcp --public-key YOUR_KEY --secret-key YOUR_SECRET --host https://cloud.langfuse.com
Local checkout tip: During development run
uv run --from /path/to/langfuse-mcp langfuse-mcp ...(oruv run python -m langfuse_mcp ...) souvexecutes the code in your working tree. Using the PyPI shortcut skips repository-only changes such as the new environment-based credential defaults and logging tweaks.
The server writes diagnostic logs to /tmp/langfuse_mcp.log. Remove the --host switch if you are targeting the default Cloud endpoint.
Use --log-level (e.g., --log-level DEBUG) and --log-to-console to control verbosity during debugging.
Selective Tool Loading
Use --tools to load only specific tool groups, reducing token overhead:
# Load only trace and prompt tools
langfuse-mcp --tools traces,prompts
# Available groups: traces, observations, sessions, exceptions, prompts, schema
# Default: all
Or set via environment: LANGFUSE_MCP_TOOLS=traces,prompts
Run with Docker
Option 1: Pull from GitHub Container Registry (Recommended)
Pull and run the pre-built image:
docker pull ghcr.io/avivsinai/langfuse-mcp:latest
docker run --rm -i \
-e LANGFUSE_PUBLIC_KEY=YOUR_PUBLIC_KEY \
-e LANGFUSE_SECRET_KEY=YOUR_SECRET_KEY \
-e LANGFUSE_HOST=https://cloud.langfuse.com \
-e LANGFUSE_MCP_LOG_FILE=/logs/langfuse_mcp.log \
-v "$(pwd)/logs:/logs" \
ghcr.io/avivsinai/langfuse-mcp:latest
Available tags:
latest- Most recent releasev0.2.0- Specific version0.2- Major.minor version
Option 2: Build from source
Build the image from the repository root so the container installs the current checkout instead of the latest PyPI release:
docker build -t langfuse-logs-mcp .
docker run --rm -i \
-e LANGFUSE_PUBLIC_KEY=YOUR_PUBLIC_KEY \
-e LANGFUSE_SECRET_KEY=YOUR_SECRET_KEY \
-e LANGFUSE_HOST=https://cloud.langfuse.com \
-e LANGFUSE_MCP_LOG_FILE=/logs/langfuse_mcp.log \
-v "$(pwd)/logs:/logs" \
langfuse-logs-mcp
Why no
-t? Allocating a pseudo-TTY can interfere with MCP stdio clients. Use-ionly so the server communicates over plain stdin/stdout.
The Dockerfile copies the local source tree and installs it with pip install ., so the container always runs your latest commits - a must while testing features that have not shipped on PyPI.
Configuration with MCP clients
Configure for Cursor
Create a .cursor/mcp.json file in your project root:
{
"mcpServers": {
"langfuse": {
"command": "uvx",
"args": ["langfuse-mcp", "--public-key", "YOUR_KEY", "--secret-key", "YOUR_SECRET", "--host", "https://cloud.langfuse.com"]
}
}
}
Configure for Claude Desktop
Add to your Claude settings:
{
"command": ["uvx"],
"args": ["langfuse-mcp"],
"type": "stdio",
"env": {
"LANGFUSE_PUBLIC_KEY": "YOUR_KEY",
"LANGFUSE_SECRET_KEY": "YOUR_SECRET",
"LANGFUSE_HOST": "https://cloud.langfuse.com"
}
}
Output Modes
Each tool supports different output modes to control the level of detail in responses:
compact(default): Returns a summary with large values truncatedfull_json_string: Returns the complete data as a JSON stringfull_json_file: Saves the complete data to a file and returns a summary with file information
Development
Clone the repository
git clone https://github.com/yourusername/langfuse-mcp.git
cd langfuse-mcp
Create a virtual environment and install dependencies
uv venv --python 3.11 .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
uv pip install --python .venv/bin/python -e ".[dev]"
Set up environment variables
export LANGFUSE_SECRET_KEY="your-secret-key"
export LANGFUSE_PUBLIC_KEY="your-public-key"
export LANGFUSE_HOST="https://cloud.langfuse.com" # Or your self-hosted URL
Testing
Run the unit test suite (mirrors CI):
pytest
To run the demo client:
uv run examples/langfuse_client_demo.py --public-key YOUR_PUBLIC_KEY --secret-key YOUR_SECRET_KEY
Version Management
This project uses dynamic versioning based on Git tags:
- The version is automatically determined from git tags using
uv-dynamic-versioning - To create a new release:
- Tag your commit with
git tag v0.1.2(following semantic versioning) - Push the tag with
git push --tags - Create a GitHub release from the tag
- Tag your commit with
- The GitHub workflow will automatically build and publish the package with the correct version to PyPI
For a detailed history of changes, please see the CHANGELOG.md file.
Update Notes
- Prompt management - get, list, create, and update prompts directly from your AI agent
- SDK floor -
langfuse>=3.11.2(capped at<4.0.0)
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Cache Management
We use the cachetools library to implement efficient caching with proper size limits:
- Uses
cachetools.LRUCachefor better reliability - Configurable cache size via the
CACHE_SIZEconstant - Automatically evicts the least recently used items when caches exceed their size limits
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langfuse_mcp-0.3.0.tar.gz.
File metadata
- Download URL: langfuse_mcp-0.3.0.tar.gz
- Upload date:
- Size: 29.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
33e46f161fd76eeb85fbaf0bb5c4b5112e1fa579a5f079963f99fe5e3d5d0171
|
|
| MD5 |
5f44fc3d0d044350351323c54484ebe7
|
|
| BLAKE2b-256 |
32f418ed1d3ea45b8a5f6dd9184fbfd066fa941806b1a33a16f71988406d667c
|
File details
Details for the file langfuse_mcp-0.3.0-py3-none-any.whl.
File metadata
- Download URL: langfuse_mcp-0.3.0-py3-none-any.whl
- Upload date:
- Size: 29.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b4355538a8fa4d265ff2a2c3f52f1792ad0347576cc882081b5ce45e5aca3076
|
|
| MD5 |
e133ff4e63ea45fe35d1afac0562c931
|
|
| BLAKE2b-256 |
7071d78567bea594f98fafca5ccb0b5eb167ba0ed69356b616d6fe30562f9b53
|