Skip to main content

TOMPo — Power BI & Fabric Lineage Intelligence. MCP Server + Streamlit Web App. Trace lineage from columns to visuals, cross-workspace. Governance health scores.

Project description

TOMPo MCP — Power BI & Fabric Lineage Intelligence

Trace lineage from semantic models → tables → reports → pages → visuals → columns/measures.

An MCP (Model Context Protocol) server that brings Power BI lineage intelligence directly into your AI assistant — GitHub Copilot, Claude Desktop, Cursor, or any MCP-compatible client.

What it does

  • Full Lineage: See exactly which columns and measures appear in which visuals, across all reports bound to a semantic model
  • Impact Analysis: "What breaks if I rename Employee.StartDate?" — instantly shows every affected visual
  • Interactive Visualization: Export a self-contained D3 tree (HTML file) with expand/collapse, zoom, and search
  • Zero Infrastructure: Runs locally on your machine using your own Azure identity. No App Service, no Docker, no backend.

Quick Start

1. Install

pip install tompo-mcp

2. Login to Azure

az login

3. Add to VS Code

Option A — mcp.json (recommended)

Create or edit .vscode/mcp.json in your workspace (or ~/.config/Code/User/mcp.json globally):

{
  "servers": {
    "tompo": {
      "command": "python",
      "args": ["-m", "tompo_mcp"],
      "type": "stdio"
    }
  }
}

Option B — settings.json

Add to your VS Code settings.json (Ctrl+Shift+P → "Preferences: Open User Settings (JSON)"):

{
  "mcp": {
    "servers": {
      "tompo": {
        "command": "python",
        "args": ["-m", "tompo_mcp"],
        "type": "stdio"
      }
    }
  }
}

4. Use in Copilot Chat

Open GitHub Copilot Chat and start asking:

> List my Fabric workspaces
> Generate lineage for dataset abc-123 in workspace xyz-456
> What visuals use the Employee.Department column?
> Export the lineage as an interactive HTML file

MCP Tools

Tool Description
list_workspaces List all Fabric/Power BI workspaces you have access to
generate_lineage Full lineage: Model → Tables → Reports → Pages → Visuals → Fields
impact_analysis Find all visuals where a specific column or measure is used
describe_semantic_model Detailed metadata: tables, columns, measures, relationships, roles
export_lineage_html Generate interactive D3 visualization as a self-contained HTML file

How It Works

You type in Copilot Chat
        │
        ▼
Copilot calls TOMPo MCP tools
        │
        ▼
TOMPo runs locally on your machine:
  → Uses your az login identity
  → Calls Fabric REST APIs (getDefinition, Scanner, DAX)
  → Parses model + report definitions
  → Builds lineage tree
  → Returns data to Copilot
        │
        ▼
Copilot shows the lineage tree / impact table
(or opens interactive HTML in your browser)

Authentication

TOMPo uses DefaultAzureCredential which automatically picks up:

  1. Azure CLI (az login) — most common for developers
  2. VS Code Azure Account — if you're signed into the Azure extension
  3. Environment variables — for CI/CD pipelines
  4. Managed Identity — for Azure-hosted scenarios

You need access to the Fabric workspaces you want to analyze. No extra app registrations or service principals required.

Metadata Extraction (3-tier fallback)

  1. Fabric getDefinition API — returns full model.bim / TMDL / PBIR definitions
  2. Admin Scanner API — fallback if getDefinition fails (requires admin permissions)
  3. DAX executeQueries — last resort using INFO.TABLES(), INFO.COLUMNS(), etc.

If a sensitivity label blocks access, TOMPo temporarily downgrades to "General", extracts metadata, then restores the original label.

Interactive Visualization

The export_lineage_html tool generates a single HTML file with:

  • D3 horizontal tree with expand/collapse nodes
  • Color-coded by type (model, table, report, page, visual, column, measure)
  • Impact Analysis tab with searchable data grid
  • Semantic Model tab with tables, columns, measures, relationships
  • Zoom, pan, fullscreen — all interactive
  • Works offline — all JavaScript and CSS inlined, no server needed

Claude Desktop

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "tompo": {
      "command": "python",
      "args": ["-m", "tompo_mcp"]
    }
  }
}

Cursor / Windsurf

{
  "mcpServers": {
    "tompo": {
      "command": "python",
      "args": ["-m", "tompo_mcp"]
    }
  }
}

Development

git clone https://github.com/microsoft/tompo-mcp.git
cd tompo-mcp
pip install -e ".[dev]"
python -m pytest tests/

Requirements

  • Python 3.10+
  • Azure CLI (az login) or any Azure credential
  • Access to Fabric/Power BI workspaces

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tompo_mcp-0.3.16.tar.gz (98.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tompo_mcp-0.3.16-py3-none-any.whl (74.1 kB view details)

Uploaded Python 3

File details

Details for the file tompo_mcp-0.3.16.tar.gz.

File metadata

  • Download URL: tompo_mcp-0.3.16.tar.gz
  • Upload date:
  • Size: 98.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for tompo_mcp-0.3.16.tar.gz
Algorithm Hash digest
SHA256 4a2ca35c0d7f9ab80ab6e6627bad2cd6944994f7a7853ea84531781ee277eda4
MD5 4e38f9beaf7515ed89bb249ec3ef023d
BLAKE2b-256 51eaa4e7cd5ecb9b1a2c8c003e80c18f8b9205304f067527d8a91d8df6caaf1a

See more details on using hashes here.

File details

Details for the file tompo_mcp-0.3.16-py3-none-any.whl.

File metadata

  • Download URL: tompo_mcp-0.3.16-py3-none-any.whl
  • Upload date:
  • Size: 74.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for tompo_mcp-0.3.16-py3-none-any.whl
Algorithm Hash digest
SHA256 b897447ce8b0518b15439ca87eae7e882952e759ea2f46311c6d60a1cb983539
MD5 e817f7f21000b4e7db855340a1cebde2
BLAKE2b-256 dc78882bf983ab23eefec8a7e003f20f54497b3df410e5d5899dc5804c990886

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page