Skip to main content

TOMPo — Power BI & Fabric Lineage Intelligence. MCP Server + Streamlit Web App. Trace lineage from columns to visuals, cross-workspace. Governance health scores.

Project description

TOMPo MCP — Power BI & Fabric Lineage Intelligence

Trace lineage from semantic models → tables → reports → pages → visuals → columns/measures.

An MCP (Model Context Protocol) server that brings Power BI lineage intelligence directly into your AI assistant — GitHub Copilot, Claude Desktop, Cursor, or any MCP-compatible client.

What it does

  • Full Lineage: See exactly which columns and measures appear in which visuals, across all reports bound to a semantic model
  • Impact Analysis: "What breaks if I rename Employee.StartDate?" — instantly shows every affected visual
  • Interactive Visualization: Export a self-contained D3 tree (HTML file) with expand/collapse, zoom, and search
  • Zero Infrastructure: Runs locally on your machine using your own Azure identity. No App Service, no Docker, no backend.

Quick Start

1. Install

pip install tompo-mcp

2. Login to Azure

az login

3. Add to VS Code

Option A — mcp.json (recommended)

Create or edit .vscode/mcp.json in your workspace (or ~/.config/Code/User/mcp.json globally):

{
  "servers": {
    "tompo": {
      "command": "python",
      "args": ["-m", "tompo_mcp"],
      "type": "stdio"
    }
  }
}

Option B — settings.json

Add to your VS Code settings.json (Ctrl+Shift+P → "Preferences: Open User Settings (JSON)"):

{
  "mcp": {
    "servers": {
      "tompo": {
        "command": "python",
        "args": ["-m", "tompo_mcp"],
        "type": "stdio"
      }
    }
  }
}

4. Use in Copilot Chat

Open GitHub Copilot Chat and start asking:

> List my Fabric workspaces
> Generate lineage for dataset abc-123 in workspace xyz-456
> What visuals use the Employee.Department column?
> Export the lineage as an interactive HTML file

MCP Tools

Tool Description
list_workspaces List all Fabric/Power BI workspaces you have access to
generate_lineage Full lineage: Model → Tables → Reports → Pages → Visuals → Fields
impact_analysis Find all visuals where a specific column or measure is used
describe_semantic_model Detailed metadata: tables, columns, measures, relationships, roles
export_lineage_html Generate interactive D3 visualization as a self-contained HTML file

How It Works

You type in Copilot Chat
        │
        ▼
Copilot calls TOMPo MCP tools
        │
        ▼
TOMPo runs locally on your machine:
  → Uses your az login identity
  → Calls Fabric REST APIs (getDefinition, Scanner, DAX)
  → Parses model + report definitions
  → Builds lineage tree
  → Returns data to Copilot
        │
        ▼
Copilot shows the lineage tree / impact table
(or opens interactive HTML in your browser)

Authentication

TOMPo uses DefaultAzureCredential which automatically picks up:

  1. Azure CLI (az login) — most common for developers
  2. VS Code Azure Account — if you're signed into the Azure extension
  3. Environment variables — for CI/CD pipelines
  4. Managed Identity — for Azure-hosted scenarios

You need access to the Fabric workspaces you want to analyze. No extra app registrations or service principals required.

Metadata Extraction (3-tier fallback)

  1. Fabric getDefinition API — returns full model.bim / TMDL / PBIR definitions
  2. Admin Scanner API — fallback if getDefinition fails (requires admin permissions)
  3. DAX executeQueries — last resort using INFO.TABLES(), INFO.COLUMNS(), etc.

If a sensitivity label blocks access, TOMPo temporarily downgrades to "General", extracts metadata, then restores the original label.

Interactive Visualization

The export_lineage_html tool generates a single HTML file with:

  • D3 horizontal tree with expand/collapse nodes
  • Color-coded by type (model, table, report, page, visual, column, measure)
  • Impact Analysis tab with searchable data grid
  • Semantic Model tab with tables, columns, measures, relationships
  • Zoom, pan, fullscreen — all interactive
  • Works offline — all JavaScript and CSS inlined, no server needed

Claude Desktop

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "tompo": {
      "command": "python",
      "args": ["-m", "tompo_mcp"]
    }
  }
}

Cursor / Windsurf

{
  "mcpServers": {
    "tompo": {
      "command": "python",
      "args": ["-m", "tompo_mcp"]
    }
  }
}

Development

git clone https://github.com/microsoft/tompo-mcp.git
cd tompo-mcp
pip install -e ".[dev]"
python -m pytest tests/

Requirements

  • Python 3.10+
  • Azure CLI (az login) or any Azure credential
  • Access to Fabric/Power BI workspaces

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tompo_mcp-0.3.17.tar.gz (99.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tompo_mcp-0.3.17-py3-none-any.whl (74.8 kB view details)

Uploaded Python 3

File details

Details for the file tompo_mcp-0.3.17.tar.gz.

File metadata

  • Download URL: tompo_mcp-0.3.17.tar.gz
  • Upload date:
  • Size: 99.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for tompo_mcp-0.3.17.tar.gz
Algorithm Hash digest
SHA256 60c920c7d5c32120e342624822ee8b5d0f61c6ef341e7aa47075feb3e32ec6e2
MD5 65650de09879ea3b850f8b23207b3ce9
BLAKE2b-256 0c009629b30f820690727be5ef1b1cd6e7ef25c17d15f698f3849b39c8425e7c

See more details on using hashes here.

File details

Details for the file tompo_mcp-0.3.17-py3-none-any.whl.

File metadata

  • Download URL: tompo_mcp-0.3.17-py3-none-any.whl
  • Upload date:
  • Size: 74.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for tompo_mcp-0.3.17-py3-none-any.whl
Algorithm Hash digest
SHA256 f94dc2a5a4aced59f059fb9fc9f7772fc2c4c962b1c8a949fc74541f47e15d46
MD5 7e549ae759b9ad052c335c716b6a0267
BLAKE2b-256 f3504eb26b23d6c18f0039ab80b9ad21f74b69330655fc720c2aadd42e86efdf

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page