Skip to main content

MCP server for reading and analyzing Fabric semantic models

Project description

Fabric Model Reader MCP

Python MCP Server DXT PBIR License

MCP server for reading and analyzing Fabric semantic models. Available as standalone server and Desktop Extension (DXT).

Requires authentication to access Power BI workspaces (Azure CLI or Personal Access Token).

Quick Start

For Claude Desktop Users (Easiest)

Prerequisites:

  1. Python 3.8+ must be installed and in your PATH

    • Mac/Linux: Usually pre-installed as python3. Use the _python3 version.
    • Windows: Download from python.org - during installation, check "Add Python to PATH". Use the _python version.
  2. Install required Python packages:

    # For Mac/Linux:
    pip3 install fastmcp requests keyring
    
    # For Windows:
    pip install fastmcp requests keyring
    

Installation:

  1. Download appropriate .dxt file from releases
  2. Double-click to install in Claude Desktop
  3. Configure authentication (see Authentication section)

For Other MCP Clients

  1. Install dependencies:
pip install -r requirements.txt
  1. Configure your MCP client to use this server. Example for VS Code:
{
  "mcpServers": {
    "fabric-model-reader": {
      "command": "python3",
      "args": ["path/to/fabric-model-reader-mcp.py"]
    }
  }
}
  1. Authenticate using one of the supported methods:
  • Personal Access Token
  • Azure CLI
  1. Start using the server through your MCP client.

  2. For enhanced functionality, use with related MCP servers:

Authentication

The extension supports three authentication methods:

  • Azure CLI: az login (recommended)
  • Environment Variable: Set POWERBI_TOKEN
  • System Keyring: Store token securely in system keyring

Available Tools

get_model_definition

Get the TMDL definition of a semantic model with pagination and filtering support.

Parameters:

  • workspace_id (required): The workspace ID
  • dataset_id (required): The dataset/semantic model ID
  • file_filter (optional): Filter for specific files (e.g., 'measures', 'tables/', 'relationships.tmdl')
  • page (optional): Page number for pagination. Use either page or file_range, not both.
  • page_size (optional, default: 10): Number of files per page
  • metadata_only (optional, default: false): If true, returns only file paths without content
  • file_range (optional): File range to retrieve (e.g., '1-10', '11-20'). Use either page or file_range, not both.

Example usage:

# Get files 1-10 using file range (recommended for complete retrieval)
get_model_definition(workspace_id="...", dataset_id="...", file_range="1-10")

# Get files 11-20
get_model_definition(workspace_id="...", dataset_id="...", file_range="11-20")

# Get first page of model definition
get_model_definition(workspace_id="...", dataset_id="...")

# Get only measures with file range
get_model_definition(workspace_id="...", dataset_id="...", file_filter="measure", file_range="1-5")

# View all available files without content
get_model_definition(workspace_id="...", dataset_id="...", metadata_only=true)

# Navigate to specific page
get_model_definition(workspace_id="...", dataset_id="...", page=2, page_size=15)

execute_dax_query

Execute a DAX query against a Power BI dataset.

Parameters:

  • workspace_id (required): The workspace ID
  • dataset_id (required): The dataset/semantic model ID
  • query (required): The DAX query to execute

Example usage:

execute_dax_query(
    workspace_id="...", 
    dataset_id="...", 
    query="EVALUATE SUMMARIZECOLUMNS('Product'[Category], \"@TotalSales\", SUM('Sales'[Amount]))"
)

How to Contribute

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes following the existing patterns
  4. Test your changes with sample models
  5. Commit your changes (git commit -m 'Add amazing feature')
  6. Push to the branch (git push origin feature/amazing-feature)
  7. Open a Pull Request

License

This project is licensed under a Non-Commercial License. See LICENSE for details.

Troubleshooting

Common Issues

Authentication failures:

  • Run az login to authenticate with Azure CLI
  • Verify you have access to the target Power BI workspace
  • Check that POWERBI_TOKEN environment variable is set correctly

Model access denied:

  • Ensure you have read permissions for the semantic model
  • Verify the workspace ID and dataset ID are correct
  • Check that the model is published and accessible

Large model performance:

  • Use file filtering to focus on specific TMDL components
  • Implement pagination for models with many files
  • Use metadata_only option to browse structure before downloading content

Security & Privacy Disclaimer

This software was created by me for me. I am sharing it for educational and personal use.

THIS SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. THE USER ASSUMES ALL RESPONSIBILITY AND RISK FOR THE USE OF THIS SOFTWARE.

DATA SECURITY AND PRIVACY: This extension accesses Microsoft Fabric and Power BI data using your provided credentials. The author assumes NO responsibility for data security, privacy, or confidentiality. Users are SOLELY responsible for:

  • Protecting their authentication credentials
  • Ensuring compliance with their organization's data policies
  • Managing access to sensitive or confidential data
  • Any data breaches or unauthorized access resulting from use of this extension

By using this code, you acknowledge that you are fully responsible for all data security and privacy implications.

AI Disclaimer

The code and docs in this repo were generated with the help of Claude Sonnet 4, Claude Opus 4, and Gemini 2.5 Pro using various agentic coding tools.

General Disclaimer

THIS SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. THE USER ASSUMES ALL RESPONSIBILITY AND RISK FOR THE USE OF THIS SOFTWARE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file iflow_mcp_data_goblin_fabric_model_reader_mcp-0.1.0.tar.gz.

File metadata

  • Download URL: iflow_mcp_data_goblin_fabric_model_reader_mcp-0.1.0.tar.gz
  • Upload date:
  • Size: 480.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_data_goblin_fabric_model_reader_mcp-0.1.0.tar.gz
Algorithm Hash digest
SHA256 4ce5a735b76d47fe70480b20f2305bb92553b8c487974bb42fd30699121bbb91
MD5 af74b3753524f805d63e45e67af4efd9
BLAKE2b-256 d53734d063d87443498bf4909cf4194822a67a60ef7b2cc9d4c3ea81a9cb1fa0

See more details on using hashes here.

File details

Details for the file iflow_mcp_data_goblin_fabric_model_reader_mcp-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: iflow_mcp_data_goblin_fabric_model_reader_mcp-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 10.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_data_goblin_fabric_model_reader_mcp-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9e346ac8610a05268be168811ef548748214e20924821579fc301ed7bad7d1b8
MD5 625daac4dc7289321075fa0da2d1f509
BLAKE2b-256 d29eae10cd6f4d59e4677dbee2fe0a07f6cffb65fa780906b87d98d34cf61a43

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page