Skip to main content

CLI for Power BI semantic models - direct .NET connection for token-efficient AI agent usage

Project description

pbi-cli

Give Claude Code the Power BI skills it needs. Install once, then just ask Claude to work with your semantic models.

Python CI License

Get StartedSkillsAll CommandsREPL ModeContributing


What is this?

pbi-cli gives Claude Code (and other AI agents) the ability to manage Power BI semantic models. It ships with 7 skills that Claude discovers automatically. You ask in plain English, Claude uses the right pbi commands.

You                        Claude Code              pbi-cli              Power BI
 "Add a YTD measure   --->  Uses Power BI    --->   CLI commands   --->  Desktop
  to the Sales table"       skills

Get Started

Fastest way: Just give Claude the repo URL and let it handle everything:

Install and set up pbi-cli from https://github.com/MinaSaad1/pbi-cli.git

Or install manually (two commands):

pipx install pbi-cli-tool    # 1. Install (handles PATH automatically)
pbi connect                  # 2. Auto-detects Power BI Desktop and installs skills

That's it. Open Power BI Desktop with a .pbix file, run pbi connect, and everything is set up automatically. Open Claude Code and start asking.

You can also specify the port manually: pbi connect -d localhost:54321

Requires: Windows with Python 3.10+ and Power BI Desktop running.

Using pip instead of pipx?
pip install pbi-cli-tool

On Windows, pip install often places the pbi command in a directory that isn't on your PATH.

Fix: Add the Scripts directory to PATH

Find the directory:

python -c "import site; print(site.getusersitepackages().replace('site-packages','Scripts'))"

Add the printed path to your system PATH:

setx PATH "%PATH%;C:\Users\YourName\AppData\Roaming\Python\PythonXXX\Scripts"

Then restart your terminal. We recommend pipx instead to avoid this entirely.


Skills

After running pbi connect, Claude Code discovers 7 Power BI skills. Each skill teaches Claude a different area of Power BI development. You don't need to memorize commands. Just describe what you want.

You: "Set up RLS for regional managers"
  |
  v
Claude Code --> Picks the right skill
                  |
                  +-- Modeling
                  +-- DAX
                  +-- Deployment
                  +-- Security
                  +-- Documentation
                  +-- Diagnostics
                  +-- Partitions

Modeling

"Create a star schema with Sales, Products, and Calendar tables"

Claude creates the tables, sets up relationships, marks the date table, and adds formatted measures. Covers tables, columns, measures, relationships, hierarchies, and calculation groups.

Example: what Claude runs behind the scenes
pbi table create Sales --mode Import
pbi table create Products --mode Import
pbi table create Calendar --mode Import
pbi relationship create --from-table Sales --from-column ProductKey --to-table Products --to-column ProductKey
pbi relationship create --from-table Sales --from-column DateKey --to-table Calendar --to-column DateKey
pbi table mark-date Calendar --date-column Date
pbi measure create "Total Revenue" -e "SUM(Sales[Revenue])" -t Sales --format-string "$#,##0"

DAX

"What are the top 10 products by revenue this year?"

Claude writes and executes DAX queries, validates syntax, and creates measures with time intelligence patterns like YTD, previous year, and rolling averages.

Example: what Claude runs behind the scenes
pbi dax execute "
EVALUATE
TOPN(
    10,
    ADDCOLUMNS(VALUES(Products[Name]), \"Revenue\", CALCULATE(SUM(Sales[Amount]))),
    [Revenue], DESC
)
"

Deployment

"Export the model to Git for version control"

Claude exports your model as TMDL files for version control and imports them back. Handles transactions for safe multi-step changes.

Example: what Claude runs behind the scenes
pbi database export-tmdl ./model/
# ... you commit to git ...
pbi database import-tmdl ./model/

Security

"Set up row-level security so regional managers only see their region"

Claude creates RLS roles with descriptions, sets up perspectives for different user groups, and exports the model for version control.

Example: what Claude runs behind the scenes
pbi security-role create "Regional Manager" --description "Users see only their region's data"
pbi perspective create "Executive Dashboard"
pbi perspective create "Regional Detail"
pbi database export-tmdl ./model-backup/

Documentation

"Document everything in this model"

Claude catalogs every table, measure, column, and relationship. Generates data dictionaries, measure inventories, and can export the full model as TMDL for human-readable reference.

Example: what Claude runs behind the scenes
pbi --json model get
pbi --json model stats
pbi --json table list
pbi --json measure list
pbi --json relationship list
pbi database export-tmdl ./model-docs/

Diagnostics

"Why is this DAX query so slow?"

Claude traces query execution, clears caches for clean benchmarks, checks model health, and verifies the environment.

Example: what Claude runs behind the scenes
pbi dax clear-cache
pbi trace start
pbi dax execute "EVALUATE SUMMARIZECOLUMNS(...)" --timeout 300
pbi trace stop
pbi trace export ./trace.json

Partitions & Expressions

"Set up partitions for incremental refresh on the Sales table"

Claude manages table partitions, shared M/Power Query expressions, and calendar table configuration.

Example: what Claude runs behind the scenes
pbi partition list --table Sales
pbi partition create "Sales_2024" --table Sales --expression "..." --mode Import
pbi expression create "ServerURL" --expression '"https://api.example.com"'
pbi calendar mark Calendar --date-column Date

All Commands

22 command groups covering the full Power BI Tabular Object Model. You rarely need these directly when using Claude Code, but they're available for scripting, CI/CD, or manual use.

Category Commands
Queries dax execute, dax validate, dax clear-cache
Model table, column, measure, relationship, hierarchy, calc-group
Deploy database export-tmdl, database import-tmdl, database export-tmsl, transaction
Security security-role, perspective
Connect connect, disconnect, connections list, connections last
Data partition, expression, calendar, advanced culture
Diagnostics trace start, trace stop, trace fetch, trace export, model stats
Tools setup, repl, skills install, skills list

Use --json for machine-readable output (for scripts and AI agents):

pbi --json measure list
pbi --json dax execute "EVALUATE Sales"

Run pbi <command> --help for full options.


REPL Mode

For interactive work, the REPL keeps a persistent connection alive between commands:

$ pbi repl

pbi> connect --data-source localhost:54321
Connected: localhost-54321

pbi(localhost-54321)> measure list
pbi(localhost-54321)> dax execute "EVALUATE TOPN(5, Sales)"
pbi(localhost-54321)> exit

Tab completion, command history, and a dynamic prompt showing your active connection.


How It Works

pbi-cli connects directly to Power BI Desktop's Analysis Services engine via pythonnet and the .NET Tabular Object Model (TOM). No external binaries or MCP servers needed. Everything runs in-process for sub-second command execution.

+------------------+         +---------------------+         +------------------+
|     pbi-cli      |         |   Bundled TOM DLLs  |         |    Power BI      |
|   (Python CLI)   | pythnet |  (.NET in-process)  |  XMLA   |     Desktop      |
|  Click commands  |-------->|  TOM / ADOMD.NET    |-------->|   msmdsrv.exe    |
+------------------+         +---------------------+         +------------------+

Why a CLI? When an AI agent uses an MCP server directly, the tool schemas consume ~4,000+ tokens per tool in the context window. A pbi command costs ~30 tokens. Same capabilities, 100x less context.

Configuration details

All config lives in ~/.pbi-cli/:

~/.pbi-cli/
  config.json          # Default connection preference
  connections.json     # Named connections
  repl_history         # REPL command history

Bundled DLLs ship inside the Python package (pbi_cli/dlls/):

  • Microsoft.AnalysisServices.Tabular.dll
  • Microsoft.AnalysisServices.AdomdClient.dll
  • Microsoft.AnalysisServices.Core.dll
  • Microsoft.AnalysisServices.Tabular.Json.dll
  • Microsoft.AnalysisServices.dll

Development

git clone https://github.com/MinaSaad1/pbi-cli.git
cd pbi-cli
pip install -e ".[dev]"
ruff check src/ tests/         # Lint
mypy src/                      # Type check
pytest -m "not e2e"            # Run tests

Contributing

Contributions are welcome! Please open an issue first to discuss what you'd like to change.

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes with tests
  4. Open a pull request

GitHub PyPI

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pbi_cli_tool-2.1.0.tar.gz (3.0 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pbi_cli_tool-2.1.0-py3-none-any.whl (3.1 MB view details)

Uploaded Python 3

File details

Details for the file pbi_cli_tool-2.1.0.tar.gz.

File metadata

  • Download URL: pbi_cli_tool-2.1.0.tar.gz
  • Upload date:
  • Size: 3.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for pbi_cli_tool-2.1.0.tar.gz
Algorithm Hash digest
SHA256 bd23601d4fb7ac295b74ab030ff5fef54080bee26b87b676ebf9bdb01ce3059c
MD5 7307835679528f9c7619ff221e785772
BLAKE2b-256 42af1bfb57aa59c14c7f482287520fc1c1a1ec2467f8ad6ba4c8023653bf92c9

See more details on using hashes here.

File details

Details for the file pbi_cli_tool-2.1.0-py3-none-any.whl.

File metadata

  • Download URL: pbi_cli_tool-2.1.0-py3-none-any.whl
  • Upload date:
  • Size: 3.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for pbi_cli_tool-2.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8d31090a93a1f403c37454ba1965454edae0db6d6592e6f9286caaa1b6def723
MD5 38ed15176a6da5c2398334c54364335a
BLAKE2b-256 6472d63359305fed05f6a78414fee314497ae579227d3fd3989e93ccf49b4cae

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page