Skip to main content

Declarative project definitions for Microsoft Fabric — inspired by Databricks Declarative Automation Bundles

Project description

Fabric Automation Bundles

Declarative project definitions for Microsoft Fabric — inspired by Databricks Declarative Automation Bundles.

Define your entire Fabric project in a single fabric.yml — lakehouses, notebooks, pipelines, semantic models, Data Agents, security roles, and environment targets — then validate, plan, and deploy with a single command.

fab-bundle init --template medallion --name my-project
fab-bundle validate
fab-bundle plan
fab-bundle deploy -t prod

CLI naming: The standalone CLI is fab-bundle. The long-term goal is integration as a fab bundle subcommand in the Fabric CLI. Both syntaxes are shown in this documentation — use whichever applies to your installation.

The Problem

Databricks has Declarative Automation Bundles — a single YAML file that defines your entire project and deploys it consistently across dev, staging, and prod.

Microsoft Fabric has nothing equivalent. The Fabric CLI can export/import items, fabric-cicd can deploy across workspaces, and Terraform can provision infrastructure — but there is no single declarative project definition that describes:

  • What resources your project needs (lakehouses, notebooks, pipelines, semantic models, Data Agents)
  • How those resources depend on each other
  • How configuration varies across environments (dev/staging/prod)
  • What security roles and permissions are required
  • How to deploy everything in the correct order

Fabric Automation Bundles fills that gap.

Quick Start

Install

pip install fabric-automation-bundles

Create a New Project

# Medallion lakehouse architecture (bronze/silver/gold)
fab-bundle init --template medallion --name my-analytics

# OSDU + Fabric for Oil, Gas & Energy
fab-bundle init --template osdu_analytics --name chevron-osdu

Or Generate from an Existing Workspace

fab-bundle generate --workspace "My Existing Workspace"

This scans the workspace and produces a fabric.yml you can customize — the fastest on-ramp for existing projects.

Validate

fab-bundle validate

Validates all resource references, dependency chains, and target configurations.

Plan (Dry-Run)

fab-bundle plan -t dev

Shows exactly what would change:

Deployment Plan: my-analytics
  Target:    dev
  Workspace: my-analytics-dev

  +  bronze-lakehouse      Lakehouse      create    New resource
  +  silver-lakehouse      Lakehouse      create    New resource
  +  gold-lakehouse        Lakehouse      create    New resource
  +  spark-env             Environment    create    New resource
  +  etl-bronze            Notebook       create    New resource
  +  etl-silver            Notebook       create    New resource
  +  daily-refresh         DataPipeline   create    New resource
  ~  analytics-model       SemanticModel  update    Definition updated

  Summary: 7 to create, 1 to update

Deploy

fab-bundle deploy -t dev        # Deploy to dev (default)
fab-bundle deploy -t staging    # Deploy to staging
fab-bundle deploy -t prod -y   # Deploy to prod (skip confirmation)

Destroy

fab-bundle destroy -t dev       # Tear down dev environment

The fabric.yml Format

bundle:
  name: my-analytics
  version: "1.0.0"

workspace:
  capacity: F64

resources:
  environments:
    spark-env:
      runtime: "1.3"
      libraries: [semantic-link-labs]

  lakehouses:
    bronze:
      description: "Raw data landing zone"
    gold:
      description: "Business-ready datasets"

  notebooks:
    etl-pipeline:
      path: ./notebooks/etl.py
      environment: spark-env
      default_lakehouse: bronze

  pipelines:
    daily-refresh:
      schedule:
        cron: "0 6 * * *"
        timezone: America/Chicago
      activities:
        - notebook: etl-pipeline

  semantic_models:
    analytics-model:
      path: ./semantic_model/
      default_lakehouse: gold

  reports:
    dashboard:
      path: ./reports/dashboard/
      semantic_model: analytics-model

  data_agents:
    my-agent:
      sources: [gold]
      instructions: ./agent/instructions.md
      few_shot_examples: ./agent/examples.yaml

security:
  roles:
    - name: engineers
      entra_group: sg-data-eng
      workspace_role: contributor
    - name: analysts
      entra_group: sg-analysts
      workspace_role: viewer

targets:
  dev:
    default: true
    workspace:
      name: my-analytics-dev
      capacity: F2

  prod:
    workspace:
      name: my-analytics-prod
    run_as:
      service_principal: sp-fabric-prod

How It Works

Dependency Resolution

Fabric Automation Bundles automatically determines deployment order using topological sorting. You never have to think about what goes first:

environments → lakehouses → notebooks → pipelines
                          → warehouses
                          → semantic_models → reports
                          → data_agents

Variable Substitution

Use ${var.name} in any string value:

variables:
  adme_endpoint:
    description: "ADME endpoint"
    default: "https://dev.energy.azure.com"

targets:
  prod:
    variables:
      adme_endpoint: "https://prod.energy.azure.com"

Include Files

Split large bundles across multiple files:

include:
  - resources/notebooks.yml
  - resources/pipelines.yml
  - security.yml

CI/CD Integration

GitHub Actions

Copy cicd/github-actions.yml to .github/workflows/fabric-bundle.yml:

- name: Deploy to Fabric
  run: |
    pip install fabric-automation-bundles
    fab-bundle deploy -t prod -y
  env:
    AZURE_TENANT_ID: ${{ secrets.AZURE_TENANT_ID }}
    AZURE_CLIENT_ID: ${{ secrets.AZURE_CLIENT_ID }}
    AZURE_CLIENT_SECRET: ${{ secrets.AZURE_CLIENT_SECRET }}

Azure DevOps

Copy cicd/azure-devops.yml to your repo as a YAML pipeline — includes validate, staging, and production stages with approval gates.

CLI Reference

Command Description
fab-bundle init Create a new project from a template
fab-bundle validate Validate the bundle definition
fab-bundle plan Preview changes (dry-run)
fab-bundle deploy Deploy to a target workspace
fab-bundle destroy Tear down bundle resources
fab-bundle generate Generate fabric.yml from existing workspace
fab-bundle run <resource> Run a notebook or pipeline
fab-bundle list List available templates
fab-bundle bind Bind an existing workspace item

Common Flags

Flag Description
-f, --file Path to fabric.yml (default: auto-detect)
-t, --target Target environment (dev, staging, prod)
-y, --auto-approve Skip confirmation prompts
--dry-run Preview without making changes

Templates

medallion

Bronze/Silver/Gold lakehouse architecture with:

  • Three lakehouses with ETL notebooks
  • Data pipeline with dependency chaining
  • Semantic model and dashboard
  • Data Agent with few-shot examples
  • Security roles for engineers and analysts
  • Dev/Staging/Prod targets

osdu_analytics

OSDU on Fabric for Oil, Gas & Energy:

  • ADME integration with OSDU Search API ingestion
  • Well/Wellbore/Production entity flattening
  • SQL views for BI (well master, production trends, field rollups)
  • Data Agent with petroleum engineering context
  • Industry-specific few-shot examples (GOR, water cut, decline analysis)
  • ADME connection config per environment

Custom Templates

Create your own templates by adding a directory to fab_bundle/templates/ with a template.yml and a fabric.yml.

Comparison: Databricks vs Fabric Automation Bundles

Feature Databricks (DABs) Fabric Automation Bundles
Project definition databricks.yml fabric.yml
CLI (standalone) databricks bundle fab-bundle
CLI (integrated) databricks bundle fab bundle (planned)
Validate databricks bundle validate fab-bundle validate
Deploy databricks bundle deploy fab-bundle deploy
Dry-run / Plan databricks bundle deploy fab-bundle plan
Run a resource databricks bundle run fab-bundle run
Generate from existing databricks bundle generate fab-bundle generate
Init from template databricks bundle init fab-bundle init
Targets/Environments ✅ YAML targets ✅ YAML targets
Dependency ordering ✅ Automatic ✅ Automatic (topological sort)
Variable substitution ${var.name} ${var.name}
Include files
Service principal auth
GitHub Actions ✅ (template provided)
Azure DevOps ✅ (template provided)
Workspace security Via Unity Catalog ✅ Entra + OneLake roles
Data Agents N/A ✅ First-class resource
Semantic Models N/A ✅ First-class resource
Custom templates

Authentication

Fabric Automation Bundles uses azure-identity for authentication:

# Interactive (development)
az login
fab-bundle deploy -t dev

# Service Principal (CI/CD)
export AZURE_TENANT_ID=...
export AZURE_CLIENT_ID=...
export AZURE_CLIENT_SECRET=...
fab-bundle deploy -t prod -y

Architecture

fab_bundle/
├── cli.py                 # Click CLI (init, validate, plan, deploy, destroy, generate, run)
├── models/
│   └── bundle.py          # 30+ Pydantic models for fabric.yml schema
├── engine/
│   ├── loader.py          # YAML parser with includes + variable substitution
│   ├── resolver.py        # Topological dependency sort
│   ├── planner.py         # Diff engine (desired state vs workspace state)
│   └── deployer.py        # Executes plans via Fabric REST API
├── providers/
│   └── fabric_api.py      # Fabric REST API client with retry logic
├── generators/
│   ├── reverse.py         # Generate fabric.yml from existing workspace
│   └── templates.py       # Template engine with Jinja2
└── templates/
    ├── medallion/          # Bronze/Silver/Gold template
    └── osdu_analytics/     # OSDU + Fabric for OGE

Contributing

Contributions welcome. See CONTRIBUTING.md for details.

git clone https://github.com/microsoft/fabric-automation-bundles.git
cd fabric-automation-bundles
pip install -e ".[dev]"
pytest

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fabric_automation_bundles-0.2.0.tar.gz (52.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fabric_automation_bundles-0.2.0-py3-none-any.whl (49.8 kB view details)

Uploaded Python 3

File details

Details for the file fabric_automation_bundles-0.2.0.tar.gz.

File metadata

File hashes

Hashes for fabric_automation_bundles-0.2.0.tar.gz
Algorithm Hash digest
SHA256 9e3b55a9e37d4c9709d47a553e5e8cea81ff4ea1497388cb952fb623c9c33727
MD5 7a6459acb5baa58d71495a26dc33645e
BLAKE2b-256 24234b0ade360c8ebfb8877961c7b68054f7898d0bac356da7ae8b12144a3b70

See more details on using hashes here.

File details

Details for the file fabric_automation_bundles-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for fabric_automation_bundles-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e9adab8e6d270d6e27da547dfddaedfa8be71a93a67a943302fd5fc99458b9ed
MD5 a14da3b0b1ce7cc80f822d424e36ede8
BLAKE2b-256 70f84195c4b6b57c2e820297d09c8566052166e9896da90ab8a96318555020f2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page