Skip to main content

A Python Fabric Rest Client to make requests to the Fabric API

Project description

FabRest

FabRest is a Python SDK for Microsoft Fabric REST APIs. It provides a consistent, typed surface for managing workspaces and Fabric items with sync and async clients.

Features

  • Authentication via azure.identity plus ROPC fallback
  • Workspace and item management (create, update, delete, list)
  • Data operations for pipelines, notebooks, and more
  • Broad API coverage across Fabric resources
  • Async support for concurrent workloads

Installation

pip install fabrest

From source:

git clone https://github.com/billybillysss/fabrest.git
cd fabrest
pip install .

Quick start

FabRest exposes FabricClient and AsyncFabricClient. Workspace-scoped resources are accessed via client.workspace("id").

Authentication

from azure.identity import DefaultAzureCredential
from fabrest import FabricClient

credential = DefaultAzureCredential()
client = FabricClient(credential)
from fabrest.api.auth import ResourceOwnerPasswordCredential
from fabrest import FabricClient

credential = ResourceOwnerPasswordCredential(
    tenant_id="your-tenant-id",
    client_id="your-client-id",
    client_secret="your-client-secret",
    username="your-username",
    password="your-password",
)
client = FabricClient(credential)

Core patterns

workspace = client.workspace("workspace-id")

# Workspaces
workspaces = client.workspaces.list()
ws = client.workspaces.get("workspace-id")

# Items (generic)
items = workspace.items.list()
lakehouses = workspace.items_for("Lakehouse").list()

Resource examples

Item resources and actions

# Lakehouse tables
tables = workspace.lakehouses.list_tables("lakehouse-id")

# SQL endpoint connection string
conn = workspace.sql_endpoints.get_connection_string("sql-endpoint-id")

# Warehouse restore points
restore_points = workspace.warehouses.list_restore_points("warehouse-id")

Data pipeline and notebook actions

# Data pipeline
pipeline_run = workspace.data_pipelines.run("pipeline-id")

# Notebook
notebook_run = workspace.notebooks.run("notebook-id")

Payload interfaces (simple and advanced)

from fabrest.models import dataflow

# Simple payload using a TypedDict interface
payload: dataflow.ExecuteQueryRequest = {
    "queryName": "GetCustomers",
}

result = workspace.dataflows.execute_query("dataflow-id", payload)
from fabrest.models import dataflow

# Payload with definition parts
payload: dataflow.CreateDataflowRequest = {
    "displayName": "Customer Dataflow",
    "definition": {
        "parts": [
            {
                "path": "model.json",
                "payloadType": "InlineBase64",
                "payload": "eyJ2ZXJzaW9uIjogIjEiLCAiZW50aXRpZXMiOiBbXX0=",
            }
        ]
    },
}

dataflow_item = workspace.dataflows.create(payload)
from fabrest.models import warehouse

# Advanced payload with nested TypedDicts and lists
payload: warehouse.CreateRestorePointRequest = {
    "displayName": "Before schema change",
    "description": "Restore point before ETL migration",
    "retentionDays": 14,
}

restore_point = workspace.warehouses.create_restore_point("warehouse-id", payload)

Async usage

import asyncio
from azure.identity import DefaultAzureCredential
from fabrest import AsyncFabricClient

async def main():
    client = AsyncFabricClient(DefaultAzureCredential())
    workspace = client.workspace("workspace-id")
    reports = await workspace.reports.async_list()
    await client.close()

asyncio.run(main())
import asyncio
from azure.identity import DefaultAzureCredential
from fabrest import AsyncFabricClient

async def main():
    client = AsyncFabricClient(DefaultAzureCredential())
    workspace = client.workspace("workspace-id")

    pipeline_run = await workspace.data_pipelines.async_run("pipeline-id")
    notebook_run = await workspace.notebooks.async_run("notebook-id")

    await client.close()

asyncio.run(main())
import asyncio
from azure.identity import DefaultAzureCredential
from fabrest import AsyncFabricClient
from fabrest.models import dataflow, warehouse

async def main():
    client = AsyncFabricClient(DefaultAzureCredential())
    workspace = client.workspace("workspace-id")

    simple_payload: dataflow.ExecuteQueryRequest = {"queryName": "GetCustomers"}
    await workspace.dataflows.async_execute_query("dataflow-id", simple_payload)

    advanced_payload: warehouse.CreateRestorePointRequest = {
        "displayName": "Before schema change",
        "description": "Restore point before ETL migration",
        "retentionDays": 14,
    }
    await workspace.warehouses.async_create_restore_point("warehouse-id", advanced_payload)

    await client.close()

asyncio.run(main())

Pagination

page = workspace.lakehouses.list(recursive=True)
next_page = workspace.lakehouses.list(continuation_token="token")

List endpoints return aggregated items. The SDK normalizes item lists from both value and data fields in API responses.

LRO and raw responses

from fabrest.transport import RequestOptions

options = RequestOptions(wait_for_completion=False, raw_response=True)
response = workspace.lakehouses.create({"displayName": "My Lakehouse"}, options=options)

Error handling

from fabrest.errors import HttpError, ThrottledError

try:
    workspace.sql_endpoints.refresh_metadata("sql-endpoint-id")
except ThrottledError as exc:
    print(exc.status_code, exc.payload)
except HttpError as exc:
    print(exc.status_code, exc.payload)

Documentation

Detailed documentation is under development. For now, refer to the source code and inline comments for usage of modules and functions.

Contributing

Contributions are welcome. Please open an Issue or submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

For support, open an issue on the GitHub repository or contact the maintainers directly.


Note: This project is not officially affiliated with Microsoft or the Fabric team.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fabrest-1.0.1.tar.gz (49.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fabrest-1.0.1-py3-none-any.whl (91.6 kB view details)

Uploaded Python 3

File details

Details for the file fabrest-1.0.1.tar.gz.

File metadata

  • Download URL: fabrest-1.0.1.tar.gz
  • Upload date:
  • Size: 49.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.12

File hashes

Hashes for fabrest-1.0.1.tar.gz
Algorithm Hash digest
SHA256 f8a13bfa2de920136484efb763c6acade1734b3136b2aa84d1156f6ae1041521
MD5 fb9da401540155a0dbd68ed84c47dc96
BLAKE2b-256 bb338b14afa28b48ffec18d0f4d65007a7d33cc1a56b346ed4fe7fe9710cb082

See more details on using hashes here.

File details

Details for the file fabrest-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: fabrest-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 91.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.12

File hashes

Hashes for fabrest-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6bcde51f46becee61c485f3a17d0df544a0f93c10fc99de103d9a6b340ec4c87
MD5 985a5ce3273c7d6c8a63a79c54929e78
BLAKE2b-256 8e5e6cd934b60c2224c5405e588be8322f5d199983a7fc65ed85b641cc72dd25

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page