Skip to main content

A Python Fabric Rest Client to make requests to the Fabric API

Project description

FabRest

FabRest is a Python SDK for Microsoft Fabric REST APIs. It provides a consistent, typed surface for managing workspaces and Fabric items with sync and async clients.

Features

  • Authentication via azure.identity plus ROPC fallback
  • Workspace and item management (create, update, delete, list)
  • Data operations for pipelines, notebooks, and more
  • Broad API coverage across Fabric resources
  • Async support for concurrent workloads

Installation

pip install fabrest

From source:

git clone https://github.com/billybillysss/fabrest.git
cd fabrest
pip install .

Quick start

FabRest exposes FabricClient and AsyncFabricClient. Workspace-scoped resources are accessed via client.workspace("id").

Authentication

from azure.identity import DefaultAzureCredential
from fabrest import FabricClient

credential = DefaultAzureCredential()
client = FabricClient(credential)
from fabrest.api.auth import ResourceOwnerPasswordCredential
from fabrest import FabricClient

credential = ResourceOwnerPasswordCredential(
    tenant_id="your-tenant-id",
    client_id="your-client-id",
    client_secret="your-client-secret",
    username="your-username",
    password="your-password",
)
client = FabricClient(credential)

Core patterns

workspace = client.workspace("workspace-id")

# Workspaces
workspaces = client.workspaces.list()
ws = client.workspaces.get("workspace-id")

# Items (generic)
items = workspace.items.list()
lakehouses = workspace.items_for("Lakehouse").list()

Resource examples

Item resources and actions

# Lakehouse tables
tables = workspace.lakehouses.list_tables("lakehouse-id")

# SQL endpoint connection string
conn = workspace.sql_endpoints.get_connection_string("sql-endpoint-id")

# Warehouse restore points
restore_points = workspace.warehouses.list_restore_points("warehouse-id")

Data pipeline and notebook actions

# Data pipeline
pipeline_run = workspace.data_pipelines.run("pipeline-id")

# Notebook
notebook_run = workspace.notebooks.run("notebook-id")

Payload interfaces (simple and advanced)

from fabrest.models import dataflow

# Simple payload using a TypedDict interface
payload: dataflow.ExecuteQueryRequest = {
    "queryName": "GetCustomers",
}

result = workspace.dataflows.execute_query("dataflow-id", payload)
from fabrest.models import dataflow

# Payload with definition parts
payload: dataflow.CreateDataflowRequest = {
    "displayName": "Customer Dataflow",
    "definition": {
        "parts": [
            {
                "path": "model.json",
                "payloadType": "InlineBase64",
                "payload": "eyJ2ZXJzaW9uIjogIjEiLCAiZW50aXRpZXMiOiBbXX0=",
            }
        ]
    },
}

dataflow_item = workspace.dataflows.create(payload)
from fabrest.models import warehouse

# Advanced payload with nested TypedDicts and lists
payload: warehouse.CreateRestorePointRequest = {
    "displayName": "Before schema change",
    "description": "Restore point before ETL migration",
    "retentionDays": 14,
}

restore_point = workspace.warehouses.create_restore_point("warehouse-id", payload)

Async usage

import asyncio
from azure.identity import DefaultAzureCredential
from fabrest import AsyncFabricClient

async def main():
    client = AsyncFabricClient(DefaultAzureCredential())
    workspace = client.workspace("workspace-id")
    reports = await workspace.reports.async_list()
    await client.close()

asyncio.run(main())
import asyncio
from azure.identity import DefaultAzureCredential
from fabrest import AsyncFabricClient

async def main():
    client = AsyncFabricClient(DefaultAzureCredential())
    workspace = client.workspace("workspace-id")

    pipeline_run = await workspace.data_pipelines.async_run("pipeline-id")
    notebook_run = await workspace.notebooks.async_run("notebook-id")

    await client.close()

asyncio.run(main())
import asyncio
from azure.identity import DefaultAzureCredential
from fabrest import AsyncFabricClient
from fabrest.models import dataflow, warehouse

async def main():
    client = AsyncFabricClient(DefaultAzureCredential())
    workspace = client.workspace("workspace-id")

    simple_payload: dataflow.ExecuteQueryRequest = {"queryName": "GetCustomers"}
    await workspace.dataflows.async_execute_query("dataflow-id", simple_payload)

    advanced_payload: warehouse.CreateRestorePointRequest = {
        "displayName": "Before schema change",
        "description": "Restore point before ETL migration",
        "retentionDays": 14,
    }
    await workspace.warehouses.async_create_restore_point("warehouse-id", advanced_payload)

    await client.close()

asyncio.run(main())

Pagination

page = workspace.lakehouses.list(recursive=True)
next_page = workspace.lakehouses.list(continuation_token="token")

List endpoints return aggregated items. The SDK normalizes item lists from both value and data fields in API responses.

LRO and raw responses

from fabrest.transport import RequestOptions

options = RequestOptions(wait_for_completion=False, raw_response=True)
response = workspace.lakehouses.create({"displayName": "My Lakehouse"}, options=options)

Error handling

from fabrest.errors import HttpError, ThrottledError

try:
    workspace.sql_endpoints.refresh_metadata("sql-endpoint-id")
except ThrottledError as exc:
    print(exc.status_code, exc.payload)
except HttpError as exc:
    print(exc.status_code, exc.payload)

Documentation

Detailed documentation is under development. For now, refer to the source code and inline comments for usage of modules and functions.

Contributing

Contributions are welcome. Please open an Issue or submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

For support, open an issue on the GitHub repository or contact the maintainers directly.


Note: This project is not officially affiliated with Microsoft or the Fabric team.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fabrest-1.0.0.tar.gz (48.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fabrest-1.0.0-py3-none-any.whl (90.8 kB view details)

Uploaded Python 3

File details

Details for the file fabrest-1.0.0.tar.gz.

File metadata

  • Download URL: fabrest-1.0.0.tar.gz
  • Upload date:
  • Size: 48.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.12

File hashes

Hashes for fabrest-1.0.0.tar.gz
Algorithm Hash digest
SHA256 7355efc732c748bf77776c5890ad675b0dccfaf20152d2f1f3c04648decdcb65
MD5 c9089fdaea4a2606ef96625a8108196c
BLAKE2b-256 8cf8be09780fa13b02a35696b9f2f3c81ad6d97e30d52c592d2de8e5706177c7

See more details on using hashes here.

File details

Details for the file fabrest-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: fabrest-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 90.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.12

File hashes

Hashes for fabrest-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e36966702c7c28ce3c0edf79d0e6fe33c6d28d92041cc44f9a068ec44453ec12
MD5 4618ff099b07b06e673ae8e24cf5f1b0
BLAKE2b-256 d2f39efcadad9b9705f8ea78408283913308473087d45db6fbf91524ce21155d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page