Skip to main content

Simplify usage of Power Bi Rest API

Project description

SimplePBI

downloads downloads PayPal donate Twitter Ask DeepWiki

Donate Your help is appreciated.

This is a simple library that makes it easy to use the Power Bi REST API and Fabric REST API. We have cover more than 90% of PowerBi request and now we are working with Microsoft Fabric too. We hope one day SimplePBI will contain all the categories in both APIs, our at least that's our vision dream. Feel free to check the doc to get a deeper understanding of a specific request:

Also make sure to check the deepwiki's doc of this project (AI-Generated): https://deepwiki.com/ladataweb/SimplePBI

We are doing our best to make this library useful for the community. This project is not a remunerable job for us. It's a public community open source project. It's a way to express our passion of sharing knowledge. Please be patient if you submit an issue and it's not fixed right away.
Each category is an Object. This means we need to initialize an object to start using its methods. In order to create them we need the Bearer token that can be obtain from a Token Object. Let's see how we can create an Admin Object to try the requests in that category.

# Import library
from simplepbi import token
from simplepbi import admin
from simplepbi.fabric import adminfab

We always need to import token object to create the object to run requests. Then we can pick the object of the Power Bi Rest API category we need. For instance "admin". The token can be created in two ways, the regular authentication and the service principal one. It depends which one you pick to complete the request. These are the necessary arguments to get a token:

  • tenant_id (you can get it from subscription resource in azure portal or ask for it to the IT department)
  • app_client_id (your app_id/client_id from the App registered in Azure with permissions to Power Bi Service)
  • username (professional email account in Azure AD)
  • password (professional password)
  • app_secret_key (secret key generated for the client id)
  • use_service_principal (True to athenticate with Service Principal and False to continue with user credentials)

NOTE: if you want to use service principal, be sure to have your tenant ready for that.
Register app example: https://blog.ladataweb.com.ar/post/188045227735/get-access-token
Service Principal permissions for admin api: https://docs.microsoft.com/en-us/power-bi/admin/read-only-apis-service-principal-authentication

# Creating objects

#Regular Login
tok = token.Token(tenant_id, app_client_id, username, password, None, use_service_principal=False)

#Service Principal
tok = token.Token(tenant_id, app_client_id, None, None, app_secret_key, use_service_principal=True)

ad = admin.Admin(tok.token)

it = adminfab.Items(tok.token)

As you can see the Token object contains a token attribute with the Bearer used by Azure to run rest methods. That attribute will be user to create the category objects like admin. Once we create our Object like admin, we can start using the requests adding the correct parameters if they are needed.

# Getting objects

All_Datasets = ad.get_datasets()

Datasets_In_Groups = ad.get_datasets_in_group(workspace_id)

Items_In_Workspace = it.list_items(workspace_id)

The library get requests will return a response object .json() that python reads it as a Dict.

The Core Category requests at Fabric

The core requests at Fabric are the operational requests for regular users. Some examples are workspaces, folders, items, git integration, schedules or gateways. It contains a subcategory items that can handle any item created at fabric. You can list items and pick a type report for the get. Items can be anything like notebooks, pipelines, semantic models, etc.

Preview methods

There are some methods in the classes that still need more testing. Those will have a "preview" at the end of the name. Please let us know if something goes wrong with those. All Fabric requests are in "preview" even though they don't have the clarification at the name.

Current Categories

Right now the library is consuming endpoints from:

Complex requests

If you want to get a deeper look on complex Admin methods and unique methods. Check this doc

Azure Pause Resume Resources

We have added a new feature to include some Azure Resource API Manager. The new "azpause" class will let you Pause or Resume Azure tabular or capacity resources. With SimplePBI you can pause and resume Fabric, Power Bi Embedded or Azure Analysis Services resources. Check this doc

Additional content

There an aditional library Utils for transformations. It is used to help some requests returning different values. The most useful method in the Utils class might be to_pandas. You can use the method to convert simple dicts to pandas. It needs the dict and the key father of a list of dicts in the response. The usual get responses are using "value" as the key. We are also adding new methods with the requests to help get new actions. Examples:

Example of our amazing unique requests

  • get_orphan_dataflows_preview: get dataflows without dataset
  • simple_import_pbix: makes publishing a pbix file easier
  • simple_import_pbix_as_parameter: import a pbix from api response content
  • simple_import_pbix_folder_in_group_preview: post a all pbix files in a local folder
  • simple_import_from_devops: import a pbix from azure devops repo
  • simple_import_from_github: import a pbix from azure github repo
  • simple_copy_reports_between_groups: copy report from workspace to a workspace
  • enhanced_refresh_dataset_in_group: a special request feature that not only eliminates the need for synchronous client connections to perform a refresh, but also unlocks enterprise-grade refresh capabilities.
  • get_activity_events_preview (already iterating): makes the get activity events specified by date easier
  • get_user_artifact_access_preview (already iterating): makes the get user artifact access easier
  • get_widely shared_artifacts_published_to_web (already iterating): makes geting the published to web repos info easier
  • get_dataset_roles_in_group: get all the roles from a single dataset in a specific workspace
  • get_datasets_roles_in_groups: get all the roles from all datasets in a list of workspaces
  • create_doc_by_table_semantic_model_in_group(workspace_id, dataset_id, doc_type, path): generate an html code file or text with a document of semantic model in a workspace organized by tables
  • create_html_semantic_model_documentation(workspace_id, semantic_model_id, output_html_path): a new version generating an html code file or text with a document of semantic model in a workspace organized by tables
  • list_roles_from_semantic_model(workspace_id, semantic_model_id): returns the roles of the specified semantic model
  • get_tables_schema_from_semantic_model(workspace_id, semantic_model_id): returns the tables schema of the specified semantic model
  • get_tables_partitions_from_semantic_model(workspace_id, semantic_model_id): returns the tables partitions of the specified semantic model
  • save_semantic_model_definition_local("xxxxxx", "xxxxxx", path): locally stores a semantic model definition in TMDL format
  • save_report_definition_local("xxxxxx", "xxxxxx", path, report_connection=None): locally stores a report definition in PBIR format

Small categories

Small categories like Dataflow Storage Accounts and Available Features were moved to Groups and Admin.

Missing endpoints

We are still developing the library. The following endpoints from admin are still missing

Fabric API

  • Admin (External data shares, labels and tenants)
  • Core (Capacities, deployment pipelines, external data shares, gateways, managed private endpoints)
  • All other categories except SemanticModels, Reports and DataPipelines

Power Bi Rest API

  • Admin (Set and Remove LabelsAsAdmin)
  • Groups (Update group User)
  • Reports
    • Export To File (full request, there is a smaller simpler one)
    • Get Export To File Status (regular and in groups)
    • Get File Of Export To File (regular and in groups)
    • Update Datasources (rdl files regular and in groups)
    • Update Report Content (regular and in groups)
  • Imports
    • Create Temporary Upload Location
    • Create Temporary Upload Location In Group
    • Post Import (for xlsx, json and rdl)
    • Post Import In Group (for xlsx, json and rdl)
  • Gateways
    • Create Datasource (looks like there is a bug on the API)
    • Update Datasource
    • Delete Datasource
  • Embed Token (All requests)

Next Steps (planned items)

  • Complete Fabric API requests for admin and core.
  • Creating new awesome ideas.
  • Keep completing missing endpoints category.
  • Focus on Fabric Rest API and close PowerBi Rest API development.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

simplepbi-1.0.9.tar.gz (77.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

simplepbi-1.0.9-py3-none-any.whl (91.8 kB view details)

Uploaded Python 3

File details

Details for the file simplepbi-1.0.9.tar.gz.

File metadata

  • Download URL: simplepbi-1.0.9.tar.gz
  • Upload date:
  • Size: 77.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for simplepbi-1.0.9.tar.gz
Algorithm Hash digest
SHA256 8b45f9c9c825e8d2d4467e76d885ee614abec62943023a5e17b27913188564e7
MD5 ddc79d25966ca0b27efd5735fe1b0f3a
BLAKE2b-256 f42214fff60bb2185075cfe07e398b7c085e27d2f339b4dec1e98bbbbfd66446

See more details on using hashes here.

File details

Details for the file simplepbi-1.0.9-py3-none-any.whl.

File metadata

  • Download URL: simplepbi-1.0.9-py3-none-any.whl
  • Upload date:
  • Size: 91.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for simplepbi-1.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 9d245365be42a17c07c42f9d765ac7e6664f55a3ebc59269563e56de5c133572
MD5 891a2e783f6efb8ce9d9a1c16d31f0b2
BLAKE2b-256 5bb3e9189b561488edc10644034182ede5564bb9bb7a68f20635de2b73f887e8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page