Skip to main content

REST client for Databricks

Project description



A REST client for the Databricks REST API.

This module is a thin layer allowing to build HTTP Requests. It does not expose API operations as distinct methods, but rather exposes generic methods allowing to build API calls.

The Databricks API sometimes returns 200 error codes and HTML content when the request is not properly authenticated. The client intercepts such occurrences (detecting non-JSON returned content) and wraps them into an exception.

This open-source project is not developed by nor affiliated with Databricks.


pip install databricks-client


import databricks_client

client = databricks_client.create("")
clusters_list = client.get('clusters/list')
for cluster in clusters_list["clusters"]:

Usage with a newly provisioned workspace

If using this module as part of a provisioning job, you need to call client.ensure_available().

When the first user logs it to a new Databricks workspace, workspace provisioning is triggered, and the API is not available until that job has completed (that usually takes under a minute, but could take longer depending on the network configuration). In that case you would get an error such as the following when calling the API:

"Succeeded{"error_code":"INVALID_PARAMETER_VALUE","message":"Unknown worker environment WorkerEnvId(workerenv-4312344789891641)"}

The method client.ensure_available(url="instance-pools/list", retries=100, delay_seconds=6) prevents this error by attempting to connect to the provided URL and retries as long as the workspace is in provisioning state, or until the given number of retries has elapsed.

Usage with Azure Active Directory

Note: Azure AD authentication for Databricks is currently in preview.

The client generates short-lived Azure AD tokens. If you need to use your client for longer than the lifetime (typically 30 minutes), rerun client.auth_azuread periodically.

Azure AD authentication with Azure CLI

Install the Azure CLI.

pip install databricks-client[azurecli]
az login
import databricks_client

client = databricks_client.create("")
# or client.auth_azuread(resource_group="my-rg", workspace_name="my-workspace")
clusters_list = client.get('clusters/list')
for cluster in clusters_list["clusters"]:

This is recommended with Azure DevOps Pipelines using the Azure CLI task.

Azure AD authentication with ADAL

pip install databricks-client
pip install adal
import databricks_client
import adal

authority_host_uri = ''
authority_uri = authority_host_uri + '/' + tenant_id
context = adal.AuthenticationContext(authority_uri)

def token_callback(resource):
    return context.acquire_token_with_client_credentials(resource, client_id, client_secret)["accessToken"]

client = databricks_client.create("")
client.auth_azuread("/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/my-rg/providers/Microsoft.Databricks/workspaces/my-workspace", token_callback)
# or client.auth_azuread(resource_group="my-rg", workspace_name="my-workspace", token_callback=token_callback)
clusters_list = client.get('clusters/list')
for cluster in clusters_list["clusters"]:

Example usages

Generating a PAT token

response =
    json={"lifetime_seconds": 60, "comment": "Unit Test Token"}
pat_token = response['token_value']

Uploading a notebook

import base64

with open(notebook_file, "rb") as f:
    file_content =
        "content": base64.b64encode(file_content).decode('ascii'),
        "path": notebook_path,
        "overwrite": False,
        "language": "PYTHON",
        "format": "SOURCE"

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for databricks-client, version 0.0.3
Filename, size File type Python version Upload date Hashes
Filename, size databricks_client-0.0.3-py3-none-any.whl (4.9 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size databricks_client-0.0.3.tar.gz (4.6 kB) File type Source Python version None Upload date Hashes View

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring DigiCert DigiCert EV certificate Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page