Skip to main content

Databricks API client auto-generated from the official databricks-cli package

Project description

pypi pyversions

[This documentation is auto-generated]

This package provides a simplified interface for the Databricks REST API. The interface is autogenerated on instantiation using the underlying client library used in the official databricks-cli python package.

Install using

pip install databricks-api

The docs here describe the interface for version 0.17.0 of the databricks-cli package for API version 2.0.

The databricks-api package contains a DatabricksAPI class which provides instance attributes for the databricks-cli ApiClient, as well as each of the available service instances. The attributes of a DatabricksAPI instance are:

  • DatabricksAPI.client <databricks_cli.sdk.api_client.ApiClient>

  • DatabricksAPI.jobs <databricks_cli.sdk.service.JobsService>

  • DatabricksAPI.cluster <databricks_cli.sdk.service.ClusterService>

  • DatabricksAPI.policy <databricks_cli.sdk.service.PolicyService>

  • DatabricksAPI.managed_library <databricks_cli.sdk.service.ManagedLibraryService>

  • DatabricksAPI.dbfs <databricks_cli.sdk.service.DbfsService>

  • DatabricksAPI.workspace <databricks_cli.sdk.service.WorkspaceService>

  • DatabricksAPI.secret <databricks_cli.sdk.service.SecretService>

  • DatabricksAPI.groups <databricks_cli.sdk.service.GroupsService>

  • DatabricksAPI.token <databricks_cli.sdk.service.TokenService>

  • DatabricksAPI.instance_pool <databricks_cli.sdk.service.InstancePoolService>

  • DatabricksAPI.delta_pipelines <databricks_cli.sdk.service.DeltaPipelinesService>

  • DatabricksAPI.repos <databricks_cli.sdk.service.ReposService>

To instantiate the client, provide the databricks host and either a token or user and password. Also shown is the full signature of the underlying ApiClient.__init__

from databricks_api import DatabricksAPI

# Provide a host and token
db = DatabricksAPI(
    host="example.cloud.databricks.com",
    token="dpapi123..."
)

# OR a host and user and password
db = DatabricksAPI(
    host="example.cloud.databricks.com",
    user="me@example.com",
    password="password"
)

# Full __init__ signature
db = DatabricksAPI(
    user=None,
    password=None,
    host=None,
    token=None,
    api_version='2.0',
    default_headers={},
    verify=True,
    command_name='',
    jobs_api_version=None
)

Refer to the official documentation on the functionality and required arguments of each method below.

Each of the service instance attributes provides the following public methods:

DatabricksAPI.jobs

db.jobs.cancel_run(
    run_id,
    headers=None,
    version=None,
)

db.jobs.create_job(
    name=None,
    existing_cluster_id=None,
    new_cluster=None,
    libraries=None,
    email_notifications=None,
    timeout_seconds=None,
    max_retries=None,
    min_retry_interval_millis=None,
    retry_on_timeout=None,
    schedule=None,
    notebook_task=None,
    spark_jar_task=None,
    spark_python_task=None,
    spark_submit_task=None,
    max_concurrent_runs=None,
    tasks=None,
    headers=None,
    version=None,
)

db.jobs.delete_job(
    job_id,
    headers=None,
    version=None,
)

db.jobs.delete_run(
    run_id=None,
    headers=None,
    version=None,
)

db.jobs.export_run(
    run_id,
    views_to_export=None,
    headers=None,
    version=None,
)

db.jobs.get_job(
    job_id,
    headers=None,
    version=None,
)

db.jobs.get_run(
    run_id=None,
    headers=None,
    version=None,
)

db.jobs.get_run_output(
    run_id,
    headers=None,
    version=None,
)

db.jobs.list_jobs(
    job_type=None,
    expand_tasks=None,
    limit=None,
    offset=None,
    headers=None,
    version=None,
)

db.jobs.list_runs(
    job_id=None,
    active_only=None,
    completed_only=None,
    offset=None,
    limit=None,
    headers=None,
    version=None,
)

db.jobs.reset_job(
    job_id,
    new_settings,
    headers=None,
    version=None,
)

db.jobs.run_now(
    job_id=None,
    jar_params=None,
    notebook_params=None,
    python_params=None,
    spark_submit_params=None,
    python_named_params=None,
    idempotency_token=None,
    headers=None,
    version=None,
)

db.jobs.submit_run(
    run_name=None,
    existing_cluster_id=None,
    new_cluster=None,
    libraries=None,
    notebook_task=None,
    spark_jar_task=None,
    spark_python_task=None,
    spark_submit_task=None,
    timeout_seconds=None,
    tasks=None,
    headers=None,
    version=None,
)

DatabricksAPI.cluster

db.cluster.create_cluster(
    num_workers=None,
    autoscale=None,
    cluster_name=None,
    spark_version=None,
    spark_conf=None,
    aws_attributes=None,
    node_type_id=None,
    driver_node_type_id=None,
    ssh_public_keys=None,
    custom_tags=None,
    cluster_log_conf=None,
    spark_env_vars=None,
    autotermination_minutes=None,
    enable_elastic_disk=None,
    cluster_source=None,
    instance_pool_id=None,
    headers=None,
)

db.cluster.delete_cluster(
    cluster_id,
    headers=None,
)

db.cluster.edit_cluster(
    cluster_id,
    num_workers=None,
    autoscale=None,
    cluster_name=None,
    spark_version=None,
    spark_conf=None,
    aws_attributes=None,
    node_type_id=None,
    driver_node_type_id=None,
    ssh_public_keys=None,
    custom_tags=None,
    cluster_log_conf=None,
    spark_env_vars=None,
    autotermination_minutes=None,
    enable_elastic_disk=None,
    cluster_source=None,
    instance_pool_id=None,
    headers=None,
)

db.cluster.get_cluster(
    cluster_id,
    headers=None,
)

db.cluster.get_events(
    cluster_id,
    start_time=None,
    end_time=None,
    order=None,
    event_types=None,
    offset=None,
    limit=None,
    headers=None,
)

db.cluster.list_available_zones(headers=None)

db.cluster.list_clusters(headers=None)

db.cluster.list_node_types(headers=None)

db.cluster.list_spark_versions(headers=None)

db.cluster.permanent_delete_cluster(
    cluster_id,
    headers=None,
)

db.cluster.pin_cluster(
    cluster_id,
    headers=None,
)

db.cluster.resize_cluster(
    cluster_id,
    num_workers=None,
    autoscale=None,
    headers=None,
)

db.cluster.restart_cluster(
    cluster_id,
    headers=None,
)

db.cluster.start_cluster(
    cluster_id,
    headers=None,
)

db.cluster.unpin_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.policy

db.policy.create_policy(
    policy_name,
    definition,
    headers=None,
)

db.policy.delete_policy(
    policy_id,
    headers=None,
)

db.policy.edit_policy(
    policy_id,
    policy_name,
    definition,
    headers=None,
)

db.policy.get_policy(
    policy_id,
    headers=None,
)

db.policy.list_policies(headers=None)

DatabricksAPI.managed_library

db.managed_library.all_cluster_statuses(headers=None)

db.managed_library.cluster_status(
    cluster_id,
    headers=None,
)

db.managed_library.install_libraries(
    cluster_id,
    libraries=None,
    headers=None,
)

db.managed_library.uninstall_libraries(
    cluster_id,
    libraries=None,
    headers=None,
)

DatabricksAPI.dbfs

db.dbfs.add_block(
    handle,
    data,
    headers=None,
)

db.dbfs.add_block_test(
    handle,
    data,
    headers=None,
)

db.dbfs.close(
    handle,
    headers=None,
)

db.dbfs.close_test(
    handle,
    headers=None,
)

db.dbfs.create(
    path,
    overwrite=None,
    headers=None,
)

db.dbfs.create_test(
    path,
    overwrite=None,
    headers=None,
)

db.dbfs.delete(
    path,
    recursive=None,
    headers=None,
)

db.dbfs.delete_test(
    path,
    recursive=None,
    headers=None,
)

db.dbfs.get_status(
    path,
    headers=None,
)

db.dbfs.get_status_test(
    path,
    headers=None,
)

db.dbfs.list(
    path,
    headers=None,
)

db.dbfs.list_test(
    path,
    headers=None,
)

db.dbfs.mkdirs(
    path,
    headers=None,
)

db.dbfs.mkdirs_test(
    path,
    headers=None,
)

db.dbfs.move(
    source_path,
    destination_path,
    headers=None,
)

db.dbfs.move_test(
    source_path,
    destination_path,
    headers=None,
)

db.dbfs.put(
    path,
    contents=None,
    overwrite=None,
    headers=None,
    src_path=None,
)

db.dbfs.put_test(
    path,
    contents=None,
    overwrite=None,
    headers=None,
    src_path=None,
)

db.dbfs.read(
    path,
    offset=None,
    length=None,
    headers=None,
)

db.dbfs.read_test(
    path,
    offset=None,
    length=None,
    headers=None,
)

DatabricksAPI.workspace

db.workspace.delete(
    path,
    recursive=None,
    headers=None,
)

db.workspace.export_workspace(
    path,
    format=None,
    direct_download=None,
    headers=None,
)

db.workspace.get_status(
    path,
    headers=None,
)

db.workspace.import_workspace(
    path,
    format=None,
    language=None,
    content=None,
    overwrite=None,
    headers=None,
)

db.workspace.list(
    path,
    headers=None,
)

db.workspace.mkdirs(
    path,
    headers=None,
)

DatabricksAPI.secret

db.secret.create_scope(
    scope,
    initial_manage_principal=None,
    scope_backend_type=None,
    backend_azure_keyvault=None,
    headers=None,
)

db.secret.delete_acl(
    scope,
    principal,
    headers=None,
)

db.secret.delete_scope(
    scope,
    headers=None,
)

db.secret.delete_secret(
    scope,
    key,
    headers=None,
)

db.secret.get_acl(
    scope,
    principal,
    headers=None,
)

db.secret.list_acls(
    scope,
    headers=None,
)

db.secret.list_scopes(headers=None)

db.secret.list_secrets(
    scope,
    headers=None,
)

db.secret.put_acl(
    scope,
    principal,
    permission,
    headers=None,
)

db.secret.put_secret(
    scope,
    key,
    string_value=None,
    bytes_value=None,
    headers=None,
)

DatabricksAPI.groups

db.groups.add_to_group(
    parent_name,
    user_name=None,
    group_name=None,
    headers=None,
)

db.groups.create_group(
    group_name,
    headers=None,
)

db.groups.get_group_members(
    group_name,
    headers=None,
)

db.groups.get_groups(headers=None)

db.groups.get_groups_for_principal(
    user_name=None,
    group_name=None,
    headers=None,
)

db.groups.remove_from_group(
    parent_name,
    user_name=None,
    group_name=None,
    headers=None,
)

db.groups.remove_group(
    group_name,
    headers=None,
)

DatabricksAPI.token

db.token.create_token(
    lifetime_seconds=None,
    comment=None,
    headers=None,
)

db.token.list_tokens(headers=None)

db.token.revoke_token(
    token_id,
    headers=None,
)

DatabricksAPI.instance_pool

db.instance_pool.create_instance_pool(
    instance_pool_name=None,
    min_idle_instances=None,
    max_capacity=None,
    aws_attributes=None,
    node_type_id=None,
    custom_tags=None,
    idle_instance_autotermination_minutes=None,
    enable_elastic_disk=None,
    disk_spec=None,
    preloaded_spark_versions=None,
    headers=None,
)

db.instance_pool.delete_instance_pool(
    instance_pool_id=None,
    headers=None,
)

db.instance_pool.edit_instance_pool(
    instance_pool_id,
    instance_pool_name=None,
    min_idle_instances=None,
    max_capacity=None,
    aws_attributes=None,
    node_type_id=None,
    custom_tags=None,
    idle_instance_autotermination_minutes=None,
    enable_elastic_disk=None,
    disk_spec=None,
    preloaded_spark_versions=None,
    headers=None,
)

db.instance_pool.get_instance_pool(
    instance_pool_id=None,
    headers=None,
)

db.instance_pool.list_instance_pools(headers=None)

DatabricksAPI.delta_pipelines

db.delta_pipelines.create(
    id=None,
    name=None,
    storage=None,
    configuration=None,
    clusters=None,
    libraries=None,
    trigger=None,
    filters=None,
    allow_duplicate_names=None,
    headers=None,
)

db.delta_pipelines.delete(
    pipeline_id=None,
    headers=None,
)

db.delta_pipelines.deploy(
    pipeline_id=None,
    id=None,
    name=None,
    storage=None,
    configuration=None,
    clusters=None,
    libraries=None,
    trigger=None,
    filters=None,
    allow_duplicate_names=None,
    headers=None,
)

db.delta_pipelines.get(
    pipeline_id=None,
    headers=None,
)

db.delta_pipelines.list(
    pagination=None,
    headers=None,
)

db.delta_pipelines.reset(
    pipeline_id=None,
    headers=None,
)

db.delta_pipelines.run(
    pipeline_id=None,
    headers=None,
)

db.delta_pipelines.start_update(
    pipeline_id=None,
    full_refresh=None,
    headers=None,
)

db.delta_pipelines.stop(
    pipeline_id=None,
    headers=None,
)

DatabricksAPI.repos

db.repos.create_repo(
    url,
    provider,
    path=None,
    headers=None,
)

db.repos.delete_repo(
    id,
    headers=None,
)

db.repos.get_repo(
    id,
    headers=None,
)

db.repos.list_repos(
    path_prefix=None,
    next_page_token=None,
    headers=None,
)

db.repos.update_repo(
    id,
    branch=None,
    tag=None,
    headers=None,
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

databricks_api-0.8.0.tar.gz (10.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

databricks_api-0.8.0-py3-none-any.whl (7.2 kB view details)

Uploaded Python 3

File details

Details for the file databricks_api-0.8.0.tar.gz.

File metadata

  • Download URL: databricks_api-0.8.0.tar.gz
  • Upload date:
  • Size: 10.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.11 CPython/3.8.2 Darwin/20.1.0

File hashes

Hashes for databricks_api-0.8.0.tar.gz
Algorithm Hash digest
SHA256 49cbf99a4faef50b57ec79e74820bf9ff1f690090359b78eb82889f9962714e6
MD5 0a5b1a95ea4915bc06f01dadeb36fdda
BLAKE2b-256 675bfdaaa9ce90d45686b44ad048fe01cea6efab2fb8529f58df0380a540def6

See more details on using hashes here.

File details

Details for the file databricks_api-0.8.0-py3-none-any.whl.

File metadata

  • Download URL: databricks_api-0.8.0-py3-none-any.whl
  • Upload date:
  • Size: 7.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.11 CPython/3.8.2 Darwin/20.1.0

File hashes

Hashes for databricks_api-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d689b9453f4772277f7df179a9e0abd0d5389590eb790cdfa14d714fd1541a9e
MD5 8bcdca4a2d6ab3f26d069f5830f4fa35
BLAKE2b-256 c2690d7f361d655eda7dcaf0f3e41bd0e44c4c7dc41ae40911906723200a3883

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page