Databricks Jobs API 2.1 Client
Project description
databricks-jobs
The Jobs API allows you to create, edit, and delete jobs. You should never hard code secrets or store them in plain text. Use the Secrets API to manage secrets in the Databricks CLI. Use the Secrets utility to reference secrets in notebooks and jobs.
This Python package is automatically generated by the OpenAPI Generator project:
- API version: 2.1
- Package version: 1.0.0
- Build package: org.openapitools.codegen.languages.PythonNextgenClientCodegen
Requirements.
Python 3.7+
Installation & Usage
pip install
If the python package is hosted on a repository, you can install directly using:
pip install git+https://github.com/GIT_USER_ID/GIT_REPO_ID.git
(you may need to run pip
with root permission: sudo pip install git+https://github.com/GIT_USER_ID/GIT_REPO_ID.git
)
Then import the package:
import databricks_jobs
Setuptools
Install via Setuptools.
python setup.py install --user
(or sudo python setup.py install
to install the package for all users)
Then import the package:
import databricks_jobs
Getting Started
Please follow the installation procedure and then run the following:
from __future__ import print_function
import time
import databricks_jobs
from databricks_jobs.rest import ApiException
from pprint import pprint
# Defining the host is optional and defaults to https://<databricks-instance>/api
# See configuration.py for a list of all supported configuration parameters.
configuration = databricks_jobs.Configuration(
host = "https://<databricks-instance>/api"
)
# The client must configure the authentication and authorization parameters
# in accordance with the API server security policy.
# Examples for each auth method are provided below, use the example that
# satisfies your auth use case.
# Configure Bearer authorization (api_token): bearerAuth
configuration = databricks_jobs.Configuration(
access_token = os.environ["BEARER_TOKEN"]
)
# Enter a context with an instance of the API client
with databricks_jobs.ApiClient(configuration) as api_client:
# Create an instance of the API class
api_instance = databricks_jobs.DefaultApi(api_client)
jobs_create_request = databricks_jobs.JobsCreateRequest() # JobsCreateRequest |
try:
# Create a new job
api_response = api_instance.jobs_create(jobs_create_request)
print("The response of DefaultApi->jobs_create:\n")
pprint(api_response)
except ApiException as e:
print("Exception when calling DefaultApi->jobs_create: %s\n" % e)
Documentation for API Endpoints
All URIs are relative to https:///api
Class | Method | HTTP request | Description |
---|---|---|---|
DefaultApi | jobs_create | POST /2.1/jobs/create | Create a new job |
DefaultApi | jobs_delete | POST /2.1/jobs/delete | Delete a job |
DefaultApi | jobs_get | GET /2.1/jobs/get | Get a single job |
DefaultApi | jobs_list | GET /2.1/jobs/list | List all jobs |
DefaultApi | jobs_reset | POST /2.1/jobs/reset | Overwrites all settings for a job |
DefaultApi | jobs_run_now | POST /2.1/jobs/run-now | Trigger a new job run |
DefaultApi | jobs_runs_cancel | POST /2.1/jobs/runs/cancel | Cancel a job run |
DefaultApi | jobs_runs_cancel_all | POST /2.1/jobs/runs/cancel-all | Cancel all runs of a job |
DefaultApi | jobs_runs_delete | POST /2.1/jobs/runs/delete | Delete a job run |
DefaultApi | jobs_runs_export | GET /2.0/jobs/runs/export | Export and retrieve a job run |
DefaultApi | jobs_runs_get | GET /2.1/jobs/runs/get | Get a single job run |
DefaultApi | jobs_runs_get_output | GET /2.1/jobs/runs/get-output | Get the output for a single run |
DefaultApi | jobs_runs_list | GET /2.1/jobs/runs/list | List runs for a job |
DefaultApi | jobs_runs_repair | POST /2.1/jobs/runs/repair | Repair a job run |
DefaultApi | jobs_runs_submit | POST /2.1/jobs/runs/submit | Create and trigger a one-time run |
DefaultApi | jobs_update | POST /2.1/jobs/update | Partially updates a job |
Documentation For Models
- AccessControlList
- AccessControlRequest
- AccessControlRequestForGroup
- AccessControlRequestForServicePrincipal
- AccessControlRequestForUser
- Adlsgen2Info
- AutoScale
- AwsAttributes
- AzureAttributes
- CanManage
- CanManageRun
- CanView
- ClusterAttributes
- ClusterCloudProviderNodeInfo
- ClusterCloudProviderNodeStatus
- ClusterEvent
- ClusterEventType
- ClusterInfo
- ClusterInstance
- ClusterLibraryStatuses
- ClusterLogConf
- ClusterSize
- ClusterSource
- ClusterSpec
- ClusterState
- CronSchedule
- DbfsStorageInfo
- DbtOutput
- DbtTask
- DockerBasicAuth
- DockerImage
- Error
- EventDetails
- FileStorageInfo
- GcpAttributes
- GitBranchSource
- GitCommitSource
- GitProvider
- GitSnapshot
- GitSource
- GitTagSource
- InitScriptInfo
- IsOwner
- Job
- JobCluster
- JobEmailNotifications
- JobSettings
- JobTask
- JobTaskSettings
- JobsCreate200Response
- JobsCreateRequest
- JobsDeleteRequest
- JobsGet200Response
- JobsList200Response
- JobsResetRequest
- JobsRunNow200Response
- JobsRunNowRequest
- JobsRunsCancelAllRequest
- JobsRunsCancelRequest
- JobsRunsDeleteRequest
- JobsRunsExport200Response
- JobsRunsGet200Response
- JobsRunsGetOutput200Response
- JobsRunsList200Response
- JobsRunsRepair200Response
- JobsRunsRepairRequest
- JobsRunsSubmit200Response
- JobsRunsSubmitRequest
- JobsUpdateRequest
- Library
- LibraryFullStatus
- LibraryInstallStatus
- ListOrder
- LogSyncStatus
- MavenLibrary
- NewCluster
- NewTaskCluster
- NodeType
- NotebookOutput
- NotebookTask
- PermissionLevel
- PermissionLevelForGroup
- PipelineTask
- PoolClusterTerminationCode
- PythonPyPiLibrary
- PythonWheelTask
- RCranLibrary
- RepairHistory
- RepairHistoryItem
- RepairRunInput
- ResizeCause
- Run
- RunLifeCycleState
- RunNowInput
- RunParameters
- RunParametersPipelineParams
- RunResultState
- RunState
- RunSubmitSettings
- RunSubmitTaskSettings
- RunTask
- RunType
- S3StorageInfo
- SparkJarTask
- SparkNode
- SparkNodeAwsAttributes
- SparkPythonTask
- SparkSubmitTask
- SparkVersion
- SqlAlertOutput
- SqlDashboardOutput
- SqlDashboardWidgetOutput
- SqlOutput
- SqlOutputError
- SqlQueryOutput
- SqlStatementOutput
- SqlTask
- SqlTaskAlert
- SqlTaskDashboard
- SqlTaskQuery
- TaskDependenciesInner
- TaskSparkSubmitTask
- TerminationCode
- TerminationParameter
- TerminationReason
- TerminationType
- TriggerType
- ViewItem
- ViewType
- ViewsToExport
- WebhookNotifications
- WebhookNotificationsOnStartInner
Documentation For Authorization
bearerAuth
- Type: Bearer authentication (api_token)
Author
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file databricks-jobs-1.0.3.tar.gz
.
File metadata
- Download URL: databricks-jobs-1.0.3.tar.gz
- Upload date:
- Size: 88.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b148603e369a75d1861a0bcf6dc37b15d4794204ef452a45d09ad6891ed68302 |
|
MD5 | a29db7bfdcb2b286cd810d79fbe8160b |
|
BLAKE2b-256 | d9d4ebba27886ec924d5f726f0adf3ece2cead0cd5e25150c90c2f06a14d3f9d |
File details
Details for the file databricks_jobs-1.0.3-py3-none-any.whl
.
File metadata
- Download URL: databricks_jobs-1.0.3-py3-none-any.whl
- Upload date:
- Size: 227.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3abe4c166e154e91a5e43e5be3f2e246a2920aae8bec52af553a8e80001d4552 |
|
MD5 | 1e4e87bc8598f5d3e1f8687bb95df038 |
|
BLAKE2b-256 | 6ee1b946a6089228c0169a8578c6f87abe4c29c13bb54e9d20f9fe1bf50dc73c |