Skip to main content

Update libraries on Databricks

Project description

Stork

Command line helpers for Databricks!

PyPI version Python package Documentation Status

Why we built this

When our team started setting up CI/CD for the various packages we maintain, we encountered some difficulties integrating Jenkins with Databricks.

We write a lot of Python + PySpark packages in our data science work, and we often deploy these as batch jobs run on a schedule using Databricks. However, each time we merged in a new change to one of these libraries we would have to manually create an egg, upload it using the Databricks GUI, go find all the jobs that used the library, and update each one to point to the new job. As our team and set of libraries and jobs grew, this became unsustainable (not to mention a big break from the CI/CD philosophy...).

As we set out to automate this using Databrick's library API, we realized that this task required using two versions of the API and many dependant API calls. Instead of trying to recreate that logic in each Jenkinsfile, we wrote stork. Now you can enjoy the magic as well!

Stork now works for both .egg and .jar files to support Python + PySpark and Scala + Spark libaries. Take advantage of stork's ability to update jobs, make sure you're following one of the following naming conventions:

new_library-1.0.0-py3.6.egg
new_library-1.0.0-SNAPSHOT-py3.6.egg
new_library-1.0.0-SNAPSHOT-my-branch-py3.6.egg
new_library-1.0.0.egg
new_library-1.0.0-SNAPSHOT.egg
new_library-1.0.0-SNAPSHOT-my-branch.egg
new_library-1.0.0.jar
new_library-1.0.0-SNAPSHOT.jar
new_library-1.0.0-SNAPSHOT-my-branch.jar

Where the first number in the version (in this case 1) is a major version signaling breaking changes.

What it does

Stork is a set of command line helpers for Databricks. Some commands are for managing libraries in Databricks in an automated fashion. This allows you to move away from the point-and-click interface for your development work and for deploying production-level libraries for use in scheduled Databricks jobs. Another command allows you to create an interactive cluster that replicates the settings used on a job cluster.

For a more detailed API and tutorials, check out the docs.

Installation

Note: stork requires python3, and currently only works on Databricks accounts that run AWS (not Azure)

Stork is hosted on PyPi, so to get the latest version simply install via pip:

pip install stork

You can also install from source, by cloning the git repository https://github.com/ShopRunner/stork.git and installing via easy_install:

git clone https://github.com/ShopRunner/stork.git
cd stork
easy_install .

Setup

Configuration

Stork uses a .storkcfg to store information about your Databricks account and setup. To create this file, run:

stork configure

You will be asked for your Databricks host name (the url you use to access the account - something like https://my-organization.cloud.databricks.com), an access token, and your production folder. This should be a folder your team creates to keep production-ready libraries. By isolating production-ready libraries in their own folder, you ensure that stork will never update a job to use a library still in development/testing.

Databricks API token

The API tokens can be generated in Databricks under Account Settings -> Access Tokens. To upload an egg to any folder in Databricks, you can use any token. To update jobs, you will need a token with admin permissions, which can be created in the same manner by an admin on the account.

Usage notes

While libraries can be uploaded to folders other than your specified production library, no libraries outside of this folder will ever be deleted and no jobs using libraries outside of this folder will be updated.

If you try to upload a library to Databricks that already exists there with the same version, a warning will be printed instructing the user to update the version if a change has been made. Without a version change the new library will not be uploaded.

Contributing

See a way for stork to improve? We welcome contributions in the form of issues or pull requests!

Please check out the contributing page for more information.

License

Copyright (c) 2018, ShopRunner

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

  1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.

  2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.

  3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

stork-3.2.1.tar.gz (15.0 kB view details)

Uploaded Source

Built Distribution

stork-3.2.1-py3-none-any.whl (16.7 kB view details)

Uploaded Python 3

File details

Details for the file stork-3.2.1.tar.gz.

File metadata

  • Download URL: stork-3.2.1.tar.gz
  • Upload date:
  • Size: 15.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.9.2

File hashes

Hashes for stork-3.2.1.tar.gz
Algorithm Hash digest
SHA256 b690bd367dc1788f6a3a9e2146bb1aba3711f9ce90171c1cdcf8261504bd1f99
MD5 aadd639ab5bf5335d5aae17759a5d082
BLAKE2b-256 53e3a36571a3062f549b03b3ba31262b3909d913a0f3d26f1893407cc19a692d

See more details on using hashes here.

File details

Details for the file stork-3.2.1-py3-none-any.whl.

File metadata

  • Download URL: stork-3.2.1-py3-none-any.whl
  • Upload date:
  • Size: 16.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.9.2

File hashes

Hashes for stork-3.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 55cb7ea852077d1f67d11066a5849ee70016909f0a1154a082dc21a8947f9790
MD5 fe3f92e3ac0da4d1d7a00d5b2bd47fee
BLAKE2b-256 fea6f9a5b5780b78d483ade3b53d62b55946e89761280fb8e1c9293e2d2aa933

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page