Pyspark tools for everyday use
Project description
pyspark-me
Pyspark and Databricks tools for everyday life
Synopsis
Create Databricks connection
# Get Databricks workspace connection
dbc = pysparkme.databricks.connect(
bearer_token='dapixyzabcd09rasdf',
url='https://westeurope.azuredatabricks.net')
Databricks workspace
# List root workspace directory
dbc.workspace.ls('/')
# Check if workspace item exists
dbc.workspace.exists('/explore')
# Check if workspace item is a directory
dbc.workspace.is_directory('/')
# Export notebook in default (SOURCE) format
dbc.workspace.export('/my_notebook')
# Export notebook in HTML format
dbc.workspace.export('/my_notebook', 'HTML')
Databricks CLI
Get CLI help
python -m pysparkme.databricks.cli --help
Export the whole Databricks workspace into a directory explore/export
.
Databricks token is taken from DATABRICKS_BEARER_TOKEN
environment variable.
python -m pysparkme.databricks.cli workspace export -o explore/export ''
Build and publish
python setup.py sdist bdist_wheel
python -m twine upload dist/*
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
pyspark-me-0.0.4.tar.gz
(7.6 kB
view hashes)
Built Distribution
pyspark_me-0.0.4-py3-none-any.whl
(18.9 kB
view hashes)
Close
Hashes for pyspark_me-0.0.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0688b7ceebf2c1505d7126595a04a527d01056be9ad0b1ee5f1690d7bbd9f1b1 |
|
MD5 | 36ddeff0629f770782b2295d075fdfd5 |
|
BLAKE2b-256 | 66a505a2d1def183cb985a62d24f1dacbc03cb3e0ba6585c9e5e4f6358687583 |