Skip to main content

DVC remote plugin for Databricks Unity Catalog Volumes

Project description

dvc-databricks

A DVC remote storage plugin that enables data versioning on Databricks Unity Catalog Volumes.

Store large data files on Databricks Volumes (e.g. backed by S3 or ADLS), keep only lightweight .dvc pointer files in your git repository, and use standard DVC commands — no custom code required.

dvc push   # uploads data to Databricks Volume via Databricks SDK
dvc pull   # downloads data from Databricks Volume

Why this plugin?

Databricks Unity Catalog Volumes cannot be accessed like a plain S3 bucket — all I/O should go through the Databricks Files API. This plugin bridges DVC and the Databricks SDK so you can version and share datasets stored on Volumes without ever leaving the standard DVC workflow.


Requirements

  • Python >= 3.11
  • DVC >= 3.0
  • Databricks CLI configured with a profile in ~/.databrickscfg
  • Access to a Databricks Unity Catalog Volume

Installation

pip install dvc-databricks

Once installed, the dbvol:// remote protocol is automatically available to DVC in every process — no imports or additional configuration needed.


Setup

1. Initialize DVC in your repository (if not already done)

dvc init
git add .dvc
git commit -m "initialize DVC"

2. Add the Databricks Volume as a DVC remote

dvc remote add -d myremote \
    dbvol:///Volumes/<catalog>/<schema>/<volume>/<path>

Example:

dvc remote add -d myremote \
    dbvol:///Volumes/ml_catalog/datasets/storage/dvc_cache

3. Set your Databricks profile

export DATABRICKS_CONFIG_PROFILE=<your-profile-name>

Note: DVC remotes do not support arbitrary config keys, so the Databricks profile must be provided via this environment variable — it cannot be stored in .dvc/config. Add the export to your ~/.zshrc or ~/.bashrc to make it permanent.


Usage

Track a data file

dvc add data/dataset.csv

This creates data/dataset.csv.dvc — a small pointer file that goes into git. The actual data file must be listed in .gitignore.

Push data to the Volume

dvc push

Uploads the file to your Databricks Volume via the Databricks SDK.

Commit the pointer to git

git add data/dataset.csv.dvc .gitignore
git commit -m "track dataset v1 with DVC"
git push

Pull data in another environment

git clone <your-repo>
pip install dvc-databricks
export DATABRICKS_CONFIG_PROFILE=<your-profile-name>
dvc pull

How it works

Your git repo                   Databricks Volume (S3 / ADLS)
──────────────────              ───────────────────────────────────
data/dataset.csv.dvc  ──────►  /Volumes/catalog/schema/vol/
.dvc/config                     └── files/md5/
                                    ├── ab/cdef1234...   ← actual data
                                    └── 9f/123abc...     ← actual data

dvc add hashes the file and stores it in the local DVC cache (.dvc/cache). A .dvc pointer file containing the MD5 hash is created next to your data file.

dvc push uploads from the local cache to the Volume using the Databricks Files API (WorkspaceClient.files.upload). Files are stored content-addressed: <volume_path>/files/md5/<hash[:2]>/<hash[2:]>.

dvc pull downloads from the Volume into the local cache, then restores the file to its original path.

Only .dvc pointer files are ever committed to git — the data stays on the Volume.


Architecture

The plugin follows the same pattern as official DVC plugins:

Class Base Role
DatabricksVolumesFileSystem dvc_objects.FileSystem DVC-facing layer: config, checksum strategy, dependency check
_DatabricksVolumesFS fsspec.AbstractFileSystem I/O layer: all Databricks SDK calls

A .pth file installed into site-packages ensures the plugin is loaded at Python startup in every process (including DVC CLI subprocesses), without requiring any manual imports.


Environment variables

Variable Description
DATABRICKS_CONFIG_PROFILE Databricks CLI profile name from ~/.databrickscfg. Falls back to the default profile if not set.

License

MIT © Óscar Reyes

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dvc_databricks-1.3.0.tar.gz (20.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dvc_databricks-1.3.0-py3-none-any.whl (10.6 kB view details)

Uploaded Python 3

File details

Details for the file dvc_databricks-1.3.0.tar.gz.

File metadata

  • Download URL: dvc_databricks-1.3.0.tar.gz
  • Upload date:
  • Size: 20.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for dvc_databricks-1.3.0.tar.gz
Algorithm Hash digest
SHA256 8df73369bdf2cfb7d46e44456342c42e78142db24a1438239e0c7e227ec3e084
MD5 8627b7422bb31ae15aa48013112bb800
BLAKE2b-256 10c6f942053b472d7f7d138fa066bc0353fdfebf28ec17ab400350d22b92a26d

See more details on using hashes here.

File details

Details for the file dvc_databricks-1.3.0-py3-none-any.whl.

File metadata

  • Download URL: dvc_databricks-1.3.0-py3-none-any.whl
  • Upload date:
  • Size: 10.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for dvc_databricks-1.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3c3823b39e598d6f5ddf466f12e0bd20ab2b3c476d2fa97bdfbc3f5947e6ed5f
MD5 ebe2edca27b0b1d91f899f1876bff469
BLAKE2b-256 7bf013c33f628bd3340643633b5db005aaa890db8f705ae8f7d50906ead478fa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page