Skip to main content

Unity Catalog pyspark fixtures

Project description

pytest-mock-unity-catalog

Pytest plugin that provides PySpark fixtures for testing code that reads and writes Unity Catalog tables — without a live Databricks cluster. Table operations are redirected to a local Delta directory so tests run fully offline.

Installation

pip install pytest-mock-unity-catalog

Pytest discovers the plugin automatically via its entry point. No imports or conftest.py changes are needed in the consuming project.

Fixtures

spark

A session-scoped SparkSession configured for local testing with Delta Lake enabled.

def test_something(spark):
    df = spark.createDataFrame([(1, "a")], ["id", "value"])
    assert df.count() == 1

By default uses delta-spark_4.1_2.13:4.1.0 (PySpark 4.1, Scala 2.13). Override via the SPARK_VERSION environment variable for other versions:

# PySpark 3.5 / Scala 2.12
SPARK_VERSION=2.12:3.2.1 pytest

# PySpark 4.0 / Scala 2.13
SPARK_VERSION=4.0_2.13:4.0.0 pytest

mock_save_as_table

Patches DataFrame.write.saveAsTable to write a Delta table to a local temp directory instead of Unity Catalog. The Unity Catalog-style three-part name (catalog.schema.table) is mapped to a directory path.

def test_write(spark, mock_save_as_table):
    df = spark.createDataFrame([(1, "a")], ["id", "value"])
    df.write.saveAsTable("my_catalog.my_schema.my_table")  # writes locally

mock_read_table

Patches both spark.read.table and spark.table to read from the same local Delta path that mock_save_as_table writes to. Use both fixtures together to round-trip through a table.

def test_read(spark, mock_read_table):
    df = spark.read.table("my_catalog.my_schema.my_table")
    assert df.count() == 2

    df2 = spark.table("my_catalog.my_schema.my_table")
    assert df2.count() == 2

local_table_base_path

The Path to the session-scoped temp directory used as the root for all table storage. Useful for asserting on the filesystem directly or for sharing the path in custom fixtures.

def test_path(local_table_base_path):
    assert (local_table_base_path / "my_catalog" / "my_schema" / "my_table").exists()

mock_volume

Redirects all /Volumes/... filesystem access to a local temp directory for the duration of the test. The fixture yields the local base Path so tests can seed files before exercising the code under test.

Intercepted access patterns:

Pattern Mechanism
open("/Volumes/...") patches builtins.open
open(Path("/Volumes/...")) patches builtins.open via PathLike
Path("/Volumes/...").read_text() patches Path.__fspath__
Path("/Volumes/...").write_text(...) patches Path.__fspath__
Path("/Volumes/...").exists() / .stat() / .mkdir() patches Path.__fspath__
pd.read_csv("/Volumes/...") pandas delegates to open()
pd.DataFrame.to_csv("/Volumes/...") pandas delegates to open()

Limitation: binary/columnar readers that bypass Python's open() — e.g. pandas.read_parquet backed by pyarrow — are not intercepted.

Parent directories under the temp root are created automatically, so no explicit mkdir is needed before writing.

def test_read_volume(mock_volume):
    # Seed a file at the equivalent of /Volumes/cat/schema/vol/data.csv
    seed = mock_volume / "cat" / "schema" / "vol" / "data.csv"
    seed.parent.mkdir(parents=True, exist_ok=True)
    seed.write_text("id,value\n1,a\n2,b\n")

    # Code under test uses the real /Volumes path — it is transparently redirected
    import pandas as pd
    df = pd.read_csv("/Volumes/cat/schema/vol/data.csv")
    assert len(df) == 2

Works with pathlib.Path too:

def test_write_volume(mock_volume):
    from pathlib import Path

    Path("/Volumes/cat/schema/vol/out.txt").write_text("hello")

    result = Path("/Volumes/cat/schema/vol/out.txt").read_text()
    assert result == "hello"

volume_base_path

The session-scoped Path used as the root for all volume storage. Injected automatically into mock_volume; only needed directly when building custom fixtures on top of the volume base.

Example: full round-trip

def test_round_trip(spark, mock_save_as_table, mock_read_table):
    df = spark.createDataFrame([(1, "a"), (2, "b")], ["id", "value"])
    df.write.saveAsTable("my_catalog.my_schema.my_table")

    result = spark.read.table("my_catalog.my_schema.my_table")
    assert result.count() == 2

Databricks / on-cluster usage

When tests run inside a Databricks notebook or job (i.e. DATABRICKS_RUNTIME_VERSION is set), the plugin detects this automatically:

  • spark returns the active SparkSession instead of creating a local one.
  • mock_read_table is a no-op — spark.read.table hits Unity Catalog as normal.
  • mock_save_as_table is a no-op — df.write.saveAsTable writes to Unity Catalog as normal. The table is dropped with DROP TABLE IF EXISTS in teardown.
  • mock_volume is a no-op — /Volumes/... paths reach the real Unity Catalog volume.

No code changes are needed; the same tests run locally (mocked) and on Databricks (real).

How it works

mock_save_as_table and mock_read_table patch PySpark's DataFrameWriter.saveAsTable and DataFrameReader.table for the duration of the test. The Unity Catalog table name is converted to a filesystem path by replacing . separators with /:

my_catalog.my_schema.my_table  →  <tmp>/my_catalog/my_schema/my_table

The temp directory is managed by pytest (tmp_path_factory) and lives under the OS temp space (e.g. /var/folders/.../pytest-of-<user>/pytest-<N>/). Pytest retains the last three runs before pruning.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytest_mock_unity_catalog-0.0.4.tar.gz (25.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pytest_mock_unity_catalog-0.0.4-py3-none-any.whl (5.9 kB view details)

Uploaded Python 3

File details

Details for the file pytest_mock_unity_catalog-0.0.4.tar.gz.

File metadata

File hashes

Hashes for pytest_mock_unity_catalog-0.0.4.tar.gz
Algorithm Hash digest
SHA256 10716fa28238d612db130906dbcffeaf80796d12c34fd5b84193bb03345b5c2d
MD5 11ceb9de15d79dc1038a4f1d5a9ccbb0
BLAKE2b-256 03ceec6857939baf38436f08efbe23e7da0fe0501dfc2c2f1da1cca0975abea9

See more details on using hashes here.

Provenance

The following attestation bundles were made for pytest_mock_unity_catalog-0.0.4.tar.gz:

Publisher: run_build.yml on marianreuss/pytest-mock-unity-catalog

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pytest_mock_unity_catalog-0.0.4-py3-none-any.whl.

File metadata

File hashes

Hashes for pytest_mock_unity_catalog-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 6fafffcdc76dc32cf987686abb109cbbb5275ad56efe5c35c64d44774f30b9b7
MD5 e5654f2b9eaf6151d30ef1edfea2731f
BLAKE2b-256 cb8c8f501003e3df26fe2e6c5d3231a258a66c9b97177edf574c44356b093f35

See more details on using hashes here.

Provenance

The following attestation bundles were made for pytest_mock_unity_catalog-0.0.4-py3-none-any.whl:

Publisher: run_build.yml on marianreuss/pytest-mock-unity-catalog

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page