Skip to main content

pytest plugin to run the tests with support of pyspark.

Project description

https://travis-ci.org/malexer/pytest-spark.svg?branch=master

pytest plugin to run the tests with support of pyspark (Apache Spark).

This plugin will allow to specify SPARK_HOME directory in pytest.ini and thus to make “pyspark” importable in your tests which are executed by pytest.

You can also define “spark_options” in pytest.ini to customize pyspark, including “spark.jars.packages” option which allows to load external libraries (e.g. “com.databricks:spark-xml”).

pytest-spark provides session scope fixtures spark_context and spark_session which can be used in your tests.

Install

$ pip install pytest-spark

Usage

Set Spark location

To run tests with required spark_home location you need to define it by using one of the following methods:

  1. Specify command line option “–spark_home”:

    $ pytest --spark_home=/opt/spark
  2. Add “spark_home” value to pytest.ini in your project directory:

    [pytest]
    spark_home = /opt/spark
  3. Set the “SPARK_HOME” environment variable.

pytest-spark will try to import pyspark from provided location.

Customize spark_options

Just define “spark_options” in your pytest.ini, e.g.:

[pytest]
spark_home = /opt/spark
spark_options =
    spark.app.name: my-pytest-spark-tests
    spark.executor.instances: 1
    spark.jars.packages: com.databricks:spark-xml_2.12:0.5.0

Using the spark_context fixture

Use fixture spark_context in your tests as a regular pyspark fixture. SparkContext instance will be created once and reused for the whole test session.

Example:

def test_my_case(spark_context):
    test_rdd = spark_context.parallelize([1, 2, 3, 4])
    # ...

Using the spark_session fixture (Spark 2.0 and above)

Use fixture spark_session in your tests as a regular pyspark fixture. A SparkSession instance with Hive support enabled will be created once and reused for the whole test session.

Example:

def test_spark_session_dataframe(spark_session):
    test_df = spark_session.createDataFrame([[1,3],[2,4]], "a: int, b: int")
    # ...

Development

Tests

Run tests locally:

$ docker-compose up --build

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytest-spark-0.5.2.tar.gz (5.2 kB view details)

Uploaded Source

Built Distribution

pytest_spark-0.5.2-py3-none-any.whl (6.5 kB view details)

Uploaded Python 3

File details

Details for the file pytest-spark-0.5.2.tar.gz.

File metadata

  • Download URL: pytest-spark-0.5.2.tar.gz
  • Upload date:
  • Size: 5.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.8.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/3.7.3

File hashes

Hashes for pytest-spark-0.5.2.tar.gz
Algorithm Hash digest
SHA256 9ced6874569073385df41454998180fad3dc773cafc1f31e110289bac8730463
MD5 0a4367f6013582f3b90f84d7126b4a0b
BLAKE2b-256 f7a4c5ec2701602b5d314fa7d8ad92bc34415726aeb20a2fa178406baa528ac3

See more details on using hashes here.

File details

Details for the file pytest_spark-0.5.2-py3-none-any.whl.

File metadata

  • Download URL: pytest_spark-0.5.2-py3-none-any.whl
  • Upload date:
  • Size: 6.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.8.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/3.7.3

File hashes

Hashes for pytest_spark-0.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b6fc3310e37c2e6cf6eca73c2dc1a5e4583544bfc3d70999cda33f9aea55c6e4
MD5 d1ee54c30fcb6efba3122a1f88b75e23
BLAKE2b-256 9fa98b4bc6c416bc8875b56e3bd290ce68dc65ad03e7cbfad878477f694525ca

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page