pytest plugin to run the tests with support of pyspark.
Project description
pytest plugin to run the tests with support of pyspark (Apache Spark).
This plugin will allow to specify SPARK_HOME directory in pytest.ini and thus to make “pyspark” importable in your tests which are executed by pytest.
Also it defines session scope fixture spark_context which can be used in your tests.
Install
$ pip install pytest-spark
Usage
Set Spark location
To run tests with required spark_home location just add “spark_home” value to pytest.ini in your project directory:
[pytest] spark_home = /opt/spark
pytest-spark will try to import pyspark from specified location.
Using fixture
Use fixture spark_context in your tests as a regular pyspark fixture. SparkContext instance will be created once and reused for the whole test session.
Example:
def test_my_case(spark_context): test_rdd = spark_context.parallelize([1, 2, 3, 4]) # ...
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for pytest_spark-0.2.0-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 374510b0fdf7dc0b52b31af7b05858bb76b58ca5ddb56247ca2d2bcc5b2e29c5 |
|
MD5 | c020d34e21dd9cf004e0c374778e029c |
|
BLAKE2b-256 | 09d14b244fe7c2109574f8f2ffa2f427bf75710b904b3a7167e55ceef0036bae |