nose2 plugin to run the tests with support of pyspark.
Project description
nose2 plugin to run the tests with support of pyspark (Apache Spark).
Features:
Make “pyspark” importable in you code executed by nose2.
Add a list of py-files dependencies of your pyspark application (which is usually supplied as an option spark-submit --py-files ...).
Install
$ pip install nose2-spark
Usage
Load “nose2-spark” plugin into nose2 by creating nose2.cfg in your project directory:
[unittest] plugins = nose2_spark
Run tests with nose2-spark activated (pyspark and friends are added to pythonpath):
$ nose2 --pyspark
nose2-spark will try to import pyspark by looking into:
SPARK_HOME environment variable
Some common Spark locations.
You can set it manually in case if all of mentioned methods are failing to find Spark. Add section “nose2-spark” to nose2.cfg:
[nose2-spark] spark_home = /opt/spark
You can add a list of required py-files to run your code:
[nose2-spark] pyfiles = package1.zip package2.zip
Example
Example of nose2.cfg with spark_home defined, one py-files dependency and auto activating nose2-spark plugin:
[unittest] plugins = nose2_spark [nose2-spark] always-on = True spark_home = /opt/spark pyfiles = package1.zip
This will allow to run tests by single command:
$ nose2
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for nose2_spark-0.3-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 17172396291eb7fa27235cfc2c6d7b7c6dbe5a11bda3987faf446bf0854c1620 |
|
MD5 | d68e0ae73d07f6318403b84d01381b0a |
|
BLAKE2b-256 | 77c003ffaa0bd3cf7c1f93d5a952794a7723eb2068c3cea1b8d38d0be74e2113 |