nose2 plugin to run the tests with support of pyspark.
- Make “pyspark” importable in you code executed by nose2.
- Add a list of py-files dependencies of your pyspark application (which is usually supplied as an option spark-submit --py-files ...).
$ pip install nose2-spark
Load “nose2-spark” plugin into nose2 by creating nose2.cfg in your project directory:
[unittest] plugins = nose2_spark
Run tests with nose2-spark activated (pyspark and friends are added to pythonpath):
$ nose2 --pyspark
nose2-spark will try to import pyspark by looking into:
- SPARK_HOME environment variable
- Some common Spark locations.
You can set it manually in case if all of mentioned methods are failing to find Spark. Add section “nose2-spark” to nose2.cfg:
[nose2-spark] spark_home = /opt/spark
You can add a list of required py-files to run your code:
[nose2-spark] pyfiles = package1.zip package2.zip
Example of nose2.cfg with spark_home defined, one py-files dependency and auto activating nose2-spark plugin:
[unittest] plugins = nose2_spark [nose2-spark] always-on = True spark_home = /opt/spark pyfiles = package1.zip
This will allow to run tests by single command:
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size nose2_spark-0.3-py2.py3-none-any.whl (4.4 kB)||File type Wheel||Python version py2.py3||Upload date||Hashes View|
|Filename, size nose2-spark-0.3.tar.gz (2.7 kB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for nose2_spark-0.3-py2.py3-none-any.whl