Skip to main content

nose2 plugin to run the tests with support of pyspark.

Project description

nose2 plugin to run the tests with support of pyspark (Apache Spark).


  1. Make “pyspark” importable in you code executed by nose2.

  2. Add a list of py-files dependencies of your pyspark application (which is usually supplied as an option spark-submit --py-files ...).


$ pip install nose2-spark


Load “nose2-spark” plugin into nose2 by creating nose2.cfg in your project directory:

plugins = nose2_spark

Run tests with nose2-spark activated (pyspark and friends are added to pythonpath):

$ nose2 --pyspark

nose2-spark will try to import pyspark by looking into:

  1. SPARK_HOME environment variable

  2. Some common Spark locations.

You can set it manually in case if all of mentioned methods are failing to find Spark. Add section “nose2-spark” to nose2.cfg:

spark_home = /opt/spark

You can add a list of required py-files to run your code:

pyfiles =


Example of nose2.cfg with spark_home defined, one py-files dependency and auto activating nose2-spark plugin:

plugins = nose2_spark

always-on = True
spark_home = /opt/spark
pyfiles =

This will allow to run tests by single command:

$ nose2

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nose2-spark-0.3.tar.gz (2.7 kB view hashes)

Uploaded source

Built Distribution

nose2_spark-0.3-py2.py3-none-any.whl (4.4 kB view hashes)

Uploaded py2 py3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page