Skip to main content

nose2 plugin to run the tests with support of pyspark.

Project description

nose2 plugin to run the tests with support of pyspark (Apache Spark).

Features:

  1. Make “pyspark” importable in you code executed by nose2.
  2. Add a list of py-files dependencies of your pyspark application (which is usually supplied as an option spark-submit --py-files ...).

Install

$ pip install nose2-spark

Usage

Load “nose2-spark” plugin into nose2 by creating nose2.cfg in your project directory:

[unittest]
plugins = nose2_spark

Run tests with nose2-spark activated (pyspark and friends are added to pythonpath):

$ nose2 --pyspark

nose2-spark will try to import pyspark by looking into:

  1. SPARK_HOME environment variable
  2. Some common Spark locations.

You can set it manually in case if all of mentioned methods are failing to find Spark. Add section “nose2-spark” to nose2.cfg:

[nose2-spark]
spark_home = /opt/spark

You can add a list of required py-files to run your code:

[nose2-spark]
pyfiles = package1.zip
          package2.zip

Example

Example of nose2.cfg with spark_home defined, one py-files dependency and auto activating nose2-spark plugin:

[unittest]
plugins = nose2_spark

[nose2-spark]
always-on = True
spark_home = /opt/spark
pyfiles = package1.zip

This will allow to run tests by single command:

$ nose2

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
nose2_spark-0.3-py2.py3-none-any.whl (4.4 kB) Copy SHA256 hash SHA256 Wheel py2.py3
nose2-spark-0.3.tar.gz (2.7 kB) Copy SHA256 hash SHA256 Source None

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page