Skip to main content

nose2 plugin to run the tests with support of pyspark.

Project description

nose2 plugin to run the tests with support of pyspark (Apache Spark).

Features:

  1. Make “pyspark” importable in you code executed by nose2.

  2. Add a list of py-files dependencies of your pyspark application (which is usually supplied as an option spark-submit --py-files ...).

Install

$ pip install nose2-spark

Usage

Load “nose2-spark” plugin into nose2 by creating nose2.cfg in your project directory:

[unittest]
plugins = nose2_spark

Run tests with nose2-spark activated (pyspark and friends are added to pythonpath):

$ nose2 --pyspark

nose2-spark will try to import pyspark by looking into:

  1. SPARK_HOME environment variable

  2. Some common Spark locations.

You can set it manually in case if all of mentioned methods are failing to find Spark. Add section “nose2-spark” to nose2.cfg:

[nose2-spark]
spark_home = /opt/spark

You can add a list of required py-files to run your code:

[nose2-spark]
pyfiles = package1.zip
          package2.zip

Example

Example of nose2.cfg with spark_home defined, one py-files dependency and auto activating nose2-spark plugin:

[unittest]
plugins = nose2_spark

[nose2-spark]
always-on = True
spark_home = /opt/spark
pyfiles = package1.zip

This will allow to run tests by single command:

$ nose2

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nose2-spark-0.3.tar.gz (2.7 kB view hashes)

Uploaded Source

Built Distribution

nose2_spark-0.3-py2.py3-none-any.whl (4.4 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page