Skip to main content

A CLI to configure pyspark for use with s3 on localstack

Project description

localstack-s3-pyspark

test-distribute

This package provides a CLI for configuring pyspark to use localstack for the S3 file system. This is intended for testing packages locally (or in your CI/CD pipeline) which you intend to deploy on an Amazon EMR cluster.

Installation

Execute the following command, replacing pip3 with the executable appropriate for the environment where you want to configure pyspark to use localstack:

pip3 install localstack-s3-pyspark

Configure Spark's Defaults

If you've installed localstack-s3-pyspark in a Dockerfile or virtual environment, just run the following command:

localstack-s3-pyspark configure-defaults

If you've installed localstack-s3-pyspark in an environment with multiple python 3.x versions, you may instead want to run an appropriate variation of the following command (replacing python3 with the command used to access the python executable for which you want to configure pyspark):

python3 -m localstack_s3_pyspark configure-defaults

Tox

Please note that if you are testing your packages with tox (highly recommended), you will need to:

  • Include "localstack-s3-pyspark" in your tox deps
  • Include localstack-s3-pyspark configure-defaults in your tox commands_pre (or by other means execute this command prior to your tests)

Here is an example tox.ini which starts up localstack using the localstack CLI (you could also use docker-compose or just docker run, if you need greater control or fewer python dependencies, see the the localstack documentation "Getting Started" page for details):

[tox]
envlist = pytest

[testenv:pytest]
deps =
  localstack-s3-pyspark
  localstack
commands_pre =
    localstack-s3-pyspark configure-defaults
    localstack start -d
    sleep 20
commands =
    py.test
commands_post =
    localstack stop

Patch boto3

If your tests interact with S3 using boto3, you can patch boto3 from within your unit tests as follows:

from localstack_s3_pyspark.boto3 import use_localstack
use_localstack()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

localstack-s3-pyspark-0.12.6.tar.gz (8.1 kB view details)

Uploaded Source

Built Distribution

localstack_s3_pyspark-0.12.6-py3-none-any.whl (7.7 kB view details)

Uploaded Python 3

File details

Details for the file localstack-s3-pyspark-0.12.6.tar.gz.

File metadata

File hashes

Hashes for localstack-s3-pyspark-0.12.6.tar.gz
Algorithm Hash digest
SHA256 da9e0daf3c3a2ca9688278ba18e5ebbca55fdd50d65a68d46ca026838e83b2f0
MD5 36be53c855b5b151b375b8d2b4c1a06d
BLAKE2b-256 87c03229aa790fd7e2db26a80ddb70f0ed52c9b55d3179589833d67e852345c1

See more details on using hashes here.

File details

Details for the file localstack_s3_pyspark-0.12.6-py3-none-any.whl.

File metadata

File hashes

Hashes for localstack_s3_pyspark-0.12.6-py3-none-any.whl
Algorithm Hash digest
SHA256 071221f1439438ec26ac81c5320fad1ef82b427566dc2e37395405eb9a2bd82f
MD5 bfe9af8e215efb3b6d675f201d20a5ff
BLAKE2b-256 27ddcc4025fbc0bbb566a45832efe7c96d635669dc0be53116cd3b1b43b77200

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page