Skip to main content

Tools for working with Redshift Spectrum.

Project description

Spectrify

https://img.shields.io/pypi/v/spectrify.svg https://img.shields.io/travis/hellonarrativ/spectrify.svg Documentation Status

A simple yet powerful tool to move your data from Redshift to Redshift Spectrum.

Features

One-liners to:

  • Export a Redshift table to S3 (CSV)

  • Convert exported CSVs to Parquet files in parallel

  • Create the Spectrum table on your Redshift cluster

  • Perform all 3 steps in sequence, essentially “copying” a Redshift table Spectrum in one command.

S3 credentials are specified using boto3. See http://boto3.readthedocs.io/en/latest/guide/configuration.html

Redshift credentials are supplied via environment variables, command-line parameters, or interactive prompt.

Install

$ pip install spectrify

Command-line Usage

Export Redshift table my_table to a folder of CSV files on S3:

$ spectrify --host=example-url.redshift.aws.com --user=myuser --db=mydb export my_table \
    's3://example-bucket/my_table'

Convert exported CSVs to Parquet:

$ spectrify --host=example-url.redshift.aws.com --user=myuser --db=mydb convert my_table \
    's3://example-bucket/my_table'

Create Spectrum table from S3 folder:

$ spectrify --host=example-url.redshift.aws.com --user=myuser --db=mydb create_table \
    's3://example-bucket/my_table' my_table my_spectrum_table

Transform Redshift table by performing all 3 steps in sequence:

$ spectrify --host=example-url.redshift.aws.com --user=myuser --db=mydb transform my_table \
    's3://example-bucket/my_table'

Python Usage

Currently, you’ll have to supply your own SQL Alchemy engine to each of the below commands (pull requests welcome to make this eaiser).

Export to S3:

from spectrify.export import export_to_csv
export_to_csv(sa_engine, table_name, s3_csv_dir)

Convert exported CSVs to Parquet:

from spectrify.convert import convert_redshift_manifest_to_parquet
from spectrify.utils.schema import get_table_schema
sa_table = get_table_schema(sa_engine, source_table_name)
convert_redshift_manifest_to_parquet(s3_csv_manifest_path, sa_table, s3_spectrum_dir)

Create Spectrum table from S3 parquet folder:

from spectrify.create import create_external_table
from spectrify.utils.schema import get_table_schema
sa_table = get_table_schema(sa_engine, source_table_name)
create_external_table(sa_engine, dest_schema, dest_table_name, sa_table, s3_spectrum_path)

Transform Redshift table by performing all 3 steps in sequence:

from spectrify.transform import transform_table
transform_table(sa_engine, table_name, s3_base_path, dest_schema, dest_table, num_workers)

Contribute

Contributions always welcome! Read our guide on contributing here: http://spectrify.readthedocs.io/en/latest/contributing.html

License

MIT License. Copyright (c) 2017, The Narrativ Company, Inc.

History

0.2.0 (2017-09-27)

  • First release on PyPI.

0.1.0 (2017-09-13)

  • Didn’t even make it to PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spectrify-0.2.1.tar.gz (18.2 kB view hashes)

Uploaded Source

Built Distribution

spectrify-0.2.1-py2.py3-none-any.whl (11.6 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page