Skip to main content

Utility belt to handle data on AWS.

Project description

AWS Data Wrangler (beta)

Utility belt to handle data on AWS.


Contents: Use Cases | Installation | Examples


Use Cases

  • Pandas -> Parquet (S3)
  • Pandas -> CSV (S3)
  • Pandas -> Glue Catalog
  • Pandas -> Athena
  • Pandas -> Redshift
  • CSV (S3) -> Pandas
  • Athena -> Pandas
  • PySpark -> Redshift

Installation

pip install awswrangler

Runs only with Python 3.6 and beyond.

Runs anywhere (AWS Lambda, AWS Glue, EMR, EC2, on-premises, local, etc).

P.S. Lambda Layer bundle and Glue egg are available to download. It's just upload to your account and run! :rocket:

Examples

Writing Pandas Dataframe to S3 + Glue Catalog

session = awswrangler.Session()
session.pandas.to_parquet(
    dataframe=dataframe,
    database="database",
    path="s3://...",
    partition_cols=["col_name"],
)

If a Glue Database name is passed, all the metadata will be created in the Glue Catalog. If not, only the s3 data write will be done.

Reading from AWS Athena to Pandas

session = awswrangler.Session()
dataframe = session.pandas.read_sql_athena(
    sql="select * from table",
    database="database"
)

Reading from S3 (CSV) to Pandas

session = awswrangler.Session()
dataframe = session.pandas.read_csv(path="s3://...")

Typical Pandas ETL

import pandas
import awswrangler

df = pandas.read_...  # Read from anywhere

# Typical Pandas, Numpy or Pyarrow transformation HERE!

session = awswrangler.Session()
session.pandas.to_parquet(  # Storing the data and metadata to Data Lake
    dataframe=dataframe,
    database="database",
    path="s3://...",
    partition_cols=["col_name"],
)

Loading Pyspark Dataframe to Redshift

session = awswrangler.Session(spark_session=spark)
session.spark.to_redshift(
    dataframe=df,
    path="s3://...",
    connection=conn,
    schema="public",
    table="table",
    iam_role="IAM_ROLE_ARN",
    mode="append",
)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

awswrangler-0.0b23.tar.gz (21.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

awswrangler-0.0b23-py36,py37-none-any.whl (24.4 kB view details)

Uploaded Python 3.6,py37

File details

Details for the file awswrangler-0.0b23.tar.gz.

File metadata

  • Download URL: awswrangler-0.0b23.tar.gz
  • Upload date:
  • Size: 21.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.20.1 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.7.3

File hashes

Hashes for awswrangler-0.0b23.tar.gz
Algorithm Hash digest
SHA256 09e90eebbe7367706e36f590f9fdd92845910279c5fc79432c66a39bc3a6c6fa
MD5 9e048d7400782b576d9ccdb6636b122a
BLAKE2b-256 95350362591d2df9e5c65ad6cb47c9cd3a9a041014a65ee470cdd62fbaf14429

See more details on using hashes here.

File details

Details for the file awswrangler-0.0b23-py36,py37-none-any.whl.

File metadata

  • Download URL: awswrangler-0.0b23-py36,py37-none-any.whl
  • Upload date:
  • Size: 24.4 kB
  • Tags: Python 3.6,py37
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.20.1 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.7.3

File hashes

Hashes for awswrangler-0.0b23-py36,py37-none-any.whl
Algorithm Hash digest
SHA256 f5b0d686417fc7e6e19e4d760f3634c417be53a9a7dc725f74dbd2d3956ee41f
MD5 0586ef59253abf378beae900bb7bdfdc
BLAKE2b-256 09ab76b6ae2d5cdc7d07b5c67f4214d0ef6d77527ae248bc256667e88a91c1b0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page