Skip to main content

Utility belt to handle data on AWS.

Project description

AWS Data Wrangler (beta)

Utility belt to handle data on AWS.


Contents: Use Cases | Installation | Examples


Use Cases

  • Pandas -> Parquet (S3)
  • Pandas -> CSV (S3)
  • Pandas -> Glue Catalog
  • Pandas -> Athena
  • Pandas -> Redshift
  • CSV (S3) -> Pandas
  • Athena -> Pandas
  • PySpark -> Redshift

Installation

pip install awswrangler

Runs only with Python 3.6 and beyond.

Runs anywhere (AWS Lambda, AWS Glue, EMR, EC2, on-premises, local, etc).

P.S. Lambda Layer bundle and Glue egg are available to download. It's just upload to your account and run! :rocket:

Examples

Writing Pandas Dataframe to S3 + Glue Catalog

session = awswrangler.Session()
session.pandas.to_parquet(
    dataframe=dataframe,
    database="database",
    path="s3://...",
    partition_cols=["col_name"],
)

If a Glue Database name is passed, all the metadata will be created in the Glue Catalog. If not, only the s3 data write will be done.

Reading from AWS Athena to Pandas

session = awswrangler.Session()
dataframe = session.pandas.read_sql_athena(
    sql="select * from table",
    database="database"
)

Reading from S3 (CSV) to Pandas

session = awswrangler.Session()
dataframe = session.pandas.read_csv(path="s3://...")

Typical Pandas ETL

import pandas
import awswrangler

df = pandas.read_...  # Read from anywhere

# Typical Pandas, Numpy or Pyarrow transformation HERE!

session = awswrangler.Session()
session.pandas.to_parquet(  # Storing the data and metadata to Data Lake
    dataframe=dataframe,
    database="database",
    path="s3://...",
    partition_cols=["col_name"],
)

Loading Pyspark Dataframe to Redshift

session = awswrangler.Session(spark_session=spark)
session.spark.to_redshift(
    dataframe=df,
    path="s3://...",
    connection=conn,
    schema="public",
    table="table",
    iam_role="IAM_ROLE_ARN",
    mode="append",
)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

awswrangler-0.0b21.tar.gz (20.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

awswrangler-0.0b21-py36,py37-none-any.whl (23.8 kB view details)

Uploaded Python 3.6,py37

File details

Details for the file awswrangler-0.0b21.tar.gz.

File metadata

  • Download URL: awswrangler-0.0b21.tar.gz
  • Upload date:
  • Size: 20.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.20.1 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.7.3

File hashes

Hashes for awswrangler-0.0b21.tar.gz
Algorithm Hash digest
SHA256 b0fd3cae3f19238fe41cdf65c8aaf72bb4f3763e24349f37ff79a97902d2d73b
MD5 fbaf61f27ec3eb3c97e60cc34693f579
BLAKE2b-256 e5377c72ff6a46bdfcf940389e1d22defd56eae3005d185ec0e12c51cb480c93

See more details on using hashes here.

File details

Details for the file awswrangler-0.0b21-py36,py37-none-any.whl.

File metadata

  • Download URL: awswrangler-0.0b21-py36,py37-none-any.whl
  • Upload date:
  • Size: 23.8 kB
  • Tags: Python 3.6,py37
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.20.1 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.7.3

File hashes

Hashes for awswrangler-0.0b21-py36,py37-none-any.whl
Algorithm Hash digest
SHA256 c9e93037e119f636e82911810b4e438d97bf0621994de12b3a126bd6ee704bbe
MD5 23da34b60d15351a71b94e1aa066ef8e
BLAKE2b-256 b99bc6b829981a83166d62ac26531cf133b87ea7103ed2438c9c8c06d4d1b7e5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page