Skip to main content

Pandas on AWS.

Project description

AWS Data Wrangler

Pandas on AWS

Easy integration with Athena, Glue, Redshift, Timestream, QuickSight, Chime, CloudWatchLogs, DynamoDB, EMR, SecretManager, PostgreSQL, MySQL, SQLServer and S3 (Parquet, CSV, JSON and EXCEL).

AWS Data Wrangler

An AWS Professional Service open source initiative | aws-proserve-opensource@amazon.com

Release Python Version Code style: black License

Checked with mypy Coverage Static Checking Build Status Documentation Status

Source Downloads Installation Command
PyPi PyPI Downloads pip install awswrangler
Conda Conda Downloads conda install -c conda-forge awswrangler

⚠️ For platforms without PyArrow 3 support (e.g. EMR, Glue PySpark Job, MWAA):
➡️ pip install pyarrow==2 awswrangler

Powered By

Table of contents

Quick Start

Installation command: pip install awswrangler

⚠️ For platforms without PyArrow 3 support (e.g. EMR, Glue PySpark Job, MWAA):
➡️pip install pyarrow==2 awswrangler

import awswrangler as wr
import pandas as pd
from datetime import datetime

df = pd.DataFrame({"id": [1, 2], "value": ["foo", "boo"]})

# Storing data on Data Lake
wr.s3.to_parquet(
    df=df,
    path="s3://bucket/dataset/",
    dataset=True,
    database="my_db",
    table="my_table"
)

# Retrieving the data directly from Amazon S3
df = wr.s3.read_parquet("s3://bucket/dataset/", dataset=True)

# Retrieving the data from Amazon Athena
df = wr.athena.read_sql_query("SELECT * FROM my_table", database="my_db")

# Get a Redshift connection from Glue Catalog and retrieving data from Redshift Spectrum
con = wr.redshift.connect("my-glue-connection")
df = wr.redshift.read_sql_query("SELECT * FROM external_schema.my_table", con=con)
con.close()

# Amazon Timestream Write
df = pd.DataFrame({
    "time": [datetime.now(), datetime.now()],   
    "my_dimension": ["foo", "boo"],
    "measure": [1.0, 1.1],
})
rejected_records = wr.timestream.write(df,
    database="sampleDB",
    table="sampleTable",
    time_col="time",
    measure_col="measure",
    dimensions_cols=["my_dimension"],
)

# Amazon Timestream Query
wr.timestream.query("""
SELECT time, measure_value::double, my_dimension
FROM "sampleDB"."sampleTable" ORDER BY time DESC LIMIT 3
""")

Read The Docs

Community Resources

Please send a Pull Request with your resource reference and @githubhandle.

Logging

Enabling internal logging examples:

import logging
logging.basicConfig(level=logging.INFO, format="[%(name)s][%(funcName)s] %(message)s")
logging.getLogger("awswrangler").setLevel(logging.DEBUG)
logging.getLogger("botocore.credentials").setLevel(logging.CRITICAL)

Into AWS lambda:

import logging
logging.getLogger("awswrangler").setLevel(logging.DEBUG)

Who uses AWS Data Wrangler?

Knowing which companies are using this library is important to help prioritize the project internally.

Please send a Pull Request with your company name and @githubhandle if you may.

What is Amazon SageMaker Data Wrangler?

Amazon SageMaker Data Wrangler is a new SageMaker Studio feature that has a similar name but has a different purpose than the AWS Data Wrangler open source project.

  • AWS Data Wrangler is open source, runs anywhere, and is focused on code.

  • Amazon SageMaker Data Wrangler is specific for the SageMaker Studio environment and is focused on a visual interface.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

awswrangler-2.7.0.tar.gz (141.6 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

awswrangler-2.7.0-py3.6.egg (383.9 kB view details)

Uploaded Egg

awswrangler-2.7.0-py3-none-any.whl (176.2 kB view details)

Uploaded Python 3

File details

Details for the file awswrangler-2.7.0.tar.gz.

File metadata

  • Download URL: awswrangler-2.7.0.tar.gz
  • Upload date:
  • Size: 141.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.7.7

File hashes

Hashes for awswrangler-2.7.0.tar.gz
Algorithm Hash digest
SHA256 1669002e8d44537a8d02c93ed7ce3cae8196733e9ab3a50036cd386b8336b542
MD5 8ae40552912ab32ba2eabc224ad4ab8e
BLAKE2b-256 6f612489e914f5e9e480bc0784696cef3b0b5aa7c43c9ad910b313622c8cc738

See more details on using hashes here.

File details

Details for the file awswrangler-2.7.0-py3.6.egg.

File metadata

  • Download URL: awswrangler-2.7.0-py3.6.egg
  • Upload date:
  • Size: 383.9 kB
  • Tags: Egg
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.7.7

File hashes

Hashes for awswrangler-2.7.0-py3.6.egg
Algorithm Hash digest
SHA256 8693587b537166540d4e9dd9a5693acb50762dcd9f41b5b9824a23b27cbb1137
MD5 c67f9968e92d0436a627f60f29b2a6e8
BLAKE2b-256 5f22ccf20ad6debefd5bf9aff3e009e24152ec1dcb7d55f65904fa32560288fa

See more details on using hashes here.

File details

Details for the file awswrangler-2.7.0-py3-none-any.whl.

File metadata

  • Download URL: awswrangler-2.7.0-py3-none-any.whl
  • Upload date:
  • Size: 176.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.7.7

File hashes

Hashes for awswrangler-2.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6e13a34a2cdc28a9a26103fe72ccf1997e4465af5c875a0f8e58f98493810e8e
MD5 78119bec88618719ba496d5259a683c4
BLAKE2b-256 98cf406eba8a9cf2bcb89a5e8bb366555269f47a73358be73d6e2e7441d99520

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page