Skip to main content

Pandas on AWS.

Project description

AWS Data Wrangler

Pandas on AWS

Easy integration with Athena, Glue, Redshift, Timestream, QuickSight, Chime, CloudWatchLogs, DynamoDB, EMR, SecretManager, PostgreSQL, MySQL, SQLServer and S3 (Parquet, CSV, JSON and EXCEL).

AWS Data Wrangler

An AWS Professional Service open source initiative | aws-proserve-opensource@amazon.com

Release Python Version Code style: black License

Checked with mypy Coverage Static Checking Build Status Documentation Status

Source Downloads Installation Command
PyPi PyPI Downloads pip install awswrangler
Conda Conda Downloads conda install -c conda-forge awswrangler

⚠️ For platforms without PyArrow 3 support (e.g. EMR, Glue PySpark Job, MWAA):
➡️ pip install pyarrow==2 awswrangler

Powered By

Table of contents

Quick Start

Installation command: pip install awswrangler

⚠️ For platforms without PyArrow 3 support (e.g. EMR, Glue PySpark Job, MWAA):
➡️pip install pyarrow==2 awswrangler

import awswrangler as wr
import pandas as pd
from datetime import datetime

df = pd.DataFrame({"id": [1, 2], "value": ["foo", "boo"]})

# Storing data on Data Lake
wr.s3.to_parquet(
    df=df,
    path="s3://bucket/dataset/",
    dataset=True,
    database="my_db",
    table="my_table"
)

# Retrieving the data directly from Amazon S3
df = wr.s3.read_parquet("s3://bucket/dataset/", dataset=True)

# Retrieving the data from Amazon Athena
df = wr.athena.read_sql_query("SELECT * FROM my_table", database="my_db")

# Get a Redshift connection from Glue Catalog and retrieving data from Redshift Spectrum
con = wr.redshift.connect("my-glue-connection")
df = wr.redshift.read_sql_query("SELECT * FROM external_schema.my_table", con=con)
con.close()

# Amazon Timestream Write
df = pd.DataFrame({
    "time": [datetime.now(), datetime.now()],   
    "my_dimension": ["foo", "boo"],
    "measure": [1.0, 1.1],
})
rejected_records = wr.timestream.write(df,
    database="sampleDB",
    table="sampleTable",
    time_col="time",
    measure_col="measure",
    dimensions_cols=["my_dimension"],
)

# Amazon Timestream Query
wr.timestream.query("""
SELECT time, measure_value::double, my_dimension
FROM "sampleDB"."sampleTable" ORDER BY time DESC LIMIT 3
""")

Read The Docs

Community Resources

Please send a Pull Request with your resource reference and @githubhandle.

Logging

Enabling internal logging examples:

import logging
logging.basicConfig(level=logging.INFO, format="[%(name)s][%(funcName)s] %(message)s")
logging.getLogger("awswrangler").setLevel(logging.DEBUG)
logging.getLogger("botocore.credentials").setLevel(logging.CRITICAL)

Into AWS lambda:

import logging
logging.getLogger("awswrangler").setLevel(logging.DEBUG)

Who uses AWS Data Wrangler?

Knowing which companies are using this library is important to help prioritize the project internally.

Please send a Pull Request with your company name and @githubhandle if you may.

What is Amazon SageMaker Data Wrangler?

Amazon SageMaker Data Wrangler is a new SageMaker Studio feature that has a similar name but has a different purpose than the AWS Data Wrangler open source project.

  • AWS Data Wrangler is open source, runs anywhere, and is focused on code.

  • Amazon SageMaker Data Wrangler is specific for the SageMaker Studio environment and is focused on a visual interface.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

awswrangler-2.9.0.tar.gz (146.9 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

awswrangler-2.9.0-py3.8.egg (404.6 kB view details)

Uploaded Egg

awswrangler-2.9.0-py3-none-any.whl (183.6 kB view details)

Uploaded Python 3

File details

Details for the file awswrangler-2.9.0.tar.gz.

File metadata

  • Download URL: awswrangler-2.9.0.tar.gz
  • Upload date:
  • Size: 146.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.5.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.1 CPython/3.8.9

File hashes

Hashes for awswrangler-2.9.0.tar.gz
Algorithm Hash digest
SHA256 fba54ffb153b372431af36e420330e87e0d09f8e27ecd56a17976e4d4b61bafd
MD5 afa5ba22051e71e08d8f772789f40f7a
BLAKE2b-256 4e8082d06c5b5aaf8a40890582f23e5130729be6726171e386806388a58e83ef

See more details on using hashes here.

File details

Details for the file awswrangler-2.9.0-py3.8.egg.

File metadata

  • Download URL: awswrangler-2.9.0-py3.8.egg
  • Upload date:
  • Size: 404.6 kB
  • Tags: Egg
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.5.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.1 CPython/3.8.9

File hashes

Hashes for awswrangler-2.9.0-py3.8.egg
Algorithm Hash digest
SHA256 195d0a8661b21186661fa366a50b2fd5a9a3c239992bb056b2b7a132ce060c3a
MD5 4a3055b8bc61e017fe3b594f1f94047f
BLAKE2b-256 8a468e9cbc32b9ac5b87e3246fe6aa7b8a9109e752dc7743e54b83dde55eb4fb

See more details on using hashes here.

File details

Details for the file awswrangler-2.9.0-py3-none-any.whl.

File metadata

  • Download URL: awswrangler-2.9.0-py3-none-any.whl
  • Upload date:
  • Size: 183.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.5.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.1 CPython/3.8.9

File hashes

Hashes for awswrangler-2.9.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ee78ffb824de325424472538b36a85f487b1fdd314ae1c60d60111db340da879
MD5 c494367c0255f33070ae2e2139924890
BLAKE2b-256 65433c1b5fced4d3caa31a17f4cae5ec6bc656becbcf492041c5caa809825ec1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page