Pandas on AWS.
Project description
AWS Data Wrangler
Pandas on AWS
| Source | Downloads | Page | Installation Command |
|---|---|---|---|
| PyPi | Link | pip install awswrangler |
|
| Conda | Link | conda install -c conda-forge awswrangler |
Quick Start
Install the Wrangler with: pip install awswrangler
import awswrangler as wr
import pandas as pd
df = pd.DataFrame({"id": [1, 2], "value": ["foo", "boo"]})
# Storing data on Data Lake
wr.s3.to_parquet(
df=df,
path="s3://bucket/dataset/",
dataset=True,
database="my_db",
table="my_table"
)
# Retrieving the data directly from Amazon S3
df = wr.s3.read_parquet("s3://bucket/dataset/", dataset=True)
# Retrieving the data from Amazon Athena
df = wr.athena.read_sql_query("SELECT * FROM my_table", database="my_db")
# Get Redshift connection (SQLAlchemy) from Glue and retrieving data from Redshift Spectrum
engine = wr.catalog.get_engine("my-redshift-connection")
df = wr.db.read_sql_query("SELECT * FROM external_schema.my_table", con=engine)
# Creating QuickSight Data Source and Dataset to reflect our new table
wr.quicksight.create_athena_data_source("athena-source", allowed_to_manage=["username"])
wr.quicksight.create_athena_dataset(
name="my-dataset",
database="my_db",
table="my_table",
data_source_name="athena-source",
allowed_to_manage=["username"]
)
# Get MySQL connection (SQLAlchemy) from Glue Catalog and LOAD the data into MySQL
engine = wr.catalog.get_engine("my-mysql-connection")
wr.db.to_sql(df, engine, schema="test", name="my_table")
# Get PostgreSQL connection (SQLAlchemy) from Glue Catalog and LOAD the data into PostgreSQL
engine = wr.catalog.get_engine("my-postgresql-connection")
wr.db.to_sql(df, engine, schema="test", name="my_table")
Read The Docs
-
- 001 - Introduction
- 002 - Sessions
- 003 - Amazon S3
- 004 - Parquet Datasets
- 005 - Glue Catalog
- 006 - Amazon Athena
- 007 - Databases (Redshift, MySQL and PostgreSQL)
- 008 - Redshift - Copy & Unload.ipynb
- 009 - Redshift - Append, Overwrite and Upsert
- 010 - Parquet Crawler
- 011 - CSV Datasets
- 012 - CSV Crawler
- 013 - Merging Datasets on S3
- 014 - Schema Evolution
- 015 - EMR
- 016 - EMR & Docker
- 017 - Partition Projection
- 018 - QuickSight
- 019 - Athena Cache
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
awswrangler-1.6.0.tar.gz
(74.7 kB
view details)
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
awswrangler-1.6.0-py3.6.egg
(198.3 kB
view details)
File details
Details for the file awswrangler-1.6.0.tar.gz.
File metadata
- Download URL: awswrangler-1.6.0.tar.gz
- Upload date:
- Size: 74.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.6.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
20fee3c4b5350b88fe73f88b456366122abf07e588e39f3c5dd610df8b2dc587
|
|
| MD5 |
6edaea19929136cfa533ef167216cc68
|
|
| BLAKE2b-256 |
9aea71c9dc6df0397c52bfa56d583d0e2fa0d31d1c255e8d05a3bc62d6586b07
|
File details
Details for the file awswrangler-1.6.0-py3.6.egg.
File metadata
- Download URL: awswrangler-1.6.0-py3.6.egg
- Upload date:
- Size: 198.3 kB
- Tags: Egg
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.6.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a2d8a6876c9566d4c13b6118f845a3cd9f389870f4838470eebb510e3162f9a0
|
|
| MD5 |
0ac8c8f51e57c669bbe98d5c179d87f1
|
|
| BLAKE2b-256 |
b907406f5a3858097b34050de7308936f284b0cfccd6ad6e13f137ad32fdef87
|
File details
Details for the file awswrangler-1.6.0-py3-none-any.whl.
File metadata
- Download URL: awswrangler-1.6.0-py3-none-any.whl
- Upload date:
- Size: 90.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.6.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
48000bf1d80a2548af71f519f90747b16a26ab5af7e6ea1ef7896d9a323ee76e
|
|
| MD5 |
36aebed5e57dfc234c0f38e335148b85
|
|
| BLAKE2b-256 |
8ef1c40e08aafce1e63a37a4d097161387c3177ec5c706772716edf05c55a9ec
|