Skip to main content

Hive import statement generator for Parquet datasets

Project description

# parquet2hive
Hive import statement generator for Parquet datasets.

## Installation
```bash
pip install parquet2hive
```

## Example usage
```bash
➜ ~ parquet2hive -d s3://telemetry-test-bucket/churn/telemetry-2/
create external table churn(clientId string, sampleId int, channel string, normalizedChannel string, country string, profileCreationDate int, subsessionStartDate int, subsessionLength int, distributionId string, submissionDate string, syncConfigured boolean, syncCountDesktop int, syncCountMobile int, version string, timestamp bigint) partitioned by (ubmissionDateS3 string) stored as parquet location 's3://telemetry-test-bucket/churn/telemetry-2/';

```

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

parquet2hive-0.1.0.tar.gz (44.0 MB view details)

Uploaded Source

File details

Details for the file parquet2hive-0.1.0.tar.gz.

File metadata

File hashes

Hashes for parquet2hive-0.1.0.tar.gz
Algorithm Hash digest
SHA256 9482df7b1243ff370174195cd163bb620af91236dce5ac7c1e7e5d9d4309de82
MD5 8abc6d24cae4960280ff80cd064ace10
BLAKE2b-256 65141d702ef60a053a64a3ca16198786910f46b8395f6fab69d3132ee362fff4

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page