CDK Constructs for streaming data from AWS Kinesis Data Stream for Glue ETL custom Job processing
Project description
aws-kinesisstreams-gluejob module
---All classes are under active development and subject to non-backward compatible changes or removal in any future version. These are not subject to the Semantic Versioning model. This means that while you may use them, you may need to update your source code when upgrading to a newer version of this package.
Reference Documentation: | https://docs.aws.amazon.com/solutions/latest/constructs/ |
---|
Language | Package |
---|---|
Python | aws_solutions_constructs.aws_kinesis_streams_gluejob |
Typescript | @aws-solutions-constructs/aws-kinesisstreams-gluejob |
Java | software.amazon.awsconstructs.services.kinesisstreamgluejob |
This AWS Solutions Construct deploys a Kinesis Stream and configures a AWS Glue Job to perform custom ETL transformation with the appropriate resources/properties for interaction and security. It also creates an S3 bucket where the python script for the AWS Glue Job can be uploaded
Here is a minimal deployable pattern definition in Typescript:
import * as glue from '@aws-cdk/aws-glue';
import * as s3assets from '@aws-cdk/aws-s3-assets';
import {KinesisstreamsToGluejob} from '@aws-solutions-constructs/aws-kinesisstreams-gluejob';
const fieldSchema: glue.CfnTable.ColumnProperty[] = [
{
name: 'id',
type: 'int',
comment: 'Identifier for the record',
},
{
name: 'name',
type: 'string',
comment: 'Name for the record',
},
{
name: 'address',
type: 'string',
comment: 'Address for the record',
},
{
name: 'value',
type: 'int',
comment: 'Value for the record',
},
];
const customEtlJob = new KinesisstreamsToGluejob(this, 'CustomETL', {
glueJobProps: {
command: {
name: 'gluestreaming',
pythonVersion: '3',
scriptLocation: new s3assets.Asset(this, 'ScriptLocation', {
path: `${__dirname}/../etl/transform.py`,
}).s3ObjectUrl,
},
},
fieldSchema: fieldSchema,
});
Initializer
new KinesisstreamsToGluejob(scope: Construct, id: string, props: KinesisstreamsToGluejobProps);
Parameters
- scope
Construct
- id
string
- props
KinesisstreamsToGluejobProps
Pattern Construct Props
Name | Type | Description |
---|---|---|
kinesisStreamProps? | kinesis.StreamProps |
Optional user-provided props to override the default props for the Kinesis stream. |
existingStreamObj? | kinesis.Stream |
Existing instance of Kinesis Stream, if this is set then kinesisStreamProps is ignored. |
glueJobProps? | cfnJob.CfnJobProps |
User provided props to override the default props for the CfnJob. |
existingGlueJob? | cfnJob.CfnJob |
Existing CfnJob configuration, if this this set then glueJobProps is ignored. |
existingDatabase? | CfnDatabase |
Existing Glue Database to be used with this construct. If not provided the construct will create a new Glue Database. The database is where the schema for the data in Kinesis Data Streams is stored |
databaseProps? | CfnDatabaseProps |
User provided Glue Database Props to override the default props used to create the Glue Database. If not provided will create a database with default props |
existingTable? | CfnTable |
Existing Glue Table to be used with this construct. This table should define the schema for the records in the Kinesis Data Streams. Either @table or @fieldSchema is mandatory. If not provided the construct will create a new Table in the database.. If @existingTable is provided then @tableProps and @fieldSchema are ignored |
tableProps? | CfnTable |
User provided Glue Table props to override default props used to create a Glue Table. If @existingTable is provided, @tableProps is ignored. If @tableProps is set, @fieldSchema is ignored |
fieldSchema? | CfnTable.ColumnProperty[] |
Either one of @existingTable, @tableProps or @fieldSchema has to be provided. If none are provided then the construct will throw an error |
outputDataStore? | SinkDataStoreProps |
The output datastore properties where the ETL output should be. Current datastore types suported is only S3. |
SinkDataStoreProps
Name | Type | Description |
---|---|---|
existingS3OutputBucket? | Bucket |
The output S3 location where the data should be written. The provided S3 bucket will be used to pass the output location to the etl script as an argument to the AWS Glue job. it will check if @outputBucketProps are provided. If not it will create a new bucket if the @datastoreType is S3. |
outputBucketProps | BucketProps |
If @existingS3OutputBucket is not provided, it will use this attribute to create a S3 bucket to store ETL output. If this property is not set, it will create a bucket with default properties |
datastoreType | SinkStoreType |
Sink data store type |
SinkStoreType
Enumeration of data store types that could include S3, DynamoDB, DocumentDB, RDS or Redshift. Current construct implementation only supports S3, but potential to add other output types in the future
Name | Type | Description |
---|---|---|
S3 | string |
S3 storage type |
Default settings
Out of the box implementation of the Construct without any override will set the following defaults:
Amazon Kinesis Stream
- Configure least privilege access IAM role for Kinesis Stream
- Enable server-side encryption for Kinesis Stream using AWS Managed KMS Key
- Deploy best practices CloudWatch Alarms for the Kinesis Stream
Glue Job
- Create a Glue Security Config that configures encryption for CloudWatch, Job Bookmarks, and S3. CloudWatch and Job Bookmarks are encrypted using AWS Managed KMS Key created for AWS Glue Service. The S3 bucket is configured with SSE-S3 encryption mode
- Configure service role policies that allow AWS Glue to read from Kinesis Data Streams.
Glue Database
- Create an AWS Glue database. An AWS Glue Table will be added to the database. This table defines the schema for the records buffered in the Amazon Kinesis Data Streams
Glue Table
- Create an AWS Glue table. The table scheam definition is based on the JSON structure of the records buffered in the Amazon Kinesis Data Streams
IAM Role
- A job execution role that has privileges to read the ETL script from the S3 bucket location, read from the Kinesis Stream, and execute the Glue Job. The permissions to write to a specific location, is not configured by the construct.
Output S3 Bucket
- An S3 bucket to store the output of the ETL transformation. This bucket will be passed as an argument to the created glue job so that it can be used in the ETL script to write data into it
Architecture
Reference Implementation
A sample use case which uses this pattern is available under use_cases/aws-custom-glue-etl
.
© Copyright 2020 Amazon.com, Inc. or its affiliates. All Rights Reserved.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for aws-solutions-constructs.aws-kinesis-streams-gluejob-1.90.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | fafd55be10bde3d6088b037925243e2b3e9182295ce751ebd30c148cbe4e5afc |
|
MD5 | 36b490b5d9bb9ee184a068c6103d2fe8 |
|
BLAKE2b-256 | 88b349d813f2cf305f69e19e09b890a901d7c855a586cfa76b4e4390ddf8bf99 |
Hashes for aws_solutions_constructs.aws_kinesis_streams_gluejob-1.90.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b581f449a1888eed0ee75609104c4af5ee21700962d8a18a773dffa37e17644e |
|
MD5 | 21cf8bf7e450fc0a01e20691b14136ab |
|
BLAKE2b-256 | db017092935ebfa4a48c70bdf8c12bcd8fbbe887ecdf5ffacdf416c5718402a4 |