Skip to main content

CDK constructs for defining an interaction between an Amazon Kinesis Data Firehose delivery stream and (1) an Amazon S3 bucket, and (2) an Amazon Kinesis Data Analytics application.

Project description

aws-kinesisfirehose-s3-and-kinesisanalytics module

---

Stability: Experimental

All classes are under active development and subject to non-backward compatible changes or removal in any future version. These are not subject to the Semantic Versioning model. This means that while you may use them, you may need to update your source code when upgrading to a newer version of this package.


Reference Documentation: https://docs.aws.amazon.com/solutions/latest/constructs/
Language Package
Python Logo Python aws_solutions_constructs.aws_kinesisfirehose_s3_and_kinesisanalytics
Typescript Logo Typescript @aws-solutions-constructs/aws-kinesisfirehose-s3-and-kinesisanalytics
Java Logo Java software.amazon.awsconstructs.services.kinesisfirehoses3kinesisanalytics

This AWS Solutions Construct implements an Amazon Kinesis Firehose delivery stream connected to an Amazon S3 bucket, and an Amazon Kinesis Analytics application.

Here is a minimal deployable pattern definition in Typescript:

const { KinesisFirehoseToAnalyticsAndS3 } from '@aws-solutions-constructs/aws-kinesisfirehose-s3-and-kinesisanalytics';

new KinesisFirehoseToAnalyticsAndS3(this, 'FirehoseToS3AndAnalyticsPattern', {
    kinesisAnalyticsProps: {
        inputs: [{
            inputSchema: {
                recordColumns: [{
                    name: 'ticker_symbol',
                    sqlType: 'VARCHAR(4)',
                    mapping: '$.ticker_symbol'
                }, {
                    name: 'sector',
                    sqlType: 'VARCHAR(16)',
                    mapping: '$.sector'
                }, {
                    name: 'change',
                    sqlType: 'REAL',
                    mapping: '$.change'
                }, {
                    name: 'price',
                    sqlType: 'REAL',
                    mapping: '$.price'
                }],
                recordFormat: {
                    recordFormatType: 'JSON'
                },
                recordEncoding: 'UTF-8'
            },
            namePrefix: 'SOURCE_SQL_STREAM'
        }]
    }
});

Initializer

new KinesisFirehoseToAnalyticsAndS3(scope: Construct, id: string, props: KinesisFirehoseToAnalyticsAndS3Props);

Parameters

Pattern Construct Props

Name Type Description
kinesisFirehoseProps? kinesisFirehose.CfnDeliveryStreamProps Optional user-provided props to override the default props for the Kinesis Firehose delivery stream.
kinesisAnalyticsProps? kinesisAnalytics.CfnApplicationProps Optional user-provided props to override the default props for the Kinesis Analytics application.
existingBucketObj? s3.IBucket Existing instance of S3 Bucket object. If this is provided, then also providing bucketProps is an error.
bucketProps? s3.BucketProps User provided props to override the default props for the S3 Bucket.
logGroupProps? logs.LogGroupProps User provided props to override the default props for for the CloudWatchLogs LogGroup.

Pattern Properties

Name Type Description
kinesisAnalytics kinesisAnalytics.CfnApplication Returns an instance of the Kinesis Analytics application created by the pattern.
kinesisFirehose kinesisFirehose.CfnDeliveryStream Returns an instance of the Kinesis Firehose delivery stream created by the pattern.
kinesisFirehoseRole iam.Role Returns an instance of the iam.Role created by the construct for Kinesis Data Firehose delivery stream
kinesisFirehoseLogGroup logs.LogGroup Returns an instance of the LogGroup created by the construct for Kinesis Data Firehose delivery stream
s3Bucket? s3.Bucket Returns an instance of the S3 bucket created by the pattern.
s3LoggingBucket? s3.Bucket Returns an instance of s3.Bucket created by the construct as the logging bucket for the primary bucket.

Default settings

Out of the box implementation of the Construct without any override will set the following defaults:

Amazon Kinesis Firehose

  • Enable CloudWatch logging for Kinesis Firehose
  • Configure least privilege access IAM role for Amazon Kinesis Firehose

Amazon S3 Bucket

  • Configure Access logging for S3 Bucket
  • Enable server-side encryption for S3 Bucket using AWS managed KMS Key
  • Enforce encryption of data in transit
  • Turn on the versioning for S3 Bucket
  • Don't allow public access for S3 Bucket
  • Retain the S3 Bucket when deleting the CloudFormation stack
  • Applies Lifecycle rule to move noncurrent object versions to Glacier storage after 90 days

Amazon Kinesis Data Analytics

  • Configure least privilege access IAM role for Amazon Kinesis Analytics

Architecture

Architecture Diagram


© Copyright 2021 Amazon.com, Inc. or its affiliates. All Rights Reserved.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file aws-solutions-constructs.aws-kinesis-firehose-s3-kinesis-analytics-1.101.0.tar.gz.

File metadata

File hashes

Hashes for aws-solutions-constructs.aws-kinesis-firehose-s3-kinesis-analytics-1.101.0.tar.gz
Algorithm Hash digest
SHA256 30eaee5ddbaf8d37e6d76606b60834eeabaa1e9383720ef6b4e96d2e66d9a5bb
MD5 e9bc9286bfd7aae63b4d44e6d079388b
BLAKE2b-256 34cbcefe2369aa88eb6676a7d1c869a013fe46615110aed799ffe54539547630

See more details on using hashes here.

File details

Details for the file aws_solutions_constructs.aws_kinesis_firehose_s3_kinesis_analytics-1.101.0-py3-none-any.whl.

File metadata

File hashes

Hashes for aws_solutions_constructs.aws_kinesis_firehose_s3_kinesis_analytics-1.101.0-py3-none-any.whl
Algorithm Hash digest
SHA256 95a4050456cab4cb3ce9edcabe5fd9aed85b3cce1249dde71a3ace1a1fea6250
MD5 969625a922199724702f7ba66227be89
BLAKE2b-256 a1d7d82d8d1140b8500855b2c763fcbf328c94521d4fb87267a156515d8328da

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page