Skip to main content

CDK constructs for defining an interaction between an Amazon Kinesis Data Firehose delivery stream and (1) an Amazon S3 bucket, and (2) an Amazon Kinesis Data Analytics application.

Project description

aws-kinesisfirehose-s3-and-kinesisanalytics module

---

Stability: Experimental

All classes are under active development and subject to non-backward compatible changes or removal in any future version. These are not subject to the Semantic Versioning model. This means that while you may use them, you may need to update your source code when upgrading to a newer version of this package.


Reference Documentation: https://docs.aws.amazon.com/solutions/latest/constructs/
Language Package
Python Logo Python aws_solutions_constructs.aws_kinesisfirehose_s3_and_kinesisanalytics
Typescript Logo Typescript @aws-solutions-constructs/aws-kinesisfirehose-s3-and-kinesisanalytics
Java Logo Java software.amazon.awsconstructs.services.kinesisfirehoses3kinesisanalytics

This AWS Solutions Construct implements an Amazon Kinesis Firehose delivery stream connected to an Amazon S3 bucket, and an Amazon Kinesis Analytics application.

Here is a minimal deployable pattern definition in Typescript:

const { KinesisFirehoseToAnalyticsAndS3 } from '@aws-solutions-constructs/aws-kinesisfirehose-s3-and-kinesisanalytics';

new KinesisFirehoseToAnalyticsAndS3(this, 'FirehoseToS3AndAnalyticsPattern', {
    kinesisAnalyticsProps: {
        inputs: [{
            inputSchema: {
                recordColumns: [{
                    name: 'ticker_symbol',
                    sqlType: 'VARCHAR(4)',
                    mapping: '$.ticker_symbol'
                }, {
                    name: 'sector',
                    sqlType: 'VARCHAR(16)',
                    mapping: '$.sector'
                }, {
                    name: 'change',
                    sqlType: 'REAL',
                    mapping: '$.change'
                }, {
                    name: 'price',
                    sqlType: 'REAL',
                    mapping: '$.price'
                }],
                recordFormat: {
                    recordFormatType: 'JSON'
                },
                recordEncoding: 'UTF-8'
            },
            namePrefix: 'SOURCE_SQL_STREAM'
        }]
    }
});

Initializer

new KinesisFirehoseToAnalyticsAndS3(scope: Construct, id: string, props: KinesisFirehoseToAnalyticsAndS3Props);

Parameters

Pattern Construct Props

Name Type Description
kinesisFirehoseProps? kinesisFirehose.CfnDeliveryStreamProps Optional user-provided props to override the default props for the Kinesis Firehose delivery stream.
kinesisAnalyticsProps? kinesisAnalytics.CfnApplicationProps Optional user-provided props to override the default props for the Kinesis Analytics application.
existingBucketObj? s3.IBucket Existing instance of S3 Bucket object, if this is set then the bucketProps is ignored.
bucketProps? s3.BucketProps User provided props to override the default props for the S3 Bucket.
logGroupProps? logs.LogGroupProps User provided props to override the default props for for the CloudWatchLogs LogGroup.

Pattern Properties

Name Type Description
kinesisAnalytics kinesisAnalytics.CfnApplication Returns an instance of the Kinesis Analytics application created by the pattern.
kinesisFirehose kinesisFirehose.CfnDeliveryStream Returns an instance of the Kinesis Firehose delivery stream created by the pattern.
kinesisFirehoseRole iam.Role Returns an instance of the iam.Role created by the construct for Kinesis Data Firehose delivery stream
kinesisFirehoseLogGroup logs.LogGroup Returns an instance of the LogGroup created by the construct for Kinesis Data Firehose delivery stream
s3Bucket? s3.Bucket Returns an instance of the S3 bucket created by the pattern.
s3LoggingBucket? s3.Bucket Returns an instance of s3.Bucket created by the construct as the logging bucket for the primary bucket.

Default settings

Out of the box implementation of the Construct without any override will set the following defaults:

Amazon Kinesis Firehose

  • Enable CloudWatch logging for Kinesis Firehose
  • Configure least privilege access IAM role for Amazon Kinesis Firehose

Amazon S3 Bucket

  • Configure Access logging for S3 Bucket
  • Enable server-side encryption for S3 Bucket using AWS managed KMS Key
  • Enforce encryption of data in transit
  • Turn on the versioning for S3 Bucket
  • Don't allow public access for S3 Bucket
  • Retain the S3 Bucket when deleting the CloudFormation stack
  • Applies Lifecycle rule to move noncurrent object versions to Glacier storage after 90 days

Amazon Kinesis Data Analytics

  • Configure least privilege access IAM role for Amazon Kinesis Analytics

Architecture

Architecture Diagram


© Copyright 2021 Amazon.com, Inc. or its affiliates. All Rights Reserved.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file aws-solutions-constructs.aws-kinesis-firehose-s3-kinesis-analytics-1.98.0.tar.gz.

File metadata

File hashes

Hashes for aws-solutions-constructs.aws-kinesis-firehose-s3-kinesis-analytics-1.98.0.tar.gz
Algorithm Hash digest
SHA256 3b412b78f50ef91ab0cc90c5b8a6ef20f86a8e20f4be7bcf012061954b29eba1
MD5 e680f59d9956812d51602cb570081d80
BLAKE2b-256 e8c17e72e74ad173a4e59e7e06a78f05a7c5d90374ff54e4281d9b2f75759f62

See more details on using hashes here.

File details

Details for the file aws_solutions_constructs.aws_kinesis_firehose_s3_kinesis_analytics-1.98.0-py3-none-any.whl.

File metadata

File hashes

Hashes for aws_solutions_constructs.aws_kinesis_firehose_s3_kinesis_analytics-1.98.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5ce2f3696cd0de1ad6ad3cee8f2f1c995891a79936d77f1a3cf4242ad2e88898
MD5 9647643b342a3833e66783d8fdb0757d
BLAKE2b-256 590bd7aa40024e6a49dec74a28bd546e13f482d40e16c9f49bcb3c4ca200eebd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page