A CDK Construct Library for Kinesis Analytics Flink applications
Project description
Kinesis Analytics Flink
---The APIs of higher level constructs in this module are experimental and under active development. They are subject to non-backward compatible changes or removal in any future version. These are not subject to the Semantic Versioning model and breaking changes will be announced in the release notes. This means that while you may use them, you may need to update your source code when upgrading to a newer version of this package.
This package provides constructs for creating Kinesis Analytics Flink applications. To learn more about using using managed Flink applications, see the AWS developer guide.
Creating Flink Applications
To create a new Flink application, use the Application
construct:
import path as path
import aws_cdk.aws_cloudwatch as cloudwatch
import aws_cdk as core
import aws_cdk.integ_tests_alpha as integ
import aws_cdk.aws_kinesisanalytics_flink_alpha as flink
app = core.App()
stack = core.Stack(app, "FlinkAppTest")
flink_runtimes = [flink.Runtime.FLINK_1_6, flink.Runtime.FLINK_1_8, flink.Runtime.FLINK_1_11, flink.Runtime.FLINK_1_13, flink.Runtime.FLINK_1_15, flink.Runtime.FLINK_1_18, flink.Runtime.FLINK_1_19, flink.Runtime.FLINK_1_20
]
flink_runtimes.for_each((runtime) => {
const flinkApp = new flink.Application(stack, `App-${runtime.value}`, {
code: flink.ApplicationCode.fromAsset(path.join(__dirname, 'code-asset')),
runtime: runtime,
});
new cloudwatch.Alarm(stack, `Alarm-${runtime.value}`, {
metric: flinkApp.metricFullRestarts(),
evaluationPeriods: 1,
threshold: 3,
});
})
integ.IntegTest(app, "ApplicationTest",
test_cases=[stack]
)
The code
property can use fromAsset
as shown above to reference a local jar
file in s3 or fromBucket
to reference a file in s3.
import path as path
import aws_cdk.aws_s3_assets as assets
import aws_cdk as core
import aws_cdk.aws_kinesisanalytics_flink_alpha as flink
app = core.App()
stack = core.Stack(app, "FlinkAppCodeFromBucketTest")
asset = assets.Asset(stack, "CodeAsset",
path=path.join(__dirname, "code-asset")
)
bucket = asset.bucket
file_key = asset.s3_object_key
flink.Application(stack, "App",
code=flink.ApplicationCode.from_bucket(bucket, file_key),
runtime=flink.Runtime.FLINK_1_19
)
app.synth()
The propertyGroups
property provides a way of passing arbitrary runtime
properties to your Flink application. You can use the
aws-kinesisanalytics-runtime library to retrieve these
properties.
# bucket: s3.Bucket
flink_app = flink.Application(self, "Application",
property_groups={
"FlinkApplicationProperties": {
"input_stream_name": "my-input-kinesis-stream",
"output_stream_name": "my-output-kinesis-stream"
}
},
# ...
runtime=flink.Runtime.FLINK_1_20,
code=flink.ApplicationCode.from_bucket(bucket, "my-app.jar")
)
Flink applications also have specific configuration for passing parameters when the Flink job starts. These include parameters for checkpointing, snapshotting, monitoring, and parallelism.
# bucket: s3.Bucket
flink_app = flink.Application(self, "Application",
code=flink.ApplicationCode.from_bucket(bucket, "my-app.jar"),
runtime=flink.Runtime.FLINK_1_20,
checkpointing_enabled=True, # default is true
checkpoint_interval=Duration.seconds(30), # default is 1 minute
min_pause_between_checkpoints=Duration.seconds(10), # default is 5 seconds
log_level=flink.LogLevel.ERROR, # default is INFO
metrics_level=flink.MetricsLevel.PARALLELISM, # default is APPLICATION
auto_scaling_enabled=False, # default is true
parallelism=32, # default is 1
parallelism_per_kpu=2, # default is 1
snapshots_enabled=False, # default is true
log_group=logs.LogGroup(self, "LogGroup")
)
Flink applications can optionally be deployed in a VPC:
# bucket: s3.Bucket
# vpc: ec2.Vpc
flink_app = flink.Application(self, "Application",
code=flink.ApplicationCode.from_bucket(bucket, "my-app.jar"),
runtime=flink.Runtime.FLINK_1_20,
vpc=vpc
)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file aws_cdk_aws_kinesisanalytics_flink_alpha-2.168.0a0.tar.gz
.
File metadata
- Download URL: aws_cdk_aws_kinesisanalytics_flink_alpha-2.168.0a0.tar.gz
- Upload date:
- Size: 105.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | dba49e37b5352c3270579c7eb4eb3a286a479eedace0caeb9eeb9a8bca7de72a |
|
MD5 | 1bbcdbb73abf679f570732bc1f0e1e6c |
|
BLAKE2b-256 | 70823669468743b4585344a8c8764a4788cab27cb577973790a9bde5b98756bb |
File details
Details for the file aws_cdk.aws_kinesisanalytics_flink_alpha-2.168.0a0-py3-none-any.whl
.
File metadata
- Download URL: aws_cdk.aws_kinesisanalytics_flink_alpha-2.168.0a0-py3-none-any.whl
- Upload date:
- Size: 104.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ea19da862318efb74e5474c6aae89a049374f59e9684195db5c1ebe81ed38fda |
|
MD5 | c70b8a761ea6f3eb76f03eb3a010435a |
|
BLAKE2b-256 | d3afd770f6ddc2ed592af1369d7c7a82130322660a780005c998c2273381c450 |