A CDK Construct Library for Kinesis Analytics Flink applications
Project description
Kinesis Analytics Flink
---The APIs of higher level constructs in this module are experimental and under active development. They are subject to non-backward compatible changes or removal in any future version. These are not subject to the Semantic Versioning model and breaking changes will be announced in the release notes. This means that while you may use them, you may need to update your source code when upgrading to a newer version of this package.
This package provides constructs for creating Kinesis Analytics Flink applications. To learn more about using using managed Flink applications, see the AWS developer guide.
Creating Flink Applications
To create a new Flink application, use the Application
construct:
import path as path
import aws_cdk.aws_cloudwatch as cloudwatch
import aws_cdk as core
import aws_cdk.integ_tests_alpha as integ
import aws_cdk.aws_kinesisanalytics_flink_alpha as flink
app = core.App()
stack = core.Stack(app, "FlinkAppTest")
flink_runtimes = [flink.Runtime.FLINK_1_6, flink.Runtime.FLINK_1_8, flink.Runtime.FLINK_1_11, flink.Runtime.FLINK_1_13, flink.Runtime.FLINK_1_15, flink.Runtime.FLINK_1_18, flink.Runtime.FLINK_1_19, flink.Runtime.FLINK_1_20
]
flink_runtimes.for_each((runtime) => {
const flinkApp = new flink.Application(stack, `App-${runtime.value}`, {
code: flink.ApplicationCode.fromAsset(path.join(__dirname, 'code-asset')),
runtime: runtime,
});
new cloudwatch.Alarm(stack, `Alarm-${runtime.value}`, {
metric: flinkApp.metricFullRestarts(),
evaluationPeriods: 1,
threshold: 3,
});
})
integ.IntegTest(app, "ApplicationTest",
test_cases=[stack]
)
The code
property can use fromAsset
as shown above to reference a local jar
file in s3 or fromBucket
to reference a file in s3.
import path as path
import aws_cdk.aws_s3_assets as assets
import aws_cdk as core
import aws_cdk.aws_kinesisanalytics_flink_alpha as flink
app = core.App()
stack = core.Stack(app, "FlinkAppCodeFromBucketTest")
asset = assets.Asset(stack, "CodeAsset",
path=path.join(__dirname, "code-asset")
)
bucket = asset.bucket
file_key = asset.s3_object_key
flink.Application(stack, "App",
code=flink.ApplicationCode.from_bucket(bucket, file_key),
runtime=flink.Runtime.FLINK_1_19
)
app.synth()
The propertyGroups
property provides a way of passing arbitrary runtime
properties to your Flink application. You can use the
aws-kinesisanalytics-runtime library to retrieve these
properties.
# bucket: s3.Bucket
flink_app = flink.Application(self, "Application",
property_groups={
"FlinkApplicationProperties": {
"input_stream_name": "my-input-kinesis-stream",
"output_stream_name": "my-output-kinesis-stream"
}
},
# ...
runtime=flink.Runtime.FLINK_1_20,
code=flink.ApplicationCode.from_bucket(bucket, "my-app.jar")
)
Flink applications also have specific configuration for passing parameters when the Flink job starts. These include parameters for checkpointing, snapshotting, monitoring, and parallelism.
# bucket: s3.Bucket
flink_app = flink.Application(self, "Application",
code=flink.ApplicationCode.from_bucket(bucket, "my-app.jar"),
runtime=flink.Runtime.FLINK_1_20,
checkpointing_enabled=True, # default is true
checkpoint_interval=Duration.seconds(30), # default is 1 minute
min_pause_between_checkpoints=Duration.seconds(10), # default is 5 seconds
log_level=flink.LogLevel.ERROR, # default is INFO
metrics_level=flink.MetricsLevel.PARALLELISM, # default is APPLICATION
auto_scaling_enabled=False, # default is true
parallelism=32, # default is 1
parallelism_per_kpu=2, # default is 1
snapshots_enabled=False, # default is true
log_group=logs.LogGroup(self, "LogGroup")
)
Flink applications can optionally be deployed in a VPC:
# bucket: s3.Bucket
# vpc: ec2.Vpc
flink_app = flink.Application(self, "Application",
code=flink.ApplicationCode.from_bucket(bucket, "my-app.jar"),
runtime=flink.Runtime.FLINK_1_20,
vpc=vpc
)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file aws_cdk_aws_kinesisanalytics_flink_alpha-2.160.0a0.tar.gz
.
File metadata
- Download URL: aws_cdk_aws_kinesisanalytics_flink_alpha-2.160.0a0.tar.gz
- Upload date:
- Size: 105.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 82ac4018f3e93e12abb4f2deca45df02d2e66342bc9c024d76a5fec4e156c00d |
|
MD5 | 451e5205cae8d763725a1461be8af4fa |
|
BLAKE2b-256 | b2f3022ec20191581fb3e28e2356a40e87e777721f551ed4f4317eda513dd963 |
File details
Details for the file aws_cdk.aws_kinesisanalytics_flink_alpha-2.160.0a0-py3-none-any.whl
.
File metadata
- Download URL: aws_cdk.aws_kinesisanalytics_flink_alpha-2.160.0a0-py3-none-any.whl
- Upload date:
- Size: 104.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 82893038ef6a7b8bbfc7f3dab930df81bec5d2a241bc3edb9efad626c2177fc1 |
|
MD5 | 74b34990fd1312177b4aaf92dbc8f77b |
|
BLAKE2b-256 | d504aed660922aaf4245faacea878bf5cce049f76c309dc573b0d67b8c037119 |