A CDK Construct Library for Kinesis Analytics Flink applications
Project description
Kinesis Analytics Flink
---The APIs of higher level constructs in this module are experimental and under active development. They are subject to non-backward compatible changes or removal in any future version. These are not subject to the Semantic Versioning model and breaking changes will be announced in the release notes. This means that while you may use them, you may need to update your source code when upgrading to a newer version of this package.
This package provides constructs for creating Kinesis Analytics Flink applications. To learn more about using using managed Flink applications, see the AWS developer guide.
Creating Flink Applications
To create a new Flink application, use the Application
construct:
import path as path
import aws_cdk.aws_cloudwatch as cloudwatch
import aws_cdk as core
import aws_cdk.integ_tests_alpha as integ
import aws_cdk.aws_kinesisanalytics_flink_alpha as flink
app = core.App()
stack = core.Stack(app, "FlinkAppTest")
flink_runtimes = [flink.Runtime.FLINK_1_6, flink.Runtime.FLINK_1_8, flink.Runtime.FLINK_1_11, flink.Runtime.FLINK_1_13, flink.Runtime.FLINK_1_15, flink.Runtime.FLINK_1_18, flink.Runtime.FLINK_1_19, flink.Runtime.FLINK_1_20
]
flink_runtimes.for_each((runtime) => {
const flinkApp = new flink.Application(stack, `App-${runtime.value}`, {
code: flink.ApplicationCode.fromAsset(path.join(__dirname, 'code-asset')),
runtime: runtime,
});
new cloudwatch.Alarm(stack, `Alarm-${runtime.value}`, {
metric: flinkApp.metricFullRestarts(),
evaluationPeriods: 1,
threshold: 3,
});
})
integ.IntegTest(app, "ApplicationTest",
test_cases=[stack]
)
The code
property can use fromAsset
as shown above to reference a local jar
file in s3 or fromBucket
to reference a file in s3.
import path as path
import aws_cdk.aws_s3_assets as assets
import aws_cdk as core
import aws_cdk.aws_kinesisanalytics_flink_alpha as flink
app = core.App()
stack = core.Stack(app, "FlinkAppCodeFromBucketTest")
asset = assets.Asset(stack, "CodeAsset",
path=path.join(__dirname, "code-asset")
)
bucket = asset.bucket
file_key = asset.s3_object_key
flink.Application(stack, "App",
code=flink.ApplicationCode.from_bucket(bucket, file_key),
runtime=flink.Runtime.FLINK_1_19
)
app.synth()
The propertyGroups
property provides a way of passing arbitrary runtime
properties to your Flink application. You can use the
aws-kinesisanalytics-runtime library to retrieve these
properties.
# bucket: s3.Bucket
flink_app = flink.Application(self, "Application",
property_groups={
"FlinkApplicationProperties": {
"input_stream_name": "my-input-kinesis-stream",
"output_stream_name": "my-output-kinesis-stream"
}
},
# ...
runtime=flink.Runtime.FLINK_1_20,
code=flink.ApplicationCode.from_bucket(bucket, "my-app.jar")
)
Flink applications also have specific configuration for passing parameters when the Flink job starts. These include parameters for checkpointing, snapshotting, monitoring, and parallelism.
# bucket: s3.Bucket
flink_app = flink.Application(self, "Application",
code=flink.ApplicationCode.from_bucket(bucket, "my-app.jar"),
runtime=flink.Runtime.FLINK_1_20,
checkpointing_enabled=True, # default is true
checkpoint_interval=Duration.seconds(30), # default is 1 minute
min_pause_between_checkpoints=Duration.seconds(10), # default is 5 seconds
log_level=flink.LogLevel.ERROR, # default is INFO
metrics_level=flink.MetricsLevel.PARALLELISM, # default is APPLICATION
auto_scaling_enabled=False, # default is true
parallelism=32, # default is 1
parallelism_per_kpu=2, # default is 1
snapshots_enabled=False, # default is true
log_group=logs.LogGroup(self, "LogGroup")
)
Flink applications can optionally be deployed in a VPC:
# bucket: s3.Bucket
# vpc: ec2.Vpc
flink_app = flink.Application(self, "Application",
code=flink.ApplicationCode.from_bucket(bucket, "my-app.jar"),
runtime=flink.Runtime.FLINK_1_20,
vpc=vpc
)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file aws_cdk_aws_kinesisanalytics_flink_alpha-2.171.1a0.tar.gz
.
File metadata
- Download URL: aws_cdk_aws_kinesisanalytics_flink_alpha-2.171.1a0.tar.gz
- Upload date:
- Size: 105.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 66898e9ad3ad5e677eacf223dd2b346b8c5fbfc28a4c6db1bf6ed9b106ff7592 |
|
MD5 | 90df00bc9513d674c12307b200360353 |
|
BLAKE2b-256 | 5b0609752c49de7075f4fb719aaa1d38d01a8ef40e43c94caccda14e9cfbfd7a |
File details
Details for the file aws_cdk.aws_kinesisanalytics_flink_alpha-2.171.1a0-py3-none-any.whl
.
File metadata
- Download URL: aws_cdk.aws_kinesisanalytics_flink_alpha-2.171.1a0-py3-none-any.whl
- Upload date:
- Size: 104.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8edefe24d1c4670e1c50c97d97c87b9d2dda8d08f9183aa4431c2d25cb3950b9 |
|
MD5 | 9fbaa692fdfaa1ff095207fcf2525321 |
|
BLAKE2b-256 | b06c293b690a46f44e94afbe3b4bc05bdfa6910fe55c2564bb8cbe0b0cab7fbc |