Skip to main content

@aws-prototyping-sdk/pipeline

Project description

BREAKING CHANGES (pre-release)

  • > v0.16.1: Refactored PDKPipeline to now be a construct so accessing CodePipeline methods now requires accessing a public codePipeline property i.e: pdkPipeline.codePipeline.XXX

The pipeline module vends an extension to CDK's CodePipeline construct, named PDKPipeline. It additionally creates a CodeCommit repository and by default is configured to build the project assumming nx-monorepo is being used (although this can be changed). A Sonarqube Scanner can also be configured to trigger a scan whenever the synth build job completes successfully. This Scanner is non-blocking and as such is not instrumented as part of the pipeline.

The architecture for the PDKPipeline is as follows:

CodeCommit repository -> CodePipeline
                             |-> EventBridge Rule (On Build Succeded) -> CodeBuild (Sonar Scan)
                             |-> Secret (sonarqube token)

This module additionally vends multiple Projen Projects, one for each of the supported languages. These projects aim to bootstrap your project by providing sample code which uses the PDKPipeline construct.

For example, in .projenrc.ts:

new PDKPipelineTsProject({
  cdkVersion: "2.1.0",
  defaultReleaseBranch: "mainline",
  devDeps: ["aws-prototyping-sdk"],
  name: "my-pipeline",
});

This will generate a package in typescript containing CDK boilerplate for a pipeline stack (which instantiates PDKPipeline), sets up a Dev stage with an Application Stage containing an empty ApplicationStack (to be implemented). Once this package is synthesized, you can run npx projen and projen will synthesize your cloudformation.

Alternatively, you can initialize a project using the cli (in an empty directory) for each of the supported languages as follows:

# Typescript
npx projen new --from @aws-prototyping-sdk/pdk-pipeline-ts
# Python
npx projen new --from @aws-prototyping-sdk/pdk-pipeline-py
# Java
npx projen new --from @aws-prototyping-sdk/pdk-pipeline-java

CDK Nag

In order to keep CDK Nag happy, make sure you build the pipeline before synth as per https://github.com/aws/aws-cdk/issues/18440.

Multi-branch pipeline management

If your team follows GitHub flow, the pipelines module can optionally help you create independent environments to test and validate changes before merging. When you create a new branch, it will automatically create a new pipeline stack and any stages you configure. Once you have finished testing and delete the branch, it will automatically clean up the stacks created in the branch's environment.

The feature is enabled and configured by setting the branchNamePrefixes property of the PDKPipeline construct. Any branches created matching this list of prefixes will create a new pipeline and stack.

When your PDKPipeline is run, the current branch will be available in the BRANCH environment variable. You can use this to give unique names to the stacks and stages created by that branch. You can also enable and disable stages based on the branch name. For example, you may want the PipelineStack and Dev stage to get created for any branch and only create the Prod stage in the default branch.

PDKPipeline configuration

Example: All Branches

pipeline-stack.ts

this.pipeline = new PDKPipeline(this, "ApplicationPipeline", {
  primarySynthDirectory: "packages/backend/cdk.out",
  repositoryName: this.node.tryGetContext("repositoryName") || "monorepo",
  branchNamePrefixes: PDKPipeline.ALL_BRANCHES,
});

Example: Branches starting with "feature/" or "fix/"

pipeline-stack.ts

this.pipeline = new PDKPipeline(this, "ApplicationPipeline", {
  primarySynthDirectory: "packages/backend/cdk.out",
  repositoryName: this.node.tryGetContext("repositoryName") || "monorepo",
  branchNamePrefixes: ["feature/", "fix/"],
});

Pipeline Definition

When you define your pipeline, you define which stages get created for a given branch and how to name your stacks uniquely. PipelineStack must be included.

pipeline.ts

const branchPrefix = PDKPipeline.getBranchPrefix({ node: app.node });

const pipelineStack = new PipelineStack(app, branchPrefix + "PipelineStack", {
  env: {
    account: process.env.CDK_DEFAULT_ACCOUNT!,
    region: process.env.CDK_DEFAULT_REGION!,
  },
});

const devStage = new ApplicationStage(app, branchPrefix + "Dev", {
  env: {
    account: process.env.CDK_DEFAULT_ACCOUNT!, // Replace with Dev account
    region: process.env.CDK_DEFAULT_REGION!, // Replace with Dev region
  },
});

pipelineStack.pipeline.addStage(devStage);

// Only create the Prod stage in the default branch
if (PDKPipeline.isDefaultBranch({ node: app.node })) {
  const prodStage = new ApplicationStage(app, "Prod", {
    env: {
      account: process.env.CDK_DEFAULT_ACCOUNT!, // Replace with Prod account
      region: process.env.CDK_DEFAULT_REGION!, // Replace with Prod region
    },
  });

  pipelineStack.pipeline.addStage(prodStage);
}

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aws_prototyping_sdk.pipeline-0.19.34.tar.gz (382.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aws_prototyping_sdk.pipeline-0.19.34-py3-none-any.whl (380.8 kB view details)

Uploaded Python 3

File details

Details for the file aws_prototyping_sdk.pipeline-0.19.34.tar.gz.

File metadata

File hashes

Hashes for aws_prototyping_sdk.pipeline-0.19.34.tar.gz
Algorithm Hash digest
SHA256 332fac0bb2a214a94c34bcb883ce379c8d85432570a9d37acecce02d214e58f1
MD5 72ee37e4db025710ae309ef53907c359
BLAKE2b-256 1bd9e73029f5a97c2a7bc6f8f3b827fcf030c1076dc0fd695e8e8992e8ea91a4

See more details on using hashes here.

File details

Details for the file aws_prototyping_sdk.pipeline-0.19.34-py3-none-any.whl.

File metadata

File hashes

Hashes for aws_prototyping_sdk.pipeline-0.19.34-py3-none-any.whl
Algorithm Hash digest
SHA256 278c14fa31f9506b9bb3094cf4b92f97ff1fb2534da7b88d7ee966586459c805
MD5 075fa9087d64ebb5684ba6174e43b762
BLAKE2b-256 60a2b4c9848cd7f8293dc86b87796a617da4ca0f690f13ef958f14b95b1add7d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page