Skip to main content

@aws-prototyping-sdk/pipeline

Project description

BREAKING CHANGES (pre-release)

  • > v0.16.1: Refactored PDKPipeline to now be a construct so accessing CodePipeline methods now requires accessing a public codePipeline property i.e: pdkPipeline.codePipeline.XXX

The pipeline module vends an extension to CDK's CodePipeline construct, named PDKPipeline. It additionally creates a CodeCommit repository and by default is configured to build the project assumming nx-monorepo is being used (although this can be changed). A Sonarqube Scanner can also be configured to trigger a scan whenever the synth build job completes successfully. This Scanner is non-blocking and as such is not instrumented as part of the pipeline.

The architecture for the PDKPipeline is as follows:

CodeCommit repository -> CodePipeline
                             |-> EventBridge Rule (On Build Succeded) -> CodeBuild (Sonar Scan)
                             |-> Secret (sonarqube token)

This module additionally vends multiple Projen Projects, one for each of the supported languages. These projects aim to bootstrap your project by providing sample code which uses the PDKPipeline construct.

For example, in .projenrc.ts:

new PDKPipelineTsProject({
  cdkVersion: "2.1.0",
  defaultReleaseBranch: "mainline",
  devDeps: ["aws-prototyping-sdk"],
  name: "my-pipeline",
});

This will generate a package in typescript containing CDK boilerplate for a pipeline stack (which instantiates PDKPipeline), sets up a Dev stage with an Application Stage containing an empty ApplicationStack (to be implemented). Once this package is synthesized, you can run npx projen and projen will synthesize your cloudformation.

Alternatively, you can initialize a project using the cli (in an empty directory) for each of the supported languages as follows:

# Typescript
npx projen new --from @aws-prototyping-sdk/pdk-pipeline-ts
# Python
npx projen new --from @aws-prototyping-sdk/pdk-pipeline-py
# Java
npx projen new --from @aws-prototyping-sdk/pdk-pipeline-java

CDK Nag

In order to keep CDK Nag happy, make sure you build the pipeline before synth as per https://github.com/aws/aws-cdk/issues/18440.

Multi-branch pipeline management

If your team follows GitHub flow, the pipelines module can optionally help you create independent environments to test and validate changes before merging. When you create a new branch, it will automatically create a new pipeline stack and any stages you configure. Once you have finished testing and delete the branch, it will automatically clean up the stacks created in the branch's environment.

The feature is enabled and configured by setting the branchNamePrefixes property of the PDKPipeline construct. Any branches created matching this list of prefixes will create a new pipeline and stack.

When your PDKPipeline is run, the current branch will be available in the BRANCH environment variable. You can use this to give unique names to the stacks and stages created by that branch. You can also enable and disable stages based on the branch name. For example, you may want the PipelineStack and Dev stage to get created for any branch and only create the Prod stage in the default branch.

PDKPipeline configuration

Example: All Branches

pipeline-stack.ts

this.pipeline = new PDKPipeline(this, "ApplicationPipeline", {
  primarySynthDirectory: "packages/backend/cdk.out",
  repositoryName: this.node.tryGetContext("repositoryName") || "monorepo",
  branchNamePrefixes: PDKPipeline.ALL_BRANCHES,
});

Example: Branches starting with "feature/" or "fix/"

pipeline-stack.ts

this.pipeline = new PDKPipeline(this, "ApplicationPipeline", {
  primarySynthDirectory: "packages/backend/cdk.out",
  repositoryName: this.node.tryGetContext("repositoryName") || "monorepo",
  branchNamePrefixes: ["feature/", "fix/"],
});

Pipeline Definition

When you define your pipeline, you define which stages get created for a given branch and how to name your stacks uniquely. PipelineStack must be included.

pipeline.ts

const branchPrefix = PDKPipeline.getBranchPrefix({ node: app.node });

const pipelineStack = new PipelineStack(app, branchPrefix + "PipelineStack", {
  env: {
    account: process.env.CDK_DEFAULT_ACCOUNT!,
    region: process.env.CDK_DEFAULT_REGION!,
  },
});

const devStage = new ApplicationStage(app, branchPrefix + "Dev", {
  env: {
    account: process.env.CDK_DEFAULT_ACCOUNT!, // Replace with Dev account
    region: process.env.CDK_DEFAULT_REGION!, // Replace with Dev region
  },
});

pipelineStack.pipeline.addStage(devStage);

// Only create the Prod stage in the default branch
if (PDKPipeline.isDefaultBranch({ node: app.node })) {
  const prodStage = new ApplicationStage(app, "Prod", {
    env: {
      account: process.env.CDK_DEFAULT_ACCOUNT!, // Replace with Prod account
      region: process.env.CDK_DEFAULT_REGION!, // Replace with Prod region
    },
  });

  pipelineStack.pipeline.addStage(prodStage);
}

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aws_prototyping_sdk.pipeline-0.19.14.tar.gz (373.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aws_prototyping_sdk.pipeline-0.19.14-py3-none-any.whl (371.4 kB view details)

Uploaded Python 3

File details

Details for the file aws_prototyping_sdk.pipeline-0.19.14.tar.gz.

File metadata

File hashes

Hashes for aws_prototyping_sdk.pipeline-0.19.14.tar.gz
Algorithm Hash digest
SHA256 c72f5240764774ecf8c6cf16ce6b045118bf5c8e1ae540400bc449a5bfdf8f18
MD5 5fa70f44e1d9978d9d712c1018662fa8
BLAKE2b-256 86a2b50d29e0e59d16a7f7bd2aa41d6709a2e5f97096606c1ae425146a014149

See more details on using hashes here.

File details

Details for the file aws_prototyping_sdk.pipeline-0.19.14-py3-none-any.whl.

File metadata

File hashes

Hashes for aws_prototyping_sdk.pipeline-0.19.14-py3-none-any.whl
Algorithm Hash digest
SHA256 8f29cf0ea00edaae7d3a35bcb10438a278f1e3f854745216d620f1176c0a9ce9
MD5 a33924a163330a453c121216ab252cc5
BLAKE2b-256 8403bb095bfbdd21200149803929caf18f50d2dba702399e48100cc7f24efda6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page