This is the AWS CDK v2 Developer Guide. The older CDK v1 entered maintenance on June 1, 2022 and ended support on June 1, 2023.
Use the CDK Pipelines module from the AWS Construct Library to configure continuous delivery of AWS CDK applications. When you commit your CDK app's source code into AWS CodeCommit, GitHub, or AWS CodeStar, CDK Pipelines can automatically build, test, and deploy your new version.
CDK Pipelines are self-updating. If you add application stages or stacks, the pipeline automatically reconfigures itself to deploy those new stages or stacks.
Note
CDK Pipelines supports two APIs. One is the original API that was made available in the CDK Pipelines Developer
Preview. The other is a modern API that incorporates feedback from CDK customers received during the preview
phase. The examples in this topic use the modern API. For details on the differences between the two supported APIs,
see CDK Pipelines
original API
Topics
Bootstrap your AWS environments
Before you can use CDK Pipelines, you must bootstrap the AWS environment that you will deploy your stacks to.
A CDK Pipeline involves at least two environments. The first environment is where the pipeline is provisioned. The second environment is where you want to deploy the application's stacks or stages to (stages are groups of related stacks). These environments can be the same, but a best practice recommendation is to isolate stages from each other in different environments.
Note
See AWS CDK bootstrapping for more information on the kinds of resources created by bootstrapping and how to customize the bootstrap stack.
Continuous deployment with CDK Pipelines requires the following to be included in the CDK Toolkit stack:
-
An Amazon Simple Storage Service (Amazon S3) bucket.
-
An Amazon ECR repository.
-
IAM roles to give the various parts of a pipeline the permissions they need.
The CDK Toolkit will upgrade your existing bootstrap stack or creates a new one if necessary.
To bootstrap an environment that can provision an AWS CDK pipeline, invoke cdk bootstrap
as shown in the
following example. Invoking the AWS CDK Toolkit via the npx
command temporarily installs it if necessary. It
will also use the version of the Toolkit installed in the current project, if one exists.
--cloudformation-execution-policies
specifies the ARN of a policy under which future
CDK Pipelines deployments will execute. The default AdministratorAccess
policy makes sure that your
pipeline can deploy every type of AWS resource. If you use this policy, make sure you trust all the code and
dependencies that make up your AWS CDK app.
Most organizations mandate stricter controls on what kinds of resources can be deployed by automation. Check with the appropriate department within your organization to determine the policy your pipeline should use.
You can omit the --profile
option if your default AWS profile contains the necessary
authentication configuration and AWS Region.
npx cdk bootstrap aws://
ACCOUNT-NUMBER
/REGION
--profileADMIN-PROFILE
\ --cloudformation-execution-policies arn:aws:iam::aws:policy/AdministratorAccess
To bootstrap additional environments into which AWS CDK applications will be deployed by the pipeline, use the
following commands instead. The --trust
option indicates which other account should have permissions
to deploy AWS CDK applications into this environment. For this option, specify the pipeline's AWS account ID.
Again, you can omit the --profile
option if your default AWS profile contains the necessary
authentication configuration and AWS Region.
npx cdk bootstrap aws://
ACCOUNT-NUMBER
/REGION
--profileADMIN-PROFILE
\ --cloudformation-execution-policies arn:aws:iam::aws:policy/AdministratorAccess \ --trustPIPELINE-ACCOUNT-NUMBER
Tip
Use administrative credentials only to bootstrap and to provision the initial pipeline. Afterward, use the pipeline itself, not your local machine, to deploy changes.
If you are upgrading a legacy bootstrapped environment, the previous Amazon S3 bucket is orphaned when the new bucket is created. Delete it manually by using the Amazon S3 console.
Protecting your bootstrap stack from deletion
If a bootstrap stack is deleted, the AWS resources that were originally provisioned in the environment to support CDK deployments will also be deleted. This will cause the pipeline to stop working. If this happens, there is no general solution for recovery.
After your environment is bootstrapped, do not delete and recreate the environment’s bootstrap stack. Instead,
try to update the bootstrap stack to a new version by running the cdk bootstrap
command again.
To protect against accidental deletion of your bootstrap stack, we recommend that you provide the
--termination-protection
option with the cdk bootstrap
command to enable termination
protection. You can enable termination protection on new or existing bootstrap stacks. To learn more about this
option, see --termination-protection
.
After enabling termination protection, you can use the AWS CLI or CloudFormation console to verify.
To enable termination protection
-
Run the following command to enable termination protection on a new or existing bootstrap stack:
$
cdk bootstrap --termination-protection
-
Use the AWS CLI or CloudFormation console to verify. The following is an example, using the AWS CLI. If you modified your bootstrap stack name, replace
CDKToolkit
with your stack name:$
aws cloudformation describe-stacks --stack-name
" trueCDKToolkit
--query "Stacks[0].EnableTerminationProtection
Initialize a project
Create a new, empty GitHub project and clone it to your workstation in the
my-pipeline
directory. (Our code examples in this topic use GitHub. You can also use AWS CodeStar
or AWS CodeCommit.)
git clone
GITHUB-CLONE-URL
my-pipeline cd my-pipeline
Note
You can use a name other than my-pipeline
for your app's main directory. However, if you do so, you
will have to tweak the file and class names later in this topic. This is because the AWS CDK Toolkit bases some file
and class names on the name of the main directory.
After cloning, initialize the project as usual.
$
cdk init app --language typescript
Important
Be sure to commit your cdk.json
and cdk.context.json
files to source
control. The context information (such as feature flags and cached values retrieved from your AWS account) are part
of your project's state. The values may be different in another environment, which can cause unexpected changes in
your results. For more information, see Context values and the AWS CDK.
Define a pipeline
Your CDK Pipelines application will include at least two stacks: one that represents the pipeline itself, and one or more stacks that represent the application deployed through it. Stacks can also be grouped into stages, which you can use to deploy copies of infrastructure stacks to different environments. For now, we'll consider the pipeline, and later delve into the application it will deploy.
The construct CodePipeline
is the construct that represents a CDK Pipeline that uses AWS CodePipeline as its
deployment engine. When you instantiate CodePipeline
in a stack, you define the source location for the
pipeline (such as a GitHub repository). You also define the commands to build the app.
For example, the following defines a pipeline whose source is stored in a GitHub repository. It also includes a build step for a TypeScript CDK application. Fill in the information about your GitHub repo where indicated.
Note
By default, the pipeline authenticates to GitHub using a personal access token stored in Secrets Manager under the
name github-token
.
You'll also need to update the instantiation of the pipeline stack to specify the AWS account and Region.
In lib/my-pipeline-stack.ts
(may vary if your project folder isn't named
my-pipeline
):
import * as cdk from 'aws-cdk-lib';
import { Construct } from 'constructs';
import { CodePipeline, CodePipelineSource, ShellStep } from 'aws-cdk-lib/pipelines';
export class MyPipelineStack extends cdk.Stack {
constructor(scope: Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
const pipeline = new CodePipeline(this, 'Pipeline', {
pipelineName: 'MyPipeline',
synth: new ShellStep('Synth', {
input: CodePipelineSource.gitHub('OWNER
/REPO
', 'main'),
commands: ['npm ci', 'npm run build', 'npx cdk synth']
})
});
}
}
In bin/my-pipeline.ts
(may vary if your project folder isn't named
my-pipeline
):
#!/usr/bin/env node
import * as cdk from 'aws-cdk-lib';
import { MyPipelineStack } from '../lib/my-pipeline-stack';
const app = new cdk.App();
new MyPipelineStack(app, 'MyPipelineStack', {
env: {
account: '111111111111
',
region: 'eu-west-1
',
}
});
app.synth();
You must deploy a pipeline manually once. After that, the pipeline keeps itself up to date from the source code repository. So be sure that the code in the repo is the code you want deployed. Check in your changes and push to GitHub, then deploy:
git add --all git commit -m "initial commit" git push cdk deploy
Tip
Now that you've done the initial deployment, your local AWS account no longer needs administrative access. This is because all changes to your app will be deployed via the pipeline. All you need to be able to do is push to GitHub.
Application stages
To define a multi-stack AWS application that can be added to the pipeline all at once, define a subclass of
Stage
. (This is
different from CdkStage
in the CDK Pipelines module.)
The stage contains the stacks that make up your application. If there are dependencies between the stacks, the
stacks are automatically added to the pipeline in the right order. Stacks that don't depend on each other are deployed
in parallel. You can add a dependency relationship between stacks by calling
stack1.addDependency(stack2)
.
Stages accept a default env
argument, which becomes the default environment for the stacks inside it.
(Stacks can still have their own environment specified.).
An application is added to the pipeline by calling addStage()
with instances of Stage
.
A stage can be instantiated and added to the pipeline multiple times to define different stages of your DTAP or
multi-Region application pipeline.
We will create a stack containing a simple Lambda function and place that stack in a stage. Then we will add the stage to the pipeline so it can be deployed.
Create the new file lib/my-pipeline-lambda-stack.ts
to hold our application stack
containing a Lambda function.
import * as cdk from 'aws-cdk-lib';
import { Construct } from 'constructs';
import { Function, InlineCode, Runtime } from 'aws-cdk-lib/aws-lambda';
export class MyLambdaStack extends cdk.Stack {
constructor(scope: Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
new Function(this, 'LambdaFunction', {
runtime: Runtime.NODEJS_18_X,
handler: 'index.handler',
code: new InlineCode('exports.handler = _ => "Hello, CDK";')
});
}
}
Create the new file lib/my-pipeline-app-stage.ts
to hold our stage.
import * as cdk from 'aws-cdk-lib';
import { Construct } from "constructs";
import { MyLambdaStack } from './my-pipeline-lambda-stack';
export class MyPipelineAppStage extends cdk.Stage {
constructor(scope: Construct, id: string, props?: cdk.StageProps) {
super(scope, id, props);
const lambdaStack = new MyLambdaStack(this, 'LambdaStack');
}
}
Edit lib/my-pipeline-stack.ts
to add the stage to our pipeline.
import * as cdk from 'aws-cdk-lib';
import { Construct } from 'constructs';
import { CodePipeline, CodePipelineSource, ShellStep } from 'aws-cdk-lib/pipelines';
import { MyPipelineAppStage } from './my-pipeline-app-stage';
export class MyPipelineStack extends cdk.Stack {
constructor(scope: Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
const pipeline = new CodePipeline(this, 'Pipeline', {
pipelineName: 'MyPipeline',
synth: new ShellStep('Synth', {
input: CodePipelineSource.gitHub('OWNER
/REPO
', 'main'),
commands: ['npm ci', 'npm run build', 'npx cdk synth']
})
});
pipeline.addStage(new MyPipelineAppStage(this, "test", {
env: { account: "111111111111
", region: "eu-west-1
" }
}));
}
}
Every application stage added by addStage()
results in the addition of a corresponding pipeline stage,
represented by a StageDeployment instance returned by the addStage()
call. You can add pre-deployment or
post-deployment actions to the stage by calling its addPre()
or addPost()
method.
// import { ManualApprovalStep } from 'aws-cdk-lib/pipelines';
const testingStage = pipeline.addStage(new MyPipelineAppStage(this, 'testing', {
env: { account: '111111111111
', region: 'eu-west-1
' }
}));
testingStage.addPost(new ManualApprovalStep('approval'));
You can add stages to a Wave to deploy them in parallel, for example when deploying a stage to multiple accounts or Regions.
const wave = pipeline.addWave('wave');
wave.addStage(new MyApplicationStage(this, 'MyAppEU', {
env: { account: '111111111111
', region: 'eu-west-1
' }
}));
wave.addStage(new MyApplicationStage(this, 'MyAppUS', {
env: { account: '111111111111
', region: 'us-west-1
' }
}));
Testing deployments
You can add steps to a CDK Pipeline to validate the deployments that you're performing. For example, you can use
the CDK Pipeline library's ShellStep
to perform tasks such as the following:
-
Trying to access a newly deployed Amazon API Gateway backed by a Lambda function
-
Checking a setting of a deployed resource by issuing an AWS CLI command
In its simplest form, adding validation actions looks like this:
// stage was returned by pipeline.addStage
stage.addPost(new ShellStep("validate", {
commands: ['../tests/validate.sh'],
}));
Many AWS CloudFormation deployments result in the generation of resources with unpredictable names. Because of this, CDK Pipelines provide a way to read AWS CloudFormation outputs after a deployment. This makes it possible to pass (for example) the generated URL of a load balancer to a test action.
To use outputs, expose the CfnOutput
object you're interested in. Then, pass it in a step's
envFromCfnOutputs
property to make it available as an environment variable within that step.
// given a stack lbStack that exposes a load balancer construct as loadBalancer
this.loadBalancerAddress = new cdk.CfnOutput(lbStack, 'LbAddress', {
value: `https://${lbStack.loadBalancer.loadBalancerDnsName}/`
});
// pass the load balancer address to a shell step
stage.addPost(new ShellStep("lbaddr", {
envFromCfnOutputs: {lb_addr: lbStack.loadBalancerAddress},
commands: ['echo $lb_addr']
}));
You can write simple validation tests right in the ShellStep
, but this approach becomes unwieldy when
the test is more than a few lines. For more complex tests, you can bring additional files (such as complete shell
scripts, or programs in other languages) into the ShellStep
via the inputs
property. The
inputs can be any step that has an output, including a source (such as a GitHub repo) or another
ShellStep
.
Bringing in files from the source repository is appropriate if the files are directly usable in the test (for
example, if they are themselves executable). In this example, we declare our GitHub repo as source
(rather
than instantiating it inline as part of the CodePipeline
). Then, we pass this fileset to both the pipeline
and the validation test.
const source = CodePipelineSource.gitHub('OWNER
/REPO
', 'main');
const pipeline = new CodePipeline(this, 'Pipeline', {
pipelineName: 'MyPipeline',
synth: new ShellStep('Synth', {
input: source,
commands: ['npm ci', 'npm run build', 'npx cdk synth']
})
});
const stage = pipeline.addStage(new MyPipelineAppStage(this, 'test', {
env: { account: '111111111111
', region: 'eu-west-1
' }
}));
stage.addPost(new ShellStep('validate', {
input: source,
commands: ['sh ../tests/validate.sh']
}));
Getting the additional files from the synth step is appropriate if your tests need to be compiled, which is done as part of synthesis.
const synthStep = new ShellStep('Synth', {
input: CodePipelineSource.gitHub('OWNER
/REPO
', 'main'),
commands: ['npm ci', 'npm run build', 'npx cdk synth'],
});
const pipeline = new CodePipeline(this, 'Pipeline', {
pipelineName: 'MyPipeline',
synth: synthStep
});
const stage = pipeline.addStage(new MyPipelineAppStage(this, 'test', {
env: { account: '111111111111
', region: 'eu-west-1
' }
}));
// run a script that was transpiled from TypeScript during synthesis
stage.addPost(new ShellStep('validate', {
input: synthStep,
commands: ['node tests/validate.js']
}));
Security notes
Any form of continuous delivery has inherent security risks. Under the AWS Shared Responsibility Model
However, by its very nature, a library that needs a high level of access to fulfill its intended purpose cannot assure complete security. There are many attack vectors outside of AWS and your organization.
In particular, keep in mind the following:
-
Be mindful of the software you depend on. Vet all third-party software you run in your pipeline, because it can change the infrastructure that gets deployed.
-
Use dependency locking to prevent accidental upgrades. CDK Pipelines respects
package-lock.json
andyarn.lock
to make sure that your dependencies are the ones you expect. -
CDK Pipelines runs on resources created in your own account, and the configuration of those resources is controlled by developers submitting code through the pipeline. Therefore, CDK Pipelines by itself cannot protect against malicious developers trying to bypass compliance checks. If your threat model includes developers writing CDK code, you should have external compliance mechanisms in place like AWS CloudFormation Hooks
(preventive) or AWS Config (reactive) that the AWS CloudFormation Execution Role does not have permissions to disable. -
Credentials for production environments should be short-lived. After bootstrapping and initial provisioning, there is no need for developers to have account credentials at all. Changes can be deployed through the pipeline. Reduce the possibility of credentials leaking by not needing them in the first place.
Troubleshooting
The following issues are commonly encountered while getting started with CDK Pipelines.
- Pipeline: Internal Failure
-
CREATE_FAILED | AWS::CodePipeline::Pipeline | Pipeline/Pipeline Internal Failure
Check your GitHub access token. It might be missing, or might not have the permissions to access the repository.
- Key: Policy contains a statement with one or more invalid principals
-
CREATE_FAILED | AWS::KMS::Key | Pipeline/Pipeline/ArtifactsBucketEncryptionKey Policy contains a statement with one or more invalid principals.
One of the target environments has not been bootstrapped with the new bootstrap stack. Make sure all your target environments are bootstrapped.
- Stack is in ROLLBACK_COMPLETE state and can not be updated.
-
Stack
STACK_NAME
is in ROLLBACK_COMPLETE state and can not be updated. (Service: AmazonCloudFormation; Status Code: 400; Error Code: ValidationError; Request ID: ...)The stack failed its previous deployment and is in a non-retryable state. Delete the stack from the AWS CloudFormation console and retry the deployment.