

# CodePipeline tutorials
<a name="tutorials"></a>

After you complete the steps in [Getting started with CodePipeline](getting-started-codepipeline.md), you can try one of the AWS CodePipeline tutorials in this user guide.

**Topics**
+ [

# Tutorial: Deploy to Amazon EC2 instances with CodePipeline
](tutorials-ec2-deploy.md)
+ [

# Tutorial: Build and push a Docker image to Amazon ECR with CodePipeline (V2 type)
](tutorials-ecr-build-publish.md)
+ [

# Tutorial: Deploy to Amazon EKS with CodePipeline
](tutorials-eks-deploy.md)
+ [

# Tutorial: Create a pipeline that runs commands with compute (V2 type)
](tutorials-commands.md)
+ [

# Tutorial: Use Git tags to start your pipeline
](tutorials-github-tags.md)
+ [

# Tutorial: Filter on branch names for pull requests to start your pipeline (V2 type)
](tutorials-github-featurebranches.md)
+ [

# Tutorial: Use pipeline-level variables
](tutorials-pipeline-variables.md)
+ [

# Tutorial: Create a simple pipeline (S3 bucket)
](tutorials-simple-s3.md)
+ [

# Tutorial: Create a simple pipeline (CodeCommit repository)
](tutorials-simple-codecommit.md)
+ [

# Tutorial: Create a four-stage pipeline
](tutorials-four-stage-pipeline.md)
+ [

# Tutorial: Set up a CloudWatch Events rule to receive email notifications for pipeline state changes
](tutorials-cloudwatch-sns-notifications.md)
+ [

# Tutorial: Create a pipeline that builds and tests your Android app with AWS Device Farm
](tutorials-codebuild-devicefarm.md)
+ [

# Tutorial: Create a pipeline that tests your iOS app with AWS Device Farm
](tutorials-codebuild-devicefarm-S3.md)
+ [

# Tutorial: Create a pipeline that deploys to Service Catalog
](tutorials-S3-servicecatalog.md)
+ [

# Tutorial: Create a pipeline with AWS CloudFormation
](tutorials-cloudformation.md)
+ [

# Tutorial: Create a pipeline that uses variables from AWS CloudFormation deployment actions
](tutorials-cloudformation-action.md)
+ [

# Tutorial: Amazon ECS Standard Deployment with CodePipeline
](ecs-cd-pipeline.md)
+ [

# Tutorial: Create a pipeline with an Amazon ECR source and ECS-to-CodeDeploy deployment
](tutorials-ecs-ecr-codedeploy.md)
+ [

# Tutorial: Create a pipeline that deploys an Amazon Alexa skill
](tutorials-alexa-skills-kit.md)
+ [

# Tutorial: Create a pipeline that uses Amazon S3 as a deployment provider
](tutorials-s3deploy.md)
+ [

# Tutorial: Create a pipeline that publishes your serverless application to the AWS Serverless Application Repository
](tutorials-serverlessrepo-auto-publish.md)
+ [

# Tutorial: Lambda function deployments with CodePipeline
](tutorials-lambda-deploy.md)
+ [

# Tutorial: Using variables with Lambda invoke actions
](tutorials-lambda-variables.md)
+ [

# Tutorial: Use an AWS Step Functions invoke action in a pipeline
](tutorials-step-functions.md)
+ [

# Tutorial: Create a pipeline that uses AWS AppConfig as a deployment provider
](tutorials-AppConfig.md)
+ [

# Tutorial: Use full clone with a GitHub pipeline source
](tutorials-github-gitclone.md)
+ [

# Tutorial: Use full clone with a CodeCommit pipeline source
](tutorials-codecommit-gitclone.md)
+ [

# Tutorial: Create a pipeline with AWS CloudFormation StackSets deployment actions
](tutorials-stackset-deployment.md)
+ [

# Tutorial: Create a variable check rule for a pipeline as an entry condition
](tutorials-varcheckrule.md)

# Tutorial: Deploy to Amazon EC2 instances with CodePipeline
<a name="tutorials-ec2-deploy"></a>

This tutorial helps you to create a deploy action in CodePipeline that deploys your code to instances you have configured in Amazon EC2.

**Note**  
As part of creating a pipeline in the console, an S3 artifact bucket will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Note**  
The `EC2` deploy action is only available for V2 type pipelines.

## Prerequisites
<a name="tutorials-ec2-deploy-prereqs"></a>

There are a few resources that you must have in place before you can use this tutorial to create your CD pipeline. Here are the things you need to get started:

**Note**  
All of these resources should be created within the same AWS Region.
+ A source control repository (this tutorial uses GitHub) where you will add a sample `script.sh` file.
+ You must use an existing CodePipeline service role that has been updated with the permissions for this action. To update your service role, see [Service role policy permissions for the EC2 deploy action](action-reference-EC2Deploy.md#action-reference-EC2Deploy-permissions-action).

After you have satisfied these prerequisites, you can proceed with the tutorial and create your CD pipeline.

## Step 1: Create Amazon EC2 Linux instances
<a name="tutorials-ec2-deploy-instances"></a>

In this step, you create the Amazon EC2 instances where you will deploy a sample application. As part of this process, create an instance role in IAM, if you have not already created an instance role in the Region where you want to create resources.

**To create an instance role**

1. Open the IAM console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/)).

1. From the console dashboard, choose **Roles**.

1. Choose **Create role**.

1. Under **Select type of trusted entity**, select **AWS service**. Under **Choose a use case**, select **EC2**. Under **Select your use case**, choose **EC2**. Choose **Next**.

1. Search for and select the policy named **`AmazonSSMManagedEC2InstanceDefaultPolicy`**. 

1. Search for and select the policy named **`AmazonSSMManagedInstanceCore`**. Choose **Next: Tags**.

1. Choose **Next: Review**. Enter a name for the role (for example, **EC2InstanceRole**).
**Note**  
Make a note of your role name for the next step. You choose this role when you are creating your instance.
**Note**  
You will add permissions to this role to allow access to the S3 artifact bucket for your pipeline after pipeline creation.

   Choose **Create role**.

**To launch instances**

1. Open the Amazon EC2 console at [https://console.aws.amazon.com/ec2/](https://console.aws.amazon.com/ec2/).

1. From the side navigation, choose **Instances**, and select **Launch instances** from the top of the page.

1. In **Name**, enter **MyInstances**. This assigns the instance a tag **Key** of **Name** and a tag **Value** of **MyInstances**. 

1. Under **Application and OS Images (Amazon Machine Image)**, locate the **Amazon Linux** AMI option with the AWS logo, and make sure it is selected. (This AMI is described as the Amazon Linux 2 AMI (HVM) and is labeled "Free tier eligible".)

1. Under **Instance type**, choose the free tier eligible `t2.micro` type as the hardware configuration for your instance.

1. Under **Key pair (login)**, choose a key pair or create one. 

1. Under **Network settings**, make sure the status is **Enable**.

1. Expand **Advanced details**. In **IAM instance profile**, choose the IAM role you created in the previous procedure (for example, **EC2InstanceRole**).
**Note**  
Do not leave the instance role blank as this creates a default role and does not select the role you created.

1. Under **Summary**, under **Number of instances**, enter `2`.

1. Choose **Launch instance**. 

1. You can view the status of the launch on the **Instances** page. When you launch an instance, its initial state is `pending`. After the instance starts, its state changes to `running`, and it receives a public DNS name. (If the **Public DNS** column is not displayed, choose the **Show/Hide** icon, and then select **Public DNS**.)

## Step 2: Add artifact bucket permissions to the EC2 instance role
<a name="tutorials-ec2-deploy-role-s3"></a>

You must update the EC2 instance role you created for your instance to allow it access to your pipeline's artifact bucket. 

**Note**  
When you create the instance, you create or use an existing EC2 instance role. To avoid `Access Denied` errors, you must add S3 bucket permissions to the instance role to give the instance permissions to the CodePipeline artifact bucket. Create a default role or update your existing role with the `s3:GetObject` permission scoped down to the artifact bucket for your pipeline's Region.

1. Navigate to your pipeline in the CodePipeline console. Choose **Settings**. View the name and location of the artifact store for an existing pipeline. Make a note of the artifact bucket Amazon Resource Name (ARN) and copy it.

1. Navigate to the IAM console and choose **Roles**. Choose the instance role you created in Step 1 of this tutorial.

1. On the **Permissions** tab, choose **Add inline policy**.

1. Add the following JSON to the policy document, replacing the value in the `Resource` field with the bucket ARN.

   ```
   {
       "Effect": "Allow",
       "Principal": "*",
       "Action": "s3:GetObject",
       "Resource": "arn:aws:s3:::BucketName"
   }
   ```

1. Choose **Update**.

## Step 3: Add a script file to your repository
<a name="tutorials-ec2-deploy-file"></a>

Paste this sample text to create your `script.sh` file for the post-script step in the deployment.

```
echo "Hello World!" 
```

**To add a `script.sh` file to your source repository**

1. Open a text editor and then copy and paste the file above into a new file.

1. Commit and push your `script.sh` file to your source repository.

   1. Add the file.

      ```
      git add .
      ```

   1. Commit the change.

      ```
      git commit -m "Adding script.sh."
      ```

   1. Push the commit.

      ```
      git push
      ```

   Make a note of the path in your repository.

   ```
   /MyDemoRepo/test/script.sh
   ```

## Step 4: Creating your pipeline
<a name="tutorials-ec2-deploy-pipeline"></a>

Use the CodePipeline wizard to create your pipeline stages and connect your source repository.

**To create your pipeline**

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyPipeline**.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, choose **Use existing service role**, and then choose the CodePipeline service role that has been updated with the required permissions for this action. To configure your CodePipeline service role for this action, see [Service role policy permissions for the EC2 deploy action](action-reference-EC2Deploy.md#action-reference-EC2Deploy-permissions-action).

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. On the **Step 3: Add source stage** page, add a source stage:

   1. In **Source provider**, choose **GitHub (via GitHub App)**.

   1. Under **Connection**, choose an existing connection or create a new one. To create or manage a connection for your GitHub source action, see [GitHub connections](connections-github.md).

   1. In **Repository name**, choose the name of your GitHub repository.

   Choose **Next**.

1. On the **Step 4: Add build stage** page, choose **Skip**.

1. On the **Step 5: Add deploy stage** page, choose **EC2**.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/ec2deploy-action.png)

   1. For **Target directory**, enter the directory on the instance that you want to deploy to, such as `/home/ec2-user/testhelloworld`.
**Note**  
Specify the deployment directory that you want the action to use on the instance. The action will automate creating the specified directory on the instance as part of the deployment.

   1. For **PostScript**, enter the path and file name for your script, such as `test/script.sh`.

   1. Choose **Next**.

1. On the **Step 6: Review** page, review your pipeline configuration and choose **Create pipeline** to create the pipeline.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/ec2deploy-pipeline.png)

1. After the pipeline runs successfully, choose **View details** to view the logs on the action to view the managed compute action output.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/ec2deploy-logs.png)  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/ec2deploy-logs2.png)

## Step 5: Test Your Pipeline
<a name="tutorials-ec2-deploy-test"></a>

Your pipeline should have everything for running an end-to-end native AWS continuous deployment. Now, test its functionality by pushing a code change to your source repository.

**To test your pipeline**

1. Make a code change to your configured source repository, commit, and push the change.

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. Choose your pipeline from the list.

1. Watch the pipeline progress through its stages. Your pipeline should complete and your action deploys the script on your instances.

1. For more troubleshooting information, see [EC2 Deploy action fails with an error message `No such file`](troubleshooting.md#troubleshooting-ec2-deploy).

# Tutorial: Build and push a Docker image to Amazon ECR with CodePipeline (V2 type)
<a name="tutorials-ecr-build-publish"></a>

This tutorial helps you to create a build action in CodePipeline that runs and pushes your Docker image to Amazon ECR after a change to your source code. This tutorial also shows you how to add an Amazon ECS deploy action that deploys your pushed image.

**Important**  
As part of creating a pipeline in the console, an S3 artifact bucket will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Note**  
This tutorial is for the ECRBuildAndPublish build action for a CodePipeline pipeline with a GitHub source repository and an Amazon ECS standard action for deploying to an Amazon ECS cluster. For a tutorial that uses a pipeline with an ECR image repository as the source for an Amazon ECS to CodeDeploy blue/green deployment action in CodePipeline, see [Tutorial: Create a pipeline with an Amazon ECR source and ECS-to-CodeDeploy deployment](tutorials-ecs-ecr-codedeploy.md).

**Important**  
This action uses CodePipeline managed CodeBuild compute to run commands in a build environment. Running the commands action will incur separate charges in AWS CodeBuild.

## Prerequisites
<a name="tutorials-ecr-build-publish-prereqs"></a>

There are a few resources that you must have in place before you can use this tutorial to create your CD pipeline. Here are the things you need to get started:

**Note**  
All of these resources should be created within the same AWS Region.
+ A source control repository (this tutorial uses GitHub) where you will add the following for this tutorial:
  + In Step 1, you will add a sample Dockerfile to your source repository as the input artifact for the ECRBuildAndPublish build action in CodePipeline.
  + In Step 2, you will add a sample imagedefinitions.json file to your source repository as a requirement for the Amazon ECS standard deploy action in CodePipeline.
+ An Amazon ECR image repository that contains an image you have built from your Dockerfile. For more information, see [Creating a Repository](https://docs.aws.amazon.com/AmazonECR/latest/userguide/repository-create.html) and [Pushing an Image](https://docs.aws.amazon.com/AmazonECR/latest/userguide/docker-push-ecr-image.html) in the *Amazon Elastic Container Registry User Guide*.
+ An Amazon ECS cluster and service created in the same Region as the image repository. For more information, see [Creating a Cluster](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/create_cluster.html) and [Creating a Service](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/create-service.html) in the *Amazon Elastic Container Service Developer Guide*.

After you have satisfied these prerequisites, you can proceed with the tutorial and create your CD pipeline.

## Step 1: Add a Dockerfile to your source repository
<a name="tutorials-ecr-build-publish-file"></a>

This tutorial uses the ECRBuildAndPublish action to build your Docker image and push the image to Amazon ECR. The managed compute action in CodePipeline uses CodeBuild to run the commands for the ECR login and image push. You do not need to add a `buildspec.yml` file to your source code repository to tell CodeBuild how to do that. You only provide the Dockerfile in your repository as follows for this example.

Paste this sample text to create your `Dockerfile` file. This sample Dockerfile is the same as the sample used in the ECR image instructions in the prerequisites.

```
FROM public.ecr.aws/amazonlinux/amazonlinux:latest

# Install dependencies
RUN yum update -y && \
 yum install -y httpd

# Install apache and write hello world message
RUN echo 'Hello World!' > /var/www/html/index.html

# Configure apache
RUN echo 'mkdir -p /var/run/httpd' >> /root/run_apache.sh && \
 echo 'mkdir -p /var/lock/httpd' >> /root/run_apache.sh && \
 echo '/usr/sbin/httpd -D FOREGROUND' >> /root/run_apache.sh && \
 chmod 755 /root/run_apache.sh

EXPOSE 80

CMD /root/run_apache.sh
```

**To add a `Dockerfile` file to your source repository**

1. Open a text editor and then copy and paste the Dockerfile above into a new file.

1. Commit and push your `Dockerfile` file to your source repository.

   1. Add the file.

      ```
      git add .
      ```

   1. Commit the change.

      ```
      git commit -m "Adding Dockerfile."
      ```

   1. Push the commit.

      ```
      git push
      ```

   Be sure to place the file at the root level of your repository.

   ```
   / Dockerfile
   ```

## Step 2: Add an imagedefinitions.json file to your source repository
<a name="w2aac13b9c15"></a>

This tutorial uses theAmazon ECS standard deploy action in CodePipeline to deploy your container to your Amazon ECS cluster. The Amazon ECS standard deploy action requires an imagedefinitions.json file containing your image name and URI. For more information about the imagedefinitions.json file, see [imagedefinitions.json file for Amazon ECS standard deployment actions](file-reference.md#pipelines-create-image-definitions).

Paste this sample text to create your `imagedefinitions.json` file. Use the name in your Dockerfile, such as `hello-world`, and use the URI from your Amazon ECR repository where the image is stored.

```
[
  {
    "name": "hello-world",
    "imageUri": "ACCOUNT-ID.dkr.ecr.us-east-1.amazonaws.com/actions/image-repo"
  }
]
```

**To add an `imagedefinitions.json` file to your source repository**

1. Open a text editor and then copy and paste the example above into a new file.

1. Commit and push your `imagedefinitions.json` file to your source repository.

   1. Add the file.

      ```
      git add .
      ```

   1. Commit the change.

      ```
      git commit -m "Adding imagedefinitions.json."
      ```

   1. Push the commit.

      ```
      git push
      ```

   Be sure to place the file at the root level of your repository.

   ```
   / imagedefinitions.json
   ```

## Step 3: Creating your pipeline
<a name="tutorials-ecr-build-publish-pipeline"></a>

Use the CodePipeline wizard to create your pipeline stages and connect your source repository.

**To create your pipeline**

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyPipeline**.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. On the **Step 3: Add source stage** page, add a source stage:

   1. In **Source provider**, choose **GitHub (via GitHub App)**.

   1. Under **Connection**, choose an existing connection or create a new one. To create or manage a connection for your GitHub source action, see [GitHub connections](connections-github.md).

   1. In **Repository name**, choose the name of your GitHub repository.

   1. In **Default branch**, choose the branch that you want to specify when the pipeline is started manually or with a source event that is not a Git tag. If the source of the change is not the trigger or if a pipeline execution was started manually, then the change used will be the HEAD commit from the default branch.

   Choose **Next**.

1. On the **Step 4: Add build stage** page, choose **Other build providers** choose **ECRBuildAndPublish**.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/ecrbuild-wizard.png)

   1. For **ECR repository name**, choose your image repository.

   1. Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. On the **Step 6: Add deploy stage** page, choose **Skip deploy stage**. You will add the ECS action in the following step.

1. On the **Step 7: Review** page, review your pipeline configuration and choose **Create pipeline** to create the pipeline.

1. Edit your pipeline to add the Amazon ECS deploy action to your pipeline:

   1. In the upper right, choose **Edit**.

   1. At the bottom of the diagram, choose **\$1 Add stage**. In **Stage name**, enter a name, such as **Deploy**.

   1. Choose **\$1 Add action group**.

   1. In **Action name**, enter a name. 

   1. In **Action provider**, choose Amazon ECS. Allow **Region** to default to the pipeline Region.

   1. In **Input artifacts**, choose the input artifact from the source stage, such as `SourceArtifact`. 

   1. For **Cluster name**, choose the Amazon ECS cluster in which your service is running.

   1. For **Service name**, choose the service to update.

   1. Choose **Save**.

   1. On the stage you are editing, choose **Done**. In the AWS CodePipeline pane, choose **Save**, and then choose **Save** on the warning message.

   1. To submit your changes and start a pipeline build, choose **Release change**, and then choose **Release**.

1. After the pipeline runs, view the pipeline structure and status.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/ecrbuild-ecsaction.png)

1. After the pipeline runs successfully, choose **View details** to view the logs on the action to view the managed compute action output.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/ecrbuild-logs.png)

1. Troubleshoot any failed actions. For example, the ECS deploy action can fail if the imagedefinitions.json file is not in the source repository. The following is an example of the error message that displays when the imagedefinitions.json file is missing.   
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/ecrbuild-ecsdebug.png)

## Step 4: Test Your Pipeline
<a name="tutorials-ecr-build-publish-test"></a>

Your pipeline should have everything for running an end-to-end native AWS continuous deployment. Now, test its functionality by pushing a code change to your source repository.

**To test your pipeline**

1. Make a code change to your configured source repository, commit, and push the change.

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. Choose your pipeline from the list.

1. Watch the pipeline progress through its stages. Your pipeline should complete and your action pushes the Docker image to ECR that was created from your code change.

# Tutorial: Deploy to Amazon EKS with CodePipeline
<a name="tutorials-eks-deploy"></a>

This tutorial helps you to create a deploy action in CodePipeline that deploys your code to a cluster you have configured in Amazon EKS.

The EKS action supports both public and private EKS clusters. Private clusters are the type recommended by EKS; however, both types are supported.

**Note**  
As part of creating a pipeline in the console, an S3 artifact bucket will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Note**  
This action uses CodePipeline managed CodeBuild compute to run commands in a build environment. Running the commands action will incur separate charges in AWS CodeBuild.

**Note**  
The `EKS` deploy action is only available for V2 type pipelines.

## Prerequisites
<a name="tutorials-eks-deploy-prereqs"></a>

There are a few resources that you must have in place before you can use this tutorial to create your CD pipeline. Here are the things you need to get started:

**Note**  
All of these resources should be created within the same AWS Region.
+ A source control repository (this tutorial uses GitHub) where you will add a sample `deployment.yaml` file.
+ You must use an existing CodePipeline service role that you will update with the permissions for this action using  [Step 3: Update the CodePipeline service role policy in IAM](#tutorials-eks-deploy-role) below. The permissions needed are based on the type of cluster you create. For more information, see [Service role policy permissions](action-reference-EKS.md#action-reference-EKS-service-role).
+ A working image and repository tag that you have pushed to ECR or your image repository.

After you have satisfied these prerequisites, you can proceed with the tutorial and create your CD pipeline.

## Step 1: (Optional) Create a cluster in Amazon EKS
<a name="tutorials-eks-deploy-cluster"></a>

You can choose to create an EKS cluster with a public or private endpoint. 

In the following steps, you create a public or a private cluster in EKS. This step is optional if you have already created your cluster.

### Create a public cluster in Amazon EKS
<a name="tutorials-eks-deploy-cluster-public"></a>

In this step, you create a cluster in EKS.

**Create a public cluster**

1. Open the EKS console, and then choose **Create cluster**.

1. In **Name**, name your cluster. Choose **Next**.

1. Choose **Create**.

### Create a private cluster in Amazon EKS
<a name="tutorials-eks-deploy-cluster-private"></a>

In case you choose to create a cluster with a private endpoint, make sure to attach the private subnets only, and make sure they have internet connection.

Follow the next five sub-steps for creating a cluster with a private endpoint.

**Create a VPC in the console**

1. Open the VPC console, and then choose **Create VPC**.

1. Under **VPC settings**, choose **VPC and more**.

1. Choose to create one public and 4 private subnets. Choose **Create VPC**.

1. On the subnets page, choose **Private**. 

**Determine the private subnets in your VPC**

1. Navigate to your VPC and choose the VPC ID to open the VPC details page.

1. On the VPC details page, choose the **Resource map** tab.

1. View the diagram and make a note of your private subnets. The subnets display with labels to indicate public or private status, and each subnet is mapped to a route table.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/eks-deploy-subnets.png)

   Note that a private cluster will have all private subnets.

1. Create a public subnet to host the NAT gateway. You can attach only one internet gateway to a VPC at a time.

**Create a NAT gateway in the public subnet**

1. In the public subnet, create a NAT gateway. Navigate to the VPC console, and then choose **Internet gateways**. Choose **Create internet gateway**.

1. In Name, enter a name for your internet gateway. Choose **Create internet gateway**.

Update the route table for the private subnet to direct traffic to the NAT gateway.

**Add the NAT gateway to your route tables for your private subnets**

1. Navigate to the VPC console, and then choose **Subnets**.

1. For each private subnet, choose it and then choose the route table for that subnet on the details page, Choose **Edit route table**. 

1. Update the route table for the private subnet to direct internet traffic to the NAT gateway. Choose **Add route**. Choose **NAT gateway** from the options to add. Choose the internet gateway you created.

1. For the public subnet, create a custom route table. Verify that the network access control list (ACL) for your public subnet allows inbound traffic from the private subnet.

1. Choose **Save changes**.

In this step, you create a cluster in EKS.

**Create a private cluster**

1. Open the EKS console, and then choose **Create cluster**.

1. In **Name**, name your cluster. Choose **Next**.

1. Specify your VPC and other configuration information. Choose **Create**.

Your EKS cluster can be a public or a private cluster. This step is for clusters that have ONLY a private endpoint. Make sure that if your cluster is private.

## Step 2: Configure your private cluster in Amazon EKS
<a name="tutorials-eks-deploy-cluster-private-configure"></a>

This step is applicable only if you have created a private cluster. This step is for clusters that have ONLY a private endpoint. 

**Configure your cluster**

1. Attach private subnets only in the EKS cluster under the **Networking** tab. Attach the private subnets captured in the **Determine the private subnets in your VPC** section under [Step 1: (Optional) Create a cluster in Amazon EKS](#tutorials-eks-deploy-cluster).

1. Make sure that the private subnets have access to the internet since CodePipeline stores and retrieves artifacts from the S3 artifact bucket for your pipeline.

## Step 3: Update the CodePipeline service role policy in IAM
<a name="tutorials-eks-deploy-role"></a>

In this step, you will update an existing CodePipeline service role, such as `cp-service-role`, with permissions required by CodePipeline to connect with your cluster. If you do not have existing role, create a new one.

Update your CodePipeline service role with the following steps.

**To update your CodePipeline service role policy**

1. Open the IAM console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/)).

1. From the console dashboard, choose **Roles**.

1. Look up your CodePipeline service role, such as `cp-service-role`.

1. Add a new inline policy.

1. In the **Policy editor**, enter the following.
   + For a public cluster, add the following permissions.

------
#### [ JSON ]

****  

     ```
     {
         "Version":"2012-10-17",		 	 	 
     
         "Statement": [
             {
                 "Sid": "EksClusterPolicy",
                 "Effect": "Allow",
                 "Action": "eks:DescribeCluster",
                 "Resource": "arn:aws:eks:us-east-1:111122223333:cluster/my-cluster"
             },
             {
                 "Sid": "EksVpcClusterPolicy",
                 "Effect": "Allow",
                 "Action": [
                     "ec2:DescribeDhcpOptions",
                     "ec2:DescribeNetworkInterfaces",
                     "ec2:DescribeRouteTables",
                     "ec2:DescribeSubnets",
                     "ec2:DescribeSecurityGroups",
                     "ec2:DescribeVpcs"
                 ],
                 "Resource": [
                     "*"
                 ]
             }
         ]
     }
     ```

------
   + For a private cluster, add the following permissions. Private clusters will require additional permissions for your VPC, if applicable.

------
#### [ JSON ]

****  

     ```
     {
         "Version":"2012-10-17",		 	 	 
     
         "Statement": [
             {
                 "Sid": "EksClusterPolicy",
                 "Effect": "Allow",
                 "Action": "eks:DescribeCluster",
                 "Resource": "arn:aws:eks:us-east-1:111122223333:cluster/my-cluster"
             },
             {
                 "Sid": "EksVpcClusterPolicy",
                 "Effect": "Allow",
                 "Action": [
                     "ec2:DescribeDhcpOptions",
                     "ec2:DescribeNetworkInterfaces",
                     "ec2:DescribeRouteTables",
                     "ec2:DescribeSubnets",
                     "ec2:DescribeSecurityGroups",
                     "ec2:DescribeVpcs"
                 ],
                 "Resource": [
                     "*"
                 ]
             },
             {
                 "Effect": "Allow",
                 "Action": "ec2:CreateNetworkInterface",
                 "Resource": "*",
                 "Condition": {
                     "StringEqualsIfExists": {
                         "ec2:Subnet": [
                             "arn:aws:ec2:us-east-1:ACCOUNT-ID:subnet/subnet-03ebd65daeEXAMPLE",
                             "arn:aws:ec2:us-east-1:ACCOUNT-ID:subnet/subnet-0e377f6036EXAMPLE",
                             "arn:aws:ec2:us-east-1:ACCOUNT-ID:subnet/subnet-0db658ba1cEXAMPLE",
                             "arn:aws:ec2:us-east-1:ACCOUNT-ID:subnet/subnet-0db658ba1cEXAMPLE"
                         ]
                     }
                 }
             },
             {
                 "Effect": "Allow",
                 "Action": "ec2:CreateNetworkInterfacePermission",
                 "Resource": "*",
                 "Condition": {
                     "ArnEquals": {
                         "ec2:Subnet": [
                             "arn:aws:ec2:us-east-1:111122223333:subnet/subnet-03ebd65daeEXAMPLE",
                             "arn:aws:ec2:us-east-1:111122223333:subnet/subnet-0e377f6036EXAMPLE",
                             "arn:aws:ec2:us-east-1:111122223333:subnet/subnet-0db658ba1cEXAMPLE",
                             "arn:aws:ec2:us-east-1:111122223333:subnet/subnet-0db658ba1cEXAMPLE"
                         ]
                     }
                 }
             },
             {
                 "Effect": "Allow",
                 "Action": "ec2:DeleteNetworkInterface",
                 "Resource": "*",
                 "Condition": {
                     "StringEqualsIfExists": {
                         "ec2:Subnet": [
                             "arn:aws:ec2:us-east-1:ACCOUNT-ID:subnet/subnet-03ebd65daeEXAMPLE",
                             "arn:aws:ec2:us-east-1:ACCOUNT-ID:subnet/subnet-0e377f6036EXAMPLE",
                             "arn:aws:ec2:us-east-1:ACCOUNT-ID:subnet/subnet-0db658ba1cEXAMPLE",
                             "arn:aws:ec2:us-east-1:ACCOUNT-ID:subnet/subnet-0db658ba1cEXAMPLE"
                         ]
                     }
                 }
             }
         ]
     }
     ```

------

1. Choose **Update policy**.

## Step 4: Create an access entry for the CodePipeline service role
<a name="tutorials-eks-deploy-access-entry"></a>

In this step, you create an access entry on your cluster that will add the CodePipeline service role you updated in Step 3, along with a managed access policy.

1. Open the EKS console and navigate to your cluster.

1. Choose the **Access** tab.

1. Under **IAM access entries**, choose **Create access entry**.

1. In **IAM principal ARN**, enter the role you just updated for the action, such as `cp-service-role`. Choose **Next**.

1. On the **Step 2: Add access policy** page, in **Policy name**, choose the managed policy for access, such as `AmazonEKSClusterAdminPolicy`. Choose **Add policy**. Choose **Next**.
**Note**  
This is the policy that the CodePipeline action uses to talk to Kubernetes. As a best practice, to scope down permissions in your policy with least privilege rather than the administrative policy, attach a custom policy instead.

1. On the review page, choose **Create**.

## Step 5: Create a source repository and add the `helm chart` config files
<a name="tutorials-eks-deploy-source"></a>

In this step, you create a config file that is appropriate for your action (Kubernetes manifest files or Helm chart) and store the config file in your source repository. Use the appropriate file for your configuration. For more information, see [https://kubernetes.io/docs/reference/kubectl/quick-reference/](https://kubernetes.io/docs/reference/kubectl/quick-reference/) or [https://helm.sh/docs/topics/charts/](https://helm.sh/docs/topics/charts/).
+ For Kubernetes, use a manifest file.
+ For Helm, use a Helm chart.

1. Create or use an existing GitHub repository.

1. Create a new structure in your repository for your helm chart files as shown in the example below.

   ```
   mychart
   |-- Chart.yaml
   |-- charts
   |-- templates
   |   |-- NOTES.txt
   |   |-- _helpers.tpl
   |   |-- deployment.yaml
   |   |-- ingress.yaml
   |   `-- service.yaml
   `-- values.yaml
   ```

1. Add the file to the root level of your repository.

## Step 6: Creating your pipeline
<a name="tutorials-eks-deploy-pipeline"></a>

Use the CodePipeline wizard to create your pipeline stages and connect your source repository.

**To create your pipeline**

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyEKSPipeline**.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, choose the service role that you updated in Step 3.

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. On the **Step 3: Add source stage** page, for **Source provider**, choose **to create a connection to your GitHub repository**.

1. On the **Step 4: Add build stage** page, choose **Skip**.

1. On the **Step 5: Add deploy stage** page, choose **Amazon EKS**.  
![\[Deploy configuration form with Helm selected, showing fields for release name and chart location.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/eks-action-example.png)

   1. Under **Deploy configuration type**, choose **Helm**.

   1. In **Helm chart location**, enter the release name, such as `my-release`. For **Helm chart location**, enter the path for your helm chart files, such as `mychart`.

   1. Choose **Next**.

1. On the **Step 6: Review** page, review your pipeline configuration and choose **Create pipeline** to create the pipeline.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/eks-deploy-pipeline.png)

1. After the pipeline runs successfully, choose **View details** to view the logs on the action to view the action output.

# Tutorial: Create a pipeline that runs commands with compute (V2 type)
<a name="tutorials-commands"></a>

In this tutorial, you configure a pipeline that continuously runs provided build commands using the Commands action in a build stage. For more information about the Commands action, see [Commands action reference](action-reference-Commands.md).

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

## Prerequisites
<a name="tutorials-commands-prereq"></a>

You must already have the following:
+ A GitHub repository. You can use the GitHub repository you created in [Tutorial: Use full clone with a GitHub pipeline source](tutorials-github-gitclone.md).

## Step 1: Create source files and push to your GitHub repository
<a name="tutorials-commands-push"></a>

In this section, you create and push your example source files to the repository that the pipeline uses for your source stage. For this example, you produce and push the following: 
+ A `README.txt` file.

**To create source files**

1. Create a file with the following text:

   ```
   Sample readme file
   ```

1. Save the file as `README.txt`.

**To push files to your GitHub repository**

1. Push or upload the files to your repository. These files are the source artifact created by the **Create Pipeline** wizard for your deployment action in AWS CodePipeline. Your files should look like this in your local directory:

   ```
   README.txt
   ```

1. To use the Git command line from a cloned repository on your local computer:

   1. Run the following command to stage all of your files at once:

      ```
      git add -A
      ```

   1. Run the following command to commit the files with a commit message.

      ```
      git commit -m "Added source files"
      ```

   1. Run the following command to push the files from your local repo to your repository:

      ```
      git push
      ```

## Step 2: Create your pipeline
<a name="tutorials-commands-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with a GitHub (via GitHub App) action for the repository where the source files are stored.
+ A build stage with the Commands action.

**To create a pipeline with the wizard**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyCommandsPipeline**.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.
**Note**  
If you are using an existing service role, to use the Commands action, you will need to add the following permissions for the service role. Scope down the permissions to the pipeline resource level by using resource-based permissions in the service role policy statement. For more information, see the policy example in [Service role policy permissions](action-reference-Commands.md#action-reference-Commands-policy).  
logs:CreateLogGroup
logs:CreateLogStream
logs:PutLogEvents

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. On the **Step 3: Add source stage** page, add a source stage:

   1. In **Source provider**, choose **GitHub (via GitHub App)**.

   1. Under **Connection**, choose an existing connection or create a new one. To create or manage a connection for your GitHub source action, see [GitHub connections](connections-github.md).

   1. In **Repository name**, choose the name of your GitHub.com repository. 

   1. In **Default branch**, choose the branch that you want to specify when the pipeline is started manually or with a source event that is not a Git tag. If the source of the change is not the trigger or if a pipeline execution was started manually, then the change used will be the HEAD commit from the default branch. Optionally, you can also specify webhooks with filtering (triggers). For more information, see [Automate starting pipelines using triggers and filtering](pipelines-triggers.md).

   Choose **Next**.

1. In **Step 4: Add build stage**, choose **Commands**.
**Note**  
Running the Commands action will incur separate charges in AWS CodeBuild.

   Enter the following commands: 

   ```
   ls
   echo hello world
   cat README.txt
   echo pipeline Execution Id is #{codepipeline.PipelineExecutionId}
   ```

   Choose **Next**.  
![\[The Step 4: Add build stage page for a new pipeline with the Commands action\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/commands-wizard-screen.png)

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 6: Add deploy stage**, choose **Skip deploy stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 7: Review**, review the information, and then choose **Create pipeline**.

1. As a final step for creating your action, add an environment variable to the action that will result in an output variable for the action. On the Commands action, choose **Edit**. On the **Edit** screen, specify a variable namespace for your action by entering `compute` in the **Variable namespace** field.

   Add the CodeBuild output variable `AWS_Default_Region`, and then choose **Add variable**.  
![\[The Edit page for the Commands action\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/commands-output-edit-var.png)

## Step 3: Run your pipeline and verify build commands
<a name="tutorials-commands-update"></a>

Release a change to run your pipeline. Verify that the build commands ran by viewing the execution history, the build logs, and the output variables.

**To view action logs and output variables**

1. After the pipeline runs successfully, you can view the logs and output for the action.

1. To view the output variables for the action, choose **History**, and then choose **Timeline**. 

   View the output variable that was added to the action. The output for the Commands action shows the output variable resolved to the action Region.  
![\[The output for the Commands action showing the output variable resolved to the action Region\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/commands-output-variable.png)

1. To view the logs for the action, choose **View details ** on the successful Commands action. View the logs for the Commands action.  
![\[Example logs for the Commands action\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/commands-output-logs.png)

# Tutorial: Use Git tags to start your pipeline
<a name="tutorials-github-tags"></a>

In this tutorial, you will create a pipeline that connects to your GitHub repository where the source action is configured for the Git tags trigger type. When a Git tag is created on a commit, your pipeline starts. This example shows you how to create a pipeline that allows filtering for tags based on the syntax of the tag name. For more information about filtering with glob patterns, see [Working with glob patterns in syntax](syntax-glob.md).

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

This tutorial connects to GitHub through the `CodeStarSourceConnection` action type.

**Note**  
This feature is not available in the Asia Pacific (Hong Kong), Africa (Cape Town), Middle East (Bahrain), or Europe (Zurich) Regions. To reference other available actions, see [Product and service integrations with CodePipeline](integrations.md). For considerations with this action in the Europe (Milan) Region, see the note in [CodeStarSourceConnection for Bitbucket Cloud, GitHub, GitHub Enterprise Server, GitLab.com, and GitLab self-managed actions](action-reference-CodestarConnectionSource.md).

**Topics**
+ [

## Prerequisites
](#tutorials-github-tags-prereq)
+ [

## Step 1: Open CloudShell and clone your repository
](#w2aac13c16c15)
+ [

## Step 2: Create a pipeline to trigger on Git tags
](#tutorials-github-tags-pipeline)
+ [

## Step 3: Tag your commits for release
](#w2aac13c16c19)
+ [

## Step 4: Release change and view logs
](#tutorials-github-tags-view)

## Prerequisites
<a name="tutorials-github-tags-prereq"></a>

Before you begin, you must do the following:
+ Create a GitHub repository with your GitHub account.
+ Have your GitHub credentials ready. When you use the AWS Management Console to set up a connection, you are asked to sign in with your GitHub credentials. 

## Step 1: Open CloudShell and clone your repository
<a name="w2aac13c16c15"></a>

You can use a command line interface to clone your repository, make commits, and add tags. This tutorial launches a CloudShell instance for the command line interface.

1. Sign in to the AWS Management Console.

1. In the top navigation bar, choose the AWS icon. The main page of the AWS Management Console displays.

1. In the top navigation bar, choose the AWS CloudShell icon. CloudShell opens. Wait while the CloudShell environment is created.
**Note**  
If you don't see the CloudShell icon, make sure that you're in a [Region supported by CloudShell](https://docs.aws.amazon.com/cloudshell/latest/userguide/faq-list.html#regions-available). This tutorial assumes you are in the US West (Oregon) Region.

1. In GitHub, navigate to your repository. Choose **Code**, and then choose **HTTPS**. Copy the path. The address to clone your Git repository is copied to your clipboard.

1. Run the following command to clone the repository.

   ```
   git clone https://github.com/<account>/MyGitHubRepo.git
   ```

1. Enter your GitHub account `Username` and `Password` when prompted. For the `Password` entry, you must use a user-created token rather than your account password.

## Step 2: Create a pipeline to trigger on Git tags
<a name="tutorials-github-tags-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with a connection to your GitHub repository and action.
+ A build stage with an AWS CodeBuild build action.

**To create a pipeline with the wizard**

1. Sign in to the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **Welcome** page, **Getting started** page, or **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyGitHubTagsPipeline**.

1. In **Pipeline type**, keep the default selection at **V2**. Pipeline types differ in characteristics and price. For more information, see [Pipeline types](pipeline-types.md).

1. In **Service role**, choose **New service role**.
**Note**  
If you choose instead to use your existing CodePipeline service role, make sure that you have added the `codestar-connections:UseConnection` IAM permission to your service role policy. For instructions for the CodePipeline service role, see [Add permissions to the the CodePipeline service role](https://docs.aws.amazon.com/codepipeline/latest/userguide/security-iam.html#how-to-update-role-new-services).

1. Under **Advanced settings**, leave the defaults. In **Artifact store**, choose **Default location** to use the default artifact store, such as the Amazon S3 artifact bucket designated as the default, for your pipeline in the Region you selected for your pipeline.
**Note**  
This is not the source bucket for your source code. This is the artifact store for your pipeline. A separate artifact store, such as an S3 bucket, is required for each pipeline.

   Choose **Next**.

1. On the **Step 3: Add source stage** page, add a source stage:

   1. In **Source provider**, choose **GitHub (via GitHub App)**.

   1. Under **Connection**, choose an existing connection or create a new one. To create or manage a connection for your GitHub source action, see [GitHub connections](connections-github.md).

   1. In **Repository name**, choose the name of your GitHub repository.

   1. In **Default branch**, choose the branch that you want to specify when the pipeline is started manually or with a source event that is not a Git tag. If the source of the change is not the trigger or if a pipeline execution was started manually, then the change used will be the HEAD commit from the default branch.

   1. Under **Webhook events**, in **Filter type**, choose **Tags**.

      In the **Tags or patterns** field, enter `release*`.
**Important**  
Pipelines that start with a trigger type of Git tags will be configured for WebhookV2 events and will not use the Webhook event (change detection on all push events) to start the pipeline.

   Choose **Next**.

1. In **Add build stage**, add a build stage:

   1. In **Build provider**, choose **AWS CodeBuild**. Allow **Region** to default to the pipeline Region.

   1. Choose **Create project**.

   1. In **Project name**, enter a name for this build project.

   1. In **Environment image**, choose **Managed image**. For **Operating system**, choose **Ubuntu**.

   1. For **Runtime**, choose **Standard**. For **Image**, choose **aws/codebuild/standard:5.0**.

   1. For **Service role**, choose **New service role**.
**Note**  
Note the name of your CodeBuild service role. You will need the role name for the final step in this tutorial.

   1. Under **Buildspec**, for **Build specifications**, choose **Insert build commands**. Choose **Switch to editor**, and paste the following under **Build commands**.

      ```
      version: 0.2
      #env:
        #variables:
           # key: "value"
           # key: "value"
        #parameter-store:
           # key: "value"
           # key: "value"
        #git-credential-helper: yes
      phases:
        install:
          #If you use the Ubuntu standard image 2.0 or later, you must specify runtime-versions.
          #If you specify runtime-versions and use an image other than Ubuntu standard image 2.0, the build fails.
          runtime-versions:
            nodejs: 12
          #commands:
            # - command
            # - command
        #pre_build:
          #commands:
            # - command
            # - command
        build:
          commands:
            - 
        #post_build:
          #commands:
            # - command
            # - command
      artifacts:
        files:
           - '*'
          # - location
        name: $(date +%Y-%m-%d)
        #discard-paths: yes
        #base-directory: location
      #cache:
        #paths:
          # - paths
      ```

   1. Choose **Continue to CodePipeline**. This returns to the CodePipeline console and creates a CodeBuild project that uses your build commands for configuration. The build project uses a service role to manage AWS service permissions. This step might take a couple of minutes.

   1. Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. On the **Step 6: Add deploy stage** page, choose **Skip deploy stage**, and then accept the warning message by choosing **Skip** again. Choose **Next**.

1. On **Step 7: Review**, choose **Create pipeline**.

## Step 3: Tag your commits for release
<a name="w2aac13c16c19"></a>

After you create your pipeline and specify Git tags, you can tag commits in your GitHub repository. In these steps, you will tag a commit with the `release-1` tag. Each commit in a Git repository must have a unique Git tag. When you choose the commit and tag it, this allows you to incorporate changes from different branches into your pipeline deployment. Note that the tag name release does not apply to the concept of a release in GitHub.

1. Reference the copied commit IDs you want to tag. To view commits in each branch, in the CloudShell terminal, enter the following command to capture the commit IDs you want to tag: 

   ```
   git log
   ```

1. In the CloudShell terminal, enter the command to tag your commit and push it to origin. After you tag your commit, you use the git push command to push the tag to origin. In the following example, enter the following command to use the `release-1` tag for the second commit with ID `49366bd`. This tag will be filtered by the pipeline `release*` tag filter and will start the pipeline.

   ```
   git tag release-1 49366bd
   ```

   ```
   git push origin release-1
   ```  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/git-tags-pipeline.png)

## Step 4: Release change and view logs
<a name="tutorials-github-tags-view"></a>

1. After the pipeline runs successfully, on your successful build stage, choose **View log**.

   Under **Logs**, view the CodeBuild build output. The commands output the value of the entered variable.

1. In the **History** page, view the **Triggers** column. View the trigger type **GitTag : release-1**.

# Tutorial: Filter on branch names for pull requests to start your pipeline (V2 type)
<a name="tutorials-github-featurebranches"></a>

In this tutorial, you will create a pipeline that connects to your GitHub.com repository where the source action is configured to start your pipeline with a trigger configuration that filters on pull requests. When a specified pull request event occurs for a specified branch, your pipeline starts. This example shows you how to create a pipeline that allows filtering for branch names. For more information about working with triggers, see [Add filters for push and pull request event types (CLI)](pipelines-filter.md#pipelines-filter-cli). For more information about filtering with regex patterns in glob format, see [Working with glob patterns in syntax](syntax-glob.md).

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

This tutorial connects to GitHub.com through the `CodeStarSourceConnection` action type.

**Topics**
+ [

## Prerequisites
](#tutorials-github-featurebranches-prereq)
+ [

## Step 1: Create a pipeline to start on pull request for specified branches
](#tutorials-github-featurebranches-pipeline)
+ [

## Step 2: Create and merge a pull request in GitHub.com to start your pipeline executions
](#tutorials-github-featurebranches-pullrequest)

## Prerequisites
<a name="tutorials-github-featurebranches-prereq"></a>

Before you begin, you must do the following:
+ Create a GitHub.com repository with your GitHub.com account.
+ Have your GitHub credentials ready. When you use the AWS Management Console to set up a connection, you are asked to sign in with your GitHub credentials. 

## Step 1: Create a pipeline to start on pull request for specified branches
<a name="tutorials-github-featurebranches-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with a connection to your GitHub.com repository and action.
+ A build stage with an AWS CodeBuild build action.

**To create a pipeline with the wizard**

1. Sign in to the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **Welcome** page, **Getting started** page, or **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyFilterBranchesPipeline**.

1. In **Pipeline type**, keep the default selection at **V2**. Pipeline types differ in characteristics and price. For more information, see [Pipeline types](pipeline-types.md).

1. In **Service role**, choose **New service role**.
**Note**  
If you choose instead to use your existing CodePipeline service role, make sure that you have added the `codeconnections:UseConnection` IAM permission to your service role policy. For instructions for the CodePipeline service role, see [Add permissions to the the CodePipeline service role](https://docs.aws.amazon.com/codepipeline/latest/userguide/security-iam.html#how-to-update-role-new-services).

1. Under **Advanced settings**, leave the defaults. In **Artifact store**, choose **Default location** to use the default artifact store, such as the Amazon S3 artifact bucket designated as the default, for your pipeline in the Region you selected for your pipeline.
**Note**  
This is not the source bucket for your source code. This is the artifact store for your pipeline. A separate artifact store, such as an S3 bucket, is required for each pipeline.

   Choose **Next**.

1. On the **Step 3: Add source stage** page, add a source stage:

   1. In **Source provider**, choose **GitHub (via GitHub App)**.

   1. Under **Connection**, choose an existing connection or create a new one. To create or manage a connection for your GitHub source action, see [GitHub connections](connections-github.md).

   1. In **Repository name**, choose the name of your GitHub.com repository.

   1. Under **Trigger type**, choose **Specify filter**.

      Under **Event type**, choose **Pull request**. Select all of the events under pull request so that the event occurs for created, updated, or closed pull requests.

      Under **Branches**, in the **Include** field, enter `main*`.  
![\[Image showing the Include branches option selected with a value of main* for a trigger with an event type of Pull request\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/pullreq-example-triggers-edit.png)
**Important**  
Pipelines that start with this trigger type will be configured for WebhookV2 events and will not use the Webhook event (change detection on all push events) to start the pipeline.

   Choose **Next**.

1. In **Step 4: Add build stage**, in **Build provider**, choose **AWS CodeBuild**. Allow **Region** to default to the pipeline Region. Choose or create the build project as instructed in [Tutorial: Use Git tags to start your pipeline](tutorials-github-tags.md). This action will only be used in this tutorial as the second stage needed to create your pipeline.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. On the **Step 6: Add deploy stage** page, choose **Skip deploy stage**, and then accept the warning message by choosing **Skip** again. Choose **Next**.

1. On **Step 7: Review**, choose **Create pipeline**.

## Step 2: Create and merge a pull request in GitHub.com to start your pipeline executions
<a name="tutorials-github-featurebranches-pullrequest"></a>

In this section, you create and merge a pull request. This starts your pipeline, with one execution for the opened pull request and one execution for the closed pull request.

**To create a pull request and start your pipeline**

1. In GitHub.com, create a pull request by making a change to the README.md on a feature branch and raising a pull request to the `main` branch. Commit the change with a message like `Update README.md for PR`.

1. The pipeline starts with the source revision showing the **Source** message for the pull request as **Update README.md for PR**.  
![\[Image showing the source message for the Pull request with the following text: Update README.md for PR\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/pullreq-example.png)

1. Choose **History**. In the pipeline execution history, view the CREATED and MERGED pull request status events that started the pipeline executions.  
![\[Image showing the pipeline execution history that shows the CREATED and MERGED pull request status events that started the pipeline executions\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/pullreq-example-history.png)

# Tutorial: Use pipeline-level variables
<a name="tutorials-pipeline-variables"></a>

In this tutorial, you will create a pipeline where you add a variable at the pipeline level and run a CodeBuild build action that outputs your variable value.

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Topics**
+ [

## Prerequisites
](#tutorials-pipeline-variables-prereq)
+ [

## Step 1: Create your pipeline and build project
](#tutorials-pipeline-variables-pipeline)
+ [

## Step 2: Release change and view logs
](#tutorials-pipeline-variables-view)

## Prerequisites
<a name="tutorials-pipeline-variables-prereq"></a>

Before you begin, you must do the following:
+ Create a CodeCommit repository.
+ Add a .txt file to the repository.

## Step 1: Create your pipeline and build project
<a name="tutorials-pipeline-variables-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with a connection to your CodeCommit repository.
+ A build stage with an AWS CodeBuild build action.

**To create a pipeline with the wizard**

1. Sign in to the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **Welcome** page, **Getting started** page, or **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyVariablesPipeline**.

1. In **Pipeline type**, keep the default selection at **V2**. Pipeline types differ in characteristics and price. For more information, see [Pipeline types](pipeline-types.md).

1. In **Service role**, choose **New service role**.
**Note**  
If you choose instead to use your existing CodePipeline service role, make sure that you have added the `codeconnections:UseConnection` IAM permission to your service role policy. For instructions for the CodePipeline service role, see [Add permissions to the the CodePipeline service role](https://docs.aws.amazon.com/codepipeline/latest/userguide/security-iam.html#how-to-update-role-new-services).

1. Under **Variables**, choose **Add variable**. In **Name**, enter `timeout`. In **Default**, enter 1000. In description, enter the following description: **Timeout**.

   This will create a variable where you can declare the value when the pipeline execution starts. Variable names must match `[A-Za-z0-9@\-_]+` and can be anything except an empty string.

1. Under **Advanced settings**, leave the defaults. In **Artifact store**, choose **Default location** to use the default artifact store, such as the Amazon S3 artifact bucket designated as the default, for your pipeline in the Region you selected for your pipeline.
**Note**  
This is not the source bucket for your source code. This is the artifact store for your pipeline. A separate artifact store, such as an S3 bucket, is required for each pipeline.

   Choose **Next**.

1. On the **Step 3: Add source stage** page, add a source stage:

   1. In **Source provider**, choose **AWS CodeCommit**.

   1. In **Repository name** and **Branch name**, choose the your repository and branch.

   Choose **Next**.

1. In **Step 4: Add build stage**, add a build stage:

   1. In **Build provider**, choose **AWS CodeBuild**. Allow **Region** to default to the pipeline Region.

   1. Choose **Create project**.

   1. In **Project name**, enter a name for this build project.

   1. In **Environment image**, choose **Managed image**. For **Operating system**, choose **Ubuntu**.

   1. For **Runtime**, choose **Standard**. For **Image**, choose **aws/codebuild/standard:5.0**.

   1. For **Service role**, choose **New service role**.
**Note**  
Note the name of your CodeBuild service role. You will need the role name for the final step in this tutorial.

   1. Under **Buildspec**, for **Build specifications**, choose **Insert build commands**. Choose **Switch to editor**, and paste the following under **Build commands**. In the buildspec, the customer variable `$CUSTOM_VAR1` will be used to output the pipeline variable in the build log. You will create the `$CUSTOM_VAR1` output variable as an environment variable in the following step.

      ```
      version: 0.2
      #env:
        #variables:
           # key: "value"
           # key: "value"
        #parameter-store:
           # key: "value"
           # key: "value"
        #git-credential-helper: yes
      phases:
        install:
          #If you use the Ubuntu standard image 2.0 or later, you must specify runtime-versions.
          #If you specify runtime-versions and use an image other than Ubuntu standard image 2.0, the build fails.
          runtime-versions:
            nodejs: 12
          #commands:
            # - command
            # - command
        #pre_build:
          #commands:
            # - command
            # - command
        build:
          commands:
            - echo $CUSTOM_VAR1
        #post_build:
          #commands:
            # - command
            # - command
      artifacts:
        files:
           - '*'
          # - location
        name: $(date +%Y-%m-%d)
        #discard-paths: yes
        #base-directory: location
      #cache:
        #paths:
          # - paths
      ```

   1. Choose **Continue to CodePipeline**. This returns to the CodePipeline console and creates a CodeBuild project that uses your build commands for configuration. The build project uses a service role to manage AWS service permissions. This step might take a couple of minutes.

   1. Under **Environment variables* - optional***, to create an environment variable as an input variable for the build action that will be resolved by the pipeline-level variable, choose **Add environment variable**. This will create the variable specified in the buildspec as `$CUSTOM_VAR1`. In **Name**, enter `CUSTOM_VAR1`. In **Value**, enter `#{variables.timeout}`. In **Type**, choose `Plaintext`.

      The `#{variables.timeout}` value for the environment variable is based on the pipeline-level variable namespace `variables` and the pipeline-level variable `timeout` created for the pipeline in step 7.

   1. Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. On the **Step 6: Add deploy stage** page, choose **Skip deploy stage**, and then accept the warning message by choosing **Skip** again. Choose **Next**.

1. On **Step 7: Review**, choose **Create pipeline**.

## Step 2: Release change and view logs
<a name="tutorials-pipeline-variables-view"></a>

1. After the pipeline runs successfully, on your successful build stage, choose **View details**.

   On the details page, choose the **Logs** tab. View the CodeBuild build output. The commands output the value of the entered variable.

1. In the left-hand nav, choose **History**.

   Choose the recent execution, and then choose the **Variables** tab. View the resolved value for the pipeline variable.

# Tutorial: Create a simple pipeline (S3 bucket)
<a name="tutorials-simple-s3"></a>

The easiest way to create a pipeline is to use the **Create pipeline** wizard in the AWS CodePipeline console. 

In this tutorial, you create a two-stage pipeline that uses a versioned S3 source bucket and CodeDeploy to release a sample application. 

**Note**  
When Amazon S3 is the source provider for your pipeline, you may zip your source file or files into a single .zip and upload the .zip to your source bucket. You may also upload a single unzipped file; however, downstream actions that expect a .zip file will fail.

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

After you create this simple pipeline, you add another stage and then disable and enable the transition between stages.

**Important**  
Many of the actions you add to your pipeline in this procedure involve AWS resources that you need to create before you create the pipeline. AWS resources for your source actions must always be created in the same AWS Region where you create your pipeline. For example, if you create your pipeline in the US East (Ohio) Region, your CodeCommit repository must be in the US East (Ohio) Region.   
You can add cross-region actions when you create your pipeline. AWS resources for cross-region actions must be in the same AWS Region where you plan to execute the action. For more information, see [Add a cross-Region action in CodePipeline](actions-create-cross-region.md).

Before you begin, you should complete the prerequisites in [Getting started with CodePipeline](getting-started-codepipeline.md).

**Topics**
+ [

## Step 1: Create an S3 source bucket for your application
](#s3-create-s3-bucket)
+ [

## Step 2: Create Amazon EC2 Windows instances and install the CodeDeploy agent
](#S3-create-instances)
+ [

## Step 3: Create an application in CodeDeploy
](#S3-create-deployment)
+ [

## Step 4: Create your first pipeline in CodePipeline
](#s3-create-pipeline)
+ [

## (Optional) Step 5: Add another stage to your pipeline
](#s3-add-stage)
+ [

## (Optional) Step 6: Disable and enable transitions between stages in CodePipeline
](#s3-configure-transitions)
+ [

## Step 7: Clean up resources
](#s3-clean-up)

## Step 1: Create an S3 source bucket for your application
<a name="s3-create-s3-bucket"></a>

You can store your source files or applications in any versioned location. In this tutorial, you create an S3 bucket for the sample application files and enable versioning on that bucket. After you have enabled versioning, you copy the sample applications to that bucket. 

**To create an S3 bucket**

1. Sign in to the console at AWS Management Console. Open the S3 console.

1. Choose **Create bucket**.

1. In **Bucket name**, enter a name for your bucket (for example, **awscodepipeline-demobucket-example-date**).
**Note**  
Because all bucket names in Amazon S3 must be unique, use one of your own, not the name shown in the example. You can change the example name just by adding the date to it. Make a note of this name because you need it for the rest of this tutorial.

   In **Region**, choose the Region where you intend to create your pipeline, such as **US West (Oregon)**, and then choose **Create bucket**.

1. After the bucket is created, a success banner displays. Choose **Go to bucket details**.

1. On the **Properties** tab, choose **Versioning**. Choose **Enable versioning**, and then choose **Save**.

   When versioning is enabled, Amazon S3 saves every version of every object in the bucket.

1. On the **Permissions** tab, leave the defaults. For more information about S3 bucket and object permissions, see [Specifying Permissions in a Policy](https://docs.aws.amazon.com/AmazonS3/latest/dev/using-with-s3-actions.html).

1. Next, download a sample and save it into a folder or directory on your local computer.

   1. Choose one of the following. Choose `SampleApp_Windows.zip` if you want to follow the steps in this tutorial for Windows Server instances.
      + If you want to deploy to Amazon Linux instances using CodeDeploy, download the sample application here: [SampleApp\$1Linux.zip](samples/SampleApp_Linux.zip).
      + If you want to deploy to Windows Server instances using CodeDeploy, download the sample application here: [SampleApp\$1Windows.zip](samples/SampleApp_Windows.zip).

      The sample application contains the following files for deploying with CodeDeploy: 
      + `appspec.yml` – The application specification file (AppSpec file) is a [YAML](http://www.yaml.org)-formatted file used by CodeDeploy to manage a deployment. For more information about the AppSpec file, see [CodeDeploy AppSpec File reference](https://docs.aws.amazon.com/codedeploy/latest/userguide/reference-appspec-file.html) in the *AWS CodeDeploy User Guide*.
      + `index.html` – The index file contains the home page for the deployed sample application.
      + `LICENSE.txt` – The license file contains license information for the sample application.
      + Files for scripts – The sample application uses scripts to write text files to a location on your instance. One file is written for each of several CodeDeploy deployment lifecycle events as follows:
        + (Linux sample only) `scripts` folder – The folder contains the following shell scripts to install dependencies and start and stop the sample application for the automated deployment: `install_dependencies`, `start_server`, and `stop_server`.
        + (Windows sample only) `before-install.bat` – This is a batch script for the `BeforeInstall` deployment lifecycle event, which will run to remove old files written during previous deployments of this sample and create a location on your instance to which to write the new files.

   1. Download the compressed (zipped) file. Do not unzip the file.

1. In the Amazon S3 console, for your bucket, upload the file: 

   1. Choose **Upload**. 

   1. Drag and drop the file or choose **Add files** and browse for the file.

   1. Choose **Upload**.

## Step 2: Create Amazon EC2 Windows instances and install the CodeDeploy agent
<a name="S3-create-instances"></a>

**Note**  
This tutorial provides sample steps for creating Amazon EC2 Windows instances. For sample steps to create Amazon EC2 Linux instances, see [Step 3: Create an Amazon EC2 Linux instance and install the CodeDeploy agent](tutorials-simple-codecommit.md#codecommit-create-deployment). When prompted for the number of instances to create, specify **2** instances.

In this step, you create the Windows Server Amazon EC2 instances to which you will deploy a sample application. As part of this process, you create an instance role with policies that allow install and management of the CodeDeploy agent on the instances. The CodeDeploy agent is a software package that enables an instance to be used in CodeDeploy deployments. You also attach policies that allow the instance to fetch files that the CodeDeploy agent uses to deploy your application and to allow the instance to be managed by SSM.

**To create an instance role**

1. Open the IAM console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/)).

1. From the console dashboard, choose **Roles**.

1. Choose **Create role**.

1. Under **Select type of trusted entity**, select **AWS service**. Under **Choose a use case**, select **EC2**, and then choose **Next: Permissions**.

1. Search for and select the policy named **`AmazonEC2RoleforAWSCodeDeploy`**.

1. Search for and select the policy named **`AmazonSSMManagedInstanceCore`**. Choose **Next: Tags**.

1. Choose **Next: Review**. Enter a name for the role (for example, **EC2InstanceRole**).
**Note**  
Make a note of your role name for the next step. You choose this role when you are creating your instance.

   Choose **Create role**.

**To launch instances**

1. Open the Amazon EC2 console at [https://console.aws.amazon.com/ec2/](https://console.aws.amazon.com/ec2/).

1. From the side navigation, choose **Instances**, and select **Launch instances** from the top of the page.

1. Under **Name and tags**, in **Name**, enter **MyCodePipelineDemo**. This assigns the instances a tag **Key** of **Name** and a tag **Value** of **MyCodePipelineDemo**. Later, you create a CodeDeploy application that deploys the sample application to the instances. CodeDeploy selects instances to deploy based on the tags.

1. Under **Application and OS Images (Amazon Machine Image)**, choose the **Windows** option. (This AMI is described as the **Microsoft Windows Server 2019 Base** and is labeled "Free tier eligible" and can be found under **Quick Start**..)

1. Under **Instance type**, choose the free tier eligible `t2.micro` type as the hardware configuration for your instance.

1. Under **Key pair (login)**, choose a key pair or create one. 

   You can also choose **Proceed without a key pair**.
**Note**  
For the purposes of this tutorial, you can proceed without a key pair. To use SSH to connect to your instances, create or use a key pair.

1. Under **Network settings**, do the following.

   In **Auto-assign Public IP**, make sure the status is **Enable**.
   + Next to **Assign a security group**, choose **Create a new security group**.
   + In the row for **SSH**, under **Source type**, choose **My IP**.
   + Choose **Add security group**, choose **HTTP**, and then under **Source type**, choose **My IP**.

1. Expand **Advanced details**. In **IAM instance profile**, choose the IAM role you created in the previous procedure (for example, **EC2InstanceRole**).

1. Under **Summary**, under **Number of instances**, enter `2`..

1. Choose **Launch instance**.

1. Choose **View all instances** to close the confirmation page and return to the console.

1. You can view the status of the launch on the **Instances** page. When you launch an instance, its initial state is `pending`. After the instance starts, its state changes to `running`, and it receives a public DNS name. (If the **Public DNS** column is not displayed, choose the **Show/Hide** icon, and then select **Public DNS**.)

1. It can take a few minutes for the instance to be ready for you to connect to it. Check that your instance has passed its status checks. You can view this information in the **Status Checks** column.

## Step 3: Create an application in CodeDeploy
<a name="S3-create-deployment"></a>

In CodeDeploy, an *application* is an identifier, in the form of a name, for the code you want to deploy. CodeDeploy uses this name to ensure the correct combination of revision, deployment configuration, and deployment group are referenced during a deployment. You select the name of the CodeDeploy application you create in this step when you create your pipeline later in this tutorial.

You first create a service role for CodeDeploy to use. If you have already created a service role, you do not need to create another one.

**To create a CodeDeploy service role**

1. Open the IAM console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/)).

1. From the console dashboard, choose **Roles**.

1. Choose **Create role**.

1. Under **Select trusted entity**, choose **AWS service**. Under **Use case**, choose **CodeDeploy**. Choose **CodeDeploy** from the options listed. Choose **Next**. The `AWSCodeDeployRole` managed policy is already attached to the role.

1. Choose **Next**.

1. Enter a name for the role (for example, **CodeDeployRole**), and then choose **Create role**.

**To create an application in CodeDeploy**

1. Open the CodeDeploy console at [https://console.aws.amazon.com/codedeploy](https://console.aws.amazon.com/codedeploy).

1. If the **Applications** page does not appear, on the AWS CodeDeploy menu, choose **Applications**.

1. Choose **Create application**.

1. In **Application name**, enter `MyDemoApplication`. 

1. In **Compute Platform**, choose **EC2/On-premises**.

1. Choose **Create application**.

**To create a deployment group in CodeDeploy**

1. On the page that displays your application, choose **Create deployment group**.

1. In **Deployment group name**, enter **MyDemoDeploymentGroup**.

1. In **Service role**, choose the service role you created earlier. You must use a service role that trusts AWS CodeDeploy with, at minimum, the trust and permissions described in [Create a Service Role for CodeDeploy](https://docs.aws.amazon.com/codedeploy/latest/userguide/getting-started-create-service-role.html). To get the service role ARN, see [Get the Service Role ARN (Console)](https://docs.aws.amazon.com/codedeploy/latest/userguide/how-to-create-service-role.html#getting-started-get-service-role-console).

1. Under **Deployment type**, choose **In-place**.

1. Under **Environment configuration**, choose **Amazon EC2 Instances**. Choose **Name** in the **Key** field, and in the **Value** field, enter **MyCodePipelineDemo**. 
**Important**  
You must choose the same value for the **Name** key here that you assigned to your EC2 instances when you created them. If you tagged your instances with something other than **MyCodePipelineDemo**, be sure to use it here.

1. Under **Agent configuration with AWS Systems Manager**, choose **Now and schedule updates**. This installs the agent on the instance. The Windows instance is already configured with the SSM agent and will now be updated with the CodeDeploy agent.

1. Under **Deployment settings**, choose `CodeDeployDefault.OneAtaTime`.

1. Under **Load Balancer**, make sure the **Enable load balancing** box is not selected. You do not need to set up a load balancer or choose a target group for this example. After you de-select the checkbox, the load balancer options do not display.

1. In the **Advanced** section, leave the defaults.

1. Choose **Create deployment group**.

## Step 4: Create your first pipeline in CodePipeline
<a name="s3-create-pipeline"></a>

In this part of the tutorial, you create the pipeline. The sample runs automatically through the pipeline.

**To create a CodePipeline automated release process**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyFirstPipeline**. 
**Note**  
If you choose another name for your pipeline, be sure to use that name instead of **MyFirstPipeline** for the rest of this tutorial. After you create a pipeline, you cannot change its name. Pipeline names are subject to some limitations. For more information, see [Quotas in AWS CodePipeline](limits.md). 

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, do one of the following:
   + Choose **New service role** to allow CodePipeline to create a new service role in IAM.
   + Choose **Existing service role** to use a service role already created in IAM. In **Role name**, choose your service role from the list.

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. In **Step 3: Add source stage**, in **Source provider**, choose **Amazon S3**. In **Bucket**, enter the name of the S3 bucket you created in [Step 1: Create an S3 source bucket for your application](#s3-create-s3-bucket). In **S3 object key**, enter the object key with or without a file path, and remember to include the file extension. For example, for `SampleApp_Windows.zip`, enter the sample file name as shown in this example:

   ```
   SampleApp_Windows.zip
   ```

   Choose **Next step**.

   Under **Change detection options**, leave the defaults. This allows CodePipeline to use Amazon CloudWatch Events to detect changes in your source bucket.

    Choose **Next**.

1. In **Step 4: Add build stage**, choose **Skip build stage**, and then accept the warning message by choosing **Skip** again. Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 6: Add deploy stage**, in **Deploy provider**, choose **CodeDeploy **. The **Region** field defaults to the same AWS Region as your pipeline. In **Application name**, enter `MyDemoApplication`, or choose the **Refresh** button, and then choose the application name from the list. In **Deployment group**, enter **MyDemoDeploymentGroup**, or choose it from the list, and then choose **Next**. 
**Note**  
The name Deploy is the name given by default to the stage created in the **Step 4: Add deploy stage** step, just as Source is the name given to the first stage of the pipeline. 

1. In **Step 7: Review**, review the information, and then choose **Create pipeline**.

1. The pipeline starts to run. You can view progress and success and failure messages as the CodePipeline sample deploys a webpage to each of the Amazon EC2 instances in the CodeDeploy deployment.

Congratulations\$1 You just created a simple pipeline in CodePipeline. The pipeline has two stages:
+ A source stage named **Source**, which detects changes in the versioned sample application stored in the S3 bucket and pulls those changes into the pipeline.
+ A **Deploy** stage that deploys those changes to EC2 instances with CodeDeploy. 

Now, verify the results.

**To verify your pipeline ran successfully**

1. View the initial progress of the pipeline. The status of each stage changes from **No executions yet** to **In Progress**, and then to either **Succeeded** or **Failed**. The pipeline should complete the first run within a few minutes.

1. After **Succeeded** is displayed for the action status, in the status area for the **Deploy** stage, choose **Details**. This opens the CodeDeploy console.

1. In the **Deployment group** tab, under **Deployment lifecycle events**, choose an instance ID. This opens the EC2 console.

1. On the **Description** tab, in **Public DNS**, copy the address, and then paste it into the address bar of your web browser. View the index page for the sample application you uploaded to your S3 bucket.

   The web page displays for the sample application you uploaded to your S3 bucket.

For more information about stages, actions, and how pipelines work, see [CodePipeline concepts](concepts.md).

## (Optional) Step 5: Add another stage to your pipeline
<a name="s3-add-stage"></a>

Now add another stage in the pipeline to deploy from staging servers to production servers using CodeDeploy. First, you create another deployment group in the CodePipelineDemoApplication in CodeDeploy. Then you add a stage that includes an action that uses this deployment group. To add another stage, you use the CodePipeline console or the AWS CLI to retrieve and manually edit the structure of the pipeline in a JSON file, and then run the **update-pipeline** command to update the pipeline with your changes.

**Topics**
+ [

### Create a second deployment group in CodeDeploy
](#s3-add-stage-part-1)
+ [

### Add the deployment group as another stage in your pipeline
](#s3-add-stage-part-2)

### Create a second deployment group in CodeDeploy
<a name="s3-add-stage-part-1"></a>

**Note**  
In this part of the tutorial, you create a second deployment group, but deploy to the same Amazon EC2 instances as before. This is for demonstration purposes only. It is purposely designed to fail to show you how errors are displayed in CodePipeline.

**To create a second deployment group in CodeDeploy**

1. Open the CodeDeploy console at [https://console.aws.amazon.com/codedeploy](https://console.aws.amazon.com/codedeploy).

1. Choose **Applications**, and in the list of applications, choose `MyDemoApplication`.

1. Choose the **Deployment groups** tab, and then choose **Create deployment group**.

1. On the **Create deployment group** page, in **Deployment group name**, enter a name for the second deployment group (for example, **CodePipelineProductionFleet**).

1. In **Service Role**, choose the same CodeDeploy service role you used for the initial deployment (not the CodePipeline service role).

1. Under **Deployment type**, choose **In-place**.

1. Under **Environment configuration**, choose **Amazon EC2 Instances**. Choose **Name** in the **Key** box, and in the **Value** box, choose `MyCodePipelineDemo` from the list. Leave the default configuration for **Deployment settings**. 

1. Under **Deployment configuration**, choose `CodeDeployDefault.OneAtaTime`.

1. Under **Load Balancer**, clear **Enable load balancing**.

1.  Choose **Create deployment group**.

### Add the deployment group as another stage in your pipeline
<a name="s3-add-stage-part-2"></a>

Now that you have another deployment group, you can add a stage that uses this deployment group to deploy to the same EC2 instances you used earlier. You can use the CodePipeline console or the AWS CLI to add this stage. 

**Topics**
+ [

#### Create a third stage (console)
](#s3-add-stage-part-2-console)
+ [

#### Create a third stage (CLI)
](#s3-add-stage-part-2-cli)

#### Create a third stage (console)
<a name="s3-add-stage-part-2-console"></a>

You can use the CodePipeline console to add a new stage that uses the new deployment group. Because this deployment group is deploying to the EC2 instances you've already used, the deploy action in this stage fails. 

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. In **Name**, choose the name of the pipeline you created, MyFirstPipeline. 

1. On the pipeline details page, choose **Edit**. 

1. On the **Edit** page, choose **\$1 Add stage** to add a stage immediately after the Deploy stage.   
![\[Image showing the + Add stage button on the edit screen\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/edit-pipeline-console-pol.png)

1. In **Add stage**, in **Stage name**, enter **Production**. Choose **Add stage**.

1. In the new stage, choose **\$1 Add action group**.

1. In **Edit action**, in **Action name**, enter **Deploy-Second-Deployment**. In **Action provider**, under **Deploy**, choose **CodeDeploy**.

1. In the CodeDeploy section, in **Application name**, choose `MyDemoApplication` from the drop-down list, as you did when you created the pipeline. In **Deployment group**, choose the deployment group you just created, **CodePipelineProductionFleet**. In **Input artifacts**, choose the input artifact from the source action. Choose **Save**.

1. On the **Edit** page, choose **Save**. In **Save pipeline changes**, choose **Save**.

1. Although the new stage has been added to your pipeline, a status of **No executions yet** is displayed because no changes have triggered another run of the pipeline. You must manually rerun the last revision to see how the edited pipeline runs. On the pipeline details page, choose **Release change**, and then choose **Release** when prompted. This runs the most recent revision available in each source location specified in a source action through the pipeline. 

   Alternatively, to use the AWS CLI to rerun the pipeline, from a terminal on your local Linux, macOS, or Unix machine, or a command prompt on your local Windows machine, run the **start-pipeline-execution** command, specifying the name of the pipeline. This runs the application in your source bucket through the pipeline for a second time.

   ```
   aws codepipeline start-pipeline-execution --name MyFirstPipeline
   ```

   This command returns a `pipelineExecutionId` object.

1. Return to the CodePipeline console and in the list of pipelines, choose **MyFirstPipeline** to open the view page.

   The pipeline shows three stages and the state of the artifact running through those three stages. It might take up to five minutes for the pipeline to run through all stages. You see the deployment succeeds on the first two stages, just as before, but the **Production** stage shows the **Deploy-Second-Deployment** action failed.

1. In the **Deploy-Second-Deployment** action, choose **Details**. You are redirected to the page for the CodeDeploy deployment. In this case, the failure is the result of the first instance group deploying to all of the EC2 instances, leaving no instances for the second deployment group.
**Note**  
This failure is by design, to demonstrate what happens when there is a failure in a pipeline stage.

#### Create a third stage (CLI)
<a name="s3-add-stage-part-2-cli"></a>

Although using the AWS CLI to add a stage to your pipeline is more complex than using the console, it provides more visibility into the structure of the pipeline.

**To create a third stage for your pipeline**

1. Open a terminal session on your local Linux, macOS, or Unix machine, or a command prompt on your local Windows machine, and run the **get-pipeline** command to display the structure of the pipeline you just created. For **MyFirstPipeline**, you would type the following command: 

   ```
   aws codepipeline get-pipeline --name "MyFirstPipeline"
   ```

   This command returns the structure of MyFirstPipeline. The first part of the output should look similar to the following:

   ```
   {
       "pipeline": {
           "roleArn": "arn:aws:iam::80398EXAMPLE:role/AWS-CodePipeline-Service",
           "stages": [
       ...
   ```

   The final part of the output includes the pipeline metadata and should look similar to the following:

   ```
       ...
           ],
           "artifactStore": {
               "type": "S3"
               "location": "amzn-s3-demo-bucket",
           },
           "name": "MyFirstPipeline",
           "version": 4
       },
       "metadata": {
           "pipelineArn": "arn:aws:codepipeline:us-east-2:80398EXAMPLE:MyFirstPipeline",
           "updated": 1501626591.112,
           "created": 1501626591.112
       }
   }
   ```

1. Copy and paste this structure into a plain-text editor, and save the file as **pipeline.json**. For convenience, save this file in the same directory where you run the **aws codepipeline** commands.
**Note**  
You can pipe the JSON directly into a file with the **get-pipeline** command as follows:  

   ```
   aws codepipeline get-pipeline --name MyFirstPipeline >pipeline.json
   ```

1. Copy the **Deploy** stage section and paste it after the first two stages. Because it is a deploy stage, just like the **Deploy** stage, you use it as a template for the third stage. 

1. Change the name of the stage and the deployment group details. 

   The following example shows the JSON you add to the pipeline.json file after the **Deploy** stage. Edit the emphasized elements with new values. Remember to include a comma to separate the **Deploy** and **Production** stage definitions.

   ```
   ,
   {
       "name": "Production",
        "actions": [
           {
            "inputArtifacts": [
                {
                 "name": "MyApp"
                }
              ],
             "name": "Deploy-Second-Deployment",
             "actionTypeId": {
                 "category": "Deploy",
                 "owner": "AWS",
                 "version": "1",
                 "provider": "CodeDeploy"
                 },
            "outputArtifacts": [],
            "configuration": {
                 "ApplicationName": "CodePipelineDemoApplication",
                 "DeploymentGroupName": "CodePipelineProductionFleet"
                  },
            "runOrder": 1
           }
       ]
   }
   ```

1. If you are working with the pipeline structure retrieved using the **get-pipeline** command, you must remove the `metadata` lines from the JSON file. Otherwise, the **update-pipeline** command cannot use it. Remove the `"metadata": { }` lines and the `"created"`, `"pipelineARN"`, and `"updated"` fields.

   For example, remove the following lines from the structure:

   ```
   "metadata": {  
     "pipelineArn": "arn:aws:codepipeline:region:account-ID:pipeline-name",
     "created": "date",
     "updated": "date"
     }
   ```

   Save the file.

1. Run the **update-pipeline** command, specifying the pipeline JSON file, similar to the following:

   ```
   aws codepipeline update-pipeline --cli-input-json file://pipeline.json
   ```

   This command returns the entire structure of the updated pipeline.
**Important**  
Be sure to include `file://` before the file name. It is required in this command.

1.  Run the **start-pipeline-execution** command, specifying the name of the pipeline. This runs the application in your source bucket through the pipeline for a second time.

   ```
   aws codepipeline start-pipeline-execution --name MyFirstPipeline
   ```

   This command returns a `pipelineExecutionId` object.

1. Open the CodePipeline console and choose **MyFirstPipeline** from the list of pipelines.

   The pipeline shows three stages and the state of the artifact running through those three stages. It might take up to five minutes for the pipeline to run through all stages. Although the deployment succeeds on the first two stages, just as before, the **Production** stage shows that the **Deploy-Second-Deployment** action failed. 

1. In the **Deploy-Second-Deployment** action, choose **Details** to see details of the failure. You are redirected to the details page for the CodeDeploy deployment. In this case, the failure is the result of the first instance group deploying to all of the EC2 instances, leaving no instances for the second deployment group. 
**Note**  
This failure is by design, to demonstrate what happens when there is a failure in a pipeline stage.

## (Optional) Step 6: Disable and enable transitions between stages in CodePipeline
<a name="s3-configure-transitions"></a>

You can enable or disable the transition between stages in a pipeline. Disabling the transition between stages allows you to manually control transitions between one stage and another. For example, you might want to run the first two stages of a pipeline, but disable transitions to the third stage until you are ready to deploy to production, or while you troubleshoot a problem or failure with that stage.

**To disable and enable transitions between stages in a CodePipeline pipeline**

1. Open the CodePipeline console and choose **MyFirstPipeline** from the list of pipelines.

1. On the details page for the pipeline, choose the **Disable transition** button between the second stage (**Deploy**) and the third stage that you added in the previous section (**Production**).

1. In **Disable transition**, enter a reason for disabling the transition between the stages, and then choose **Disable**.

   The arrow between stages displays an icon and color change, and the **Enable transition** button.  
![\[Image showing the entered reason for disabling the transition as "Disabling transition while I troubleshoot the failure"\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/codepipeline-disabled-transition-pol.png)

1. Upload your sample again to the S3 bucket. Because the bucket is versioned, this change starts the pipeline. 

1. Return to the details page for your pipeline and watch the status of the stages. The pipeline view changes to show progress and success on the first two stages, but no changes occur on the third stage. This process might take a few minutes.

1. Enable the transition by choosing the **Enable transition** button between the two stages. In the **Enable transition** dialog box, choose **Enable**. The stage starts running in a few minutes and attempts to process the artifact that has already been run through the first two stages of the pipeline.
**Note**  
If you want this third stage to succeed, edit the CodePipelineProductionFleet deployment group before you enable the transition, and specify a different set of EC2 instances where the application is deployed. For more information about how to do this, see [Change deployment group settings](http://docs.aws.amazon.com/codedeploy/latest/userguide/how-to-change-deployment-group-settings.html). If you create more EC2 instances, you might incur additional costs. 

## Step 7: Clean up resources
<a name="s3-clean-up"></a>

You can use some of the resources you created in this tutorial for the [Tutorial: Create a four-stage pipeline](tutorials-four-stage-pipeline.md). For example, you can reuse the CodeDeploy application and deployment. You can configure a build action with a provider such as CodeBuild, which is a fully managed build service in the cloud. You can also configure a build action that uses a provider with a build server or system, such as Jenkins.

However, after you complete this and any other tutorials, you should delete the pipeline and the resources it uses, so that you are not charged for the continued use of those resources. First, delete the pipeline, then the CodeDeploy application and its associated Amazon EC2 instances, and finally, the S3 bucket.

**To clean up the resources used in this tutorial**

1. To clean up your CodePipeline resources, follow the instructions in [Delete a pipeline in AWS CodePipeline](pipelines-delete.md).

1. To clean up your CodeDeploy resources, follow the instructions in [To clean up resources (console)](https://docs.aws.amazon.com/codedeploy/latest/userguide/tutorials-wordpress-clean-up.html#tutorials-wordpress-clean-up-console).

1. To delete the S3 bucket, follow the instructions in [Deleting or emptying a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/delete-or-empty-bucket.html). If you do not intend to create more pipelines, delete the S3 bucket created for storing your pipeline artifacts. For more information about this bucket, see [CodePipeline concepts](concepts.md).

# Tutorial: Create a simple pipeline (CodeCommit repository)
<a name="tutorials-simple-codecommit"></a>

In this tutorial, you use CodePipeline to deploy code maintained in a CodeCommit repository to a single Amazon EC2 instance. Your pipeline is triggered when you push a change to the CodeCommit repository. The pipeline deploys your changes to an Amazon EC2 instance using CodeDeploy as the deployment service.

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

The pipeline has two stages:
+ A source stage (**Source**) for your CodeCommit source action.
+ A deployment stage (**Deploy**) for your CodeDeploy deployment action.

The easiest way to get started with AWS CodePipeline is to use the **Create Pipeline** wizard in the CodePipeline console.

**Note**  
Before you begin, make sure you've set up your Git client to work with CodeCommit. For instructions, see [Setting up for CodeCommit](https://docs.aws.amazon.com/codecommit/latest/userguide/setting-up.html).

## Step 1: Create a CodeCommit repository
<a name="codecommit-create-repository"></a>

First, you create a repository in CodeCommit. Your pipeline gets source code from this repository when it runs. You also create a local repository where you maintain and update code before you push it to the CodeCommit repository.

**To create a CodeCommit repository**



1. Open the CodeCommit console at [https://console.aws.amazon.com/codecommit/](https://console.aws.amazon.com/codecommit/).

1. In the Region selector, choose the AWS Region where you want to create the repository and pipeline. For more information, see [AWS Regions and Endpoints](https://docs.aws.amazon.com/general/latest/gr/rande.html).

1. On the **Repositories** page, choose **Create repository**.

1. On the **Create repository** page, in **Repository name**, enter a name for your repository (for example, **MyDemoRepo**).

1. Choose **Create**.

**Note**  
The remaining steps in this tutorial use **MyDemoRepo** for the name of your CodeCommit repository. If you choose a different name, be sure to use it throughout this tutorial.

**To set up a local repository**

In this step, you set up a local repository to connect to your remote CodeCommit repository.
**Note**  
You are not required to set up a local repository. You can also use the console to upload files as described in [Step 2: Add sample code to your CodeCommit repository](#codecommit-add-code).

1. With your new repository open in the console, choose **Clone URL** on the top right of the page, and then choose **Clone SSH**. The address to clone your Git repository is copied to your clipboard.

1. In your terminal or command line, navigate to a local directory where you'd like your local repository to be stored. In this tutorial, we use `/tmp`.

1. Run the following command to clone the repository, replacing the SSH address with the one you copied in the previous step. This command creates a directory called `MyDemoRepo`. You copy a sample application to this directory.

   ```
   git clone ssh://git-codecommit.us-west-2.amazonaws.com/v1/repos/MyDemoRepo
   ```

## Step 2: Add sample code to your CodeCommit repository
<a name="codecommit-add-code"></a>

In this step, you download code for a sample application that was created for a CodeDeploy sample walkthrough, and add it to your CodeCommit repository.



1. Next, download a sample and save it into a folder or directory on your local computer.

   1. Choose one of the following. Choose `SampleApp_Linux.zip` if you want to follow the steps in this tutorial for Linux instances.
      + If you want to deploy to Amazon Linux instances using CodeDeploy, download the sample application here: [SampleApp\$1Linux.zip](samples/SampleApp_Linux.zip).
      + If you want to deploy to Windows Server instances using CodeDeploy, download the sample application here: [SampleApp\$1Windows.zip](samples/SampleApp_Windows.zip).

      The sample application contains the following files for deploying with CodeDeploy: 
      + `appspec.yml` – The application specification file (AppSpec file) is a [YAML](http://www.yaml.org)-formatted file used by CodeDeploy to manage a deployment. For more information about the AppSpec file, see [CodeDeploy AppSpec File reference](https://docs.aws.amazon.com/codedeploy/latest/userguide/reference-appspec-file.html) in the *AWS CodeDeploy User Guide*.
      + `index.html` – The index file contains the home page for the deployed sample application.
      + `LICENSE.txt` – The license file contains license information for the sample application.
      + Files for scripts – The sample application uses scripts to write text files to a location on your instance. One file is written for each of several CodeDeploy deployment lifecycle events as follows:
        + (Linux sample only) `scripts` folder – The folder contains the following shell scripts to install dependencies and start and stop the sample application for the automated deployment: `install_dependencies`, `start_server`, and `stop_server`.
        + (Windows sample only) `before-install.bat` – This is a batch script for the `BeforeInstall` deployment lifecycle event, which will run to remove old files written during previous deployments of this sample and create a location on your instance to which to write the new files.

   1. Download the compressed (zipped) file.

1. Unzip the files from [SampleApp\$1Linux.zip](samples/SampleApp_Linux.zip) into the local directory you created earlier (for example, `/tmp/MyDemoRepo` or `c:\temp\MyDemoRepo`).

   Be sure to place the files directly into your local repository. Do not include a `SampleApp_Linux` folder. On your local Linux, macOS, or Unix machine, for example, your directory and file hierarchy should look like this:

   ```
   /tmp
      └-- MyDemoRepo
          │-- appspec.yml
          │-- index.html
          │-- LICENSE.txt
          └-- scripts
              │-- install_dependencies
              │-- start_server
              └-- stop_server
   ```

1. To upload files to your repository, use one of the following methods.

   1. To use the CodeCommit console to upload your files: 

      1. Open the CodeCommit console, and choose your repository from the **Repositories** list.

      1. Choose **Add file**, and then choose **Upload file**. 

      1. Select **Choose file**, and then browse for your file. To add a file under a folder, choose **Create file** and then enter the folder name with the file name, such as `scripts/install_dependencies`. Paste the file contents into the new file.

         Commit the change by entering your user name and email address. 

         Choose **Commit changes**.

      1. Repeat this step for each file.

         Your repository contents should look like this:

         ```
                │-- appspec.yml
                │-- index.html
                │-- LICENSE.txt
                └-- scripts
                    │-- install_dependencies
                    │-- start_server
                    └-- stop_server
         ```

   1. To use git commands to upload your files: 

      1. Change directories to your local repo:

         ```
         (For Linux, macOS, or Unix) cd /tmp/MyDemoRepo
         (For Windows) cd c:\temp\MyDemoRepo
         ```

      1. Run the following command to stage all of your files at once:

         ```
         git add -A
         ```

      1. Run the following command to commit the files with a commit message:

         ```
         git commit -m "Add sample application files"
         ```

      1. Run the following command to push the files from your local repo to your CodeCommit repository:

         ```
         git push
         ```

1. The files you downloaded and added to your local repo have now been added to the `main` branch in your CodeCommit `MyDemoRepo` repository and are ready to be included in a pipeline.

## Step 3: Create an Amazon EC2 Linux instance and install the CodeDeploy agent
<a name="codecommit-create-deployment"></a>

In this step, you create the Amazon EC2 instance where you deploy a sample application. As part of this process, create an instance role that allows install and management of the CodeDeploy agent on the instance. The CodeDeploy agent is a software package that enables an instance to be used in CodeDeploy deployments. You also attach policies that allow the instance to fetch files that the CodeDeploy agent uses to deploy your application and to allow the instance to be managed by SSM.

**To create an instance role**

1. Open the IAM console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/)).

1. From the console dashboard, choose **Roles**.

1. Choose **Create role**.

1. Under **Select type of trusted entity**, select **AWS service**. Under **Choose a use case**, select **EC2**. Under **Select your use case**, choose **EC2**. Choose **Next: Permissions**.

1. Search for and select the policy named **`AmazonEC2RoleforAWSCodeDeploy`**. 

1. Search for and select the policy named **`AmazonSSMManagedInstanceCore`**. Choose **Next: Tags**.

1. Choose **Next: Review**. Enter a name for the role (for example, **EC2InstanceRole**).
**Note**  
Make a note of your role name for the next step. You choose this role when you are creating your instance.

   Choose **Create role**.

**To launch an instance**

1. Open the Amazon EC2 console at [https://console.aws.amazon.com/ec2/](https://console.aws.amazon.com/ec2/).

1. From the side navigation, choose **Instances**, and select **Launch instances** from the top of the page.

1. In **Name**, enter **MyCodePipelineDemo**. This assigns the instance a tag **Key** of **Name** and a tag **Value** of **MyCodePipelineDemo**. Later, you create a CodeDeploy application that deploys the sample application to this instance. CodeDeploy selects instances to deploy based on the tags.

1. Under **Application and OS Images (Amazon Machine Image)**, locate the **Amazon Linux** AMI option with the AWS logo, and make sure it is selected. (This AMI is described as the Amazon Linux 2 AMI (HVM) and is labeled "Free tier eligible".)

1. Under **Instance type**, choose the free tier eligible `t2.micro` type as the hardware configuration for your instance.

1. Under **Key pair (login)**, choose a key pair or create one. 

   You can also choose **Proceed without a key pair**.
**Note**  
For the purposes of this tutorial, you can proceed without a key pair. To use SSH to connect to your instances, create or use a key pair.

1. Under **Network settings**, do the following.

   In **Auto-assign Public IP**, make sure the status is **Enable**.

   For the created security group, choose **HTTP**, and then under **Source type**, choose **My IP**.

1. Expand **Advanced details**. In **IAM instance profile**, choose the IAM role you created in the previous procedure (for example, **EC2InstanceRole**).

1. Under **Summary**, under **Number of instances**, enter `1`..

1. Choose **Launch instance**. 

1. You can view the status of the launch on the **Instances** page. When you launch an instance, its initial state is `pending`. After the instance starts, its state changes to `running`, and it receives a public DNS name. (If the **Public DNS** column is not displayed, choose the **Show/Hide** icon, and then select **Public DNS**.)

## Step 4: Create an application in CodeDeploy
<a name="codecommit-create-codedeploy-app"></a>

In CodeDeploy, an [https://docs.aws.amazon.com/codedeploy/latest/userguide/applications.html](https://docs.aws.amazon.com/codedeploy/latest/userguide/applications.html) is a resource that contains the software application you want to deploy. Later, you use this application with CodePipeline to automate deployments of the sample application to your Amazon EC2 instance.

First, you create a role that allows CodeDeploy to perform deployments. Then, you create a CodeDeploy application.

**To create a CodeDeploy service role**

1. Open the IAM console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/)).

1. From the console dashboard, choose **Roles**.

1. Choose **Create role**.

1. Under **Select trusted entity**, choose **AWS service**. Under **Use case**, choose **CodeDeploy**. Choose **CodeDeploy** from the options listed. Choose **Next**. The `AWSCodeDeployRole` managed policy is already attached to the role.

1. Choose **Next**.

1. Enter a name for the role (for example, **CodeDeployRole**), and then choose **Create role**.

**To create an application in CodeDeploy**

1. Open the CodeDeploy console at [https://console.aws.amazon.com/codedeploy](https://console.aws.amazon.com/codedeploy).

1. If the **Applications** page does not appear, on the menu, choose **Applications**.

1. Choose **Create application**.

1. In **Application name**, enter **MyDemoApplication**. 

1. In **Compute Platform**, choose **EC2/On-premises**.

1. Choose **Create application**.

**To create a deployment group in CodeDeploy**

A [https://docs.aws.amazon.com/codedeploy/latest/userguide/deployment-groups.html](https://docs.aws.amazon.com/codedeploy/latest/userguide/deployment-groups.html) is a resource that defines deployment-related settings like which instances to deploy to and how fast to deploy them.

1. On the page that displays your application, choose **Create deployment group**.

1. In **Deployment group name**, enter **MyDemoDeploymentGroup**.

1. In **Service role**, choose the ARN of the service role you created earlier (for example, **`arn:aws:iam::account_ID:role/CodeDeployRole`**).

1. Under **Deployment type**, choose **In-place**.

1. Under **Environment configuration**, choose **Amazon EC2 Instances**. In the **Key** field, enter **Name**. In the **Value** field, enter the name you used to tag the instance (for example, **MyCodePipelineDemo**).

1. Under **Agent configuration with AWS Systems Manager**, choose **Now and schedule updates**. This installs the agent on the instance. The Linux instance is already configured with the SSM agent and will now be updated with the CodeDeploy agent.

1. Under **Deployment configuration**, choose `CodeDeployDefault.OneAtaTime`.

1. Under **Load Balancer**, make sure **Enable load balancing** is not selected. You do not need to set up a load balancer or choose a target group for this example.

1. Choose **Create deployment group**.

## Step 5: Create your first pipeline in CodePipeline
<a name="codecommit-create-pipeline"></a>

You're now ready to create and run your first pipeline. In this step, you create a pipeline that runs automatically when code is pushed to your CodeCommit repository.

**To create a CodePipeline pipeline**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

   Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyFirstPipeline**.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. In **Step 3: Add source stage**, in **Source provider**, choose **CodeCommit**. In **Repository name**, choose the name of the CodeCommit repository you created in [Step 1: Create a CodeCommit repository](#codecommit-create-repository). In **Branch name**, choose `main`, and then choose **Next step**.

   After you select the repository name and branch, a message displays the Amazon CloudWatch Events rule to be created for this pipeline. 

   Under **Change detection options**, leave the defaults. This allows CodePipeline to use Amazon CloudWatch Events to detect changes in your source repository.

   Choose **Next**.

1. In **Step 4: Add build stage**, choose **Skip build stage**, and then accept the warning message by choosing **Skip** again. Choose **Next**.
**Note**  
In this tutorial, you are deploying code that requires no build service, so you can skip this step. However, if your source code needs to be built before it is deployed to instances, you can configure [CodeBuild](http://aws.amazon.com/codebuild/) in this step.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 6: Add deploy stage**, in **Deploy provider**, choose **CodeDeploy**. In **Application name**, choose **MyDemoApplication**. In **Deployment group**, choose **MyDemoDeploymentGroup**, and then choose **Next step**.

1. In **Step 7: Review**, review the information, and then choose **Create pipeline**.

1. The pipeline starts running after it is created. It downloads the code from your CodeCommit repository and creates a CodeDeploy deployment to your EC2 instance. You can view progress and success and failure messages as the CodePipeline sample deploys the webpage to the Amazon EC2 instance in the CodeDeploy deployment.  
![\[A view of a pipeline starting to run in the CodePipeline console.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/codepipeline-firstpipeline-codecommit-pol.png)

Congratulations\$1 You just created a simple pipeline in CodePipeline. 

Next, you verify the results.

**To verify that your pipeline ran successfully**

1. View the initial progress of the pipeline. The status of each stage changes from **No executions yet** to **In Progress**, and then to either **Succeeded** or **Failed**. The pipeline should complete the first run within a few minutes.

1. After **Succeeded** is displayed for the pipeline status, in the status area for the **Deploy** stage, choose **CodeDeploy**. This opens the CodeDeploy console. If **Succeeded** is not displayed see [Troubleshooting CodePipeline](troubleshooting.md).

1.  On the **Deployments** tab, choose the deployment ID. On the page for the deployment, under **Deployment lifecycle events**, choose the instance ID. This opens the EC2 console.

1. On the **Description** tab, in **Public DNS**, copy the address (for example, `ec2-192-0-2-1.us-west-2.compute.amazonaws.com`), and then paste it into the address bar of your web browser.

   The web page displays for the sample application you downloaded and pushed to your CodeCommit repository.

For more information about stages, actions, and how pipelines work, see [CodePipeline concepts](concepts.md).

## Step 6: Modify code in your CodeCommit repository
<a name="codecommit-push-code"></a>

Your pipeline is configured to run whenever code changes are made to your CodeCommit repository. In this step, you make changes to the HTML file that is part of the sample CodeDeploy application in the CodeCommit repository. When you push these changes, your pipeline runs again, and the changes you make are visible at the web address you accessed earlier.

1. Change directories to your local repo:

   ```
   (For Linux, macOS, or Unix) cd /tmp/MyDemoRepo
   (For Windows) cd c:\temp\MyDemoRepo
   ```

1. Use a text editor to modify the `index.html` file:

   ```
   (For Linux or Unix)gedit index.html
   (For OS X)open –e index.html
   (For Windows)notepad index.html
   ```

1. Revise the contents of the `index.html` file to change the background color and some of the text on the webpage, and then save the file. 

   ```
   <!DOCTYPE html>
   <html>
   <head>
     <title>Updated Sample Deployment</title>
     <style>
       body {
         color: #000000;
         background-color: #CCFFCC;
         font-family: Arial, sans-serif;  
         font-size:14px;
       }
           
       h1 {
         font-size: 250%;
         font-weight: normal;
         margin-bottom: 0;
       }
       
       h2 {
         font-size: 175%;
         font-weight: normal;
         margin-bottom: 0;
       }
     </style>
   </head>
   <body>
     <div align="center"><h1>Updated Sample Deployment</h1></div>
     <div align="center"><h2>This application was updated using CodePipeline, CodeCommit, and CodeDeploy.</h2></div>
     <div align="center">
       <p>Learn more:</p> 
       <p><a href="https://docs.aws.amazon.com/codepipeline/latest/userguide/">CodePipeline User Guide</a></p>
       <p><a href="https://docs.aws.amazon.com/codecommit/latest/userguide/">CodeCommit User Guide</a></p>
       <p><a href="https://docs.aws.amazon.com/codedeploy/latest/userguide/">CodeDeploy User Guide</a></p>
     </div>
   </body>
   </html>
   ```

1. Commit and push your changes to your CodeCommit repository by running the following commands, one at a time:

   ```
   git commit -am "Updated sample application files"
   ```

   ```
   git push
   ```

**To verify your pipeline ran successfully**

1. View the initial progress of the pipeline. The status of each stage changes from **No executions yet** to **In Progress**, and then to either **Succeeded** or **Failed**. The running of the pipeline should be complete within a few minutes.

1. After **Succeeded** is displayed for the action status, refresh the demo page you accessed earlier in your browser.

   The updated webpage is displayed.

## Step 7: Clean up resources
<a name="codecommit-clean-up"></a>

You can use some of the resources you created in this tutorial for other tutorials in this guide. For example, you can reuse the CodeDeploy application and deployment. However, after you complete this and any other tutorials, you should delete the pipeline and the resources it uses so that you are not charged for the continued use of those resources. First, delete the pipeline, then the CodeDeploy application and its associated Amazon EC2 instance, and finally, the CodeCommit repository.

**To clean up the resources used in this tutorial**

1. To clean up your CodePipeline resources, follow the instructions in [Delete a pipeline in AWS CodePipeline](pipelines-delete.md).

1. To clean up your CodeDeploy resources, follow the instructions in [Clean Up Deployment Walkthrough Resources](https://docs.aws.amazon.com/codedeploy/latest/userguide/tutorials-simple-s3alkthrough.html#tutorials-simple-s3alkthrough-clean-up).

1. To delete the CodeCommit repository, follow the instructions in [Delete a CodeCommit repository](https://docs.aws.amazon.com/codecommit/latest/userguide/how-to-delete-repository.html).

## Step 8: Further reading
<a name="codecommit-optional-tasks"></a>

Learn more about how CodePipeline works:
+ For more information about stages, actions, and how pipelines work, see [CodePipeline concepts](concepts.md).
+ For information about the actions you can perform using CodePipeline, see [Integrations with CodePipeline action types](integrations-action-type.md).
+ Try this more advanced tutorial, [Tutorial: Create a four-stage pipeline](tutorials-four-stage-pipeline.md). It creates a multi-stage pipeline that includes a step that builds code before it's deployed.

# Tutorial: Create a four-stage pipeline
<a name="tutorials-four-stage-pipeline"></a>

Now that you've created your first pipeline in [Tutorial: Create a simple pipeline (S3 bucket)](tutorials-simple-s3.md) or [Tutorial: Create a simple pipeline (CodeCommit repository)](tutorials-simple-codecommit.md), you can start creating more complex pipelines. This tutorial will walk you through the creation of a four-stage pipeline that uses a GitHub repository for your source, a Jenkins build server to build the project, and a CodeDeploy application to deploy the built code to a staging server. The following diagram shows the initial three-stage pipeline.

![\[A diagram showing the source stage with the source action, a build stage with the Jenkins action, and a deploy stage with the deploy action.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/flow-codepipeline-codecommit-jenkins.png)


After the pipeline is created, you will edit it to add a stage with a test action to test the code, also using Jenkins. 

Before you can create this pipeline, you must configure the required resources. For example, if you want to use a GitHub repository for your source code, you must create the repository before you can add it to a pipeline. As part of setting up, this tutorial walks you through setting up Jenkins on an EC2 instance for demonstration purposes. 

**Important**  
Many of the actions you add to your pipeline in this procedure involve AWS resources that you need to create before you create the pipeline. AWS resources for your source actions must always be created in the same AWS Region where you create your pipeline. For example, if you create your pipeline in the US East (Ohio) Region, your CodeCommit repository must be in the US East (Ohio) Region.   
You can add cross-region actions when you create your pipeline. AWS resources for cross-region actions must be in the same AWS Region where you plan to execute the action. For more information, see [Add a cross-Region action in CodePipeline](actions-create-cross-region.md).

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

Before you begin this tutorial, you should have already completed the general prerequisites in [Getting started with CodePipeline](getting-started-codepipeline.md).

**Topics**
+ [

## Step 1: Complete prerequisites
](#tutorials-four-stage-pipeline-prerequisites)
+ [

## Step 2: Create a pipeline in CodePipeline
](#tutorials-four-stage-pipeline-pipeline-create)
+ [

## Step 3: Add another stage to your pipeline
](#tutorials-four-stage-pipeline-add-stage)
+ [

## Step 4: Clean up resources
](#tutorials-four-stage-pipeline-clean-up)

## Step 1: Complete prerequisites
<a name="tutorials-four-stage-pipeline-prerequisites"></a>

To integrate with Jenkins, AWS CodePipeline requires you to install the CodePipeline Plugin for Jenkins on any instance of Jenkins you want to use with CodePipeline. You should also configure a dedicated IAM user or role to use for permissions between your Jenkins project and CodePipeline. The easiest way to integrate Jenkins and CodePipeline is to install Jenkins on an EC2 instance that uses an IAM instance role that you create for Jenkins integration. In order for links in the pipeline for Jenkins actions to successfully connect, you must configure proxy and firewall settings on the server or EC2 instance to allow inbound connections to the port used by your Jenkins project. Make sure you have configured Jenkins to authenticate users and enforce access control before you allow connections on those ports (for example, 443 and 8443 if you have secured Jenkins to only use HTTPS connections, or 80 and 8080 if you allow HTTP connections). For more information, see [Securing Jenkins](https://wiki.jenkins.io/display/JENKINS/Securing+Jenkins).

**Note**  
This tutorial uses a code sample and configures build steps that convert the sample from Haml to HTML. You can download the open-source sample code from the GitHub repository by following the steps in [Copy or clone the sample into a GitHub repository](#tutorials-four-stage-pipeline-prerequisites-github). You will need the entire sample in your GitHub repository, not just the .zip file.   
This tutorial also assumes that:  
You are familiar with installing and administering Jenkins and creating Jenkins projects.
You have installed Rake and the Haml gem for Ruby on the same computer or instance that hosts your Jenkins project.
You have set the required system environment variables so that Rake commands can be run from the terminal or command line (for example, on Windows systems, modifying the PATH variable to include the directory where you installed Rake).

**Topics**
+ [

### Copy or clone the sample into a GitHub repository
](#tutorials-four-stage-pipeline-prerequisites-github)
+ [

### Create an IAM role to use for Jenkins integration
](#tutorials-four-stage-pipeline-prerequisites-jenkins-iam-role)
+ [

### Install and configure Jenkins and the CodePipeline Plugin for Jenkins
](#tutorials-four-stage-pipeline-prerequisites-jenkins-configure)

### Copy or clone the sample into a GitHub repository
<a name="tutorials-four-stage-pipeline-prerequisites-github"></a>

**To clone the sample and push to a GitHub repository**

1. Download the sample code from the GitHub repository, or clone the repositories to your local computer. There are two sample packages: 
   + If you will be deploying your sample to Amazon Linux, RHEL, or Ubuntu Server instances, choose [codepipeline-jenkins-aws-codedeploy\$1linux.zip](https://github.com/awslabs/aws-codepipeline-jenkins-aws-codedeploy_linux). 
   + If you will be deploying your sample to Windows Server instances, choose [CodePipeline-Jenkins-AWSCodeDeploy\$1Windows.zip](https://github.com/awslabs/AWSCodePipeline-Jenkins-AWSCodeDeploy_windows).

1. From the repository, choose **Fork** to clone the sample repo into a repo in your Github account. For more information, see the [GitHub documentation](https://help.github.com/articles/create-a-repo/).

### Create an IAM role to use for Jenkins integration
<a name="tutorials-four-stage-pipeline-prerequisites-jenkins-iam-role"></a>

As a best practice, consider launching an EC2 instance to host your Jenkins server and using an IAM role to grant the instance the required permissions for interacting with CodePipeline.

1. Sign in to the AWS Management Console and open the IAM console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/).

1. In the IAM console, in the navigation pane, choose **Roles**, and then choose **Create role**.

1. Under **Select type of trusted entity**, choose **AWS service**. Under **Choose the service that will use this role**, choose **EC2**. Under **Select your use case**, choose **EC2**. 

1. Choose **Next: Permissions**. On the **Attach permissions policies** page, select the `AWSCodePipelineCustomActionAccess` managed policy, and then choose **Next: Tags**. Choose **Next: Review**.

1. On the **Review** page, in **Role name**, enter the name of the role to create specifically for Jenkins integration (for example, *JenkinsAccess*), and then choose **Create role**.

When you create the EC2 instance where you will install Jenkins, in **Step 3: Configure Instance Details**, make sure you choose the instance role (for example, *JenkinsAccess*).

For more information about instance roles and Amazon EC2, see [IAM roles for Amazon EC2](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html), [Using IAM Roles to Grant Permissions to Applications Running on Amazon EC2 Instances](https://docs.aws.amazon.com/IAM/latest/UserGuide/roles-usingrole-ec2instance.html), and [Creating a role to delegate permissions to an AWS service](https://docs.aws.amazon.com/IAM/latest/UserGuide/roles-creatingrole-service.html).

### Install and configure Jenkins and the CodePipeline Plugin for Jenkins
<a name="tutorials-four-stage-pipeline-prerequisites-jenkins-configure"></a>

**To install Jenkins and the CodePipeline Plugin for Jenkins**

1. Create an EC2 instance where you will install Jenkins, and in **Step 3: Configure Instance Details**, make sure you choose the instance role you created (for example, *JenkinsAccess*). For more information about creating EC2 instances, see [Launch an Amazon EC2 instance](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-launch-instance_linux.html) in the *Amazon EC2 User Guide*. 
**Note**  
If you already have Jenkins resources you want to use, you can do so, but you must create a special IAM user, apply the `AWSCodePipelineCustomActionAccess` managed policy to that user, and then configure and use the access credentials for that user on your Jenkins resource. If you want to use the Jenkins UI to supply the credentials, configure Jenkins to only allow HTTPS. For more information, see [Troubleshooting CodePipeline](troubleshooting.md).

1. Install Jenkins on the EC2 instance. For more information, see the Jenkins documentation for [installing Jenkins](https://www.jenkins.io/doc/book/installing/linux/) and [starting and accessing Jenkins](https://wiki.jenkins.io/JENKINS/Starting-and-Accessing-Jenkins.html), as well as [details of integration with Jenkins](integrations-action-type.md#JenkinsInt_2) in [Product and service integrations with CodePipeline](integrations.md).

1. Launch Jenkins, and on the home page, choose **Manage Jenkins**.

1. On the **Manage Jenkins** page, choose **Manage Plugins**.

1. Choose the **Available** tab, and in the **Filter** search box, enter **AWS CodePipeline**. Choose **CodePipeline Plugin for Jenkins** from the list and choose **Download now and install after restart**.

1. On the **Installing Plugins/Upgrades** page, select **Restart Jenkins when installation is complete and no jobs are running**.

1. Choose **Back to Dashboard**.

1. On the main page, choose **New Item**.

1. In **Item Name**, enter a name for the Jenkins project (for example, *MyDemoProject*). Choose **Freestyle project**, and then choose **OK**.
**Note**  
Make sure that the name for your project meets the requirements for CodePipeline. For more information, see [Quotas in AWS CodePipeline](limits.md).

1. On the configuration page for the project, select the **Execute concurrent builds if necessary** check box. In **Source Code Management**, choose **AWS CodePipeline**. If you have installed Jenkins on an EC2 instance and configured the AWS CLI with the profile for the IAM user you created for integration between CodePipeline and Jenkins, leave all of the other fields empty.

1. Choose **Advanced**, and in **Provider**, enter a name for the provider of the action as it will appear in CodePipeline (for example, *MyJenkinsProviderName*). Make sure that this name is unique and easy to remember. You will use it when you add a build action to your pipeline later in this tutorial, and again when you add a test action.
**Note**  
This action name must meet the naming requirements for actions in CodePipeline. For more information, see [Quotas in AWS CodePipeline](limits.md).

1. In **Build Triggers**, clear any check boxes, and then select **Poll SCM**. In **Schedule**, enter five asterisks separated by spaces, as follows:

   ```
   * * * * *
   ```

   This polls CodePipeline every minute. 

1. In **Build**, choose **Add build step**. Choose **Execute shell** (Amazon Linux, RHEL, or Ubuntu Server) **Execute batch command** (Windows Server), and then enter the following:

   ```
   rake
   ```
**Note**  
Make sure that your environment is configured with the variables and settings required to run rake; otherwise, the build will fail.

1. Choose **Add post-build action**, and then choose **AWS CodePipeline Publisher**. Choose **Add**, and in **Build Output Locations**, leave the location blank. This configuration is the default. It will create a compressed file at the end of the build process.

1. Choose **Save** to save your Jenkins project.

## Step 2: Create a pipeline in CodePipeline
<a name="tutorials-four-stage-pipeline-pipeline-create"></a>

In this part of the tutorial, you create the pipeline using the **Create Pipeline** wizard. 

**To create a CodePipeline automated release process**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. If necessary, use the Region selector to change the Region to the one where your pipeline resources are located. For example, if you created resources for the previous tutorial in `us-east-2`, make sure that the Region selector is set to US East (Ohio).

   For more information about the Regions and endpoints available for CodePipeline, see [AWS CodePipeline endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/codepipeline.html).

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. On the **Step 2: Choose pipeline settings** page, in **Pipeline name**, enter the name for your pipeline.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.

1. Leave the settings under **Advanced settings** at their defaults, and choose **Next**.

1. On the **Step 3: Add source stage** page, in **Source provider**, choose **GitHub**.

1. Under **Connection**, choose an existing connection or create a new one. To create or manage a connection for your GitHub source action, see [GitHub connections](connections-github.md).

1. In **Step 4: Add build stage**, choose **Add Jenkins**. In **Provider name**, enter the name of the action you provided in the CodePipeline Plugin for Jenkins (for example *MyJenkinsProviderName*). This name must exactly match the name in the CodePipeline Plugin for Jenkins. In **Server URL**, enter the URL of the EC2 instance where Jenkins is installed. In **Project name**, enter the name of the project you created in Jenkins, such as *MyDemoProject*, and then choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 6: Add deploy stage**, reuse the CodeDeploy application and deployment group you created in [Tutorial: Create a simple pipeline (S3 bucket)](tutorials-simple-s3.md). In **Deploy provider**, choose **CodeDeploy**. In **Application name**, enter **CodePipelineDemoApplication**, or choose the refresh button, and then choose the application name from the list. In **Deployment group**, enter **CodePipelineDemoFleet**, or choose it from the list, and then choose **Next**.
**Note**  
You can use your own CodeDeploy resources or create new ones, but you might incur additional costs.

1. In **Step 7: Review**, review the information, and then choose **Create pipeline**.

1. The pipeline automatically starts and runs the sample through the pipeline. You can view progress and success and failure messages as the pipeline builds the Haml sample to HTML and deploys it a webpage to each of the Amazon EC2 instances in the CodeDeploy deployment.

## Step 3: Add another stage to your pipeline
<a name="tutorials-four-stage-pipeline-add-stage"></a>

Now you will add a test stage and then a test action to that stage that uses the Jenkins test included in the sample to determine whether the webpage has any content. This test is for demonstration purposes only.

**Note**  
If you did not want to add another stage to your pipeline, you could add a test action to the Staging stage of the pipeline, before or after the deployment action.

### Add a test stage to your pipeline
<a name="tutorials-four-stage-pipeline-add-stage-console"></a>

**Topics**
+ [

#### Look up the IP address of an instance
](#tutorials-four-stage-pipeline-instance-ip-lookup)
+ [

#### Create a Jenkins project for testing the deployment
](#tutorials-four-stage-pipeline-create-jenkins-project)
+ [

#### Create a fourth stage
](#tutorials-four-stage-pipeline-create-fourth-stage)

#### Look up the IP address of an instance
<a name="tutorials-four-stage-pipeline-instance-ip-lookup"></a>

**To verify the IP address of an instance where you deployed your code**

1. After **Succeeded** is displayed for the pipeline status, in the status area for the Staging stage, choose **Details**. 

1. In the **Deployment Details** section, in **Instance ID**, choose the instance ID of one of the successfully deployed instances. 

1. Copy the IP address of the instance (for example, *192.168.0.4*). You will use this IP address in your Jenkins test.

#### Create a Jenkins project for testing the deployment
<a name="tutorials-four-stage-pipeline-create-jenkins-project"></a>

**To create the Jenkins project**

1. On the instance where you installed Jenkins, open Jenkins and from the main page, choose **New Item**.

1.  In **Item Name**, enter a name for the Jenkins project (for example, *MyTestProject*). Choose **Freestyle project**, and then choose **OK**.
**Note**  
Make sure that the name for your project meets the CodePipeline requirements. For more information, see [Quotas in AWS CodePipeline](limits.md).

1. On the configuration page for the project, select the **Execute concurrent builds if necessary** check box. In **Source Code Management**, choose **AWS CodePipeline**. If you have installed Jenkins on an EC2 instance and configured the AWS CLI with the profile for the IAM user you created for integration between CodePipeline and Jenkins, leave all the other fields empty. 
**Important**  
If you are configuring a Jenkins project and it is not installed on an Amazon EC2 instance, or it is installed on an EC2 instance that is running a Windows operating system, complete the fields as required by your proxy host and port settings, and provide the credentials of the IAM user or role you configured for integration between Jenkins and CodePipeline.

1. Choose **Advanced**, and in **Category**, choose **Test**. 

1. In **Provider**, enter the same name you used for the build project (for example, *MyJenkinsProviderName*). You will use this name when you add the test action to your pipeline later in this tutorial.
**Note**  
This name must meet the CodePipeline naming requirements for actions. For more information, see [Quotas in AWS CodePipeline](limits.md).

1. In **Build Triggers**, clear any check boxes, and then select **Poll SCM**. In **Schedule**, enter five asterisks separated by spaces, as follows:

   ```
   * * * * *
   ```

   This polls CodePipeline every minute. 

1. In **Build**, choose **Add build step**. If you are deploying to Amazon Linux, RHEL, or Ubuntu Server instances, choose** Execute shell **. Then enter the following, where the IP address is the address of the EC2 instance you copied earlier:

   ```
   TEST_IP_ADDRESS=192.168.0.4 rake test
   ```

   If you are deploying to Windows Server instances, choose **Execute batch command**, and then enter the following, where the IP address is the address of the EC2 instance you copied earlier:

   ```
   set TEST_IP_ADDRESS=192.168.0.4 rake test
   ```
**Note**  
The test assumes a default port of 80. If you want to specify a different port, add a test port statement, as follows:   

   ```
   TEST_IP_ADDRESS=192.168.0.4 TEST_PORT=8000 rake test
   ```

1. Choose **Add post-build action**, and then choose **AWS CodePipeline Publisher**. Do not choose **Add**.

1. Choose **Save** to save your Jenkins project.

#### Create a fourth stage
<a name="tutorials-four-stage-pipeline-create-fourth-stage"></a>

**To add a stage to your pipeline that includes the Jenkins test action**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. In **Name**, choose the name of the pipeline you created, MySecondPipeline. 

1. On the pipeline details page, choose **Edit**. 

1. On the **Edit** page, choose **\$1 Stage** to add a stage immediately after the Build stage. 

1. In the name field for the new stage, enter a name (for example, **Testing**), and then choose **\$1 Add action group**. 

1. In **Action name**, enter *MyJenkinsTest-Action*. In **Test provider**, choose the provider name you specified in Jenkins (for example, *MyJenkinsProviderName*). In **Project name**, enter the name of the project you created in Jenkins (for example, *MyTestProject*). In **Input artifacts**, choose the artifact from the Jenkins build whose default name is *BuildArtifact*, and then choose **Done**.
**Note**  
Because the Jenkins test action operates on the application built in the Jenkins build step, use the build artifact for the input artifact to the test action.

   For more information about input and output artifacts and the structure of pipelines, see [CodePipeline pipeline structure reference](reference-pipeline-structure.md).

1. On the **Edit** page, choose **Save pipeline changes**. In the **Save pipeline changes** dialog box, choose **Save and continue**.

1. Although the new stage has been added to your pipeline, a status of **No executions yet** is displayed for that stage because no changes have triggered another run of the pipeline. To run the sample through the revised pipeline, on the pipeline details page, choose **Release change**. 

   The pipeline view shows the stages and actions in your pipeline and the state of the revision running through those four stages. The time it takes for the pipeline to run through all stages will depend on the size of the artifacts, the complexity of your build and test actions, and other factors. 

## Step 4: Clean up resources
<a name="tutorials-four-stage-pipeline-clean-up"></a>

After you complete this tutorial, you should delete the pipeline and the resources it uses so you will not be charged for continued use of those resources. If you do not intend to keep using CodePipeline, delete the pipeline, then the CodeDeploy application and its associated Amazon EC2 instances, and finally, the Amazon S3 bucket used to store artifacts. You should also consider whether to delete other resources, such as the GitHub repository, if you do not intend to keep using them.

**To clean up the resources used in this tutorial**

1. Open a terminal session on your local Linux, macOS, or Unix machine, or a command prompt on your local Windows machine, and run the **delete-pipeline** command to delete the pipeline you created. For **MySecondPipeline**, you would enter the following command: 

   ```
   aws codepipeline delete-pipeline --name "MySecondPipeline"
   ```

   This command returns nothing.

1. To clean up your CodeDeploy resources, follow the instructions in [Cleaning Up](http://docs.aws.amazon.com/codedeploy/latest/userguide/getting-started-walkthrough.html#getting-started-walkthrough-clean-up).

1. To clean up your instance resources, delete the EC2 instance where you installed Jenkins. For more information, see [Clean up your instance](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-clean-up-your-instance.html).

1. If you do not intend to create more pipelines or use CodePipeline again, delete the Amazon S3 bucket used to store artifacts for your pipeline. To delete the bucket, follow the instructions in [Deleting a bucket](http://docs.aws.amazon.com/AmazonS3/latest/UG/DeletingaBucket.html).

1. If you do not intend to use the other resources for this pipeline again, consider deleting them by following the guidance for that particular resource. For example, if you want to delete the GitHub repository, follow the instructions in [Deleting a repository](https://help.github.com/articles/deleting-a-repository/) on the GitHub website.

# Tutorial: Set up a CloudWatch Events rule to receive email notifications for pipeline state changes
<a name="tutorials-cloudwatch-sns-notifications"></a>

After you set up a pipeline in AWS CodePipeline, you can set up a CloudWatch Events rule to send notifications whenever there are changes to the execution state of your pipelines, or in the stages or actions in your pipelines. For more information on using CloudWatch Events to set up notifications for pipeline state changes, see [Monitoring CodePipeline events](detect-state-changes-cloudwatch-events.md).

In this tutorial, you configure a notification to send an email when a pipeline's state changes to FAILED. This tutorial uses an input transformer method when creating the CloudWatch Events rule. It transforms the message schema details to deliver the message in human-readable text.

**Note**  
As you create the resources for this tutorial, such as the Amazon SNS notification and the CloudWatch Events rule, make sure the resources are created in the same AWS Region as your pipeline.

**Topics**
+ [

## Step 1: Set up an email notification using Amazon SNS
](#create-filter-for-target)
+ [

## Step 2: Create a rule and add the SNS topic as the target
](#create-notification-rule)
+ [

## Step 3: Clean up resources
](#notifications-clean-up-resources)

## Step 1: Set up an email notification using Amazon SNS
<a name="create-filter-for-target"></a>

Amazon SNS coordinates use of topics to deliver messages to subscribing endpoints or clients. Use Amazon SNS to create a notification topic and then subscribe to the topic using your email address. The Amazon SNS topic will be added as a target to your CloudWatch Events rule. For more information, see the [Amazon Simple Notification Service Developer Guide](https://docs.aws.amazon.com/sns/latest/dg/) .

Create or identify a topic in Amazon SNS. CodePipeline will use CloudWatch Events to send notifications to this topic through Amazon SNS. To create a topic:

1. Open the Amazon SNS console at [https://console.aws.amazon.com/sns](https://console.aws.amazon.com/sns).

1. Choose **Create topic**. 

1. In the **Create new topic** dialog box, for **Topic name**, type a name for the topic (for example, **PipelineNotificationTopic**).   
![\[Create the notification topic using Amazon SNS.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/tutorial-SNS-topic.png)

1. Choose **Create topic**.

   For more information, see [Create a Topic](https://docs.aws.amazon.com/sns/latest/dg/CreateTopic.html) in the *Amazon SNS Developer Guide*.

Subscribe one or more recipients to the topic to receive email notifications. To subscribe a recipient to a topic:

1. In the Amazon SNS console, from the **Topics** list, select the check box next to your new topic. Choose **Actions, Subscribe to topic**.

1. In the **Create subscription** dialog box, verify that an ARN appears in **Topic ARN**.

1. For **Protocol**, choose **Email**.

1. For **Endpoint**, type the recipient's full email address.

1. Choose **Create Subscription**.

1. Amazon SNS sends a subscription confirmation email to the recipient. To receive email notifications, the recipient must choose the **Confirm subscription** link in this email. After the recipient clicks the link, if successfully subscribed, Amazon SNS displays a confirmation message in the recipient's web browser.

   For more information, see [Subscribe to a Topic](https://docs.aws.amazon.com/sns/latest/dg/SubscribeTopic.html) in the *Amazon SNS Developer Guide*.

## Step 2: Create a rule and add the SNS topic as the target
<a name="create-notification-rule"></a>

Create a CloudWatch Events notification rule with CodePipeline as the event source.

1. Open the CloudWatch console at [https://console.aws.amazon.com/cloudwatch/](https://console.aws.amazon.com/cloudwatch/).

1. In the navigation pane, choose **Events**.

1. Choose **Create rule**. Under **Event source**, choose **AWS CodePipeline**. For **Event Type**, choose **Pipeline Execution State Change**.

1. Choose **Specific state(s)**, and choose **FAILED**.

1. Choose **Edit** to open the JSON editor for the **Event Pattern Preview** pane. Add the **pipeline** parameter with the name of your pipeline as shown in the following example for a pipeline named "myPipeline."

   You can copy the event pattern here and paste it into the console:

   ```
   {
     "source": [
       "aws.codepipeline"
     ],
     "detail-type": [
       "CodePipeline Pipeline Execution State Change"
     ],
     "detail": {
       "state": [
         "FAILED"
       ],
       "pipeline": [
         "myPipeline"
       ]
     }
   }
   ```

1. For **Targets**, choose **Add target**. 

1. In the list of targets, choose **SNS topic**. For **Topic**, enter the topic you created.

1. Expand **Configure input**, and then choose **Input Transformer**. 

1. In the **Input Path** box, type the following key-value pairs.

   ```
   { "pipeline" : "$.detail.pipeline" }
   ```

   In the **Input Template** box, type the following: 

   ```
   "The Pipeline <pipeline> has failed."
   ```

1. Choose **Configure details**.

1. On the **Configure rule details** page, type a name and an optional description. For **State**, leave the **Enabled** box selected.

1. Choose **Create rule**. 

1. Confirm that CodePipeline is now sending build notifications. For example, check to see if the build notification emails are now in your inbox.

1. To change a rule's behavior, in the CloudWatch console, choose the rule, and then choose **Actions**, **Edit**. Edit the rule, choose **Configure details**, and then choose **Update rule**.

   To stop using a rule to send build notifications, in the CloudWatch console, choose the rule, and then choose **Actions**, **Disable**.

   To delete a rule, in the CloudWatch console, choose the rule, and then choose **Actions**, **Delete**.

## Step 3: Clean up resources
<a name="notifications-clean-up-resources"></a>

After you complete this tutorial, you should delete the pipeline and the resources it uses so you will not be charged for continued use of those resources. 

For information about how to clean up the SNS notification and delete the Amazon CloudWatch Events rule, see [Clean Up (Unsubscribe from an Amazon SNS Topic)](http://docs.aws.amazon.com/sns/latest/dg/CleanUp.html) and reference `DeleteRule` in the [Amazon CloudWatch Events API Reference](https://docs.aws.amazon.com/AmazonCloudWatchEvents/latest/APIReference/).

# Tutorial: Create a pipeline that builds and tests your Android app with AWS Device Farm
<a name="tutorials-codebuild-devicefarm"></a>

You can use AWS CodePipeline to configure a continuous integration flow in which your app is built and tested each time a commit is pushed. This tutorial shows how to create and configure a pipeline to build and test your Android app with source code in a GitHub repository. The pipeline detects the arrival of a new GitHub commit and then uses [CodeBuild](https://docs.aws.amazon.com/codebuild/latest/userguide/welcome.html) to build the app and [Device Farm](https://docs.aws.amazon.com/devicefarm/latest/developerguide/welcome.html) to test it.

**Important**  
As part of creating a pipeline in the console, an S3 artifact bucket will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Important**  
Many of the actions you add to your pipeline in this procedure involve AWS resources that you need to create before you create the pipeline. AWS resources for your source actions must always be created in the same AWS Region where you create your pipeline. For example, if you create your pipeline in the US East (Ohio) Region, your CodeCommit repository must be in the US East (Ohio) Region.   
You can add cross-region actions when you create your pipeline. AWS resources for cross-region actions must be in the same AWS Region where you plan to execute the action. For more information, see [Add a cross-Region action in CodePipeline](actions-create-cross-region.md).

You can try this out using your existing Android app and test definitions, or you can use the [sample app and test definitions provided by Device Farm](https://github.com/aws-samples/aws-device-farm-sample-app-for-android).

**Before you begin**

1. Sign in to the AWS Device Farm console and choose **Create a new project**.

1. Choose your project. In the browser, copy the URL of your new project. The URL contains the project ID. 

1. Copy and retain this project ID. You use it when you create your pipeline in CodePipeline.

   Here is an example URL for a project. To extract the project ID, copy the value after `projects/`. In this example, the project ID is `eec4905f-98f8-40aa-9afc-4c1cfexample`.

   ```
   https://<region-URL>/devicefarm/home?region=us-west-2#/projects/eec4905f-98f8-40aa-9afc-4c1cfexample/runs
   ```

## Configure CodePipeline to use your Device Farm tests
<a name="codepipeline-configure-tests"></a>

1. 

   Add and commit a file called [https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html](https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html) in the root of your app code, and push it to your repository. CodeBuild uses this file to perform commands and access artifacts required to build your app.

   ```
   version: 0.2
   
   phases:
     build:
       commands:
         - chmod +x ./gradlew
         - ./gradlew assembleDebug
   artifacts:
     files:
        - './android/app/build/outputs/**/*.apk'
     discard-paths: yes
   ```

1. (Optional) If you [use Calabash or Appium to test your app](https://docs.aws.amazon.com/devicefarm/latest/developerguide/test-types-intro.html), add the test definition file to your repository. In a later step, you can configure Device Farm to use the definitions to carry out your test suite. 

   If you use Device Farm built-in tests, you can skip this step.

1. To create your pipeline and add a source stage, do the following:

   1. Sign in to the AWS Management Console and open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

   1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

   1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

   1. On the **Step 2: Choose pipeline settings** page, in **Pipeline name**, enter the name for your pipeline.

   1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

   1. In **Service role**, leave **New service role** selected, and leave **Role name** unchanged. You can also choose to use an existing service role, if you have one.
**Note**  
If you use a CodePipeline service role that was created before July 2018, you need to add permissions for Device Farm. To do this, open the IAM console, find the role, and then add the following permissions to the role's policy. For more information, see [Add permissions to the CodePipeline service role](how-to-custom-role.md#how-to-update-role-new-services).  

      ```
      {
           "Effect": "Allow",
           "Action": [
              "devicefarm:ListProjects",
              "devicefarm:ListDevicePools",
              "devicefarm:GetRun",
              "devicefarm:GetUpload",
              "devicefarm:CreateUpload",
              "devicefarm:ScheduleRun"
           ],
           "Resource": "*"
      }
      ```

   1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

   1. On the **Step 3: Add source stage** page, in **Source provider**, choose **GitHub (via GitHub App)**.

   1. Under **Connection**, choose an existing connection or create a new one. To create or manage a connection for your GitHub source action, see [GitHub connections](connections-github.md).

   1. In **Repository**, choose the source repository.

   1. In **Branch**, choose the branch that you want to use.

   1. Leave the remaining defaults for the source action. Choose **Next**.

1. In **Step 4: Add build stage**, add a build stage:

   1. In **Build provider**, choose **Other build providers**, and then choose **AWS CodeBuild**. Allow **Region** to default to the pipeline Region.

   1. Choose **Create project**.

   1. In **Project name**, enter a name for this build project.

   1. In **Environment image**, choose **Managed image**. For **Operating system**, choose **Ubuntu**.

   1. For **Runtime**, choose **Standard**. For **Image**, choose **aws/codebuild/standard:5.0**.

      CodeBuild uses this OS image, which has Android Studio installed, to build your app.

   1. For **Service role**, choose your existing CodeBuild service role or create a new one.

   1. For **Build specifications**, choose **Use a buildspec file**.

   1. Choose **Continue to CodePipeline**. This returns to the CodePipeline console and creates a CodeBuild project that uses the `buildspec.yml` in your repository for configuration. The build project uses a service role to manage AWS service permissions. This step might take a couple of minutes.

   1. Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. On the **Step 6: Add deploy stage** page, choose **Skip deploy stage**, and then accept the warning message by choosing **Skip** again. Choose **Next**.

1. On **Step 7: Review**, choose **Create pipeline**. You should see a diagram that shows the source and build stages.

1. Add a Device Farm test action to your pipeline:

   1. In the upper right, choose **Edit**.

   1. At the bottom of the diagram, choose **\$1 Add stage**. In **Stage name**, enter a name, such as **Test**.

   1. Choose **\$1 Add action group**.

   1. In **Action name**, enter a name. 

   1. In **Action provider**, choose **AWS Device Farm**. Allow **Region** to default to the pipeline Region.

   1. In **Input artifacts**, choose the input artifact that matches the output artifact of the stage that comes before the test stage, such as `BuildArtifact`. 

      In the AWS CodePipeline console, you can find the name of the output artifact for each stage by hovering over the information icon in the pipeline diagram. If your pipeline tests your app directly from the **Source** stage, choose **SourceArtifact**. If the pipeline includes a **Build** stage, choose **BuildArtifact**.

   1. In **ProjectId**, enter your Device Farm project ID. Use the steps at the start of this tutorial to retrieve your project ID.

   1. In **DevicePoolArn**, enter the ARN for the device pool. To get the available device pool ARNs for the project, including the ARN for Top Devices, use the AWS CLI to enter the following command: 

      ```
      aws devicefarm list-device-pools --arn arn:aws:devicefarm:us-west-2:account_ID:project:project_ID
      ```

   1. In **AppType**, enter **Android**.

      The following is a list of valid values for **AppType**:
      + **iOS**
      + **Android**
      + **Web**

   1. In **App**, enter the path of the compiled app package. The path is relative to the root of the input artifact for the test stage. Typically, this path is similar to `app-release.apk`.

   1. In **TestType**, enter your type of test, and then in **Test**, enter the path of the test definition file. The path is relative to the root of the input artifact for your test.

      The following is a list of valid values for **TestType**:
      + **APPIUM\$1JAVA\$1JUNIT**
      + **APPIUM\$1JAVA\$1TESTNG**
      + **APPIUM\$1NODE**
      + **APPIUM\$1RUBY**
      + **APPIUM\$1PYTHON**
      + **APPIUM\$1WEB\$1JAVA\$1JUNIT**
      + **APPIUM\$1WEB\$1JAVA\$1TESTNG**
      + **APPIUM\$1WEB\$1NODE**
      + **APPIUM\$1WEB\$1RUBY**
      + **APPIUM\$1WEB\$1PYTHON**
      + **BUILTIN\$1FUZZ**
      + **INSTRUMENTATION**
      + **XCTEST**
      + **XCTEST\$1UI**
**Note**  
Custom environment nodes are not supported.

   1. In the remaining fields, provide the configuration that is appropriate for your test and application type.

   1. (Optional) In **Advanced**, provide configuration information for your test run.

   1. Choose **Save**.

   1. On the stage you are editing, choose **Done**. In the AWS CodePipeline pane, choose **Save**, and then choose **Save** on the warning message.

   1. To submit your changes and start a pipeline build, choose **Release change**, and then choose **Release**.

# Tutorial: Create a pipeline that tests your iOS app with AWS Device Farm
<a name="tutorials-codebuild-devicefarm-S3"></a>

 You can use AWS CodePipeline to easily configure a continuous integration flow in which your app is tested each time the source bucket changes. This tutorial shows you how to create and configure a pipeline to test your built iOS app from an S3 bucket. The pipeline detects the arrival of a saved change through Amazon CloudWatch Events, and then uses [Device Farm](https://docs.aws.amazon.com/devicefarm/latest/developerguide/welcome.html) to test the built application. 

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Important**  
Many of the actions you add to your pipeline in this procedure involve AWS resources that you need to create before you create the pipeline. AWS resources for your source actions must always be created in the same AWS Region where you create your pipeline. For example, if you create your pipeline in the US East (Ohio) Region, your CodeCommit repository must be in the US East (Ohio) Region.   
You can add cross-region actions when you create your pipeline. AWS resources for cross-region actions must be in the same AWS Region where you plan to execute the action. For more information, see [Add a cross-Region action in CodePipeline](actions-create-cross-region.md).

You can try this out using your existing iOS app, or you can use the [sample iOS app](samples/s3-ios-test-1.zip).

**Before you begin**

1. Sign in to the AWS Device Farm console and choose **Create a new project**.

1. Choose your project. In the browser, copy the URL of your new project. The URL contains the project ID.

1. Copy and retain this project ID. You use it when you create your pipeline in CodePipeline.

   Here is an example URL for a project. To extract the project ID, copy the value after `projects/`. In this example, the project ID is `eec4905f-98f8-40aa-9afc-4c1cfexample`.

   ```
   https://<region-URL>/devicefarm/home?region=us-west-2#/projects/eec4905f-98f8-40aa-9afc-4c1cfexample/runs
   ```

## Configure CodePipeline to use your Device Farm tests (Amazon S3 example)
<a name="codepipeline-configure-tests-S3"></a>

1. Create or use an S3 bucket with versioning enabled. Follow the instructions in [Step 1: Create an S3 source bucket for your application](tutorials-simple-s3.md#s3-create-s3-bucket) to create an S3 bucket.

1. In the Amazon S3 console for your bucket, choose **Upload**, and follow the instructions to upload your .zip file.

   Your sample application must be packaged in a .zip file.

1. To create your pipeline and add a source stage, do the following:

   1. Sign in to the AWS Management Console and open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

   1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

   1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

   1. On the **Step 2: Choose pipeline settings** page, in **Pipeline name**, enter the name for your pipeline.

   1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

   1. In **Service role**, leave **New service role** selected, and leave **Role name** unchanged. You can also choose to use an existing service role, if you have one.
**Note**  
If you use a CodePipeline service role that was created before July 2018, you must add permissions for Device Farm. To do this, open the IAM console, find the role, and then add the following permissions to the role's policy. For more information, see [Add permissions to the CodePipeline service role](how-to-custom-role.md#how-to-update-role-new-services).  

      ```
      {
           "Effect": "Allow",
           "Action": [
              "devicefarm:ListProjects",
              "devicefarm:ListDevicePools",
              "devicefarm:GetRun",
              "devicefarm:GetUpload",
              "devicefarm:CreateUpload",
              "devicefarm:ScheduleRun"
           ],
           "Resource": "*"
      }
      ```

   1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

   1. On the **Step 3: Add source stage** page, in **Source provider**, choose **Amazon S3**.

   1. In **Amazon S3 location**, enter the bucket, such as `my-storage-bucket`, and object key, such as `s3-ios-test-1.zip` for your .zip file.

   1. Choose **Next**.

1. In **Step 4: Add build stage**, create a placeholder build stage for your pipeline. This allows you to create the pipeline in the wizard. After you use the wizard to create your two-stage pipeline, you no longer need this placeholder build stage. After the pipeline is completed, this second stage is deleted and the new test stage is added in step 5.

   

   1. In **Build provider**, choose **Add Jenkins**. This build selection is a placeholder. It is not used.

   1. In **Provider name**, enter a name. The name is a placeholder. It is not used.

   1. In **Server URL**, enter text. The text is a placeholder. It is not used.

   1. In **Project name**, enter a name. The name is a placeholder. It is not used.

   1. Choose **Next**.

   1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

      Choose **Next**.

   1. On the **Step 6: Add deploy stage** page, choose **Skip deploy stage**, and then accept the warning message by choosing **Skip** again.

   1. On **Step 7: Review**, choose **Create pipeline**. You should see a diagram that shows the source and build stages.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/codepipeline-view-pipeline-S3.png)

1. Add a Device Farm test action to your pipeline as follows:

   1. In the upper right, choose **Edit**. 

   1. Choose **Edit stage**. Choose **Delete**. This deletes the placeholder stage now that you no longer need it for pipeline creation.

   1. At the bottom of the diagram, choose **\$1 Add stage**.

   1. In Stage name, enter a name for the stage, such as Test, and then choose **Add stage**.

   1. Choose **\$1 Add action group**.

   1. In **Action name**, enter a name, such as DeviceFarmTest.

   1. In **Action provider**, choose **AWS Device Farm**. Allow **Region** to default to the pipeline Region.

   1. In **Input artifacts**, choose the input artifact that matches the output artifact of the stage that comes before the test stage, such as `SourceArtifact`. 

      In the AWS CodePipeline console, you can find the name of the output artifact for each stage by hovering over the information icon in the pipeline diagram. If your pipeline tests your app directly from the **Source** stage, choose **SourceArtifact**. If the pipeline includes a **Build** stage, choose **BuildArtifact**.

   1. In **ProjectId**, choose your Device Farm project ID. Use the steps at the start of this tutorial to retrieve your project ID.

   1. In **DevicePoolArn**, enter the ARN for the device pool. To get the available device pool ARNs for the project, including the ARN for Top Devices, use the AWS CLI to enter the following command: 

      ```
      aws devicefarm list-device-pools --arn arn:aws:devicefarm:us-west-2:account_ID:project:project_ID
      ```

   1. In **AppType**, enter **iOS**.

      The following is a list of valid values for **AppType**:
      + **iOS**
      + **Android**
      + **Web**

   1. In **App**, enter the path of the compiled app package. The path is relative to the root of the input artifact for the test stage. Typically, this path is similar to `ios-test.ipa`.

   1. In **TestType**, enter your type of test, and then in **Test**, enter the path of the test definition file. The path is relative to the root of the input artifact for your test.

      If you're using one of the built-in Device Farm tests, enter the type of test configured in your Device Farm project, such as BUILTIN\$1FUZZ. In **FuzzEventCount**, enter a time in milliseconds, such as 6000. In **FuzzEventThrottle**, enter a time in milliseconds, such as 50.

      If you aren't using one of the built-in Device Farm tests, enter your type of test, and then in **Test**, enter the path of the test definition file. The path is relative to the root of the input artifact for your test. 

      The following is a list of valid values for **TestType**:
      + **APPIUM\$1JAVA\$1JUNIT**
      + **APPIUM\$1JAVA\$1TESTNG**
      + **APPIUM\$1NODE**
      + **APPIUM\$1RUBY**
      + **APPIUM\$1PYTHON**
      + **APPIUM\$1WEB\$1JAVA\$1JUNIT**
      + **APPIUM\$1WEB\$1JAVA\$1TESTNG**
      + **APPIUM\$1WEB\$1NODE**
      + **APPIUM\$1WEB\$1RUBY**
      + **APPIUM\$1WEB\$1PYTHON**
      + **BUILTIN\$1FUZZ**
      + **INSTRUMENTATION**
      + **XCTEST**
      + **XCTEST\$1UI**
**Note**  
Custom environment nodes are not supported.

   1. In the remaining fields, provide the configuration that is appropriate for your test and application type.

   1. (Optional) In **Advanced**, provide configuration information for your test run.

   1. Choose **Save**.

   1. On the stage you are editing, choose **Done**. In the AWS CodePipeline pane, choose **Save**, and then choose **Save** on the warning message.

   1. To submit your changes and start a pipeline execution, choose **Release change**, and then choose **Release**.

# Tutorial: Create a pipeline that deploys to Service Catalog
<a name="tutorials-S3-servicecatalog"></a>

Service Catalog enables you to create and provision products based on AWS CloudFormation templates. 

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

This tutorial shows you how to create and configure a pipeline to deploy your product template to Service Catalog and deliver changes you have made in your source repository (already created in GitHub, CodeCommit, or Amazon S3).

**Note**  
When Amazon S3 is the source provider for your pipeline, you must upload to your bucket all source files packaged as a single .zip file. Otherwise, the source action fails.

First, you create a product in Service Catalog, and then you create a pipeline in AWS CodePipeline. This tutorial provides two options for setting up the deployment configuration:
+ Create a product in Service Catalog and upload a template file to your source repository. Provide product version and deployment configuration in the CodePipeline console (without a separate configuration file). See [Option 1: Deploy to Service Catalog without a configuration file](#tutorials-S3-servicecatalog-ex1-configure).
**Note**  
The template file can be created in YAML or JSON format.
+ Create a product in Service Catalog and upload a template file to your source repository. Provide product version and deployment configuration in a separate configuration file. See [Option 2: Deploy to Service Catalog using a configuration file](#tutorials-S3-servicecatalog-ex2-configure).

## Option 1: Deploy to Service Catalog without a configuration file
<a name="tutorials-S3-servicecatalog-ex1-configure"></a>

In this example, you upload the sample AWS CloudFormation template file for an S3 bucket, and then create your product in Service Catalog. Next, you create your pipeline and specify deployment configuration in the CodePipeline console.

### Step 1: Upload sample template file to source repository
<a name="tutorials-S3-servicecatalog-configure"></a>

1. Open a text editor. Create a sample template by pasting the following into the file. Save the file as `S3_template.json`.

   ```
   {
     "AWSTemplateFormatVersion": "2010-09-09",
     "Description": "CloudFormation Sample Template S3_Bucket: Sample template showing how to create a privately accessible S3 bucket. **WARNING** This template creates an S3 bucket. You will be billed for the resources used if you create a stack from this template.",
     "Resources": {
       "S3Bucket": {
         "Type": "AWS::S3::Bucket",
         "Properties": {}
       }
     },
     "Outputs": {
       "BucketName": {
         "Value": {
           "Ref": "S3Bucket"
         },
         "Description": "Name of Amazon S3 bucket to hold website content"
       }
     }
   }
   ```

   This template allows AWS CloudFormation to create an S3 bucket that can be used by Service Catalog.

1. Upload the `S3_template.json` file to your AWS CodeCommit repository.

### Step 2: Create a product in Service Catalog
<a name="tutorials-S3-servicecatalog-product"></a>

1. As an IT administrator, sign in to the Service Catalog console, go to the **Products** page, and then choose **Upload new product**.

1. On the **Upload new product** page, complete the following:

   1. In **Product name**, enter the name you want to use for your new product.

   1. In **Description**, enter the product catalog description. This description is shown in the product listing to help the user choose the correct product. 

   1. In **Provided by**, enter the name of your IT department or administrator.

   1. Choose **Next**.

1. (Optional) In **Enter support details**, enter contact information for product support, and choose **Next**.

1. In **Version details**, complete the following:

   1. Choose **Upload a template file**. Browse for your `S3_template.json` file and upload it.

   1. In **Version title**, enter the name of the product version (for example, **devops S3 v2**).

   1. In **Description**, enter details that distinguish this version from other versions.

   1. Choose **Next**.

1. On the **Review** page, verify that the information is correct, and then choose **Create**. 

1. On the **Products** page, in the browser, copy the URL of your new product. This contains the product ID. Copy and retain this product ID. You use it when you create your pipeline in CodePipeline.

   Here is the URL for a product named `my-product`. To extract the product ID, copy the value between the equals sign (`=`) and the ampersand (`&`). In this example, the product ID is `prod-example123456`.

   ```
   https://<region-URL>/servicecatalog/home?region=<region>#/admin-products?productCreated=prod-example123456&createdProductTitle=my-product
   ```
**Note**  
Copy the URL for your product before you navigate away from the page. Once you navigate away from this page, you must use the CLI to obtain your product ID.

   After a few seconds, your product appears on the **Products** page. You might need to refresh your browser to see the product in the list.

### Step 3: Create your pipeline
<a name="tutorials-S3-servicecatalog-pipeline"></a>

1. To name your pipeline and select parameters for your pipeline, do the following:

   1. Sign in to the AWS Management Console and open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

   1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

   1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

   1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter a name for your pipeline.

   1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

   1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.

   1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. To add a source stage on the **Step 3: Add source stage** page, do the following:

   1. In **Source provider**, choose **AWS CodeCommit**.

   1. In **Repository name** and **Branch name**, enter the repository and branch you want to use for your source action.

   1. Choose **Next**.

1. In **Step 4: Add build stage**, choose **Skip build stage**, and then accept the warning message by choosing **Skip** again.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 6: Add deploy stage**, complete the following:

   1. In **Deploy provider**, choose **AWS Service Catalog**.

   1. For deployment configuration, choose **Enter deployment configuration**.

   1. In **Product ID**, paste the product ID you copied from the Service Catalog console.

   1. In **Template file path**, enter the relative path where the template file is stored.

   1. In **Product type**, choose **CloudFormation template**.

   1. In **Product version name**, enter the name of the product version you specified in Service Catalog. If you want to have the template change deployed to a new product version, enter a product version name that has not been used for any previous product version in the same product.

   1. For **Input artifact**, choose the source input artifact.

   1. Choose **Next**.

1. In **Step 7: Review**, review your pipeline settings, and then choose **Create**.

1. After your pipeline runs successfully, on the deployment stage, choose **Details**. This opens your product in Service Catalog.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/deploy-servicecatalog-pipeline.png)

1. Under your product information, choose your version name to open the product template. View the template deployment.

### Step 4: Push a change and verify your product in Service Catalog
<a name="tutorials-S3-servicecatalog-change"></a>

1. View your pipeline in the CodePipeline console, and on your source stage, choose **Details**. Your source AWS CodeCommit repository opens in the console. Choose **Edit**, and make a change in the file (for example, to the description). 

   ```
   "Description": "Name of Amazon S3 bucket to hold and version website content"
   ```

1. Commit and push your change. Your pipeline starts after you push the change. When the run of the pipeline is complete, on the deployment stage, choose **Details** to open your product in Service Catalog.

1. Under your product information, choose the new version name to open the product template. View the deployed template change.

## Option 2: Deploy to Service Catalog using a configuration file
<a name="tutorials-S3-servicecatalog-ex2-configure"></a>

In this example, you upload the sample AWS CloudFormation template file for an S3 bucket, and then create your product in Service Catalog. You also upload a separate configuration file that specifies your deployment configuration. Next, you create your pipeline and specify the location of your configuration file.

### Step 1: Upload sample template file to source repository
<a name="tutorials-S3-servicecatalog-upload2"></a>

1. Open a text editor. Create a sample template by pasting the following into the file. Save the file as `S3_template.json`.

   ```
   {
     "AWSTemplateFormatVersion": "2010-09-09",
     "Description": "CloudFormation Sample Template S3_Bucket: Sample template showing how to create a privately accessible S3 bucket. **WARNING** This template creates an S3 bucket. You will be billed for the resources used if you create a stack from this template.",
     "Resources": {
       "S3Bucket": {
         "Type": "AWS::S3::Bucket",
         "Properties": {}
       }
     },
     "Outputs": {
       "BucketName": {
         "Value": {
           "Ref": "S3Bucket"
         },
         "Description": "Name of Amazon S3 bucket to hold website content"
       }
     }
   }
   ```

   This template allows AWS CloudFormation to create an S3 bucket that can be used by Service Catalog.

1. Upload the `S3_template.json` file to your AWS CodeCommit repository.

### Step 2: Create your product deployment configuration file
<a name="tutorials-S3-servicecatalog-configure2"></a>

1. Open a text editor. Create the configuration file for your product. The configuration file is used to define your Service Catalog deployment parameters/preferences. You use this file when you create your pipeline.

   This sample provides a `ProductVersionName` of "devops S3 v2" and a `ProductVersionDescription` of `MyProductVersionDescription`. If you want to have the template change deployed to a new product version, just enter a product version name that has not been used for any previous product version in the same product.

    Save the file as `sample_config.json`.

   ```
   {
       "SchemaVersion": "1.0",
       "ProductVersionName": "devops S3 v2",
       "ProductVersionDescription": "MyProductVersionDescription",
       "ProductType": "CLOUD_FORMATION_TEMPLATE",
       "Properties": {
           "TemplateFilePath": "/S3_template.json"
       }
   }
   ```

   This file creates the product version information for you each time your pipeline runs.

1. Upload the `sample_config.json` file to your AWS CodeCommit repository. Make sure you upload this file to your source repository.

### Step 3: Create a product in Service Catalog
<a name="tutorials-S3-servicecatalog-product2"></a>

1. As an IT administrator, sign in to the Service Catalog console, go to the **Products** page, and then choose **Upload new product**.

1. On the **Upload new product** page, complete the following:

   1. In **Product name**, enter the name you want to use for your new product.

   1. In **Description**, enter the product catalog description. This description appears in the product listing to help the user choose the correct product. 

   1. In **Provided by**, enter the name of your IT department or administrator.

   1. Choose **Next**.

1. (Optional) In **Enter support details**, enter product support contact information, and then choose **Next**.

1. In **Version details**, complete the following:

   1. Choose **Upload a template file**. Browse for your `S3_template.json` file and upload it.

   1. In **Version title**, enter the name of the product version (for example, "devops S3 v2").

   1. In **Description**, enter details that distinguish this version from other versions.

   1. Choose **Next**.

1. On the **Review** page, verify that the information is correct, and then choose **Confirm and upload**. 

1. On the **Products** page, in the browser, copy the URL of your new product. This contains the product ID. Copy and retain this product ID. You use when you create your pipeline in CodePipeline.

   Here is the URL for a product named `my-product`. To extract the product ID, copy the value between the equals sign (`=`) and the ampersand (`&`). In this example, the product ID is `prod-example123456`. 

   ```
   https://<region-URL>/servicecatalog/home?region=<region>#/admin-products?productCreated=prod-example123456&createdProductTitle=my-product
   ```
**Note**  
Copy the URL for your product before you navigate away from the page. Once you navigate away from this page, you must use the CLI to obtain your product ID.

   After a few seconds, your product appears on the **Products** page. You might need to refresh your browser to see the product in the list.

### Step 4: Create your pipeline
<a name="tutorials-S3-servicecatalog-pipeline2"></a>

1. To name your pipeline and select parameters for your pipeline, do the following:

   1. Sign in to the AWS Management Console and open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

   1. Choose **Getting started**. Choose **Create pipeline**, and then enter a name for your pipeline.

   1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.

   1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. To add a source stage, do the following:

   1. In **Source provider**, choose **AWS CodeCommit**.

   1. In **Repository name** and **Branch name**, enter the repository and branch you want to use for your source action.

   1. Choose **Next**.

1. In **Add build stage**, choose **Skip build stage**, and then accept the warning message by choosing **Skip** again.

1. In **Add deploy stage**, complete the following:

   1. In **Deploy provider**, choose **AWS Service Catalog**.

   1. Choose **Use configuration file**.

   1. In **Product ID**, paste the product ID you copied from the Service Catalog console.

   1. In **Configuration file path**, enter the file path of the configuration file in your repository.

   1. Choose **Next**.

1. In **Review**, review your pipeline settings, and then choose **Create**.

1. After your pipeline runs successfully, on your deployment stage, choose **Details** to open your product in Service Catalog.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/deploy-servicecatalog-pipeline.png)

1. Under your product information, choose your version name to open the product template. View the template deployment.

### Step 5: Push a change and verify your product in Service Catalog
<a name="tutorials-S3-servicecatalog-change2"></a>

1. View your pipeline in the CodePipeline console, and on the source stage, choose **Details**. Your source AWS CodeCommit repository opens in the console. Choose **Edit**, and then make a change in the file (for example, to the description).

   ```
   "Description": "Name of Amazon S3 bucket to hold and version website content"
   ```

1. Commit and push your change. Your pipeline starts after you push the change. When the run of the pipeline is complete, on the deployment stage, choose **Details** to open your product in Service Catalog.

1. Under your product information, choose the new version name to open the product template. View the deployed template change.

# Tutorial: Create a pipeline with AWS CloudFormation
<a name="tutorials-cloudformation"></a>

The examples provide sample templates that allow you to use AWS CloudFormation to create a pipeline that deploys your application to your instances each time the source code changes. The sample template creates a pipeline that you can view in AWS CodePipeline. The pipeline detects the arrival of a saved change through Amazon CloudWatch Events.

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Topics**
+ [

# Example 1: Create an AWS CodeCommit pipeline with AWS CloudFormation
](tutorials-cloudformation-codecommit.md)
+ [

# Example 2: Create an Amazon S3 pipeline with AWS CloudFormation
](tutorials-cloudformation-s3.md)

# Example 1: Create an AWS CodeCommit pipeline with AWS CloudFormation
<a name="tutorials-cloudformation-codecommit"></a>

This walkthrough shows you how to use the AWS CloudFormation console to create infrastructure that includes a pipeline connected to a CodeCommit source repository. In this tutorial, you use the provided sample template file to create your resource stack, which includes your artifact store, pipeline, and change-detection resources, such as your Amazon CloudWatch Events rule. After you create your resource stack in AWS CloudFormation, you can view your pipeline in the AWS CodePipeline console. The pipeline is a two-stage pipeline with a CodeCommit source stage and a CodeDeploy deployment stage.

**Prerequisites:**

You must have created the following resources to use with the AWS CloudFormation sample template:
+ You must have created a source repository. You can use the AWS CodeCommit repository you created in [Tutorial: Create a simple pipeline (CodeCommit repository)](tutorials-simple-codecommit.md).
+ You must have created a CodeDeploy application and deployment group. You can use the CodeDeploy resources you created in [Tutorial: Create a simple pipeline (CodeCommit repository)](tutorials-simple-codecommit.md).
+ Choose one of these links to download the sample AWS CloudFormation template file for creating a pipeline: [YAML](samples/codepipeline-codecommit-events-yaml.zip) \$1 [JSON](samples/codepipeline-codecommit-events-json.zip)

  Unzip the file and place it on your local computer.
+ Download the [SampleApp\$1Linux.zip](samples/SampleApp_Linux.zip) sample application file.



**Create your pipeline in AWS CloudFormation**

1. Unzip the files from [SampleApp\$1Linux.zip](samples/SampleApp_Linux.zip) and upload the files to your AWS CodeCommit repository. You must upload the unzipped files to the root directory of your repository. You can follow the instructions in [Step 2: Add sample code to your CodeCommit repository](tutorials-simple-codecommit.md#codecommit-add-code) to push the files to your repository.

1. Open the AWS CloudFormation console and choose **Create Stack**. Choose **With new resources (standard)**.

1. Under **Specify template**, choose **Upload a template**. Select **Choose file** and then choose the template file from your local computer. Choose **Next**.

1. In **Stack name**, enter a name for your pipeline. Parameters specified by the sample template are displayed. Enter the following parameters: 

   1. In **ApplicationName**, enter the name of your CodeDeploy application.

   1. In **BetaFleet**, enter the name of your CodeDeploy deployment group.

   1. In **BranchName**, enter the repository branch you want to use.

   1. In **RepositoryName**, enter the name of your CodeCommit source repository.

1. Choose **Next**. Accept the defaults on the following page, and then choose **Next**.

1. In **Capabilities**, select **I acknowledge that AWS CloudFormation might create IAM resources**, and then choose **Create stack**.

1. After your stack creation is complete, view the event list to check for any errors.

   **Troubleshooting**

   The IAM user who is creating the pipeline in AWS CloudFormation might require additional permissions to create resources for the pipeline. The following permissions are required in the  policy to allow AWS CloudFormation to create the required Amazon CloudWatch Events resources for the CodeCommit pipeline:

   ```
   {
        "Effect": "Allow",
        "Action": [
           "events:PutRule",
           "events:PutEvents",
           "events:PutTargets",
           "events:DeleteRule",
           "events:RemoveTargets",
           "events:DescribeRule"
        ],
        "Resource": "resource_ARN"
   }
   ```

1. Sign in to the AWS Management Console and open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

   Under **Pipelines**, choose your pipeline and choose **View**. The diagram shows your pipeline source and deployment stages.
**Note**  
To view the pipeline that was created, find the **Logical ID** column under the **Resources** tab for your stack in CloudFormation. Note the name in the **Physical ID** column for the pipeline. In CodePipeline, you can view the pipeline with the same Physical ID (pipeline name) in the Region where you created your stack.

1. In your source repository, commit and push a change. Your change-detection resources pick up the change, and your pipeline starts.

# Example 2: Create an Amazon S3 pipeline with AWS CloudFormation
<a name="tutorials-cloudformation-s3"></a>

This walkthrough shows you how to use the AWS CloudFormation console to create infrastructure that includes a pipeline connected to an Amazon S3 source bucket. In this tutorial, you use the provided sample template file to create your resource stack, which includes your source bucket, artifact store, pipeline, and change-detection resources, such as your Amazon CloudWatch Events rule and CloudTrail trail. After you create your resource stack in AWS CloudFormation, you can view your pipeline in the AWS CodePipeline console. The pipeline is a two-stage pipeline with an Amazon S3 source stage and a CodeDeploy deployment stage.

**Prerequisites:**

You must have the following resources to use with the AWS CloudFormation sample template:
+ You must have created the Amazon EC2 instances, where you installed the CodeDeploy agent on the instances. You must have created a CodeDeploy application and deployment group. Use the Amazon EC2 and CodeDeploy resources you created in [Tutorial: Create a simple pipeline (CodeCommit repository)](tutorials-simple-codecommit.md).
+ Choose the following links to download the sample AWS CloudFormation template files for creating a pipeline with an Amazon S3 source: 
  + Download the sample template for your pipeline: [YAML](samples/codepipeline-s3-events-yaml.zip) \$1 [JSON](samples/codepipeline-s3-events-json.zip)
  + Download the sample template for your CloudTrail bucket and trail: [YAML](samples/codepipeline-s3-cloudtrail-yaml.zip) \$1 [JSON](samples/codepipeline-s3-cloudtrail-json.zip)
  + Unzip the files and place them on your local computer.
+ Download the sample application from [SampleApp\$1Linux.zip](samples/SampleApp_Linux.zip).

  Save the .zip file on your local computer. You upload the .zip file after the stack is created.

**Create your pipeline in AWS CloudFormation**

1. Open the AWS CloudFormation console, and choose **Create Stack**. Choose **With new resources (standard)**.

1. In **Choose a template**, choose **Upload a template**. Select **Choose file**, and then choose the template file from your local computer. Choose **Next**.

1. In **Stack name**, enter a name for your pipeline. Parameters specified by the sample template are displayed. Enter the following parameters: 

   1. In **ApplicationName**, enter the name of your CodeDeploy application. You can replace the `DemoApplication` default name.

   1. In **BetaFleet**, enter the name of your CodeDeploy deployment group. You can replace the `DemoFleet` default name.

   1. In **SourceObjectKey**, enter `SampleApp_Linux.zip`. You upload this file to your bucket after the template creates the bucket and pipeline.

1. Choose **Next**. Accept the defaults on the following page, and then choose **Next**.

1. In **Capabilities**, select **I acknowledge that AWS CloudFormation might create IAM resources**, and then choose **Create stack**.

1. After your stack creation is complete, view the event list to check for any errors.

   **Troubleshooting**

   The IAM userwho is creating the pipeline in AWS CloudFormation might require additional permissions to create resources for the pipeline. The following permissions are required in the  policy to allow AWS CloudFormation to create the required Amazon CloudWatch Events resources for the Amazon S3 pipeline:

   ```
   {
        "Effect": "Allow",
        "Action": [
           "events:PutRule",
           "events:PutEvents",
           "events:PutTargets",
           "events:DeleteRule",
           "events:RemoveTargets",
           "events:DescribeRule"
        ],
        "Resource": "resource_ARN"
   }
   ```

1. In CloudFormation, in the **Resources** tab for your stack, view the resources that were created for your stack. 
**Note**  
To view the pipeline that was created, find the **Logical ID** column under the **Resources** tab for your stack in CloudFormation. Note the name in the **Physical ID** column for the pipeline. In CodePipeline, you can view the pipeline with the same Physical ID (pipeline name) in the Region where you created your stack.

   Choose the S3 bucket with a `sourcebucket` label in the name, such as `s3-cfn-codepipeline-sourcebucket-y04EXAMPLE.` Do not choose the pipeline artifact bucket.

   The source bucket is empty because the resource is newly created by CloudFormation. Open the Amazon S3 console and locate your `sourcebucket` bucket. Choose **Upload**, and follow the instructions to upload your `SampleApp_Linux.zip` .zip file.
**Note**  
When Amazon S3 is the source provider for your pipeline, you must upload to your bucket all source files packaged as a single .zip file. Otherwise, the source action fails.

1. Sign in to the AWS Management Console and open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

   Under **Pipelines**, choose your pipeline, and then choose **View**. The diagram shows your pipeline source and deployment stages.

1. Complete the steps in the following procedure to create your AWS CloudTrail resources.

**Create your AWS CloudTrail resources in AWS CloudFormation**

1. Open the AWS CloudFormation console, and choose **Create Stack**.

1. In **Choose a template**, choose **Upload a template to Amazon S3**. Choose **Browse**, and then select the template file for the AWS CloudTrail resources from your local computer. Choose **Next**.

1. In **Stack name**, enter a name for your resource stack. Parameters specified by the sample template are displayed. Enter the following parameters: 

   1. In **SourceObjectKey**, accept the default for the sample application's zip file.

1. Choose **Next**. Accept the defaults on the following page, and then choose **Next**.

1. In **Capabilities**, select **I acknowledge that AWS CloudFormation might create IAM resources**, and then choose **Create**.

1. After your stack creation is complete, view the event list to check for any errors.

   The following permissions are required in the policy to allow AWS CloudFormation to create the required CloudTrail resources for the Amazon S3 pipeline:

   ```
   {
        "Effect": "Allow",
        "Action": [
           "cloudtrail:CreateTrail",
           "cloudtrail:DeleteTrail",
           "cloudtrail:StartLogging",
           "cloudtrail:StopLogging",
           "cloudtrail:PutEventSelectors"
        ],
        "Resource": "resource_ARN"
   }
   ```

1. Sign in to the AWS Management Console and open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

   Under **Pipelines**, choose your pipeline, and then choose **View**. The diagram shows your pipeline source and deployment stages.

1. In your source bucket, commit and push a change. Your change-detection resources pick up the change and your pipeline starts.

# Tutorial: Create a pipeline that uses variables from AWS CloudFormation deployment actions
<a name="tutorials-cloudformation-action"></a>

In this tutorial, you use the AWS CodePipeline console to create a pipeline with a deployment action. When the pipeline runs, the template creates a stack and also creates an `outputs` file. Outputs generated by the stack template are the variables generated by the AWS CloudFormation action in CodePipeline.

In the action where you create the stack from the template, you designate a variable namespace. The variables produced by the `outputs` file can then be consumed by subsequent actions. In this example, you create a change set based on the `StackName` variable produced by the AWS CloudFormation action. After a manual approval, you execute the change set and then create a delete stack action that deletes the stack based on the `StackName` variable.

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Topics**
+ [

## Prerequisites: Create an AWS CloudFormation service role and a CodeCommit repository
](#tutorials-cloudformation-action-prereq)
+ [

## Step 1: Download, edit, and upload the sample AWS CloudFormation template
](#tutorials-cloudformation-action-upload)
+ [

## Step 2: Create your pipeline
](#tutorials-cloudformation-action-pipeline)
+ [

## Step 3: Add an CloudFormation deployment action to create the change set
](#tutorials-cloudformation-action-changeset)
+ [

## Step 4: Add a manual approval action
](#tutorials-cloudformation-action-approval)
+ [

## Step 5: Add a CloudFormation deployment action to execute the change set
](#tutorials-cloudformation-action-deployment)
+ [

## Step 6: Add a CloudFormation deployment action to delete the stack
](#tutorials-cloudformation-action-delete)

## Prerequisites: Create an AWS CloudFormation service role and a CodeCommit repository
<a name="tutorials-cloudformation-action-prereq"></a>

You must already have the following:
+ A CodeCommit repository. You can use the AWS CodeCommit repository you created in [Tutorial: Create a simple pipeline (CodeCommit repository)](tutorials-simple-codecommit.md).
+ This example creates an Amazon DocumentDB stack from a template. You must use AWS Identity and Access Management (IAM) to create an AWS CloudFormation service role with the following permissions for Amazon DocumentDB.

  ```
  "rds:DescribeDBClusters",
  "rds:CreateDBCluster",
  "rds:DeleteDBCluster",
  "rds:CreateDBInstance"
  ```

## Step 1: Download, edit, and upload the sample AWS CloudFormation template
<a name="tutorials-cloudformation-action-upload"></a>

Download the sample AWS CloudFormation template file and upload it to your CodeCommit repository.

1. Navigate to the sample template for your Region. For example, use the table at [https://docs.aws.amazon.com/documentdb/latest/developerguide/quick_start_cfn.html#quick_start_cfn-launch_stack](https://docs.aws.amazon.com/documentdb/latest/developerguide/quick_start_cfn.html#quick_start_cfn-launch_stack) to choose the Region and download the template. Download the template for an Amazon DocumentDB Cluster. The file name is `documentdb_full_stack.yaml`.

1. Unzip the `documentdb_full_stack.yaml` file, and open it in a text editor. Make the following changes.

   1. For this example, add the following `Purpose:` parameter to your `Parameters` section in the template.

      ```
        Purpose:
          Type: String
          Default: testing
          AllowedValues:
            - testing
            - production
          Description: The purpose of this instance.
      ```

   1. For this example, add the following `StackName` output to your `Outputs:` section in the template.

      ```
        StackName:
          Value: !Ref AWS::StackName
      ```

1. Upload the template file to your AWS CodeCommit repository. You must upload the unzipped and edited template file to the root directory of your repository. 

   To use the CodeCommit console to upload your files: 

   1. Open the CodeCommit console, and choose your repository from the **Repositories** list.

   1. Choose **Add file**, and then choose **Upload file**. 

   1. Select **Choose file**, and then browse for your file. Commit the change by entering your user name and email address. Choose **Commit changes**.

   Your file should look like this at the root level in your repository:

   ```
   documentdb_full_stack.yaml
   ```

## Step 2: Create your pipeline
<a name="tutorials-cloudformation-action-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with a CodeCommit action where the source artifact is your template file.
+ A deployment stage with an CloudFormation deployment action.

Each action in the source and deployment stages created by the wizard is assigned a variable namespace, `SourceVariables` and `DeployVariables`, respectively. Because the actions have a namespace assigned, the variables configured in this example are available to downstream actions. For more information, see [Variables reference](reference-variables.md).

**To create a pipeline with the wizard**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyCFNDeployPipeline**.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, do one of the following:
   + Choose **New service role** to allow CodePipeline to create a service role in IAM.
   + Choose **Existing service role**. In **Role name**, choose your service role from the list.

1. In **Artifact store**: 

   1. Choose **Default location** to use the default artifact store, such as the Amazon S3 artifact bucket designated as the default, for your pipeline in the Region you selected for your pipeline.

   1. Choose **Custom location** if you already have an artifact store, such as an Amazon S3 artifact bucket, in the same Region as your pipeline.
**Note**  
This is not the source bucket for your source code. This is the artifact store for your pipeline. A separate artifact store, such as an S3 bucket, is required for each pipeline. When you create or edit a pipeline, you must have an artifact bucket in the pipeline Region and one artifact bucket per AWS Region where you are running an action.  
For more information, see [Input and output artifacts](welcome-introducing-artifacts.md) and [CodePipeline pipeline structure reference](reference-pipeline-structure.md).

   Choose **Next**.

1. In **Step 3: Add source stage**: 

   1. In **Source provider**, choose **AWS CodeCommit**.

   1. In **Repository name**, choose the name of the CodeCommit repository that you created in [Step 1: Create a CodeCommit repository](tutorials-simple-codecommit.md#codecommit-create-repository).

   1. In **Branch name**, choose the name of the branch that contains your latest code update.

   After you select the repository name and branch, the Amazon CloudWatch Events rule to be created for this pipeline is displayed. 

   Choose **Next**.

1. In **Step 4: Add build stage**, choose **Skip build stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 6: Add deploy stage**:

   1. In **Action name**, choose **Deploy**. In **Deploy provider**, choose **CloudFormation**.

   1. In **Action mode**, choose **Create or update a stack**.

   1. In **Stack name**, enter a name for the stack. This is the name of the stack that the template will create.

   1. In **Output file name**, enter a name for the outputs file, such as **outputs**. This is the name of the file that will be created by the action after the stack is created.

   1. Expand **Advanced**. Under **Parameter overrides**, enter your template overrides as key-value pairs. For example, this template requires the following overrides.

      ```
      {
      "DBClusterName": "MyDBCluster",
      "DBInstanceName": "MyDBInstance",
      "MasterUser": "UserName",
      "MasterPassword": "Password",
      "DBInstanceClass": "db.r4.large",
      "Purpose": "testing"}
      ```

      If you don't enter overrides, the template creates a stack with default values.

   1. Choose **Next**.

   1. On **Step 7: Review**, choose **Create pipeline**. You should see a diagram that shows the pipeline stages. Allow your pipeline to run. Your two-stage pipeline is complete and ready for the additional stages to be added.

## Step 3: Add an CloudFormation deployment action to create the change set
<a name="tutorials-cloudformation-action-changeset"></a>

Create a next action in your pipeline that will allow CloudFormation to create the change set before the manual approval action.



1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

   Under **Pipelines**, choose your pipeline and choose **View**. The diagram shows your pipeline source and deployment stages.

1. Choose to edit the pipeline, or continue to display the pipeline in **Edit** mode.

1. Choose to edit the **Deploy** stage.

1. Add a deployment action that will create a change set for the stack that was created in the previous action. You add this action after the existing action in the stage.

   1. In **Action name**, enter **Change\$1Set**. In **Action provider**, choose **AWS CloudFormation **.

   1. In **Input artifact**, choose **SourceArtifact**.

   1. In **Action mode**, choose **Create or replace a change set**.

   1. In **Stack name**, enter the variable syntax as shown. This is the name of the stack that the change set is created for, where the default namespace `DeployVariables` is assigned to the action.

      ```
      #{DeployVariables.StackName}
      ```

   1. In **Change set name**, enter the name of the change set.

      ```
      my-changeset
      ```

   1. In **Parameter Overrides**, change the `Purpose` parameter from `testing` to `production`.

      ```
      {
      "DBClusterName": "MyDBCluster",
      "DBInstanceName": "MyDBInstance",
      "MasterUser": "UserName",
      "MasterPassword": "Password",
      "DBInstanceClass": "db.r4.large",
      "Purpose": "production"}
      ```

   1. Choose **Done** to save the action.

## Step 4: Add a manual approval action
<a name="tutorials-cloudformation-action-approval"></a>

Create a manual approval action in your pipeline.



1. Choose to edit the pipeline, or continue to display the pipeline in **Edit** mode.

1. Choose to edit the **Deploy** stage.

1. Add a manual approval action after the deploy action that creates the change set. This action allows you to verify the created resource change set in CloudFormation before the pipeline executes the change set.

## Step 5: Add a CloudFormation deployment action to execute the change set
<a name="tutorials-cloudformation-action-deployment"></a>

Create a next action in your pipeline that allows CloudFormation to execute the change set after the manual approval action.



1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

   Under **Pipelines**, choose your pipeline and choose **View**. The diagram shows your pipeline source and deployment stages.

1. Choose to edit the pipeline, or continue to display the pipeline in **Edit** mode.

1. Choose to edit the **Deploy** stage.

1. Add a deployment action that will execute the change set that was approved in the previous manual action:

   1. In **Action name**, enter **Execute\$1Change\$1Set**. In **Action provider**, choose **AWS CloudFormation**.

   1. In **Input artifact**, choose **SourceArtifact**.

   1. In **Action mode**, choose **Execute a change set**.

   1. In **Stack name**, enter the variable syntax as shown. This is the name of the stack that the change set is created for.

      ```
      #{DeployVariables.StackName}
      ```

   1. In **Change set name**, enter the name of the change set you created in the previous action.

      ```
      my-changeset
      ```

   1. Choose **Done** to save the action.

   1. Continue the pipeline run.

## Step 6: Add a CloudFormation deployment action to delete the stack
<a name="tutorials-cloudformation-action-delete"></a>

Create a final action in your pipeline that allows CloudFormation to get the stack name from the variable in the outputs file and delete the stack.



1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

   Under **Pipelines**, choose your pipeline and choose **View**. The diagram shows your pipeline source and deployment stages.

1. Choose to edit the pipeline.

1. Choose to edit the **Deploy** stage.

1. Add a deployment action that will delete the stack:

   1. In **Action name**, choose **DeleteStack**. In **Deploy provider**, choose **CloudFormation**.

   1. In **Action mode**, choose **Delete a stack**.

   1. In **Stack name**, enter the variable syntax as shown. This is the name of the stack that the action will delete.

   1. Choose **Done** to save the action.

   1. Choose **Save** to save the pipeline.

   The pipeline runs when it is saved.

# Tutorial: Amazon ECS Standard Deployment with CodePipeline
<a name="ecs-cd-pipeline"></a>

This tutorial helps you to create a complete, end-to-end continuous deployment (CD) pipeline with Amazon ECS with CodePipeline.

**Important**  
As part of creating a pipeline in the console, an S3 artifact bucket will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Note**  
This tutorial is for the Amazon ECS standard deployment action for CodePipeline. For a tutorial that uses the Amazon ECS to CodeDeploy blue/green deployment action in CodePipeline, see [Tutorial: Create a pipeline with an Amazon ECR source and ECS-to-CodeDeploy deployment](tutorials-ecs-ecr-codedeploy.md).

**Note**  
This tutorial is for the Amazon ECS standard deployment action for CodePipeline with a source action. For a tutorial that uses the Amazon ECSstandard deployment action along with the ECRBuildAndPublish build action in CodePipeline to push your image, see [Tutorial: Build and push a Docker image to Amazon ECR with CodePipeline (V2 type)](tutorials-ecr-build-publish.md).

## Prerequisites
<a name="ecs-cd-prereqs"></a>

There are a few resources that you must have in place before you can use this tutorial to create your CD pipeline. Here are the things you need to get started: 

**Note**  
All of these resources should be created within the same AWS Region.
+ A source control repository (this tutorial uses CodeCommit) with your Dockerfile and application source. For more information, see [Create a CodeCommit Repository](https://docs.aws.amazon.com/codecommit/latest/userguide/how-to-create-repository.html) in the *AWS CodeCommit User Guide*.
+ A Docker image repository (this tutorial uses Amazon ECR) that contains an image you have built from your Dockerfile and application source. For more information, see [Creating a Repository](https://docs.aws.amazon.com/AmazonECR/latest/userguide/repository-create.html) and [Pushing an Image](https://docs.aws.amazon.com/AmazonECR/latest/userguide/docker-push-ecr-image.html) in the *Amazon Elastic Container Registry User Guide*.
+ An Amazon ECS task definition that references the Docker image hosted in your image repository. For more information, see [Creating a Task Definition](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/create-task-definition.html) in the *Amazon Elastic Container Service Developer Guide*.
**Important**  
The Amazon ECS standard deployment action for CodePipeline creates its own revision of the task definition based on the the revision used by the Amazon ECS service. If you create new revisions for the task definition without updating the Amazon ECS service, the deployment action will ignore those revisions.

  Below is a sample task definition used for this tutorial. The value you use for `name` and `family` will be used in the next step for your build specification file.

  ```
  {
    "ipcMode": null,
    "executionRoleArn": "role_ARN",
    "containerDefinitions": [
      {
        "dnsSearchDomains": null,
        "environmentFiles": null,
        "logConfiguration": {
          "logDriver": "awslogs",
          "secretOptions": null,
          "options": {
            "awslogs-group": "/ecs/hello-world",
            "awslogs-region": "us-west-2",
            "awslogs-stream-prefix": "ecs"
          }
        },
        "entryPoint": null,
        "portMappings": [
          {
            "hostPort": 80,
            "protocol": "tcp",
            "containerPort": 80
          }
        ],
        "command": null,
        "linuxParameters": null,
        "cpu": 0,
        "environment": [],
        "resourceRequirements": null,
        "ulimits": null,
        "dnsServers": null,
        "mountPoints": [],
        "workingDirectory": null,
        "secrets": null,
        "dockerSecurityOptions": null,
        "memory": null,
        "memoryReservation": 128,
        "volumesFrom": [],
        "stopTimeout": null,
        "image": "image_name",
        "startTimeout": null,
        "firelensConfiguration": null,
        "dependsOn": null,
        "disableNetworking": null,
        "interactive": null,
        "healthCheck": null,
        "essential": true,
        "links": null,
        "hostname": null,
        "extraHosts": null,
        "pseudoTerminal": null,
        "user": null,
        "readonlyRootFilesystem": null,
        "dockerLabels": null,
        "systemControls": null,
        "privileged": null,
        "name": "hello-world"
      }
    ],
    "placementConstraints": [],
    "memory": "2048",
    "taskRoleArn": null,
    "compatibilities": [
      "EC2",
      "FARGATE"
    ],
    "taskDefinitionArn": "ARN",
    "family": "hello-world",
    "requiresAttributes": [],
    "pidMode": null,
    "requiresCompatibilities": [
      "FARGATE"
    ],
    "networkMode": "awsvpc",
    "cpu": "1024",
    "revision": 1,
    "status": "ACTIVE",
    "inferenceAccelerators": null,
    "proxyConfiguration": null,
    "volumes": []
  }
  ```
+ An Amazon ECS cluster that is running a service that uses your previously mentioned task definition. For more information, see [Creating a Cluster](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/clusters.html) and [Creating a Service](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/create-service-console-v2.html) in the *Amazon Elastic Container Service Developer Guide*.

After you have satisfied these prerequisites, you can proceed with the tutorial and create your CD pipeline.

## Step 1: Add a Build Specification File to Your Source Repository
<a name="cd-buildspec"></a>

This tutorial uses CodeBuild to build your Docker image and push the image to Amazon ECR. Add a `buildspec.yml` file to your source code repository to tell CodeBuild how to do that. The example build specification below does the following:
+ Pre-build stage:
  + Log in to Amazon ECR.
  + Set the repository URI to your ECR image and add an image tag with the first seven characters of the Git commit ID of the source.
+ Build stage:
  + Build the Docker image and tag the image both as `latest` and with the Git commit ID.
+ Post-build stage:
  + Push the image to your ECR repository with both tags.
  + Write a file called `imagedefinitions.json` in the build root that has your Amazon ECS service's container name and the image and tag. The deployment stage of your CD pipeline uses this information to create a new revision of your service's task definition, and then it updates the service to use the new task definition. The `imagedefinitions.json` file is required for the ECS job worker.

Paste this sample text to create your `buildspec.yml` file, and replace the values for your image and task definition. This text uses the example account ID 111122223333.

```
version: 0.2

phases:
  pre_build:
    commands:
      - echo Logging in to Amazon ECR...
      - aws --version
      - aws ecr get-login-password --region $AWS_DEFAULT_REGION | docker login --username AWS --password-stdin 111122223333.dkr.ecr.us-west-2.amazonaws.com
      - REPOSITORY_URI=012345678910.dkr.ecr.us-west-2.amazonaws.com/hello-world
      - COMMIT_HASH=$(echo $CODEBUILD_RESOLVED_SOURCE_VERSION | cut -c 1-7)
      - IMAGE_TAG=${COMMIT_HASH:=latest}
  build:
    commands:
      - echo Build started on `date`
      - echo Building the Docker image...
      - docker build -t $REPOSITORY_URI:latest .
      - docker tag $REPOSITORY_URI:latest $REPOSITORY_URI:$IMAGE_TAG
  post_build:
    commands:
      - echo Build completed on `date`
      - echo Pushing the Docker images...
      - docker push $REPOSITORY_URI:latest
      - docker push $REPOSITORY_URI:$IMAGE_TAG
      - echo Writing image definitions file...
      - printf '[{"name":"hello-world","imageUri":"%s"}]' $REPOSITORY_URI:$IMAGE_TAG > imagedefinitions.json
artifacts:
    files: imagedefinitions.json
```

The build specification was written for the sample task definition that was provided in [Prerequisites](#ecs-cd-prereqs), used by the Amazon ECS service for this tutorial. The `REPOSITORY_URI` value corresponds to the `image` repository (without any image tag), and the `hello-world` value near the end of the file corresponds to the container name in the service's task definition. 

**To add a `buildspec.yml` file to your source repository**

1. Open a text editor and then copy and paste the build specification above into a new file.

1. Replace the `REPOSITORY_URI` value (`012345678910.dkr.ecr.us-west-2.amazonaws.com/hello-world`) with your Amazon ECR repository URI (without any image tag) for your Docker image. Replace `hello-world` with the container name in your service's task definition that references your Docker image.

1. Commit and push your `buildspec.yml` file to your source repository.

   1. Add the file.

      ```
      git add .
      ```

   1. Commit the change.

      ```
      git commit -m "Adding build specification."
      ```

   1. Push the commit.

      ```
      git push
      ```

## Step 2: Creating Your Continuous Deployment Pipeline
<a name="pipeline-wizard"></a>

Use the CodePipeline wizard to create your pipeline stages and connect your source repository to your ECS service.

**To create your pipeline**

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **Welcome** page, choose **Create pipeline**. 

   If this is your first time using CodePipeline, an introductory page appears instead of **Welcome**. Choose **Get Started Now**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, type the name for your pipeline. For this tutorial, the pipeline name is **hello-world**.

1. In **Pipeline type**, keep the default selection at **V2**. Pipeline types differ in characteristics and price. For more information, see [Pipeline types](pipeline-types.md). Choose **Next**.

1. On the **Step 3: Add source stage** page, for **Source provider**, choose ** AWS CodeCommit**.

   1. For **Repository name**, choose the name of the CodeCommit repository to use as the source location for your pipeline.

   1. For **Branch name**, choose the branch to use and choose **Next**.

1. On the **Step 4: Add build stage** page, for **Build provider** choose **AWS CodeBuild**, and then choose **Create project**.

   1. For **Project name**, choose a unique name for your build project. For this tutorial, the project name is **hello-world**.

   1. For **Environment image**, choose **Managed image**.

   1. For **Operating system**, choose **Amazon Linux 2**.

   1. For **Runtime(s)**, choose **Standard**.

   1. For **Image**, choose **`aws/codebuild/amazonlinux2-x86_64-standard:3.0`**.

   1. For **Image version** and **Environment type**, use the default values.

   1. Select **Enable this flag if you want to build Docker images or want your builds to get elevated privileges**.

   1. Deselect **CloudWatch logs**. You might need to expand **Advanced**.

   1. Choose **Continue to CodePipeline**.

   1. Choose **Next**.
**Note**  
The wizard creates a CodeBuild service role for your build project, called **codebuild-*build-project-name*-service-role**. Note this role name, as you add Amazon ECR permissions to it later.

1. On the **Step 5: Add deploy stage** page, for **Deployment provider**, choose **Amazon ECS**.

   1. For **Cluster name**, choose the Amazon ECS cluster in which your service is running. For this tutorial, the cluster is **default**.

   1. For **Service name**, choose the service to update and choose **Next**. For this tutorial, the service name is **hello-world**.

1. On the **Step 6: Review** page, review your pipeline configuration and choose **Create pipeline** to create the pipeline.
**Note**  
Now that the pipeline has been created, it attempts to run through the different pipeline stages. However, the default CodeBuild role created by the wizard does not have permissions to execute all of the commands contained in the `buildspec.yml` file, so the build stage fails. The next section adds the permissions for the build stage.

## Step 3: Add Amazon ECR Permissions to the CodeBuild Role
<a name="code-build-perms"></a>

The CodePipeline wizard created an IAM role for the CodeBuild build project, called **codebuild-*build-project-name*-service-role**. For this tutorial, the name is **codebuild-hello-world-service-role**. Because the `buildspec.yml` file makes calls to Amazon ECR API operations, the role must have a policy that allows permissions to make these Amazon ECR calls. The following procedure helps you attach the proper permissions to the role.

**To add Amazon ECR permissions to the CodeBuild role**

1. Open the IAM console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/).

1. In the left navigation pane, choose **Roles**.

1. In the search box, type **codebuild-** and choose the role that was created by the CodePipeline wizard. For this tutorial, the role name is **codebuild-hello-world-service-role**.

1. On the **Summary** page, choose **Attach policies**.

1. Select the box to the left of the **AmazonEC2ContainerRegistryPowerUser** policy, and choose **Attach policy**.

## Step 4: Test Your Pipeline
<a name="commit-change"></a>

Your pipeline should have everything for running an end-to-end native AWS continuous deployment. Now, test its functionality by pushing a code change to your source repository.

**To test your pipeline**

1. Make a code change to your configured source repository, commit, and push the change.

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. Choose your pipeline from the list.

1. Watch the pipeline progress through its stages. Your pipeline should complete and your Amazon ECS service runs the Docker image that was created from your code change.

# Tutorial: Create a pipeline with an Amazon ECR source and ECS-to-CodeDeploy deployment
<a name="tutorials-ecs-ecr-codedeploy"></a>

In this tutorial, you configure a pipeline in AWS CodePipeline that deploys container applications using a blue/green deployment that supports Docker images. In a blue/green deployment, you can launch the new version of your application alongside the old version and test the new version before you reroute traffic. You can also monitor the deployment process and rapidly roll back if there is an issue.

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Note**  
This tutorial is for the Amazon ECS to CodeDeploy blue/green deployment action for CodePipeline. For a tutorial that uses the Amazon ECS standard deployment action in CodePipeline, see [Tutorial: Amazon ECS Standard Deployment with CodePipeline](ecs-cd-pipeline.md).

The completed pipeline detects changes to your image, which is stored in an image repository such as Amazon ECR, and uses CodeDeploy to route and deploy traffic to an Amazon ECS cluster and load balancer. CodeDeploy uses a listener to reroute traffic to the port of the updated container specified in the AppSpec file. For information about how the load balancer, production listener, target groups, and your Amazon ECS application are used in a blue/green deployment, see [Tutorial: Deploy an Amazon ECS Service](https://docs.aws.amazon.com/codedeploy/latest/userguide/tutorial-ecs-deployment.html).

The pipeline is also configured to use a source location, such as CodeCommit, where your Amazon ECS task definition is stored. In this tutorial, you configure each of these AWS resources and then create your pipeline with stages that contain actions for each resource.

Your continuous delivery pipeline will automatically build and deploy container images whenever source code is changed or a new base image is uploaded to Amazon ECR.

This flow uses the following artifacts:
+ A Docker image file that specifies the container name and repository URI of your Amazon ECR image repository.
+ An Amazon ECS task definition that lists your Docker image name, container name, Amazon ECS service name, and load balancer configuration.
+ A CodeDeploy AppSpec file that specifies the name of the Amazon ECS task definition file, the name of the updated application's container, and the container port where CodeDeploy reroutes production traffic. It can also specify optional network configuration and Lambda functions you can run during deployment lifecycle event hooks.

**Note**  
When you commit a change to your Amazon ECR image repository, the pipeline source action creates an `imageDetail.json` file for that commit. For information about the `imageDetail.json` file, see [imageDetail.json file for Amazon ECS blue/green deployment actions](file-reference.md#file-reference-ecs-bluegreen).

When you create or edit your pipeline and update or specify source artifacts for your deployment stage, make sure to point to the source artifacts with the latest name and version you want to use. After you set up your pipeline, as you make changes to your image or task definition, you might need to update your source artifact files in your repositories and then edit the deployment stage in your pipeline.

**Topics**
+ [

## Prerequisites
](#tutorials-ecs-ecr-codedeploy-prereq)
+ [

## Step 1: Create image and push to an Amazon ECR repository
](#tutorials-ecs-ecr-codedeploy-imagerepository)
+ [

## Step 2: Create task definition and AppSpec source files and push to a CodeCommit repository
](#tutorials-ecs-ecr-codedeploy-taskdefinition)
+ [

## Step 3: Create your Application Load Balancer and target groups
](#tutorials-ecs-ecr-codedeploy-loadbal)
+ [

## Step 4: Create your Amazon ECS cluster and service
](#tutorials-ecs-ecr-codedeploy-cluster)
+ [

## Step 5: Create your CodeDeploy application and deployment group (ECS compute platform)
](#tutorials-ecs-ecr-codedeploy-deployment)
+ [

## Step 6: Create your pipeline
](#tutorials-ecs-ecr-codedeploy-pipeline)
+ [

## Step 7: Make a change to your pipeline and verify deployment
](#tutorials-ecs-ecr-codedeploy-update)

## Prerequisites
<a name="tutorials-ecs-ecr-codedeploy-prereq"></a>

You must have already created the following resources:
+ A CodeCommit repository. You can use the AWS CodeCommit repository you created in [Tutorial: Create a simple pipeline (CodeCommit repository)](tutorials-simple-codecommit.md).
+ Launch an Amazon EC2 Linux instance and install Docker to create an image as shown in this tutorial. If you already have an image you want to use, you can skip this prerequisite.

## Step 1: Create image and push to an Amazon ECR repository
<a name="tutorials-ecs-ecr-codedeploy-imagerepository"></a>

In this section, you use Docker to create an image and then use the AWS CLI to create an Amazon ECR repository and push the image to the repository.

**Note**  
If you already have an image you want to use, you can skip this step.

**To create an image**

1. Sign in to your Linux instance where you have Docker installed.

   Pull down an image for `nginx`. This command provides the `nginx:latest` image:

   ```
   docker pull nginx
   ```

1. Run **docker images**. You should see the image in the list.

   ```
   docker images
   ```

**To create an Amazon ECR repository and push your image**

1. Create an Amazon ECR repository to store your image. Make a note of the `repositoryUri` in the output.

   ```
   aws ecr create-repository --repository-name nginx
   ```

   Output:

   ```
   {
       "repository": {
           "registryId": "aws_account_id",
           "repositoryName": "nginx",
           "repositoryArn": "arn:aws:ecr:us-east-1:aws_account_id:repository/nginx",
           "createdAt": 1505337806.0,
           "repositoryUri": "aws_account_id.dkr.ecr.us-east-1.amazonaws.com/nginx"
       }
   }
   ```

1. Tag the image with the `repositoryUri` value from the previous step.

   ```
   docker tag nginx:latest aws_account_id.dkr.ecr.us-east-1.amazonaws.com/nginx:latest
   ```

1. Run the **aws ecr get-login-password** command, as shown in this example for the `us-west-2` Region and the 111122223333 account ID.

   ```
   aws ecr get-login-password --region us-west-2 | docker login --username AWS --password-stdin 111122223333.dkr.ecr.us-west-2.amazonaws.com/nginx
   ```

1. Push the image to Amazon ECR using the `repositoryUri` from the earlier step.

   ```
   docker push 111122223333.dkr.ecr.us-east-1.amazonaws.com/nginx:latest
   ```

## Step 2: Create task definition and AppSpec source files and push to a CodeCommit repository
<a name="tutorials-ecs-ecr-codedeploy-taskdefinition"></a>

In this section, you create a task definition JSON file and register it with Amazon ECS. You then create an AppSpec file for CodeDeploy and use your Git client to push the files to your CodeCommit repository.

**To create a task definition for your image**

1. Create a file named `taskdef.json` with the following contents. For `image`, enter your image name, such as nginx. This value is updated when your pipeline runs.
**Note**  
Make sure that the execution role specified in the task definition contains the `AmazonECSTaskExecutionRolePolicy`. For more information, see [Amazon ECS Task Execution IAM Role](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/task_execution_IAM_role.html) in the *Amazon ECS Developer Guide*.

   ```
   {
       "executionRoleArn": "arn:aws:iam::account_ID:role/ecsTaskExecutionRole",
       "containerDefinitions": [
           {
               "name": "sample-website",
               "image": "nginx",
               "essential": true,
               "portMappings": [
                   {
                       "hostPort": 80,
                       "protocol": "tcp",
                       "containerPort": 80
                   }
               ]
           }
       ],
       "requiresCompatibilities": [
           "FARGATE"
       ],
       "networkMode": "awsvpc",
       "cpu": "256",
       "memory": "512",
       "family": "ecs-demo"
   }
   ```

1. Register your task definition with the `taskdef.json` file.

   ```
   aws ecs register-task-definition --cli-input-json file://taskdef.json
   ```

1. After the task definition is registered, edit your file to remove the image name and include the `<IMAGE1_NAME>` placeholder text in the image field.

   ```
   {
       "executionRoleArn": "arn:aws:iam::account_ID:role/ecsTaskExecutionRole",
       "containerDefinitions": [
           {
               "name": "sample-website",
               "image": "<IMAGE1_NAME>",
               "essential": true,
               "portMappings": [
                   {
                       "hostPort": 80,
                       "protocol": "tcp",
                       "containerPort": 80
                   }
               ]
           }
       ],
       "requiresCompatibilities": [
           "FARGATE"
       ],
       "networkMode": "awsvpc",
       "cpu": "256",
       "memory": "512",
       "family": "ecs-demo"
   }
   ```

**To create an AppSpec file**
+ The AppSpec file is used for CodeDeploy deployments. The file, which includes optional fields, uses this format:

  ```
  version: 0.0
  Resources:
    - TargetService:
        Type: AWS::ECS::Service
        Properties:
          TaskDefinition: "task-definition-ARN"
          LoadBalancerInfo:
            ContainerName: "container-name"
            ContainerPort: container-port-number
  # Optional properties
          PlatformVersion: "LATEST"
          NetworkConfiguration:
              AwsvpcConfiguration:
                Subnets: ["subnet-name-1", "subnet-name-2"]
                SecurityGroups: ["security-group"]
                AssignPublicIp: "ENABLED"
  Hooks:
  - BeforeInstall: "BeforeInstallHookFunctionName"
  - AfterInstall: "AfterInstallHookFunctionName"
  - AfterAllowTestTraffic: "AfterAllowTestTrafficHookFunctionName"
  - BeforeAllowTraffic: "BeforeAllowTrafficHookFunctionName"
  - AfterAllowTraffic: "AfterAllowTrafficHookFunctionName"
  ```

  For more information about the AppSpec file, including examples, see [CodeDeploy AppSpec File Reference](https://docs.aws.amazon.com/codedeploy/latest/userguide/reference-appspec-file.html).

  Create a file named `appspec.yaml` with the following contents. For `TaskDefinition`, do not change the `<TASK_DEFINITION>` placeholder text. This value is updated when your pipeline runs.

  ```
  version: 0.0
  Resources:
    - TargetService:
        Type: AWS::ECS::Service
        Properties:
          TaskDefinition: <TASK_DEFINITION>
          LoadBalancerInfo:
            ContainerName: "sample-website"
            ContainerPort: 80
  ```

**To push files to your CodeCommit repository**

1. Push or upload the files to your CodeCommit repository. These files are the source artifact created by the **Create pipeline** wizard for your deployment action in CodePipeline. Your files should look like this in your local directory:

   ```
   /tmp
     |my-demo-repo
       |-- appspec.yaml
       |-- taskdef.json
   ```

1. Choose the method you want to use to upload your files:

   1. To use your git command line from a cloned repository on your local computer:

      1. Change directories to your local repository:

         ```
         (For Linux, macOS, or Unix) cd /tmp/my-demo-repo
         (For Windows) cd c:\temp\my-demo-repo
         ```

      1. Run the following command to stage all of your files at once:

         ```
         git add -A
         ```

      1. Run the following command to commit the files with a commit message:

         ```
         git commit -m "Added task definition files"
         ```

      1. Run the following command to push the files from your local repo to your CodeCommit repository:

         ```
         git push
         ```

   1. To use the CodeCommit console to upload your files:

      1. Open the CodeCommit console, and choose your repository from the **Repositories** list.

      1. Choose **Add file**, and then choose **Upload file**.

      1. Choose **Choose file**, and then browse for your file. Commit the change by entering your user name and email address. Choose **Commit changes**.

      1. Repeat this step for each file you want to upload.

## Step 3: Create your Application Load Balancer and target groups
<a name="tutorials-ecs-ecr-codedeploy-loadbal"></a>

In this section, you create an Amazon EC2 Application Load Balancer. You use the subnet names and target group values you create with your load balancer later, when you create your Amazon ECS service. You can create an Application Load Balancer or a Network Load Balancer. The load balancer must use a VPC with two public subnets in different Availability Zones. In these steps, you confirm your default VPC, create a load balancer, and then create two target groups for your load balancer. For more information, see [Target Groups for Your Network Load Balancers](https://docs.aws.amazon.com/elasticloadbalancing/latest/network/load-balancer-target-groups.html).

**To verify your default VPC and public subnets**

1. Sign in to the AWS Management Console and open the Amazon VPC console at [https://console.aws.amazon.com/vpc/](https://console.aws.amazon.com/vpc/).

1. Verify the default VPC to use. In the navigation pane, choose **Your VPCs**. Note which VPC shows **Yes** in the **Default VPC** column. This is the default VPC. It contains default subnets for you to select.

1. Choose **Subnets**. Choose two subnets that show **Yes** in the **Default subnet** column.
**Note**  
Make a note of your subnet IDs. You need them later in this tutorial.

1. Choose the subnets, and then choose the **Description** tab. Verify that the subnets you want to use are in different Availability Zones.

1. Choose the subnets, and then choose the **Route Table** tab. To verify that each subnet you want to use is a public subnet, confirm that a gateway row is included in the route table.

**To create an Amazon EC2 Application Load Balancer**

1. Sign in to the AWS Management Console and open the Amazon EC2 console at [https://console.aws.amazon.com/ec2/](https://console.aws.amazon.com/ec2/).

1. In the navigation pane, choose **Load Balancers**.

1. Choose **Create Load Balancer**.

1. Choose **Application Load Balancer**, and then choose **Create**.

1. In **Name**, enter the name of your load balancer.

1. In **Scheme**, choose **internet-facing**.

1. In **IP address type**, choose **ipv4**.

1. Configure two listener ports for your load balancer:

   1. Under **Load Balancer Protocol**, choose **HTTP**. Under **Load Balancer Port**, enter **80**.

   1. Choose **Add listener**.

   1. Under **Load Balancer Protocol** for the second listener, choose **HTTP**. Under **Load Balancer Port**, enter **8080**.

1. Under **Availability Zones**, in **VPC**, choose the default VPC. Next, choose the two default subnets you want to use.

1. Choose **Next: Configure Security Settings**.

1. Choose **Next: Configure Security Groups**.

1. Choose **Select an existing security group**, and make a note of the security group ID.

1. Choose **Next: Configure Routing**.

1. In **Target group**, choose **New target group** and configure your first target group:

   1. In **Name**, enter a target group name (for example, **target-group-1**).

   1. In **Target type**, choose **IP**.

   1. In **Protocol** choose **HTTP**. In **Port**, enter **80**.

   1. Choose **Next: Register Targets**.

1. Choose **Next: Review**, and then choose **Create**.

**To create a second target group for your load balancer**

1. After your load balancer is provisioned, open the Amazon EC2 console. In the navigation pane, choose **Target Groups**.

1. Choose **Create target group**.

1. In **Name**, enter a target group name (for example, **target-group-2**).

1. In **Target type**, choose **IP**.

1. In **Protocol** choose **HTTP**. In **Port**, enter **8080**.

1. In **VPC**, choose the default VPC.

1. Choose **Create**.
**Note**  
You must have two target groups created for your load balancer in order for your deployment to run. You only need to make a note of the ARN of your first target group. This ARN is used in the `create-service` JSON file in the next step.

**To update your load balancer to include your second target group**

1. Open the Amazon EC2 console. In the navigation pane, choose **Load Balancers**.

1. Choose your load balancer, and then choose the **Listeners** tab. Choose the listener with port 8080, and then choose **Edit**.

1. Choose the pencil icon next to **Forward to**. Choose your second target group, and then choose the check mark. Choose **Update** to save the updates.

## Step 4: Create your Amazon ECS cluster and service
<a name="tutorials-ecs-ecr-codedeploy-cluster"></a>

In this section, you create an Amazon ECS cluster and service where CodeDeploy routes traffic during deployment (to an Amazon ECS cluster rather than EC2 instances). To create your Amazon ECS service, you must use the subnet names, security group, and target group value you created with your load balancer to create your service.

**Note**  
When you use these steps to create your Amazon ECS cluster, you use the **Networking only** cluster template, which provisions AWS Fargate containers. AWS Fargate is a technology that manages your container instance infrastructure for you. You do not need to choose or manually create Amazon EC2 instances for your Amazon ECS cluster.

**To create an Amazon ECS cluster**

1. Open the Amazon ECS classic console at [https://console.aws.amazon.com/ecs/](https://console.aws.amazon.com/ecs/).

1. In the navigation pane, choose **Clusters**.

1. Choose **Create cluster**.

1. Choose the **Networking only** cluster template that uses AWS Fargate, and then choose **Next step**.

1. Enter a cluster name on the **Configure cluster** page. You can add an optional tag for your resource. Choose **Create**.

**To create an Amazon ECS service**

Use the AWS CLI to create your service in Amazon ECS.

1. Create a JSON file and name it `create-service.json`. Paste the following into the JSON file.

   For the `taskDefinition` field, when you register a task definition in Amazon ECS, you give it a family. This is similar to a name for multiple versions of the task definition, specified with a revision number. In this example, use "`ecs-demo:1`" for the family and revision number in your file. Use the subnet names, security group, and target group value you created with your load balancer in [Step 3: Create your Application Load Balancer and target groups](#tutorials-ecs-ecr-codedeploy-loadbal).
**Note**  
You need to include your target group ARN in this file. Open the Amazon EC2 console and from the navigation pane, under **LOAD BALANCING**, choose **Target Groups**. Choose your first target group. Copy your ARN from the **Description** tab.

   ```
   {
       "taskDefinition": "family:revision-number",
       "cluster": "my-cluster",
       "loadBalancers": [
           {
               "targetGroupArn": "target-group-arn",
               "containerName": "sample-website",
               "containerPort": 80
           }
       ],
       "desiredCount": 1,
       "launchType": "FARGATE",
       "schedulingStrategy": "REPLICA",
       "deploymentController": {
           "type": "CODE_DEPLOY"
       },
       "networkConfiguration": {
           "awsvpcConfiguration": {
               "subnets": [
                   "subnet-1",
                   "subnet-2"
               ],
               "securityGroups": [
                   "security-group"
               ],
               "assignPublicIp": "ENABLED"
           }
       }
   }
   ```

1. Run the **create-service** command, specifying the JSON file:
**Important**  
Be sure to include `file://` before the file name. It is required in this command.

   This example creates a service named `my-service`.
**Note**  
This example command creates a service named my-service. If you already have a service with this name, the command returns an error.

   ```
   aws ecs create-service --service-name my-service --cli-input-json file://create-service.json
   ```

   The output returns the description fields for your service.

1. Run the **describe-services** command to verify that your service was created.

   ```
   aws ecs describe-services --cluster cluster-name --services service-name
   ```

## Step 5: Create your CodeDeploy application and deployment group (ECS compute platform)
<a name="tutorials-ecs-ecr-codedeploy-deployment"></a>

When you create a CodeDeploy application and deployment group for the Amazon ECS compute platform, the application is used during a deployment to reference the correct deployment group, target groups, listeners, and traffic rerouting behavior.

**To create a CodeDeploy application**

1. Open the CodeDeploy console and choose **Create application**.

1. In **Application name**, enter the name you want to use.

1. In **Compute platform**, choose **Amazon ECS**.

1. Choose **Create application**.

**To create a CodeDeploy deployment group**

1. On your application page's **Deployment groups** tab, choose **Create deployment group**.

1. In **Deployment group name**, enter a name that describes the deployment group.

1. In **Service role**, choose a service role that grants CodeDeploy access to Amazon ECS. To create a new service role, follow these steps:

   1. Open the IAM console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/)).

   1. From the console dashboard, choose **Roles**.

   1. Choose **Create role**.

   1. Under **Select type of trusted entity**, select **AWS service**. Under **Choose a use case**, select **CodeDeploy**. Under **Select your use case**, select **CodeDeploy - ECS**. Choose **Next: Permissions**. The `AWSCodeDeployRoleForECS` managed policy is already attached to the role.

   1. Choose **Next: Tags**, and **Next: Review**.

   1. Enter a name for the role (for example, **CodeDeployECSRole**), and then choose **Create role**.

1. In **Environment configuration**, choose your Amazon ECS cluster name and service name.

1. From **Load balancers**, choose the name of the load balancer that serves traffic to your Amazon ECS service.

1. From **Production listener port**, choose the port and protocol for the listener that serves production traﬃc to your Amazon ECS service. From **Test listener port**, choose the port and protocol for the test listener.

1. From **Target group 1 name** and **Target group 2 name**, choose the target groups used to route traffic during your deployment. Make sure that these are the target groups you created for your load balancer.

1. Choose **Reroute traffic immediately** to determine how long after a successful deployment to reroute traffic to your updated Amazon ECS task.

1. Choose **Create deployment group**.

## Step 6: Create your pipeline
<a name="tutorials-ecs-ecr-codedeploy-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A CodeCommit action where the source artifacts are the task definition and the AppSpec file.
+ A source stage with an Amazon ECR source action where the source artifact is the image file.
+ A deployment stage with an Amazon ECS deploy action where the deployment runs with a CodeDeploy application and deployment group.

**To create a two-stage pipeline with the wizard**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyImagePipeline**.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. In **Step 3: Add source stage**, in **Source provider**, choose **AWS CodeCommit**. In **Repository name**, choose the name of the CodeCommit repository you created in [Step 1: Create a CodeCommit repository](tutorials-simple-codecommit.md#codecommit-create-repository). In **Branch name**, choose the name of the branch that contains your latest code update.

   Choose **Next**.

1. In **Step 4: Add build stage**, choose **Skip build stage**, and then accept the warning message by choosing **Skip** again. Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 6: Add deploy stage**:

   1. In **Deploy provider**, choose **Amazon ECS (Blue/Green)**. In **Application name**, enter or choose the application name from the list, such as `codedeployapp`. In **Deployment group**, enter or choose the deployment group name from the list, such as `codedeploydeplgroup`.

       
**Note**  
The name "Deploy" is the name given by default to the stage created in the **Step 4: Deploy** step, just as "Source" is the name given to the first stage of the pipeline.

   1. Under **Amazon ECS task definition**, choose **SourceArtifact**. In the field, enter **taskdef.json**.

   1. Under **AWS CodeDeploy AppSpec file**, choose **SourceArtifact**. In the field, enter **appspec.yaml**.
**Note**  
At this point, do not fill in any information under **Dynamically update task definition image**.

   1. Choose **Next**.

1. In **Step 7: Review**, review the information, and then choose **Create pipeline**.

**To add an Amazon ECR source action to your pipeline**

View your pipeline and add an Amazon ECR source action to your pipeline.

1. Choose your pipeline. In the upper left, choose **Edit**.

1. In the source stage, choose **Edit stage**.

1. Add a parallel action by choosing **\$1 Add action** next to your CodeCommit source action.

1. In **Action name**, enter a name (for example, **Image**).

1. In **Action provider**, choose **Amazon ECR**.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/ECR-source-action.png)

1. In **Repository name**, choose the name of your Amazon ECR repository.

1. In **Image tag**, specify the image name and version, if different from latest.

1. In **Output artifacts**, choose the output artifact default (for example, `MyImage`) that contains the image name and repository URI information you want the next stage to use.

1. Choose **Save** on the action screen. Choose **Done** on the stage screen. Choose **Save** on the pipeline. A message shows the Amazon CloudWatch Events rule to be created for the Amazon ECR source action.

**To wire your source artifacts to the deploy action**

1. Choose **Edit** on your Deploy stage and choose the icon to edit the **Amazon ECS (Blue/Green)** action.

1. Scroll to the bottom of the pane. In **Input artifacts**, choose **Add**. Add the source artifact from your new Amazon ECR repository (for example, `MyImage`).

1. In **Task Definition**, choose **SourceArtifact**, and then verify **taskdef.json** is entered.

1. In **AWS CodeDeploy AppSpec File**, choose **SourceArtifact**, and then verify **appspec.yaml** is entered.

1. In **Dynamically update task definition image**, in **Input Artifact with Image URI**, choose **MyImage**, and then enter the placeholder text that is used in the `taskdef.json` file:** IMAGE1\$1NAME**. Choose **Save**.

1. In the AWS CodePipeline pane, choose **Save pipeline change**, and then choose **Save change**. View your updated pipeline.

   After this example pipeline is created, the action configuration for the console entries appears in the pipeline structure as follows:

   ```
   "configuration": {
     "AppSpecTemplateArtifact": "SourceArtifact",
     "AppSpecTemplatePath": "appspec.yaml",
     "TaskDefinitionTemplateArtifact": "SourceArtifact",
     "TaskDefinitionTemplatePath": "taskdef.json",
     "ApplicationName": "codedeployapp",
     "DeploymentGroupName": "codedeploydeplgroup",
     "Image1ArtifactName": "MyImage",
     "Image1ContainerName": "IMAGE1_NAME"
   },
   ```

1. To submit your changes and start a pipeline build, choose **Release change**, and then choose **Release**.

1. Choose the deployment action to view it in CodeDeploy and see the progress of the traffic shifting.
**Note**  
You might see a deployment step that shows an optional wait time. By default, CodeDeploy waits one hour after a successful deployment before it terminates the original task set. You can use this time to roll back or terminate the task, but your deployment otherwise completes when the task set is terminated.

## Step 7: Make a change to your pipeline and verify deployment
<a name="tutorials-ecs-ecr-codedeploy-update"></a>

Make a change to your image and then push the change to your Amazon ECR repository. This triggers your pipeline to run. Verify that your image source change is deployed.

# Tutorial: Create a pipeline that deploys an Amazon Alexa skill
<a name="tutorials-alexa-skills-kit"></a>

In this tutorial, you configure a pipeline that continuously delivers your Alexa skill using the Alexa Skills Kit as the deployment provider in your deployment stage. The completed pipeline detects changes to your skill when you make a change to the source files in your source repository. The pipeline then uses the Alexa Skills Kit to deploy to the Alexa skill development stage.

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Note**  
This feature is not available in the Asia Pacific (Hong Kong) or Europe (Milan) Region. To use other deploy actions available in that Region, see [Deploy action integrations](integrations-action-type.md#integrations-deploy).

To create your custom skill as a Lambda function, see [Host a Custom Skill as an AWS Lambda Function](https://developer.amazon.com/docs/custom-skills/host-a-custom-skill-as-an-aws-lambda-function.html). You can also create a pipeline that uses Lambda source files and a CodeBuild project to deploy changes to Lambda for your skill.

## Prerequisites
<a name="tutorials-alexa-skills-kit-prereq"></a>

You must already have the following:
+ A CodeCommit repository. You can use the AWS CodeCommit repository you created in [Tutorial: Create a simple pipeline (CodeCommit repository)](tutorials-simple-codecommit.md).
+ An Amazon developer account. This is the account that owns your Alexa skills. You can create an account for free at [Alexa Skills Kit](https://developer.amazon.com/alexa-skills-kit). 
+ An Alexa skill. You can create a sample skill using the [Get Custom Skill Sample Code](https://developer.amazon.com/docs/custom-skills/use-the-alexa-skills-kit-samples.html) tutorial.
+ Install the ASK CLI and configure it using `ask init` with your AWS credentials. See [Install and initialize ASK CLI](https://developer.amazon.com/docs/smapi/quick-start-alexa-skills-kit-command-line-interface.html#install-initialize).

## Step 1: Create an Alexa developer services LWA security profile
<a name="tutorials-alexa-skills-kit-profile"></a>

In this section, you create a security profile to use with Login with Amazon (LWA). If you already have a profile, you can skip this step.
+ Use the steps in [generate-lwa-tokens](https://developer.amazon.com/docs/smapi/ask-cli-command-reference.html#generate-lwa-tokens) to create a Security Profile.
+ After you create the profile, make a note of the **Client ID** and **Client Secret**.
+ Make sure you enter the **Allowed Return URLs** as provided in the instructions. The URLs allow the ASK CLI command to redirect refresh token requests.

## Step 2: Create Alexa skill source files and push to your CodeCommit repository
<a name="tutorials-alexa-skills-kit-push"></a>

In this section, you create and push your Alexa skill source files to the repository that the pipeline uses for your source stage. For the skill you have created in the Amazon developer console, you produce and push the following: 
+ A `skill.json` file.
+ An `interactionModel/custom` folder.
**Note**  
This directory structure complies with Alexa Skills Kit skill package format requirements, as outlined in [Skill package format](https://developer.amazon.com/docs/smapi/skill-package-api-reference.html#skill-package-format). If your directory structure does not use the correct skill package format, changes do not successfully deploy to the Alexa Skills Kit console.

**To create source files for your skill**

1. Retrieve your skill ID from the Alexa Skills Kit developer console. Use this command:

   ```
   ask api list-skills
   ```

   Locate your skill by name and then copy the associated ID in the `skillId` field.

1. Generate a `skill.json` file that contains your skill details. Use this command:

   ```
   ask api get-skill -s skill-ID > skill.json
   ```

1. (Optional) Create an `interactionModel/custom` folder.

   Use this command to generate the interaction model file within the folder. For locale, this tutorial uses en-US as the locale in the file name.

   ```
   ask api get-model --skill-id skill-ID --locale locale >
       ./interactionModel/custom/locale.json
   ```

**To push files to your CodeCommit repository**

1. Push or upload the files to your CodeCommit repository. These files are the source artifact created by the **Create Pipeline** wizard for your deployment action in AWS CodePipeline. Your files should look like this in your local directory:

   ```
   skill.json
   /interactionModel
     /custom
       |en-US.json
   ```

1. Choose the method you want to use to upload your files:

   1. To use the Git command line from a cloned repository on your local computer:

      1. Run the following command to stage all of your files at once:

         ```
         git add -A
         ```

      1. Run the following command to commit the files with a commit message:

         ```
         git commit -m "Added Alexa skill files"
         ```

      1. Run the following command to push the files from your local repo to your CodeCommit repository:

         ```
         git push
         ```

   1. To use the CodeCommit console to upload your files: 

      1. Open the CodeCommit console, and choose your repository from the **Repositories** list.

      1. Choose **Add file**, and then choose **Upload file**. 

      1. Choose **Choose file**, and then browse for your file. Commit the change by entering your user name and email address. Choose **Commit changes**.

      1. Repeat this step for each file you want to upload.

## Step 3: Use ASK CLI commands to create a refresh token
<a name="tutorials-alexa-skills-kit-token"></a>

CodePipeline uses a refresh token based on the client ID and secret in your Amazon developer account to authorize actions it performs on your behalf. In this section, you use the ASK CLI to create the token. You use these credentials when you use the **Create Pipeline** wizard.

**To create a refresh token with your Amazon developer account credentials**

1. Use the following command: 

   ```
   ask util generate-lwa-tokens
   ```

1. When prompted, enter your client ID and secret as shown in this example: 

   ```
   ? Please type in the client ID: 
   amzn1.application-client.example112233445566
   ? Please type in the client secret:
   example112233445566
   ```

1. The sign-in browser page displays. Sign in with your Amazon developer account credentials.

1. Return to the command line screen. The access token and refresh token are generated in the output. Copy the refresh token returned in the output.

## Step 4: Create your pipeline
<a name="tutorials-alexa-skills-kit-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with a CodeCommit action where the source artifacts are the Alexa skill files that support your skill.
+ A deployment stage with an Alexa Skills Kit deploy action.

**To create a pipeline with the wizard**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. Choose the AWS Region where you want to create the project and its resources. The Alexa skill runtime is available only in the following Regions:
   + Asia Pacific (Tokyo)
   + Europe (Ireland)
   + US East (N. Virginia)
   + US West (Oregon)

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyAlexaPipeline**.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. In **Step 3: Add source stage**, in **Source provider**, choose **AWS CodeCommit**. In **Repository name**, choose the name of the CodeCommit repository you created in [Step 1: Create a CodeCommit repository](tutorials-simple-codecommit.md#codecommit-create-repository). In **Branch name**, choose the name of the branch that contains your latest code update.

   After you select the repository name and branch, a message shows the Amazon CloudWatch Events rule to be created for this pipeline. 

   Choose **Next**.

1. In **Step 4: Add build stage**, choose **Skip build stage**, and then accept the warning message by choosing **Skip** again.

   Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 6: Add deploy stage**:

   1. In **Deploy provider**, choose **Alexa Skills Kit**. 

   1. In **Alexa skill ID**, enter the skill ID assigned to your skill in the Alexa Skills Kit developer console.

   1. In **Client ID**, enter the ID of the application you registered.

   1. In **Client secret**, enter the secret you chose when you registered.

   1. In **Refresh token**, enter the token you generated in step 3.  
![\[The Step 6: Deploy page for an Alexa Skills Kit action\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/alexa-deploy.png)

   1. Choose **Next**.

1. In **Step 7: Review**, review the information, and then choose **Create pipeline**.

## Step 5: Make a change to any source file and verify deployment
<a name="tutorials-alexa-skills-kit-update"></a>

Make a change to your skill and then push the change to your repository. This triggers your pipeline to run. Verify that your skill is updated in the [Alexa Skills Kit developer console](https://developer.amazon.com/alexa/console/ask).

# Tutorial: Create a pipeline that uses Amazon S3 as a deployment provider
<a name="tutorials-s3deploy"></a>

In this tutorial, you configure a pipeline that continuously delivers files using Amazon S3 as the deployment action provider in your deployment stage. The completed pipeline detects changes when you make a change to the source files in your source repository. The pipeline then uses Amazon S3 to deploy the files to your bucket. Each time you modify or add your website files in your source location, the deployment creates the website with your latest files. 

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Note**  
Even if you delete files from the source repository, the S3 deploy action does not delete S3 objects corresponding to deleted files.

This tutorial provides two options:
+ Create a pipeline that deploys a static website to your S3 public bucket. This example creates a pipeline with an AWS CodeCommit source action and an Amazon S3 deployment action. See [Option 1: Deploy static website files to Amazon S3](#tutorials-s3deploy-acc).
+ Create a pipeline that compiles sample TypeScript code into JavaScript and deploys the CodeBuild output artifact to your S3 bucket for archive. This example creates a pipeline with an Amazon S3 source action, a CodeBuild build action, and an Amazon S3 deployment action. See [Option 2: Deploy built archive files to Amazon S3 from an S3 source bucket](#tutorials-s3deploy-s3source).

**Important**  
Many of the actions you add to your pipeline in this procedure involve AWS resources that you need to create before you create the pipeline. AWS resources for your source actions must always be created in the same AWS Region where you create your pipeline. For example, if you create your pipeline in the US East (Ohio) Region, your CodeCommit repository must be in the US East (Ohio) Region.   
You can add cross-region actions when you create your pipeline. AWS resources for cross-region actions must be in the same AWS Region where you plan to execute the action. For more information, see [Add a cross-Region action in CodePipeline](actions-create-cross-region.md).

## Option 1: Deploy static website files to Amazon S3
<a name="tutorials-s3deploy-acc"></a>

In this example, you download the sample static website template file, upload the files to your AWS CodeCommit repository, create your bucket, and configure it for hosting. Next, you use the AWS CodePipeline console to create your pipeline and specify an Amazon S3 deployment configuration.

### Prerequisites
<a name="tutorials-s3deploy-acc-prereq"></a>

You must already have the following:
+ A CodeCommit repository. You can use the AWS CodeCommit repository you created in [Tutorial: Create a simple pipeline (CodeCommit repository)](tutorials-simple-codecommit.md).
+ Source files for your static website. Use this link to download a [sample static website](samples/sample-website.zip). The sample-website.zip download produces the following files: 
  + An `index.html` file
  + A `main.css` file
  + A `graphic.jpg` file
+ An S3 bucket configured for website hosting. See [Hosting a static website on Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/userguide/WebsiteHosting.html). Make sure you create your bucket in the same Region as the pipeline.
**Note**  
To host a website, your bucket must have public read access, which gives everyone read access. With the exception of website hosting, you should keep the default access settings that block public access to S3 buckets.

### Step 1: Push source files to your CodeCommit repository
<a name="tutorials-s3deploy-acc-push"></a>

In this section, push your source files to the repository that the pipeline uses for your source stage.

**To push files to your CodeCommit repository**

1. Extract the downloaded sample files. Do not upload the ZIP file to your repository.

1. Push or upload the files to your CodeCommit repository. These files are the source artifact created by the **Create Pipeline** wizard for your deployment action in CodePipeline. Your files should look like this in your local directory:

   ```
   index.html
   main.css
   graphic.jpg
   ```

1. You can use Git or the CodeCommit console to upload your files:

   1. To use the Git command line from a cloned repository on your local computer:

      1. Run the following command to stage all of your files at once:

         ```
         git add -A
         ```

      1. Run the following command to commit the files with a commit message:

         ```
         git commit -m "Added static website files"
         ```

      1. Run the following command to push the files from your local repo to your CodeCommit repository:

         ```
         git push
         ```

   1. To use the CodeCommit console to upload your files: 

      1. Open the CodeCommit console, and choose your repository from the **Repositories** list.

      1. Choose **Add file**, and then choose **Upload file**. 

      1. Select **Choose file**, and then browse for your file. Commit the change by entering your user name and email address. Choose **Commit changes**.

      1. Repeat this step for each file you want to upload.

### Step 2: Create your pipeline
<a name="tutorials-s3deploy-acc-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with a CodeCommit action where the source artifacts are the files for your website.
+ A deployment stage with an Amazon S3 deployment action.

**To create a pipeline with the wizard**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. On the **Welcome** page, **Getting started** page, or **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyS3DeployPipeline**.

1. In **Pipeline type**, choose **V2**. For more information, see [Pipeline types](pipeline-types.md). Choose **Next**.

1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. In **Step 3: Add source stage**, in **Source provider**, choose **AWS CodeCommit**. In **Repository name**, choose the name of the CodeCommit repository you created in [Step 1: Create a CodeCommit repository](tutorials-simple-codecommit.md#codecommit-create-repository). In **Branch name**, choose the name of the branch that contains your latest code update. Unless you created a different branch on your own, only `main` is available. 

   After you select the repository name and branch, the Amazon CloudWatch Events rule to be created for this pipeline is displayed. 

   Choose **Next**.

1. In **Step 4: Add build stage**, choose **Skip build stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 6: Add deploy stage**:

   1. In **Deploy provider**, choose **Amazon S3**. 

   1. In **Bucket**, enter the name of your public bucket.

   1. Select **Extract file before deploy**.
**Note**  
The deployment fails if you do not select **Extract file before deploy**. This is because the AWS CodeCommit action in your pipeline zips source artifacts and your file is a ZIP file.

      When **Extract file before deploy** is selected, **Deployment path** is displayed. Enter the name of the path you want to use. This creates a folder structure in Amazon S3 to which the files are extracted. For this tutorial, leave this field blank.  
![\[The Step 6: Add deploy stage page for an S3 deploy action with an AWS CodeCommit source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/tutorial-s3deploy-stage-codecommit.png)

   1. (Optional) In **Canned ACL**, you can apply a set of predefined grants, known as a [canned ACL](https://docs.aws.amazon.com/AmazonS3/latest/userguide/acl-overview.html#canned-acl), to the uploaded artifacts. 

   1. (Optional) In **Cache control**, enter the caching parameters. You can set this to control caching behavior for requests/responses. For valid values, see the [http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9](http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9) header field for HTTP operations.

   1. Choose **Next**.

1. In **Step 7: Review**, review the information, and then choose **Create pipeline**.

1. After your pipeline runs successfully, open the Amazon S3 console and verify that your files appear in your public bucket as shown:

   ```
   index.html
   main.css
   graphic.jpg
   ```

1. Access your endpoint to test the website. Your endpoint follows this format: `http://bucket-name.s3-website-region.amazonaws.com/`.

   Example endpoint: `http://my-bucket.s3-website-us-west-2.amazonaws.com/`. 

   The sample web page appears.

### Step 3: Make a change to any source file and verify deployment
<a name="tutorials-s3deploy-acc-update"></a>

Make a change to your source files and then push the change to your repository. This triggers your pipeline to run. Verify that your website is updated.

## Option 2: Deploy built archive files to Amazon S3 from an S3 source bucket
<a name="tutorials-s3deploy-s3source"></a>

In this option, the build commands in your build stage compile TypeScript code into JavaScript code and deploy the output to your S3 target bucket under a separate timestamped folder. First, you create TypeScript code and a buildspec.yml file. After you combine the source files in a ZIP file, you upload the source ZIP file to your S3 source bucket, and use a CodeBuild stage to deploy a built application ZIP file to your S3 target bucket. The compiled code is retained as an archive in your target bucket.

### Prerequisites
<a name="tutorials-s3deploy-s3source-prereq"></a>

You must already have the following:
+ An S3 source bucket. You can use the bucket you created in [Tutorial: Create a simple pipeline (S3 bucket)](tutorials-simple-s3.md).
+ An S3 target bucket. See [Hosting a static website on Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/userguide/WebsiteHosting.html). Make sure you create your bucket in the same AWS Region as the pipeline you want to create.
**Note**  
This example demonstrates deploying files to a private bucket. Do not enable your target bucket for website hosting or attach any policies that make the bucket public.

### Step 1: Create and upload source files to your S3 source bucket
<a name="tutorials-s3deploy-s3source-upload"></a>

In this section, you create and upload your source files to the bucket that the pipeline uses for your source stage. This section provides instructions for creating the following source files:
+ A `buildspec.yml` file, which is used for CodeBuild build projects.
+ An `index.ts` file. 

**To create a buildspec.yml file**
+ Create a file named `buildspec.yml` with the following contents. These build commands install TypeScript and use the TypeScript compiler to rewrite the code in `index.ts` to JavaScript code.

  ```
  version: 0.2
  
  phases:
    install:
      commands:
        - npm install -g typescript
    build:
      commands:
        - tsc index.ts
  artifacts:
    files:
      - index.js
  ```

**To create an index.ts file**
+ Create a file named `index.ts` with the following contents.

  ```
  interface Greeting {
      message: string;
  }
  
  class HelloGreeting implements Greeting {
      message = "Hello!";
  }
  
  function greet(greeting: Greeting) {
      console.log(greeting.message);
  }
  
  let greeting = new HelloGreeting();
  
  greet(greeting);
  ```

**To upload files to your S3 source bucket**

1. Your files should look like this in your local directory:

   ```
   buildspec.yml
   index.ts
   ```

   Zip the files and name the file `source.zip`.

1. In the Amazon S3 console, for your source bucket, choose **Upload**. Choose **Add files**, and then browse for the ZIP file you created.

1.  Choose **Upload**. These files are the source artifact created by the **Create Pipeline** wizard for your deployment action in CodePipeline. Your file should look like this in your bucket:

   ```
   source.zip
   ```

### Step 2: Create your pipeline
<a name="tutorials-s3deploy-s3source-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with an Amazon S3 action where the source artifacts are the files for your downloadable application.
+ A deployment stage with an Amazon S3 deployment action.

**To create a pipeline with the wizard**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. On the **Welcome** page, **Getting started** page, or **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyS3DeployPipeline**.

1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. In **Step 3: Add source stage**, in **Source provider**, choose **Amazon S3**. In **Bucket**, choose the name of your source bucket. In **S3 object key**, enter the name of your source ZIP file. Make sure you include the .zip file extension.

   Choose **Next**.

1. In **Step 4: Add build stage**:

   1. In **Build provider**, choose **CodeBuild**.

   1. Choose **Create build project**. On the **Create project** page:

   1. In **Project name**, enter a name for this build project.

   1. In **Environment**, choose **Managed image**. For **Operating system**, choose **Ubuntu**.

   1. For **Runtime**, choose **Standard**. For **Runtime version**, choose **aws/codebuild/standard:1.0**.

   1. In **Image version**, choose **Always use the latest image for this runtime version**.

   1. For **Service role**, choose your CodeBuild service role, or create one.

   1. For **Build specifications**, choose **Use a buildspec file**.

   1. Choose **Continue to CodePipeline**. A message is displayed if the project was created successfully.

   1. Choose **Next**.

1. In **Step 5: Add deploy stage**:

   1. In **Deploy provider**, choose **Amazon S3**. 

   1. In **Bucket**, enter the name of your S3 target bucket.

   1. Make sure that **Extract file before deploy** is cleared.

      When **Extract file before deploy** is cleared, **S3 object key** is displayed. Enter the name of the path you want to use: `js-application/{datetime}.zip`.

      This creates a `js-application` folder in Amazon S3 to which the files are extracted. In this folder, the `{datetime}` variable creates a timestamp on each output file when your pipeline runs.  
![\[The Step 5: Deploy page for an Amazon S3 deploy action with an Amazon S3 source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/tutorial-s3deploy-stage-s3source.png)

   1. (Optional) In **Canned ACL**, you can apply a set of predefined grants, known as a [canned ACL](https://docs.aws.amazon.com/AmazonS3/latest/userguide/acl-overview.html#canned-acl), to the uploaded artifacts. 

   1. (Optional) In **Cache control**, enter the caching parameters. You can set this to control caching behavior for requests/responses. For valid values, see the [http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9](http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9) header field for HTTP operations.

   1. Choose **Next**.

1. In **Step 6: Review**, review the information, and then choose **Create pipeline**.

1. After your pipeline runs successfully, view your bucket in the Amazon S3 console. Verify that your deployed ZIP file is displayed in your target bucket under the `js-application` folder. The JavaScript file contained in the ZIP file should be `index.js`. The `index.js` file contains the following output:

   ```
   var HelloGreeting = /** @class */ (function () {
       function HelloGreeting() {
           this.message = "Hello!";
       }
       return HelloGreeting;
   }());
   function greet(greeting) {
       console.log(greeting.message);
   }
   var greeting = new HelloGreeting();
   greet(greeting);
   ```

### Step 3: Make a change to any source file and verify deployment
<a name="tutorials-s3deploy-s3source-update"></a>

Make a change to your source files and then upload them to your source bucket. This triggers your pipeline to run. View your target bucket and verify that the deployed output files are available in the `js-application` folder as shown:

![\[Sample ZIP download\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/tutorial-s3deploy-pipeline-appzip.png)


# Tutorial: Create a pipeline that publishes your serverless application to the AWS Serverless Application Repository
<a name="tutorials-serverlessrepo-auto-publish"></a>

You can use AWS CodePipeline to continuously deliver your AWS SAM serverless application to the AWS Serverless Application Repository.

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

This tutorial shows how to create and configure a pipeline to build your serverless application that is hosted in GitHub and publish it to the AWS Serverless Application Repository automatically. The pipeline uses GitHub as the source provider and CodeBuild as the build provider. To publish your serverless application to the AWS Serverless Application Repository, you deploy an [application](https://serverlessrepo.aws.amazon.com/applications/arn:aws:serverlessrepo:us-east-1:077246666028:applications~aws-serverless-codepipeline-serverlessrepo-publish ) (from the AWS Serverless Application Repository) and associate the Lambda function created by that application as an Invoke action provider in your pipeline. Then you can continuously deliver application updates to the AWS Serverless Application Repository, without writing any code.

**Important**  
Many of the actions you add to your pipeline in this procedure involve AWS resources that you need to create before you create the pipeline. AWS resources for your source actions must always be created in the same AWS Region where you create your pipeline. For example, if you create your pipeline in the US East (Ohio) Region, your CodeCommit repository must be in the US East (Ohio) Region.   
You can add cross-region actions when you create your pipeline. AWS resources for cross-region actions must be in the same AWS Region where you plan to execute the action. For more information, see [Add a cross-Region action in CodePipeline](actions-create-cross-region.md).

## Before you begin
<a name="tutorials-serverlessrepo-auto-publish-prereq"></a>

In this tutorial, we assume the following. 
+ You are familiar with [AWS Serverless Application Model (AWS SAM)](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/) and the [AWS Serverless Application Repository](https://docs.aws.amazon.com/serverlessrepo/latest/devguide/).
+ You have a serverless application hosted in GitHub that you have published to the AWS Serverless Application Repository using the AWS SAM CLI. To publish an example application to the AWS Serverless Application Repository, see [Quick Start: Publishing Applications](https://docs.aws.amazon.com/serverlessrepo/latest/devguide/serverlessrepo-quick-start.html) in the *AWS Serverless Application Repository Developer Guide*. To publish your own application to the AWS Serverless Application Repository, see [Publishing Applications Using the AWS SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-template-publishing-applications.html) in the *AWS Serverless Application Model Developer Guide*.

## Step 1: Create a buildspec.yml file
<a name="serverlessrepo-auto-publish-create-buildspec"></a>

Create a `buildspec.yml` file with the following contents, and add it to your serverless application's GitHub repository. Replace *template.yml* with your application's AWS SAM template and *bucketname* with the S3 bucket where your packaged application is stored.

```
version: 0.2
phases:
  install:
    runtime-versions:
        python: 3.8
  build:
    commands:
      - sam package --template-file template.yml --s3-bucket bucketname --output-template-file packaged-template.yml
artifacts:
  files:
    - packaged-template.yml
```

## Step 2: Create and configure your pipeline
<a name="serverlessrepo-auto-publish-create-pipeline"></a>

Follow these steps to create your pipeline in the AWS Region where you want to publish your serverless application.

1. Sign in to the AWS Management Console and open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. If necessary, switch to the AWS Region where you want to publish your serverless application.

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. Choose **Create pipeline**. On the **Step 2: Choose pipeline settings** page, in **Pipeline name**, enter the name for your pipeline.

1. In **Pipeline type**, choose **V2**. For more information, see [Pipeline types](pipeline-types.md). Choose **Next**.

1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. On the **Step 3: Add source stage** page, in **Source provider**, choose **GitHub**.

1. Under **Connection**, choose an existing connection or create a new one. To create or manage a connection for your GitHub source action, see [GitHub connections](connections-github.md).

1. In **Repository**, choose your GitHub source repository.

1. In **Branch**, choose your GitHub branch.

1. Leave the remaining defaults for the source action. Choose **Next**.

1. On the **Step 4: Add build stage** page, add a build stage:

   1. In **Build provider**, choose **AWS CodeBuild**. For **Region**, use the pipeline Region.

   1. Choose **Create project**.

   1. In **Project name**, enter a name for this build project.

   1. In **Environment image**, choose **Managed image**. For **Operating system**, choose **Ubuntu**.

   1. For **Runtime** and **Runtime version**, choose the runtime and version required for your serverless application.

   1. For **Service role**, choose **New service role**.

   1. For **Build specifications**, choose **Use a buildspec file**.

   1. Choose **Continue to CodePipeline**. This opens the CodePipeline console and creates a CodeBuild project that uses the `buildspec.yml` in your repository for configuration. The build project uses a service role to manage AWS service permissions. This step might take a couple of minutes.

   1. Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. On the **Step 6: Add deploy stage** page, choose **Skip deploy stage**, and then accept the warning message by choosing **Skip** again. Choose **Next**.

1. On **Step 7: Review**, choose **Create pipeline**. You should see a diagram that shows the stages.

1. Grant the CodeBuild service role permission to access the S3 bucket where your packaged application is stored.

   1. In the **Build** stage of your new pipeline, choose **CodeBuild**.

   1. Choose the **Build details** tab.

   1. In **Environment**, choose the CodeBuild service role to open the IAM console.

   1. Expand the selection for `CodeBuildBasePolicy`, and choose **Edit policy**.

   1. Choose **JSON**.

   1. Add a new policy statement with the following contents. The statement allows CodeBuild to put objects into the S3 bucket where your packaged application is stored. Replace *bucketname* with the name of your S3 bucket.

      ```
              {
                  "Effect": "Allow",
                  "Resource": [
                      "arn:aws:s3:::bucketname/*"
                  ],
                  "Action": [
                      "s3:PutObject"
                  ]
              }
      ```

   1. Choose **Review policy**.

   1. Choose **Save changes**.

## Step 3: Deploy the publish application
<a name="serverlessrepo-auto-publish-deploy-app"></a>

Follow these steps to deploy the application that contains the Lambda function that performs the publish to the AWS Serverless Application Repository. This application is **aws-serverless-codepipeline-serverlessrepo-publish**.

**Note**  
You must deploy the application to the same AWS Region as your pipeline.

1. Go to the [application](https://serverlessrepo.aws.amazon.com/applications/arn:aws:serverlessrepo:us-east-1:077246666028:applications~aws-serverless-codepipeline-serverlessrepo-publish ) page, and choose **Deploy**.

1. Select **I acknowledge that this app creates custom IAM roles**. 

1. Choose **Deploy**.

1. Choose **View CloudFormation Stack** to open the CloudFormation console.

1. Expand the **Resources** section. You see **ServerlessRepoPublish**, which is of the type **AWS::Lambda::Function**. Make a note of the physical ID of this resource for the next step. You use this physical ID when you create the new publish action in CodePipeline.

## Step 4: Create the publish action
<a name="serverlessrepo-auto-publish-create-action"></a>

Follow these steps to create the publish action in your pipeline.

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. In the left navigation section, choose the pipeline that you want to edit.

1. Choose **Edit**.

1. After the last stage of your current pipeline, choose **\$1 Add stage**. In **Stage name** enter a name, such as **Publish**, and choose **Add stage**.

1. In the new stage, choose **\$1 Add action group**.

1. Enter an action name. From **Action provider**, in **Invoke**, choose **AWS Lambda**.

1. From **Input artifacts**, choose **BuildArtifact**.

1. From **Function name**, choose the physical ID of the Lambda function that you noted in the previous step.

1. Choose **Save** for the action.

1. Choose **Done** for the stage.

1. In the upper right, choose **Save**.

1. To verify your pipeline, make a change to your application in GitHub. For example, change the application's description in the `Metadata` section of your AWS SAM template file. Commit the change and push it to your GitHub branch. This triggers your pipeline to run. When the pipeline is complete, check that your application has been updated with your change in the [AWS Serverless Application Repository](https://console.aws.amazon.com/serverlessrepo/home).

# Tutorial: Lambda function deployments with CodePipeline
<a name="tutorials-lambda-deploy"></a>

This tutorial helps you to create a deploy action in CodePipeline that deploys your code to your function you have configured in Lambda. You will create a sample Lambda function where you will create an alias and version, add the zipped Lambda function to your source location, and run the Lambda action in your pipeline.

**Note**  
As part of creating a pipeline in the console, an S3 artifact bucket will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Note**  
The `Lambda` deploy action is only available for V2 type pipelines.

## Prerequisites
<a name="tutorials-lambda-deploy-prereqs"></a>

There are a few resources that you must have in place before you can use this tutorial to create your CD pipeline. Here are the things you need to get started:

**Note**  
All of these resources should be created within the same AWS Region.
+ A source control repository, such as GitHub, or a source S3 bucket (this tutorial uses S3) where you will store a  `.zip` file that you create for your Lambda function.
+ You must use an existing CodePipeline service role that has been updated with the permissions for this action. To update your service role, see [Service role policy permissions for the Lambda deploy action](action-reference-LambdaDeploy.md#action-reference-LambdaDeploy-permissions-action).

After you have satisfied these prerequisites, you can proceed with the tutorial and create your CD pipeline.

## Step 1: Create your sample Lambda function
<a name="tutorials-lambda-deploy-instances"></a>

In this step, you create the Lambda function where you will deploy.

**To create your Lambda function**

1. Access the Lambda console and follow the steps in the following tutorial to create a sample Lambda function: link.

1. From the top navigation, choose **Create**, and select **Start from scratch** from the top of the page.

1. In **Name**, enter **MyLambdaFunction**. 

1. Publish a new version. This will be the version that the alias will point to.

   1. Select your function.

   1. Choose the **Actions** dropdown.

   1. Choose **Publish new version**.

   1. (Optional) Add to the description in **Description**.

   1. Choose **Publish**.

1. Create an alias for your function, such as `aliasV1`.

1. Make sure the alias is pointing to the version that you just created (such as 1).
**Note**  
If you choose \$1LATEST, you cannot use traffic shifting features because Lambda does not support \$1LATEST for an alias pointing to more than 1 version.

## Step 2: Upload the function file to your repository
<a name="tutorials-lambda-deploy-file"></a>

Download the function and save it as a zip file. Upload the zipped file to your S3 bucket using the following steps.

**To add a `.zip` file to your source repository**

1. Open your S3 bucket.

1. Choose **Upload**.

1. Upload the zip file containing your `sample_lambda_source.zip` file to your source bucket.

   Make a note of the path.

   ```
   object key
   ```

## Step 3: Creating your pipeline
<a name="tutorials-lambda-deploy-pipeline"></a>

Use the CodePipeline wizard to create your pipeline stages and connect your source repository.

**To create your pipeline**

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyPipeline**.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, choose **Use existing service role**, and then choose the CodePipeline service role that has been updated with the required permissions for this action. To configure your CodePipeline service role for this action, see [Service role policy permissions for the Lambda deploy action](action-reference-LambdaDeploy.md#action-reference-LambdaDeploy-permissions-action).

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. On the **Step 3: Add source stage** page, add a source stage:

   1. In **Source provider**, choose **Amazon S3**.

   1. In **Object key**, add the name of your .zip file, including the file extension, such as `sample_lambda_source.zip`.

      

   Choose **Next**.

1. On the **Step 4: Add build stage** page, choose **Skip**.

1. On the **Step 5: Add test stage** page, choose **Skip**.

1. On the **Step 6: Add deploy stage** page, choose **Lambda**.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/lambdadeploy-edit-screen.png)

   1. Add your function name and alias. 

   1. Choose your deploy strategy.

   1. Choose **Next**.

1. On the **Step 7: Review** page, review your pipeline configuration and choose **Create pipeline** to create the pipeline.  
![\[\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/lambdadeploy-pipeline-screen.png)

## Step 4: Test Your Pipeline
<a name="tutorials-lambda-deploy-test"></a>

Your pipeline should have everything for running an end-to-end native AWS continuous deployment. Now, test its functionality by pushing a code change to your source repository.

**To test your pipeline**

1. Make a code change to your configured source repository, commit, and push the change.

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. Choose your pipeline from the list.

1. Watch the pipeline progress through its stages. Your pipeline should complete and your action deploys to your Lambda function.

## Learn more
<a name="tutorials-lambda-deploy-learn"></a>

The Lambda deploy action allows two methods of deployment. One method is traffic shifting alone without an input artifact from the source action. The other method is updating function code using an input artifact from the source action, then publishing a new version based on the updated code. For the second method, if the alias is provided, CodePipeline will do the traffic shifting as well. This Lambda deploy action tutorial demonstrates updating your function using a source artifact.

To learn more about the action, see the action reference page at [AWS Lambda deploy action reference](action-reference-LambdaDeploy.md).

# Tutorial: Using variables with Lambda invoke actions
<a name="tutorials-lambda-variables"></a>

A Lambda invoke action can use variables from another action as part of its input and return new variables along with its output. For information about variables for actions in CodePipeline, see [Variables reference](reference-variables.md).

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

At the end of this tutorial, you will have:
+ A Lambda invoke action that:
  + Consumes the `CommitId` variable from a CodeCommit source action
  + Outputs three new variables: `dateTime`, `testRunId`, and `region`
+ A manual approval action that consumes the new variables from your Lambda invoke action to provide a test URL and a test run ID
+ A pipeline updated with the new actions

**Topics**
+ [

## Prerequisites
](#lambda-variables-prereqs)
+ [

## Step 1: Create a Lambda function
](#lambda-variables-function)
+ [

## Step 2: Add a Lambda invoke action and manual approval action to your pipeline
](#lambda-variables-pipeline)

## Prerequisites
<a name="lambda-variables-prereqs"></a>

Before you begin, you must have the following: 
+ You can create or use the pipeline with the CodeCommit source in [Tutorial: Create a simple pipeline (CodeCommit repository)](tutorials-simple-codecommit.md).
+ Edit your existing pipeline so that the CodeCommit source action has a namespace. Assign the namespace `SourceVariables` to the action.

## Step 1: Create a Lambda function
<a name="lambda-variables-function"></a>

Use the following steps to create a Lambda function and a Lambda execution role. You add the Lambda action to your pipeline after you create the Lambda function.

**To create a Lambda function and execution role**

1. Sign in to the AWS Management Console and open the AWS Lambda console at [https://console.aws.amazon.com/lambda/](https://console.aws.amazon.com/lambda/).

1. Choose **Create function**. Leave **Author from scratch** selected.

1. In **Function name**, enter the name of your function, such as **myInvokeFunction**. In **Runtime**, leave the default option selected.

1. Expand **Choose or create an execution role**. Choose **Create a new role with basic Lambda permissions**.

1. Choose **Create function**.

1. To use a variable from another action, it will have to be passed to the `UserParameters` in the Lambda invoke action configuration. You will be configuring the action in our pipeline later in the tutorial, but you will add the code assuming the variable will be passed.

   To produce new variables, set a property called `outputVariables` on the input to `putJobSuccessResult`. Note that you cannot produce variables as part of a `putJobFailureResult`.

   ```
    const putJobSuccess = async (message) => {
           const params = {
               jobId: jobId,
               outputVariables: {
                   testRunId: Math.floor(Math.random() * 1000).toString(),
                   dateTime: Date(Date.now()).toString(),
                   region: lambdaRegion
               }
           };
   ```

   In your new function, on the **Code** tab, paste the following example code under `index.mjs`.

   ```
   import { CodePipeline } from '@aws-sdk/client-codepipeline';
   
   export const handler = async (event, context) => {
       const codepipeline = new CodePipeline({});
       
       // Retrieve the Job ID from the Lambda action
       const jobId = event["CodePipeline.job"].id;
       
       // Retrieve UserParameters
       const params = event["CodePipeline.job"].data.actionConfiguration.configuration.UserParameters;
       
       // The region from where the lambda function is being executed
       const lambdaRegion = process.env.AWS_REGION;
       
       // Notify CodePipeline of a successful job
       const putJobSuccess = async (message) => {
           const params = {
               jobId: jobId,
               outputVariables: {
                   testRunId: Math.floor(Math.random() * 1000).toString(),
                   dateTime: Date(Date.now()).toString(),
                   region: lambdaRegion
               }
           };
           
           try {
               await codepipeline.putJobSuccessResult(params);
               return message;
           } catch (err) {
               throw err;
           }
       };
       
       // Notify CodePipeline of a failed job
       const putJobFailure = async (message) => {
           const params = {
               jobId: jobId,
               failureDetails: {
                   message: JSON.stringify(message),
                   type: 'JobFailed',
                   externalExecutionId: context.invokeid
               }
           };
           
           try {
               await codepipeline.putJobFailureResult(params);
               throw message;
           } catch (err) {
               throw err;
           }
       };
       
       try {
           console.log("Testing commit - " + params);
           
           // Your tests here
           
           // Succeed the job
           return await putJobSuccess("Tests passed.");
       } catch (ex) {
           // If any of the assertions failed then fail the job
           return await putJobFailure(ex);
       }
   };
   ```

1. Allow the function to auto save.

1. Copy the Amazon Resource Name (ARN) contained in the **Function ARN** field at the top of the screen.

1. As a last step, open the AWS Identity and Access Management (IAM) console at [https://console.aws.amazon.com/iam/](https://console.aws.amazon.com/iam/). Modify the Lambda execution role to add the following policy: [AWSCodePipelineCustomActionAccess](https://console.aws.amazon.com/iam/home?region=us-west-2#/policies/arn%3Aaws%3Aiam%3A%3Aaws%3Apolicy%2FAWSCodePipelineCustomActionAccess). For the steps to create a Lambda execution role or modify the role policy, see [Step 2: Create the Lambda function](actions-invoke-lambda-function.md#actions-invoke-lambda-function-create-function) . 

## Step 2: Add a Lambda invoke action and manual approval action to your pipeline
<a name="lambda-variables-pipeline"></a>

In this step, you add a Lambda invoke action to your pipeline. You add the action as part of a stage named **Test**. The action type is an invoke action. You then add a manual approval action after the invoke action.

**To add a Lambda action and a manual approval action to the pipeline**

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

   The names of all pipelines that are associated with your AWS account are displayed. Choose the pipeline where you want to add the action.

1. Add the Lambda test action to your pipeline.

   1. To edit your pipeline, choose **Edit**. Add a stage after the source action in the existing pipeline. Enter a name for the stage, such as **Test**.

   1. In the new stage, choose **Add action group** to add an action. In **Action name**, enter the name of the invoke action, such as **Test\$1Commit**.

   1. In **Action provider**, choose **AWS Lambda**.

   1. In **Input artifacts**, choose the name of your source action's output artifact, such as `SourceArtifact`.

   1. In **FunctionName**, add the ARN of the Lambda function that you created.

   1. In **Variable namespace**, add the namespace name, such as **TestVariables**.

   1. In **Output artifacts**, add the output artifact name, such as **LambdaArtifact**.

   1. Choose **Done**.

1. Add the manual approval action to your pipeline.

   1. With your pipeline still in editing mode, add a stage after the invoke action. Enter a name for the stage, such as **Approval**.

   1. In the new stage, choose the icon to add an action. In **Action name**, enter the name of the approval action, such as **Change\$1Approval**.

   1. In **Action provider**, choose **Manual approval**.

   1. In **URL for review**, construct the URL by adding the variable syntax for the `region` variable and the `CommitId` variable. Make sure that you use the namespaces assigned to the actions that provide the output variables. 

      For this example, the URL with the variable syntax for a CodeCommit action has the default namespace `SourceVariables`. The Lambda region output variable has the `TestVariables` namespace. The URL looks like the following.

      ```
      https://#{TestVariables.region}.console.aws.amazon.com/codesuite/codecommit/repositories/MyDemoRepo/commit/#{SourceVariables.CommitId}
      ```

      In **Comments**, construct the approval message text by adding the variable syntax for the `testRunId` variable. For this example, the URL with the variable syntax for the Lambda `testRunId` output variable has the `TestVariables` namespace. Enter the following message.

      ```
      Make sure to review the code before approving this action. Test Run ID: #{TestVariables.testRunId}
      ```

1. Choose **Done** to close the edit screen for the action, and then choose **Done** to close the edit screen for the stage. To save the pipeline, choose **Done**. The completed pipeline now contains a structure with source, test, approval, and deploy stages.

   Choose **Release change** to run the latest change through the pipeline structure.

1. When the pipeline reaches the manual approval stage, choose **Review**. The resolved variables appear as the URL for the commit ID. Your approver can choose the URL to view the commit.

1. After the pipeline runs successfully, you can also view the variable values on the action execution history page.

# Tutorial: Use an AWS Step Functions invoke action in a pipeline
<a name="tutorials-step-functions"></a>

You can use AWS Step Functions to create and configure state machines. This tutorial shows you how to add an invoke action to a pipeline that activates state machine executions from your pipeline. 

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

In this tutorial, you do the following tasks:
+ Create a standard state machine in AWS Step Functions.
+ Enter the state machine input JSON directly. You can also upload the state machine input file to an Amazon Simple Storage Service (Amazon S3) bucket.
+ Update your pipeline by adding the state machine action.

**Topics**
+ [

## Prerequisite: Create or choose a simple pipeline
](#tutorials-step-functions-prereq)
+ [

## Step 1: Create the sample state machine
](#tutorials-step-functions-sample)
+ [

## Step 2: Add a Step Functions invoke action to your pipeline
](#tutorials-step-functions-pipeline)

## Prerequisite: Create or choose a simple pipeline
<a name="tutorials-step-functions-prereq"></a>

In this tutorial, you add an invoke action to an existing pipeline. You can use the pipeline you created in [Tutorial: Create a simple pipeline (S3 bucket)](tutorials-simple-s3.md) or [Tutorial: Create a simple pipeline (CodeCommit repository)](tutorials-simple-codecommit.md).

You use an existing pipeline with a source action and at least a two-stage structure, but you do not use source artifacts for this example.

**Note**  
You might need to update the service role used by your pipeline with additional permissions required to run this action. To do this, open the AWS Identity and Access Management (IAM) console, find the role, and then add the permissions to the role's policy. For more information, see [Add permissions to the CodePipeline service role](how-to-custom-role.md#how-to-update-role-new-services).

## Step 1: Create the sample state machine
<a name="tutorials-step-functions-sample"></a>

In the Step Functions console, create a state machine using the `HelloWorld` sample template. For instructions, see [Create a State Machine](https://docs.aws.amazon.com/step-functions/latest/dg/getting-started.html#create-state-machine) in the *AWS Step Functions Developer Guide*.

## Step 2: Add a Step Functions invoke action to your pipeline
<a name="tutorials-step-functions-pipeline"></a>

Add a Step Functions invoke action to your pipeline as follows:

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

   The names of all pipelines associated with your AWS account are displayed.

1. In **Name**, choose the name of the pipeline you want to edit. This opens a detailed view of the pipeline, including the state of each of the actions in each stage of the pipeline.

1. On the pipeline details page, choose **Edit**.

1. On the second stage of your simple pipeline, choose **Edit stage**. Choose **Delete**. This deletes the second stage now that you no longer need it.

1. At the bottom of the diagram, choose **\$1 Add stage**.

1. In **Stage name**, enter a name for the stage, such as **Invoke**, and then choose **Add stage**.

1. Choose **\$1 Add action group**.

1. In **Action name**, enter a name, such as **Invoke**.

1. In **Action provider**, choose **AWS Step Functions**. Allow **Region** to default to the pipeline Region.

1. In **Input artifacts**, choose `SourceArtifact`.

1. In **State machine ARN**, choose the Amazon Resource Name (ARN) for the state machine that you created earlier.

1. (Optional) In **Execution name prefix**, enter a prefix to be added to the state machine execution ID.

1. In **Input type**, choose **Literal**.

1. In **Input**, enter the input JSON that the `HelloWorld` sample state machine expects.
**Note**  
The input to the state machine execution is different from the term used in CodePipeline to describe input artifacts for actions.

   For this example, enter the following JSON:

   ```
   {"IsHelloWorldExample": true}
   ```

1. Choose **Done**.

1. On the stage that you're editing, choose **Done**. In the AWS CodePipeline pane, choose **Save**, and then choose **Save** on the warning message.

1. To submit your changes and start a pipeline execution, choose **Release change**, and then choose **Release**.

1. On your completed pipeline, choose **AWS Step Functions** in your invoke action. In the AWS Step Functions console, view your state machine execution ID. The ID shows your state machine name `HelloWorld` and the state machine execution ID with the prefix `my-prefix`.

   ```
   arn:aws:states:us-west-2:account-ID:execution:HelloWorld:my-prefix-0d9a0900-3609-4ebc-925e-83d9618fcca1
   ```

# Tutorial: Create a pipeline that uses AWS AppConfig as a deployment provider
<a name="tutorials-AppConfig"></a>

In this tutorial, you configure a pipeline that continuously delivers configuration files using AWS AppConfig as the deployment action provider in your deployment stage.

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Topics**
+ [

## Prerequisites
](#tutorials-AppConfig-prereq)
+ [

## Step 1: Create your AWS AppConfig resources
](#tutorials-AppConfig-application)
+ [

## Step 2: Upload files to your S3 source bucket
](#tutorials-AppConfig-bucket)
+ [

## Step 3: Create your pipeline
](#tutorials-AppConfig-pipeline)
+ [

## Step 4: Make a change to any source file and verify deployment
](#tutorials-AppConfig-verify)

## Prerequisites
<a name="tutorials-AppConfig-prereq"></a>

Before you begin, you must complete the following:
+ This example uses an S3 source for your pipeline. Create or use an Amazon S3 bucket with versioning enabled. Follow the instructions in [Step 1: Create an S3 source bucket for your application](tutorials-simple-s3.md#s3-create-s3-bucket) to create an S3 bucket.

## Step 1: Create your AWS AppConfig resources
<a name="tutorials-AppConfig-application"></a>

In this section, you create the following resources:
+ An *application* in AWS AppConfig is a logical unit of code that provides capabilities for your customers.
+ An *environment* in AWS AppConfig is a logical deployment group of AppConfig targets, such as applications in a beta or production environment.
+ A *configuration profile* is a collection of settings that influence the behavior of your application. The configuration profile enables AWS AppConfig to access your configuration in its stored location.
+ (Optional) A *deployment strategy* in AWS AppConfig defines the behavior of a configuration deployment, such as what percentage of clients should receive the new deployed config at any given time during a deployment.

**To create an application, environment, configuration profile, and deployment strategy**

1. Sign in to the AWS Management Console.

1. Use the steps in the following topics to create your resources in AWS AppConfig.
   + [Create an application](https://docs.aws.amazon.com/systems-manager/latest/userguide/appconfig-creating-application.html).
   + [Create an environment](https://docs.aws.amazon.com/systems-manager/latest/userguide/appconfig-creating-environment.html).
   + [Create an AWS CodePipeline configuration profile](https://docs.aws.amazon.com/systems-manager/latest/userguide/appconfig-creating-configuration-and-profile.html).
   + (Optional) [Choose a predefined deployment strategy or create your own](https://docs.aws.amazon.com/systems-manager/latest/userguide/appconfig-creating-deployment-strategy.html).

## Step 2: Upload files to your S3 source bucket
<a name="tutorials-AppConfig-bucket"></a>

In this section, create your configuration file or files. Then zip and push your source files to the bucket that the pipeline uses for your source stage.

**To create configuration files**

1. Create a `configuration.json` file for each configuration in each Region. Include the following contents:

   ```
   Hello World!
   ```

1. Use the following steps to zip and upload your configuration files.

**To zip and upload source files**

1. Create a .zip file with your files and name the .zip file `configuration-files.zip`. As an example, your .zip file can use the following structure:

   ```
   .
   └── appconfig-configurations
       └── MyConfigurations
           ├── us-east-1
           │   └── configuration.json
           └── us-west-2
               └── configuration.json
   ```

1. In the Amazon S3 console for your bucket, choose **Upload**, and follow the instructions to upload your .zip file.

## Step 3: Create your pipeline
<a name="tutorials-AppConfig-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with an Amazon S3 action where the source artifacts are the files for your configuration.
+ A deployment stage with an AppConfig deployment action.

**To create a pipeline with the wizard**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyAppConfigPipeline**.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.

1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**.

1. In **Step 3: Add source stage**, in **Source provider**, choose **Amazon S3**. In **Bucket**, choose the name of your S3 source bucket. 

   In **S3 object key**, enter the name of your .zip file: `configuration-files.zip`.

   Choose **Next**.

1. In **Step 4: Add build stage**, choose **Skip build stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 6: Add deploy stage**:

   1. In **Deploy provider**, choose **AWS AppConfig**. 

   1. In **Application**, choose the name of the application you created in AWS AppConfig. The field shows the ID for your application.

   1. In **Environment**, choose the name of the environment you created in AWS AppConfig. The field shows the ID for your environment.

   1. In **Configuration profile**, choose the name of the configuration profile you created in AWS AppConfig. The field shows the ID for your configuration profile.

   1. In **Deployment strategy**, choose the name of your deployment strategy. This can be either a deployment strategy you created in AppConfig or one you have chosen from predefined deployment strategies in AppConfig. The field shows the ID for your deployment strategy.

   1. In **Input artifact configuration path**, enter the file path. Make sure that your input artifact configuration path matches the directory structure in your S3 bucket .zip file. For this example, enter the following file path: `appconfig-configurations/MyConfigurations/us-west-2/configuration.json`. 

   1. Choose **Next**.

1. In **Step 7: Review**, review the information, and then choose **Create pipeline**.

## Step 4: Make a change to any source file and verify deployment
<a name="tutorials-AppConfig-verify"></a>

Make a change to your source files and upload the change to your bucket. This triggers your pipeline to run. Verify that your configuration is available by viewing the version.

# Tutorial: Use full clone with a GitHub pipeline source
<a name="tutorials-github-gitclone"></a>

You can choose the full clone option for your GitHub source action in CodePipeline. Use this option to run CodeBuild commands for Git metadata in your pipeline build action.

**Note**  
The full clone option described here refers to specifying whether CodePipeline should clone repository metadata, which can only be used by CodeBuild commands. To use a GitHub [ user access token](https://docs.github.com/en/apps/creating-github-apps/authenticating-with-a-github-app/generating-a-user-access-token-for-a-github-app) for use with CodeBuild projects, follow the steps here to install the AWS Connector for GitHub app and then leave the App installation field empty. CodeConnections will use the user access token for the connection.



**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

In this tutorial, you will create a pipeline that connects to your GitHub repository, uses the full clone option for source data, and run a CodeBuild build that clones your repository and performs Git commands for the repository.

**Note**  
This feature is not available in the Asia Pacific (Hong Kong), Africa (Cape Town), Middle East (Bahrain), Europe (Zurich), or AWS GovCloud (US-West) Regions. To reference other available actions, see [Product and service integrations with CodePipeline](integrations.md). For considerations with this action in the Europe (Milan) Region, see the note in [CodeStarSourceConnection for Bitbucket Cloud, GitHub, GitHub Enterprise Server, GitLab.com, and GitLab self-managed actions](action-reference-CodestarConnectionSource.md).

**Topics**
+ [

## Prerequisites
](#tutorials-github-gitclone-prereq)
+ [

## Step 1: Create a README file
](#tutorials-github-gitclone-file)
+ [

## Step 2: Create your pipeline and build project
](#tutorials-github-gitclone-pipeline)
+ [

## Step 3: Update the CodeBuild service role policy to use connections
](#tutorials-github-gitclone-rolepolicy)
+ [

## Step 4: View repository commands in build output
](#tutorials-github-gitclone-view)

## Prerequisites
<a name="tutorials-github-gitclone-prereq"></a>

Before you begin, you must do the following:
+ Create a GitHub repository with your GitHub account.
+ Have your GitHub credentials ready. When you use the AWS Management Console to set up a connection, you are asked to sign in with your GitHub credentials. 

## Step 1: Create a README file
<a name="tutorials-github-gitclone-file"></a>

After you create your GitHub repository, use these steps to add a README file.

1. Log in to your GitHub repository and choose your repository.

1. To create a new file, choose **Add file > Create new file**. Name the file `README.md`. file and add the following text.

   ```
   This is a GitHub repository!
   ```

1. Choose **Commit changes**.

   Make sure the `README.md` file is at the root level of your repository.

## Step 2: Create your pipeline and build project
<a name="tutorials-github-gitclone-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with a connection to your GitHub repository and action.
+ A build stage with an AWS CodeBuild build action.

**To create a pipeline with the wizard**

1. Sign in to the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **Welcome** page, **Getting started** page, or **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyGitHubPipeline**.

1. In **Pipeline type**, choose **V1** for the purposes of this tutorial. You can also choose **V2**; however, note that pipeline types differ in characteristics and price. For more information, see [Pipeline types](pipeline-types.md).

1. In **Service role**, choose **New service role**.
**Note**  
If you choose instead to use your existing CodePipeline service role, make sure that you have added the `codestar-connections:UseConnection` IAM permission to your service role policy. For instructions for the CodePipeline service role, see [Add permissions to the the CodePipeline service role](https://docs.aws.amazon.com/codepipeline/latest/userguide/security-iam.html#how-to-update-role-new-services).

1. Under **Advanced settings**, leave the defaults. In **Artifact store**, choose **Default location** to use the default artifact store, such as the Amazon S3 artifact bucket designated as the default, for your pipeline in the Region you selected for your pipeline.
**Note**  
This is not the source bucket for your source code. This is the artifact store for your pipeline. A separate artifact store, such as an S3 bucket, is required for each pipeline.

   Choose **Next**.

1. On the **Step 3: Add source stage** page, add a source stage:

   1. In **Source provider**, choose **GitHub (via GitHub App)**.

   1. Under **Connection**, choose an existing connection or create a new one. To create or manage a connection for your GitHub source action, see [GitHub connections](connections-github.md). 

      You install one app for all of your connections to a particular provider. If you have already installed the AWS Connector for GitHub app, choose it and skip this step.
**Note**  
If you want to create a [ user access token](https://docs.github.com/en/apps/creating-github-apps/authenticating-with-a-github-app/generating-a-user-access-token-for-a-github-app), make sure that you've already installed the AWS Connector for GitHub app and then leave the App installation field empty. CodeConnections will use the user access token for the connection. For more information, see [Access your source provider in CodeBuild](https://docs.aws.amazon.com/codebuild/latest/userguide/access-tokens.html).

   1. In **Repository name**, choose the name of your GitHub repository.

   1. In **Branch name**, choose the repository branch you want to use.

   1. Make sure the **Start the pipeline on source code change** option is selected.

   1. Under **Output artifact format**, choose **Full clone** to enable the Git clone option for the source repository. Only actions provided by CodeBuild can use the Git clone option. You will use [Step 3: Update the CodeBuild service role policy to use connections](#tutorials-github-gitclone-rolepolicy) in this tutorial to update the permissions for your CodeBuild project service role to use this option.

   Choose **Next**.

1. In **Step 4: Add build stage**, add a build stage:

   1. In **Build provider**, choose **AWS CodeBuild**. Allow **Region** to default to the pipeline Region.

   1. Choose **Create project**.

   1. In **Project name**, enter a name for this build project.

   1. In **Environment image**, choose **Managed image**. For **Operating system**, choose **Ubuntu**.

   1. For **Runtime**, choose **Standard**. For **Image**, choose **aws/codebuild/standard:5.0**.

   1. For **Service role**, choose **New service role**.
**Note**  
Note the name of your CodeBuild service role. You will need the role name for the final step in this tutorial.

   1. Under **Buildspec**, for **Build specifications**, choose **Insert build commands**. Choose **Switch to editor**, and paste the following under **Build commands**.
**Note**  
In the `env` section of the build spec, make sure the credential helper for git commands is enabled as shown in this example.

      ```
      version: 0.2
      
      env:
        git-credential-helper: yes
      phases:
        install:
          #If you use the Ubuntu standard image 2.0 or later, you must specify runtime-versions.
          #If you specify runtime-versions and use an image other than Ubuntu standard image 2.0, the build fails.
          runtime-versions:
            nodejs: 12
            # name: version
          #commands:
            # - command
            # - command
        pre_build:
          commands:
            - ls -lt
            - cat README.md
        build:
          commands:
            - git log | head -100
            - git status
            - ls
            - git archive --format=zip HEAD > application.zip
        #post_build:
          #commands:
            # - command
            # - command
      artifacts:
        files:
          - application.zip
          # - location
        #name: $(date +%Y-%m-%d)
        #discard-paths: yes
        #base-directory: location
      #cache:
        #paths:
          # - paths
      ```

   1. Choose **Continue to CodePipeline**. This returns to the CodePipeline console and creates a CodeBuild project that uses your build commands for configuration. The build project uses a service role to manage AWS service permissions. This step might take a couple of minutes.

   1. Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. On the **Step 6: Add deploy stage** page, choose **Skip deploy stage**, and then accept the warning message by choosing **Skip** again. Choose **Next**.

1. On **Step 7: Review**, choose **Create pipeline**.

## Step 3: Update the CodeBuild service role policy to use connections
<a name="tutorials-github-gitclone-rolepolicy"></a>

The initial pipeline run will fail because the CodeBuild service role must be updated with permissions to use connections. Add the `codestar-connections:UseConnection` IAM permission to your service role policy. For instructions to update the policy in the IAM console, see [Add CodeBuild GitClone permissions for connections to Bitbucket, GitHub, GitHub Enterprise Server, or GitLab.com](troubleshooting.md#codebuild-role-connections).

## Step 4: View repository commands in build output
<a name="tutorials-github-gitclone-view"></a>

1. When your service role is successfully updated, choose **Retry** on the failed CodeBuild stage.

1. After the pipeline runs successfully, on your successful build stage, choose **View details**.

   On the details page, choose the **Logs** tab. View the CodeBuild build output. The commands output the value of the entered variable.

   The commands output the `README.md` file contents, list the files in the directory, clone the repository, view the log, and archive the repository as a ZIP file.

# Tutorial: Use full clone with a CodeCommit pipeline source
<a name="tutorials-codecommit-gitclone"></a>

You can choose the full clone option for your CodeCommit source action in CodePipeline. Use this option to allow CodeBuild to access Git metadata in your pipeline build action.

In this tutorial, you create a pipeline that accesses your CodeCommit repository, uses the full clone option for source data, and runs a CodeBuild build that clones your repository and performs Git commands for the repository.

**Note**  
CodeBuild actions are the only downstream actions support use of Git metadata available with the Git clone option. Also, while your pipeline can contain cross-account actions, the CodeCommit action and the CodeBuild action must be in the same account for the full clone option to succeed.

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Topics**
+ [

## Prerequisites
](#tutorials-codecommit-gitclone-prereq)
+ [

## Step 1: Create a README file
](#tutorials-codecommit-gitclone-file)
+ [

## Step 2: Create your pipeline and build project
](#tutorials-codecommit-gitclone-pipeline)
+ [

## Step 3: Update the CodeBuild service role policy to clone the repository
](#tutorials-codecommit-gitclone-rolepolicy)
+ [

## Step 4: View repository commands in build output
](#tutorials-codecommit-gitclone-view)

## Prerequisites
<a name="tutorials-codecommit-gitclone-prereq"></a>

Before you begin, you must create a CodeCommit repository in the same AWS account and Region as your pipeline.

## Step 1: Create a README file
<a name="tutorials-codecommit-gitclone-file"></a>

Use these steps to add a README file to your source repository. The README file provides an example source file for the CodeBuild downstream action to read.

**To add a README file**

1. Log in to your repository and choose your repository.

1. To create a new file, choose **Add file > Create file**. Name the file `README.md`. file and add the following text.

   ```
   This is a CodeCommit repository!
   ```

1. Choose **Commit changes**.

   Make sure the `README.md` file is at the root level of your repository.

## Step 2: Create your pipeline and build project
<a name="tutorials-codecommit-gitclone-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with a CodeCommit source action.
+ A build stage with an AWS CodeBuild build action.

**To create a pipeline with the wizard**

1. Sign in to the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **Welcome** page, **Getting started** page, or **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyCodeCommitPipeline**.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, do one of the following:
   + Choose **Existing service role**.
   + Choose your existing CodePipeline service role. This role must have the `codecommit:GetRepository` IAM permission to your service role policy. See [Add permissions to the the CodePipeline service role](https://docs.aws.amazon.com/codepipeline/latest/userguide/security-iam.html#how-to-update-role-new-services).

1. Under **Advanced settings**, leave the defaults. Choose **Next**.

1. On the **Step 3: Add source stage** page, do the following:

   1. In **Source provider**, choose **CodeCommit**.

   1. In **Repository name**, choose the name of your repository.

   1. In **Branch name**, choose your branch name.

   1. Make sure the **Start the pipeline on source code change** option is selected.

   1. Under **Output artifact format**, choose **Full clone** to enable the Git clone option for the source repository. Only actions provided by CodeBuild can use the Git clone option. 

   Choose **Next**.

1. In **Step 4: Add build stage**, do the following:

   1. In **Build provider**, choose **AWS CodeBuild**. Allow **Region** to default to the pipeline Region.

   1. Choose **Create project**.

   1. In **Project name**, enter a name for this build project.

   1. In **Environment image**, choose **Managed image**. For **Operating system**, choose **Ubuntu**.

   1. For **Runtime**, choose **Standard**. For **Image**, choose **aws/codebuild/standard:5.0**.

   1. For **Service role**, choose **New service role**.
**Note**  
Note the name of your CodeBuild service role. You will need the role name for the final step in this tutorial.

   1. Under **Buildspec**, for **Build specifications**, choose **Insert build commands**. Choose **Switch to editor**, and then under **Build commands** paste the following code.

      ```
      version: 0.2
      
      env:
        git-credential-helper: yes
      phases:
        install:
          #If you use the Ubuntu standard image 2.0 or later, you must specify runtime-versions.
          #If you specify runtime-versions and use an image other than Ubuntu standard image 2.0, the build fails.
          runtime-versions:
            nodejs: 12
            # name: version
          #commands:
            # - command
            # - command
        pre_build:
          commands:
            - ls -lt
            - cat README.md
        build:
          commands:
            - git log | head -100
            - git status
            - ls
            - git describe --all
        #post_build:
          #commands:
            # - command
            # - command
      #artifacts:
        #files:
          # - location
        #name: $(date +%Y-%m-%d)
        #discard-paths: yes
        #base-directory: location
      #cache:
        #paths:
          # - paths
      ```

   1. Choose **Continue to CodePipeline**. This returns you to the CodePipeline console and creates a CodeBuild project that uses your build commands for configuration. The build project uses a service role to manage AWS service permissions. This step might take a couple of minutes.

   1. Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. On the **Step 6: Add deploy stage** page, choose **Skip deploy stage**, and then accept the warning message by choosing **Skip** again. Choose **Next**.

1. On **Step 7: Review**, choose **Create pipeline**.

## Step 3: Update the CodeBuild service role policy to clone the repository
<a name="tutorials-codecommit-gitclone-rolepolicy"></a>

The initial pipeline run will fail because you need to update the CodeBuild service role with permissions to pull from your repository.

Add the `codecommit:GitPull` IAM permission to your service role policy. For instructions to update the policy in the IAM console, see [Add CodeBuild GitClone permissions for CodeCommit source actions](troubleshooting.md#codebuild-role-codecommitclone).

## Step 4: View repository commands in build output
<a name="tutorials-codecommit-gitclone-view"></a>

**To view the build output**

1. When your service role is successfully updated, choose **Retry** on the failed CodeBuild stage.

1. After the pipeline runs successfully, on your successful build stage, choose **View details**.

   On the details page, choose the **Logs** tab. View the CodeBuild build output. The commands output the value of the entered variable.

   The commands output the `README.md` file contents, list the files in the directory, clone the repository, view the log, and run `git describe --all`. 

# Tutorial: Create a pipeline with AWS CloudFormation StackSets deployment actions
<a name="tutorials-stackset-deployment"></a>

In this tutorial, you use the AWS CodePipeline console to create a pipeline with deployment actions for creating a stack set and creating stack instances. When the pipeline runs, the template creates a stack set and also creates and updates the instances where the stack set is deployed.

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

There are two ways to manage permissions for a stack set: self-managed and AWS-managed IAM roles. This tutorial provides examples with self-managed permissions.

To most effectively use Stacksets in CodePipeline, you should have a clear understanding of the concepts behind CloudFormation StackSets and how they work. See [StackSets concepts](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/stacksets-concepts.html) in the *AWS CloudFormation User Guide*.

**Topics**
+ [

## Prerequisites
](#tutorials-stackset-deployment-prereq)
+ [

## Step 1: Upload the sample AWS CloudFormation template and parameter file
](#tutorials-stackset-deployment-upload)
+ [

## Step 2: Create your pipeline
](#tutorials-stackset-action-pipeline)
+ [

## Step 3: View initial deployment
](#tutorials-stackset-action-initial)
+ [

## Step 4: Add a CloudFormationStackInstances action
](#tutorials-stacksets-instances)
+ [

## Step 5: View stack set resources for your deployment
](#tutorials-stacksets-view)
+ [

## Step 6: Make an update to your stack set
](#tutorials-stacksets-update)

## Prerequisites
<a name="tutorials-stackset-deployment-prereq"></a>

For stack set operations, you use two different accounts: an administration account and a target account. You create stack sets in the administrator account. You create individual stacks that belong to a stack set in the target account.

**To create an administrator role with your administrator account**
+ Follow the instructions in [Set up basic permissions for stack set operations](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/stacksets-prereqs-self-managed.html#stacksets-prereqs-accountsetup). Your role must be named **`AWSCloudFormationStackSetAdministrationRole`**.

**To create a service role in the target account**
+ Create a service role in the target account that trusts the administrator account. Follow the instructions in [Set up basic permissions for stack set operations](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/stacksets-prereqs-self-managed.html#stacksets-prereqs-accountsetup). Your role must be named **`AWSCloudFormationStackSetExecutionRole`**. 

## Step 1: Upload the sample AWS CloudFormation template and parameter file
<a name="tutorials-stackset-deployment-upload"></a>

Create a source bucket for your stack set template and parameters files. Download the sample AWS CloudFormation template file, set up a parameters file, and then zip the files before upload to your S3 source bucket.

**Note**  
Make sure to ZIP the source files before you upload to your S3 source bucket, even if the only source file is the template.



**To create an S3 source bucket**

1. Sign in to the AWS Management Console and open the Amazon S3 console at [https://console.aws.amazon.com/s3/](https://console.aws.amazon.com/s3/).

1. Choose **Create bucket**.

1. In **Bucket name**, enter a name for your bucket.

   In **Region**, choose the Region where you want to create your pipeline. Choose **Create bucket**.

1. After the bucket is created, a success banner displays. Choose **Go to bucket details**.

1. On the **Properties** tab, choose **Versioning**. Choose **Enable versioning**, and then choose **Save**.

**To create the AWS CloudFormation template file**

1. Download the following sample template file for generating CloudTrail configuration for stack sets: [https://s3.amazonaws.com/cloudformation-stackset-sample-templates-us-east-1/EnableAWSCloudtrail.yml](https://s3.amazonaws.com/cloudformation-stackset-sample-templates-us-east-1/EnableAWSCloudtrail.yml).

1. Save the file as `template.yml`.

**To create the parameters.txt file**

1. Create a file with the parameters for your deployment. Parameters are values that you want to update in your stack at runtime. The following sample file updates the template parameters for your stack set to enable logging validation and global events.

   ```
   [
     {
       "ParameterKey": "EnableLogFileValidation",
       "ParameterValue": "true"
     }, 
     {
       "ParameterKey": "IncludeGlobalEvents",
       "ParameterValue": "true"
     }
   ]
   ```

1. Save the file as `parameters.txt`.

**To create the accounts.txt file**

1. Create a file with the accounts where you want to create instances, as shown in the following sample file.

   ```
   [
       "111111222222","333333444444"
   ]
   ```

1. Save the file as `accounts.txt`.

**To create and upload source files**

1. Combine the files into a single ZIP file. Your files should look like this in your ZIP file.

   ```
   template.yml
   parameters.txt
   accounts.txt
   ```

1. Upload the ZIP file to your S3 bucket. This file is the source artifact created by the **Create Pipeline** wizard for your deployment action in CodePipeline.

## Step 2: Create your pipeline
<a name="tutorials-stackset-action-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with an S3 source action where the source artifact is your template file and any supporting source files.
+ A deployment stage with an CloudFormation stack set deployment action that creates the stack set.
+ A deployment stage with an CloudFormation stack instances deployment action that creates the stacks and instances within the target accounts.

**To create a pipeline with a CloudFormationStackSet action**

1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console.aws.amazon.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home).

1. On the **Welcome** page, **Getting started** page, or **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyStackSetsPipeline**.

1. In **Pipeline type**, choose **V1** for the purposes of this tutorial. You can also choose **V2**; however, note that pipeline types differ in characteristics and price. For more information, see [Pipeline types](pipeline-types.md).

1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM.

1. In **Artifact store**, leave the defaults.
**Note**  
This is not the source bucket for your source code. This is the artifact store for your pipeline. A separate artifact store, such as an S3 bucket, is required for each pipeline. When you create or edit a pipeline, you must have an artifact bucket in the pipeline Region and one artifact bucket per AWS Region where you are running an action.  
For more information, see [Input and output artifacts](welcome-introducing-artifacts.md) and [CodePipeline pipeline structure reference](reference-pipeline-structure.md).

   Choose **Next**.

1. On the **Step 3: Add source stage** page, in **Source provider**, choose **Amazon S3**.

1. In **Bucket**, enter the S3 source bucket you created for this tutorial, such as `BucketName`. In **S3 object key**, enter the file path and file name for your ZIP file, such as `MyFiles.zip`.

1. Choose **Next**.

1. In **Step 4: Add build stage**, choose **Skip build stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. In **Step 6: Add deploy stage**:

   1. In **Deploy provider**, choose **AWS CloudFormation Stack Set**.

   1. In **Stack set name**, enter a name for the stack set. This is the name of the stack set that the template creates.
**Note**  
Make a note of your stack set name. You will use it when you add the second StackSets deployment action to your pipeline.

   1. In **Template path**, enter the artifact name and file path where you uploaded your template file. For example, enter the following using the default source artifact name `SourceArtifact`.

      ```
      SourceArtifact::template.yml
      ```

   1. In **Deployment targets**, enter the artifact name and file path where you uploaded your accounts file. For example, enter the following using the default source artifact name `SourceArtifact`.

      ```
      SourceArtifact::accounts.txt
      ```

   1. In **Deployment target AWS Regions**, enter one Region for deployment of your initial stack instance, such as `us-east-1`.

   1. Expand **Deployment options**. In **Parameters**, enter the artifact name and file path where you uploaded your parameters file. For example, enter the following using the default source artifact name `SourceArtifact`.

      ```
      SourceArtifact::parameters.txt
      ```

      To enter the parameters as a literal input rather than a file path, enter the following:

      ```
      ParameterKey=EnableLogFileValidation,ParameterValue=true
      ParameterKey=IncludeGlobalEvents,ParameterValue=true
      ```

   1. In **Capabilities**, choose CAPABILITY\$1IAM and CAPABILITY\$1NAMED\$1IAM.

   1. In **Permission model**, choose SELF\$1MANAGED.

   1. In **Failure tolerance percentage**, enter `20`.

   1. In **Max concurrent percentage**, enter `25`.

   1. Choose **Next**.

   1. In **Step 7: Review**, choose **Create pipeline**. Your pipeline displays. 

   1. Allow your pipeline to run. 

## Step 3: View initial deployment
<a name="tutorials-stackset-action-initial"></a>

View the resources and status for your initial deployment. After verifying the deployment successfully created your stack set, you can add the second action to your **Deploy** stage.

**To view the resources**

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. Under **Pipelines**, choose your pipeline and choose **View**. The diagram shows your pipeline source and deployment stages.

1. Choose the CloudFormation action on the **CloudFormationStackSet** action in your pipeline. The template, resources, and events for your stack set are shown in the CloudFormation console.

1. In the left navigation panel, choose **StackSets**. In the list, choose the new stack set.

1. Choose the **Stack instances** tab. Verify that one stack instance for each account you provided was created in the us-east-1 Region. Verify that the status for each stack instance is `CURRENT`.

## Step 4: Add a CloudFormationStackInstances action
<a name="tutorials-stacksets-instances"></a>

Create a next action in your pipeline that will allow CloudFormation StackSets to create the remainingstack instances.

**To create a next action in your pipeline**

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

   Under **Pipelines**, choose your pipeline and choose **View**. The diagram shows your pipeline source and deployment stages.

1. Choose to edit the pipeline. The pipeline displays in **Edit** mode. 

1. On the **Deploy** stage, choose **Edit**.

1. Under the **AWS CloudFormation Stack Set** deploy action, choose **Add action group**.

1. On the **Edit action** page, add the action details:

   1. In **Action name**, enter a name for the action.

   1. In **Action provider**, choose **AWS CloudFormation Stack Instances**.

   1. Under **Input artifacts**, choose **SourceArtifact**.

   1. In **Stack set name**, enter the name for the stack set. This is the name of the stack set that you provided in the first action.

   1. In **Deployment targets**, enter the artifact name and file path where you uploaded your accounts file. For example, enter the following using the default source artifact name `SourceArtifact`.

      ```
      SourceArtifact::accounts.txt
      ```

   1. In **Deployment target AWS Regions**, enter the Regions for deployment of your remaining stack instances, such as `us-east-2` and `eu-central-1` as follows:

      ```
      us-east2, eu-central-1
      ```

   1. In **Failure tolerance percentage**, enter `20`.

   1. In **Max concurrent percentage**, enter `25`.

   1. Choose **Save**.

   1. .Manually release a change. Your updated pipeline displays with two actions in the Deploy stage.

## Step 5: View stack set resources for your deployment
<a name="tutorials-stacksets-view"></a>

You can view the resources and status for your stack set deployment.

**To view the resources**

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. Under **Pipelines**, choose your pipeline and then choose **View**. The diagram shows your pipeline source and deployment stages.

1. Choose the CloudFormation action on the **`AWS CloudFormation Stack Instances`** action in your pipeline. The template, resources, and events for your stack set are shown in the CloudFormation console.

1. In the left navigation panel, choose **StackSets**. In the list, choose your stack set.

1. Choose the **Stack instances** tab. Verify that all remaining stack instances for each account you provided were created or updated in the expected Regions. Verify that the status for each stack instance is `CURRENT`.

## Step 6: Make an update to your stack set
<a name="tutorials-stacksets-update"></a>

Make an update to your stack set and deploy the update to instances. In this example, you also make a change to the deployment targets you want to designate for update. The instances that are not part of the update move to an outdated status.

1. Open the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. Under **Pipelines**, choose your pipeline and then choose **Edit**. On the **Deploy** stage, choose **Edit**.

1. Choose to edit the **AWS CloudFormation Stack Set** action in your pipeline. In **Description**, write over the existing description with a new description for the stack set.

1. Choose to edit the **AWS CloudFormation Stack Instances** action in your pipeline. In **Deployment target AWS Regions**, delete the `us-east-2` value that was entered when the action was created.

1. Save the changes. Choose **Release change** to run your pipeline.

1. Open your action in CloudFormation. Choose the **StackSet info** tab. In **StackSet description**, verify that the new description is shown.

1. Choose the **Stack instances** tab. Under **Status**, verify that the status for the stack instances in us-east-2 is `OUTDATED`.

# Tutorial: Create a variable check rule for a pipeline as an entry condition
<a name="tutorials-varcheckrule"></a>

In this tutorial, you configure a pipeline that continuously delivers files using GitHub as the source action provider in your source stage. The completed pipeline detects changes when you make a change to the source files in your source repository. The pipeline runs and then checks the output variables against the source repository name and branch name provided in the condition for entry to the build stage.

**Important**  
As part of creating a pipeline, an S3 artifact bucket provided by the customer will be used by CodePipeline for artifacts. (This is different from the bucket used for an S3 source action.) If the S3 artifact bucket is in a different account from the account for your pipeline, make sure that the S3 artifact bucket is owned by AWS accounts that are safe and will be dependable.

**Important**  
Many of the actions you add to your pipeline in this procedure involve AWS resources that you need to create before you create the pipeline. AWS resources for your source actions must always be created in the same AWS Region where you create your pipeline. For example, if you create your pipeline in the US East (Ohio) Region, your CodeCommit repository must be in the US East (Ohio) Region.   
You can add cross-region actions when you create your pipeline. AWS resources for cross-region actions must be in the same AWS Region where you plan to execute the action. For more information, see [Add a cross-Region action in CodePipeline](actions-create-cross-region.md).

This example uses the example pipeline with a GitHub (Version2) source action and a CodeBuild build action where the entry condition for the build stage will check for variables.

## Prerequisites
<a name="tutorials-varcheckrule-prereq"></a>

Before you begin, you must do the following:
+ Create a GitHub repository with your GitHub account.
+ Have your GitHub credentials ready. When you use the AWS Management Console to set up a connection, you are asked to sign in with your GitHub credentials. 
+ A connection to your repository to set up GitHub (via GitHub App) as the source action for your pipeline. To create a connection to your GitHub repository, see [GitHub connections](connections-github.md).

## Step 1: Create sample source file and add to your GitHub repository
<a name="tutorials-varcheckrule-push"></a>

In this section, you create and add your example source file to the repository that the pipeline uses for your source stage. For this example, you produce and add the following: 
+ A `README.md` file.

After you create your GitHub repository, use these steps to add your README file.

1. Log in to your GitHub repository and choose your repository.

1. To create a new file, choose **Add file**, and then choose **Create new file**. Name the file `README.md` and add the following text.

   ```
   This is a GitHub repository!
   ```

1. Choose **Commit changes**. For the purposes of this tutorial, add a commit message that contains the capitalized word "Update" as in the following example:

   ```
   Update to source files
   ```
**Note**  
The rule check for strings is case-sensitive.

   Make sure the `README.md` file is at the root level of your repository.

## Step 2: Create your pipeline
<a name="tutorials-varcheckrule-create-pipeline"></a>

In this section, you create a pipeline with the following actions:
+ A source stage with a connection to your GitHub repository and action.
+ A CodeBuild build stage where the stage has an On Entry condition configured for the variable check rule.

**To create a pipeline with the wizard**

1. Sign in to the CodePipeline console at [https://console.aws.amazon.com/codepipeline/](https://console.aws.amazon.com/codepipeline/).

1. On the **Welcome** page, **Getting started** page, or **Pipelines** page, choose **Create pipeline**.

1. On the **Step 1: Choose creation option** page, under **Creation options**, choose the **Build custom pipeline** option. Choose **Next**.

1. In **Step 2: Choose pipeline settings**, in **Pipeline name**, enter **MyVarCheckPipeline**.

1. CodePipeline provides V1 and V2 type pipelines, which differ in characteristics and price. The V2 type is the only type you can choose in the console. For more information, see [pipeline types](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipeline-types-planning.html?icmpid=docs_acp_help_panel). For information about pricing for CodePipeline, see [Pricing](https://aws.amazon.com/codepipeline/pricing/).

1. In **Service role**, choose **New service role**.
**Note**  
If you choose instead to use your existing CodePipeline service role, make sure that you have added the `codeconnections:UseConnection` IAM permission to your service role policy. For instructions for the CodePipeline service role, see [Add permissions to the the CodePipeline service role](https://docs.aws.amazon.com/codepipeline/latest/userguide/security-iam.html#how-to-update-role-new-services).

1. Under **Advanced settings**, leave the defaults.

   Choose **Next**.

1. On the **Step 3: Add source stage** page, add a source stage:

   1. In **Source provider**, choose **GitHub (via GitHub App)**.

   1. Under **Connection**, choose an existing connection or create a new one. To create or manage a connection for your GitHub source action, see [GitHub connections](connections-github.md).

   1. In **Repository name**, choose the name of your GitHub repository.

   1. In **Branch name**, choose the repository branch you want to use.

   1. Make sure the **No trigger** option is selected.

   Choose **Next**.

1. In **Step 4: Add build stage**, add a build stage:

   1. In **Build provider**, choose **AWS CodeBuild**. Allow **Region** to default to the pipeline Region.

   1. Choose **Create project**.

   1. In **Project name**, enter a name for this build project.

   1. In **Environment image**, choose **Managed image**. For **Operating system**, choose **Ubuntu**.

   1. For **Runtime**, choose **Standard**. For **Image**, choose **aws/codebuild/standard:5.0**.

   1. For **Service role**, choose **New service role**.
**Note**  
Note the name of your CodeBuild service role. You will need the role name for the final step in this tutorial.

   1. Under **Buildspec**, for **Build specifications**, choose **Insert build commands**. Choose **Switch to editor**, and paste the following under **Build commands**.

      ```
      version: 0.2
      #env:
        #variables:
           # key: "value"
           # key: "value"
        #parameter-store:
           # key: "value"
           # key: "value"
        #git-credential-helper: yes
      phases:
        install:
          #If you use the Ubuntu standard image 2.0 or later, you must specify runtime-versions.
          #If you specify runtime-versions and use an image other than Ubuntu standard image 2.0, the build fails.
          runtime-versions:
            nodejs: 12
          #commands:
            # - command
            # - command
        #pre_build:
          #commands:
            # - command
            # - command
        build:
          commands:
            - 
        #post_build:
          #commands:
            # - command
            # - command
      artifacts:
        files:
           - '*'
          # - location
        name: $(date +%Y-%m-%d)
        #discard-paths: yes
        #base-directory: location
      #cache:
        #paths:
          # - paths
      ```

   1. Choose **Continue to CodePipeline**. This returns to the CodePipeline console and creates a CodeBuild project that uses your build commands for configuration. The build project uses a service role to manage AWS service permissions. This step might take a couple of minutes.

   1. Choose **Next**.

1. In **Step 5: Add test stage**, choose **Skip test stage**, and then accept the warning message by choosing **Skip** again. 

   Choose **Next**.

1. On the **Step 6: Add deploy stage** page, choose **Skip deploy stage**, and then accept the warning message by choosing **Skip** again. Choose **Next**.

1. On **Step 7: Review**, choose **Create pipeline**.

## Step 2: Edit the build stage to add the condition and rule
<a name="tutorials-varcheckrule-create-condition"></a>

In this step, you edit the stage to add an On Entry condition for the variable check rule.

1. Choose your pipeline, and then choose **Edit**. Choose to add an entry rule on the build stage. 

   In **Rule provider**, choose **VariableCheck**.

1. In **Variable**, enter the variable or variables that you want to check. In **Value**, enter the string value to check against the resolved variable. In the following example screens, a rule is created for an "equals" check, and another rule is created for a "contains" check.  
![\[The rule creation page for the "equals" variable check\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/varcheck-tut-create-rule-equals.png)  
![\[The rule creation page for the "contains" variable check\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/varcheck-tut-create-rule-contains.png)

1. Choose **Save**. 

   Choose **Done**.

## Step 3: Run the pipeline and view resolved variables
<a name="tutorials-varcheckrule-run"></a>

In this step, you view the resolved values and results of the variable check rule.

1. View the resolved run after the rule check is successful as shown in the following example.  
![\[The successful run\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/varcheck-tut-run-succeeded.png)

1. View the variable information on the **Timeline** tab.   
![\[The history page showing the Timline tab with variables succeeded\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/varcheck-tut-history.png)