

# Cross-service samples for CodeBuild
<a name="cross-service-samples"></a>

You can use these cross-service samples to experiment with AWS CodeBuild:

[Amazon ECR sample](sample-ecr.md)  
Uses a Docker image in an Amazon ECR repository to use Apache Maven to produce a single JAR file. The sample instructions will show you how to create and push a Docker image to Amazon ECR, create a Go project, build the project, run the project, and set up permissions to allow CodeBuild to connect to Amazon ECR.

[Amazon EFS sample](sample-efs.md)  
Shows how to configure a buildspec file so that a CodeBuild project mounts and builds on an Amazon EFS file system. The sample instructions will show you how to create a Amazon VPC, create file system in the Amazon VPC, create and build a project that uses the Amazon VPC, and then review the generated project file and variables. 

[AWS CodePipeline samples](sample-codepipeline.md)  
Shows how to use AWS CodePipeline to create a build with batch builds as well as multiple input sources and multiple output artifacts. Included in this section are example JSON files that show pipeline structures that create batch builds with separate artifacts, and combined artifacts. An additonal JSON sample is provided that show the pipeline structure with multiple input sources and multiple output artifacts.

[AWS Config sample](how-to-integrate-config.md)  
Shows how to set up AWS Config. Lists which CodeBuild resources are tracked and describes how to look up CodeBuild projects in AWS Config. The sample instructions will show you the prerequisites for integrating with AWS Config, the steps to set up AWS Config, and the steps to look up CodeBuild projects and data in AWS Config. 

[Build notifications sample](sample-build-notifications.md)  
Uses Apache Maven to produce a single JAR file. Sends a build notification to subscribers of an Amazon SNS topic. The sample instructions show you how to set up permissions so that CodeBuild can communicate with Amazon SNS and CloudWatch, how to create and identify CodeBuild topics in Amazon SNS, how to subscribe recipients to the topic, and how to set up rules in CloudWatch.

# Amazon ECR sample for CodeBuild
<a name="sample-ecr"></a>

This sample uses a Docker image in an Amazon Elastic Container Registry (Amazon ECR) image repository to build a sample Go project.

**Important**  
Running this sample might result in charges to your AWS account. These include possible charges for AWS CodeBuild and for AWS resources and actions related to Amazon S3, AWS KMS, CloudWatch Logs, and Amazon ECR. For more information, see [CodeBuild pricing](http://aws.amazon.com/codebuild/pricing), [Amazon S3 pricing](http://aws.amazon.com/s3/pricing), [AWS Key Management Service pricing](http://aws.amazon.com/kms/pricing), [Amazon CloudWatch pricing](http://aws.amazon.com/cloudwatch/pricing), and [Amazon Elastic Container Registry pricing](http://aws.amazon.com/ecr/pricing).

**Topics**
+ [Run the Amazon ECR sample](#sample-ecr-running)

## Run the Amazon ECR sample
<a name="sample-ecr-running"></a>

Use the following instructions to run the Amazon ECR sample for CodeBuild.

**To run this sample**

1. To create and push the Docker image to your image repository in Amazon ECR, complete the steps in the [Run the 'Publish Docker image to Amazon ECR' sample](sample-docker.md#sample-docker-running) section of the ['Publish Docker image to Amazon ECR' sample](sample-docker.md).

1. Create a Go project: 

   1. Create the files as described in the [Go project structure](#ecr-sample-go-project-file-structure) and [Go project files](#sample-ecr-go-project-files) sections of this topic, and then upload them to an S3 input bucket or an AWS CodeCommit, GitHub, or Bitbucket repository. 
**Important**  
Do not upload `(root directory name)`, just the files inside of `(root directory name)`.   
If you are using an S3 input bucket, be sure to create a ZIP file that contains the files, and then upload it to the input bucket. Do not add `(root directory name)` to the ZIP file, just the files inside of `(root directory name)`.

   1. Create a build project, run the build, and view related build information.

      If you use the AWS CLI to create the build project, the JSON-formatted input to the `create-project` command might look similar to this. (Replace the placeholders with your own values.)

      ```
      {
        "name": "sample-go-project",
        "source": {
          "type": "S3",
          "location": "codebuild-region-ID-account-ID-input-bucket/GoSample.zip"
        },
        "artifacts": {
          "type": "S3",
          "location": "codebuild-region-ID-account-ID-output-bucket",
          "packaging": "ZIP",
          "name": "GoOutputArtifact.zip"
        },
        "environment": {
          "type": "LINUX_CONTAINER",
          "image": "aws/codebuild/standard:5.0",
          "computeType": "BUILD_GENERAL1_SMALL"
        },
        "serviceRole": "arn:aws:iam::account-ID:role/role-name",
        "encryptionKey": "arn:aws:kms:region-ID:account-ID:key/key-ID"
      }
      ```

   1. To get the build output artifact, open your S3 output bucket.

   1. Download the `GoOutputArtifact.zip` file to your local computer or instance, and then extract the contents of the file. In the extracted contents, get the `hello` file. 

1.  If one of the following is true, you must add permissions to your image repository in Amazon ECR so that AWS CodeBuild can pull its Docker image into the build environment. 
   +  Your project uses CodeBuild credentials to pull Amazon ECR images. This is denoted by a value of `CODEBUILD` in the `imagePullCredentialsType` attribute of your `ProjectEnvironment`. 
   +  Your project uses a cross-account Amazon ECR image. In this case, your project must use its service role to pull Amazon ECR images. To enable this behavior, set the `imagePullCredentialsType` attribute of your `ProjectEnvironment` to `SERVICE_ROLE`. 

   1. Open the Amazon ECR console at [https://console.aws.amazon.com/ecr/](https://console.aws.amazon.com/ecr/).

   1. In the list of repository names, choose the name of the repository you created or selected.

   1. From the navigation pane, choose **Permissions**, choose **Edit**, and then choose **Add statement**.

   1. For **Statement name**, enter an identifier (for example, **CodeBuildAccess**).

   1. For **Effect**, leave **Allow** selected. This indicates that you want to allow access to another AWS account.

   1. For **Principal**, do one of the following:
      + If your project uses CodeBuild credentials to pull an Amazon ECR image, in **Service principal**, enter **codebuild.amazonaws.com**. 
      + If your project uses a cross-account Amazon ECR image, for **AWS account IDs**, enter IDs of the AWS accounts that you want to give access.

   1. Skip the **All IAM entities** list.

   1. For **Action**, select the pull-only actions: **ecr:GetDownloadUrlForLayer**, **ecr:BatchGetImage**, and **ecr:BatchCheckLayerAvailability**.

   1. For **Conditions**, add the following:

      ```
      {
         "StringEquals":{
            "aws:SourceAccount":"<AWS-account-ID>",
            "aws:SourceArn":"arn:aws:codebuild:<region>:<AWS-account-ID>:project/<project-name>"
         }
      }
      ```

   1. Choose **Save**.

      This policy is displayed in **Permissions**. The principal is what you entered for **Principal** in step 3 of this procedure:
      + If your project uses CodeBuild credentials to pull an Amazon ECR image, `"codebuild.amazonaws.com"` appears under **Service principals**.
      + If your project uses a cross-account Amazon ECR image, the ID of the AWS account that you want to give access appears under **AWS Account IDs**.

        The following sample policy uses both CodeBuild credentials and a cross-account Amazon ECR image.

------
#### [ JSON ]

****  

      ```
      {
          "Version":"2012-10-17",		 	 	 
          "Statement": [
              {
                  "Sid": "CodeBuildAccessPrincipal",
                  "Effect": "Allow",
                  "Action": [
                      "ecr:GetDownloadUrlForLayer",
                      "ecr:BatchGetImage",
                      "ecr:BatchCheckLayerAvailability"
                  ],
                  "Resource": "*",
                  "Condition": {
                      "StringEquals": {
                          "aws:SourceArn": "arn:aws:codebuild:us-east-1:111122223333:project/MyProject",
                          "aws:SourceAccount": "111122223333"
                      }
                  }
              },
              {
                  "Sid": "CodeBuildAccessCrossAccount",
                  "Effect": "Allow",
                  "Action": [
                      "ecr:GetDownloadUrlForLayer",
                      "ecr:BatchGetImage",
                      "ecr:BatchCheckLayerAvailability"
                  ],
                  "Resource": "*"
              }
          ]
      }
      ```

------
      + If your projects use CodeBuild credentials and you would like your CodeBuild projects to have open access to the Amazon ECR repository, you can omit the `Condition` keys and add the following sample policy.

------
#### [ JSON ]

****  

      ```
      {
          "Version":"2012-10-17",		 	 	 
          "Statement": [
              {
                  "Sid": "CodeBuildAccessPrincipal",
                  "Effect": "Allow",
                  "Resource": [
                      "arn:aws:codecommit:us-east-2:111122223333:MySharedDemoRepo"
                  ],
                  "Action": [
                      "ecr:GetDownloadUrlForLayer",
                      "ecr:BatchGetImage",
                      "ecr:BatchCheckLayerAvailability"
                  ]
              },
              {
                  "Sid": "CodeBuildAccessCrossAccount",
                  "Effect": "Allow",
                  "Resource": [
                      "arn:aws:codecommit:us-east-2:111122223333:MySharedDemoRepo"
                  ],
                  "Action": [
                      "ecr:GetDownloadUrlForLayer",
                      "ecr:BatchGetImage",
                      "ecr:BatchCheckLayerAvailability"
                  ]
              }
          ]
      }
      ```

------

1. Create a build project, run the build, and view build information.

   If you use the AWS CLI to create the build project, the JSON-formatted input to the `create-project` command might look similar to this. (Replace the placeholders with your own values.)

   ```
   {
     "name": "amazon-ecr-sample-project",
     "source": {
       "type": "S3",
       "location": "codebuild-region-ID-account-ID-input-bucket/GoSample.zip"
     },
     "artifacts": {
       "type": "S3",
       "location": "codebuild-region-ID-account-ID-output-bucket",
       "packaging": "ZIP",
       "name": "GoOutputArtifact.zip"
     },
     "environment": {
       "type": "LINUX_CONTAINER",
       "image": "account-ID.dkr.ecr.region-ID.amazonaws.com/your-Amazon-ECR-repo-name:tag",
       "computeType": "BUILD_GENERAL1_SMALL"
     },
     "serviceRole": "arn:aws:iam::account-ID:role/role-name",
     "encryptionKey": "arn:aws:kms:region-ID:account-ID:key/key-ID"
   }
   ```

1. To get the build output artifact, open your S3 output bucket.

1. Download the `GoOutputArtifact.zip` file to your local computer or instance, and then extract the contents of the `GoOutputArtifact.zip` file. In the extracted contents, get the `hello` file.

### Go project structure
<a name="ecr-sample-go-project-file-structure"></a>

This sample assumes this directory structure.

```
(root directory name)
├── buildspec.yml
└── hello.go
```

### Go project files
<a name="sample-ecr-go-project-files"></a>

This sample uses these files.

`buildspec.yml` (in `(root directory name)`)

```
version: 0.2

phases:
  install: 
   runtime-versions: 
     golang: 1.13 
  build:
    commands:
      - echo Build started on `date`
      - echo Compiling the Go code
      - go build hello.go 
  post_build:
    commands:
      - echo Build completed on `date`
artifacts:
  files:
    - hello
```

`hello.go` (in `(root directory name)`)

```
package main
import "fmt"

func main() {
  fmt.Println("hello world")
  fmt.Println("1+1 =", 1+1)
  fmt.Println("7.0/3.0 =", 7.0/3.0)
  fmt.Println(true && false)
  fmt.Println(true || false)
  fmt.Println(!true)
}
```

# Amazon Elastic File System sample for AWS CodeBuild
<a name="sample-efs"></a>

 You might want to create your AWS CodeBuild builds on Amazon Elastic File System, a scalable, shared file service for Amazon EC2 instances. The storage capacity with Amazon EFS is elastic, so it grows or shrinks as files are added and removed. It has a simple web services interface that you can use to create and configure file systems. It also manages all of the file storage infrastructure for you, so you do not need to worry about deploying, patching, or maintaining file system configurations. For more information, see [What is Amazon Elastic File System?](https://docs.aws.amazon.com/efs/latest/ug/whatisefs.html) in the *Amazon Elastic File System User Guide*. 

 This sample shows you how to configure a CodeBuild project so that it mounts and then builds a Java application to an Amazon EFS file system. Before you begin, you must have a Java application ready to build that is uploaded to an S3 input bucket or an AWS CodeCommit, GitHub, GitHub Enterprise Server, or Bitbucket repository. 

Data in transit for your file system is encrypted. To encrypt data in transit using a different image, see [Encrypting data in transit](https://docs.aws.amazon.com/efs/latest/ug/encryption-in-transit.html). 

**Topics**
+ [Use AWS CodeBuild with Amazon Elastic File System](#sample-efs-high-level-steps)
+ [Troubleshoot the Amazon EFS integration](sample-efs-troubleshooting.md)

## Use AWS CodeBuild with Amazon Elastic File System
<a name="sample-efs-high-level-steps"></a>

The sample covers the four high-level steps required to use Amazon EFS with AWS CodeBuild. They are: 

1. Create a virtual private cloud (VPC) in your AWS account. 

1. Create a file system that uses this VPC. 

1. Create and build a CodeBuild project that uses the VPC. The CodeBuild project uses the following to identify the file system:
   +  A unique file system identifier. You choose the identifier when you specify the file system in your build project.
   + The file system ID. The ID is displayed when you view your file system in the Amazon EFS console.
   +  A mount point. This is a directory in your Docker container that mounts the file system. 
   + Mount options. These include details about how to mount the file system.

1. Review the build project to ensure that the correct project files and variables were generated.

**Note**  
 A file system created in Amazon EFS is supported on Linux platforms only. 

 

**Topics**
+ [Step 1: Create a VPC using CloudFormation](#sample-efs-create-vpc)
+ [Step 2: Create an Amazon Elastic File System file system with your VPC](#sample-efs-create-efs)
+ [Step 3: Create a CodeBuild project to use with Amazon EFS](#sample-efs-create-acb)
+ [Step 4: Review the build project](#sample-efs-summary)

### Step 1: Create a VPC using CloudFormation
<a name="sample-efs-create-vpc"></a>

 Create your VPC with an CloudFormation template. 

1.  Follow the instructions in [CloudFormation VPC template](cloudformation-vpc-template.md) to use CloudFormation to create a VPC. 
**Note**  
 The VPC created by this CloudFormation template has two private subnets and two public subnets. You must only use private subnets when you use AWS CodeBuild to mount the file system you created in Amazon EFS. If you use one of the public subnets, the build fails. 

1. Sign in to the AWS Management Console and open the Amazon VPC console at [https://console.aws.amazon.com/vpc/](https://console.aws.amazon.com/vpc/).

1.  Choose the VPC you created with CloudFormation.

1. On the **Description** tab, make a note of the name of your VPC and its ID. Both are required when you create your AWS CodeBuild project later in this sample. 

### Step 2: Create an Amazon Elastic File System file system with your VPC
<a name="sample-efs-create-efs"></a>

 Create a simple Amazon EFS file system for this sample using the VPC you created earlier. 

1. Sign in to the AWS Management Console and open the Amazon EFS console at [ https://console.aws.amazon.com/efs/](https://console.aws.amazon.com/efs/).

1.  Choose **Create file system**. 

1.  From **VPC**, choose the VPC name you noted earlier in this sample. 

1.  Leave the Availability Zones associated with your subnets selected. 

1.  Choose **Next Step**. 

1.  In **Add tags**, for the default **Name** key, in **Value**, enter the name of your Amazon EFS file system. 

1.  Keep **Bursting** and **General Purpose** selected as your default performance and throughput modes, and then choose **Next Step**. 

1. For **Configure client access**, choose **Next Step**.

1.  Choose **Create File System**. 

1.  (Optional) We recommend adding a policy to your Amazon EFS file system that enforces encryption of data in transit. In the Amazon EFS console, choose **File system policy**, choose **Edit**, select the box labeled **Enforce in-transit encryption for all clients**, and then choose **Save**.

### Step 3: Create a CodeBuild project to use with Amazon EFS
<a name="sample-efs-create-acb"></a>

 Create a AWS CodeBuild project that uses the VPC you created earlier in this sample. When the build is run, it mounts the Amazon EFS file system created earlier. Next, it stores the .jar file created by your Java application in your file system's mount point directory.

1. Open the AWS CodeBuild console at [https://console.aws.amazon.com/codesuite/codebuild/home](https://console.aws.amazon.com/codesuite/codebuild/home).

1.  From the navigation pane, choose **Build projects**, and then choose **Create build project**. 

1.  In **Project name**, enter a name for your project. 

1.  From **Source provider**, choose the repository that contains the Java application you want to build. 

1.  Enter information, such as a repository URL, that CodeBuild uses to locate your application. The options are different for each source provider. For more information, see [Choose source provider](create-project.md#create-project-source-provider). 

1.  From **Environment image**, choose **Managed image**. 

1.  From **Operating system**, choose **Amazon Linux 2**. 

1. From **Runtime(s)**, choose **Standard**. 

1.  From **Image**, choose **aws/codebuild/amazonlinux-x86\$164-standard:4.0**. 

1.  From **Environment type**, choose **Linux**. 

1.  Under **Service role**, choose **New service role**. In **Role name**, enter a name for the role CodeBuild creates for you. 

1. Expand **Additional configuration**.

1.  Select **Enable this flag if you want to build Docker images or want your builds to get elevated privileges**.
**Note**  
By default, Docker daemon is enabled for non-VPC builds. If you would like to use Docker containers for VPC builds, see [Runtime Privilege and Linux Capabilities](https://docs.docker.com/engine/reference/run/#runtime-privilege-and-linux-capabilities) on the Docker Docs website and enable privileged mode. Also, Windows does not support privileged mode.

1.  From **VPC**, choose the VPC ID. 

1.  From **Subnets**, choose one or more of the private subnets associated with your VPC. You must use private subnets in a build that mounts an Amazon EFS file system. If you use a public subnet, the build fails. 

1.  From **Security Groups**, choose the default security group.

1.  In **File systems**, enter the following information:
   + For **Identifier**, enter a unique file system identifier. It must be fewer than 129 characters and contain only alphanumeric characters and underscores. CodeBuild uses this identifier to create an environment variable that identifies the elastic file system. The environment variable format is `CODEBUILD_<file_system_identifier>` in capital letters. For example, if you enter `my_efs`, the environment variable is `CODEBUILD_MY_EFS`. 
   + For **ID**, choose the file system ID. 
   + (Optional) Enter a directory in the file system. CodeBuild mounts this directory. If you leave **Directory path** blank, CodeBuild mounts the entire file system. The path is relative to the root of the file system. 
   + For **Mount point**, enter the absolute path of the directory in your build container where the file system is mounted. If this directory does not exist, CodeBuild creates it during the build. 
   + (Optional) Enter mount options. If you leave **Mount options** blank, CodeBuild uses its default mount options:

     ```
     nfsvers=4.1
     rsize=1048576
     wsize=1048576
     hard
     timeo=600
     retrans=2
     ```

     For more information, see [Recommended NFS Mount Options](https://docs.aws.amazon.com/efs/latest/ug/mounting-fs-nfs-mount-settings.html) in the *Amazon Elastic File System User Guide*. 

1.  For **Build specification**, choose **Insert build commands**, and then choose **Switch to editor**. 

1.  Enter the following build spec commands into the editor. Replace `<file_system_identifier>` with the identifier you entered in step 17. Use capital letters (for example, `CODEBUILD_MY_EFS`).

   ```
   version: 0.2
   phases:
     install:
       runtime-versions:
         java: corretto11    
     build:
       commands:
         - mvn compile -Dgpg.skip=true -Dmaven.repo.local=$CODEBUILD_<file_system_identifier>
   ```

1.  Use the default values for all other settings, and then choose **Create build project**. When your build is complete, the console page for your project is displayed. 

1.  Choose **Start build**. 

### Step 4: Review the build project
<a name="sample-efs-summary"></a>



 After your AWS CodeBuild project is built: 
+  You have a .jar file created by your Java application that is built to your Amazon EFS file system under your mount point directory. 
+  An environment variable that identifies your file system is created using the file system identifier you entered when you created the project. 

 For more information, see [Mounting file systems](https://docs.aws.amazon.com/efs/latest/ug/mounting-fs.html) in the *Amazon Elastic File System User Guide*. 

# Troubleshoot the Amazon EFS integration
<a name="sample-efs-troubleshooting"></a>

The following are errors you might encounter when setting up Amazon EFS with CodeBuild.

**Topics**
+ [CLIENT\$1ERROR: mounting '127.0.0.1:/' failed. permission denied](#sample-efs-troubleshooting.permission-denied)
+ [CLIENT\$1ERROR: mounting '127.0.0.1:/' failed. connection reset by peer](#sample-efs-troubleshooting.connection-reset)
+ [VPC\$1CLIENT\$1ERROR: Unexpected EC2 error: UnauthorizedOperation](#sample-efs-troubleshooting.unauthorized-operation)

## CLIENT\$1ERROR: mounting '127.0.0.1:/' failed. permission denied
<a name="sample-efs-troubleshooting.permission-denied"></a>

IAM authorization is not supported for mounting Amazon EFS with CodeBuild. If you are using a custom Amazon EFS file system policy, you will need to grant read and write access to all IAM principals. For example:

```
"Principal": {
  "AWS": "*"
}
```

## CLIENT\$1ERROR: mounting '127.0.0.1:/' failed. connection reset by peer
<a name="sample-efs-troubleshooting.connection-reset"></a>

There are two possible causes for this error:
+ The CodeBuild VPC subnet is in a different availability zone than the Amazon EFS mount target. You can resolve this by adding a VPC subnet in the same availability zone as the Amazon EFS mount target.
+ The security group does not have permissions to communicate with Amazon EFS. You can resolve this by adding an inbound rule to allow all traffic from either the VPC (add the primary CIDR block for your VPC), or the security group itself.

## VPC\$1CLIENT\$1ERROR: Unexpected EC2 error: UnauthorizedOperation
<a name="sample-efs-troubleshooting.unauthorized-operation"></a>

This error occurs when all of the subnets in your VPC configuration for the CodeBuild project are public subnets. You must have at least one private subnet in the VPC to ensure network connectivity. 

# AWS CodePipeline samples for CodeBuild
<a name="sample-codepipeline"></a>

This section describes sample integrations between CodePipeline and CodeBuild.


| Sample | Description | 
| --- | --- | 
|  [Samples of CodePipeline/CodeBuild integrations and batch builds](#sample-pipeline-batch)  |  These samples demonstrate how to use AWS CodePipeline to create a build project that uses batch builds.  | 
|  [Sample of a CodePipeline/CodeBuild integration with multiple input sources and output artifacts](#sample-pipeline-multi-input-output)  |  This sample demonstrates how to use AWS CodePipeline to create a build project that uses multiple input sources to create multiple output artifacts.  | 

## Samples of CodePipeline/CodeBuild integrations and batch builds
<a name="sample-pipeline-batch"></a>

AWS CodeBuild supports batch builds. The following samples demonstrate how to use AWS CodePipeline to create a build project that uses batch builds.

You can use a JSON-formatted file that defines the structure of your pipeline, and then use it with the AWS CLI to create the pipeline. For more information, see [AWS CodePipeline Pipeline structure reference](https://docs.aws.amazon.com/codepipeline/latest/userguide/reference-pipeline-structure.html) in the *AWS CodePipeline User Guide*.

### Batch build with individual artifacts
<a name="sample-pipeline-batch.separate-artifacts"></a>

Use the following JSON file as an example of a pipeline structure that creates a batch build with separate artifacts. To enable batch builds in CodePipeline, set the `BatchEnabled` parameter of the `configuration` object to `true`.

```
{
  "pipeline": {
    "roleArn": "arn:aws:iam::account-id:role/my-AWS-CodePipeline-service-role-name",
    "stages": [
      {
        "name": "Source",
        "actions": [
          {
            "inputArtifacts": [],
            "name": "Source1",
            "actionTypeId": {
              "category": "Source",
              "owner": "AWS",
              "version": "1",
              "provider": "S3"
            },
            "outputArtifacts": [
              {
                "name": "source1"
              }
            ],
            "configuration": {
              "S3Bucket": "<my-input-bucket-name>",
              "S3ObjectKey": "my-source-code-file-name.zip"
            },
            "runOrder": 1
          },
          {
            "inputArtifacts": [],
            "name": "Source2",
            "actionTypeId": {
              "category": "Source",
              "owner": "AWS",
              "version": "1",
              "provider": "S3"
            },
            "outputArtifacts": [
              {
                "name": "source2"
              }
            ],
            "configuration": {
              "S3Bucket": "<my-other-input-bucket-name>",
              "S3ObjectKey": "my-other-source-code-file-name.zip"
            },
            "runOrder": 1
          }
        ]
      },
      {
        "name": "Build",
        "actions": [
          {
            "inputArtifacts": [
              {
                "name": "source1"
              },
              {
                "name": "source2"
              }
            ],
            "name": "Build",
            "actionTypeId": {
              "category": "Build",
              "owner": "AWS",
              "version": "1",
              "provider": "CodeBuild"
            },
            "outputArtifacts": [
              {
                "name": "build1"
              },
              {
                "name": "build1_artifact1"
              },
              {
                "name": "build1_artifact2"
              },
              {
                "name": "build2_artifact1"
              },
              {
                "name": "build2_artifact2"
              }
            ],
            "configuration": {
              "ProjectName": "my-build-project-name",
              "PrimarySource": "source1",
              "BatchEnabled": "true"
            },
            "runOrder": 1
          }
        ]
      }
    ],
    "artifactStore": {
      "type": "S3",
      "location": "<AWS-CodePipeline-internal-bucket-name>"
    },
    "name": "my-pipeline-name",
    "version": 1
  }
}
```

The following is an example of a CodeBuild buildspec file that will work with this pipeline configuration.

```
version: 0.2
batch:
  build-list:
    - identifier: build1
      env:
        compute-type: BUILD_GENERAL1_SMALL
    - identifier: build2
      env:
        compute-type: BUILD_GENERAL1_MEDIUM

phases:
  build:
    commands:
      - echo 'file' > output_file

artifacts:
  files:
    - output_file
  secondary-artifacts:
    artifact1:
      files:
        - output_file
    artifact2:
      files:
        - output_file
```

The names of the output artifacts specified in the pipeline's JSON file must match the identifier of the builds and artifacts defined in your buildspec file. The syntax is *buildIdentifier* for the primary artifacts, and *buildIdentifier*\$1*artifactIdentifier* for the secondary artifacts.

For example, for output artifact name `build1`, CodeBuild will upload the primary artifact of `build1` to the location of `build1`. For output name `build1_artifact1`, CodeBuild will upload the secondary artifact `artifact1` of `build1` to the location of `build1_artifact1`, and so on. If only one output location is specified, the name should be *buildIdentifier* only.

After you create the JSON file, you can create your pipeline. Use the AWS CLI to run the **create-pipeline** command and pass the file to the `--cli-input-json` parameter. For more information, see [Create a pipeline (CLI)](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipelines-create.html#pipelines-create-cli) in the *AWS CodePipeline User Guide*. 

### Batch build with combined artifacts
<a name="sample-pipeline-batch.combined-artifacts"></a>

Use the following JSON file as an example of a pipeline structure that creates a batch build with combined artifacts. To enable batch builds in CodePipeline, set the `BatchEnabled` parameter of the `configuration` object to `true`. To combine the build artifacts into the same location, set the `CombineArtifacts` parameter of the `configuration` object to `true`.

```
{
 "pipeline": {
  "roleArn": "arn:aws:iam::account-id:role/my-AWS-CodePipeline-service-role-name",
  "stages": [
    {
      "name": "Source",
      "actions": [
        {
          "inputArtifacts": [],
          "name": "Source1",
          "actionTypeId": {
            "category": "Source",
            "owner": "AWS",
            "version": "1",
            "provider": "S3"
          },
          "outputArtifacts": [
            {
              "name": "source1"
            }
          ],
          "configuration": {
            "S3Bucket": "<my-input-bucket-name>",
            "S3ObjectKey": "my-source-code-file-name.zip"
          },
          "runOrder": 1
        },
        {
          "inputArtifacts": [],
          "name": "Source2",
          "actionTypeId": {
            "category": "Source",
            "owner": "AWS",
            "version": "1",
            "provider": "S3"
          },
          "outputArtifacts": [
            {
              "name": "source2"
            }
          ],
          "configuration": {
            "S3Bucket": "<my-other-input-bucket-name>",
            "S3ObjectKey": "my-other-source-code-file-name.zip"
          },
          "runOrder": 1
        }
      ]
    },
    {
      "name": "Build",
      "actions": [
        {
          "inputArtifacts": [
            {
              "name": "source1"
            },
            {
              "name": "source2"
            }
          ],
          "name": "Build",
          "actionTypeId": {
            "category": "Build",
            "owner": "AWS",
            "version": "1",
            "provider": "CodeBuild"
          },
          "outputArtifacts": [
            {
              "name": "output1 "
            }
          ],
          "configuration": {
            "ProjectName": "my-build-project-name",
            "PrimarySource": "source1",
             "BatchEnabled": "true",
             "CombineArtifacts": "true"
          },
          "runOrder": 1
        }
      ]
    }
  ],
  "artifactStore": {
    "type": "S3",
    "location": "<AWS-CodePipeline-internal-bucket-name>"
  },
  "name": "my-pipeline-name",
  "version": 1
 }
}
```

The following is an example of a CodeBuild buildspec file that will work with this pipeline configuration.

```
version: 0.2
batch:
  build-list:
    - identifier: build1
      env:
        compute-type: BUILD_GENERAL1_SMALL
    - identifier: build2
      env:
        compute-type: BUILD_GENERAL1_MEDIUM

phases:
  build:
    commands:
      - echo 'file' > output_file

artifacts:
  files:
    - output_file
```

If combined artifacts is enabled for the batch build, there is only one output allowed. CodeBuild will combine the primary artifacts of all the builds into one single ZIP file.

After you create the JSON file, you can create your pipeline. Use the AWS CLI to run the **create-pipeline** command and pass the file to the `--cli-input-json` parameter. For more information, see [Create a pipeline (CLI)](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipelines-create.html#pipelines-create-cli) in the *AWS CodePipeline User Guide*. 

## Sample of a CodePipeline/CodeBuild integration with multiple input sources and output artifacts
<a name="sample-pipeline-multi-input-output"></a>

An AWS CodeBuild project can take more than one input source. It can also create more than one output artifact. This sample demonstrates how to use AWS CodePipeline to create a build project that uses multiple input sources to create multiple output artifacts. For more information, see [Multiple input sources and output artifacts sample](sample-multi-in-out.md).

You can use a JSON-formatted file that defines the structure of your pipeline, and then use it with the AWS CLI to create the pipeline. Use the following JSON file as an example of a pipeline structure that creates a build with more than one input source and more than one output artifact. Later in this sample you see how this file specifies the multiple inputs and outputs. For more information, see [CodePipeline pipeline structure reference](https://docs.aws.amazon.com/codepipeline/latest/userguide/reference-pipeline-structure.html) in the *AWS CodePipeline User Guide*.

```
{
 "pipeline": {
  "roleArn": "arn:aws:iam::account-id:role/my-AWS-CodePipeline-service-role-name",
  "stages": [
    {
      "name": "Source",
      "actions": [
        {
          "inputArtifacts": [],
          "name": "Source1",
          "actionTypeId": {
            "category": "Source",
            "owner": "AWS",
            "version": "1",
            "provider": "S3"
          },
          "outputArtifacts": [
            {
              "name": "source1"
            }
          ],
          "configuration": {
            "S3Bucket": "my-input-bucket-name",
            "S3ObjectKey": "my-source-code-file-name.zip"
          },
          "runOrder": 1
        },
        {
          "inputArtifacts": [],
          "name": "Source2",
          "actionTypeId": {
            "category": "Source",
            "owner": "AWS",
            "version": "1",
            "provider": "S3"
          },
          "outputArtifacts": [
            {
              "name": "source2"
            }
          ],
          "configuration": {
            "S3Bucket": "my-other-input-bucket-name",
            "S3ObjectKey": "my-other-source-code-file-name.zip"
          },
          "runOrder": 1
        }
      ]
    },
    {
      "name": "Build",
      "actions": [
        {
          "inputArtifacts": [
            {
              "name": "source1"
            },
            {
              "name": "source2"
            }
          ],
          "name": "Build",
          "actionTypeId": {
            "category": "Build",
            "owner": "AWS",
            "version": "1",
            "provider": "AWS CodeBuild"
          },
          "outputArtifacts": [
            {
              "name": "artifact1"
            },
            {
              "name": "artifact2"
            }
          ],
          "configuration": {
            "ProjectName": "my-build-project-name",
            "PrimarySource": "source1"
          },
          "runOrder": 1
        }
      ]
    }
  ],
  "artifactStore": {
    "type": "S3",
    "location": "AWS-CodePipeline-internal-bucket-name"
  },
  "name": "my-pipeline-name",
  "version": 1
 }
}
```

 In this JSON file: 
+ One of your input sources must be designated the `PrimarySource`. This source is the directory where CodeBuild looks for and runs your buildspec file. The keyword `PrimarySource` is used to specify the primary source in the `configuration` section of the CodeBuild stage in the JSON file. 
+ Each input source is installed in its own directory. This directory is stored in the built-in environment variable `$CODEBUILD_SRC_DIR` for the primary source and `$CODEBUILD_SRC_DIR_yourInputArtifactName` for all other sources. For the pipeline in this sample, the two input source directories are `$CODEBUILD_SRC_DIR` and `$CODEBUILD_SRC_DIR_source2`. For more information, see [Environment variables in build environments](build-env-ref-env-vars.md). 
+ The names of the output artifacts specified in the pipeline's JSON file must match the names of the secondary artifacts defined in your buildspec file. This pipeline uses the following buildspec file. For more information, see [Buildspec syntax](build-spec-ref.md#build-spec-ref-syntax). 

  ```
  version: 0.2
  
  phases:
    build:
      commands:
        - touch source1_file
        - cd $CODEBUILD_SRC_DIR_source2
        - touch source2_file
  
  artifacts:
    files:
      - '**/*'
    secondary-artifacts:
      artifact1:
        base-directory: $CODEBUILD_SRC_DIR
        files:
          - source1_file
      artifact2:
        base-directory: $CODEBUILD_SRC_DIR_source2
        files:
          - source2_file
  ```

 After you create the JSON file, you can create your pipeline. Use the AWS CLI to run the **create-pipeline** command and pass the file to the `--cli-input-json` parameter. For more information, see [Create a pipeline (CLI)](https://docs.aws.amazon.com/codepipeline/latest/userguide/pipelines-create.html#pipelines-create-cli) in the *AWS CodePipeline User Guide*. 

# AWS Config sample with CodeBuild
<a name="how-to-integrate-config"></a>

AWS Config provides an inventory of your AWS resources and a history of configuration changes to these resources. AWS Config now supports AWS CodeBuild as an AWS resource, which means the service can track your CodeBuild projects. For more information about AWS Config, see [What is AWS Config?](https://docs.aws.amazon.com/config/latest/developerguide/WhatIsConfig.html) in the *AWS Config Developer Guide*.

You can see the following information about CodeBuild resources on the **Resource Inventory** page in the AWS Config console:
+ A timeline of your CodeBuild configuration changes.
+ Configuration details for each CodeBuild project.
+ Relationships with other AWS resources.
+ A list of changes to your CodeBuild projects.

**Topics**
+ [Use CodeBuild with AWS Config](#how-to-integrate-config-run)
+ [Step 3: View AWS CodeBuild data in the AWS Config console](#viewing-config-details)

## Use CodeBuild with AWS Config
<a name="how-to-integrate-config-run"></a>

The procedures in this topic show you how to set up AWS Config and look up CodeBuild projects.

**Topics**
+ [Prerequisites](#how-to-create-a-build-project)
+ [Step 1: Set up AWS Config](#setup-config)
+ [Step 2: Look up AWS CodeBuild projects](#lookup-projects)

### Prerequisites
<a name="how-to-create-a-build-project"></a>

Create your AWS CodeBuild project. For instructions, see [Create a build project](create-project.md).

### Step 1: Set up AWS Config
<a name="setup-config"></a>
+ [Setting up AWS Config (console)](https://docs.aws.amazon.com/config/latest/developerguide/gs-console.html)
+ [Setting up AWS Config (AWS CLI)](https://docs.aws.amazon.com/config/latest/developerguide/gs-cli.html)

**Note**  
After you complete setup, it might take up to 10 minutes before you can see AWS CodeBuild projects in the AWS Config console.

### Step 2: Look up AWS CodeBuild projects
<a name="lookup-projects"></a>

1. Sign in to the AWS Management Console and open the AWS Config console at [https://console.aws.amazon.com/config](https://console.aws.amazon.com/config). 

1. On the **Resource inventory** page, select **AWS CodeBuild Project** under **Resource type**. Scroll down and select the **CodeBuild project** check box.

1. Choose **Look up**.

1. After the list of CodeBuild projects is added, choose the CodeBuild project name link in the **Config timeline** column.

## Step 3: View AWS CodeBuild data in the AWS Config console
<a name="viewing-config-details"></a>

When you look up resources on the **Resource inventory** page, you can choose the AWS Config timeline to view details about your CodeBuild project. The details page for a resource provides information about the configuration, relationships, and number of changes made to that resource. 

The blocks at the top of the page are collectively called the timeline. The timeline shows the date and time that the recording was made.

For more information, see [Viewing configuration details in the AWS Config console](https://docs.aws.amazon.com/config/latest/developerguide/view-manage-resource-console.html) in the *AWS Config Developer Guide*.

# Build notifications sample for CodeBuild
<a name="sample-build-notifications"></a>

Amazon CloudWatch Events has built-in support for AWS CodeBuild. CloudWatch Events is a stream of system events describing changes in your AWS resources. With CloudWatch Events, you write declarative rules to associate events of interest with automated actions to be taken. This sample uses Amazon CloudWatch Events and Amazon Simple Notification Service (Amazon SNS) to send build notifications to subscribers whenever builds succeed, fail, go from one build phase to another, or any combination of these events.

**Important**  
Running this sample might result in charges to your AWS account. These include possible charges for CodeBuild and for AWS resources and actions related to Amazon CloudWatch and Amazon SNS. For more information, see [CodeBuild pricing](http://aws.amazon.com/codebuild/pricing), [Amazon CloudWatch pricing](http://aws.amazon.com/cloudwatch/pricing), and [Amazon SNS pricing](http://aws.amazon.com/sns/pricing).

**Topics**
+ [Run the build notifications sample](#sample-build-notifications-running)
+ [Build notifications input format reference](sample-build-notifications-ref.md)

## Run the build notifications sample
<a name="sample-build-notifications-running"></a>

Use the following procedure to run the build notifications sample.

**To run this sample**

1. If you already have a topic set up and subscribed to in Amazon SNS that you want to use for this sample, skip ahead to step 4. Otherwise, if you are using an IAM user instead of an AWS root account or an administrator user to work with Amazon SNS, add the following statement (between *\$1\$1\$1 BEGIN ADDING STATEMENT HERE \$1\$1\$1* and *\$1\$1\$1 END ADDING STATEMENT HERE \$1\$1\$1*) to the user (or IAM group the user is associated with). Using an AWS root account is not recommended. This statement enables viewing, creating, subscribing, and testing the sending of notifications to topics in Amazon SNS. Ellipses (`...`) are used for brevity and to help you locate where to add the statement. Do not remove any statements, and do not type these ellipses into the existing policy.

------
#### [ JSON ]

****  

   ```
   {
       "Version":"2012-10-17",		 	 	 
       "Statement": [
           {
               "Effect": "Allow",
               "Action": [
                   "sns:CreateTopic",
                   "sns:GetTopicAttributes",
                   "sns:List*",
                   "sns:Publish",
                   "sns:SetTopicAttributes",
                   "sns:Subscribe"
               ],
               "Resource": "*"
           }
       ]
   }
   ```

------
**Note**  
The IAM entity that modifies this policy must have permission in IAM to modify policies.  
For more information, see [Editing customer managed policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_managed-using.html#edit-managed-policy-console) or the "To edit or delete an inline policy for a group, user, or role" section in [Working with inline policies (console)](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_inline-using.html#AddingPermissions_Console) in the *IAM User Guide*.

1. Create or identify a topic in Amazon SNS. AWS CodeBuild uses CloudWatch Events to send build notifications to this topic through Amazon SNS. 

   To create a topic:

   1. Open the Amazon SNS console at [https://console.aws.amazon.com/sns](https://console.aws.amazon.com/sns).

   1. Choose **Create topic**. 

   1. In **Create new topic**, for **Topic name**, enter a name for the topic (for example, **CodeBuildDemoTopic**). (If you choose a different name, substitute it throughout this sample.) 

   1. Choose **Create topic**.

   1. On the **Topic details: CodeBuildDemoTopic** page, copy the **Topic ARN** value. You need this value for the next step. 

        
![\[The Topic ARN value.\]](http://docs.aws.amazon.com/codebuild/latest/userguide/images/topic-arn.png)

      

   For more information, see [Create a topic](https://docs.aws.amazon.com/sns/latest/dg/CreateTopic.html) in the *Amazon SNS Developer Guide*.

1. Subscribe one or more recipients to the topic to receive email notifications. 

   To subscribe a recipient to a topic:

   1. With the Amazon SNS console open from the previous step, in the navigation pane, choose **Subscriptions**, and then choose **Create subscription**.

   1. In **Create subscription**, for **Topic ARN**, paste the topic ARN you copied from the previous step.

   1. For **Protocol**, choose **Email**.

   1. For **Endpoint**, enter the recipient's full email address. 

        
![\[The subscription configuration.\]](http://docs.aws.amazon.com/codebuild/latest/userguide/images/create-subscription.png)

      

   1. Choose **Create Subscription**.

   1. Amazon SNS sends a subscription confirmation email to the recipient. To begin receiving email notifications, the recipient must choose the **Confirm subscription** link in the subscription confirmation email. After the recipient clicks the link, if successfully subscribed, Amazon SNS displays a confirmation message in the recipient's web browser.

   For more information, see [Subscribe to a topic](https://docs.aws.amazon.com/sns/latest/dg/SubscribeTopic.html) in the *Amazon SNS Developer Guide*.

1. If you are using an user instead of an AWS root account or an administrator user to work with CloudWatch Events, add the following statement (between *\$1\$1\$1 BEGIN ADDING STATEMENT HERE \$1\$1\$1* and *\$1\$1\$1 END ADDING STATEMENT HERE \$1\$1\$1*) to the user (or IAM group the user is associated with). Using an AWS root account is not recommended. This statement is used to allow the user to work with CloudWatch Events. Ellipses (`...`) are used for brevity and to help you locate where to add the statement. Do not remove any statements, and do not type these ellipses into the existing policy.

------
#### [ JSON ]

****  

   ```
   {
       "Version":"2012-10-17",		 	 	 
       "Statement": [
           {
               "Effect": "Allow",
               "Action": [
                   "events:*",
                   "iam:PassRole"
               ],
               "Resource": "arn:aws:iam::*:role/Service*"
           }
       ]
   }
   ```

------
**Note**  
The IAM entity that modifies this policy must have permission in IAM to modify policies.  
For more information, see [Editing customer managed policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_managed-using.html#edit-managed-policy-console) or the "To edit or delete an inline policy for a group, user, or role" section in [Working with inline policies (console)](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_inline-using.html#AddingPermissions_Console) in the *IAM User Guide*.

1. Create a rule in CloudWatch Events. To do this, open the CloudWatch console, at [https://console.aws.amazon.com/cloudwatch](https://console.aws.amazon.com/cloudwatch).

1. In the navigation pane, under **Events**, choose **Rules**, and then choose **Create rule**. 

1. On the **Step 1: Create rule page**, **Event Pattern** and **Build event pattern to match events by service** should already be selected. 

1. For **Service Name**, choose **CodeBuild**. For **Event Type**, **All Events** should already be selected.

1. The following code should be displayed in **Event Pattern Preview**:

   ```
   {
     "source": [ 
       "aws.codebuild"
     ]
   }
   ```

1. Choose **Edit** and replace the code in **Event Pattern Preview** with one of the following two rule patterns.

   This first rule pattern triggers an event when a build starts or completes for the specified build projects in AWS CodeBuild.

   ```
   {
     "source": [ 
       "aws.codebuild"
     ], 
     "detail-type": [
       "CodeBuild Build State Change"
     ],
     "detail": {
       "build-status": [
         "IN_PROGRESS",
         "SUCCEEDED", 
         "FAILED",
         "STOPPED" 
       ],
       "project-name": [
         "my-demo-project-1",
         "my-demo-project-2"
       ]
     }  
   }
   ```

   In the preceding rule, make the following code changes as needed.
   + To trigger an event when a build starts or completes, either leave all of the values as shown in the `build-status` array, or remove the `build-status` array altogether. 
   + To trigger an event only when a build completes, remove `IN_PROGRESS` from the `build-status` array. 
   + To trigger an event only when a build starts, remove all of the values except `IN_PROGRESS` from the `build-status` array.
   + To trigger events for all build projects, remove the `project-name` array altogether.
   + To trigger events only for individual build projects, specify the name of each build project in the `project-name` array. 

   This second rule pattern triggers an event whenever a build moves from one build phase to another for the specified build projects in AWS CodeBuild.

   ```
   {
     "source": [ 
       "aws.codebuild"
     ], 
     "detail-type": [
       "CodeBuild Build Phase Change" 
     ],
     "detail": {
       "completed-phase": [
         "SUBMITTED",
         "PROVISIONING",
         "DOWNLOAD_SOURCE",
         "INSTALL",
         "PRE_BUILD",
         "BUILD",
         "POST_BUILD",
         "UPLOAD_ARTIFACTS",
         "FINALIZING"
       ],
       "completed-phase-status": [
         "TIMED_OUT",
         "STOPPED",
         "FAILED", 
         "SUCCEEDED",
         "FAULT",
         "CLIENT_ERROR"
       ],
       "project-name": [
         "my-demo-project-1",
         "my-demo-project-2"
       ]
     }  
   }
   ```

   In the preceding rule, make the following code changes as needed.
   + To trigger an event for every build phase change (which might send up to nine notifications for each build), either leave all of the values as shown in the `completed-phase` array, or remove the `completed-phase` array altogether.
   + To trigger events only for individual build phase changes, remove the name of each build phase in the `completed-phase` array that you do not want to trigger an event for.
   + To trigger an event for every build phase status change, either leave all of the values as shown in the `completed-phase-status` array, or remove the `completed-phase-status` array altogether.
   + To trigger events only for individual build phase status changes, remove the name of each build phase status in the `completed-phase-status` array that you do not want to trigger an event for.
   + To trigger events for all build projects, remove the `project-name` array.
   + To trigger events for individual build projects, specify the name of each build project in the `project-name` array. 

   For more information about event patterns, see [Event Patterns](https://docs.aws.amazon.com/eventbridge/latest/userguide/filtering-examples-structure.html) in the Amazon EventBridge User Guide.

   For more information about filtering with event patterns, see [Content-based Filtering with Event Patterns](https://docs.aws.amazon.com/eventbridge/latest/userguide/content-filtering-with-event-patterns.html) in the Amazon EventBridge User Guide.
**Note**  
If you want to trigger events for both build state changes and build phase changes, you must create two separate rules: one for build state changes and another for build phase changes. If you try to combine both rules into a single rule, the combined rule might produce unexpected results or stop working altogether.

   When you have finished replacing the code, choose **Save**.

1. For **Targets**, choose **Add target**. 

1. In the list of targets, choose **SNS topic**. 

1. For **Topic**, choose the topic you identified or created earlier. 

1. Expand **Configure input**, and then choose **Input Transformer**. 

1. In the **Input Path** box, enter one of the following input paths.

   For a rule with a `detail-type` value of `CodeBuild Build State Change`, enter the following.

   ```
   {"build-id":"$.detail.build-id","project-name":"$.detail.project-name","build-status":"$.detail.build-status"}
   ```

   For a rule with a `detail-type` value of `CodeBuild Build Phase Change`, enter the following.

   ```
   {"build-id":"$.detail.build-id","project-name":"$.detail.project-name","completed-phase":"$.detail.completed-phase","completed-phase-status":"$.detail.completed-phase-status"}
   ```

   To get other types of information, see the [Build notifications input format reference](sample-build-notifications-ref.md).

1. In the **Input Template** box, enter one of the following input templates.

   For a rule with a `detail-type` value of `CodeBuild Build State Change`, enter the following.

   ```
   "Build '<build-id>' for build project '<project-name>' has reached the build status of '<build-status>'."
   ```

   For a rule with a `detail-type` value of `CodeBuild Build Phase Change`, enter the following.

   ```
   "Build '<build-id>' for build project '<project-name>' has completed the build phase of '<completed-phase>' with a status of '<completed-phase-status>'."
   ```

1. Choose **Configure details**.

1. On the **Step 2: Configure rule details** page, enter a name and an optional description. For **State**, leave **Enabled** selected.

1. Choose **Create rule**. 

1. Create build projects, run the builds, and view build information.

1. Confirm that CodeBuild is now successfully sending build notifications. For example, check to see if the build notification emails are now in your inbox.

To change a rule's behavior, in the CloudWatch console, choose the rule you want to change, choose **Actions**, and then choose **Edit**. Make changes to the rule, choose **Configure details**, and then choose **Update rule**.

To stop using a rule to send build notifications, in the CloudWatch console, choose the rule you want to stop using, choose **Actions**, and then choose **Disable**.

To delete a rule altogether, in the CloudWatch console, choose the rule you want to delete, choose **Actions**, and then choose **Delete**.

# Build notifications input format reference
<a name="sample-build-notifications-ref"></a>

CloudWatch delivers notifications in JSON format.

Build state change notifications use the following format:

```
{
  "version": "0",
  "id": "c030038d-8c4d-6141-9545-00ff7b7153EX",
  "detail-type": "CodeBuild Build State Change",
  "source": "aws.codebuild",
  "account": "123456789012",
  "time": "2017-09-01T16:14:28Z",
  "region": "us-west-2",
  "resources":[
    "arn:aws:codebuild:us-west-2:123456789012:build/my-sample-project:8745a7a9-c340-456a-9166-edf953571bEX"
  ],
  "detail":{
    "build-status": "SUCCEEDED",
    "project-name": "my-sample-project",
    "build-id": "arn:aws:codebuild:us-west-2:123456789012:build/my-sample-project:8745a7a9-c340-456a-9166-edf953571bEX",
    "additional-information": {
      "artifact": {
        "md5sum": "da9c44c8a9a3cd4b443126e823168fEX",
        "sha256sum": "6ccc2ae1df9d155ba83c597051611c42d60e09c6329dcb14a312cecc0a8e39EX",
        "location": "arn:aws:s3:::codebuild-123456789012-output-bucket/my-output-artifact.zip"
      },
      "environment": {
        "image": "aws/codebuild/standard:5.0",
        "privileged-mode": false,
        "compute-type": "BUILD_GENERAL1_SMALL",
        "type": "LINUX_CONTAINER",
        "environment-variables": []
      },
      "timeout-in-minutes": 60,
      "build-complete": true,
      "initiator": "MyCodeBuildDemoUser",
      "build-start-time": "Sep 1, 2017 4:12:29 PM",
      "source": {
        "location": "codebuild-123456789012-input-bucket/my-input-artifact.zip",
        "type": "S3"
      },
      "logs": {
        "group-name": "/aws/codebuild/my-sample-project",
        "stream-name": "8745a7a9-c340-456a-9166-edf953571bEX",
        "deep-link": "https://console.aws.amazon.com/cloudwatch/home?region=us-west-2#logEvent:group=/aws/codebuild/my-sample-project;stream=8745a7a9-c340-456a-9166-edf953571bEX"
      },
      "phases": [
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:12:29 PM",
          "end-time": "Sep 1, 2017 4:12:29 PM",
          "duration-in-seconds": 0,
          "phase-type": "SUBMITTED",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:12:29 PM",
          "end-time": "Sep 1, 2017 4:13:05 PM",
          "duration-in-seconds": 36,
          "phase-type": "PROVISIONING",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:13:05 PM",
          "end-time": "Sep 1, 2017 4:13:10 PM",
          "duration-in-seconds": 4,
          "phase-type": "DOWNLOAD_SOURCE",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:13:10 PM",
          "end-time": "Sep 1, 2017 4:13:10 PM",
          "duration-in-seconds": 0,
          "phase-type": "INSTALL",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:13:10 PM",
          "end-time": "Sep 1, 2017 4:13:10 PM",
          "duration-in-seconds": 0,
          "phase-type": "PRE_BUILD",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:13:10 PM",
          "end-time": "Sep 1, 2017 4:14:21 PM",
          "duration-in-seconds": 70,
          "phase-type": "BUILD",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:14:21 PM",
          "end-time": "Sep 1, 2017 4:14:21 PM",
          "duration-in-seconds": 0,
          "phase-type": "POST_BUILD",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:14:21 PM",
          "end-time": "Sep 1, 2017 4:14:21 PM",
          "duration-in-seconds": 0,
          "phase-type": "UPLOAD_ARTIFACTS",
          "phase-status": "SUCCEEDED"
        },
         {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:14:21 PM",
          "end-time": "Sep 1, 2017 4:14:26 PM",
          "duration-in-seconds": 4,
          "phase-type": "FINALIZING",
          "phase-status": "SUCCEEDED"
        },
        {
          "start-time": "Sep 1, 2017 4:14:26 PM",
          "phase-type": "COMPLETED"
        }
      ]
    },
    "current-phase": "COMPLETED",
    "current-phase-context": "[]",
    "version": "1"
  }
}
```

Build phase change notifications use the following format:

```
{
  "version": "0",
  "id": "43ddc2bd-af76-9ca5-2dc7-b695e15adeEX",
  "detail-type": "CodeBuild Build Phase Change",
  "source": "aws.codebuild",
  "account": "123456789012",
  "time": "2017-09-01T16:14:21Z",
  "region": "us-west-2",
  "resources":[
    "arn:aws:codebuild:us-west-2:123456789012:build/my-sample-project:8745a7a9-c340-456a-9166-edf953571bEX"
  ],
  "detail":{
    "completed-phase": "COMPLETED",
    "project-name": "my-sample-project",
    "build-id": "arn:aws:codebuild:us-west-2:123456789012:build/my-sample-project:8745a7a9-c340-456a-9166-edf953571bEX",
    "completed-phase-context": "[]",
    "additional-information": {
      "artifact": {
        "md5sum": "da9c44c8a9a3cd4b443126e823168fEX",
        "sha256sum": "6ccc2ae1df9d155ba83c597051611c42d60e09c6329dcb14a312cecc0a8e39EX",
        "location": "arn:aws:s3:::codebuild-123456789012-output-bucket/my-output-artifact.zip"
      },
      "environment": {
        "image": "aws/codebuild/standard:5.0",
        "privileged-mode": false,
        "compute-type": "BUILD_GENERAL1_SMALL",
        "type": "LINUX_CONTAINER",
        "environment-variables": []
      },
      "timeout-in-minutes": 60,
      "build-complete": true,
      "initiator": "MyCodeBuildDemoUser",
      "build-start-time": "Sep 1, 2017 4:12:29 PM",
      "source": {
        "location": "codebuild-123456789012-input-bucket/my-input-artifact.zip",
        "type": "S3"
      },
      "logs": {
        "group-name": "/aws/codebuild/my-sample-project",
        "stream-name": "8745a7a9-c340-456a-9166-edf953571bEX",
        "deep-link": "https://console.aws.amazon.com/cloudwatch/home?region=us-west-2#logEvent:group=/aws/codebuild/my-sample-project;stream=8745a7a9-c340-456a-9166-edf953571bEX"
      },
      "phases": [
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:12:29 PM",
          "end-time": "Sep 1, 2017 4:12:29 PM",
          "duration-in-seconds": 0,
          "phase-type": "SUBMITTED",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:12:29 PM",
          "end-time": "Sep 1, 2017 4:13:05 PM",
          "duration-in-seconds": 36,
          "phase-type": "PROVISIONING",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:13:05 PM",
          "end-time": "Sep 1, 2017 4:13:10 PM",
          "duration-in-seconds": 4,
          "phase-type": "DOWNLOAD_SOURCE",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:13:10 PM",
          "end-time": "Sep 1, 2017 4:13:10 PM",
          "duration-in-seconds": 0,
          "phase-type": "INSTALL",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:13:10 PM",
          "end-time": "Sep 1, 2017 4:13:10 PM",
          "duration-in-seconds": 0,
          "phase-type": "PRE_BUILD",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:13:10 PM",
          "end-time": "Sep 1, 2017 4:14:21 PM",
          "duration-in-seconds": 70,
          "phase-type": "BUILD",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:14:21 PM",
          "end-time": "Sep 1, 2017 4:14:21 PM",
          "duration-in-seconds": 0,
          "phase-type": "POST_BUILD",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:14:21 PM",
          "end-time": "Sep 1, 2017 4:14:21 PM",
          "duration-in-seconds": 0,
          "phase-type": "UPLOAD_ARTIFACTS",
          "phase-status": "SUCCEEDED"
        },
        {
          "phase-context": [],
          "start-time": "Sep 1, 2017 4:14:21 PM",
          "end-time": "Sep 1, 2017 4:14:26 PM",
          "duration-in-seconds": 4,
          "phase-type": "FINALIZING",
          "phase-status": "SUCCEEDED"
        },
        {
          "start-time": "Sep 1, 2017 4:14:26 PM",
          "phase-type": "COMPLETED"
        }
      ]  
    },
    "completed-phase-status": "SUCCEEDED",
    "completed-phase-duration-seconds": 4,
    "version": "1",
    "completed-phase-start": "Sep 1, 2017 4:14:21 PM",
    "completed-phase-end": "Sep 1, 2017 4:14:26 PM"
  }
}
```