

# Integrating with Amazon Redshift
<a name="RedshiftforDynamoDB"></a>

Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service that makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools.

DynamoDB and Amazon Redshift can be used together to address different data storage and processing needs within an application or data ecosystem.

See the topics below for more detailed topics on how to integrate DynamoDB with Amazon Redshift.

**Topics**
+ [Cross-account integration considerations with CMK](cross-account-integration-considerations.md)
+ [DynamoDB zero-ETL integration with Amazon Redshift](RedshiftforDynamoDB-zero-etl.md)
+ [Loading data from DynamoDB into Amazon Redshift with the COPY command](RedshiftforDynamoDB-copy-data.md)

# Cross-account integration considerations with CMK
<a name="cross-account-integration-considerations"></a>

When you attempt to integrate from DynamoDB to Amazon Redshift, the initial action is launched from Amazon Redshift. Without the proper permissions, this action could result in a silent failure. The following sections detail the permissions required for this cross-account integration.

## Required AWS KMS policies and permissions
<a name="required-kms-policies-permissions"></a>

Replace the following placeholders in the examples:
+ `111122223333`: The AWS account ID where Amazon Redshift is hosted
+ `444455556666`: The AWS account ID where DynamoDB is hosted
+ `REDSHIFT_ROLE_NAME`: The IAM role name used by Amazon Redshift
+ `REGION`: The AWS Region where your resources are located
+ `TABLE_NAME`: The name of your DynamoDB table
+ `KMS_KEY_ID`: The ID of your KMS key

### KMS key policy in the DynamoDB account
<a name="kms-key-policy-dynamodb-account"></a>

The following AWS KMS key policy enables cross-account access between your DynamoDB and Amazon Redshift services. In this example, account 444455556666 contains the DynamoDB table and AWS KMS key, while account 111122223333 contains the Amazon Redshift cluster that needs access to decrypt the data.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "Enable IAM User Permissions",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::444455556666:root"
            },
            "Action": "kms:*",
            "Resource": "*"
        },
        {
            "Sid": "Allow Redshift to use the key",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::111122223333:role/REDSHIFT_ROLE_NAME"
            },
            "Action": [
                "kms:Decrypt",
                "kms:DescribeKey",
                "kms:GenerateDataKey",
                "kms:GenerateDataKeyWithoutPlaintext"
            ],
            "Resource": "*"
        }
    ]
}
```

------

### IAM Policy for the Amazon Redshift role (in Amazon Redshift account)
<a name="iam-policy-redshift-role"></a>

The following IAM policy allows a Amazon Redshift service to access DynamoDB tables and their associated AWS KMS encryption keys in a cross-account scenario. In this example, account 444455556666 contains the DynamoDB resources and AWS KMS keys that the Amazon Redshift service needs to access.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AllowDynamoDBAccess",
            "Effect": "Allow",
            "Action": [
                "dynamodb:DescribeTable",
                "dynamodb:BatchGetItem",
                "dynamodb:Scan",
                "dynamodb:Query",
                "dynamodb:BatchGetItem",
                "dynamodb:GetItem",
                "dynamodb:GetRecords",
                "dynamodb:GetShardIterator",
                "dynamodb:DescribeStream",
                "dynamodb:ListStreams"
            ],
            "Resource": [
                "arn:aws:dynamodb:*:444455556666:table/TABLE_NAME",
                "arn:aws:dynamodb:*:444455556666:table/TABLE_NAME/stream/*"
            ]
        },
        {
            "Sid": "AllowKMSAccess",
            "Effect": "Allow",
            "Action": [
                "kms:Decrypt",
                "kms:DescribeKey",
                "kms:GenerateDataKey",
                "kms:GenerateDataKeyWithoutPlaintext"
            ],
            "Resource": "arn:aws:kms:us-east-1:444455556666:key/KMS_KEY_ID"
        }
    ]
}
```

------

### Trust relationship for the Amazon Redshift role
<a name="trust-relationship-redshift-role"></a>

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "redshift.amazonaws.com"
            },
            "Action": "sts:AssumeRole"
        }
    ]
}
```

------

### DynamoDB Table policy (if using resource-based policies)
<a name="dynamodb-table-policy"></a>

The following resource-based policy allows a Amazon Redshift service in account 111122223333 to access DynamoDB tables and Streams in account 444455556666. Attach this policy to your DynamoDB table to enable cross-account access.

------
#### [ JSON ]

****  

```
{
    "Version":"2012-10-17",		 	 	 
    "Statement": [
        {
            "Sid": "AllowRedshiftAccess",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::111122223333:role/REDSHIFT_ROLE_NAME"
            },
            "Action": [
                "dynamodb:DescribeTable",
                "dynamodb:BatchGetItem",
                "dynamodb:Scan",
                "dynamodb:Query",
                "dynamodb:BatchGetItem",
                "dynamodb:GetItem",
                "dynamodb:GetRecords",
                "dynamodb:GetShardIterator",
                "dynamodb:DescribeStream",
                "dynamodb:ListStreams"
            ],
            "Resource": [
                "arn:aws:dynamodb:*:444455556666:table/TABLE_NAME",
                "arn:aws:dynamodb:*:444455556666:table/TABLE_NAME/stream/*"
            ]
        }
    ]
}
```

------

## Important considerations
<a name="important-considerations"></a>

1. Ensure the KMS key is in the same region as your DynamoDB table.

1. The KMS key must be a customer managed key (CMK), not an AWS managed key.

1. If you're using DynamoDB global tables, configure permissions for all relevant regions.

1. Consider adding condition statements to restrict access based on VPC endpoints or IP ranges.

1. For enhanced security, consider using `aws:PrincipalOrgID` condition to restrict access to your organization.

1. Monitor KMS key usage through CloudTrail and CloudWatch metrics.

# DynamoDB zero-ETL integration with Amazon Redshift
<a name="RedshiftforDynamoDB-zero-etl"></a>

Amazon DynamoDB zero-ETL integration with Amazon Redshift enables seamless analytics on DynamoDB data without any coding. This fully-managed feature automatically replicates DynamoDB tables into an Amazon Redshift database so users can run SQL queries and analytics on their DynamoDB data without having to set up complex ETL processes. The integration works by replicating data from the DynamoDB table to the Amazon Redshift database. 

To set up the integration, simply specify a DynamoDB table as the source and an Amazon Redshift database as the target. On activation, the integration exports the full DynamoDB table to populate the Amazon Redshift database. The time it takes for this initial process to complete depends on the DynamoDB table size. The zero-ETL integration then incrementally replicates updates from DynamoDB to Amazon Redshift every 15-30 minutes using DynamoDB incremental exports. This means the replicated DynamoDB data in Amazon Redshift is kept up-to-date automatically. 

Once configured, users can analyze the DynamoDB data in Amazon Redshift using standard SQL clients and tools, without impacting DynamoDB table performance. By eliminating cumbersome ETL, this zero-ETL integration provides a fast, easy way to unlock insights from DynamoDB through Amazon Redshift analytics and machine learning capabilities. 

**Topics**
+ [Prerequisites before creating a DynamoDB zero-ETL integration with Amazon Redshift](#RedshiftforDynamoDB-zero-etl-prereqs)
+ [Limitations when using DynamoDB zero-ETL integrations with Amazon Redshift](#RedshiftforDynamoDB-zero-etl-limitations)
+ [Creating a DynamoDB zero-ETL integration with Amazon Redshift](RedshiftforDynamoDB-zero-etl-getting-started.md)
+ [Viewing DynamoDB zero-ETL integrations with Amazon Redshift](RedshiftforDynamoDB-zero-etl-viewing.md)
+ [Deleting DynamoDB zero-ETL integrations with Amazon Redshift](RedshiftforDynamoDB-zero-etl-deleting.md)

## Prerequisites before creating a DynamoDB zero-ETL integration with Amazon Redshift
<a name="RedshiftforDynamoDB-zero-etl-prereqs"></a>

1.  You must have your source DynamoDB table and target Amazon Redshift cluster created before creating an integration. This information is covered in [Step 1: Configuring a source DynamoDB table](RedshiftforDynamoDB-zero-etl-getting-started.md#RedshiftforDynamoDB-zero-etl-getting-started-configuring) and [Step 2: Creating an Amazon Redshift data warehouse](RedshiftforDynamoDB-zero-etl-getting-started.md#RedshiftforDynamoDB-zero-etl-getting-started-creating). 

1.  A zero-ETL integration between Amazon DynamoDB and Amazon Redshift requires your source DynamoDB table to have [Point-in-time recovery (PITR)](Point-in-time-recovery.md) enabled.

1. For **resource-based policies**, the zero-ETL integration requires a resource-based policy attached directly to your DynamoDB table. This inline policy grants the Amazon Redshift service permission to access your table data for replication. For more information about resource-based policies for DynamoDB, see [Using resource-based policies for DynamoDB](access-control-resource-based.md).

   If you create the integration where your DynamoDB table and Amazon Redshift data warehouse are in the same account, you can use the **Fix it for me** option during the create integration step to automatically apply the required resource policies to both DynamoDB and Amazon Redshift.

   If you create an integration where your DynamoDB table and Amazon Redshift data warehouse are in different AWS accounts, you will need to manually apply the following resource policy on your DynamoDB table.

------
#### [ JSON ]

****  

   ```
   {
       "Version":"2012-10-17",		 	 	 
       "Statement": [
           {
               "Sid": "StatementthatallowsAmazonRedshiftservicetoDescribeTableandExportTable",
               "Effect": "Allow",
               "Principal": {
                   "Service": "redshift.amazonaws.com"
               },
               "Action": [
                   "dynamodb:ExportTableToPointInTime",
                   "dynamodb:DescribeTable"
               ],
               "Resource": "*",
               "Condition": {
                   "StringEquals": {
                       "aws:SourceAccount": "111122223333"
                   },
                   "ArnEquals": {
                       "aws:SourceArn": "arn:aws:redshift:us-east-1:111122223333:integration:*"
                   }
               }
           },
           {
               "Sid": "StatementthatallowsAmazonRedshiftservicetoDescribeTableandExportTable",
               "Effect": "Allow",
               "Principal": {
                   "Service": "redshift.amazonaws.com"
               },
               "Action": "dynamodb:DescribeExport",
               "Resource": "arn:aws:dynamodb:us-east-1:111122223333:table/table-name/export/*",
               "Condition": {
                   "StringEquals": {
                       "aws:SourceAccount": "111122223333"
                   },
                   "ArnEquals": {
                       "aws:SourceArn": "arn:aws:redshift:us-east-1:111122223333:integration:*"
                   }
               }
           }
       ]
   }
   ```

------

   You may also need to configure the resource policy on your Amazon Redshift data warehouse. For more information, see [Configure authorization using the Amazon Redshift API](https://docs.aws.amazon.com/redshift/latest/mgmt/zero-etl-using.redshift-iam.html#zero-etl-using.resource-policies).

1. 

**For Identity-based policies:**

   1.  The user creating the integration requires an identity-based policy that authorizes the following actions: `GetResourcePolicy`, `PutResourcePolicy`, and `UpdateContinuousBackups`.
**Note**  
 The following policy examples will show the resource as `arn:aws:redshift{-serverless}`. This is an example to show that the arn can be either `arn:aws:redshift` or `arn:aws:redshift-serverless` depending on if your namespace is an Amazon Redshift cluster or Amazon Redshift Serverless namespace. 

------
#### [ JSON ]

****  

      ```
      {
          "Version":"2012-10-17",		 	 	 
          "Statement": [
              {
                  "Effect": "Allow",
                  "Action": [
                      "dynamodb:ListTables"
                  ],
                  "Resource": "*"
              },
              {
                  "Effect": "Allow",
                  "Action": [
                      "dynamodb:GetResourcePolicy",
                      "dynamodb:PutResourcePolicy",
                      "dynamodb:UpdateContinuousBackups"
                  ],
                  "Resource": [
                      "arn:aws:dynamodb:us-east-1:111122223333:table/table-name"
                  ]
              },
              {
                  "Sid": "AllowRedshiftDescribeIntegration",
                  "Effect": "Allow",
                  "Action": [
                      "redshift:DescribeIntegrations"
                  ],
                  "Resource": "*"
              },
              {
                  "Sid": "AllowRedshiftCreateIntegration",
                  "Effect": "Allow",
                  "Action": "redshift:CreateIntegration",
                  "Resource": "arn:aws:redshift:us-east-1:111122223333:integration:*"
              },
              {
                  "Sid": "AllowRedshiftModifyDeleteIntegration",
                  "Effect": "Allow",
                  "Action": [
                      "redshift:ModifyIntegration",
                      "redshift:DeleteIntegration"
                  ],
                  "Resource": "arn:aws:redshift:us-east-1:111122223333:integration:uuid"
              },
              {
                  "Sid": "AllowRedshiftCreateInboundIntegration",
                  "Effect": "Allow",
                  "Action": "redshift:CreateInboundIntegration",
                  "Resource": "arn:aws:redshift:us-east-1:111122223333:namespace:uuid"
              }
          ]
      }
      ```

------

   1.  The user responsible for configuring the destination Amazon Redshift namespace requires an identity-based policy that authorizes the following actions: `PutResourcePolicy`, `DeleteResourcePolicy`, and `GetResourcePolicy`.

------
#### [ JSON ]

****  

      ```
      {
          "Version":"2012-10-17",		 	 	 
          "Statement": [
              {
                  "Effect": "Allow",
                  "Action": [
                      "redshift:PutResourcePolicy",
                      "redshift:DeleteResourcePolicy",
                      "redshift:GetResourcePolicy"
                  ],
                  "Resource": [
                      "arn:aws:redshift:us-east-1:111122223333:cluster:cluster-name"
                  ]
              },
              {
                  "Effect": "Allow",
                  "Action": [
                      "redshift:DescribeInboundIntegrations"
                  ],
                  "Resource": [
                      "arn:aws:redshift:us-east-1:111122223333:cluster:cluster-name"
                  ]
              }
          ]
      }
      ```

------

1. 

**Encryption key permissions**  
If the source DynamoDB table is encrypted using customer managed AWS KMS key, you will need to add the following policy on your KMS key. This policy allows Amazon Redshift to be able to export data from your encrypted table using your KMS key.

   ```
   {
       "Sid": "Statement to allow Amazon Redshift service to perform Decrypt operation on the source DynamoDB Table",
       "Effect": "Allow",
       "Principal": {
           "Service": [
               "redshift.amazonaws.com"
           ]
       },
       "Action": "kms:Decrypt",
       "Resource": "*",
       "Condition": {
           "StringEquals": {
               "aws:SourceAccount": "<account>"
           },
           "ArnEquals": {
               "aws:SourceArn": "arn:aws:redshift:<region>:<account>:integration:*"
           }
       }
   }
   ```

 You can also follow the steps on [ Getting started with zero-ETL integrations](https://docs.aws.amazon.com/redshift/latest/mgmt/zero-etl-using.setting-up.html#zero-etl-using.redshift-iam) in the Amazon Redshift management guide to configure the permissions of the Amazon Redshift namespace. 

## Limitations when using DynamoDB zero-ETL integrations with Amazon Redshift
<a name="RedshiftforDynamoDB-zero-etl-limitations"></a>

 The following general limitations apply to the current release of this integration. These limitations can change in subsequent releases. 

**Note**  
In addition to the limitations below, also review the general considerations when using zero-ETL integrations see [ Considerations when using zero-ETL integrations with Amazon Redshift](https://docs.aws.amazon.com/redshift/latest/mgmt/zero-etl.reqs-lims.html) in the *Amazon Redshift Management Guide*.
+ The DynamoDB table and Amazon Redshift cluster need to be in the same Region.
+ The source DynamoDB table must be encrypted with either an Amazon-owned or Customer-managed AWS KMS key. Amazon managed encryption is not supported for the source DynamoDB table.

# Creating a DynamoDB zero-ETL integration with Amazon Redshift
<a name="RedshiftforDynamoDB-zero-etl-getting-started"></a>

 Before creating a zero-ETL integration, you must first set up your source DynamoDB table and then the target Amazon Redshift data warehouse. 

## Step 1: Configuring a source DynamoDB table
<a name="RedshiftforDynamoDB-zero-etl-getting-started-configuring"></a>

 To create a zero-ETL integration with Amazon Redshift, you need to enable point-in-time recovery (PITR) on your table. If you do not have PITR turned on, the console can fix this for you during the integration setup process. For details on how to enable PITR, see [Point-in-time recovery](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/PointInTimeRecovery_Howitworks.html). 

## Step 2: Creating an Amazon Redshift data warehouse
<a name="RedshiftforDynamoDB-zero-etl-getting-started-creating"></a>

If you don't already have an Amazon Redshift data warehouse, you can create one. To create an Amazon Redshift Serverless workgroup, see [Creating a workgroup with a namespace](https://docs.aws.amazon.com/redshift/latest/mgmt/serverless-console-workgroups-create-workgroup-wizard.html). To create an Amazon Redshift cluster, see [ Creating a cluster](https://docs.aws.amazon.com/redshift/latest/mgmt/create-cluster.html). 

 The target Amazon Redshift workgroup or cluster must have the enable\$1case\$1sensitive\$1identifier parameter turned on for the integration to be successful. For more information on enabling case sensitivity, see [ Turn on case sensitivity for your data warehouse](https://docs.aws.amazon.com/redshift/latest/mgmt/zero-etl-setting-up.case-sensitivity.html) in the Amazon Redshift management guide. 

 After the Amazon Redshift workgroup or cluster setup is complete, you need to configure your data warehouse. See [Zero-ETL integrations](https://docs.aws.amazon.com/redshift/latest/mgmt/zero-etl-using.html) in the Amazon Redshift Management Guide for more information.

## Step 3: Creating a DynamoDB zero-ETL integration
<a name="RedshiftforDynamoDB-zero-etl-getting-started-creating-zetl"></a>

Before you create a zero-ETL integration, make sure to complete the tasks in the section titled [Prerequisites before creating a DynamoDB zero-ETL integration with Amazon Redshift](RedshiftforDynamoDB-zero-etl.md#RedshiftforDynamoDB-zero-etl-prereqs). Creating an integration between DynamoDB and Amazon Redshift is a two-step process. First create an integration from the DynamoDB, and then attach a Amazon Redshift database to this newly created integration. 

**Create a zero-ETL integration**

1.  Sign in to the AWS Management Console and open the Amazon DynamoDB console at [https://console.aws.amazon.com/dynamodbv2](https://console.aws.amazon.com/dynamodbv2). 

1.  In the navigation pane, choose **Integrations**. 

1. Select **Create zero-ETL integration** and choose **Amazon Redshift**.

1. This will take you to the **Amazon Redshift console**. To continue with the procedure, see the **DynamoDB section** in [Create a zero-ETL integration for DynamoDB](https://docs.aws.amazon.com/redshift/latest/mgmt/zero-etl-setting-up.create-integration-ddb.html).

# Viewing DynamoDB zero-ETL integrations with Amazon Redshift
<a name="RedshiftforDynamoDB-zero-etl-viewing"></a>

 You can view the details of a zero-ETL integration to see its configuration information and current status. 

**To view the details of a zero-ETL integration in the Amazon DynamoDB console:**

1.  Sign in to the AWS Management Console and open the Amazon DynamoDB console at [https://console.aws.amazon.com/dynamodbv2](https://console.aws.amazon.com/dynamodbv2). 

1.  In the DynamoDB console, choose **Integrations**. 

1.  In the **Zero-ETL integration** pane, select the zero-ETL integration you want to view. 

**To view the details of a zero-ETL integration in the Amazon Redshift console:**

1.  Sign in to the AWS Management Console and open the Amazon Redshift console at [https://console.aws.amazon.com/redshiftv2](https://console.aws.amazon.com/redshiftv2). 

1. Follow the steps at [Viewing zero-ETL integrations](https://docs.aws.amazon.com/redshift/latest/mgmt/zero-etl-using.describing.html).

**Note**  
 The possible statuses of a zero-ETL integration with Amazon Redshift are listed in [Viewing zero-ETL integrations](https://docs.aws.amazon.com/redshift/latest/mgmt/zero-etl-using.describing.html) in the *Amazon Redshift Management Guide*.

# Deleting DynamoDB zero-ETL integrations with Amazon Redshift
<a name="RedshiftforDynamoDB-zero-etl-deleting"></a>

 When you delete a zero-ETL integration, your data isn't deleted from DynamoDB or Amazon Redshift, but DynamoDB stops sending data from your source table to the Amazon Redshift target. 

**To delete a zero-ETL integration**

1.  Sign in to the AWS Management Console and open the Amazon DynamoDB console at [https://console.aws.amazon.com/dynamodbv2](https://console.aws.amazon.com/dynamodbv2). 

1.  In the DynamoDB console, choose **Integrations**. 

1.  In the **Zero-ETL integration** pane, select the zero-ETL integration you want to delete. 

1.  Choose **Manage**. This will take you to the integration details page.

1.  To confirm the deletion, choose **Delete**. 

# Loading data from DynamoDB into Amazon Redshift with the COPY command
<a name="RedshiftforDynamoDB-copy-data"></a>



Amazon Redshift works with Amazon DynamoDB with advanced business intelligence capabilities and a powerful SQL-based interface. When you copy data from a DynamoDB table into Amazon Redshift, you can perform complex data analysis queries on that data, including joins with other tables in your Amazon Redshift cluster.

In terms of provisioned throughput, a copy operation from a DynamoDB table counts against that table's read capacity. After the data is copied, your SQL queries in Amazon Redshift do not affect DynamoDB in any way. This is because your queries act upon a copy of the data from DynamoDB, rather than upon DynamoDB itself.

Before you can load data from a DynamoDB table, you must first create an Amazon Redshift table to serve as the destination for the data. Keep in mind that you are copying data from a NoSQL environment into a SQL environment, and that there are certain rules in one environment that do not apply in the other. Here are some of the differences to consider:
+ DynamoDB table names can contain up to 255 characters, including '.' (dot) and '-' (dash) characters, and are case-sensitive. Amazon Redshift table names are limited to 127 characters, cannot contain dots or dashes and are not case-sensitive. In addition, table names cannot conflict with any Amazon Redshift reserved words.
+ DynamoDB does not support the SQL concept of NULL. You need to specify how Amazon Redshift interprets empty or blank attribute values in DynamoDB, treating them either as NULLs or as empty fields.
+ DynamoDB data types do not correspond directly with those of Amazon Redshift. You need to ensure that each column in the Amazon Redshift table is of the correct data type and size to accommodate the data from DynamoDB.

Here is an example COPY command from Amazon Redshift SQL:

```
copy favoritemovies from 'dynamodb://my-favorite-movies-table'
credentials 'aws_access_key_id=<Your-Access-Key-ID>;aws_secret_access_key=<Your-Secret-Access-Key>'
readratio 50;
```

In this example, the source table in DynamoDB is `my-favorite-movies-table`. The target table in Amazon Redshift is `favoritemovies`. The `readratio 50` clause regulates the percentage of provisioned throughput that is consumed; in this case, the COPY command will use no more than 50 percent of the read capacity units provisioned for `my-favorite-movies-table`. We highly recommend setting this ratio to a value less than the average unused provisioned throughput.

For detailed instructions on loading data from DynamoDB into Amazon Redshift, refer to the following sections in the [https://docs.aws.amazon.com/redshift/latest/dg/](https://docs.aws.amazon.com/redshift/latest/dg/):
+ [Loading data from a DynamoDB table](https://docs.aws.amazon.com/redshift/latest/dg/t_Loading-data-from-dynamodb.html)
+ [The COPY command](https://docs.aws.amazon.com/redshift/latest/dg/r_COPY.html)
+ [COPY examples](https://docs.aws.amazon.com/redshift/latest/dg/r_COPY_command_examples.html)