Run an ETL/ELT workflow using Step Functions and the Amazon Redshift API - AWS Step Functions

Run an ETL/ELT workflow using Step Functions and the Amazon Redshift API

This sample project demonstrates how to use Step Functions and the Amazon Redshift Data API to run an ETL/ELT workflow that loads data into the Amazon Redshift data warehouse.

In this project, Step Functions uses an AWS Lambda function and the Amazon Redshift Data API to create the required database objects and to generate a set of example data, then executes two jobs in parallel that perform loading dimension tables, followed by a fact table. Once both dimension load jobs end successfully, Step Functions executes the load job for the fact table, runs the validation job, then pauses the Amazon Redshift cluster.

Note

You can modify the ETL logic to receive data from other sources such as Amazon S3, which can use the COPY command to copy data from Amazon S3 to an Amazon Redshift table.

For more information about Amazon Redshift and Step Functions service integrations, see the following guides:

For more information about IAM policies for Lambda and Amazon Redshift, see the following guides:

Note

This sample project may incur charges.

For new AWS users, a free usage tier is available. On this tier, services are free below a certain level of usage. For more information about AWS costs and the Free Tier, see AWS Step Functions pricing.

Step 1: Create the state machine

  1. Open the Step Functions console and choose Create state machine.

  2. Find and choose the starter template you want to work with. Choose Next to continue.

  3. Choose Run a demo to create a read-only and ready-to-deploy workflow, or choose Build on it to create an editable state machine definition that you can build on and later deploy.

  4. Choose Use template to continue with your selection.

Next steps depend on your previous choice:

  1. Run a demo – You can review the state machine before you create a read-only project with resources deployed by AWS CloudFormation to your AWS account.

    You can view the state machine definition, and when you are ready, choose Deploy and run to deploy the project and create the resources.

    Deploying can take up to 10 minutes to create resources and permissions. You can use the Stack ID link to monitor progress in AWS CloudFormation.

    After deploy completes, you should see your new state machine in the console.

  2. Build on it – You can review and edit the workflow definition. You might need to set values for placeholders in the sample project before attemping to run your custom workflow.

Note

Standard charges might apply for services deployed to your account.

Step 2: Run the state machine

  1. On the State machines page, choose your sample project.

  2. On the sample project page, choose Start execution.

  3. In the Start execution dialog box, do the following:

    1. (Optional) Enter a custom execution name to override the generated default.

      Non-ASCII names and logging

      Step Functions accepts names for state machines, executions, activities, and labels that contain non-ASCII characters. Because such characters will not work with Amazon CloudWatch, we recommend using only ASCII characters so you can track metrics in CloudWatch.

    2. (Optional) In the Input box, enter input values as JSON. You can skip this step if you are running a demo.

    3. Choose Start execution.

    The Step Functions console will direct you to an Execution Details page where you can choose states in the Graph view to explore related information in the Step details pane.

Congratulations!

You should now have either a running demo or a state machine definition that you can customize.