Step 4: Transfer data from a SaaS source to Amazon S3
Suppose you now want to transfer your data from Salesforce to Amazon S3. With Amazon S3, you can synchronize and replicate customer relationship management (CRM) data into data lakes to analyze or use to drive machine learning. To keep this information up to date, you can create an event-triggered flow from Salesforce to Amazon S3. An event-triggered flow runs when Amazon AppFlow detects a change to the target data in the CRM storage service.
After you create an S3 bucket, you can set up and run a flow with Amazon AppFlow to transfer data from a supported source to the S3 bucket. You can use one S3 bucket as both a source and destination, so you don't need to create a new S3 bucket if you already created one for this tutorial. In this step, you use the AWS Management Console to create and run a flow from Salesforce or another software as a service (SaaS) application to Amazon S3.
Topics
Prerequisites
Before you begin, you need an S3 bucket to receive the data if you don't already have one. You can use the same S3 bucket as both a source and destination for different flows. This tutorial uses Salesforce for a SaaS account, but you can use another supported source application if you want. Some flow options that this tutorial uses don't work for a SaaS application other than Salesforce.
-
Amazon S3 setup — If you don't already have an S3 bucket, Create an S3 bucket to prepare Amazon S3 to receive your data.
-
Salesforce setup (Optional) — If you already have a Salesforce account, or you want to complete this tutorial with a different SaaS application, you can skip this step. Sign up for a free Salesforce developer account here
. -
Transfer data to Salesforce (Optional) — If you use Salesforce for this tutorial, we recommend that you complete Step 3: Transfer data from Amazon S3 before you continue.
Change data capture in Salesforce
To run an event-triggered flow, Amazon AppFlow needs to receive a notification when a record changes. When you use the change data capture feature in Salesforce, you can generate change event notifications for selected entities. If you don't have administrator-level credentials, you might not be able to select entities to generate change notifications. However, the free developer account has administrator privileges.
To enable change data capture
-
Open Salesforce at www.salesforce.com
and log in to your account. -
Navigate to the Change Data Capture page.
-
If you use the sample data, select Account (Account) to generate change event notifications. Otherwise, select the appropriate entity for your data.
For more information about Salesforce change data capture, see Change Data Capture
Create a flow
The following procedures detail how to create a flow from Salesforce to Amazon S3, but you can follow the steps with any supported source. Some flow options that this tutorial uses don't work for a SaaS application other than Salesforce, but alternate steps appear.
To complete Step 1: Specify flow details
-
Open the Amazon AppFlow console at https://console.aws.amazon.com/appflow/
. -
Choose Create flow.
-
For Flow name, enter
. For example, if your source is Salesforce, enterSaaS
-to-s3salesforce-to-s3
. -
Under Data encryption, you have the option to activate custom encryption settings. By default, Amazon AppFlow encrypts your data with a key in AWS Key Management Service (AWS KMS). AWS creates, uses, and manages this key for you. Amazon AppFlow always encrypts your data during transit and at rest. The default encryption is adequate for this tutorial, so don't select custom encryption settings. For more information, see Data protection in the Amazon AppFlow User Guide.
-
Under Tags, you have the option to add tags to your flow. Tags are key-value pairs that assign metadata to resources that you create. Tags aren't necessary for this tutorial. For more information, see Tagging AWS resources in the AWS General Reference.
-
To continue to Step 2: Configure flow, choose Next.
To complete Step 2: Configure flow
-
Configure the Source details. These details vary based on the source that you want to transfer data from.
-
If you want to transfer data from Salesforce, do the following:
-
For Source name, choose Salesforce.
-
For Choose Salesforce connection, select your connection. For example, select
my-salesforce-connection
, the connection that you created in a previous step.Tip
If you don't have a connection, you can choose Connect to create one now.
-
Select Salesforce events.
-
If you use the sample data, for Choose Salesforce event, select Account Change Event. Otherwise, select the event that matches your data.
-
-
If you want to transfer data from another supported application besides Salesforce, do the following:
-
For Source name, select the source that you want for your data.
-
For Choose connection, select the connection that you created, or create one.
-
Select object and specify the correct object type for your data.
-
If there are any other source details, configure the required fields.
-
-
-
For Destination name, choose Amazon S3.
-
In Bucket details, for Choose an S3 bucket, select your S3 bucket. Use the same S3 bucket that contains the
source
folder from the previous step. -
For Enter bucket prefix, enter
destination
. Bucket prefixes are folders.Tip
If you don't have a folder that matches the name that you entered, the flow automatically creates one when it runs.
-
Configure the Flow trigger. This varies based on the source where you want to transfer data from.
-
If you want to transfer data from Salesforce, leave the default selection Run flow on event.
-
If you want to transfer data from another supported application besides Salesforce, leave the default selection Run on demand. This option allows you to run the flow with the selection of one button in the console.
Tip
You can also run flows on a schedule. Amazon AppFlow bases the time zone for this schedule on your web browser. For more information, see Schedule-triggered flows in the Amazon AppFlow User Guide.
-
-
To continue to Step 3: Map data fields, choose Next.
To complete Step 3: Map data fields
-
Under Mapping method, leave the default selection Manually map fields.
-
In the Source to destination field mapping section, select the Choose source fields dropdown and select Map all fields directly.
-
Under Validations, specify what happens to invalid data within the flow. For this step, you don't need any validations.
-
To continue to Step 4: Add filters, choose Next.
To complete Step 4: Add filters
-
Under Filters, specify what data the flow transfers. With this setting, you can ensure the flow transfers data only when it meets certain criteria. For this tutorial, you don't need any filters.
-
To continue to Step 5: Review and create, choose Next.
To complete Step 5: Review and create
-
Review the flow settings, then choose Create flow.
Run a flow
You now have a flow. The source that you use determines how you run this flow.
Your event-triggered flow runs when a change occurs to a record that you've set up to generate change event notifications. Here, you change a record within your Salesforce account to activate a flow run.
To run an event-triggered flow with Salesforce
-
Open the Amazon AppFlow console at https://console.aws.amazon.com/appflow/
. -
In Flows, select the
salesforce-to-s3
flow. -
Choose Activate flow.
-
Open Salesforce at www.salesforce.com
and log in to your account. -
Navigate to the page where Salesforce stores your records. For the sample data, this is the Accounts page.
-
Edit one of the records. For example, in the sample data, change the Rating in
Example3
from cold to hot.
After about a minute, refresh your flow page in Amazon AppFlow. When the flow successfully runs, a timestamp from the last flow run appears.

Your on-demand flow runs when you choose the Run flow button in the console.
To run an on-demand flow
-
In Flows, select your flow from the list.
-
Choose Run flow.
When the flow successfully runs, a banner appears.

View transferred data
The data from your source now resides in your S3 bucket. From the S3 bucket, you can, for example, consume the data from multiple AWS services for analysis. In this step, you download and view the data on your computer.
To retrieve the transferred data
-
Open the Amazon S3 console at https://console.aws.amazon.com/s3/
. -
In Buckets, choose your S3 bucket from the list.
-
In your S3 bucket, choose the
destination
folder. Then choose the flow folder, for example,salesforce-to-s3
. -
The folder contains one file. Select this file and choose Download.
-
Navigate to the file in your
Downloads
folder and rename it with a descriptive name. -
Open the file to view the updated record.
You've now transferred data from Salesforce or the SaaS that you chose to Amazon S3. If you used Salesforce, you set up an event-triggered flow to keep up-to-date with changing data.