

# Creating flows in Amazon AppFlow
<a name="create-flow"></a>

There are several ways to create flows in Amazon AppFlow. You can use the AWS Management Console, AWS CLI commands, the Amazon AppFlow API, or CloudFormation.

**Topics**
+ [Create a flow using the AWS console](create-flow-console.md)
+ [Create a flow using the AWS CLI](create-flow-cli.md)
+ [Create a flow using the Amazon AppFlow APIs](create-flow-api.md)
+ [Create a flow using CloudFormation resources](create-flow-cfn.md)

# Create a flow using the AWS console
<a name="create-flow-console"></a>

There are several ways to gets started with creating your first flow by using the AWS console user interface, AWS CLI commands, APIs, or by specifying CloudFormation resources. The console enables you to input basic information for your flow and connect as a user of the associated SaaS application.

**To create a flow using the console**

The following procedure provides the steps to create and configure a flow using the Amazon AppFlow console user interface.

1. Sign in to the AWS Management Console and open the Amazon AppFlow console at [https://console.aws.amazon.com/appflow/](https://console.aws.amazon.com/appflow/).

1. Choose **Create flow**.

1. For **Flow details**, enter a name and description for the flow. A valid flow name is a combination of alphanumeric characters and the following special characters: \$1@\$1.-\$1.

1. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose **Data encryption**, **Customize encryption settings** and then select an existing CMK or create a new one.

1. (Optional) To add a tag, choose **Tags**, **Add tag** and then enter the key name and value. The following basic restrictions apply to tags:
   + Maximum number of tags per resource – 50 
   + For each resource, each tag key must be unique, and each tag key can have only one value. 
   + Maximum key length – 128 
   + Unicode characters in UTF-8 
   + Use letters, numbers, and spaces representable in UTF-8, and the following characters: \$1 - = . \$1 : / @.
   + Tag keys and values are case-sensitive.
   + The `aws:` prefix is reserved for AWS use. If a tag has a tag key with this prefix, then you can't edit or delete the tag's key or value. Tags with the `aws:` prefix do not count against your tags per resource limit.

1. Choose **Next**.

**To configure the flow**

1. For **Source details**, select the source and provide the requested information. For example, provide connection information and select objects or events. For more information, look up your source application on the [Supported source and destination applications](app-specific.md) page where you can find application-specific connection instructions.
**Note**  
To successfully configure a connection for a flow, the user or role you use to create the flow must have permission to use the `UseConnectorProfile` permission-only action for the connection (connectorprofile) that you choose for the flow. This permission is included in the AmazonAppFlowFullAccess managed policy. If you are using a custom policy, you must add the permission to the policy and specify the connectorprofile resource in the policy.

1. For **Destination details**, select the destination and provide the requested information about the location. For more information, look up your destination application on the [Supported source and destination applications](app-specific.md) page where you can find application-specific connection instructions.

1. For **Flow trigger**, choose how to trigger the flow. The following are the flow trigger options:
   + **Run on demand** - Run the flow manually.
   + **Run on event** - Run the flow based on the specified change event.
     + This option is available only for SaaS applications that provide change events. You must choose the event when you choose the source.
   + **Run on schedule** - Run the flow on the specified schedule and transfer the specified data.
     + You can choose either full or incremental transfer for schedule-triggered flows. 
     + When you select full transfer, Amazon AppFlow transfers a snapshot of all records at the time of the flow run from the source to the destination.
     + When you select incremental transfer, Amazon AppFlow transfers only the records that have been added or changed since the last successful flow run. You can also select a timestamp field to specify how Amazon AppFlow identifies new or changed records. For example, if you have a **Created Date** timestamp field, choose this to instruct Amazon AppFlow to transfer only newly-created records (and not changed records) since the last successful flow run. The first flow in a schedule-triggered flow will pull 30 days of past records at the time of the first flow run.
     + The scheduling frequency depends on the frequency supported by the source application. 

1. Choose **Next**.

**Tip**  
Attempting a connection with an expired user login can return a 'status code 400' error. If you encounter this error, we recommend creating a new connection and deleting the old one, or using an existing connection with valid credentials. For more information on setting up a connection, look up your source application on the [Supported source and destination applications](app-specific.md) page.

**To map data fields**

1. For **Mapping method**, choose how to map the fields and complete the field mapping. The following are the field mapping options:
   + **Manually map fields** - Use the Amazon AppFlow user interface to specify the field mapping. To map all fields, choose **Source field name**, **Bulk actions**, **Map all fields directly**. Otherwise, select one or more fields from **Source field name**, **Source fields**, and then choose **Map fields directly**.
   + **Upload a .csv file with mapped fields** - Use a comma-separated values (CSV) file to specify the field mappings. Each line in the CSV file contains the source field name, followed by a comma, which is followed by the destination field name. For more information on how to create the CSV file for upload, see the note that follows this procedure.

1. (Optional) To add a formula that concatenates fields, select two fields from **Mapped fields** and then choose **Add formula**.

1. (Optional) To mask or truncate field values, select one or more fields from **Mapped fields** and then choose **Modify values**.

1. (Optional) For **Validations**, add validations to check whether a field has bad data. For each field, choose the condition that indicates bad data and what action Amazon AppFlow should take when a field in a record is bad.

1. Choose **Next**.

**Tip**  
When manually mapping between a source and destination, you must select compatible fields and be sure not to exceed the number of records supported by the destination. For more information on supported record quotas, see [Quotas for Amazon AppFlow](https://docs.aws.amazon.com/appflow/latest/userguide/service-quotas.html) in the *Amazon AppFlow User Guide*.

**Note**  
When creating a CSV file to upload to Amazon AppFlow, you must specify each source field and destination field pair in a single line separated by a comma. For example, if you want to map source fields SF1, SF2, and SF3 to destination fields DFa, DFb, and DFc respectively, the CSV file should contain three lines as follows:  
SF1, DFa   
SF2, DFb   
SF3, DFc   
Save your file with a .csv extension and then upload this file to import the mapping into Amazon AppFlow.

**To add filters**

Specify a filter to determine which records to transfer. Amazon AppFlow enables you to filter data fields by adding multiple filters and by adding criteria to a filter.
**Note**  
When you select field names with string values, OR logic allows you to combine two or more criteria into a broader condition. When you add multiple filters, AND logic allows you to combine your filters into a narrower condition.

1. To add a filter, choose **Add filter**, select the field name, select a condition, and then specify the criteria.

1. (Optional) To add further criteria to your filter, choose **Add criteria**. Depending on the field and the condition, you can add up to 10 criteria per filter.

1. (Optional) To add another filter, choose **Add filter** again. You can create up to 10 filters to specify which data fields you want to use in your flow. Amazon AppFlow will implement each filter in the order in which you specify them, and transfer only the records that meet all filter criteria.

1. To remove a filter, choose **Remove** next to the filter.

1. When you are finished adding filters, choose **Next**.

   Review the information for your flow. To change the information for a step, choose **Edit**. When you are finished, choose **Create flow**.
**Tip**  
If the flow creation fails, review the error message and confirm that all required fields have been entered, and that the user or role you are using has permission to the `UseConnectorProfile` action for the connection selected for the flow.

# Create a flow using the AWS CLI
<a name="create-flow-cli"></a>

You may also use the [CLI](https://docs.aws.amazon.com/cli/latest/reference/appflow/index.html) to create a connector profile and configure a flow using the AWS CLI commands for **create-connector-profile** and **create-flow**. Due to the varying methods of authentication across each target application, the specific information provided for connection creation will vary. Two examples are provided here as a comparison — Salesforce and ServiceNow.

Run the **create-connector-profile** command to create the connector profile for your flow. The following example creates a new Amazon AppFlow connection to Salesforce. Note that this leverages a Salesforce Connected App, which itself requires several steps to configure across AWS and Salesforce. See [Salesforce global connected app](https://docs.aws.amazon.com/appflow/latest/userguide/salesforce.html#salesforce-global-connected-app) for details.

Create Salesforce connection:

```
aws appflow create-connector-profile \
    --connector-profile-name MySalesforceConnection \
    --connector-type Salesforce \
    --connection-mode Public \
    --connector-profile-config ' { 
                "connectorProfileProperties": {
                    "Salesforce": {
                        "instanceUrl": "https://<instance-name>.my.salesforce.com",
                        "isSandboxEnvironment": false
                    }
                },
                "connectorProfileCredentials": {
                    "Salesforce": {
                        "accessToken": "<access-token-value>",
                        "refreshToken": "<refresh-token-value>",
                        "oAuthRequest": {
                            "authCode": "<auth-code-value>",
                            "redirectUri": "https://login.salesforce.com/"
                        },
                        "clientCredentialsArn": "<secret-arn-value>"
                    }
                }
            }'
```

Run the **create-connector-profile** command to begin creating your flow. The following example creates a new Amazon AppFlow connection to ServiceNow. Note that, unlike Salesforce, there is no prerequisite configuration for either AWS or ServiceNow.

Create ServiceNow connection:

```
aws appflow create-connector-profile \
    --connector-profile-name MyServiceNowConnection \
    --connector-type Servicenow \
    --connection-mode Public \
    --connector-profile-config ' { 
                "connectorProfileProperties": {
                    "ServiceNow": {
                        "instanceUrl": "https://<instance-name>.service-now.com"
                    }
                },
                "connectorProfileCredentials": {
                    "ServiceNow": {
                        "username": "<username-value>",
                        "password": "<password-value>"
                    }
                }
            }'
```

Run the **create-flow** command to begin creating your flow. The following implements a flow from Salesforce to S3 using a previously created Salesforce connection and S3 bucket, delivering the data in CSV format with all Salesforce source fields mapped directly.

Create Salesforce to S3 flow:

```
aws appflow create-flow \
    --flow-name MySalesforceToS3Flow \
    --trigger-config '{
                "triggerType": "OnDemand"
            }' \
    --source-flow-config '{
                  "connectorType": "Salesforce",
                  "connectorProfileName": "MySalesforceConnection",
                  "sourceConnectorProperties": {
                      "Salesforce": {
                          "object": "Account"
                    }
                }
            }' \
    --destination-flow-config '[{
                "connectorType": "S3",
                "destinationConnectorProperties": {
                    "S3": {
                        "bucketName": "<s3-bucket-name>",
                        "s3OutputFormatConfig": {
                            "fileType": "CSV"
                        }
                    }
                }
           }]' \
    --tasks '[
                {
                    "sourceFields": [],
                    "taskType": "Map_all",
                    "taskProperties": {}
                }
            ]'
```

Run the **start-flow** command to start your flow. For on-demand flows, this operation runs the flow immediately. For schedule and event-triggered flows, this operation activates the flow. The following starts the flow `MySalesforceToS3Flow` which was created in the previous step.

```
aws appflow start-flow --flow-name MySalesforceToS3Flow
```

The describe-flow command is helpful for understanding how previously created flows, including flows created through the Console, are structured.

Describe a flow:

```
aws appflow describe-flow --flow-name MySalesforceToS3Flow
```

Refer to the [ AWS CLI Command Reference for Amazon AppFlow](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/appflow/index.html) for additional details about the complete list of commands available for Amazon AppFlow.

# Create a flow using the Amazon AppFlow APIs
<a name="create-flow-api"></a>

You may also use the APIs to create a connector profile and configure a flow using the `CreateConnectorProfile` and `CreateFlow` APIs. Due to the varying methods of authentication across each target application, the specific information provided for connection creation will vary. Two examples are provided below as a comparison — Salesforce and ServiceNow.

Program the `CreateConnectorProfile` API to create a connector profile associated with your AWS account. There is a soft quota of 100 connector profiles per AWS account. If you need more connector profiles than this quota allows, you can submit a request to the Amazon AppFlow team through the Amazon AppFlow support channel. The following examples creates a new Amazon AppFlow connection to Salesforce. Note that this leverages a Salesforce Connected App, which itself requires several steps to configure across AWS and Salesforce. See [Salesforce global connected app](https://docs.aws.amazon.com/appflow/latest/userguide/salesforce.html#salesforce-global-connected-app) for details.

Create Salesforce connection:

```
POST /create-connector-profile HTTP/1.1
Content-type: application/json

{
    "connectorProfileName": "MySalesforceConnection",
    "connectorType": "Salesforce",
    "connectionMode": "Public",
    "connectorProfileConfig": {
        "connectorProfileProperties": { 
            "Salesforce": { 
                "instanceUrl": "https://<instance-name>.my.salesforce.com",
                "isSandboxEnvironment": false
            }
        },
        "connectorProfileCredentials": { 
            "Salesforce": { 
                "accessToken": "<access-token-value>",
                "refreshToken": "<refresh-token-value>",
                "oAuthRequest": {
                    "authCode": "<auth-code-value>",
                    "redirectUri": "https://login.salesforce.com/"
                },
                "clientCredentialsArn": "<secret-arn-value>"
            }
        }
    }
}
```

The following examples creates a new Amazon AppFlow connection to ServiceNow. Note that, unlike Salesforce, there is no pre-requisite configuration for either AWS or ServiceNow.

Create ServiceNow connection

```
POST /create-connector-profile HTTP/1.1
Content-type: application/json

{
    "connectorProfileName": "MyServiceNowConnection",
    "connectorType": "Servicenow",
    "connectionMode": "Public",
    "connectorProfileConfig": {
        "connectorProfileProperties": { 
            "ServiceNow": { 
                "instanceUrl": "https://<instance-name>.service-now.com",
                "isSandboxEnvironment": false
            }
        },
        "connectorProfileCredentials": { 
            "ServiceNow": {
                "username": "<username-value>",
                "password": "<password-value>"
            }
        }
    }
}
```

The following implements a flow from Salesforce to S3 using a previously created Salesforce connection and S3 bucket, delivering the data in CSV format with all Salesforce source fields mapped directly.

Create Salesforce to S3 flow

```
POST /create-flow HTTP/1.1
Content-type: application/json

{
    "flowName": "MySalesforceToS3Flow",
    "triggerConfig": {
        "triggerType": "OnDemand"
    },
    "sourceFlowConfig": {
          "connectorType": "Salesforce",
          "connectorProfileName": "MySalesforceConnection",
          "sourceConnectorProperties": {
              "Salesforce": {
                  "object": "Account"
            }
        }
    },
    "destinationFlowConfigList": [{
        "connectorType": "S3",
        "destinationConnectorProperties": {
            "S3": {
                "bucketName": "appflow-demo-destination",
                "s3OutputFormatConfig": {
                    "fileType": "CSV"
                }
            }
        }
    }],
    "tasks": [
        {
            "sourceFields": [],
            "taskType": "Map_all",
            "taskProperties": {}
        }
    ]
}
```

The following starts the flow `MySalesforceToS3Flow` which was created in the previous step.

Start a flow:

```
POST /start-flow HTTP/1.1
Content-type: application/json

{
   "flowName": "MySalesforceToS3Flow"
}
```

Refer to the [ Amazon AppFlow API Reference](https://docs.aws.amazon.com/appflow/1.0/APIReference/API_CreateConnectorProfile.html) for details about the complete set of Amazon AppFlow APIs.

# Create a flow using CloudFormation resources
<a name="create-flow-cfn"></a>

You may also use CloudFormation to create a connector profile and configure a flow using the `AWS::AppFlow::ConnectorProfile` and `AWS::AppFlow::Flow` resources. The following example creates a new Amazon AppFlow connection to Salesforce. Note that this leverages a Salesforce Connected App, which itself requires several steps to configure across AWS and Salesforce. See [Salesforce global connected app](https://docs.aws.amazon.com/appflow/latest/userguide/salesforce.html#salesforce-global-connected-app) for details.

Declare the `AWS::AppFlow::ConnectorProfile` entity in your CloudFormation template with the following JSON syntax:

```
{
  "AWSTemplateFormatVersion":"2010-09-09",
  "Resources": {
    "MySalesforceConnection": {
      "Type" : "AWS::AppFlow::ConnectorProfile",
      "Properties": {
        "ConnectorProfileName": "MySalesforceConnection",
        "ConnectorType": "Salesforce",
        "ConnectionMode": "Public",
        "ConnectorProfileConfig": {
          "ConnectorProfileProperties": {
            "Salesforce": {
              "InstanceUrl": "https://<instance-name>.my.salesforce.com",
              "IsSandboxEnvironment": false
            }
          },
          "ConnectorProfileCredentials": {
            "Salesforce": {
              "AccessToken": "<access-token-value>",
              "RefreshToken": "<refresh-token-value>",
              "ConnectorOAuthRequest": {
                "AuthCode": "<auth-code-value>",
                "RedirectUri": "https://login.salesforce.com/"
              },
              "ClientCredentialsArn": "<secret-arn-value>"
            }
          }
        }
      }    
    }
  }
}
```

Following is an example of YAML syntax:

```
AWSTemplateFormatVersion: '2010-09-09'
Resources:
  MySalesforceConnection:
    Type: AWS::AppFlow::ConnectorProfile
    Properties: 
      ConnectorProfileName: MySalesforceConnection
      ConnectorType: Salesforce
      ConnectionMode: Public
      ConnectorProfileConfig:
        ConnectorProfileProperties:
          Salesforce:
            InstanceUrl: https://<instance-name>.my.salesforce.com
            IsSandboxEnvironment: false
        ConnectorProfileCredentials:
          Salesforce:
            AccessToken: <access-token-value>
            RefreshToken: <refresh-token-value>
            ConnectorOAuthRequest:
              AuthCode: <auth-code-value>
              RedirectUri: https://login.salesforce.com/
            ClientCredentialsArn: <secret-arn-value>
```

The following examples creates a new Amazon AppFlow connection to ServiceNow.

Create ServiceNow connection - JSON

```
 
{
  "AWSTemplateFormatVersion":"2010-09-09",
  "Resources": {
    "MyServiceNowConnection": {
      "Type" : "AWS::AppFlow::ConnectorProfile",
      "Properties": {
        "ConnectorProfileName": "MyServiceNowConnection",
        "ConnectorType": "Servicenow",
        "ConnectionMode": "Public",
        "ConnectorProfileConfig": {
          "ConnectorProfileProperties": {
            "ServiceNow": {
              "InstanceUrl": "https://<instance-name>.service-now.com", 
            }
          },
          "ConnectorProfileCredentials": {
            "ServiceNow": {
              "Username": "<username-value>",
              "Password": "<password-value>"
            }
          }
        }
      }    
    }
  }
}
```

The following is an example of YAML syntax that creates a new Amazon AppFlow connection to ServiceNow.

Create ServiceNow connection - YAML:

```
AWSTemplateFormatVersion: '2010-09-09'
Resources:
  MyServiceNowConnection:
    Type: AWS::AppFlow::ConnectorProfile
    Properties: 
      ConnectorProfileName: MyServiceNowConnection
      ConnectorType: Servicenow
      ConnectionMode: Public
      ConnectorProfileConfig:
        ConnectorProfileProperties:
          ServiceNow:
            InstanceUrl: https://<instance-name>.service-now.com
        ConnectorProfileCredentials:
          ServiceNow:
            Username: <username-value>
            Password: <password-value>
```

The following implements a flow from Salesforce to S3 using a previously created Salesforce connection and S3 bucket, delivering the data in CSV format with all Salesforce source fields mapped directly.

Create Salesforce to S3 flow - JSON:

```
{
  "AWSTemplateFormatVersion":"2010-09-09",
  "Resources": {
    "MySalesforceToS3Flow": {
      "Type" : "AWS::AppFlow::Flow",
      "Properties": {
        "FlowName": "MySalesforceToS3Flow",
        "TriggerConfig": {
          "TriggerType": "OnDemand"
        },
        "SourceFlowConfig": {
          "ConnectorType": "Salesforce",
          "ConnectorProfileName": "MySalesforceConnection",
          "SourceConnectorProperties": {
            "Salesforce": {
                "Object": "Account"
            }
          }
        },
        "DestinationFlowConfigList" : [{
          "ConnectorType": "S3",
          "DestinationConnectorProperties": {
            "S3": {
              "BucketName": "<s3-bucket-name>",
              "S3OutputFormatConfig": {
                "FileType": "CSV"
              }
            }
          }
        }],
        "Tasks": [
          {
            "TaskType": "Map_all",
            "SourceFields": [],
            "TaskProperties": [{
              "Key": "EXCLUDE_SOURCE_FIELDS_LIST",
              "Value": "[]"
            }],
            "ConnectorOperator": {
              "Salesforce": "NO_OP"
            }
          }
        ]
      }
    }
  }
}
```

The following implements a flow from Salesforce to S3 using a previously created Salesforce connection and S3 bucket, delivering the data in CSV format with all Salesforce source fields mapped directly.

Create Salesforce to S3 flow - YAML:

```
AWSTemplateFormatVersion: '2010-09-09'
Resources:
  MySalesforceToS3Flow:
    Type: AWS::AppFlow::Flow
    Properties:
      FlowName: MySalesforceToS3Flow
      TriggerConfig:
        TriggerType: OnDemand
      SourceFlowConfig:
        ConnectorType: Salesforce
        ConnectorProfileName: MySalesforceConnection
        SourceConnectorProperties:
          Salesforce:
            Object: Account
      DestinationFlowConfigList:
        - ConnectorType: S3
          DestinationConnectorProperties:
            S3:
              BucketName: <s3-bucket-name>
              S3OutputFormatConfig:
                FileType: CSV
      Tasks:
        - TaskType: Map_all
          SourceFields: []
          TaskProperties:
          - Key: EXCLUDE_SOURCE_FIELDS_LIST
            Value: '[]'
          ConnectorOperator:
            Salesforce: NO_OP
```

Refer to the [ AWS CloudFormation User Guide Amazon AppFlow chapter](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/AWS_AppFlow.html) for details about the complete set of resource options for all sources and destinations.