CfnFlowProps
- class aws_cdk.aws_appflow.CfnFlowProps(*, destination_flow_config_list, flow_name, source_flow_config, tasks, trigger_config, description=None, flow_status=None, kms_arn=None, metadata_catalog_config=None, tags=None)
Bases:
object
Properties for defining a
CfnFlow
.- Parameters:
destination_flow_config_list (
Union
[IResolvable
,Sequence
[Union
[IResolvable
,DestinationFlowConfigProperty
,Dict
[str
,Any
]]]]) – The configuration that controls how Amazon AppFlow places data in the destination connector.flow_name (
str
) – The specified name of the flow. Spaces are not allowed. Use underscores (_) or hyphens (-) only.source_flow_config (
Union
[IResolvable
,SourceFlowConfigProperty
,Dict
[str
,Any
]]) – Contains information about the configuration of the source connector used in the flow.tasks (
Union
[IResolvable
,Sequence
[Union
[IResolvable
,TaskProperty
,Dict
[str
,Any
]]]]) – A list of tasks that Amazon AppFlow performs while transferring the data in the flow run.trigger_config (
Union
[IResolvable
,TriggerConfigProperty
,Dict
[str
,Any
]]) – The trigger settings that determine how and when Amazon AppFlow runs the specified flow.description (
Optional
[str
]) – A user-entered description of the flow.flow_status (
Optional
[str
]) – Sets the status of the flow. You can specify one of the following values:. - Active - The flow runs based on the trigger settings that you defined. Active scheduled flows run as scheduled, and active event-triggered flows run when the specified change event occurs. However, active on-demand flows run only when you manually start them by using Amazon AppFlow. - Suspended - You can use this option to deactivate an active flow. Scheduled and event-triggered flows will cease to run until you reactive them. This value only affects scheduled and event-triggered flows. It has no effect for on-demand flows. If you omit the FlowStatus parameter, Amazon AppFlow creates the flow with a default status. The default status for on-demand flows is Active. The default status for scheduled and event-triggered flows is Draft, which means they’re not yet active.kms_arn (
Optional
[str
]) – The ARN (Amazon Resource Name) of the Key Management Service (KMS) key you provide for encryption. This is required if you do not want to use the Amazon AppFlow-managed KMS key. If you don’t provide anything here, Amazon AppFlow uses the Amazon AppFlow-managed KMS key.metadata_catalog_config (
Union
[IResolvable
,MetadataCatalogConfigProperty
,Dict
[str
,Any
],None
]) – Specifies the configuration that Amazon AppFlow uses when it catalogs your data. When Amazon AppFlow catalogs your data, it stores metadata in a data catalog.tags (
Optional
[Sequence
[Union
[CfnTag
,Dict
[str
,Any
]]]]) – The tags used to organize, track, or control access for your flow.
- See:
http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-appflow-flow.html
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_appflow as appflow cfn_flow_props = appflow.CfnFlowProps( destination_flow_config_list=[appflow.CfnFlow.DestinationFlowConfigProperty( connector_type="connectorType", destination_connector_properties=appflow.CfnFlow.DestinationConnectorPropertiesProperty( custom_connector=appflow.CfnFlow.CustomConnectorDestinationPropertiesProperty( entity_name="entityName", # the properties below are optional custom_properties={ "custom_properties_key": "customProperties" }, error_handling_config=appflow.CfnFlow.ErrorHandlingConfigProperty( bucket_name="bucketName", bucket_prefix="bucketPrefix", fail_on_first_error=False ), id_field_names=["idFieldNames"], write_operation_type="writeOperationType" ), event_bridge=appflow.CfnFlow.EventBridgeDestinationPropertiesProperty( object="object", # the properties below are optional error_handling_config=appflow.CfnFlow.ErrorHandlingConfigProperty( bucket_name="bucketName", bucket_prefix="bucketPrefix", fail_on_first_error=False ) ), lookout_metrics=appflow.CfnFlow.LookoutMetricsDestinationPropertiesProperty( object="object" ), marketo=appflow.CfnFlow.MarketoDestinationPropertiesProperty( object="object", # the properties below are optional error_handling_config=appflow.CfnFlow.ErrorHandlingConfigProperty( bucket_name="bucketName", bucket_prefix="bucketPrefix", fail_on_first_error=False ) ), redshift=appflow.CfnFlow.RedshiftDestinationPropertiesProperty( intermediate_bucket_name="intermediateBucketName", object="object", # the properties below are optional bucket_prefix="bucketPrefix", error_handling_config=appflow.CfnFlow.ErrorHandlingConfigProperty( bucket_name="bucketName", bucket_prefix="bucketPrefix", fail_on_first_error=False ) ), s3=appflow.CfnFlow.S3DestinationPropertiesProperty( bucket_name="bucketName", # the properties below are optional bucket_prefix="bucketPrefix", s3_output_format_config=appflow.CfnFlow.S3OutputFormatConfigProperty( aggregation_config=appflow.CfnFlow.AggregationConfigProperty( aggregation_type="aggregationType", target_file_size=123 ), file_type="fileType", prefix_config=appflow.CfnFlow.PrefixConfigProperty( path_prefix_hierarchy=["pathPrefixHierarchy"], prefix_format="prefixFormat", prefix_type="prefixType" ), preserve_source_data_typing=False ) ), salesforce=appflow.CfnFlow.SalesforceDestinationPropertiesProperty( object="object", # the properties below are optional data_transfer_api="dataTransferApi", error_handling_config=appflow.CfnFlow.ErrorHandlingConfigProperty( bucket_name="bucketName", bucket_prefix="bucketPrefix", fail_on_first_error=False ), id_field_names=["idFieldNames"], write_operation_type="writeOperationType" ), sapo_data=appflow.CfnFlow.SAPODataDestinationPropertiesProperty( object_path="objectPath", # the properties below are optional error_handling_config=appflow.CfnFlow.ErrorHandlingConfigProperty( bucket_name="bucketName", bucket_prefix="bucketPrefix", fail_on_first_error=False ), id_field_names=["idFieldNames"], success_response_handling_config=appflow.CfnFlow.SuccessResponseHandlingConfigProperty( bucket_name="bucketName", bucket_prefix="bucketPrefix" ), write_operation_type="writeOperationType" ), snowflake=appflow.CfnFlow.SnowflakeDestinationPropertiesProperty( intermediate_bucket_name="intermediateBucketName", object="object", # the properties below are optional bucket_prefix="bucketPrefix", error_handling_config=appflow.CfnFlow.ErrorHandlingConfigProperty( bucket_name="bucketName", bucket_prefix="bucketPrefix", fail_on_first_error=False ) ), upsolver=appflow.CfnFlow.UpsolverDestinationPropertiesProperty( bucket_name="bucketName", s3_output_format_config=appflow.CfnFlow.UpsolverS3OutputFormatConfigProperty( prefix_config=appflow.CfnFlow.PrefixConfigProperty( path_prefix_hierarchy=["pathPrefixHierarchy"], prefix_format="prefixFormat", prefix_type="prefixType" ), # the properties below are optional aggregation_config=appflow.CfnFlow.AggregationConfigProperty( aggregation_type="aggregationType", target_file_size=123 ), file_type="fileType" ), # the properties below are optional bucket_prefix="bucketPrefix" ), zendesk=appflow.CfnFlow.ZendeskDestinationPropertiesProperty( object="object", # the properties below are optional error_handling_config=appflow.CfnFlow.ErrorHandlingConfigProperty( bucket_name="bucketName", bucket_prefix="bucketPrefix", fail_on_first_error=False ), id_field_names=["idFieldNames"], write_operation_type="writeOperationType" ) ), # the properties below are optional api_version="apiVersion", connector_profile_name="connectorProfileName" )], flow_name="flowName", source_flow_config=appflow.CfnFlow.SourceFlowConfigProperty( connector_type="connectorType", source_connector_properties=appflow.CfnFlow.SourceConnectorPropertiesProperty( amplitude=appflow.CfnFlow.AmplitudeSourcePropertiesProperty( object="object" ), custom_connector=appflow.CfnFlow.CustomConnectorSourcePropertiesProperty( entity_name="entityName", # the properties below are optional custom_properties={ "custom_properties_key": "customProperties" }, data_transfer_api=appflow.CfnFlow.DataTransferApiProperty( name="name", type="type" ) ), datadog=appflow.CfnFlow.DatadogSourcePropertiesProperty( object="object" ), dynatrace=appflow.CfnFlow.DynatraceSourcePropertiesProperty( object="object" ), google_analytics=appflow.CfnFlow.GoogleAnalyticsSourcePropertiesProperty( object="object" ), infor_nexus=appflow.CfnFlow.InforNexusSourcePropertiesProperty( object="object" ), marketo=appflow.CfnFlow.MarketoSourcePropertiesProperty( object="object" ), pardot=appflow.CfnFlow.PardotSourcePropertiesProperty( object="object" ), s3=appflow.CfnFlow.S3SourcePropertiesProperty( bucket_name="bucketName", bucket_prefix="bucketPrefix", # the properties below are optional s3_input_format_config=appflow.CfnFlow.S3InputFormatConfigProperty( s3_input_file_type="s3InputFileType" ) ), salesforce=appflow.CfnFlow.SalesforceSourcePropertiesProperty( object="object", # the properties below are optional data_transfer_api="dataTransferApi", enable_dynamic_field_update=False, include_deleted_records=False ), sapo_data=appflow.CfnFlow.SAPODataSourcePropertiesProperty( object_path="objectPath", # the properties below are optional pagination_config=appflow.CfnFlow.SAPODataPaginationConfigProperty( max_page_size=123 ), parallelism_config=appflow.CfnFlow.SAPODataParallelismConfigProperty( max_parallelism=123 ) ), service_now=appflow.CfnFlow.ServiceNowSourcePropertiesProperty( object="object" ), singular=appflow.CfnFlow.SingularSourcePropertiesProperty( object="object" ), slack=appflow.CfnFlow.SlackSourcePropertiesProperty( object="object" ), trendmicro=appflow.CfnFlow.TrendmicroSourcePropertiesProperty( object="object" ), veeva=appflow.CfnFlow.VeevaSourcePropertiesProperty( object="object", # the properties below are optional document_type="documentType", include_all_versions=False, include_renditions=False, include_source_files=False ), zendesk=appflow.CfnFlow.ZendeskSourcePropertiesProperty( object="object" ) ), # the properties below are optional api_version="apiVersion", connector_profile_name="connectorProfileName", incremental_pull_config=appflow.CfnFlow.IncrementalPullConfigProperty( datetime_type_field_name="datetimeTypeFieldName" ) ), tasks=[appflow.CfnFlow.TaskProperty( source_fields=["sourceFields"], task_type="taskType", # the properties below are optional connector_operator=appflow.CfnFlow.ConnectorOperatorProperty( amplitude="amplitude", custom_connector="customConnector", datadog="datadog", dynatrace="dynatrace", google_analytics="googleAnalytics", infor_nexus="inforNexus", marketo="marketo", pardot="pardot", s3="s3", salesforce="salesforce", sapo_data="sapoData", service_now="serviceNow", singular="singular", slack="slack", trendmicro="trendmicro", veeva="veeva", zendesk="zendesk" ), destination_field="destinationField", task_properties=[appflow.CfnFlow.TaskPropertiesObjectProperty( key="key", value="value" )] )], trigger_config=appflow.CfnFlow.TriggerConfigProperty( trigger_type="triggerType", # the properties below are optional trigger_properties=appflow.CfnFlow.ScheduledTriggerPropertiesProperty( schedule_expression="scheduleExpression", # the properties below are optional data_pull_mode="dataPullMode", first_execution_from=123, flow_error_deactivation_threshold=123, schedule_end_time=123, schedule_offset=123, schedule_start_time=123, time_zone="timeZone" ) ), # the properties below are optional description="description", flow_status="flowStatus", kms_arn="kmsArn", metadata_catalog_config=appflow.CfnFlow.MetadataCatalogConfigProperty( glue_data_catalog=appflow.CfnFlow.GlueDataCatalogProperty( database_name="databaseName", role_arn="roleArn", table_prefix="tablePrefix" ) ), tags=[CfnTag( key="key", value="value" )] )
Attributes
- description
A user-entered description of the flow.
- destination_flow_config_list
The configuration that controls how Amazon AppFlow places data in the destination connector.
- flow_name
The specified name of the flow.
Spaces are not allowed. Use underscores (_) or hyphens (-) only.
- flow_status
.
Active - The flow runs based on the trigger settings that you defined. Active scheduled flows run as scheduled, and active event-triggered flows run when the specified change event occurs. However, active on-demand flows run only when you manually start them by using Amazon AppFlow.
Suspended - You can use this option to deactivate an active flow. Scheduled and event-triggered flows will cease to run until you reactive them. This value only affects scheduled and event-triggered flows. It has no effect for on-demand flows.
If you omit the FlowStatus parameter, Amazon AppFlow creates the flow with a default status. The default status for on-demand flows is Active. The default status for scheduled and event-triggered flows is Draft, which means they’re not yet active.
- See:
- Type:
Sets the status of the flow. You can specify one of the following values
- kms_arn
The ARN (Amazon Resource Name) of the Key Management Service (KMS) key you provide for encryption.
This is required if you do not want to use the Amazon AppFlow-managed KMS key. If you don’t provide anything here, Amazon AppFlow uses the Amazon AppFlow-managed KMS key.
- metadata_catalog_config
Specifies the configuration that Amazon AppFlow uses when it catalogs your data.
When Amazon AppFlow catalogs your data, it stores metadata in a data catalog.
- source_flow_config
Contains information about the configuration of the source connector used in the flow.
- tags
The tags used to organize, track, or control access for your flow.
- tasks
A list of tasks that Amazon AppFlow performs while transferring the data in the flow run.
- trigger_config
The trigger settings that determine how and when Amazon AppFlow runs the specified flow.