CfnWorkflow
- class aws_cdk.aws_transfer.CfnWorkflow(scope, id, *, steps, description=None, on_exception_steps=None, tags=None)
Bases:
CfnResource
Allows you to create a workflow with specified steps and step details the workflow invokes after file transfer completes.
After creating a workflow, you can associate the workflow created with any transfer servers by specifying the
workflow-details
field inCreateServer
andUpdateServer
operations.- See:
http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-transfer-workflow.html
- CloudformationResource:
AWS::Transfer::Workflow
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_transfer as transfer # copy_step_details: Any # custom_step_details: Any # delete_step_details: Any # tag_step_details: Any cfn_workflow = transfer.CfnWorkflow(self, "MyCfnWorkflow", steps=[transfer.CfnWorkflow.WorkflowStepProperty( copy_step_details=copy_step_details, custom_step_details=custom_step_details, decrypt_step_details=transfer.CfnWorkflow.DecryptStepDetailsProperty( destination_file_location=transfer.CfnWorkflow.InputFileLocationProperty( efs_file_location=transfer.CfnWorkflow.EfsInputFileLocationProperty( file_system_id="fileSystemId", path="path" ), s3_file_location=transfer.CfnWorkflow.S3InputFileLocationProperty( bucket="bucket", key="key" ) ), type="type", # the properties below are optional name="name", overwrite_existing="overwriteExisting", source_file_location="sourceFileLocation" ), delete_step_details=delete_step_details, tag_step_details=tag_step_details, type="type" )], # the properties below are optional description="description", on_exception_steps=[transfer.CfnWorkflow.WorkflowStepProperty( copy_step_details=copy_step_details, custom_step_details=custom_step_details, decrypt_step_details=transfer.CfnWorkflow.DecryptStepDetailsProperty( destination_file_location=transfer.CfnWorkflow.InputFileLocationProperty( efs_file_location=transfer.CfnWorkflow.EfsInputFileLocationProperty( file_system_id="fileSystemId", path="path" ), s3_file_location=transfer.CfnWorkflow.S3InputFileLocationProperty( bucket="bucket", key="key" ) ), type="type", # the properties below are optional name="name", overwrite_existing="overwriteExisting", source_file_location="sourceFileLocation" ), delete_step_details=delete_step_details, tag_step_details=tag_step_details, type="type" )], tags=[CfnTag( key="key", value="value" )] )
- Parameters:
scope (
Construct
) – Scope in which this resource is defined.id (
str
) – Construct identifier for this resource (unique in its scope).steps (
Union
[IResolvable
,Sequence
[Union
[IResolvable
,WorkflowStepProperty
,Dict
[str
,Any
]]]]) – Specifies the details for the steps that are in the specified workflow.description (
Optional
[str
]) – Specifies the text description for the workflow.on_exception_steps (
Union
[IResolvable
,Sequence
[Union
[IResolvable
,WorkflowStepProperty
,Dict
[str
,Any
]]],None
]) – Specifies the steps (actions) to take if errors are encountered during execution of the workflow.tags (
Optional
[Sequence
[Union
[CfnTag
,Dict
[str
,Any
]]]]) – Key-value pairs that can be used to group and search for workflows. Tags are metadata attached to workflows for any purpose.
Methods
- add_deletion_override(path)
Syntactic sugar for
addOverride(path, undefined)
.- Parameters:
path (
str
) – The path of the value to delete.- Return type:
None
- add_dependency(target)
Indicates that this resource depends on another resource and cannot be provisioned unless the other resource has been successfully provisioned.
This can be used for resources across stacks (or nested stack) boundaries and the dependency will automatically be transferred to the relevant scope.
- Parameters:
target (
CfnResource
) –- Return type:
None
- add_depends_on(target)
(deprecated) Indicates that this resource depends on another resource and cannot be provisioned unless the other resource has been successfully provisioned.
- Parameters:
target (
CfnResource
) –- Deprecated:
use addDependency
- Stability:
deprecated
- Return type:
None
- add_metadata(key, value)
Add a value to the CloudFormation Resource Metadata.
- Parameters:
key (
str
) –value (
Any
) –
- See:
- Return type:
None
Note that this is a different set of metadata from CDK node metadata; this metadata ends up in the stack template under the resource, whereas CDK node metadata ends up in the Cloud Assembly.
- add_override(path, value)
Adds an override to the synthesized CloudFormation resource.
To add a property override, either use
addPropertyOverride
or prefixpath
with “Properties.” (i.e.Properties.TopicName
).If the override is nested, separate each nested level using a dot (.) in the path parameter. If there is an array as part of the nesting, specify the index in the path.
To include a literal
.
in the property name, prefix with a\
. In most programming languages you will need to write this as"\\."
because the\
itself will need to be escaped.For example:
cfn_resource.add_override("Properties.GlobalSecondaryIndexes.0.Projection.NonKeyAttributes", ["myattribute"]) cfn_resource.add_override("Properties.GlobalSecondaryIndexes.1.ProjectionType", "INCLUDE")
would add the overrides Example:
"Properties": { "GlobalSecondaryIndexes": [ { "Projection": { "NonKeyAttributes": [ "myattribute" ] ... } ... }, { "ProjectionType": "INCLUDE" ... }, ] ... }
The
value
argument toaddOverride
will not be processed or translated in any way. Pass raw JSON values in here with the correct capitalization for CloudFormation. If you pass CDK classes or structs, they will be rendered with lowercased key names, and CloudFormation will reject the template.- Parameters:
path (
str
) –The path of the property, you can use dot notation to override values in complex types. Any intermediate keys will be created as needed.
value (
Any
) –The value. Could be primitive or complex.
- Return type:
None
- add_property_deletion_override(property_path)
Adds an override that deletes the value of a property from the resource definition.
- Parameters:
property_path (
str
) – The path to the property.- Return type:
None
- add_property_override(property_path, value)
Adds an override to a resource property.
Syntactic sugar for
addOverride("Properties.<...>", value)
.- Parameters:
property_path (
str
) – The path of the property.value (
Any
) – The value.
- Return type:
None
- apply_removal_policy(policy=None, *, apply_to_update_replace_policy=None, default=None)
Sets the deletion policy of the resource based on the removal policy specified.
The Removal Policy controls what happens to this resource when it stops being managed by CloudFormation, either because you’ve removed it from the CDK application or because you’ve made a change that requires the resource to be replaced.
The resource can be deleted (
RemovalPolicy.DESTROY
), or left in your AWS account for data recovery and cleanup later (RemovalPolicy.RETAIN
). In some cases, a snapshot can be taken of the resource prior to deletion (RemovalPolicy.SNAPSHOT
). A list of resources that support this policy can be found in the following link:- Parameters:
policy (
Optional
[RemovalPolicy
]) –apply_to_update_replace_policy (
Optional
[bool
]) – Apply the same deletion policy to the resource’s “UpdateReplacePolicy”. Default: truedefault (
Optional
[RemovalPolicy
]) – The default policy to apply in case the removal policy is not defined. Default: - Default value is resource specific. To determine the default value for a resource, please consult that specific resource’s documentation.
- See:
- Return type:
None
- get_att(attribute_name, type_hint=None)
Returns a token for an runtime attribute of this resource.
Ideally, use generated attribute accessors (e.g.
resource.arn
), but this can be used for future compatibility in case there is no generated attribute.- Parameters:
attribute_name (
str
) – The name of the attribute.type_hint (
Optional
[ResolutionTypeHint
]) –
- Return type:
- get_metadata(key)
Retrieve a value value from the CloudFormation Resource Metadata.
- Parameters:
key (
str
) –- See:
- Return type:
Any
Note that this is a different set of metadata from CDK node metadata; this metadata ends up in the stack template under the resource, whereas CDK node metadata ends up in the Cloud Assembly.
- inspect(inspector)
Examines the CloudFormation resource and discloses attributes.
- Parameters:
inspector (
TreeInspector
) – tree inspector to collect and process attributes.- Return type:
None
- obtain_dependencies()
Retrieves an array of resources this resource depends on.
This assembles dependencies on resources across stacks (including nested stacks) automatically.
- Return type:
List
[Union
[Stack
,CfnResource
]]
- obtain_resource_dependencies()
Get a shallow copy of dependencies between this resource and other resources in the same stack.
- Return type:
List
[CfnResource
]
- override_logical_id(new_logical_id)
Overrides the auto-generated logical ID with a specific ID.
- Parameters:
new_logical_id (
str
) – The new logical ID to use for this stack element.- Return type:
None
- remove_dependency(target)
Indicates that this resource no longer depends on another resource.
This can be used for resources across stacks (including nested stacks) and the dependency will automatically be removed from the relevant scope.
- Parameters:
target (
CfnResource
) –- Return type:
None
- replace_dependency(target, new_target)
Replaces one dependency with another.
- Parameters:
target (
CfnResource
) – The dependency to replace.new_target (
CfnResource
) – The new dependency to add.
- Return type:
None
- to_string()
Returns a string representation of this construct.
- Return type:
str
- Returns:
a string representation of this resource
Attributes
- CFN_RESOURCE_TYPE_NAME = 'AWS::Transfer::Workflow'
- attr_arn
Specifies the unique Amazon Resource Name (ARN) for the workflow.
- CloudformationAttribute:
Arn
- attr_workflow_id
A unique identifier for a workflow.
- CloudformationAttribute:
WorkflowId
- cfn_options
Options for this resource, such as condition, update policy etc.
- cfn_resource_type
AWS resource type.
- creation_stack
return:
the stack trace of the point where this Resource was created from, sourced from the +metadata+ entry typed +aws:cdk:logicalId+, and with the bottom-most node +internal+ entries filtered.
- description
Specifies the text description for the workflow.
- logical_id
The logical ID for this CloudFormation stack element.
The logical ID of the element is calculated from the path of the resource node in the construct tree.
To override this value, use
overrideLogicalId(newLogicalId)
.- Returns:
the logical ID as a stringified token. This value will only get resolved during synthesis.
- node
The tree node.
- on_exception_steps
Specifies the steps (actions) to take if errors are encountered during execution of the workflow.
- ref
Return a string that will be resolved to a CloudFormation
{ Ref }
for this element.If, by any chance, the intrinsic reference of a resource is not a string, you could coerce it to an IResolvable through
Lazy.any({ produce: resource.ref })
.
- stack
The stack in which this element is defined.
CfnElements must be defined within a stack scope (directly or indirectly).
- steps
Specifies the details for the steps that are in the specified workflow.
- tags
Tag Manager which manages the tags for this resource.
- tags_raw
Key-value pairs that can be used to group and search for workflows.
Static Methods
- classmethod is_cfn_element(x)
Returns
true
if a construct is a stack element (i.e. part of the synthesized cloudformation template).Uses duck-typing instead of
instanceof
to allow stack elements from different versions of this library to be included in the same stack.- Parameters:
x (
Any
) –- Return type:
bool
- Returns:
The construct as a stack element or undefined if it is not a stack element.
- classmethod is_cfn_resource(x)
Check whether the given object is a CfnResource.
- Parameters:
x (
Any
) –- Return type:
bool
- classmethod is_construct(x)
Checks if
x
is a construct.Use this method instead of
instanceof
to properly detectConstruct
instances, even when the construct library is symlinked.Explanation: in JavaScript, multiple copies of the
constructs
library on disk are seen as independent, completely different libraries. As a consequence, the classConstruct
in each copy of theconstructs
library is seen as a different class, and an instance of one class will not test asinstanceof
the other class.npm install
will not create installations like this, but users may manually symlink construct libraries together or use a monorepo tool: in those cases, multiple copies of theconstructs
library can be accidentally installed, andinstanceof
will behave unpredictably. It is safest to avoid usinginstanceof
, and using this type-testing method instead.- Parameters:
x (
Any
) – Any object.- Return type:
bool
- Returns:
true if
x
is an object created from a class which extendsConstruct
.
CopyStepDetailsProperty
- class CfnWorkflow.CopyStepDetailsProperty(*, destination_file_location=None, name=None, overwrite_existing=None, source_file_location=None)
Bases:
object
Details for a step that performs a file copy.
Consists of the following values:
A description
An Amazon S3 location for the destination of the file copy.
A flag that indicates whether to overwrite an existing file of the same name. The default is
FALSE
.
- Parameters:
destination_file_location (
Union
[IResolvable
,S3FileLocationProperty
,Dict
[str
,Any
],None
]) – Specifies the location for the file being copied. Use${Transfer:UserName}
or${Transfer:UploadDate}
in this field to parametrize the destination prefix by username or uploaded date. - Set the value ofDestinationFileLocation
to${Transfer:UserName}
to copy uploaded files to an Amazon S3 bucket that is prefixed with the name of the Transfer Family user that uploaded the file. - Set the value ofDestinationFileLocation
to${Transfer:UploadDate}
to copy uploaded files to an Amazon S3 bucket that is prefixed with the date of the upload. .. epigraph:: The system resolvesUploadDate
to a date format of YYYY-MM-DD , based on the date the file is uploaded in UTC.name (
Optional
[str
]) – The name of the step, used as an identifier.overwrite_existing (
Optional
[str
]) – A flag that indicates whether to overwrite an existing file of the same name. The default isFALSE
. If the workflow is processing a file that has the same name as an existing file, the behavior is as follows: - IfOverwriteExisting
isTRUE
, the existing file is replaced with the file being processed. - IfOverwriteExisting
isFALSE
, nothing happens, and the workflow processing stops.source_file_location (
Optional
[str
]) – Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter${previous.file}
. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value. - To use the originally uploaded file location as input for this step, enter${original.file}
.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_transfer as transfer copy_step_details_property = transfer.CfnWorkflow.CopyStepDetailsProperty( destination_file_location=transfer.CfnWorkflow.S3FileLocationProperty( s3_file_location=transfer.CfnWorkflow.S3InputFileLocationProperty( bucket="bucket", key="key" ) ), name="name", overwrite_existing="overwriteExisting", source_file_location="sourceFileLocation" )
Attributes
- destination_file_location
Specifies the location for the file being copied.
Use
${Transfer:UserName}
or${Transfer:UploadDate}
in this field to parametrize the destination prefix by username or uploaded date.Set the value of
DestinationFileLocation
to${Transfer:UserName}
to copy uploaded files to an Amazon S3 bucket that is prefixed with the name of the Transfer Family user that uploaded the file.Set the value of
DestinationFileLocation
to${Transfer:UploadDate}
to copy uploaded files to an Amazon S3 bucket that is prefixed with the date of the upload.
The system resolves
UploadDate
to a date format of YYYY-MM-DD , based on the date the file is uploaded in UTC.
- name
The name of the step, used as an identifier.
- overwrite_existing
A flag that indicates whether to overwrite an existing file of the same name. The default is
FALSE
.If the workflow is processing a file that has the same name as an existing file, the behavior is as follows:
If
OverwriteExisting
isTRUE
, the existing file is replaced with the file being processed.If
OverwriteExisting
isFALSE
, nothing happens, and the workflow processing stops.
- source_file_location
either the output from the previous step, or the originally uploaded file for the workflow.
To use the previous file as the input, enter
${previous.file}
. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.To use the originally uploaded file location as input for this step, enter
${original.file}
.
- See:
- Type:
Specifies which file to use as input to the workflow step
CustomStepDetailsProperty
- class CfnWorkflow.CustomStepDetailsProperty(*, name=None, source_file_location=None, target=None, timeout_seconds=None)
Bases:
object
Details for a step that invokes an AWS Lambda function.
Consists of the Lambda function’s name, target, and timeout (in seconds).
- Parameters:
name (
Optional
[str
]) – The name of the step, used as an identifier.source_file_location (
Optional
[str
]) – Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter${previous.file}
. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value. - To use the originally uploaded file location as input for this step, enter${original.file}
.target (
Optional
[str
]) – The ARN for the Lambda function that is being called.timeout_seconds (
Union
[int
,float
,None
]) – Timeout, in seconds, for the step.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_transfer as transfer custom_step_details_property = transfer.CfnWorkflow.CustomStepDetailsProperty( name="name", source_file_location="sourceFileLocation", target="target", timeout_seconds=123 )
Attributes
- name
The name of the step, used as an identifier.
- source_file_location
either the output from the previous step, or the originally uploaded file for the workflow.
To use the previous file as the input, enter
${previous.file}
. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.To use the originally uploaded file location as input for this step, enter
${original.file}
.
- See:
- Type:
Specifies which file to use as input to the workflow step
- target
The ARN for the Lambda function that is being called.
- timeout_seconds
Timeout, in seconds, for the step.
DecryptStepDetailsProperty
- class CfnWorkflow.DecryptStepDetailsProperty(*, destination_file_location, type, name=None, overwrite_existing=None, source_file_location=None)
Bases:
object
Details for a step that decrypts an encrypted file.
Consists of the following values:
A descriptive name
An Amazon S3 or Amazon Elastic File System (Amazon EFS) location for the source file to decrypt.
An S3 or Amazon EFS location for the destination of the file decryption.
A flag that indicates whether to overwrite an existing file of the same name. The default is
FALSE
.The type of encryption that’s used. Currently, only PGP encryption is supported.
- Parameters:
destination_file_location (
Union
[IResolvable
,InputFileLocationProperty
,Dict
[str
,Any
]]) – Specifies the location for the file being decrypted. Use${Transfer:UserName}
or${Transfer:UploadDate}
in this field to parametrize the destination prefix by username or uploaded date. - Set the value ofDestinationFileLocation
to${Transfer:UserName}
to decrypt uploaded files to an Amazon S3 bucket that is prefixed with the name of the Transfer Family user that uploaded the file. - Set the value ofDestinationFileLocation
to${Transfer:UploadDate}
to decrypt uploaded files to an Amazon S3 bucket that is prefixed with the date of the upload. .. epigraph:: The system resolvesUploadDate
to a date format of YYYY-MM-DD , based on the date the file is uploaded in UTC.type (
str
) – The type of encryption used. Currently, this value must bePGP
.name (
Optional
[str
]) – The name of the step, used as an identifier.overwrite_existing (
Optional
[str
]) – A flag that indicates whether to overwrite an existing file of the same name. The default isFALSE
. If the workflow is processing a file that has the same name as an existing file, the behavior is as follows: - IfOverwriteExisting
isTRUE
, the existing file is replaced with the file being processed. - IfOverwriteExisting
isFALSE
, nothing happens, and the workflow processing stops.source_file_location (
Optional
[str
]) – Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter${previous.file}
. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value. - To use the originally uploaded file location as input for this step, enter${original.file}
.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_transfer as transfer decrypt_step_details_property = transfer.CfnWorkflow.DecryptStepDetailsProperty( destination_file_location=transfer.CfnWorkflow.InputFileLocationProperty( efs_file_location=transfer.CfnWorkflow.EfsInputFileLocationProperty( file_system_id="fileSystemId", path="path" ), s3_file_location=transfer.CfnWorkflow.S3InputFileLocationProperty( bucket="bucket", key="key" ) ), type="type", # the properties below are optional name="name", overwrite_existing="overwriteExisting", source_file_location="sourceFileLocation" )
Attributes
- destination_file_location
Specifies the location for the file being decrypted.
Use
${Transfer:UserName}
or${Transfer:UploadDate}
in this field to parametrize the destination prefix by username or uploaded date.Set the value of
DestinationFileLocation
to${Transfer:UserName}
to decrypt uploaded files to an Amazon S3 bucket that is prefixed with the name of the Transfer Family user that uploaded the file.Set the value of
DestinationFileLocation
to${Transfer:UploadDate}
to decrypt uploaded files to an Amazon S3 bucket that is prefixed with the date of the upload.
The system resolves
UploadDate
to a date format of YYYY-MM-DD , based on the date the file is uploaded in UTC.
- name
The name of the step, used as an identifier.
- overwrite_existing
A flag that indicates whether to overwrite an existing file of the same name. The default is
FALSE
.If the workflow is processing a file that has the same name as an existing file, the behavior is as follows:
If
OverwriteExisting
isTRUE
, the existing file is replaced with the file being processed.If
OverwriteExisting
isFALSE
, nothing happens, and the workflow processing stops.
- source_file_location
either the output from the previous step, or the originally uploaded file for the workflow.
To use the previous file as the input, enter
${previous.file}
. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.To use the originally uploaded file location as input for this step, enter
${original.file}
.
- See:
- Type:
Specifies which file to use as input to the workflow step
- type
The type of encryption used.
Currently, this value must be
PGP
.
DeleteStepDetailsProperty
- class CfnWorkflow.DeleteStepDetailsProperty(*, name=None, source_file_location=None)
Bases:
object
An object that contains the name and file location for a file being deleted by a workflow.
- Parameters:
name (
Optional
[str
]) – The name of the step, used as an identifier.source_file_location (
Optional
[str
]) – Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter${previous.file}
. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value. - To use the originally uploaded file location as input for this step, enter${original.file}
.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_transfer as transfer delete_step_details_property = transfer.CfnWorkflow.DeleteStepDetailsProperty( name="name", source_file_location="sourceFileLocation" )
Attributes
- name
The name of the step, used as an identifier.
- source_file_location
either the output from the previous step, or the originally uploaded file for the workflow.
To use the previous file as the input, enter
${previous.file}
. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.To use the originally uploaded file location as input for this step, enter
${original.file}
.
- See:
- Type:
Specifies which file to use as input to the workflow step
EfsInputFileLocationProperty
- class CfnWorkflow.EfsInputFileLocationProperty(*, file_system_id=None, path=None)
Bases:
object
Specifies the Amazon EFS identifier and the path for the file being used.
- Parameters:
file_system_id (
Optional
[str
]) – The identifier of the file system, assigned by Amazon EFS.path (
Optional
[str
]) – The pathname for the folder being used by a workflow.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_transfer as transfer efs_input_file_location_property = transfer.CfnWorkflow.EfsInputFileLocationProperty( file_system_id="fileSystemId", path="path" )
Attributes
- file_system_id
The identifier of the file system, assigned by Amazon EFS.
- path
The pathname for the folder being used by a workflow.
InputFileLocationProperty
- class CfnWorkflow.InputFileLocationProperty(*, efs_file_location=None, s3_file_location=None)
Bases:
object
Specifies the location for the file that’s being processed.
- Parameters:
efs_file_location (
Union
[IResolvable
,EfsInputFileLocationProperty
,Dict
[str
,Any
],None
]) – Specifies the details for the Amazon Elastic File System (Amazon EFS) file that’s being decrypted.s3_file_location (
Union
[IResolvable
,S3InputFileLocationProperty
,Dict
[str
,Any
],None
]) – Specifies the details for the Amazon S3 file that’s being copied or decrypted.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_transfer as transfer input_file_location_property = transfer.CfnWorkflow.InputFileLocationProperty( efs_file_location=transfer.CfnWorkflow.EfsInputFileLocationProperty( file_system_id="fileSystemId", path="path" ), s3_file_location=transfer.CfnWorkflow.S3InputFileLocationProperty( bucket="bucket", key="key" ) )
Attributes
- efs_file_location
Specifies the details for the Amazon Elastic File System (Amazon EFS) file that’s being decrypted.
- s3_file_location
Specifies the details for the Amazon S3 file that’s being copied or decrypted.
S3FileLocationProperty
- class CfnWorkflow.S3FileLocationProperty(*, s3_file_location=None)
Bases:
object
Specifies the S3 details for the file being used, such as bucket, ETag, and so forth.
- Parameters:
s3_file_location (
Union
[IResolvable
,S3InputFileLocationProperty
,Dict
[str
,Any
],None
]) – Specifies the details for the file location for the file that’s being used in the workflow. Only applicable if you are using Amazon S3 storage.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_transfer as transfer s3_file_location_property = transfer.CfnWorkflow.S3FileLocationProperty( s3_file_location=transfer.CfnWorkflow.S3InputFileLocationProperty( bucket="bucket", key="key" ) )
Attributes
- s3_file_location
Specifies the details for the file location for the file that’s being used in the workflow.
Only applicable if you are using Amazon S3 storage.
S3InputFileLocationProperty
- class CfnWorkflow.S3InputFileLocationProperty(*, bucket=None, key=None)
Bases:
object
Specifies the details for the Amazon S3 location for an input file to a workflow.
- Parameters:
bucket (
Optional
[str
]) – Specifies the S3 bucket for the customer input file.key (
Optional
[str
]) – The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_transfer as transfer s3_input_file_location_property = transfer.CfnWorkflow.S3InputFileLocationProperty( bucket="bucket", key="key" )
Attributes
- bucket
Specifies the S3 bucket for the customer input file.
- key
The name assigned to the file when it was created in Amazon S3.
You use the object key to retrieve the object.
S3TagProperty
- class CfnWorkflow.S3TagProperty(*, key, value)
Bases:
object
Specifies the key-value pair that are assigned to a file during the execution of a Tagging step.
- Parameters:
key (
str
) – The name assigned to the tag that you create.value (
str
) – The value that corresponds to the key.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_transfer as transfer s3_tag_property = transfer.CfnWorkflow.S3TagProperty( key="key", value="value" )
Attributes
- key
The name assigned to the tag that you create.
- value
The value that corresponds to the key.
WorkflowStepProperty
- class CfnWorkflow.WorkflowStepProperty(*, copy_step_details=None, custom_step_details=None, decrypt_step_details=None, delete_step_details=None, tag_step_details=None, type=None)
Bases:
object
The basic building block of a workflow.
- Parameters:
copy_step_details (
Any
) – Details for a step that performs a file copy. Consists of the following values: - A description - An Amazon S3 location for the destination of the file copy. - A flag that indicates whether to overwrite an existing file of the same name. The default isFALSE
.custom_step_details (
Any
) – Details for a step that invokes an AWS Lambda function. Consists of the Lambda function’s name, target, and timeout (in seconds).decrypt_step_details (
Union
[IResolvable
,DecryptStepDetailsProperty
,Dict
[str
,Any
],None
]) – Details for a step that decrypts an encrypted file. Consists of the following values: - A descriptive name - An Amazon S3 or Amazon Elastic File System (Amazon EFS) location for the source file to decrypt. - An S3 or Amazon EFS location for the destination of the file decryption. - A flag that indicates whether to overwrite an existing file of the same name. The default isFALSE
. - The type of encryption that’s used. Currently, only PGP encryption is supported.delete_step_details (
Any
) – Details for a step that deletes the file.tag_step_details (
Any
) – Details for a step that creates one or more tags. You specify one or more tags. Each tag contains a key-value pair.type (
Optional
[str
]) – Currently, the following step types are supported. - ``COPY`` - Copy the file to another location. - ``CUSTOM`` - Perform a custom step with an AWS Lambda function target. - ``DECRYPT`` - Decrypt a file that was encrypted before it was uploaded. - ``DELETE`` - Delete the file. - ``TAG`` - Add a tag to the file.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_transfer as transfer # copy_step_details: Any # custom_step_details: Any # delete_step_details: Any # tag_step_details: Any workflow_step_property = transfer.CfnWorkflow.WorkflowStepProperty( copy_step_details=copy_step_details, custom_step_details=custom_step_details, decrypt_step_details=transfer.CfnWorkflow.DecryptStepDetailsProperty( destination_file_location=transfer.CfnWorkflow.InputFileLocationProperty( efs_file_location=transfer.CfnWorkflow.EfsInputFileLocationProperty( file_system_id="fileSystemId", path="path" ), s3_file_location=transfer.CfnWorkflow.S3InputFileLocationProperty( bucket="bucket", key="key" ) ), type="type", # the properties below are optional name="name", overwrite_existing="overwriteExisting", source_file_location="sourceFileLocation" ), delete_step_details=delete_step_details, tag_step_details=tag_step_details, type="type" )
Attributes
- copy_step_details
Details for a step that performs a file copy.
Consists of the following values:
A description
An Amazon S3 location for the destination of the file copy.
A flag that indicates whether to overwrite an existing file of the same name. The default is
FALSE
.
- custom_step_details
Details for a step that invokes an AWS Lambda function.
Consists of the Lambda function’s name, target, and timeout (in seconds).
- decrypt_step_details
Details for a step that decrypts an encrypted file.
Consists of the following values:
A descriptive name
An Amazon S3 or Amazon Elastic File System (Amazon EFS) location for the source file to decrypt.
An S3 or Amazon EFS location for the destination of the file decryption.
A flag that indicates whether to overwrite an existing file of the same name. The default is
FALSE
.The type of encryption that’s used. Currently, only PGP encryption is supported.
- delete_step_details
Details for a step that deletes the file.
- tag_step_details
Details for a step that creates one or more tags.
You specify one or more tags. Each tag contains a key-value pair.
- type
Currently, the following step types are supported.
``COPY`` - Copy the file to another location.
``CUSTOM`` - Perform a custom step with an AWS Lambda function target.
``DECRYPT`` - Decrypt a file that was encrypted before it was uploaded.
``DELETE`` - Delete the file.
``TAG`` - Add a tag to the file.