CfnApplicationProps
- class aws_cdk.aws_kinesisanalytics.CfnApplicationProps(*, inputs, application_code=None, application_description=None, application_name=None)
Bases:
object
Properties for defining a
CfnApplication
.- Parameters:
inputs (
Union
[IResolvable
,Sequence
[Union
[InputProperty
,Dict
[str
,Any
],IResolvable
]]]) – Use this parameter to configure the application input. You can configure your application to receive input from a single streaming source. In this configuration, you map this streaming source to an in-application stream that is created. Your application code can then query the in-application stream like a table (you can think of it as a constantly updating table). For the streaming source, you provide its Amazon Resource Name (ARN) and format of data on the stream (for example, JSON, CSV, etc.). You also must provide an IAM role that Amazon Kinesis Analytics can assume to read this stream on your behalf. To create the in-application stream, you need to specify a schema to transform your data into a schematized version used in SQL. In the schema, you provide the necessary mapping of the data elements in the streaming source to record columns in the in-app stream.application_code (
Optional
[str
]) – One or more SQL statements that read input data, transform it, and generate output. For example, you can write a SQL statement that reads data from one in-application stream, generates a running average of the number of advertisement clicks by vendor, and insert resulting rows in another in-application stream using pumps. For more information about the typical pattern, see Application Code . You can provide such series of SQL statements, where output of one statement can be used as the input for the next statement. You store intermediate results by creating in-application streams and pumps. Note that the application code must create the streams with names specified in theOutputs
. For example, if yourOutputs
defines output streams namedExampleOutputStream1
andExampleOutputStream2
, then your application code must create these streams.application_description (
Optional
[str
]) – Summary description of the application.application_name (
Optional
[str
]) – Name of your Amazon Kinesis Analytics application (for example,sample-app
).
- Link:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. import aws_cdk.aws_kinesisanalytics as kinesisanalytics cfn_application_props = kinesisanalytics.CfnApplicationProps( inputs=[kinesisanalytics.CfnApplication.InputProperty( input_schema=kinesisanalytics.CfnApplication.InputSchemaProperty( record_columns=[kinesisanalytics.CfnApplication.RecordColumnProperty( name="name", sql_type="sqlType", # the properties below are optional mapping="mapping" )], record_format=kinesisanalytics.CfnApplication.RecordFormatProperty( record_format_type="recordFormatType", # the properties below are optional mapping_parameters=kinesisanalytics.CfnApplication.MappingParametersProperty( csv_mapping_parameters=kinesisanalytics.CfnApplication.CSVMappingParametersProperty( record_column_delimiter="recordColumnDelimiter", record_row_delimiter="recordRowDelimiter" ), json_mapping_parameters=kinesisanalytics.CfnApplication.JSONMappingParametersProperty( record_row_path="recordRowPath" ) ) ), # the properties below are optional record_encoding="recordEncoding" ), name_prefix="namePrefix", # the properties below are optional input_parallelism=kinesisanalytics.CfnApplication.InputParallelismProperty( count=123 ), input_processing_configuration=kinesisanalytics.CfnApplication.InputProcessingConfigurationProperty( input_lambda_processor=kinesisanalytics.CfnApplication.InputLambdaProcessorProperty( resource_arn="resourceArn", role_arn="roleArn" ) ), kinesis_firehose_input=kinesisanalytics.CfnApplication.KinesisFirehoseInputProperty( resource_arn="resourceArn", role_arn="roleArn" ), kinesis_streams_input=kinesisanalytics.CfnApplication.KinesisStreamsInputProperty( resource_arn="resourceArn", role_arn="roleArn" ) )], # the properties below are optional application_code="applicationCode", application_description="applicationDescription", application_name="applicationName" )
Attributes
- application_code
One or more SQL statements that read input data, transform it, and generate output.
For example, you can write a SQL statement that reads data from one in-application stream, generates a running average of the number of advertisement clicks by vendor, and insert resulting rows in another in-application stream using pumps. For more information about the typical pattern, see Application Code .
You can provide such series of SQL statements, where output of one statement can be used as the input for the next statement. You store intermediate results by creating in-application streams and pumps.
Note that the application code must create the streams with names specified in the
Outputs
. For example, if yourOutputs
defines output streams namedExampleOutputStream1
andExampleOutputStream2
, then your application code must create these streams.
- application_description
Summary description of the application.
- application_name
Name of your Amazon Kinesis Analytics application (for example,
sample-app
).
- inputs
Use this parameter to configure the application input.
You can configure your application to receive input from a single streaming source. In this configuration, you map this streaming source to an in-application stream that is created. Your application code can then query the in-application stream like a table (you can think of it as a constantly updating table).
For the streaming source, you provide its Amazon Resource Name (ARN) and format of data on the stream (for example, JSON, CSV, etc.). You also must provide an IAM role that Amazon Kinesis Analytics can assume to read this stream on your behalf.
To create the in-application stream, you need to specify a schema to transform your data into a schematized version used in SQL. In the schema, you provide the necessary mapping of the data elements in the streaming source to record columns in the in-app stream.