

After careful consideration, we have decided to discontinue Amazon Kinesis Data Analytics for SQL applications:

1. From **September 1, 2025**, we won't provide any bug fixes for Amazon Kinesis Data Analytics for SQL applications because we will have limited support for it, given the upcoming discontinuation.

2. From **October 15, 2025**, you will not be able to create new Kinesis Data Analytics for SQL applications.

3. We will delete your applications starting **January 27, 2026**. You will not be able to start or operate your Amazon Kinesis Data Analytics for SQL applications. Support will no longer be available for Amazon Kinesis Data Analytics for SQL from that time. For more information, see [Amazon Kinesis Data Analytics for SQL Applications discontinuation](discontinuation.md).

# Step 3: Create Your Starter Amazon Kinesis Data Analytics Application
<a name="get-started-exercise"></a>

By following the steps in this section, you can create your first Kinesis Data Analytics application using the console. 

**Note**  
We suggest that you review [Amazon Kinesis Data Analytics for SQL Applications: How It Works](how-it-works.md) before trying the Getting Started exercise.

For this Getting Started exercise, you can use the console to work with either the demo stream or templates with application code.
+ If you choose to use the demo stream, the console creates a Kinesis data stream in your account that is called `kinesis-analytics-demo-stream`.

  A Kinesis Data Analytics application requires a streaming source. For this source, several SQL examples in this guide use the demo stream `kinesis-analytics-demo-stream`. The console also runs a script that continuously adds sample data (simulated stock trade records) to this stream, as shown following.  
![\[Formatted stream sample table showing stock symbols, sectors, and prices.\]](http://docs.aws.amazon.com/kinesisanalytics/latest/dev/images/gs-v2-30.png)

  You can use `kinesis-analytics-demo-stream` as the streaming source for your application in this exercise.
**Note**  
The demo stream remains in your account. You can use it to test other examples in this guide. However, when you leave the console, the script that the console uses stops populating the data. When needed, the console provides the option to start populating the stream again. 
+ If you choose to use the templates with example application code, you use template code that the console provides to perform simple analytics on the demo stream. 

You use these features to quickly set up your first application as follows:

1. **Create an application** – You only need to provide a name. The console creates the application and the service sets the application state to `READY`.

    

1. **Configure input** – First, you add a streaming source, the demo stream. You must create a demo stream in the console before you can use it. Then, the console takes a random sample of records on the demo stream and infers a schema for the in-application input stream that is created. The console names the in-application stream `SOURCE_SQL_STREAM_001`.

   The console uses the discovery API to infer the schema. If necessary, you can edit the inferred schema. For more information, see [DiscoverInputSchema](API_DiscoverInputSchema.md). Kinesis Data Analytics uses this schema to create an in-application stream.

    

   When you start the application, Kinesis Data Analytics reads the demo stream continuously on your behalf and inserts rows in the `SOURCE_SQL_STREAM_001` in-application input stream. 

    

1. **Specify application code** – You use a template (called **Continuous filter**) that provides the following code:

   ```
   CREATE OR REPLACE STREAM "DESTINATION_SQL_STREAM" 
     (symbol VARCHAR(4), sector VARCHAR(12), CHANGE DOUBLE, price DOUBLE);
    
   -- Create pump to insert into output. 
   CREATE OR REPLACE PUMP "STREAM_PUMP" AS 
      INSERT INTO "DESTINATION_SQL_STREAM"  
         SELECT STREAM ticker_symbol, sector, CHANGE, price
         FROM "SOURCE_SQL_STREAM_001"
         WHERE sector SIMILAR TO '%TECH%';
   ```

   The application code queries the in-application stream `SOURCE_SQL_STREAM_001`. The code then inserts the resulting rows in another in-application stream `DESTINATION_SQL_STREAM`, using pumps. For more information about this coding pattern, see [Application Code](how-it-works-app-code.md). 

   For information about the SQL language elements that are supported by Kinesis Data Analytics, see [Amazon Kinesis Data Analytics SQL Reference](https://docs.aws.amazon.com/kinesisanalytics/latest/sqlref/analytics-sql-reference.html).

    

1. **Configuring output** – In this exercise, you don't configure any output. That is, you don't persist data in the in-application stream that your application creates to any external destination. Instead, you verify query results in the console. Additional examples in this guide show how to configure output. For one example, see [Example: Creating Simple Alerts](app-simple-alerts.md).

   



**Important**  
The exercise uses the US East (N. Virginia) Region (us-east-1) to set up the application. You can use any of the supported AWS Regions.

**Next Step**  
[Step 3.1: Create an Application](get-started-create-app.md)

# Step 3.1: Create an Application
<a name="get-started-create-app"></a>

In this section, you create an Amazon Kinesis Data Analytics application. You configure application input in the next step.

**To create a data analytics application**

1. Sign in to the AWS Management Console and open the Managed Service for Apache Flink console at [ https://console.aws.amazon.com/kinesisanalytics](https://console.aws.amazon.com/kinesisanalytics).

1. Choose **Create application**.

1. On the **Create application** page, type an application name, type a description, choose **SQL** for the application's **Runtime** setting, and then choose **Create application**.  
![\[Screenshot of New application page with application name and description.\]](http://docs.aws.amazon.com/kinesisanalytics/latest/dev/images/gs-v2-10.png)

   Doing this creates a Kinesis Data Analytics application with a status of READY. The console shows the application hub where you can configure input and output.
**Note**  
To create an application, the [CreateApplication](API_CreateApplication.md) operation requires only the application name. You can add input and output configuration after you create an application in the console.

   

   In the next step, you configure input for the application. In the input configuration, you add a streaming data source to the application and discover a schema for an in-application input stream by sampling data on the streaming source.

**Next Step**  
[Step 3.2: Configure Input](get-started-configure-input.md)

# Step 3.2: Configure Input
<a name="get-started-configure-input"></a>

Your application needs a streaming source. To help you get started, the console can create a demo stream (called `kinesis-analytics-demo-stream`). The console also runs a script that populates records in the stream.

**To add a streaming source to your application**

1. On the application hub page in the console, choose **Connect streaming data**.  
![\[Screenshot of the example app and the connect to a sourceGS button.\]](http://docs.aws.amazon.com/kinesisanalytics/latest/dev/images/gs-v2-20.png)

1. On the page that appears, review the following:
   + **Source** section, where you specify a streaming source for your application. You can select an existing stream source or create one. In this exercise, you create a new stream, the demo stream. 

      

     By default the console names the in-application input stream that is created as `INPUT_SQL_STREAM_001`. For this exercise, keep this name as it appears.

      
     + **Stream reference name** – This option shows the name of the in-application input stream that is created, `SOURCE_SQL_STREAM_001`. You can change the name, but for this exercise, keep this name.

        

       In the input configuration, you map the demo stream to an in-application input stream that is created. When you start the application, Amazon Kinesis Data Analytics continuously reads the demo stream and insert rows in the in-application input stream. You query this in-application input stream in your application code. 

        
     + **Record pre-processing with AWS Lambda**: This option is where you specify an AWS Lambda expression that modifies the records in the input stream before your application code executes. In this exercise, leave the **Disabled** option selected. For more information about Lambda preprocessing, see [Preprocessing Data Using a Lambda Function](lambda-preprocessing.md).

   After you provide all the information on this page, the console sends an update request (see [UpdateApplication](API_UpdateApplication.md)) to add the input configuration the application. 

1. On the **Source **page, choose **Configure a new stream**.

1. Choose **Create demo stream**. The console configures the application input by doing the following:
   + The console creates a Kinesis data stream called `kinesis-analytics-demo-stream`. 
   + The console populates the stream with sample stock ticker data.
   + Using the [DiscoverInputSchema](API_DiscoverInputSchema.md) input action, the console infers a schema by reading sample records on the stream. The schema that is inferred is the schema for the in-application input stream that is created. For more information, see [Configuring Application Input](how-it-works-input.md).
   + The console shows the inferred schema and the sample data it read from the streaming source to infer the schema.

   The console displays the sample records on the streaming source.  
![\[Formatted stream sample tab showing stock symbols, sectors, and prices in tabular format.\]](http://docs.aws.amazon.com/kinesisanalytics/latest/dev/images/gs-v2-30.png)

   The following appear on the **Stream sample** console page:
   + The **Raw stream sample** tab shows the raw stream records sampled by the [DiscoverInputSchema](API_DiscoverInputSchema.md) API action to infer the schema.
   + The **Formatted stream sample** tab shows the tabular version of the data in the **Raw stream sample** tab.
   + If you choose **Edit schema**, you can edit the inferred schema. For this exercise, don't change the inferred schema. For more information about editing a schema, see [Working with the Schema Editor](console-summary-edit-schema.md).

     If you choose **Rediscover schema**, you can request the console to run [DiscoverInputSchema](API_DiscoverInputSchema.md) again and infer the schema. 

     

   

1. Choose **Save and continue**.

   You now have an application with input configuration added to it. In the next step, you add SQL code to perform some analytics on the data in-application input stream.

**Next Step**  
[Step 3.3: Add Real-Time Analytics (Add Application Code)](get-started-add-realtime-analytics.md)

# Step 3.3: Add Real-Time Analytics (Add Application Code)
<a name="get-started-add-realtime-analytics"></a>

You can write your own SQL queries against the in-application stream, but for the following step you use one of the templates that provides sample code.

1. On the application hub page, choose **Go to SQL editor**.   
![\[Screenshot of the example application page with Go to SQL editor button.\]](http://docs.aws.amazon.com/kinesisanalytics/latest/dev/images/gs-v2-40.png)

1. In the **Would you like to start running "ExampleApp"?** dialog box, choose **Yes, start application**.

   The console sends a request to start the application (see [StartApplication](API_StartApplication.md)), and then the SQL editor page appears.

   

1. The console opens the SQL editor page. Review the page, including the buttons (**Add SQL from templates**, **Save and run SQL**) and various tabs.

1. In the SQL editor, choose **Add SQL from templates**.

1. From the available template list, choose **Continuous filter**. The sample code reads data from one in-application stream (the `WHERE` clause filters the rows) and inserts it in another in-application stream as follows:
   + It creates the in-application stream `DESTINATION_SQL_STREAM`.
   + It creates a pump `STREAM_PUMP`, and uses it to select rows from `SOURCE_SQL_STREAM_001` and insert them in the `DESTINATION_SQL_STREAM`. 

   

1. Choose **Add this SQL to editor**. 

1. Test the application code as follows:

   Remember, you already started the application (status is RUNNING). Therefore, Amazon Kinesis Data Analytics is already continuously reading from the streaming source and adding rows to the in-application stream `SOURCE_SQL_STREAM_001`.

   1. In the SQL Editor, choose **Save and run SQL**. The console first sends update request to save the application code. Then, the code continuously executes.

   1. You can see the results in the **Real-time analytics** tab.   
![\[Screenshot of the SQL editor with results shown in the real-time analytics tab.\]](http://docs.aws.amazon.com/kinesisanalytics/latest/dev/images/gs-v2-50.png)

      The SQL editor has the following tabs:
      + The **Source data** tab shows an in-application input stream that is mapped to the streaming source. Choose the in-application stream, and you can see data coming in. Note the additional columns in the in-application input stream that weren't specified in the input configuration. These include the following timestamp columns:

         
        + **ROWTIME** – Each row in an in-application stream has a special column called `ROWTIME`. This column is the timestamp when Amazon Kinesis Data Analytics inserted the row in the first in-application stream (the in-application input stream that is mapped to the streaming source).

           
        + **Approximate\$1Arrival\$1Time** – Each Kinesis Data Analytics record includes a value called `Approximate_Arrival_Time`. This value is the approximate arrival timestamp that is set when the streaming source successfully receives and stores the record. When Kinesis Data Analytics reads records from a streaming source, it fetches this column into the in-application input stream. 

        These timestamp values are useful in windowed queries that are time-based. For more information, see [Windowed Queries](windowed-sql.md).

         
      + The **Real-time analytics** tab shows all the other in-application streams created by your application code. It also includes the error stream. Kinesis Data Analytics sends any rows it cannot process to the error stream. For more information, see [Error Handling](error-handling.md).

         

        Choose `DESTINATION_SQL_STREAM` to view the rows your application code inserted. Note the additional columns that your application code didn't create. These columns include the `ROWTIME` timestamp column. Kinesis Data Analytics simply copies these values from the source (`SOURCE_SQL_STREAM_001`).

         
      + The **Destination** tab shows the external destination where Kinesis Data Analytics writes the query results. You haven't configured any external destination for your application output yet.

      

**Next Step**  
[Step 3.4: (Optional) Update the Application Code](get-started-update-appcode.md)

# Step 3.4: (Optional) Update the Application Code
<a name="get-started-update-appcode"></a>

In this step, you explore how to update the application code. 

**To update application code**

1. Create another in-application stream as follows:
   + Create another in-application stream called `DESTINATION_SQL_STREAM_2`.
   + Create a pump, and then use it to insert rows in the newly created stream by selecting rows from the `DESTINATION_SQL_STREAM`.

   In the SQL editor, append the following code to the existing application code:

   ```
   CREATE OR REPLACE STREAM "DESTINATION_SQL_STREAM_2" 
              (ticker_symbol VARCHAR(4), 
               change        DOUBLE, 
               price         DOUBLE);
   
   CREATE OR REPLACE PUMP "STREAM_PUMP_2" AS 
      INSERT INTO "DESTINATION_SQL_STREAM_2"
         SELECT STREAM ticker_symbol, change, price 
         FROM   "DESTINATION_SQL_STREAM";
   ```

   Save and run the code. Additional in-application streams appear on the **Real-time analytics** tab.

1. Create two in-application streams. Filter rows in the `SOURCE_SQL_STREAM_001` based on the stock ticker, and then insert them in to these separate streams. 

   Append the following SQL statements to your application code:

   ```
   CREATE OR REPLACE STREAM "AMZN_STREAM" 
              (ticker_symbol VARCHAR(4), 
               change        DOUBLE, 
               price         DOUBLE);
   
   CREATE OR REPLACE PUMP "AMZN_PUMP" AS 
      INSERT INTO "AMZN_STREAM"
         SELECT STREAM ticker_symbol, change, price 
         FROM   "SOURCE_SQL_STREAM_001"
         WHERE  ticker_symbol SIMILAR TO '%AMZN%';
   
   CREATE OR REPLACE STREAM "TGT_STREAM" 
              (ticker_symbol VARCHAR(4), 
               change        DOUBLE, 
               price         DOUBLE);
   
   CREATE OR REPLACE PUMP "TGT_PUMP" AS 
      INSERT INTO "TGT_STREAM"
         SELECT STREAM ticker_symbol, change, price 
         FROM   "SOURCE_SQL_STREAM_001"
         WHERE  ticker_symbol SIMILAR TO '%TGT%';
   ```

   Save and run the code. Notice additional in-application streams on the **Real-time analytics** tab.

You now have your first working Amazon Kinesis Data Analytics application. In this exercise, you did the following: 
+ Created your first Kinesis Data Analytics application.

   
+ Configured application input that identified the demo stream as the streaming source and mapped it to an in-application stream (`SOURCE_SQL_STREAM_001`) that is created. Kinesis Data Analytics continuously reads the demo stream and inserts records in the in-application stream.

   
+ Your application code queried the `SOURCE_SQL_STREAM_001` and wrote output to another in-application stream called `DESTINATION_SQL_STREAM`. 



Now you can optionally configure application output to write the application output to an external destination. That is, you can configure the application output to write records in the `DESTINATION_SQL_STREAM` to an external destination. For this exercise, this is an optional step. To learn how to configure the destination, go to the next step.

**Next Step**  
[Step 4 (Optional) Edit the Schema and SQL Code Using the Console](console-feature-summary.md).