

For similar capabilities to Amazon Timestream for LiveAnalytics, consider Amazon Timestream for InfluxDB. It offers simplified data ingestion and single-digit millisecond query response times for real-time analytics. Learn more [here](https://docs.aws.amazon.com//timestream/latest/developerguide/timestream-for-influxdb.html).

# Accessing Timestream for LiveAnalytics
<a name="accessing"></a>

You can access Timestream for LiveAnalytics using the console, CLI or the API. For information about accessing Timestream for LiveAnalytics, review the following:

**Topics**
+ [

## Sign up for an AWS account
](#sign-up-for-aws)
+ [

## Create a user with administrative access
](#create-an-admin)
+ [

## Provide Timestream for LiveAnalytics access
](#getting-started.prereqs.iam-user)
+ [

## Grant programmatic access
](#programmatic-access)
+ [

# Using the console
](console_timestream.md)
+ [

# Accessing Amazon Timestream for LiveAnalytics using the AWS CLI
](Tools.CLI.md)
+ [

# Using the API
](Using.API.md)
+ [

# Using the AWS SDKs
](getting-started-sdks.md)

## Sign up for an AWS account
<a name="sign-up-for-aws"></a>

If you do not have an AWS account, complete the following steps to create one.

**To sign up for an AWS account**

1. Open [https://portal.aws.amazon.com/billing/signup](https://portal.aws.amazon.com/billing/signup).

1. Follow the online instructions.

   Part of the sign-up procedure involves receiving a phone call or text message and entering a verification code on the phone keypad.

   When you sign up for an AWS account, an *AWS account root user* is created. The root user has access to all AWS services and resources in the account. As a security best practice, assign administrative access to a user, and use only the root user to perform [tasks that require root user access](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_root-user.html#root-user-tasks).

AWS sends you a confirmation email after the sign-up process is complete. At any time, you can view your current account activity and manage your account by going to [https://aws.amazon.com/](https://aws.amazon.com/) and choosing **My Account**.

## Create a user with administrative access
<a name="create-an-admin"></a>

After you sign up for an AWS account, secure your AWS account root user, enable AWS IAM Identity Center, and create an administrative user so that you don't use the root user for everyday tasks.

**Secure your AWS account root user**

1.  Sign in to the [AWS Management Console](https://console.aws.amazon.com/) as the account owner by choosing **Root user** and entering your AWS account email address. On the next page, enter your password.

   For help signing in by using root user, see [Signing in as the root user](https://docs.aws.amazon.com/signin/latest/userguide/console-sign-in-tutorials.html#introduction-to-root-user-sign-in-tutorial) in the *AWS Sign-In User Guide*.

1. Turn on multi-factor authentication (MFA) for your root user.

   For instructions, see [Enable a virtual MFA device for your AWS account root user (console)](https://docs.aws.amazon.com/IAM/latest/UserGuide/enable-virt-mfa-for-root.html) in the *IAM User Guide*.

**Create a user with administrative access**

1. Enable IAM Identity Center.

   For instructions, see [Enabling AWS IAM Identity Center](https://docs.aws.amazon.com//singlesignon/latest/userguide/get-set-up-for-idc.html) in the *AWS IAM Identity Center User Guide*.

1. In IAM Identity Center, grant administrative access to a user.

   For a tutorial about using the IAM Identity Center directory as your identity source, see [ Configure user access with the default IAM Identity Center directory](https://docs.aws.amazon.com//singlesignon/latest/userguide/quick-start-default-idc.html) in the *AWS IAM Identity Center User Guide*.

**Sign in as the user with administrative access**
+ To sign in with your IAM Identity Center user, use the sign-in URL that was sent to your email address when you created the IAM Identity Center user.

  For help signing in using an IAM Identity Center user, see [Signing in to the AWS access portal](https://docs.aws.amazon.com/signin/latest/userguide/iam-id-center-sign-in-tutorial.html) in the *AWS Sign-In User Guide*.

**Assign access to additional users**

1. In IAM Identity Center, create a permission set that follows the best practice of applying least-privilege permissions.

   For instructions, see [ Create a permission set](https://docs.aws.amazon.com//singlesignon/latest/userguide/get-started-create-a-permission-set.html) in the *AWS IAM Identity Center User Guide*.

1. Assign users to a group, and then assign single sign-on access to the group.

   For instructions, see [ Add groups](https://docs.aws.amazon.com//singlesignon/latest/userguide/addgroups.html) in the *AWS IAM Identity Center User Guide*.

## Provide Timestream for LiveAnalytics access
<a name="getting-started.prereqs.iam-user"></a>

 The permissions that are required to access Timestream for LiveAnalytics are already granted to the administrator. For other users, you should grant them Timestream for LiveAnalytics access using the following policy: 

------
#### [ JSON ]

****  

```
{
  "Version":"2012-10-17",		 	 	 
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "timestream:*",
        "kms:DescribeKey",
        "kms:CreateGrant",
        "kms:Decrypt",
        "dbqms:CreateFavoriteQuery",
        "dbqms:DescribeFavoriteQueries",
        "dbqms:UpdateFavoriteQuery",
        "dbqms:DeleteFavoriteQueries",
        "dbqms:GetQueryString",
        "dbqms:CreateQueryHistory",
        "dbqms:UpdateQueryHistory",
        "dbqms:DeleteQueryHistory",
        "dbqms:DescribeQueryHistory",
        "s3:ListAllMyBuckets"
      ],
      "Resource": "*"
    }
  ]
}
```

------

**Note**  
For information about `dbqms`, see [Actions, resources, and condition keys for Database Query Metadata Service](https://docs.aws.amazon.com/service-authorization/latest/reference/list_databasequerymetadataservice.html). For information about `kms` see [Actions, resources, and condition keys for AWS Key Management Service](https://docs.aws.amazon.com/service-authorization/latest/reference/list_awskeymanagementservice.html).

## Grant programmatic access
<a name="programmatic-access"></a>

Users need programmatic access if they want to interact with AWS outside of the AWS Management Console. The way to grant programmatic access depends on the type of user that's accessing AWS.

To grant users programmatic access, choose one of the following options.


****  

| Which user needs programmatic access? | To | By | 
| --- | --- | --- | 
| IAM | (Recommended) Use console credentials as temporary credentials to sign programmatic requests to the AWS CLI, AWS SDKs, or AWS APIs. |  Following the instructions for the interface that you want to use. [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/timestream/latest/developerguide/accessing.html)  | 
|  Workforce identity (Users managed in IAM Identity Center)  | Use temporary credentials to sign programmatic requests to the AWS CLI, AWS SDKs, or AWS APIs. |  Following the instructions for the interface that you want to use. [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/timestream/latest/developerguide/accessing.html)  | 
| IAM | Use temporary credentials to sign programmatic requests to the AWS CLI, AWS SDKs, or AWS APIs. | Following the instructions in [Using temporary credentials with AWS resources](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp_use-resources.html) in the IAM User Guide. | 
| IAM | (Not recommended)Use long-term credentials to sign programmatic requests to the AWS CLI, AWS SDKs, or AWS APIs. |  Following the instructions for the interface that you want to use. [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/timestream/latest/developerguide/accessing.html)  | 

# Using the console
<a name="console_timestream"></a>

 You can use the AWS Management Console for Timestream Live Analytics to create, edit, delete, describe, and list databases and tables. You can also use the console to run queries.

**Topics**
+ [

## Tutorial
](#console_timestream.db-w-sample-data)
+ [

## Create a database
](#console_timestream.db.using-console)
+ [

## Create a table
](#console_timestream.table.using-console)
+ [

## Run a query
](#console_timestream.queries.using-console)
+ [

## Create a scheduled query
](#console_timestream.scheduledquery.using-console)
+ [

## Delete a scheduled query
](#console_timestream.scheduledquerydeletedisable.using-console)
+ [

## Delete a table
](#console_timestream.delete-table.using-console)
+ [

## Delete a database
](#console_timestream.delete-db.using-console)
+ [

## Edit a table
](#console_timestream.edit-table.using-console)
+ [

## Edit a database
](#console_timestream.edit-db.using-console)

## Tutorial
<a name="console_timestream.db-w-sample-data"></a>

 This tutorial shows you how to create a database populated with sample data sets and run sample queries. The sample datasets used in this tutorial are frequently seen in IoT and DevOps scenarios. The IoT dataset contains time series data such as the speed, location, and load of a truck, to streamline fleet management and identify optimization opportunities. The DevOps dataset contains EC2 instance metrics such as CPU, network, and memory utilization to improve application performance and availability. Here's a [video tutorial](https://www.youtube.com/watch?v=YBWCGDd4ChQ) for the instructions described in this section 

Follow these steps to create a database populated with the sample data sets and run sample queries using the AWS Console.

1. Open the [AWS Console](https://console.aws.amazon.com/timestream).

1. In the navigation pane, choose **Databases**

1. Click on **Create database**.

1. On the create database page, enter the following:
   + **Choose configuration**—Select **Sample database**.
   + **Name**—Enter a database name of your choice.
   + **Choose sample datasets**—Select **IoT** and **DevOps**.
   +  Click on **Create database** to create a database containing two tables—IoT and DevOps populated with sample data. 

1. In the navigation pane, choose **Query editor**

1. Select **Sample queries** from the top menu.

1. Click on one of the sample queries. This will take you back to the query editor with the editor populated with the sample query.

1. Click **Run** to run the query and see query results.

## Create a database
<a name="console_timestream.db.using-console"></a>

Follow these steps to create a database using the AWS Console.

1. Open the [AWS Console](https://console.aws.amazon.com/timestream).

1. In the navigation pane, choose **Databases**

1. Click on **Create database**.

1. On the create database page, enter the following.
   + **Choose configuration**—Select **Standard database**.
   + **Name**—Enter a database name of your choice.
   + **Encryption **—Choose a KMS key or use the default option, where Timestream Live Analytics will create a KMS key in your account if one does not already exist.

1.  Click on **Create database** to create a database.

## Create a table
<a name="console_timestream.table.using-console"></a>

Follow these steps to create a table using the AWS Console.

1. Open the [AWS Console](https://console.aws.amazon.com/timestream).

1. In the navigation pane, choose **Tables**

1. Click on **Create table**.

1. On the create table page, enter the following.
   + **Database name**—Select the name of the database created in [Create a database](#console_timestream.db.using-console).
   + **Table name**—Enter a table name of your choice.
   + **Memory store retention**—Specify how long you want to retain data in the memory store. The memory store processes incoming data, including late arriving data (data with a timestamp earlier than the current time) and is optimized for fast point-in-time queries.
   + **Magnetic store retention**—Specify how long you want to retain data in the magnetic store. The magnetic store is meant for long term storage and is optimized for fast analytical queries.

1.  Click on **Create table**.

## Run a query
<a name="console_timestream.queries.using-console"></a>

Follow these steps to run queries using the AWS Console.

1. Open the [AWS Console](https://console.aws.amazon.com/timestream).

1. In the navigation pane, choose **Query editor**

1. In the left pane, select the database created in [Create a database](#console_timestream.db.using-console).

1. In the left pane, select the database created in [Create a table](#console_timestream.table.using-console).

1. In the query editor, you can run a query. To see the latest 10 rows in the table, run: 

   ```
   SELECT * FROM <database_name>.<table_name> ORDER BY time DESC LIMIT 10
   ```

1. (Optional) Turn on **Enable Insights** to get insights about the efficiency of your queries. 

## Create a scheduled query
<a name="console_timestream.scheduledquery.using-console"></a>

Follow these steps to create a scheduled query using the AWS Console.

1. Open the [AWS Console](https://console.aws.amazon.com/timestream).

1. In the navigation pane, choose **Scheduled queries**.

1. Click on **Create scheduled query**.

1. In the **Query Name** and **Destination Table** sections, enter the following.
   + **Name**—Enter a query name.
   + **Database name**—Select the name of the database created in [Create a database](#console_timestream.db.using-console).
   + **Table name**—Select the name of the table created in [Create a table](#console_timestream.table.using-console).

1. In the **Query Statement** section, enter a valid query statement. Then click **Validate query**.

1. From **Destination table model**, define the model for any undefined attributes. You can use **Visual builder** or JSON.

1. In the **Run schedule** section, choose **Fixed rate** or **Chron expression**.For chron expressions, refer to [Schedule Expressions for Scheduled Queries](https://docs.aws.amazon.com/timestream/latest/developerguide/scheduledqueries-schedule.html) for more details on schedule expressions. 

1. In the **SNS topic** section, enter the SNS topic that will be used to for notification.

1. In the **Error log report** section enter the S3 location that will be used to report errors.

   Choose the **Encryption key type**.

1. In the **Security settings** section from **AWS KMS key**, choose the type of AWS KMS key.

   Enter the **IAM role** that Timestream for LiveAnalytics will use to run the scheduled query. Refer to the [IAM policy examples for scheduled queries](https://docs.aws.amazon.com/timestream/latest/developerguide/security_iam_id-based-policy-examples.html#security_iam_id-based-policy-examples-sheduledqueries) for details on the required permissions and trust relationship for the role.

1.  Click **Create scheduled query**.

## Delete a scheduled query
<a name="console_timestream.scheduledquerydeletedisable.using-console"></a>

Follow these steps to delete or disable a scheduled query using the AWS Console.

1. Open the [AWS Console](https://console.aws.amazon.com/timestream).

1. In the navigation pane, choose **Scheduled queries**

1. Select the scheduled query created in [Create a scheduled query](#console_timestream.scheduledquery.using-console).

1. Select **Actions**.

1. Choose **Disable** or **Delete**.

1. If you selected Delete, confirm the action and select **Delete**.

## Delete a table
<a name="console_timestream.delete-table.using-console"></a>

Follow these steps to delete a database using the AWS Console.

1. Open the [AWS Console](https://console.aws.amazon.com/timestream).

1. In the navigation pane, choose **Tables**

1. Select the table that you created in [Create a table](#console_timestream.table.using-console).

1. Click **Delete**.

1. Type *delete* in the confirmation box.

## Delete a database
<a name="console_timestream.delete-db.using-console"></a>

Follow these steps to delete a database using the AWS Console: 

1. Open the [AWS Console](https://console.aws.amazon.com/timestream).

1. In the navigation pane, choose **Databases**

1. Select the database that you created in **Create a database**.

1. Click **Delete**.

1. Type *delete* in the confirmation box.

## Edit a table
<a name="console_timestream.edit-table.using-console"></a>

Follow these steps to edit a table using the AWS Console.

1. Open the [AWS Console](https://console.aws.amazon.com/timestream).

1. In the navigation pane, choose **Tables**

1. Select the table that you created in [Create a table](#console_timestream.table.using-console).

1. Click **Edit**

1. Edit the table details and save.
   + **Memory store retention**—Specify how long you want to retain data in the memory store. The memory store processes incoming data, including late arriving data (data with a timestamp earlier than the current time) and is optimized for fast point-in-time queries.
   + **Magnetic store retention**—Specify how long you want to retain data in the magnetic store. The magnetic store is meant for long term storage and is optimized for fast analytical queries.

## Edit a database
<a name="console_timestream.edit-db.using-console"></a>

Follow these steps to edit a database using the AWS Console.

1. Open the [AWS Console](https://console.aws.amazon.com/timestream).

1. In the navigation pane, choose **Databases**

1. Select the database that you created in **Create a database**.

1. Click **Edit**

1. Edit the database details and save.

# Accessing Amazon Timestream for LiveAnalytics using the AWS CLI
<a name="Tools.CLI"></a>

 You can use the AWS Command Line Interface (AWS CLI) to control multiple AWS services from the command line and automate them through scripts. You can use the AWS CLI for ad hoc operations. You can also use it to embed Amazon Timestream for LiveAnalytics operations within utility scripts.

 Before you can use the AWS CLI with Timestream for LiveAnalytics, you must set up programmatic access. For more information, see [Grant programmatic access](accessing.md#programmatic-access). 

For a complete listing of all the commands available for the Timestream for LiveAnalytics Query API in the AWS CLI, see the [AWS CLI Command Reference](https://docs.aws.amazon.com/cli/latest/reference/timestream-query/index.html).

For a complete listing of all the commands available for the Timestream for LiveAnalytics Write API in the AWS CLI, see the [AWS CLI Command Reference](https://docs.aws.amazon.com/cli/latest/reference/timestream-write/index.html).

**Topics**
+ [

## Downloading and configuring the AWS CLI
](#Tools.CLI.DownloadingAndRunning)
+ [

## Using the AWS CLI with Timestream for LiveAnalytics
](#Tools.CLI.UsingWithQLDB)

## Downloading and configuring the AWS CLI
<a name="Tools.CLI.DownloadingAndRunning"></a>

The AWS CLI runs on Windows, macOS, or Linux. To download, install, and configure it, follow these steps:

1. Download the AWS CLI at [http://aws.amazon.com/cli](https://aws.amazon.com/cli).

1. Follow the instructions for [Installing the AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/installing.html) and [Configuring the AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html) in the *AWS Command Line Interface User Guide*.

## Using the AWS CLI with Timestream for LiveAnalytics
<a name="Tools.CLI.UsingWithQLDB"></a>

The command line format consists of an Amazon Timestream for LiveAnalytics operation name, followed by the parameters for that operation. The AWS CLI supports a shorthand syntax for the parameter values, in addition to JSON.

 Use `help` to list all available commands in Timestream for LiveAnalytics. For example: 

```
aws timestream-write help
```

```
aws timestream-query help
```

 You can also use `help` to describe a specific command and learn more about its usage: 

```
aws timestream-write create-database help
```

 For example, to create a database: 

```
aws timestream-write create-database --database-name myFirstDatabase
```

 To create a table with magnetic store writes enabled: 

```
aws timestream-write create-table \
--database-name metricsdb \
--table-name metrics \
--magnetic-store-write-properties "{\"EnableMagneticStoreWrites\": true}"
```

To write data using single-measure records:

```
aws timestream-write write-records \
--database-name metricsdb \
--table-name metrics \
--common-attributes "{\"Dimensions\":[{\"Name\":\"asset_id\", \"Value\":\"100\"}], \"Time\":\"1631051324000\",\"TimeUnit\":\"MILLISECONDS\"}" \
--records "[{\"MeasureName\":\"temperature\", \"MeasureValueType\":\"DOUBLE\",\"MeasureValue\":\"30\"},{\"MeasureName\":\"windspeed\", \"MeasureValueType\":\"DOUBLE\",\"MeasureValue\":\"7\"},{\"MeasureName\":\"humidity\", \"MeasureValueType\":\"DOUBLE\",\"MeasureValue\":\"15\"},{\"MeasureName\":\"brightness\", \"MeasureValueType\":\"DOUBLE\",\"MeasureValue\":\"17\"}]"
```

To write data using multi-measure records:

```
# wide model helper method to create Multi-measure records
function ingest_multi_measure_records {
  epoch=`date +%s`
  epoch+=$i

  # multi-measure records
  aws timestream-write write-records \
  --database-name $src_db_wide \
  --table-name $src_tbl_wide \
  --common-attributes "{\"Dimensions\":[{\"Name\":\"device_id\", \
              \"Value\":\"12345678\"},\
            {\"Name\":\"device_type\", \"Value\":\"iPhone\"}, \
            {\"Name\":\"os_version\", \"Value\":\"14.8\"}, \
            {\"Name\":\"region\", \"Value\":\"us-east-1\"} ], \
            \"Time\":\"$epoch\",\"TimeUnit\":\"MILLISECONDS\"}" \
--records "[{\"MeasureName\":\"video_metrics\", \"MeasureValueType\":\"MULTI\", \
  \"MeasureValues\": \
  [{\"Name\":\"video_startup_time\",\"Value\":\"0\",\"Type\":\"BIGINT\"}, \
  {\"Name\":\"rebuffering_ratio\",\"Value\":\"0.5\",\"Type\":\"DOUBLE\"}, \
  {\"Name\":\"video_playback_failures\",\"Value\":\"0\",\"Type\":\"BIGINT\"}, \
  {\"Name\":\"average_frame_rate\",\"Value\":\"0.5\",\"Type\":\"DOUBLE\"}]}]" \
--endpoint-url $ingest_endpoint \
  --region  $region
}

# create 5 records
for i in {100..105};
  do ingest_multi_measure_records $i;
done
```

To query a table: 

```
aws timestream-query query \
--query-string "SELECT time, device_id, device_type, os_version, 
region, video_startup_time, rebuffering_ratio, video_playback_failures, \
average_frame_rate \
FROM metricsdb.metrics \
where time >= ago (15m)"
```

To create a scheduled query: 

```
aws timestream-query create-scheduled-query \
  --name scheduled_query_name \
  --query-string "select bin(time, 1m) as time, \
          avg(measure_value::double) as avg_cpu, min(measure_value::double) as min_cpu, region \
          from $src_db.$src_tbl where measure_name = 'cpu' \
          and time BETWEEN @scheduled_runtime - (interval '5' minute)  AND @scheduled_runtime \
          group by region, bin(time, 1m)" \
  --schedule-configuration "{\"ScheduleExpression\":\"$cron_exp\"}" \
  --notification-configuration "{\"SnsConfiguration\":{\"TopicArn\":\"$sns_topic_arn\"}}" \
  --scheduled-query-execution-role-arn "arn:aws:iam::452360119086:role/TimestreamSQExecutionRole" \
  --target-configuration "{\"TimestreamConfiguration\":{\
          \"DatabaseName\": \"$dest_db\",\
          \"TableName\": \"$dest_tbl\",\
          \"TimeColumn\":\"time\",\
          \"DimensionMappings\":[{\
            \"Name\": \"region\", \"DimensionValueType\": \"VARCHAR\"
          }],\
          \"MultiMeasureMappings\":{\
            \"TargetMultiMeasureName\": \"mma_name\",
            \"MultiMeasureAttributeMappings\":[{\
              \"SourceColumn\": \"avg_cpu\", \"MeasureValueType\": \"DOUBLE\", \"TargetMultiMeasureAttributeName\": \"target_avg_cpu\"
            },\
            { \
              \"SourceColumn\": \"min_cpu\", \"MeasureValueType\": \"DOUBLE\", \"TargetMultiMeasureAttributeName\": \"target_min_cpu\"
            }] \
          }\
          }}" \
  --error-report-configuration "{\"S3Configuration\": {\
        \"BucketName\": \"$s3_err_bucket\",\
        \"ObjectKeyPrefix\": \"scherrors\",\
        \"EncryptionOption\": \"SSE_S3\"\
        }\
      }"
```

# Using the API
<a name="Using.API"></a>

 In addition to the [SDKs](getting-started-sdks.md), Amazon Timestream for LiveAnalytics provides direct REST API access via the *endpoint discovery pattern*. The endpoint discovery pattern is described below, along with its use cases. 

## The endpoint discovery pattern
<a name="Using-API.endpoint-discovery"></a>

Because Timestream Live Analytics's SDKs are designed to transparently work with the service's architecture, including the management and mapping of the service endpoints, it is recommended that you use the SDKs for most applications. However, there are a few instances where use of the Timestream for LiveAnalytics REST API endpoint discovery pattern is necessary: 
+ You are using [VPC endpoints (AWS PrivateLink) with Timestream for LiveAnalytics](VPCEndpoints.md)
+ Your application uses a programming language that does not yet have SDK support
+ You require better control over the client-side implementation

This section includes information on how the endpoint discovery pattern works, how to implement the endpoint discovery pattern, and usage notes. Select a topic below to learn more. 

**Topics**
+ [

## The endpoint discovery pattern
](#Using-API.endpoint-discovery)
+ [

# How the endpoint discovery pattern works
](Using-API.endpoint-discovery.how-it-works.md)
+ [

# Implementing the endpoint discovery pattern
](Using-API.endpoint-discovery.describe-endpoints.implementation.md)

# How the endpoint discovery pattern works
<a name="Using-API.endpoint-discovery.how-it-works"></a>

 Timestream is built using a [cellular architecture ](architecture.md#cells) to ensure better scaling and traffic isolation properties. Because each customer account is mapped to a specific cell in a region, your application must use the correct cell-specific endpoints that your account has been mapped to. When using the SDKs, this mapping is transparently handled for you and you do not need to manage the cell-specific endpoints. However, when directly accessing the REST API, you will need to manage and map the correct endpoints yourself. This process, the *endpoint discovery pattern*, is described below: 

1.  The endpoint discovery pattern starts with a call to the `DescribeEndpoints` action (described in the [https://docs.aws.amazon.com/timestream/latest/developerguide/API_Reference.html](https://docs.aws.amazon.com/timestream/latest/developerguide/API_Reference.html) section). 

1.  The endpoint should be cached and reused for the amount of time specified by the returned time-to-live (TTL) value (the [https://docs.aws.amazon.com/timestream/latest/developerguide/API_Endpoint.html#timestream-Type-Endpoint-CachePeriodInMinutes.html](https://docs.aws.amazon.com/timestream/latest/developerguide/API_Endpoint.html#timestream-Type-Endpoint-CachePeriodInMinutes.html)). Calls to the Timestream Live Analytics API can then be made for the duration of the TTL. 

1.  After the TTL expires, a new call to DescribeEndpoints should be made to refresh the endpoint (in other words, start over at Step 1). 

**Note**  
 Syntax, parameters and other usage information for the `DescribeEndpoints` action are described in the [API Reference](https://docs.aws.amazon.com/timestream/latest/developerguide/API_DescribeEndpoints.html). Note that the `DescribeEndpoints` action is available via both SDKs, and is identical for each. 

For implementation of the endpoint discovery pattern, see [Implementing the endpoint discovery pattern](Using-API.endpoint-discovery.describe-endpoints.implementation.md).

# Implementing the endpoint discovery pattern
<a name="Using-API.endpoint-discovery.describe-endpoints.implementation"></a>

 To implement the endpoint discovery pattern, choose an API (Write or Query), create a **DescribeEndpoints** request, and use the returned endpoint(s) for the duration of the returned TTL value(s). The implementation procedure is described below. 

**Note**  
Ensure you are familiar with the [usage notes](#Using-API.endpoint-discovery.describe-endpoints.usage-notes).

## Implementation procedure
<a name="Using-API.endpoint-discovery.describe-endpoints.implementation.procedure"></a>

1.  Acquire the endpoint for the API you would like to make calls against ([Write](https://docs.aws.amazon.com/timestream/latest/developerguide/API_Operations_Amazon_Timestream_Write.html) or [Query](https://docs.aws.amazon.com/timestream/latest/developerguide/API_Operations_Amazon_Timestream_Query.html)). using the [https://docs.aws.amazon.com/timestream/latest/developerguide/API_DescribeEndpoints.html](https://docs.aws.amazon.com/timestream/latest/developerguide/API_DescribeEndpoints.html) request. 

   1.  Create a request for [https://docs.aws.amazon.com/timestream/latest/developerguide/API_DescribeEndpoints.html](https://docs.aws.amazon.com/timestream/latest/developerguide/API_DescribeEndpoints.html) that corresponds to the API of interest ([Write](https://docs.aws.amazon.com/timestream/latest/developerguide/API_Operations_Amazon_Timestream_Write.html) or [Query](https://docs.aws.amazon.com/timestream/latest/developerguide/API_Operations_Amazon_Timestream_Query.html)) using one of the two endpoints described below. There are no input parameters for the request. Ensure that you read the notes below.   
*Write SDK:*  

      ```
      ingest.timestream.<region>.amazonaws.com
      ```  
*Query SDK:*  

      ```
      query.timestream.<region>.amazonaws.com
      ```

      An example CLI call for region `us-east-1` follows.

      ```
      REGION_ENDPOINT="https://query.timestream.us-east-1.amazonaws.com"
      REGION=us-east-1
      aws timestream-write describe-endpoints \
      --endpoint-url $REGION_ENDPOINT \
      --region $REGION
      ```
**Note**  
 The HTTP "Host" header *must* also contain the API endpoint. The request will fail if the header is not populated. This is a standard requirement for all HTTP/1.1 requests. If you use an HTTP library supporting 1.1 or later, the HTTP library should automatically populate the header for you.
**Note**  
Substitute *<region>* with the region identifier for the region the request is being made in, e.g. `us-east-1`

   1. Parse the response to extract the endpoint(s), and cache TTL value(s). The response is an array of one or more [`Endpoint` objects ](https://docs.aws.amazon.com/timestream/latest/developerguide/API_Endpoint.html). Each `Endpoint` object contains an endpoint address (`Address`) and the TTL for that endpoint (`CachePeriodInMinutes`). 

1.  Cache the endpoint for up to the specified TTL. 

1.  When the TTL expires, retrieve a new endpoint by starting over at step 1 of the Implementation. 

## Usage notes for the endpoint discovery pattern
<a name="Using-API.endpoint-discovery.describe-endpoints.usage-notes"></a>
+ The **DescribeEndpoints** action is the only action that Timestream Live Analytics regional endpoints recognize. 
+ The response contains a list of endpoints to make Timestream Live Analytics API calls against. 
+  On successful response, there should be at least one endpoint in the list. If there is more than one endpoint in the list, any of them are equally usable for the API calls, and the caller may choose the endpoint to use at random. 
+ In addition to the DNS address of the endpoint, each endpoint in the list will specify a time to live (TTL) that is allowable for using the endpoint specified in minutes.
+ The endpoint should be cached and reused for the amount of time specified by the returned TTL value (in minutes). After the TTL expires a new call to **DescribeEndpoints** should be made to refresh the endpoint to use, as the endpoint will no longer work after the TTL has expired.

# Using the AWS SDKs
<a name="getting-started-sdks"></a>

 You can access Amazon Timestream using the AWS SDKs. Timestream supports two SDKs per language; namely, the Write SDK and the Query SDK. The Write SDK is used to perform CRUD operations and to insert your time series data into Timestream. The Query SDK is used to query your existing time series data stored in Timestream. 

Once you've completed the necessary prerequisites for your SDK of choice, you can get started with the [Code samples](code-samples.md).

**Topics**
+ [

# Java
](getting-started.java.md)
+ [

# Java v2
](getting-started.java-v2.md)
+ [

# Go
](getting-started.go.md)
+ [

# Python
](getting-started.python.md)
+ [

# Node.js
](getting-started.node-js.md)
+ [

# .NET
](getting-started.dot-net.md)

# Java
<a name="getting-started.java"></a>

To get started with the [Java 1.0 SDK](https://aws.amazon.com/sdk-for-java/) and Amazon Timestream, complete the prerequisites, described below.

Once you've completed the necessary prerequisites for the Java SDK, you can get started with the [Code samples](code-samples.md).

## Prerequisites
<a name="getting-started.java.prereqs"></a>

Before you get started with Java, you must do the following:

1. Follow the AWS setup instructions in [Accessing Timestream for LiveAnalytics](accessing.md).

1. Set up a Java development environment by downloading and installing the following:
   + Java SE Development Kit 8 (such as [Amazon Corretto 8](https://docs.aws.amazon.com/corretto/latest/corretto-8-ug/downloads-list.html)).
   + Java IDE (such as [Eclipse](http://www.eclipse.org) or [IntelliJ](https://www.jetbrains.com/idea/)).

      For more information, see [Getting Started with the AWS SDK for Java](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/get-started.html) 

1. Configure your AWS credentials and Region for development:
   + Set up your AWS security credentials for use with the AWS SDK for Java.
   + Set your AWS Region to determine your default Timestream for LiveAnalytics endpoint.

## Using Apache Maven
<a name="getting-started.java.with-maven"></a>

 You can use [Apache Maven ](https://maven.apache.org/) to configure and build AWS SDK for Java projects. 

**Note**  
To use Apache Maven, ensure your Java SDK and runtime are 1.8 or higher.

You can configure the AWS SDK as a Maven dependency as described in [ Using the SDK with Apache Maven](https://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/setup-project-maven.html). 

You can run compile and run your source code with the following command:

```
mvn clean compile
mvn exec:java -Dexec.mainClass=<your source code Main class>
```

**Note**  
 `<your source code Main class>` is the path to your Java source code's main class. 

## Setting your AWS credentials
<a name="getting-started.java.credentials"></a>

The [AWS SDK for Java](https://aws.amazon.com/sdk-for-java) requires that you provide AWS credentials to your application at runtime. The code examples in this guide assume that you are using an AWS credentials file, as described in [Set up AWS Credentials and Region for Development](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/setup-credentials.html) in the *AWS SDK for Java Developer Guide*.

The following is an example of an AWS credentials file named `~/.aws/credentials`, where the tilde character (`~`) represents your home directory.

```
[default] 
aws_access_key_id = AWS access key ID goes here 
aws_secret_access_key = Secret key goes here
```

# Java v2
<a name="getting-started.java-v2"></a>

To get started with the [Java 2.0 SDK](https://aws.amazon.com/sdk-for-java/) and Amazon Timestream, complete the prerequisites, described below.

Once you've completed the necessary prerequisites for the Java 2.0 SDK, you can get started with the [Code samples](code-samples.md).

## Prerequisites
<a name="getting-started.java-v2.prereqs"></a>

Before you get started with Java, you must do the following:

1. Follow the AWS setup instructions in [Accessing Timestream for LiveAnalytics](accessing.md).

1. You can configure the AWS SDK as a Maven dependency as described in [ Using the SDK with Apache Maven](https://docs.aws.amazon.com/sdk-for-java/v2/developer-guide/welcome.html). 

1. Set up a Java development environment by downloading and installing the following:
   + Java SE Development Kit 8 (such as [Amazon Corretto 8](https://docs.aws.amazon.com/corretto/latest/corretto-8-ug/downloads-list.html)).
   + Java IDE (such as [Eclipse](http://www.eclipse.org) or [IntelliJ](https://www.jetbrains.com/idea/)).

      For more information, see [Getting Started with the AWS SDK for Java](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/get-started.html) 

## Using Apache Maven
<a name="getting-started.java-v2.with-maven"></a>

 You can use [Apache Maven ](https://maven.apache.org/) to configure and build AWS SDK for Java projects. 

**Note**  
To use Apache Maven, ensure your Java SDK and runtime are 1.8 or higher.

You can configure the AWS SDK as a Maven dependency as described in [ Using the SDK with Apache Maven](https://docs.aws.amazon.com/sdk-for-java/v2/developer-guide/welcome.html). The changes required to the pom.xml file are described [here](https://docs.aws.amazon.com/sdk-for-java/v2/migration-guide/whats-different.html#adding-v2). 

You can run compile and run your source code with the following command:

```
mvn clean compile
mvn exec:java -Dexec.mainClass=<your source code Main class>
```

**Note**  
 `<your source code Main class>` is the path to your Java source code's main class. 

# Go
<a name="getting-started.go"></a>

To get started with the [Go SDK](https://aws.amazon.com/sdk-for-go/) and Amazon Timestream, complete the prerequisites, described below.

Once you've completed the necessary prerequisites for the Go SDK, you can get started with the [Code samples](code-samples.md).

## Prerequisites
<a name="getting-started.prereqs.go"></a>

1.  [Download the GO SDK 1.14](https://golang.org/doc/install). 

1.  [Configure the GO SDK](https://docs.aws.amazon.com/sdk-for-go/v1/developer-guide/configuring-sdk.html). 

1.  [Construct your client](https://docs.aws.amazon.com/sdk-for-go/v1/developer-guide/configuring-sdk.html). 

# Python
<a name="getting-started.python"></a>

To get started with the [Python SDK](https://aws.amazon.com/sdk-for-python/) and Amazon Timestream, complete the prerequisites, described below.

Once you've completed the necessary prerequisites for the Python SDK, you can get started with the [Code samples](code-samples.md).

## Prerequisites
<a name="getting-started.python.prereqs"></a>

To use Python, install and configure Boto3, following the instructions [here](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html).

# Node.js
<a name="getting-started.node-js"></a>

To get started with the [Node.js SDK](https://aws.amazon.com/sdk-for-node-js/) and Amazon Timestream, complete the prerequisites, described below.

Once you've completed the necessary prerequisites for the Node.js SDK, you can get started with the [Code samples](code-samples.md).

## Prerequisites
<a name="getting-started.node-js.prereqs"></a>

Before you get started with Node.js, you must do the following:

1. [Install Node.js](https://nodejs.org/en/).

1.  [Install the AWS SDK for JavaScript](https://aws.amazon.com/sdk-for-node-js/). 

# .NET
<a name="getting-started.dot-net"></a>

To get started with the [.NET SDK](https://aws.amazon.com/sdk-for-net/) and Amazon Timestream, complete the prerequisites, described below.

Once you've completed the necessary prerequisites for the .NET SDK, you can get started with the [Code samples](code-samples.md).

## Prerequisites
<a name="getting-started.dot-net.prereqs"></a>

Before you get started with .NET, install the required NuGet packages and ensure that AWSSDK.Core version is 3.3.107 or newer by running the following commands: 

```
dotnet add package AWSSDK.Core
dotnet add package AWSSDK.TimestreamWrite
dotnet add package AWSSDK.TimestreamQuery
```