Tune a DeepAR Model - Amazon SageMaker

Tune a DeepAR Model

Automatic model tuning, also known as hyperparameter tuning, finds the best version of a model by running many jobs that test a range of hyperparameters on your dataset. You choose the tunable hyperparameters, a range of values for each, and an objective metric. You choose the objective metric from the metrics that the algorithm computes. Automatic model tuning searches the hyperparameters chosen to find the combination of values that result in the model that optimizes the objective metric.

For more information about model tuning, see Automatic model tuning with SageMaker.

Metrics Computed by the DeepAR Algorithm

The DeepAR algorithm reports three metrics, which are computed during training. When tuning a model, choose one of these as the objective. For the objective, use either the forecast accuracy on a provided test channel (recommended) or the training loss. For recommendations for the training/test split for the DeepAR algorithm, see Best Practices for Using the DeepAR Algorithm.

Metric Name Description Optimization Direction
test:RMSE

The root mean square error between the forecast and the actual target computed on the test set.

Minimize

test:mean_wQuantileLoss

The average overall quantile losses computed on the test set. To control which quantiles are used, set the test_quantiles hyperparameter.

Minimize

train:final_loss

The training negative log-likelihood loss averaged over the last training epoch for the model.

Minimize

Tunable Hyperparameters for the DeepAR Algorithm

Tune a DeepAR model with the following hyperparameters. The hyperparameters that have the greatest impact, listed in order from the most to least impactful, on DeepAR objective metrics are: epochs, context_length, mini_batch_size, learning_rate, and num_cells.

Parameter Name Parameter Type Recommended Ranges
epochs

IntegerParameterRanges

MinValue: 1, MaxValue: 1000

context_length

IntegerParameterRanges

MinValue: 1, MaxValue: 200

mini_batch_size

IntegerParameterRanges

MinValue: 32, MaxValue: 1028

learning_rate

ContinuousParameterRange

MinValue: 1e-5, MaxValue: 1e-1

num_cells

IntegerParameterRanges

MinValue: 30, MaxValue: 200

num_layers

IntegerParameterRanges

MinValue: 1, MaxValue: 8

dropout_rate

ContinuousParameterRange

MinValue: 0.00, MaxValue: 0.2

embedding_dimension

IntegerParameterRanges

MinValue: 1, MaxValue: 50