Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

Amazon Nova Understanding model customization hyperparameters

Focus mode
Amazon Nova Understanding model customization hyperparameters - Amazon Bedrock

The Amazon Nova Lite, Amazon Nova Micro, and Amazon Nova Pro models support the following three hyperparameters for model customization. For more information, see Customize your model to improve its performance for your use case.

For information about fine tuning Amazon Nova models, see Fine-tuning Amazon Nova models.

Hyperparameter (console) Hyperparameter (API) Definition Type Minimum Maximum Default
Epochs epochCount The number of iterations through the entire training dataset integer 1 5 2
Learning rate learningRate The rate at which model parameters are updated after each batch float 1.00E-6 1.00E-4 1.00E-5
Learning rate warmup steps learningRateWarmupSteps The number of iterations over which the learning rate is gradually increased to the specified rate integer 0 100 10

The default epoch number is 2, which works for most cases. In general, larger data sets require fewer epochs to converge, while smaller data sets require more epochs to converge. A faster convergence might also be achieved by increasing the learning rate, but this is less desirable because it might lead to training instability at convergence. We recommend starting with the default hyperparameters, which are based on our assessment across tasks of different complexity and data sizes.

The learning rate will gradually increase to the set value during warm up. Therefore, we recommend that you avoid a large warm up value when the training sample is small because the learning rate might never reach the set value during the training process. We recommend setting the warmup steps by dividing the dataset size by 640 for Amazon Nova Micro, 160 for Amazon Nova Lite, and 320 for Amazon Nova Pro.

PrivacySite termsCookie preferences
© 2025, Amazon Web Services, Inc. or its affiliates. All rights reserved.