Meta Llama 3.1 model customization hyperparameters - Amazon Bedrock

Meta Llama 3.1 model customization hyperparameters

The Meta Llama 3.1 8B and 70B models support the following hyperparameters for model customization. For more information, see Customize your model to improve its performance for your use case.

For information about fine tuning Meta Llama models, see the Meta documentation at https://ai.meta.com/llama/get-started/#fine-tuning.

Note

The epochCount quota is adjustable.

Hyperparameter (console) Hyperparameter (API) Definition Minimum Maximum Default
Epochs epochCount The number of iterations through the entire training dataset 1 10 5
Batch size batchSize The number of samples processed before updating model parameters 1 1 1
Learning rate learningRate The rate at which model parameters are updated after each batch 5.00E-6 0.1 1.00E-4