Class CfnFlow.PromptModelInferenceConfigurationProperty.Builder
java.lang.Object
software.amazon.awscdk.services.bedrock.CfnFlow.PromptModelInferenceConfigurationProperty.Builder
- All Implemented Interfaces:
software.amazon.jsii.Builder<CfnFlow.PromptModelInferenceConfigurationProperty>
- Enclosing interface:
CfnFlow.PromptModelInferenceConfigurationProperty
@Stability(Stable)
public static final class CfnFlow.PromptModelInferenceConfigurationProperty.Builder
extends Object
implements software.amazon.jsii.Builder<CfnFlow.PromptModelInferenceConfigurationProperty>
A builder for
CfnFlow.PromptModelInferenceConfigurationProperty
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionbuild()
Builds the configured instance.Sets the value ofCfnFlow.PromptModelInferenceConfigurationProperty.getMaxTokens()
stopSequences
(List<String> stopSequences) Sets the value ofCfnFlow.PromptModelInferenceConfigurationProperty.getStopSequences()
temperature
(Number temperature) Sets the value ofCfnFlow.PromptModelInferenceConfigurationProperty.getTemperature()
Sets the value ofCfnFlow.PromptModelInferenceConfigurationProperty.getTopK()
Sets the value ofCfnFlow.PromptModelInferenceConfigurationProperty.getTopP()
-
Constructor Details
-
Builder
public Builder()
-
-
Method Details
-
maxTokens
@Stability(Stable) public CfnFlow.PromptModelInferenceConfigurationProperty.Builder maxTokens(Number maxTokens) Sets the value ofCfnFlow.PromptModelInferenceConfigurationProperty.getMaxTokens()
- Parameters:
maxTokens
- The maximum number of tokens to return in the response.- Returns:
this
-
stopSequences
@Stability(Stable) public CfnFlow.PromptModelInferenceConfigurationProperty.Builder stopSequences(List<String> stopSequences) Sets the value ofCfnFlow.PromptModelInferenceConfigurationProperty.getStopSequences()
- Parameters:
stopSequences
- A list of strings that define sequences after which the model will stop generating.- Returns:
this
-
temperature
@Stability(Stable) public CfnFlow.PromptModelInferenceConfigurationProperty.Builder temperature(Number temperature) Sets the value ofCfnFlow.PromptModelInferenceConfigurationProperty.getTemperature()
- Parameters:
temperature
- Controls the randomness of the response. Choose a lower value for more predictable outputs and a higher value for more surprising outputs.- Returns:
this
-
topK
@Stability(Stable) public CfnFlow.PromptModelInferenceConfigurationProperty.Builder topK(Number topK) Sets the value ofCfnFlow.PromptModelInferenceConfigurationProperty.getTopK()
- Parameters:
topK
- The number of most-likely candidates that the model considers for the next token during generation.- Returns:
this
-
topP
@Stability(Stable) public CfnFlow.PromptModelInferenceConfigurationProperty.Builder topP(Number topP) Sets the value ofCfnFlow.PromptModelInferenceConfigurationProperty.getTopP()
- Parameters:
topP
- The percentage of most-likely candidates that the model considers for the next token.- Returns:
this
-
build
Builds the configured instance.- Specified by:
build
in interfacesoftware.amazon.jsii.Builder<CfnFlow.PromptModelInferenceConfigurationProperty>
- Returns:
- a new instance of
CfnFlow.PromptModelInferenceConfigurationProperty
- Throws:
NullPointerException
- if any required attribute was not provided
-