Interface CfnFlowVersion.PromptFlowNodeConfigurationProperty

All Superinterfaces:
software.amazon.jsii.JsiiSerializable
All Known Implementing Classes:
CfnFlowVersion.PromptFlowNodeConfigurationProperty.Jsii$Proxy
Enclosing class:
CfnFlowVersion

@Stability(Stable) public static interface CfnFlowVersion.PromptFlowNodeConfigurationProperty extends software.amazon.jsii.JsiiSerializable
Contains configurations for a prompt node in the flow.

You can use a prompt from Prompt management or you can define one in this node. If the prompt contains variables, the inputs into this node will fill in the variables. The output from this node is the response generated by the model. For more information, see Node types in Amazon Bedrock works in the Amazon Bedrock User Guide.

Example:

 // The code below shows an example of how to instantiate this type.
 // The values are placeholders you should change.
 import software.amazon.awscdk.services.bedrock.*;
 PromptFlowNodeConfigurationProperty promptFlowNodeConfigurationProperty = PromptFlowNodeConfigurationProperty.builder()
         .sourceConfiguration(PromptFlowNodeSourceConfigurationProperty.builder()
                 .inline(PromptFlowNodeInlineConfigurationProperty.builder()
                         .modelId("modelId")
                         .templateConfiguration(PromptTemplateConfigurationProperty.builder()
                                 .text(TextPromptTemplateConfigurationProperty.builder()
                                         .text("text")
                                         // the properties below are optional
                                         .inputVariables(List.of(PromptInputVariableProperty.builder()
                                                 .name("name")
                                                 .build()))
                                         .build())
                                 .build())
                         .templateType("templateType")
                         // the properties below are optional
                         .inferenceConfiguration(PromptInferenceConfigurationProperty.builder()
                                 .text(PromptModelInferenceConfigurationProperty.builder()
                                         .maxTokens(123)
                                         .stopSequences(List.of("stopSequences"))
                                         .temperature(123)
                                         .topK(123)
                                         .topP(123)
                                         .build())
                                 .build())
                         .build())
                 .resource(PromptFlowNodeResourceConfigurationProperty.builder()
                         .promptArn("promptArn")
                         .build())
                 .build())
         .build();
 

See Also: