PromptVariant
- class aws_cdk.aws_bedrock_alpha.PromptVariant(*args: Any, **kwargs)
- Bases: - object- (experimental) Factory class for creating prompt variants. - Provides static methods to create different types of prompt variants with proper configuration and type safety. - Stability:
- experimental 
- ExampleMetadata:
- fixture=default infused 
 - Example: - cmk = kms.Key(self, "cmk") claude_model = bedrock.BedrockFoundationModel.ANTHROPIC_CLAUDE_SONNET_V1_0 variant1 = bedrock.PromptVariant.text( variant_name="variant1", model=claude_model, prompt_variables=["topic"], prompt_text="This is my first text prompt. Please summarize our conversation on: {{topic}}.", inference_configuration=bedrock.PromptInferenceConfiguration.text( temperature=1, top_p=0.999, max_tokens=2000 ) ) prompt1 = bedrock.Prompt(self, "prompt1", prompt_name="prompt1", description="my first prompt", default_variant=variant1, variants=[variant1], kms_key=cmk ) - Static Methods - classmethod agent(*, agent_alias, prompt_text, model, variant_name, prompt_variables=None)
- (experimental) Creates an agent prompt template variant. - Parameters:
- agent_alias ( - IAgentAlias) – (experimental) An alias pointing to the agent version to be used.
- prompt_text ( - str) – (experimental) The text prompt. Variables are used by enclosing its name with double curly braces as in- {{variable_name}}.
- model ( - IBedrockInvokable) – (experimental) The model which is used to run the prompt. The model could be a foundation model, a custom model, or a provisioned model.
- variant_name ( - str) – (experimental) The name of the prompt variant.
- prompt_variables ( - Optional[- Sequence[- str]]) – (experimental) The variables in the prompt template that can be filled in at runtime. Default: - No variables defined.
 
- Return type:
- Returns:
- A PromptVariant configured for agent interactions 
- Stability:
- experimental 
 
 - classmethod chat(*, messages, inference_configuration=None, system=None, tool_configuration=None, model, variant_name, prompt_variables=None)
- (experimental) Creates a chat template variant. - Use this template type when the model supports the Converse API or the Anthropic Claude Messages API. This allows you to include a System prompt and previous User messages and Assistant messages for context. - Parameters:
- messages ( - Sequence[- ChatMessage]) – (experimental) The messages in the chat prompt. Must include at least one User Message. The messages should alternate between User and Assistant.
- inference_configuration ( - Optional[- PromptInferenceConfiguration]) – (experimental) Inference configuration for the Chat Prompt. Default: - No inference configuration provided.
- system ( - Optional[- str]) – (experimental) Context or instructions for the model to consider before generating a response. Default: - No system message provided.
- tool_configuration ( - Union[- ToolConfiguration,- Dict[- str,- Any],- None]) – (experimental) The configuration with available tools to the model and how it must use them. Default: - No tool configuration provided.
- model ( - IBedrockInvokable) – (experimental) The model which is used to run the prompt. The model could be a foundation model, a custom model, or a provisioned model.
- variant_name ( - str) – (experimental) The name of the prompt variant.
- prompt_variables ( - Optional[- Sequence[- str]]) – (experimental) The variables in the prompt template that can be filled in at runtime. Default: - No variables defined.
 
- Return type:
- Returns:
- A PromptVariant configured for chat interactions 
- Stability:
- experimental 
 
 - classmethod text(*, prompt_text, inference_configuration=None, model, variant_name, prompt_variables=None)
- (experimental) Creates a text template variant. - Parameters:
- prompt_text ( - str) – (experimental) The text prompt. Variables are used by enclosing its name with double curly braces as in- {{variable_name}}.
- inference_configuration ( - Optional[- PromptInferenceConfiguration]) – (experimental) Inference configuration for the Text Prompt. Default: - No inference configuration provided.
- model ( - IBedrockInvokable) – (experimental) The model which is used to run the prompt. The model could be a foundation model, a custom model, or a provisioned model.
- variant_name ( - str) – (experimental) The name of the prompt variant.
- prompt_variables ( - Optional[- Sequence[- str]]) – (experimental) The variables in the prompt template that can be filled in at runtime. Default: - No variables defined.
 
- Return type:
- Returns:
- A PromptVariant configured for text processing 
- Stability:
- experimental