interface PromptVariantProperty
| Language | Type name | 
|---|---|
|  .NET | Amazon.CDK.aws_bedrock.CfnPromptVersion.PromptVariantProperty | 
|  Go | github.com/aws/aws-cdk-go/awscdk/v2/awsbedrock#CfnPromptVersion_PromptVariantProperty | 
|  Java | software.amazon.awscdk.services.bedrock.CfnPromptVersion.PromptVariantProperty | 
|  Python | aws_cdk.aws_bedrock.CfnPromptVersion.PromptVariantProperty | 
|  TypeScript | aws-cdk-lib»aws_bedrock»CfnPromptVersion»PromptVariantProperty | 
Contains details about a variant of the prompt.
Example
// The code below shows an example of how to instantiate this type.
// The values are placeholders you should change.
import { aws_bedrock as bedrock } from 'aws-cdk-lib';
declare const additionalModelRequestFields: any;
declare const any: any;
declare const auto: any;
declare const json: any;
const promptVariantProperty: bedrock.CfnPromptVersion.PromptVariantProperty = {
  name: 'name',
  templateConfiguration: {
    chat: {
      messages: [{
        content: [{
          cachePoint: {
            type: 'type',
          },
          text: 'text',
        }],
        role: 'role',
      }],
      // the properties below are optional
      inputVariables: [{
        name: 'name',
      }],
      system: [{
        cachePoint: {
          type: 'type',
        },
        text: 'text',
      }],
      toolConfiguration: {
        tools: [{
          cachePoint: {
            type: 'type',
          },
          toolSpec: {
            inputSchema: {
              json: json,
            },
            name: 'name',
            // the properties below are optional
            description: 'description',
          },
        }],
        // the properties below are optional
        toolChoice: {
          any: any,
          auto: auto,
          tool: {
            name: 'name',
          },
        },
      },
    },
    text: {
      text: 'text',
      // the properties below are optional
      cachePoint: {
        type: 'type',
      },
      inputVariables: [{
        name: 'name',
      }],
    },
  },
  templateType: 'templateType',
  // the properties below are optional
  additionalModelRequestFields: additionalModelRequestFields,
  genAiResource: {
    agent: {
      agentIdentifier: 'agentIdentifier',
    },
  },
  inferenceConfiguration: {
    text: {
      maxTokens: 123,
      stopSequences: ['stopSequences'],
      temperature: 123,
      topP: 123,
    },
  },
  metadata: [{
    key: 'key',
    value: 'value',
  }],
  modelId: 'modelId',
};
Properties
| Name | Type | Description | 
|---|---|---|
| name | string | The name of the prompt variant. | 
| template | IResolvable | Prompt | Contains configurations for the prompt template. | 
| template | string | The type of prompt template to use. | 
| additional | any | Contains model-specific inference configurations that aren't in the inferenceConfigurationfield. | 
| gen | IResolvable | Prompt | Specifies a generative AI resource with which to use the prompt. | 
| inference | IResolvable | Prompt | Contains inference configurations for the prompt variant. | 
| metadata? | IResolvable | (IResolvable | Prompt)[] | An array of objects, each containing a key-value pair that defines a metadata tag and value to attach to a prompt variant. | 
| model | string | The unique identifier of the model or inference profile with which to run inference on the prompt. | 
name
Type:
string
The name of the prompt variant.
templateConfiguration
Type:
IResolvable | Prompt
Contains configurations for the prompt template.
templateType
Type:
string
The type of prompt template to use.
additionalModelRequestFields?
Type:
any
(optional)
Contains model-specific inference configurations that aren't in the inferenceConfiguration field.
To see model-specific inference parameters, see Inference request parameters and response fields for foundation models .
genAiResource?
Type:
IResolvable | Prompt
(optional)
Specifies a generative AI resource with which to use the prompt.
inferenceConfiguration?
Type:
IResolvable | Prompt
(optional)
Contains inference configurations for the prompt variant.
metadata?
Type:
IResolvable | (IResolvable | Prompt)[]
(optional)
An array of objects, each containing a key-value pair that defines a metadata tag and value to attach to a prompt variant.
modelId?
Type:
string
(optional)
The unique identifier of the model or inference profile with which to run inference on the prompt.
