interface PromptVariantProperty
Language | Type name |
---|---|
![]() | Amazon.CDK.aws_bedrock.CfnPrompt.PromptVariantProperty |
![]() | github.com/aws/aws-cdk-go/awscdk/v2/awsbedrock#CfnPrompt_PromptVariantProperty |
![]() | software.amazon.awscdk.services.bedrock.CfnPrompt.PromptVariantProperty |
![]() | aws_cdk.aws_bedrock.CfnPrompt.PromptVariantProperty |
![]() | aws-cdk-lib » aws_bedrock » CfnPrompt » PromptVariantProperty |
Contains details about a variant of the prompt.
Example
// The code below shows an example of how to instantiate this type.
// The values are placeholders you should change.
import { aws_bedrock as bedrock } from 'aws-cdk-lib';
declare const any: any;
declare const auto: any;
declare const json: any;
const promptVariantProperty: bedrock.CfnPrompt.PromptVariantProperty = {
name: 'name',
templateConfiguration: {
chat: {
messages: [{
content: [{
text: 'text',
}],
role: 'role',
}],
// the properties below are optional
inputVariables: [{
name: 'name',
}],
system: [{
text: 'text',
}],
toolConfiguration: {
tools: [{
toolSpec: {
inputSchema: {
json: json,
},
name: 'name',
// the properties below are optional
description: 'description',
},
}],
// the properties below are optional
toolChoice: {
any: any,
auto: auto,
tool: {
name: 'name',
},
},
},
},
text: {
inputVariables: [{
name: 'name',
}],
text: 'text',
textS3Location: {
bucket: 'bucket',
key: 'key',
// the properties below are optional
version: 'version',
},
},
},
templateType: 'templateType',
// the properties below are optional
genAiResource: {
agent: {
agentIdentifier: 'agentIdentifier',
},
},
inferenceConfiguration: {
text: {
maxTokens: 123,
stopSequences: ['stopSequences'],
temperature: 123,
topP: 123,
},
},
modelId: 'modelId',
};
Properties
Name | Type | Description |
---|---|---|
name | string | The name of the prompt variant. |
template | IResolvable | Prompt | Contains configurations for the prompt template. |
template | string | The type of prompt template to use. |
gen | IResolvable | Prompt | Specifies a generative AI resource with which to use the prompt. |
inference | IResolvable | Prompt | Contains inference configurations for the prompt variant. |
model | string | The unique identifier of the model or inference profile with which to run inference on the prompt. |
name
Type:
string
The name of the prompt variant.
templateConfiguration
Type:
IResolvable
|
Prompt
Contains configurations for the prompt template.
templateType
Type:
string
The type of prompt template to use.
genAiResource?
Type:
IResolvable
|
Prompt
(optional)
Specifies a generative AI resource with which to use the prompt.
inferenceConfiguration?
Type:
IResolvable
|
Prompt
(optional)
Contains inference configurations for the prompt variant.
modelId?
Type:
string
(optional)
The unique identifier of the model or inference profile with which to run inference on the prompt.