Test a prompt using Prompt management
To learn how to test a prompt you created in Prompt management, select the tab corresponding to your method of choice and follow the steps:
- Console
-
To test a prompt in Prompt management
-
Sign in to the AWS Management Console using an IAM role with Amazon Bedrock permissions, and open the Amazon Bedrock console at Getting Started with the AWS Management Console.
-
Select Prompt management from the left navigation pane. Then, choose a prompt in the Prompts section.
-
Choose Edit in Prompt builder in the Prompt draft section, or choose a version of the prompt in the Versions section.
-
(Optional) To provide values for variables in your prompt, you need to first select a model in the Configurations pane. Then, enter a Test value for each variable in the Test variables pane.
Note
These test values are temporary and aren't saved if you save your prompt.
-
To test your prompt, choose Run in the Test window pane.
-
Modify your prompt or its configurations and then run your prompt again as necessary. If you're satisfied with your prompt, you can choose Create version to create a snapshot of your prompt that can be used in production. For more information, see Deploy a prompt to your application using versions in Prompt management.
You can also test the prompt in the following ways:
-
To test the prompt in a flow, include a prompt node in the flow. For more information, see Create a flow in Amazon Bedrock and Node types in flow.
-
If didn't configure your prompt with an agent, you can still test the prompt with an agent by importing it when testing an agent. For more information, see Test and troubleshoot agent behavior.
-
- API
-
You can test your prompt in the following ways:
-
To run inference on the prompt, send an InvokeModel InvokeModelWithResponseStream, Converse, or ConverseStream request request (see link for request and response formats and field details) with an Amazon Bedrock runtime endpoint and specify the ARN of the prompt in the
modelId
parameter. Note the following restrictions on using a Prompt management prompt withConverse
orConverseStream
:-
You can't include the
additionalModelRequestFields
,inferenceConfig
,system
, ortoolConfig
fields. -
If you include the
messages
field, the messages are appended after the messages defined in the prompt. -
If you include the
guardrailConfig
field, the guardrail is applied to the entire prompt. If you includeguardContent
blocks in the ContentBlock field, the guardrail will only be applied to those blocks.
-
-
To test your prompt in a flow, create or edit a flow by sending a CreateFlow or UpdateFlow request (see link for request and response formats and field details) with an Agents for Amazon Bedrock build-time endpoint. Include a SDK for JavaScript in Node.js of the
PromptNode
type and include the ARN of the prompt in thepromptArn
field. Then, send an InvokeFlow request (see link for request and response formats and field details) with an Agents for Amazon Bedrock runtime endpoint. For more information, see Create a flow in Amazon Bedrock and Node types in flow. -
To test your prompt with an agent, use the Amazon Bedrock console (see the Console tab), or enter the text of the prompt into the
inputText
field of an InvokeAgent request.
-