のバージョン 4 (V4) SDK for .NET はプレビュー中です。プレビューでこの新しいバージョンに関する情報を確認するには、 AWS SDK for .NET (バージョン 4 プレビュー) デベロッパーガイドを参照してください。
SDK の V4 はプレビュー中であるため、コンテンツは変更される可能性があることに注意してください。
翻訳は機械翻訳により提供されています。提供された翻訳内容と英語版の間で齟齬、不一致または矛盾がある場合、英語版が優先します。
を使用した Amazon Bedrock ランタイムの例 SDK for .NET
次のコード例は、Amazon Bedrock ランタイム AWS SDK for .NET で を使用してアクションを実行し、一般的なシナリオを実装する方法を示しています。
「シナリオ」は、1 つのサービス内から、または他の AWS のサービスと組み合わせて複数の関数を呼び出し、特定のタスクを実行する方法を示すコード例です。
各例には完全なソースコードへのリンクが含まれており、コードの設定方法と実行方法に関する手順を確認できます。
トピック
シナリオ
次のコード例は、さまざまな方法で Amazon Bedrock 基盤モデルと相互作用するプレイグラウンドを作成する方法を示しています。
- SDK for .NET
-
.NET 基盤モデル (FM) プレイグラウンドは、C# コードから Amazon Bedrock を使用する方法を紹介する.NET MAUI Blazor サンプルアプリケーションです。この例は、.NET 開発者と C# 開発者が Amazon Bedrock を使用してジェネレーティブな AI 対応アプリケーションを構築する方法を示しています。次の 4 つのプレイグラウンドを使用して Amazon Bedrock 基盤モデルをテストしたり操作したりできます。
-
テキストプレイグラウンド。
-
チャットプレイグラウンド。
-
ボイスチャットプレイグラウンド。
-
イメージプレイグラウンド。
この例には、アクセスできる基盤モデルとその特性も一覧表示されています。ソースコードとデプロイ手順については、GitHub
のプロジェクトを参照してください。 この例で使用されているサービス
Amazon Bedrock ランタイム
-
次のコード例は、アプリケーション、生成 AI モデル、接続されたツールまたは API 間の一般的なインタラクションを構築し、AI と外部世界のインタラクションを仲介する方法を示しています。外部気象 API を AI モデルに接続する例を使用して、ユーザー入力に基づいてリアルタイムの気象情報を提供します。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 シナリオフローのプライマリ実行。このシナリオでは、ユーザー、Amazon Bedrock Converse API、および気象ツール間の会話を調整します。
using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; using Amazon.Runtime.Documents; using Microsoft.Extensions.DependencyInjection; using Microsoft.Extensions.DependencyInjection.Extensions; using Microsoft.Extensions.Hosting; using Microsoft.Extensions.Http; using Microsoft.Extensions.Logging; using Microsoft.Extensions.Logging.Console; namespace ConverseToolScenario; public static class ConverseToolScenario { /* Before running this .NET code example, set up your development environment, including your credentials. This demo illustrates a tool use scenario using Amazon Bedrock's Converse API and a weather tool. The script interacts with a foundation model on Amazon Bedrock to provide weather information based on user input. It uses the Open-Meteo API (https://open-meteo.com) to retrieve current weather data for a given location. */ public static BedrockActionsWrapper _bedrockActionsWrapper = null!; public static WeatherTool _weatherTool = null!; public static bool _interactive = true; // Change this string to use a different model with Converse API. private static string model_id = "amazon.nova-lite-v1:0"; private static string system_prompt = @" You are a weather assistant that provides current weather data for user-specified locations using only the Weather_Tool, which expects latitude and longitude. Infer the coordinates from the location yourself. If the user provides coordinates, infer the approximate location and refer to it in your response. To use the tool, you strictly apply the provided tool specification. - Explain your step-by-step process, and give brief updates before each step. - Only use the Weather_Tool for data. Never guess or make up information. - Repeat the tool use for subsequent requests if necessary. - If the tool errors, apologize, explain weather is unavailable, and suggest other options. - Report temperatures in °C (°F) and wind in km/h (mph). Keep weather reports concise. Sparingly use emojis where appropriate. - Only respond to weather queries. Remind off-topic users of your purpose. - Never claim to search online, access external data, or use tools besides Weather_Tool. - Complete the entire process until you have all required data before sending the complete response. " ; private static string default_prompt = "What is the weather like in Seattle?"; // The maximum number of recursive calls allowed in the tool use function. // This helps prevent infinite loops and potential performance issues. private static int max_recursions = 5; public static async Task Main(string[] args) { // Set up dependency injection for the Amazon service. using var host = Host.CreateDefaultBuilder(args) .ConfigureLogging(logging => logging.AddFilter("System", LogLevel.Error) .AddFilter<ConsoleLoggerProvider>("Microsoft", LogLevel.Trace)) .ConfigureServices((_, services) => services.AddHttpClient() .AddSingleton<IAmazonBedrockRuntime>(_ => new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1)) // Specify a region that has access to the chosen model. .AddTransient<BedrockActionsWrapper>() .AddTransient<WeatherTool>() .RemoveAll<IHttpMessageHandlerBuilderFilter>() ) .Build(); ServicesSetup(host); try { await RunConversationAsync(); } catch (Exception ex) { Console.WriteLine(new string('-', 80)); Console.WriteLine($"There was a problem running the scenario: {ex.Message}"); Console.WriteLine(new string('-', 80)); } finally { Console.WriteLine( "Amazon Bedrock Converse API with Tool Use Feature Scenario is complete."); Console.WriteLine(new string('-', 80)); } } /// <summary> /// Populate the services for use within the console application. /// </summary> /// <param name="host">The services host.</param> private static void ServicesSetup(IHost host) { _bedrockActionsWrapper = host.Services.GetRequiredService<BedrockActionsWrapper>(); _weatherTool = host.Services.GetRequiredService<WeatherTool>(); } /// <summary> /// Starts the conversation with the user and handles the interaction with Bedrock. /// </summary> /// <returns>The conversation array.</returns> public static async Task<List<Message>> RunConversationAsync() { // Print the greeting and a short user guide PrintHeader(); // Start with an empty conversation var conversation = new List<Message>(); // Get the first user input var userInput = await GetUserInputAsync(); while (userInput != null) { // Create a new message with the user input and append it to the conversation var message = new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userInput } } }; conversation.Add(message); // Send the conversation to Amazon Bedrock var bedrockResponse = await SendConversationToBedrock(conversation); // Recursively handle the model's response until the model has returned its final response or the recursion counter has reached 0 await ProcessModelResponseAsync(bedrockResponse, conversation, max_recursions); // Repeat the loop until the user decides to exit the application userInput = await GetUserInputAsync(); } PrintFooter(); return conversation; } /// <summary> /// Sends the conversation, the system prompt, and the tool spec to Amazon Bedrock, and returns the response. /// </summary> /// <param name="conversation">The conversation history including the next message to send.</param> /// <returns>The response from Amazon Bedrock.</returns> private static async Task<ConverseResponse> SendConversationToBedrock(List<Message> conversation) { Console.WriteLine("\tCalling Bedrock..."); // Send the conversation, system prompt, and tool configuration, and return the response return await _bedrockActionsWrapper.SendConverseRequestAsync(model_id, system_prompt, conversation, _weatherTool.GetToolSpec()); } /// <summary> /// Processes the response received via Amazon Bedrock and performs the necessary actions based on the stop reason. /// </summary> /// <param name="modelResponse">The model's response returned via Amazon Bedrock.</param> /// <param name="conversation">The conversation history.</param> /// <param name="maxRecursion">The maximum number of recursive calls allowed.</param> private static async Task ProcessModelResponseAsync(ConverseResponse modelResponse, List<Message> conversation, int maxRecursion) { if (maxRecursion <= 0) { // Stop the process, the number of recursive calls could indicate an infinite loop Console.WriteLine("\tWarning: Maximum number of recursions reached. Please try again."); } // Append the model's response to the ongoing conversation conversation.Add(modelResponse.Output.Message); if (modelResponse.StopReason == "tool_use") { // If the stop reason is "tool_use", forward everything to the tool use handler await HandleToolUseAsync(modelResponse.Output, conversation, maxRecursion - 1); } if (modelResponse.StopReason == "end_turn") { // If the stop reason is "end_turn", print the model's response text, and finish the process PrintModelResponse(modelResponse.Output.Message.Content[0].Text); if (!_interactive) { default_prompt = "x"; } } } /// <summary> /// Handles the tool use case by invoking the specified tool and sending the tool's response back to Bedrock. /// The tool response is appended to the conversation, and the conversation is sent back to Amazon Bedrock for further processing. /// </summary> /// <param name="modelResponse">The model's response containing the tool use request.</param> /// <param name="conversation">The conversation history.</param> /// <param name="maxRecursion">The maximum number of recursive calls allowed.</param> public static async Task HandleToolUseAsync(ConverseOutput modelResponse, List<Message> conversation, int maxRecursion) { // Initialize an empty list of tool results var toolResults = new List<ContentBlock>(); // The model's response can consist of multiple content blocks foreach (var contentBlock in modelResponse.Message.Content) { if (!String.IsNullOrEmpty(contentBlock.Text)) { // If the content block contains text, print it to the console PrintModelResponse(contentBlock.Text); } if (contentBlock.ToolUse != null) { // If the content block is a tool use request, forward it to the tool var toolResponse = await InvokeTool(contentBlock.ToolUse); // Add the tool use ID and the tool's response to the list of results toolResults.Add(new ContentBlock { ToolResult = new ToolResultBlock() { ToolUseId = toolResponse.ToolUseId, Content = new List<ToolResultContentBlock>() { new ToolResultContentBlock { Json = toolResponse.Content } } } }); } } // Embed the tool results in a new user message var message = new Message() { Role = ConversationRole.User, Content = toolResults }; // Append the new message to the ongoing conversation conversation.Add(message); // Send the conversation to Amazon Bedrock var response = await SendConversationToBedrock(conversation); // Recursively handle the model's response until the model has returned its final response or the recursion counter has reached 0 await ProcessModelResponseAsync(response, conversation, maxRecursion); } /// <summary> /// Invokes the specified tool with the given payload and returns the tool's response. /// If the requested tool does not exist, an error message is returned. /// </summary> /// <param name="payload">The payload containing the tool name and input data.</param> /// <returns>The tool's response or an error message.</returns> public static async Task<ToolResponse> InvokeTool(ToolUseBlock payload) { var toolName = payload.Name; if (toolName == "Weather_Tool") { var inputData = payload.Input.AsDictionary(); PrintToolUse(toolName, inputData); // Invoke the weather tool with the input data provided var weatherResponse = await _weatherTool.FetchWeatherDataAsync(inputData["latitude"].ToString(), inputData["longitude"].ToString()); return new ToolResponse { ToolUseId = payload.ToolUseId, Content = weatherResponse }; } else { var errorMessage = $"\tThe requested tool with name '{toolName}' does not exist."; return new ToolResponse { ToolUseId = payload.ToolUseId, Content = new { error = true, message = errorMessage } }; } } /// <summary> /// Prompts the user for input and returns the user's response. /// Returns null if the user enters 'x' to exit. /// </summary> /// <param name="prompt">The prompt to display to the user.</param> /// <returns>The user's input or null if the user chooses to exit.</returns> private static async Task<string?> GetUserInputAsync(string prompt = "\tYour weather info request:") { var userInput = default_prompt; if (_interactive) { Console.WriteLine(new string('*', 80)); Console.WriteLine($"{prompt} (x to exit): \n\t"); userInput = Console.ReadLine(); } if (string.IsNullOrWhiteSpace(userInput)) { prompt = "\tPlease enter your weather info request, e.g. the name of a city"; return await GetUserInputAsync(prompt); } if (userInput.ToLowerInvariant() == "x") { return null; } return userInput; } /// <summary> /// Logs the welcome message and usage guide for the tool use demo. /// </summary> public static void PrintHeader() { Console.WriteLine(@" ================================================= Welcome to the Amazon Bedrock Tool Use demo! ================================================= This assistant provides current weather information for user-specified locations. You can ask for weather details by providing the location name or coordinates. Weather information will be provided using a custom Tool and open-meteo API. Example queries: - What's the weather like in New York? - Current weather for latitude 40.70, longitude -74.01 - Is it warmer in Rome or Barcelona today? To exit the program, simply type 'x' and press Enter. P.S.: You're not limited to single locations, or even to using English! Have fun and experiment with the app! "); } /// <summary> /// Logs the footer information for the tool use demo. /// </summary> public static void PrintFooter() { Console.WriteLine(@" ================================================= Thank you for checking out the Amazon Bedrock Tool Use demo. We hope you learned something new, or got some inspiration for your own apps today! For more Bedrock examples in different programming languages, have a look at: https://docs.aws.amazon.com/bedrock/latest/userguide/service_code_examples.html ================================================= "); } /// <summary> /// Logs information about the tool use. /// </summary> /// <param name="toolName">The name of the tool being used.</param> /// <param name="inputData">The input data for the tool.</param> public static void PrintToolUse(string toolName, Dictionary<string, Document> inputData) { Console.WriteLine($"\n\tInvoking tool: {toolName} with input: {inputData["latitude"].ToString()}, {inputData["longitude"].ToString()}...\n"); } /// <summary> /// Logs the model's response. /// </summary> /// <param name="message">The model's response message.</param> public static void PrintModelResponse(string message) { Console.WriteLine("\tThe model's response:\n"); Console.WriteLine(message); Console.WriteLine(); } }
デモで使用される気象ツール。このファイルは、ツール仕様を定義し、Open-Meteo API から を使用して気象データを取得するロジックを実装します。
using Amazon.BedrockRuntime.Model; using Amazon.Runtime.Documents; using Microsoft.Extensions.Logging; namespace ConverseToolScenario; /// <summary> /// Weather tool that will be invoked when requested by the Bedrock response. /// </summary> public class WeatherTool { private readonly ILogger<WeatherTool> _logger; private readonly IHttpClientFactory _httpClientFactory; public WeatherTool(ILogger<WeatherTool> logger, IHttpClientFactory httpClientFactory) { _logger = logger; _httpClientFactory = httpClientFactory; } /// <summary> /// Returns the JSON Schema specification for the Weather tool. The tool specification /// defines the input schema and describes the tool's functionality. /// For more information, see https://json-schema.org/understanding-json-schema/reference. /// </summary> /// <returns>The tool specification for the Weather tool.</returns> public ToolSpecification GetToolSpec() { ToolSpecification toolSpecification = new ToolSpecification(); toolSpecification.Name = "Weather_Tool"; toolSpecification.Description = "Get the current weather for a given location, based on its WGS84 coordinates."; Document toolSpecDocument = Document.FromObject( new { type = "object", properties = new { latitude = new { type = "string", description = "Geographical WGS84 latitude of the location." }, longitude = new { type = "string", description = "Geographical WGS84 longitude of the location." } }, required = new[] { "latitude", "longitude" } }); toolSpecification.InputSchema = new ToolInputSchema() { Json = toolSpecDocument }; return toolSpecification; } /// <summary> /// Fetches weather data for the given latitude and longitude using the Open-Meteo API. /// Returns the weather data or an error message if the request fails. /// </summary> /// <param name="latitude">The latitude of the location.</param> /// <param name="longitude">The longitude of the location.</param> /// <returns>The weather data or an error message.</returns> public async Task<Document> FetchWeatherDataAsync(string latitude, string longitude) { string endpoint = "https://api.open-meteo.com/v1/forecast"; try { var httpClient = _httpClientFactory.CreateClient(); var response = await httpClient.GetAsync($"{endpoint}?latitude={latitude}&longitude={longitude}¤t_weather=True"); response.EnsureSuccessStatusCode(); var weatherData = await response.Content.ReadAsStringAsync(); Document weatherDocument = Document.FromObject( new { weather_data = weatherData }); return weatherDocument; } catch (HttpRequestException e) { _logger.LogError(e, "Error fetching weather data: {Message}", e.Message); throw; } catch (Exception e) { _logger.LogError(e, "Unexpected error fetching weather data: {Message}", e.Message); throw; } } }
ツール設定を使用した Converse API アクション。
/// <summary> /// Wrapper class for interacting with the Amazon Bedrock Converse API. /// </summary> public class BedrockActionsWrapper { private readonly IAmazonBedrockRuntime _bedrockClient; private readonly ILogger<BedrockActionsWrapper> _logger; /// <summary> /// Initializes a new instance of the <see cref="BedrockActionsWrapper"/> class. /// </summary> /// <param name="bedrockClient">The Bedrock Converse API client.</param> /// <param name="logger">The logger instance.</param> public BedrockActionsWrapper(IAmazonBedrockRuntime bedrockClient, ILogger<BedrockActionsWrapper> logger) { _bedrockClient = bedrockClient; _logger = logger; } /// <summary> /// Sends a Converse request to the Amazon Bedrock Converse API. /// </summary> /// <param name="modelId">The Bedrock Model Id.</param> /// <param name="systemPrompt">A system prompt instruction.</param> /// <param name="conversation">The array of messages in the conversation.</param> /// <param name="toolSpec">The specification for a tool.</param> /// <returns>The response of the model.</returns> public async Task<ConverseResponse> SendConverseRequestAsync(string modelId, string systemPrompt, List<Message> conversation, ToolSpecification toolSpec) { try { var request = new ConverseRequest() { ModelId = modelId, System = new List<SystemContentBlock>() { new SystemContentBlock() { Text = systemPrompt } }, Messages = conversation, ToolConfig = new ToolConfiguration() { Tools = new List<Tool>() { new Tool() { ToolSpec = toolSpec } } } }; var response = await _bedrockClient.ConverseAsync(request); return response; } catch (ModelNotReadyException ex) { _logger.LogError(ex, "Model not ready, please wait and try again."); throw; } catch (AmazonBedrockRuntimeException ex) { _logger.LogError(ex, "Error occurred while sending Converse request."); throw; } } }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「Converse」を参照してください。
-
AI21 Labs Jurassic-2
次のコード例は、Bedrock の Converse API を使用して AI21 Labs Jurassic-2 にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して AI21 Labs Jurassic-2 にテキストメッセージを送信します。
// Use the Converse API to send a text message to AI21 Labs Jurassic-2. using System; using System.Collections.Generic; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Jurassic-2 Mid. var modelId = "ai21.j2-mid-v1"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseAsync(request); // Extract and print the response text. string responseText = response?.Output?.Message?.Content?[0]?.Text ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「Converse」を参照してください。
-
次のコード例は、Invoke Model API を使用して AI21 Labs Jurassic-2 にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信します。
// Use the native inference API to send a text message to AI21 Labs Jurassic-2. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Jurassic-2 Mid. var modelId = "ai21.j2-mid-v1"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { prompt = userMessage, maxTokens = 512, temperature = 0.5 }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var response = await client.InvokeModelAsync(request); // Decode the response body. var modelResponse = await JsonNode.ParseAsync(response.Body); // Extract and print the response text. var responseText = modelResponse["completions"]?[0]?["data"]?["text"] ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「InvokeModel」を参照してください。
-
Amazon Nova
次のコード例は、Bedrock の Converse API を使用して Amazon Nova にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して、Amazon Nova にテキストメッセージを送信します。
// Use the Converse API to send a text message to Amazon Nova. using System; using System.Collections.Generic; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Amazon Nova Lite. var modelId = "amazon.nova-lite-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseAsync(request); // Extract and print the response text. string responseText = response?.Output?.Message?.Content?[0]?.Text ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
Bedrock の Converse API とツール設定を使用して、Amazon Nova にメッセージの会話を送信します。
/// <summary> /// Wrapper class for interacting with the Amazon Bedrock Converse API. /// </summary> public class BedrockActionsWrapper { private readonly IAmazonBedrockRuntime _bedrockClient; private readonly ILogger<BedrockActionsWrapper> _logger; /// <summary> /// Initializes a new instance of the <see cref="BedrockActionsWrapper"/> class. /// </summary> /// <param name="bedrockClient">The Bedrock Converse API client.</param> /// <param name="logger">The logger instance.</param> public BedrockActionsWrapper(IAmazonBedrockRuntime bedrockClient, ILogger<BedrockActionsWrapper> logger) { _bedrockClient = bedrockClient; _logger = logger; } /// <summary> /// Sends a Converse request to the Amazon Bedrock Converse API. /// </summary> /// <param name="modelId">The Bedrock Model Id.</param> /// <param name="systemPrompt">A system prompt instruction.</param> /// <param name="conversation">The array of messages in the conversation.</param> /// <param name="toolSpec">The specification for a tool.</param> /// <returns>The response of the model.</returns> public async Task<ConverseResponse> SendConverseRequestAsync(string modelId, string systemPrompt, List<Message> conversation, ToolSpecification toolSpec) { try { var request = new ConverseRequest() { ModelId = modelId, System = new List<SystemContentBlock>() { new SystemContentBlock() { Text = systemPrompt } }, Messages = conversation, ToolConfig = new ToolConfiguration() { Tools = new List<Tool>() { new Tool() { ToolSpec = toolSpec } } } }; var response = await _bedrockClient.ConverseAsync(request); return response; } catch (ModelNotReadyException ex) { _logger.LogError(ex, "Model not ready, please wait and try again."); throw; } catch (AmazonBedrockRuntimeException ex) { _logger.LogError(ex, "Error occurred while sending Converse request."); throw; } } }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「Converse」を参照してください。
-
次のコード例は、Bedrock の Converse API を使用して Amazon Nova にテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して Amazon Nova にテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理します。
// Use the Converse API to send a text message to Amazon Nova // and print the response stream. using System; using System.Collections.Generic; using System.Linq; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Amazon Nova Lite. var modelId = "amazon.nova-lite-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseStreamRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseStreamAsync(request); // Extract and print the streamed response text in real-time. foreach (var chunk in response.Stream.AsEnumerable()) { if (chunk is ContentBlockDeltaEvent) { Console.Write((chunk as ContentBlockDeltaEvent).Delta.Text); } } } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API Reference」の「ConverseStream」を参照してください。
-
次のコード例は、アプリケーション、生成 AI モデル、接続されたツールまたは API 間の一般的なインタラクションを構築し、AI と外部世界のインタラクションを仲介する方法を示しています。外部気象 API を AI モデルに接続する例を使用して、ユーザー入力に基づいてリアルタイムの気象情報を提供します。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 シナリオフローのプライマリ実行。このシナリオでは、ユーザー、Amazon Bedrock Converse API、および気象ツール間の会話を調整します。
using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; using Amazon.Runtime.Documents; using Microsoft.Extensions.DependencyInjection; using Microsoft.Extensions.DependencyInjection.Extensions; using Microsoft.Extensions.Hosting; using Microsoft.Extensions.Http; using Microsoft.Extensions.Logging; using Microsoft.Extensions.Logging.Console; namespace ConverseToolScenario; public static class ConverseToolScenario { /* Before running this .NET code example, set up your development environment, including your credentials. This demo illustrates a tool use scenario using Amazon Bedrock's Converse API and a weather tool. The script interacts with a foundation model on Amazon Bedrock to provide weather information based on user input. It uses the Open-Meteo API (https://open-meteo.com) to retrieve current weather data for a given location. */ public static BedrockActionsWrapper _bedrockActionsWrapper = null!; public static WeatherTool _weatherTool = null!; public static bool _interactive = true; // Change this string to use a different model with Converse API. private static string model_id = "amazon.nova-lite-v1:0"; private static string system_prompt = @" You are a weather assistant that provides current weather data for user-specified locations using only the Weather_Tool, which expects latitude and longitude. Infer the coordinates from the location yourself. If the user provides coordinates, infer the approximate location and refer to it in your response. To use the tool, you strictly apply the provided tool specification. - Explain your step-by-step process, and give brief updates before each step. - Only use the Weather_Tool for data. Never guess or make up information. - Repeat the tool use for subsequent requests if necessary. - If the tool errors, apologize, explain weather is unavailable, and suggest other options. - Report temperatures in °C (°F) and wind in km/h (mph). Keep weather reports concise. Sparingly use emojis where appropriate. - Only respond to weather queries. Remind off-topic users of your purpose. - Never claim to search online, access external data, or use tools besides Weather_Tool. - Complete the entire process until you have all required data before sending the complete response. " ; private static string default_prompt = "What is the weather like in Seattle?"; // The maximum number of recursive calls allowed in the tool use function. // This helps prevent infinite loops and potential performance issues. private static int max_recursions = 5; public static async Task Main(string[] args) { // Set up dependency injection for the Amazon service. using var host = Host.CreateDefaultBuilder(args) .ConfigureLogging(logging => logging.AddFilter("System", LogLevel.Error) .AddFilter<ConsoleLoggerProvider>("Microsoft", LogLevel.Trace)) .ConfigureServices((_, services) => services.AddHttpClient() .AddSingleton<IAmazonBedrockRuntime>(_ => new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1)) // Specify a region that has access to the chosen model. .AddTransient<BedrockActionsWrapper>() .AddTransient<WeatherTool>() .RemoveAll<IHttpMessageHandlerBuilderFilter>() ) .Build(); ServicesSetup(host); try { await RunConversationAsync(); } catch (Exception ex) { Console.WriteLine(new string('-', 80)); Console.WriteLine($"There was a problem running the scenario: {ex.Message}"); Console.WriteLine(new string('-', 80)); } finally { Console.WriteLine( "Amazon Bedrock Converse API with Tool Use Feature Scenario is complete."); Console.WriteLine(new string('-', 80)); } } /// <summary> /// Populate the services for use within the console application. /// </summary> /// <param name="host">The services host.</param> private static void ServicesSetup(IHost host) { _bedrockActionsWrapper = host.Services.GetRequiredService<BedrockActionsWrapper>(); _weatherTool = host.Services.GetRequiredService<WeatherTool>(); } /// <summary> /// Starts the conversation with the user and handles the interaction with Bedrock. /// </summary> /// <returns>The conversation array.</returns> public static async Task<List<Message>> RunConversationAsync() { // Print the greeting and a short user guide PrintHeader(); // Start with an empty conversation var conversation = new List<Message>(); // Get the first user input var userInput = await GetUserInputAsync(); while (userInput != null) { // Create a new message with the user input and append it to the conversation var message = new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userInput } } }; conversation.Add(message); // Send the conversation to Amazon Bedrock var bedrockResponse = await SendConversationToBedrock(conversation); // Recursively handle the model's response until the model has returned its final response or the recursion counter has reached 0 await ProcessModelResponseAsync(bedrockResponse, conversation, max_recursions); // Repeat the loop until the user decides to exit the application userInput = await GetUserInputAsync(); } PrintFooter(); return conversation; } /// <summary> /// Sends the conversation, the system prompt, and the tool spec to Amazon Bedrock, and returns the response. /// </summary> /// <param name="conversation">The conversation history including the next message to send.</param> /// <returns>The response from Amazon Bedrock.</returns> private static async Task<ConverseResponse> SendConversationToBedrock(List<Message> conversation) { Console.WriteLine("\tCalling Bedrock..."); // Send the conversation, system prompt, and tool configuration, and return the response return await _bedrockActionsWrapper.SendConverseRequestAsync(model_id, system_prompt, conversation, _weatherTool.GetToolSpec()); } /// <summary> /// Processes the response received via Amazon Bedrock and performs the necessary actions based on the stop reason. /// </summary> /// <param name="modelResponse">The model's response returned via Amazon Bedrock.</param> /// <param name="conversation">The conversation history.</param> /// <param name="maxRecursion">The maximum number of recursive calls allowed.</param> private static async Task ProcessModelResponseAsync(ConverseResponse modelResponse, List<Message> conversation, int maxRecursion) { if (maxRecursion <= 0) { // Stop the process, the number of recursive calls could indicate an infinite loop Console.WriteLine("\tWarning: Maximum number of recursions reached. Please try again."); } // Append the model's response to the ongoing conversation conversation.Add(modelResponse.Output.Message); if (modelResponse.StopReason == "tool_use") { // If the stop reason is "tool_use", forward everything to the tool use handler await HandleToolUseAsync(modelResponse.Output, conversation, maxRecursion - 1); } if (modelResponse.StopReason == "end_turn") { // If the stop reason is "end_turn", print the model's response text, and finish the process PrintModelResponse(modelResponse.Output.Message.Content[0].Text); if (!_interactive) { default_prompt = "x"; } } } /// <summary> /// Handles the tool use case by invoking the specified tool and sending the tool's response back to Bedrock. /// The tool response is appended to the conversation, and the conversation is sent back to Amazon Bedrock for further processing. /// </summary> /// <param name="modelResponse">The model's response containing the tool use request.</param> /// <param name="conversation">The conversation history.</param> /// <param name="maxRecursion">The maximum number of recursive calls allowed.</param> public static async Task HandleToolUseAsync(ConverseOutput modelResponse, List<Message> conversation, int maxRecursion) { // Initialize an empty list of tool results var toolResults = new List<ContentBlock>(); // The model's response can consist of multiple content blocks foreach (var contentBlock in modelResponse.Message.Content) { if (!String.IsNullOrEmpty(contentBlock.Text)) { // If the content block contains text, print it to the console PrintModelResponse(contentBlock.Text); } if (contentBlock.ToolUse != null) { // If the content block is a tool use request, forward it to the tool var toolResponse = await InvokeTool(contentBlock.ToolUse); // Add the tool use ID and the tool's response to the list of results toolResults.Add(new ContentBlock { ToolResult = new ToolResultBlock() { ToolUseId = toolResponse.ToolUseId, Content = new List<ToolResultContentBlock>() { new ToolResultContentBlock { Json = toolResponse.Content } } } }); } } // Embed the tool results in a new user message var message = new Message() { Role = ConversationRole.User, Content = toolResults }; // Append the new message to the ongoing conversation conversation.Add(message); // Send the conversation to Amazon Bedrock var response = await SendConversationToBedrock(conversation); // Recursively handle the model's response until the model has returned its final response or the recursion counter has reached 0 await ProcessModelResponseAsync(response, conversation, maxRecursion); } /// <summary> /// Invokes the specified tool with the given payload and returns the tool's response. /// If the requested tool does not exist, an error message is returned. /// </summary> /// <param name="payload">The payload containing the tool name and input data.</param> /// <returns>The tool's response or an error message.</returns> public static async Task<ToolResponse> InvokeTool(ToolUseBlock payload) { var toolName = payload.Name; if (toolName == "Weather_Tool") { var inputData = payload.Input.AsDictionary(); PrintToolUse(toolName, inputData); // Invoke the weather tool with the input data provided var weatherResponse = await _weatherTool.FetchWeatherDataAsync(inputData["latitude"].ToString(), inputData["longitude"].ToString()); return new ToolResponse { ToolUseId = payload.ToolUseId, Content = weatherResponse }; } else { var errorMessage = $"\tThe requested tool with name '{toolName}' does not exist."; return new ToolResponse { ToolUseId = payload.ToolUseId, Content = new { error = true, message = errorMessage } }; } } /// <summary> /// Prompts the user for input and returns the user's response. /// Returns null if the user enters 'x' to exit. /// </summary> /// <param name="prompt">The prompt to display to the user.</param> /// <returns>The user's input or null if the user chooses to exit.</returns> private static async Task<string?> GetUserInputAsync(string prompt = "\tYour weather info request:") { var userInput = default_prompt; if (_interactive) { Console.WriteLine(new string('*', 80)); Console.WriteLine($"{prompt} (x to exit): \n\t"); userInput = Console.ReadLine(); } if (string.IsNullOrWhiteSpace(userInput)) { prompt = "\tPlease enter your weather info request, e.g. the name of a city"; return await GetUserInputAsync(prompt); } if (userInput.ToLowerInvariant() == "x") { return null; } return userInput; } /// <summary> /// Logs the welcome message and usage guide for the tool use demo. /// </summary> public static void PrintHeader() { Console.WriteLine(@" ================================================= Welcome to the Amazon Bedrock Tool Use demo! ================================================= This assistant provides current weather information for user-specified locations. You can ask for weather details by providing the location name or coordinates. Weather information will be provided using a custom Tool and open-meteo API. Example queries: - What's the weather like in New York? - Current weather for latitude 40.70, longitude -74.01 - Is it warmer in Rome or Barcelona today? To exit the program, simply type 'x' and press Enter. P.S.: You're not limited to single locations, or even to using English! Have fun and experiment with the app! "); } /// <summary> /// Logs the footer information for the tool use demo. /// </summary> public static void PrintFooter() { Console.WriteLine(@" ================================================= Thank you for checking out the Amazon Bedrock Tool Use demo. We hope you learned something new, or got some inspiration for your own apps today! For more Bedrock examples in different programming languages, have a look at: https://docs.aws.amazon.com/bedrock/latest/userguide/service_code_examples.html ================================================= "); } /// <summary> /// Logs information about the tool use. /// </summary> /// <param name="toolName">The name of the tool being used.</param> /// <param name="inputData">The input data for the tool.</param> public static void PrintToolUse(string toolName, Dictionary<string, Document> inputData) { Console.WriteLine($"\n\tInvoking tool: {toolName} with input: {inputData["latitude"].ToString()}, {inputData["longitude"].ToString()}...\n"); } /// <summary> /// Logs the model's response. /// </summary> /// <param name="message">The model's response message.</param> public static void PrintModelResponse(string message) { Console.WriteLine("\tThe model's response:\n"); Console.WriteLine(message); Console.WriteLine(); } }
デモで使用される気象ツール。このファイルは、ツール仕様を定義し、Open-Meteo API から を使用して気象データを取得するロジックを実装します。
using Amazon.BedrockRuntime.Model; using Amazon.Runtime.Documents; using Microsoft.Extensions.Logging; namespace ConverseToolScenario; /// <summary> /// Weather tool that will be invoked when requested by the Bedrock response. /// </summary> public class WeatherTool { private readonly ILogger<WeatherTool> _logger; private readonly IHttpClientFactory _httpClientFactory; public WeatherTool(ILogger<WeatherTool> logger, IHttpClientFactory httpClientFactory) { _logger = logger; _httpClientFactory = httpClientFactory; } /// <summary> /// Returns the JSON Schema specification for the Weather tool. The tool specification /// defines the input schema and describes the tool's functionality. /// For more information, see https://json-schema.org/understanding-json-schema/reference. /// </summary> /// <returns>The tool specification for the Weather tool.</returns> public ToolSpecification GetToolSpec() { ToolSpecification toolSpecification = new ToolSpecification(); toolSpecification.Name = "Weather_Tool"; toolSpecification.Description = "Get the current weather for a given location, based on its WGS84 coordinates."; Document toolSpecDocument = Document.FromObject( new { type = "object", properties = new { latitude = new { type = "string", description = "Geographical WGS84 latitude of the location." }, longitude = new { type = "string", description = "Geographical WGS84 longitude of the location." } }, required = new[] { "latitude", "longitude" } }); toolSpecification.InputSchema = new ToolInputSchema() { Json = toolSpecDocument }; return toolSpecification; } /// <summary> /// Fetches weather data for the given latitude and longitude using the Open-Meteo API. /// Returns the weather data or an error message if the request fails. /// </summary> /// <param name="latitude">The latitude of the location.</param> /// <param name="longitude">The longitude of the location.</param> /// <returns>The weather data or an error message.</returns> public async Task<Document> FetchWeatherDataAsync(string latitude, string longitude) { string endpoint = "https://api.open-meteo.com/v1/forecast"; try { var httpClient = _httpClientFactory.CreateClient(); var response = await httpClient.GetAsync($"{endpoint}?latitude={latitude}&longitude={longitude}¤t_weather=True"); response.EnsureSuccessStatusCode(); var weatherData = await response.Content.ReadAsStringAsync(); Document weatherDocument = Document.FromObject( new { weather_data = weatherData }); return weatherDocument; } catch (HttpRequestException e) { _logger.LogError(e, "Error fetching weather data: {Message}", e.Message); throw; } catch (Exception e) { _logger.LogError(e, "Unexpected error fetching weather data: {Message}", e.Message); throw; } } }
ツール設定を使用した Converse API アクション。
/// <summary> /// Wrapper class for interacting with the Amazon Bedrock Converse API. /// </summary> public class BedrockActionsWrapper { private readonly IAmazonBedrockRuntime _bedrockClient; private readonly ILogger<BedrockActionsWrapper> _logger; /// <summary> /// Initializes a new instance of the <see cref="BedrockActionsWrapper"/> class. /// </summary> /// <param name="bedrockClient">The Bedrock Converse API client.</param> /// <param name="logger">The logger instance.</param> public BedrockActionsWrapper(IAmazonBedrockRuntime bedrockClient, ILogger<BedrockActionsWrapper> logger) { _bedrockClient = bedrockClient; _logger = logger; } /// <summary> /// Sends a Converse request to the Amazon Bedrock Converse API. /// </summary> /// <param name="modelId">The Bedrock Model Id.</param> /// <param name="systemPrompt">A system prompt instruction.</param> /// <param name="conversation">The array of messages in the conversation.</param> /// <param name="toolSpec">The specification for a tool.</param> /// <returns>The response of the model.</returns> public async Task<ConverseResponse> SendConverseRequestAsync(string modelId, string systemPrompt, List<Message> conversation, ToolSpecification toolSpec) { try { var request = new ConverseRequest() { ModelId = modelId, System = new List<SystemContentBlock>() { new SystemContentBlock() { Text = systemPrompt } }, Messages = conversation, ToolConfig = new ToolConfiguration() { Tools = new List<Tool>() { new Tool() { ToolSpec = toolSpec } } } }; var response = await _bedrockClient.ConverseAsync(request); return response; } catch (ModelNotReadyException ex) { _logger.LogError(ex, "Model not ready, please wait and try again."); throw; } catch (AmazonBedrockRuntimeException ex) { _logger.LogError(ex, "Error occurred while sending Converse request."); throw; } } }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「Converse」を参照してください。
-
Amazon Nova Canvas
次のコード例は、Amazon Bedrock で Amazon Nova Canvas を呼び出してイメージを生成する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Amazon Nova Canvas でイメージを作成します。
// Use the native inference API to create an image with Amazon Nova Canvas. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID. var modelId = "amazon.nova-canvas-v1:0"; // Define the image generation prompt for the model. var prompt = "A stylized picture of a cute old steampunk robot."; // Create a random seed between 0 and 858,993,459 int seed = new Random().Next(0, 858993460); //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { taskType = "TEXT_IMAGE", textToImageParams = new { text = prompt }, imageGenerationConfig = new { seed, quality = "standard", width = 512, height = 512, numberOfImages = 1 } }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var response = await client.InvokeModelAsync(request); // Decode the response body. var modelResponse = await JsonNode.ParseAsync(response.Body); // Extract the image data. var base64Image = modelResponse["images"]?[0].ToString() ?? ""; // Save the image in a local folder string savedPath = AmazonNovaCanvas.InvokeModel.SaveBase64Image(base64Image); Console.WriteLine($"Image saved to: {savedPath}"); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「InvokeModel」を参照してください。
-
Amazon Titan Text
次のコード例は、Bedrock の Converse API を使用して Amazon Titan Text にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して Amazon Titan Text にテキストメッセージを送信します。
// Use the Converse API to send a text message to Amazon Titan Text. using System; using System.Collections.Generic; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Titan Text Premier. var modelId = "amazon.titan-text-premier-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseAsync(request); // Extract and print the response text. string responseText = response?.Output?.Message?.Content?[0]?.Text ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「Converse」を参照してください。
-
次のコード例は、Bedrock の Converse API を使用して Amazon Titan Text にテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して Amazon Titan Text にテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理します。
// Use the Converse API to send a text message to Amazon Titan Text // and print the response stream. using System; using System.Collections.Generic; using System.Linq; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Titan Text Premier. var modelId = "amazon.titan-text-premier-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseStreamRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseStreamAsync(request); // Extract and print the streamed response text in real-time. foreach (var chunk in response.Stream.AsEnumerable()) { if (chunk is ContentBlockDeltaEvent) { Console.Write((chunk as ContentBlockDeltaEvent).Delta.Text); } } } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API Reference」の「ConverseStream」を参照してください。
-
次のコード例は、Invoke Model API を使用して Amazon Titan Text にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信します。
// Use the native inference API to send a text message to Amazon Titan Text. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Titan Text Premier. var modelId = "amazon.titan-text-premier-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { inputText = userMessage, textGenerationConfig = new { maxTokenCount = 512, temperature = 0.5 } }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var response = await client.InvokeModelAsync(request); // Decode the response body. var modelResponse = await JsonNode.ParseAsync(response.Body); // Extract and print the response text. var responseText = modelResponse["results"]?[0]?["outputText"] ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「InvokeModel」を参照してください。
-
次のコード例は、Invoke Model API を使用して Amazon Titan Text モデルにテキストメッセージを送信し、レスポンスストリームを印刷する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理します。
// Use the native inference API to send a text message to Amazon Titan Text // and print the response stream. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Titan Text Premier. var modelId = "amazon.titan-text-premier-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { inputText = userMessage, textGenerationConfig = new { maxTokenCount = 512, temperature = 0.5 } }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelWithResponseStreamRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var streamingResponse = await client.InvokeModelWithResponseStreamAsync(request); // Extract and print the streamed response text in real-time. foreach (var item in streamingResponse.Body) { var chunk = JsonSerializer.Deserialize<JsonObject>((item as PayloadPart).Bytes); var text = chunk["outputText"] ?? ""; Console.Write(text); } } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、AWS SDK for .NET API リファレンス」の「InvokeModelWithResponseStream」を参照してください。
-
Anthropic Claude
次のコード例は、Bedrock の Converse API を使用して Anthropic Claude にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して、Anthropic Claude にテキストメッセージを送信します。
// Use the Converse API to send a text message to Anthropic Claude. using System; using System.Collections.Generic; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Claude 3 Haiku. var modelId = "anthropic.claude-3-haiku-20240307-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseAsync(request); // Extract and print the response text. string responseText = response?.Output?.Message?.Content?[0]?.Text ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「Converse」を参照してください。
-
次のコード例は、Bedrock の Converse API を使用して Anthropic Claude にテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して Anthropic Claude にテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理します。
// Use the Converse API to send a text message to Anthropic Claude // and print the response stream. using System; using System.Collections.Generic; using System.Linq; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Claude 3 Haiku. var modelId = "anthropic.claude-3-haiku-20240307-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseStreamRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseStreamAsync(request); // Extract and print the streamed response text in real-time. foreach (var chunk in response.Stream.AsEnumerable()) { if (chunk is ContentBlockDeltaEvent) { Console.Write((chunk as ContentBlockDeltaEvent).Delta.Text); } } } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API Reference」の「ConverseStream」を参照してください。
-
次のコード例は、Invoke Model API を使用して Anthropic Claude にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信します。
// Use the native inference API to send a text message to Anthropic Claude. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Claude 3 Haiku. var modelId = "anthropic.claude-3-haiku-20240307-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { anthropic_version = "bedrock-2023-05-31", max_tokens = 512, temperature = 0.5, messages = new[] { new { role = "user", content = userMessage } } }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var response = await client.InvokeModelAsync(request); // Decode the response body. var modelResponse = await JsonNode.ParseAsync(response.Body); // Extract and print the response text. var responseText = modelResponse["content"]?[0]?["text"] ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「InvokeModel」を参照してください。
-
次のコード例は、Invoke Model API を使用して Anthropic Claude モデルにテキストメッセージを送信し、レスポンスストリームを印刷する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理します。
// Use the native inference API to send a text message to Anthropic Claude // and print the response stream. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Claude 3 Haiku. var modelId = "anthropic.claude-3-haiku-20240307-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { anthropic_version = "bedrock-2023-05-31", max_tokens = 512, temperature = 0.5, messages = new[] { new { role = "user", content = userMessage } } }); // Create a request with the model ID, the user message, and an inference configuration. var request = new InvokeModelWithResponseStreamRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var streamingResponse = await client.InvokeModelWithResponseStreamAsync(request); // Extract and print the streamed response text in real-time. foreach (var item in streamingResponse.Body) { var chunk = JsonSerializer.Deserialize<JsonObject>((item as PayloadPart).Bytes); var text = chunk["delta"]?["text"] ?? ""; Console.Write(text); } } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、AWS SDK for .NET API リファレンス」の「InvokeModelWithResponseStream」を参照してください。
-
Cohere Command
次のコード例は、Bedrock の Converse API を使用して Cohere Command にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して Cohere Command にテキストメッセージを送信します。
// Use the Converse API to send a text message to Cohere Command. using System; using System.Collections.Generic; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Command R. var modelId = "cohere.command-r-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseAsync(request); // Extract and print the response text. string responseText = response?.Output?.Message?.Content?[0]?.Text ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「Converse」を参照してください。
-
次のコード例は、Bedrock の Converse API を使用して Cohere Command にテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して Cohere Command にテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理します。
// Use the Converse API to send a text message to Cohere Command // and print the response stream. using System; using System.Collections.Generic; using System.Linq; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Command R. var modelId = "cohere.command-r-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseStreamRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseStreamAsync(request); // Extract and print the streamed response text in real-time. foreach (var chunk in response.Stream.AsEnumerable()) { if (chunk is ContentBlockDeltaEvent) { Console.Write((chunk as ContentBlockDeltaEvent).Delta.Text); } } } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API Reference」の「ConverseStream」を参照してください。
-
次のコード例は、Invoke Model API を使用して Cohere Command R および R+ にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信します。
// Use the native inference API to send a text message to Cohere Command R. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Command R. var modelId = "cohere.command-r-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { message = userMessage, max_tokens = 512, temperature = 0.5 }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var response = await client.InvokeModelAsync(request); // Decode the response body. var modelResponse = await JsonNode.ParseAsync(response.Body); // Extract and print the response text. var responseText = modelResponse["text"] ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「InvokeModel」を参照してください。
-
次のコード例は、Invoke Model API を使用して Cohere Command にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信します。
// Use the native inference API to send a text message to Cohere Command. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Command Light. var modelId = "cohere.command-light-text-v14"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { prompt = userMessage, max_tokens = 512, temperature = 0.5 }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var response = await client.InvokeModelAsync(request); // Decode the response body. var modelResponse = await JsonNode.ParseAsync(response.Body); // Extract and print the response text. var responseText = modelResponse["generations"]?[0]?["text"] ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「InvokeModel」を参照してください。
-
次のコード例は、Invoke Model API とレスポンスストリームを使用して、Cohere Command にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理します。
// Use the native inference API to send a text message to Cohere Command R // and print the response stream. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Command R. var modelId = "cohere.command-r-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { message = userMessage, max_tokens = 512, temperature = 0.5 }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelWithResponseStreamRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var streamingResponse = await client.InvokeModelWithResponseStreamAsync(request); // Extract and print the streamed response text in real-time. foreach (var item in streamingResponse.Body) { var chunk = JsonSerializer.Deserialize<JsonObject>((item as PayloadPart).Bytes); var text = chunk["text"] ?? ""; Console.Write(text); } } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「InvokeModel」を参照してください。
-
次のコード例は、Invoke Model API とレスポンスストリームを使用して、Cohere Command にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理します。
// Use the native inference API to send a text message to Cohere Command // and print the response stream. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Command Light. var modelId = "cohere.command-light-text-v14"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { prompt = userMessage, max_tokens = 512, temperature = 0.5 }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelWithResponseStreamRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var streamingResponse = await client.InvokeModelWithResponseStreamAsync(request); // Extract and print the streamed response text in real-time. foreach (var item in streamingResponse.Body) { var chunk = JsonSerializer.Deserialize<JsonObject>((item as PayloadPart).Bytes); var text = chunk["generations"]?[0]?["text"] ?? ""; Console.Write(text); } } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「InvokeModel」を参照してください。
-
Meta Llama
次のコード例は、Bedrock の Converse API を使用して Meta Llama にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して Meta Llama にテキストメッセージを送信します。
// Use the Converse API to send a text message to Meta Llama. using System; using System.Collections.Generic; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Llama 3 8b Instruct. var modelId = "meta.llama3-8b-instruct-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseAsync(request); // Extract and print the response text. string responseText = response?.Output?.Message?.Content?[0]?.Text ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「Converse」を参照してください。
-
次のコード例は、Bedrock の Converse API を使用して Meta Llama にテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して Meta Llama にテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理します。
// Use the Converse API to send a text message to Meta Llama // and print the response stream. using System; using System.Collections.Generic; using System.Linq; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Llama 3 8b Instruct. var modelId = "meta.llama3-8b-instruct-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseStreamRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseStreamAsync(request); // Extract and print the streamed response text in real-time. foreach (var chunk in response.Stream.AsEnumerable()) { if (chunk is ContentBlockDeltaEvent) { Console.Write((chunk as ContentBlockDeltaEvent).Delta.Text); } } } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API Reference」の「ConverseStream」を参照してください。
-
次のコード例は、Invoke Model API を使用して Meta Llama 3 にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信します。
// Use the native inference API to send a text message to Meta Llama 3. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USWest2); // Set the model ID, e.g., Llama 3 70b Instruct. var modelId = "meta.llama3-70b-instruct-v1:0"; // Define the prompt for the model. var prompt = "Describe the purpose of a 'hello world' program in one line."; // Embed the prompt in Llama 2's instruction format. var formattedPrompt = $@" <|begin_of_text|><|start_header_id|>user<|end_header_id|> {prompt} <|eot_id|> <|start_header_id|>assistant<|end_header_id|> "; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { prompt = formattedPrompt, max_gen_len = 512, temperature = 0.5 }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var response = await client.InvokeModelAsync(request); // Decode the response body. var modelResponse = await JsonNode.ParseAsync(response.Body); // Extract and print the response text. var responseText = modelResponse["generation"] ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「InvokeModel」を参照してください。
-
次のコード例は、Invoke Model API を使用して Meta Llama 3 にテキストメッセージを送信し、レスポンスストリームを印刷する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理します。
// Use the native inference API to send a text message to Meta Llama 3 // and print the response stream. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USWest2); // Set the model ID, e.g., Llama 3 70b Instruct. var modelId = "meta.llama3-70b-instruct-v1:0"; // Define the prompt for the model. var prompt = "Describe the purpose of a 'hello world' program in one line."; // Embed the prompt in Llama 2's instruction format. var formattedPrompt = $@" <|begin_of_text|><|start_header_id|>user<|end_header_id|> {prompt} <|eot_id|> <|start_header_id|>assistant<|end_header_id|> "; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { prompt = formattedPrompt, max_gen_len = 512, temperature = 0.5 }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelWithResponseStreamRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var streamingResponse = await client.InvokeModelWithResponseStreamAsync(request); // Extract and print the streamed response text in real-time. foreach (var item in streamingResponse.Body) { var chunk = JsonSerializer.Deserialize<JsonObject>((item as PayloadPart).Bytes); var text = chunk["generation"] ?? ""; Console.Write(text); } } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、AWS SDK for .NET API リファレンス」の「InvokeModelWithResponseStream」を参照してください。
-
Mistral AI
次のコード例は、Bedrock の Converse API を使用して Mistral にテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して Mistral にテキストメッセージを送信します。
// Use the Converse API to send a text message to Mistral. using System; using System.Collections.Generic; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Mistral Large. var modelId = "mistral.mistral-large-2402-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseAsync(request); // Extract and print the response text. string responseText = response?.Output?.Message?.Content?[0]?.Text ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「Converse」を参照してください。
-
次のコード例は、Bedrock の Converse API を使用して Mistral にテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Bedrock の Converse API を使用して Mistral にテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理します。
// Use the Converse API to send a text message to Mistral // and print the response stream. using System; using System.Collections.Generic; using System.Linq; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Mistral Large. var modelId = "mistral.mistral-large-2402-v1:0"; // Define the user message. var userMessage = "Describe the purpose of a 'hello world' program in one line."; // Create a request with the model ID, the user message, and an inference configuration. var request = new ConverseStreamRequest { ModelId = modelId, Messages = new List<Message> { new Message { Role = ConversationRole.User, Content = new List<ContentBlock> { new ContentBlock { Text = userMessage } } } }, InferenceConfig = new InferenceConfiguration() { MaxTokens = 512, Temperature = 0.5F, TopP = 0.9F } }; try { // Send the request to the Bedrock Runtime and wait for the result. var response = await client.ConverseStreamAsync(request); // Extract and print the streamed response text in real-time. foreach (var chunk in response.Stream.AsEnumerable()) { if (chunk is ContentBlockDeltaEvent) { Console.Write((chunk as ContentBlockDeltaEvent).Delta.Text); } } } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API Reference」の「ConverseStream」を参照してください。
-
次のコード例は、Invoke Model API を使用して Mistral モデルにテキストメッセージを送信する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信します。
// Use the native inference API to send a text message to Mistral. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Mistral Large. var modelId = "mistral.mistral-large-2402-v1:0"; // Define the prompt for the model. var prompt = "Describe the purpose of a 'hello world' program in one line."; // Embed the prompt in Mistral's instruction format. var formattedPrompt = $"<s>[INST] {prompt} [/INST]"; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { prompt = formattedPrompt, max_tokens = 512, temperature = 0.5 }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var response = await client.InvokeModelAsync(request); // Decode the response body. var modelResponse = await JsonNode.ParseAsync(response.Body); // Extract and print the response text. var responseText = modelResponse["outputs"]?[0]?["text"] ?? ""; Console.WriteLine(responseText); } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、「AWS SDK for .NET API リファレンス」の「InvokeModel」を参照してください。
-
次のコード例は、Invoke Model API を使用して Mistral AI モデルにテキストメッセージを送信し、レスポンスストリームを印刷する方法を示しています。
- SDK for .NET
-
注記
GitHub には、その他のリソースもあります。用例一覧を検索し、AWS コード例リポジトリ
での設定と実行の方法を確認してください。 Invoke Model API を使用してテキストメッセージを送信し、レスポンスストリームをリアルタイムで処理します。
// Use the native inference API to send a text message to Mistral // and print the response stream. using System; using System.IO; using System.Text.Json; using System.Text.Json.Nodes; using Amazon; using Amazon.BedrockRuntime; using Amazon.BedrockRuntime.Model; // Create a Bedrock Runtime client in the AWS Region you want to use. var client = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1); // Set the model ID, e.g., Mistral Large. var modelId = "mistral.mistral-large-2402-v1:0"; // Define the prompt for the model. var prompt = "Describe the purpose of a 'hello world' program in one line."; // Embed the prompt in Mistral's instruction format. var formattedPrompt = $"<s>[INST] {prompt} [/INST]"; //Format the request payload using the model's native structure. var nativeRequest = JsonSerializer.Serialize(new { prompt = formattedPrompt, max_tokens = 512, temperature = 0.5 }); // Create a request with the model ID and the model's native request payload. var request = new InvokeModelWithResponseStreamRequest() { ModelId = modelId, Body = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(nativeRequest)), ContentType = "application/json" }; try { // Send the request to the Bedrock Runtime and wait for the response. var streamingResponse = await client.InvokeModelWithResponseStreamAsync(request); // Extract and print the streamed response text in real-time. foreach (var item in streamingResponse.Body) { var chunk = JsonSerializer.Deserialize<JsonObject>((item as PayloadPart).Bytes); var text = chunk["outputs"]?[0]?["text"] ?? ""; Console.Write(text); } } catch (AmazonBedrockRuntimeException e) { Console.WriteLine($"ERROR: Can't invoke '{modelId}'. Reason: {e.Message}"); throw; }
-
API の詳細については、AWS SDK for .NET API リファレンス」の「InvokeModelWithResponseStream」を参照してください。
-