

# APIs supported by Amazon Bedrock
<a name="apis"></a>

**Inference APIs supported**

Amazon Bedrock provide five main API patterns to perform [inference](inference.md) in Amazon Bedrock.


| **API method** | **Service endpoint** | **Use-case best suited for** | **Key feature** | 
| --- | --- | --- | --- | 
| [Responses API](bedrock-mantle.md) (recommended) | bedrock-mantle.<suffix> | Stateful conversations | Use the Responses API for modern, agentic applications requiring built-in tool use (search, code interpreter), multimodal inputs, and stateful conversations | 
| [Chat completions](bedrock-mantle.md) | bedrock-mantle.<suffix> (recommended) and bedrock-runtime.<suffix> | Stateless multi-turn chat | Use the [Chat Completions API](inference-chat-completions.md) for lightweight, stateless, text-focused tasks where you need full control over chat history management and lower latency. | 
| Messages API | bedrock-mantle.<suffix> | Anthropic-native interface | Use the Messages API for direct access to Anthropic models using the Anthropic-native request and response format via the bedrock-mantle endpoint. | 
| [Converse method](conversation-inference.md) | bedrock-runtime.<suffix> | Multi-turn chat/standardizing | The [Converse API](conversation-inference.md) provides a unified interface for interacting with all models in Amazon Bedrock. | 
| [Invoke method](inference-invoke.md) | bedrock-runtime.<suffix> | Single transactions / Large payloads | The Invoke API provides direct access to models with more ability to control the request and response format. | 

Note: suffix is `{region}.amazonaws.com`

Read more about the [APIs supported by Amazon Bedrock](inference-api.md).

**Deciding between APIs**

The API you use depends on your use-case.


| **Use Case** | **Recommended API** | 
| --- | --- | 
| Migrating from OpenAI API-compatible endpoint | Use OpenAI-compatible APIs: [Responses API](https://platform.openai.com/docs/api-reference/responses) or [Chat Completions API](inference-chat-completions.md). According to OpenAI, the [recommended](https://platform.openai.com/docs/guides/migrate-to-responses) API long-term is Responses API. | 
| Using models not compatible with OpenAI-compatible endpoint | Use native Amazon Bedrock APIs: [Converse](conversation-inference.md) and [Invoke](inference-invoke.md). For more information, see [Submit prompts and generate responses using the API](inference-api.md). | 
| Consistent interface across all models | [Converse API](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_Converse.html) - Works with all models that support messages. Write code once and use it with different models. For example code, see [Converse API examples](https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference.html#message-inference-examples). | 
| Direct model access with full control | [Invoke API](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModel.html) - Provides direct access to models with more control over request and response format. Use for generating text, images, and embeddings. For example code, see [Invoke model code examples](https://docs.aws.amazon.com/bedrock/latest/userguide/inference-invoke.html#inference-example-invoke). | 
| New to Amazon Bedrock | Start with the [Responses API](bedrock-mantle.md) | 

**Models supported by each API and endpoint**

First, browse our [models](models.md) to decide on the model you want to use. Once you decide on the model you want to use, you can see the APIs it supports and based on that you choose which endpoint to use. The `bedrock-mantle` supports the Responses, Chat Completions, and Messages API. The `bedrock-runtime` supports the Invoke and Converse API.