

# Endpoints supported by Amazon Bedrock
<a name="endpoints"></a>

Amazon Bedrock supports various endpoints depending on whether you want to perform control plane operators or [inference](inference.md) operations.

**Control Plane operations**

Amazon Bedrock control plane operations can be used with the endpoint: `bedrock.{region}.amazonaws.com.` It can be used for managing resources, such as listing available [models](models.md), creating [custom model](custom-models.md) jobs, and managing [provisioned throughput](prov-throughput.md). Read more on our control plane API [here](https://docs.aws.amazon.com/general/latest/gr/bedrock.html).

**Inference operations**

Amazon Bedrock supports the following primary two end points for performing inference programmatically:


| **Endpoint** | **Supported APIs** | **Description** | 
| --- | --- | --- | 
| bedrock-mantle.\$1region\$1.api.aws | [Responses API](bedrock-mantle.md) / [Chat Completions API](bedrock-mantle.md) | Region-specific endpoints for making inference requests for models hosted in Amazon Bedrock using the OpenAI-compatible endpoints. | 
| bedrock-runtime.\$1region\$1.amazonaws.com | [InvokeModel](inference-invoke.md) / [Converse](conversation-inference.md) / [Chat Completions](inference-chat-completions.md) | Region-specific endpoints for making inference requests for models hosted in Amazon Bedrock using the InvokeModel/Converse/Chat Completions APIs. Read more on Amazon Bedrock Runtime APIs [here](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_Operations_Amazon_Bedrock_Runtime.html). | 

**Which service endpoint should you use for your inference?**

The endpoint you use depends on your use-case.


| **Endpoint** | **Features and Use Cases** | 
| --- | --- | 
| bedrock-mantle |  Supports OpenAI-compatible APIs ([Responses API](bedrock-mantle.md), [Chat Completions API](bedrock-mantle.md)). **Use for:** Easily migrating from other models that use the OpenAI-compatible APIs. **Features:** Supports both client and server side tool use with Lambda functions and comes configured with ready-to-use tools in your applications. **Recommended for:** New users to Amazon Bedrock.  | 
| bedrock-runtime |  Supports Amazon Bedrock native APIs ([Invoke API](inference-invoke.md), [Converse API](conversation-inference.md)). Also supports the OpenAI-compatible [Chat Completions API](inference-chat-completions.md). **Use for:** Running any model supported by Amazon Bedrock. The [Converse API](conversation-inference.md) provides a unified interface for interacting with all models in Amazon Bedrock and the [Invoke API](inference-invoke.md) provides direct access to models with more ability to control the request and response format. You can also use [Chat Completions API](inference-chat-completions.md) to interact with your models. **Features:** Supports only client-side tool use with Lambda functions and does not come pre-configured with ready-to-use tools. Allows you to track usage and costs when invoking a model.  | 

If you are new to Amazon Bedrock, we recommend you start with the `bedrock-mantle` endpoint.