

# Getting started
<a name="AgentCore-GettingStarted"></a>

Amazon Bedrock AgentCore provides built-in metrics, logs, and traces to monitor the performance of your AgentCore modular services. You can view this data in Amazon CloudWatch. To access the full range of observability data from all AgentCore module services, instrument your code using the AWS Distro for OpenTelemetry (ADOT) SDK.

## Add observability to your agentic resources
<a name="add-observability-agentic-resources"></a>

Before you begin, enable CloudWatch Transaction Search. For more information, see [Enable Transaction Search](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/Enable-TransactionSearch.html).

### Enable observability for AgentCore Runtime hosted agents
<a name="enable-observability-agentcore-runtime"></a>

You can host your agents on AgentCore Runtime, a secure, serverless runtime purpose-built for deploying and scaling dynamic AI agents and tools. AgentCore Runtime supports any open-source framework including LangGraph, CrewAI, Strands Agents, any protocol, and any model.

To enable observability for AgentCore Runtime hosted agents, see [Configure custom observability](https://docs.aws.amazon.com/bedrock-agentcore/latest/devguide/observability-configure.html#observability-configure-custom).

For a step-by-step tutorial, see [Enabling observability for AgentCore Runtime hosted agents](https://aws.github.io/bedrock-agentcore-starter-toolkit/user-guide/observability/quickstart.html#enabling-observability-for-agentcore-runtime-hosted-agents).

### Enable observability for non-AgentCore hosted agents
<a name="enable-observability-non-agentcore"></a>

You can host your agents outside of AgentCore and bring your observability data into CloudWatch for end-to-end monitoring in one location.

To enable observability for non-AgentCore Runtime hosted agents, see [Configure third-party observability](https://docs.aws.amazon.com/bedrock-agentcore/latest/devguide/observability-configure.html#observability-configure-3p).

For a step-by-step tutorial, see [Enabling observability for non-AgentCore hosted agents](https://aws.github.io/bedrock-agentcore-starter-toolkit/user-guide/observability/quickstart.html#enabling-observability-for-non-agentcore-hosted-agents).

### Enable observability for AgentCore memory, gateway, and built-in tool resources
<a name="enable-observability-agentcore-resources"></a>

You can gain visibility into the metrics and traces of AgentCore modular services. For more information, see [Configure CloudWatch observability](https://docs.aws.amazon.com/bedrock-agentcore/latest/devguide/observability-configure.html#observability-configure-cloudwatch).

### Enable AgentCore Evaluations
<a name="enable-observability-agentcore-evaluations"></a>

You can gain visibility into AgentCore Evaluations. AgentCore Evaluations provide capabilities to monitor and assess the performance, quality, and reliability of your AI agents. To enable observability for AgentCore Evaluations, see [ AgentCore evaluations](https://docs.aws.amazon.com/bedrock-agentcore/latest/devguide/evaluations.html).

# View observability data in CloudWatch
<a name="view-observability-data-cloudwatch"></a>

After you enable observability for your agentic resources, you can view the collected data in CloudWatch.

## View the GenAI Observability dashboard
<a name="view-genai-observability-dashboard"></a>

1. Open the CloudWatch console.

1. Under the GenAI Observability dashboard, view data related to model invocations and agents on Amazon Bedrock AgentCore.

1. In the Amazon Bedrock AgentCore sub-menu, you can choose the following views:
   + **Agents View** – Lists all your agents, both on and off runtime. Choose an agent to view runtime metrics, sessions, traces, and evaluations specific to that agent
   + **Sessions View** – Navigate across all sessions associated with agents
   + **Traces View** – View traces and span information for agents. Choose a trace to explore the trace trajectory and timeline

## View logs
<a name="view-logs"></a>

1. Open the CloudWatch console.

1. In the navigation pane, expand **Logs** and choose **Log groups**.

1. Search for your agent's log group:
   + Standard logs (stdout/stderr) – `/aws/bedrock-agentcore/runtimes/<agent_id>-<endpoint_name>/[runtime-logs] <UUID>`
   + OTEL structured logs – `/aws/bedrock-agentcore/runtimes/<agent_id>-<endpoint_name>/runtime-logs`

## View traces and spans
<a name="view-traces-spans"></a>

1. Open the CloudWatch console.

1. In the navigation pane, choose **Transaction Search**.

1. Navigate to `/aws/spans/default`.

1. Filter by service name or other criteria.

1. Choose a trace to view the detailed execution graph.

## View metrics
<a name="view-metrics"></a>

1. Open the CloudWatch console.

1. In the navigation pane, choose **Metrics**.

1. Navigate to the **bedrock-agentcore** namespace.

1. Explore the available metrics.

# Protect sensitive data
<a name="mask-sensitive-data"></a>

Amazon CloudWatch Logs uses data protection policies to identify sensitive data and define actions to protect that data. You use data identifiers to select the sensitive data of interest. Amazon CloudWatch Logs then detects the sensitive data using machine learning and pattern matching. You can define audit and masking operations to log sensitive data findings and mask sensitive data when viewing log events.

For more information, see [Protecting sensitive log data with masking](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/cloudwatch-logs-data-protection-policies.html).

You can configure data protection for Amazon Bedrock AgentCore at the **account level** or at the **log group level**. With account level data protection, data protection rules are applied to all logs in your account. With log level data protection, data protection rules are can be applied to specific log groups in your account. This gives you granular control over how PII data is masked in your in your account.

**To configure data protection at the account level**

1. Open the Amazon CloudWatch console.

1. In the navigation pane, choose **Settings**.

1. Choose the **Logs** tab.

1. Choose **Configure the Data protection account policy**.

1. Specify the data identifiers that are relevant to your data.
   + To use a a predefined data identifier, in the **Managed data identifiers** drop-down, select the data identifiers that are relevant to your data.
   + To use a custom data identifier, choose **Add custom data identifier**, and then specify a name for the identifier and a Regex pattern for the data to protect.

1. (*Optional*) Choose a destination for the audit findings.
   + To send audit findings to a CloudWatch log, choose **Amazon CloudWatch Logs** and then select the destination log group.
   + To send audit findings to a Firehose stream, choose **Amazon Data Firehose** and then select the destination firehose stream.
   + To send audit findings to an Amazon S3 bucket, choose **Amazon S3** and then select the destination Amazon S3 bucket.

1. Choose **Activate data protection**.

**To configure data protection at the log group level**

1. Open the Amazon CloudWatch console.

1. In the navigation panel, choose **Logs**, **Log Management**.

1. Choose the **Log groups** tab, select the log group you want to enable data protection on, and then choose **Create data protection policy**.

1. Specify the data identifiers that are relevant to your data.
   + To use a a predefined data identifier, in the **Managed data identifiers** drop-down, select the data identifiers that are relevant to your data.
   + To use a custom data identifier, choose **Add custom data identifier**, and then specify a name for the identifier and a Regex pattern for the data to protect.

1. (*Optional*) Choose a destination for the audit findings.
   + To send audit findings to a CloudWatch log, choose **Amazon CloudWatch Logs** and then select the destination log group.
   + To send audit findings to a Firehose stream, choose **Amazon Data Firehose** and then select the destination firehose stream.
   + To send audit findings to an Amazon S3 bucket, choose **Amazon S3** and then select the destination Amazon S3 bucket.

1. Choose **Activate data protection**.