Log groups and streams that Amazon SageMaker AI sends to Amazon CloudWatch Logs
To help you debug your compilation jobs, processing jobs, training jobs, endpoints,
transform jobs, notebook instances, and notebook instance lifecycle configurations, anything
an algorithm container, a model container, or a notebook instance lifecycle configuration
sends to stdout
or stderr
is also sent to Amazon CloudWatch Logs. In addition to
debugging, you can use these for progress analysis.
By default, log data is stored in CloudWatch Logs indefinitely. However, you can configure how long to store log data in a log group. For information, see Change Log Data Retention in CloudWatch Logs in the Amazon CloudWatch Logs User Guide.
Logs
The following table lists all of the logs provided by Amazon SageMaker AI.
Logs
Log Group Name | Log Stream Name |
---|---|
/aws/sagemaker/CompilationJobs |
|
/aws/sagemaker/Endpoints/[EndpointName] |
|
(For Asynchronous Inference endpoints) |
|
(For Inference Pipelines) |
|
/aws/sagemaker/groundtruth/WorkerActivity |
|
/aws/sagemaker/InferenceRecommendationsJobs |
|
|
|
|
|
/aws/sagemaker/LabelingJobs |
|
/aws/sagemaker/NotebookInstances |
|
|
|
/aws/sagemaker/ProcessingJobs |
|
/aws/sagemaker/studio |
|
|
|
/aws/sagemaker/TrainingJobs |
|
/aws/sagemaker/TransformJobs |
|
|
|
|
Note
1. The /aws/sagemaker/NotebookInstances/[LifecycleConfigHook]
log stream is
created when you create a notebook instance with a lifecycle configuration. For more
information, see Customization of a SageMaker notebook instance
using an LCC script.
2. For Inference Pipelines, if you don't provide container names, the platform uses **container-1, container-2**, and so on, corresponding to the order provided in the SageMaker AI model.
For more information about logging events with CloudWatch logging, see What is Amazon CloudWatch Logs? in the Amazon CloudWatch User Guide.