Accessing Amazon Rekognition Custom Labels evaluation metrics (SDK)
The DescribeProjectVersions operation provides access to metrics beyond those provided in the console.
Like the console, DescribeProjectVersions
provides access to the following metrics as summary
information for the testing results and as testing results for each label:
The average threshold for all labels and the threshold for individual labels is returned.
DescribeProjectVersions
also provides access to the following metrics for classification and image detection (object location on image).
Confusion Matrix for image classification. For more information, see Viewing the confusion matrix for a model.
Mean Average Precision (mAP) for image detection.
Mean Average Recall (mAR) for image detection.
DescribeProjectVersions
also provides access to true positive, false positive, false negative, and true negative
values. For more information, see Metrics for evaluating your model.
The aggregate F1 score metric is returned directly by DescribeProjectVersions
. Other metrics are accessible
from a Accessing the model summary file and
Interpreting the evaluation
manifest snapshot
files stored in an Amazon S3 bucket. For more information, see Accessing the summary file and evaluation manifest snapshot (SDK).