Monitor batch inference jobs
Apart from the configurations you set for a batch inference job, you can also monitor its progress by seeing its status, how many records have been processed and how many records failed to process. For more information about the possible statuses for a job, see the status
field in ModelInvocationJobSummary.
To learn how to view details about batch inference jobs, select the tab corresponding to your method of choice and follow the steps.
- Console
-
- API
-
To get information about a batch inference job, send a GetModelInvocationJob request (see link for request and response formats and field details) with an Amazon Bedrock control plane endpoint and provide the ID or ARN of the job in the jobIdentifier
field.
To list information about multiple batch inference jobs, send ListModelInvocationJobs request (see link for request and response formats and field details) with an Amazon Bedrock control plane endpoint. You can specify the following optional parameters:
Field |
Short description |
maxResults |
The maximum number of results to return in a
response. |
nextToken |
If there are more results than the number you specified
in the maxResults field, the response returns a nextToken
value. To see the next batch of results, send the
nextToken value in another
request. |
To list all the tags for a job, send a ListTagsForResource request (see link for request and response formats and field details) with an Amazon Bedrock control plane endpoint and include the Amazon Resource Name (ARN) of the job.
You can also monitor batch inference jobs with Amazon EventBridge. For more information, see Monitor Amazon Bedrock job state changes using Amazon EventBridge.