/AWS1/CL_BDK=>CREATEMODELINVOCATIONJOB()
¶
About CreateModelInvocationJob¶
Creates a batch inference job to invoke a model on multiple prompts. Format your data according to Format your inference data and upload it to an Amazon S3 bucket. For more information, see Process multiple prompts with batch inference.
The response returns a jobArn
that you can use to stop or get details about the job.
Method Signature¶
IMPORTING¶
Required arguments:¶
IV_JOBNAME
TYPE /AWS1/BDKMODELINVCJOBNAME
/AWS1/BDKMODELINVCJOBNAME
¶
A name to give the batch inference job.
IV_ROLEARN
TYPE /AWS1/BDKROLEARN
/AWS1/BDKROLEARN
¶
The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference. You can use the console to create a default service role or follow the steps at Create a service role for batch inference.
IV_MODELID
TYPE /AWS1/BDKMODELID
/AWS1/BDKMODELID
¶
The unique identifier of the foundation model to use for the batch inference job.
IO_INPUTDATACONFIG
TYPE REF TO /AWS1/CL_BDKMDELINVJOBINPDAT00
/AWS1/CL_BDKMDELINVJOBINPDAT00
¶
Details about the location of the input to the batch inference job.
IO_OUTPUTDATACONFIG
TYPE REF TO /AWS1/CL_BDKMDELINVJOBOUTDAT00
/AWS1/CL_BDKMDELINVJOBOUTDAT00
¶
Details about the location of the output of the batch inference job.
Optional arguments:¶
IV_CLIENTREQUESTTOKEN
TYPE /AWS1/BDKMDELINVIDEMPOTENCYTOK
/AWS1/BDKMDELINVIDEMPOTENCYTOK
¶
A unique, case-sensitive identifier to ensure that the API request completes no more than one time. If this token matches a previous request, Amazon Bedrock ignores the request, but does not return an error. For more information, see Ensuring idempotency.
IO_VPCCONFIG
TYPE REF TO /AWS1/CL_BDKVPCCONFIG
/AWS1/CL_BDKVPCCONFIG
¶
The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job. For more information, see Protect batch inference jobs using a VPC.
IV_TIMEOUTDURATIONINHOURS
TYPE /AWS1/BDKMDELINVJOBTODURINHO00
/AWS1/BDKMDELINVJOBTODURINHO00
¶
The number of hours after which to force the batch inference job to time out.
IT_TAGS
TYPE /AWS1/CL_BDKTAG=>TT_TAGLIST
TT_TAGLIST
¶
Any tags to associate with the batch inference job. For more information, see Tagging Amazon Bedrock resources.