Class CfnInferenceComponent.InferenceComponentSpecificationProperty.Builder
java.lang.Object
software.amazon.awscdk.services.sagemaker.CfnInferenceComponent.InferenceComponentSpecificationProperty.Builder
- All Implemented Interfaces:
software.amazon.jsii.Builder<CfnInferenceComponent.InferenceComponentSpecificationProperty>
- Enclosing interface:
CfnInferenceComponent.InferenceComponentSpecificationProperty
@Stability(Stable)
public static final class CfnInferenceComponent.InferenceComponentSpecificationProperty.Builder
extends Object
implements software.amazon.jsii.Builder<CfnInferenceComponent.InferenceComponentSpecificationProperty>
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionbaseInferenceComponentName
(String baseInferenceComponentName) build()
Builds the configured instance.computeResourceRequirements
(IResolvable computeResourceRequirements) computeResourceRequirements
(CfnInferenceComponent.InferenceComponentComputeResourceRequirementsProperty computeResourceRequirements) container
(IResolvable container) startupParameters
(IResolvable startupParameters)
-
Constructor Details
-
Builder
public Builder()
-
-
Method Details
-
baseInferenceComponentName
@Stability(Stable) public CfnInferenceComponent.InferenceComponentSpecificationProperty.Builder baseInferenceComponentName(String baseInferenceComponentName) Sets the value ofCfnInferenceComponent.InferenceComponentSpecificationProperty.getBaseInferenceComponentName()
- Parameters:
baseInferenceComponentName
- The name of an existing inference component that is to contain the inference component that you're creating with your request. Specify this parameter only if your request is meant to create an adapter inference component. An adapter inference component contains the path to an adapter model. The purpose of the adapter model is to tailor the inference output of a base foundation model, which is hosted by the base inference component. The adapter inference component uses the compute resources that you assigned to the base inference component.When you create an adapter inference component, use the
Container
parameter to specify the location of the adapter artifacts. In the parameter value, use theArtifactUrl
parameter of theInferenceComponentContainerSpecification
data type.Before you can create an adapter inference component, you must have an existing inference component that contains the foundation model that you want to adapt.
- Returns:
this
-
computeResourceRequirements
@Stability(Stable) public CfnInferenceComponent.InferenceComponentSpecificationProperty.Builder computeResourceRequirements(IResolvable computeResourceRequirements) Sets the value ofCfnInferenceComponent.InferenceComponentSpecificationProperty.getComputeResourceRequirements()
- Parameters:
computeResourceRequirements
- The compute resources allocated to run the model, plus any adapter models, that you assign to the inference component. Omit this parameter if your request is meant to create an adapter inference component. An adapter inference component is loaded by a base inference component, and it uses the compute resources of the base inference component.- Returns:
this
-
computeResourceRequirements
@Stability(Stable) public CfnInferenceComponent.InferenceComponentSpecificationProperty.Builder computeResourceRequirements(CfnInferenceComponent.InferenceComponentComputeResourceRequirementsProperty computeResourceRequirements) Sets the value ofCfnInferenceComponent.InferenceComponentSpecificationProperty.getComputeResourceRequirements()
- Parameters:
computeResourceRequirements
- The compute resources allocated to run the model, plus any adapter models, that you assign to the inference component. Omit this parameter if your request is meant to create an adapter inference component. An adapter inference component is loaded by a base inference component, and it uses the compute resources of the base inference component.- Returns:
this
-
container
@Stability(Stable) public CfnInferenceComponent.InferenceComponentSpecificationProperty.Builder container(IResolvable container) - Parameters:
container
- Defines a container that provides the runtime environment for a model that you deploy with an inference component.- Returns:
this
-
container
@Stability(Stable) public CfnInferenceComponent.InferenceComponentSpecificationProperty.Builder container(CfnInferenceComponent.InferenceComponentContainerSpecificationProperty container) - Parameters:
container
- Defines a container that provides the runtime environment for a model that you deploy with an inference component.- Returns:
this
-
modelName
@Stability(Stable) public CfnInferenceComponent.InferenceComponentSpecificationProperty.Builder modelName(String modelName) - Parameters:
modelName
- The name of an existing SageMaker AI model object in your account that you want to deploy with the inference component.- Returns:
this
-
startupParameters
@Stability(Stable) public CfnInferenceComponent.InferenceComponentSpecificationProperty.Builder startupParameters(IResolvable startupParameters) Sets the value ofCfnInferenceComponent.InferenceComponentSpecificationProperty.getStartupParameters()
- Parameters:
startupParameters
- Settings that take effect while the model container starts up.- Returns:
this
-
startupParameters
@Stability(Stable) public CfnInferenceComponent.InferenceComponentSpecificationProperty.Builder startupParameters(CfnInferenceComponent.InferenceComponentStartupParametersProperty startupParameters) Sets the value ofCfnInferenceComponent.InferenceComponentSpecificationProperty.getStartupParameters()
- Parameters:
startupParameters
- Settings that take effect while the model container starts up.- Returns:
this
-
build
Builds the configured instance.- Specified by:
build
in interfacesoftware.amazon.jsii.Builder<CfnInferenceComponent.InferenceComponentSpecificationProperty>
- Returns:
- a new instance of
CfnInferenceComponent.InferenceComponentSpecificationProperty
- Throws:
NullPointerException
- if any required attribute was not provided
-