

# InferenceComponentSpecificationSummary
<a name="API_InferenceComponentSpecificationSummary"></a>

Details about the resources that are deployed with this inference component.

## Contents
<a name="API_InferenceComponentSpecificationSummary_Contents"></a>

 ** BaseInferenceComponentName **   <a name="sagemaker-Type-InferenceComponentSpecificationSummary-BaseInferenceComponentName"></a>
The name of the base inference component that contains this inference component.  
Type: String  
Length Constraints: Minimum length of 0. Maximum length of 63.  
Pattern: `[a-zA-Z0-9]([\-a-zA-Z0-9]*[a-zA-Z0-9])?`   
Required: No

 ** ComputeResourceRequirements **   <a name="sagemaker-Type-InferenceComponentSpecificationSummary-ComputeResourceRequirements"></a>
The compute resources allocated to run the model, plus any adapter models, that you assign to the inference component.  
Type: [InferenceComponentComputeResourceRequirements](API_InferenceComponentComputeResourceRequirements.md) object  
Required: No

 ** Container **   <a name="sagemaker-Type-InferenceComponentSpecificationSummary-Container"></a>
Details about the container that provides the runtime environment for the model that is deployed with the inference component.  
Type: [InferenceComponentContainerSpecificationSummary](API_InferenceComponentContainerSpecificationSummary.md) object  
Required: No

 ** DataCacheConfig **   <a name="sagemaker-Type-InferenceComponentSpecificationSummary-DataCacheConfig"></a>
Settings that affect how the inference component caches data.  
Type: [InferenceComponentDataCacheConfigSummary](API_InferenceComponentDataCacheConfigSummary.md) object  
Required: No

 ** ModelName **   <a name="sagemaker-Type-InferenceComponentSpecificationSummary-ModelName"></a>
The name of the SageMaker AI model object that is deployed with the inference component.  
Type: String  
Length Constraints: Minimum length of 0. Maximum length of 63.  
Pattern: `[a-zA-Z0-9]([\-a-zA-Z0-9]*[a-zA-Z0-9])?`   
Required: No

 ** SchedulingConfig **   <a name="sagemaker-Type-InferenceComponentSpecificationSummary-SchedulingConfig"></a>
The scheduling configuration that determines how inference component copies are placed across available instances when copies are added or removed.  
Type: [InferenceComponentSchedulingConfig](API_InferenceComponentSchedulingConfig.md) object  
Required: No

 ** StartupParameters **   <a name="sagemaker-Type-InferenceComponentSpecificationSummary-StartupParameters"></a>
Settings that take effect while the model container starts up.  
Type: [InferenceComponentStartupParameters](API_InferenceComponentStartupParameters.md) object  
Required: No

## See Also
<a name="API_InferenceComponentSpecificationSummary_SeeAlso"></a>

For more information about using this API in one of the language-specific AWS SDKs, see the following:
+  [AWS SDK for C\$1\$1](https://docs.aws.amazon.com/goto/SdkForCpp/sagemaker-2017-07-24/InferenceComponentSpecificationSummary) 
+  [AWS SDK for Java V2](https://docs.aws.amazon.com/goto/SdkForJavaV2/sagemaker-2017-07-24/InferenceComponentSpecificationSummary) 
+  [AWS SDK for Ruby V3](https://docs.aws.amazon.com/goto/SdkForRubyV3/sagemaker-2017-07-24/InferenceComponentSpecificationSummary) 