Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

MLPER-06: Explore alternatives for performance improvement - Machine Learning Lens

MLPER-06: Explore alternatives for performance improvement

Perform benchmarks to improve the machine learning model performance. Benchmarking in ML involves evaluation and comparison of ML workloads with different algorithms, features, and architecture resources. It enables identifying the combination with optimal performance.

Options you can use when benchmarking include:

  • Use more data to broaden the statistical range and improve the success metric of the model.

  • Apply feature engineering to extract important signals in the data for the model.

  • Make alternative algorithm selections for an optimal fit to the specifics of the data.

  • Ensemble methods that combine the different advantages of multiple models.

  • Tune the hyperparameters for a given algorithm to calibrate the model for the data.

Implementation plan

  • Use Amazon SageMaker AI Experiments to optimize algorithms and features -Begin with a simple architecture, obvious features, and a simple algorithm to establish a baseline. Amazon SageMaker AI provides built-in algorithms for developing a baseline model. Use Amazon SageMaker AI Experiments to organize, track, compare, and evaluate your machine learning experiments. Test different algorithms with increasing complexity to observe performance. Combine models into an ensemble to increase accuracy, but consider the potential loss of efficiency as a trade-off. Refine the features by selection and modify parameters to optimize model performance. Tune the model’s hyperparameters to optimize performance using Amazon SageMaker AI Hyperparameter Optimization to automate the search.

Documents

Blogs

Videos

Examples

PrivacySite termsCookie preferences
© 2025, Amazon Web Services, Inc. or its affiliates. All rights reserved.