Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

Integrating background filters into a client application for the Amazon Chime SDK

Focus mode
Integrating background filters into a client application for the Amazon Chime SDK - Amazon Chime SDK

This section explains how to programmatically filter video backgrounds by using background blur 2.0 and background replacement 2.0. To add a background filter to a video stream, you create a VideoFxProcessor that contains a VideoFxConfig object. You then insert that processor into a VideoTransformDevice.

The background filter processor uses a TensorFlow Lite machine learning model, JavaScript Web Workers, and WebAssembly to apply a filter to the background of each frame in the video stream. These assets are downloaded at runtime when you create a VideoFxProcessor.

The browser demo application on GitHub uses the new background blur and replacement filters. To try them, launch the demo with npm run start, join the meeting, then click the camera to enable video. Open the Apply Filter menu ( Button with a circle and a downward arrow. ) and choose one of the Background Blur 2.0 or Background Replacement 2.0 options.

PrivacySite termsCookie preferences
© 2025, Amazon Web Services, Inc. or its affiliates. All rights reserved.