

# IVS Broadcast SDK: Android Guide \$1 Real-Time Streaming
<a name="broadcast-android"></a>

The IVS real-time streaming Android broadcast SDK enables participants to send and receive video on Android.

The `com.amazonaws.ivs.broadcast` package implements the interface described in this document. The SDK supports the following operations:
+ Join a stage 
+ Publish media to other participants in the stage
+ Subscribe to media from other participants in the stage
+ Manage and monitor video and audio published to the stage
+ Get WebRTC statistics for each peer connection
+ All operations from the IVS low-latency streaming Android broadcast SDK

**Latest version of Android broadcast SDK:** 1.41.0 ([Release Notes](https://docs.aws.amazon.com/ivs/latest/RealTimeUserGuide/release-notes.html#apr09-26-broadcast-mobile-rt)) 

**Reference documentation:** For information on the most important methods available in the Amazon IVS Android broadcast SDK, see the reference documentation at [https://aws.github.io/amazon-ivs-broadcast-docs/1.41.0/android/](https://aws.github.io/amazon-ivs-broadcast-docs/1.41.0/android/).

**Sample code: **See the Android sample repository on GitHub: [https://github.com/aws-samples/amazon-ivs-real-time-streaming-android-samples](https://github.com/aws-samples/amazon-ivs-real-time-streaming-android-samples).

**Platform requirements:** Android 9.0\$1

# Getting Started​ with the IVS Android Broadcast SDK \$1 Real-Time Streaming
<a name="broadcast-android-getting-started"></a>

This document takes you through the steps involved in getting started with the IVS real-time streaming Android broadcast SDK.

## Install the Library
<a name="broadcast-android-install"></a>

There are several ways to add the Amazon IVS Android broadcast library to your Android development environment: use Gradle directly, use Gradle version catalogs, or install the SDK manually.

**Use Gradle directly**: Add the library to your module’s `build.gradle` file, as shown here (for the latest version of the IVS broadcast SDK):

```
repositories {
    mavenCentral()
}
 
dependencies {
     implementation 'com.amazonaws:ivs-broadcast:1.41.0:stages@aar'
}
```

**Use Gradle version catalogs**: First include this in your module’s `build.gradle` file:

```
implementation(libs.ivs){
   artifact {
      classifier = "stages"
      type = "aar"
   }
}
```

Then include the following in the `libs.version.toml` file (for the latest version of the IVS broadcast SDK):

```
[versions]
ivs="1.41.0"

[libraries]
ivs = {module = "com.amazonaws:ivs-broadcast", version.ref = "ivs"}
```

**Install the SDK manually**: Download the latest version from this location:

[https://search.maven.org/artifact/com.amazonaws/ivs-broadcast](https://search.maven.org/artifact/com.amazonaws/ivs-broadcast)

Be sure to download the `aar` with `-stages` appended.

**Also allow SDK control over the speakerphone**: Regardless of which installation method you choose, also add the following permission to your manifest, to allow the SDK to enable and disable the speakerphone:

```
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/>
```

## Using the SDK with Debug Symbols
<a name="broadcast-android-using-debug-symbols-rt"></a>

We also publish a version of the Android broadcast SDK which includes debug symbols. You can use this version to improve the quality of debug reports (stack traces) in Firebase Crashlytics, if you run into crashes in the IVS broadcast SDK; i.e., `libbroadcastcore.so`. When you report these crashes to the IVS SDK team, the higher quality stack traces make it easier to fix the issues.

To use this version of the SDK, put the following in your Gradle build files:

```
implementation "com.amazonaws:ivs-broadcast:$version:stages-unstripped@aar"
```

Use the above line instead of this:

```
implementation "com.amazonaws:ivs-broadcast:$version:stages@aar"
```

### Uploading Symbols to Firebase Crashlytics
<a name="android-debug-symbols-rt-firebase-crashlytics"></a>

Ensure that your Gradle build files are set up for Firebase Crashlytics. Follow Google’s instructions here:

[https://firebase.google.com/docs/crashlytics/ndk-reports](https://firebase.google.com/docs/crashlytics/ndk-reports)

Be sure to include `com.google.firebase:firebase-crashlytics-ndk` as a dependency.

When building your app for release, the Firebase Crashlytics plugin should upload symbols automatically. To upload symbols manually, run either of the following:

```
gradle uploadCrashlyticsSymbolFileRelease
```

```
./gradlew uploadCrashlyticsSymbolFileRelease
```

(It will not hurt if symbols are uploaded twice, both automatically and manually.)

### Preventing your Release .apk from Becoming Larger
<a name="android-debug-symbols-rt-sizing-apk"></a>

Before packaging the release `.apk` file, the Android Gradle Plugin automatically tries to strip debug information from shared libraries (including the IVS broadcast SDK's `libbroadcastcore.so` library). However, sometimes this does not happen. As a result, your `.apk` file could become larger and you could get a warning message from the Android Gradle Plugin that it’s unable to strip debug symbols and is packaging `.so` files as is. If this happens, do the following:
+ Install an Android NDK. Any recent version will work.
+ Add `ndkVersion <your_installed_ndk_version_number>` to your application’s `build.gradle` file. Do this even if your application itself does not contain native code.

For more information, see this [issue report](https://issuetracker.google.com/issues/353554169).

## Request Permissions
<a name="broadcast-android-permissions"></a>

Your app must request permission to access the user’s camera and mic. (This is not specific to Amazon IVS; it is required for any application that needs access to cameras and microphones.)

Here, we check whether the user has already granted permissions and, if not, ask for them:

```
final String[] requiredPermissions =
         { Manifest.permission.CAMERA, Manifest.permission.RECORD_AUDIO };

for (String permission : requiredPermissions) {
    if (ContextCompat.checkSelfPermission(this, permission) 
                != PackageManager.PERMISSION_GRANTED) {
        // If any permissions are missing we want to just request them all.
        ActivityCompat.requestPermissions(this, requiredPermissions, 0x100);
        break;
    }
}
```

Here, we get the user’s response:

```
@Override
public void onRequestPermissionsResult(int requestCode, 
                                      @NonNull String[] permissions,
                                      @NonNull int[] grantResults) {
    super.onRequestPermissionsResult(requestCode,
               permissions, grantResults);
    if (requestCode == 0x100) {
        for (int result : grantResults) {
            if (result == PackageManager.PERMISSION_DENIED) {
                return;
            }
        }
        setupBroadcastSession();
    }
}
```

# Publishing & Subscribing with the IVS Android Broadcast SDK \$1 Real-Time Streaming
<a name="android-publish-subscribe"></a>

This document takes you through the steps involved in publishing and subscribing to a stage using the IVS real-time streaming Android broadcast SDK.

## Concepts
<a name="android-publish-subscribe-concepts"></a>

Three core concepts underlie real-time functionality: [stage](#android-publish-subscribe-concepts-stage), [strategy](#android-publish-subscribe-concepts-strategy), and [renderer](#android-publish-subscribe-concepts-renderer). The design goal is minimizing the amount of client-side logic necessary to build a working product.

### Stage
<a name="android-publish-subscribe-concepts-stage"></a>

The `Stage` class is the main point of interaction between the host application and the SDK. It represents the stage itself and is used to join and leave the stage. Creating and joining a stage requires a valid, unexpired token string from the control plane (represented as `token`). Joining and leaving a stage are simple. 

```
Stage stage = new Stage(context, token, strategy);

try {
	stage.join();
} catch (BroadcastException exception) {
	// handle join exception
}

stage.leave();
```

The `Stage` class is also where the `StageRenderer` can be attached:

```
stage.addRenderer(renderer); // multiple renderers can be added
```

### Strategy
<a name="android-publish-subscribe-concepts-strategy"></a>

The `Stage.Strategy` interface provides a way for the host application to communicate the desired state of the stage to the SDK. Three functions need to be implemented: `shouldSubscribeToParticipant`, `shouldPublishFromParticipant`, and `stageStreamsToPublishForParticipant`. All are discussed below.

#### Subscribing to Participants
<a name="android-publish-subscribe-concepts-strategy-participants"></a>

```
Stage.SubscribeType shouldSubscribeToParticipant(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo);
```

When a remote participant joins the stage, the SDK queries the host application about the desired subscription state for that participant. The options are `NONE`, `AUDIO_ONLY`, and `AUDIO_VIDEO`. When returning a value for this function, the host application does not need to worry about the publish state, current subscription state, or stage connection state. If `AUDIO_VIDEO` is returned, the SDK waits until the remote participant is publishing before subscribing, and it updates the host application through the renderer throughout the process.

Here is a sample implementation:

```
@Override
Stage.SubscribeType shouldSubscribeToParticipant(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo) {
	return Stage.SubscribeType.AUDIO_VIDEO;
}
```

This is the complete implementation of this function for a host application that always wants all participants to see each other; e.g., a video chat application.

More advanced implementations also are possible. Use the `userInfo` property on `ParticipantInfo` to selectively subscribe to participants based on server-provided attributes:

```
@Override
Stage.SubscribeType shouldSubscribeToParticipant(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo) {
	switch(participantInfo.userInfo.get(“role”)) {
		case “moderator”:
			return Stage.SubscribeType.NONE;
		case “guest”:
			return Stage.SubscribeType.AUDIO_VIDEO;
		default:
			return Stage.SubscribeType.NONE;
	}
}
```

This can be used to create a stage where moderators can monitor all guests without being seen or heard themselves. The host application could use additional business logic to let moderates see each other but remain invisible to guests.

#### Configuration for Subscribing to Participants
<a name="android-publish-subscribe-concepts-strategy-participants-config"></a>

```
SubscribeConfiguration subscribeConfigurationForParticipant(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo);
```

If a remote participant is being subscribed to (see [Subscribing to Participants](#android-publish-subscribe-concepts-strategy-participants)), the SDK queries the host application about a custom subscribe configuration for that participant. This configuration is optional and allows the host application to control certain aspects of subscriber behavior. For information on what can be configured, see [SubscribeConfiguration](https://aws.github.io/amazon-ivs-web-broadcast/docs/sdk-reference/interfaces/SubscribeConfiguration) in the SDK reference documentation.

Here is a sample implementation:

```
@Override
public SubscribeConfiguration subscribeConfigrationForParticipant(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo) {
    SubscribeConfiguration config = new SubscribeConfiguration();

    config.jitterBuffer.setMinDelay(JitterBufferConfiguration.JitterBufferDelay.MEDIUM());

    return config;
}
```

This implementation updates the jitter-buffer minimum delay for all subscribed participants to a preset of `MEDIUM`.

As with `shouldSubscribeToParticipant`, more advanced implementations are possible. The given `ParticipantInfo` can be used to selectively update the subscribe configuration for specific participants.

We recommend using the default behaviors. Specify custom configuration only if there is a particular behavior you want to change.

#### Publishing
<a name="android-publish-subscribe-concepts-strategy-publishing"></a>

```
boolean shouldPublishFromParticipant(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo);
```

Once connected to the stage, the SDK queries the host application to see if a particular participant should publish. This is invoked only on local participants that have permission to publish based on the provided token.

Here is a sample implementation:

```
@Override
boolean shouldPublishFromParticipant(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo) {
	return true;
}
```

This is for a standard video chat application where users always want to publish. They can mute and unmute their audio and video, to instantly be hidden or seen/heard. (They also can use publish/unpublish, but that is much slower. Mute/unmute is preferable for use cases where changing visibility often is desirable.)

#### Choosing Streams to Publish
<a name="android-publish-subscribe-concepts-strategy-streams"></a>

```
@Override
List<LocalStageStream> stageStreamsToPublishForParticipant(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo);
}
```

When publishing, this is used to determine what audio and video streams should be published. This is covered in more detail later in [Publish a Media Stream](#android-publish-subscribe-publish-stream).

#### Updating the Strategy
<a name="android-publish-subscribe-concepts-strategy-updates"></a>

The strategy is intended to be dynamic: the values returned from any of the above functions can be changed at any time. For example, if the host application does not want to publish until the end user taps a button, you could return a variable from `shouldPublishFromParticipant` (something like `hasUserTappedPublishButton`). When that variable changes based on an interaction by the end user, call `stage.refreshStrategy()` to signal to the SDK that it should query the strategy for the latest values, applying only things that have changed. If the SDK observes that the `shouldPublishFromParticipant` value has changed, it will start the publish process. If the SDK queries and all functions return the same value as before, the `refreshStrategy` call will not perform any modifications to the stage.

If the return value of `shouldSubscribeToParticipant` changes from `AUDIO_VIDEO` to `AUDIO_ONLY`, the video stream will be removed for all participants with changed returned values, if a video stream existed previously.

Generally, the stage uses the strategy to most efficiently apply the difference between the previous and current strategies, without the host application needing to worry about all the state required to manage it properly. Because of this, think of calling `stage.refreshStrategy()` as a cheap operation, because it does nothing unless the strategy changes.

### Renderer
<a name="android-publish-subscribe-concepts-renderer"></a>

The `StageRenderer` interface communicates the state of the stage to the host application. Updates to the host application’s UI usually can be powered entirely by the events provided by the renderer. The renderer provides the following functions:

```
void onParticipantJoined(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo);

void onParticipantLeft(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo);

void onParticipantPublishStateChanged(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo, @NonNull Stage.PublishState publishState);

void onParticipantSubscribeStateChanged(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo, @NonNull Stage.SubscribeState subscribeState);

void onStreamsAdded(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo, @NonNull List<StageStream> streams);

void onStreamsRemoved(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo, @NonNull List<StageStream> streams);

void onStreamsMutedChanged(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo, @NonNull List<StageStream> streams);

void onError(@NonNull BroadcastException exception);

void onConnectionStateChanged(@NonNull Stage stage, @NonNull Stage.ConnectionState state, @Nullable BroadcastException exception);
                
void onStreamAdaptionChanged(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo, @NonNull RemoteStageStream stream, boolean adaption);

void onStreamLayersChanged(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo, @NonNull RemoteStageStream stream, @NonNull List<RemoteStageStream.Layer> layers);

void onStreamLayerSelected(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo, @NonNull RemoteStageStream stream, @Nullable RemoteStageStream.Layer layer, @NonNull RemoteStageStream.LayerSelectedReason reason);
```

For most of these methods, the corresponding `Stage` and `ParticipantInfo` are provided.

It is not expected that the information provided by the renderer impacts the return values of the strategy. For example, the return value of `shouldSubscribeToParticipant` is not expected to change when `onParticipantPublishStateChanged` is called. If the host application wants to subscribe to a particular participant, it should return the desired subscription type regardless of that participant’s publish state. The SDK is responsible for ensuring that the desired state of the strategy is acted on at the correct time based on the state of the stage.

The `StageRenderer` can be attached to the stage class:

```
stage.addRenderer(renderer); // multiple renderers can be added
```

Note that only publishing participants trigger `onParticipantJoined`, and whenever a participant stops publishing or leaves the stage session, `onParticipantLeft` is triggered.

## Publish a Media Stream
<a name="android-publish-subscribe-publish-stream"></a>

Local devices such as built-in microphones and cameras are discovered via `DeviceDiscovery`. Here is an example of selecting the front-facing camera and default microphone, then return them as `LocalStageStreams` to be published by the SDK:

```
DeviceDiscovery deviceDiscovery = new DeviceDiscovery(context);

List<Device> devices = deviceDiscovery.listLocalDevices();
List<LocalStageStream> publishStreams = new ArrayList<LocalStageStream>();

Device frontCamera = null;
Device microphone = null;

// Create streams using the front camera, first microphone
for (Device device : devices) {
	Device.Descriptor descriptor = device.getDescriptor();
	if (!frontCamera && descriptor.type == Device.Descriptor.DeviceType.Camera && descriptor.position = Device.Descriptor.Position.FRONT) {
		front Camera = device;
	}
	if (!microphone && descriptor.type == Device.Descriptor.DeviceType.Microphone) {
		microphone = device;
	}
}

ImageLocalStageStream cameraStream = new ImageLocalStageStream(frontCamera);
AudioLocalStageStream microphoneStream = new AudioLocalStageStream(microphoneDevice);

publishStreams.add(cameraStream);
publishStreams.add(microphoneStream);

// Provide the streams in Stage.Strategy
@Override
@NonNull List<LocalStageStream> stageStreamsToPublishForParticipant(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo) {
	return publishStreams;
}
```

## Display and Remove Participants
<a name="android-publish-subscribe-participants"></a>

After subscribing is completed, you will receive an array of `StageStream` objects through the renderer’s `onStreamsAdded` function. You can retrieve the preview from an `ImageStageStream`:

```
ImagePreviewView preview = ((ImageStageStream)stream).getPreview();

// Add the view to your view hierarchy
LinearLayout previewHolder = findViewById(R.id.previewHolder);
preview.setLayoutParams(new LinearLayout.LayoutParams(
		LinearLayout.LayoutParams.MATCH_PARENT,
		LinearLayout.LayoutParams.MATCH_PARENT));
previewHolder.addView(preview);
```

You can retrieve the audio-level stats from an `AudioStageStream`:

```
((AudioStageStream)stream).setStatsCallback((peak, rms) -> {
	// handle statistics
});
```

When a participant stops publishing or is unsubscribed from, the `onStreamsRemoved` function is called with the streams that were removed. Host applications should use this as a signal to remove the participant’s video stream from the view hierarchy.

`onStreamsRemoved` is invoked for all scenarios in which a stream might be removed, including: 
+ The remote participant stops publishing.
+ A local device unsubscribes or changes subscription from `AUDIO_VIDEO` to `AUDIO_ONLY`.
+ The remote participant leaves the stage.
+ The local participant leaves the stage.

Because `onStreamsRemoved` is invoked for all scenarios, no custom business logic is required around removing participants from the UI during remote or local leave operations.

## Mute and Unmute Media Streams
<a name="android-publish-subscribe-mute-streams"></a>

`LocalStageStream` objects have a `setMuted` function that controls whether the stream is muted. This function can be called on the stream before or after it is returned from the `streamsToPublishForParticipant` strategy function.

**Important**: If a new `LocalStageStream` object instance is returned by `streamsToPublishForParticipant` after a call to `refreshStrategy`, the mute state of the new stream object is applied to the stage. Be careful when creating new `LocalStageStream` instances to make sure the expected mute state is maintained.

## Monitor Remote Participant Media Mute State
<a name="android-publish-subscribe-mute-state"></a>

When a participant changes the mute state of their video or audio stream, the renderer `onStreamMutedChanged` function is invoked with a list of streams that have changed. Use the `getMuted` method on `StageStream` to update your UI accordingly. 

```
@Override
void onStreamsMutedChanged(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo, @NonNull List<StageStream> streams) {
	for (StageStream stream : streams) {
		boolean muted = stream.getMuted();
		// handle UI changes
	}
}
```

## Get WebRTC Statistics
<a name="android-publish-subscribe-webrtc-stats"></a>

To get the latest WebRTC statistics for a publishing stream or a subscribing stream, use `requestRTCStats` on `StageStream`. When a collection is completed, you will receive statistics through the `StageStream.Listener` which can be set on `StageStream`.

```
stream.requestRTCStats();

@Override
void onRTCStats(Map<String, Map<String, String>> statsMap) {
	for (Map.Entry<String, Map<String, string>> stat : statsMap.entrySet()) {
		for(Map.Entry<String, String> member : stat.getValue().entrySet()) {
			Log.i(TAG, stat.getKey() + “ has member “ + member.getKey() + “ with value “ + member.getValue());
		}
	}
}
```

## Get Participant Attributes
<a name="android-publish-subscribe-participant-attributes"></a>

If you specify attributes in the `CreateParticipantToken` operation request, you can see the attributes in `ParticipantInfo` properties:

```
@Override
void onParticipantJoined(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo) {
	for (Map.Entry<String, String> entry : participantInfo.userInfo.entrySet()) {
		Log.i(TAG, “attribute: “ + entry.getKey() + “ = “ + entry.getValue());
	}
}
```

## Embed Messages
<a name="android-publish-subscribe-embed-messages"></a>

The `embedMessage` method on ImageDevice allows you to insert metadata payloads directly into video frames during publishing. This enables frame-synchronized messaging for real-time applications. Message embedding is available only when using the SDK for real-time publishing (not low-latency publishing).

Embedded messages are not guaranteed to arrive to subscribers because they are embedded directly within video frames and transmitted over UDP, which does not guarantee packet delivery. Packet loss during transmission can result in lost messages, especially in poor network conditions. To mitigate this, the `embedMessage` method includes a `repeatCount` parameter that duplicates the message across multiple consecutive frames, increasing delivery reliability. This capability is available only for video streams.

### Using embedMessage
<a name="android-embed-messages-using-embedmessage"></a>

Publishing clients can embed message payloads into their video stream using the `embedMessage` method on ImageDevice. The payload size must be greater than 0KB and less than 1KB. The number of embedded messages inserted per second must not exceed 10KB per second. 

```
val surfaceSource: SurfaceSource = imageStream.device as SurfaceSource
val message = "hello world"
val messageBytes = message.toByteArray(StandardCharsets.UTF_8)

try {
    surfaceSource.embedMessage(messageBytes, 0)
} catch (e: BroadcastException) {
    Log.e("EmbedMessage", "Failed to embed message: ${e.message}")
}
```

### Repeating Message Payloads
<a name="android-embed-messages-repeat-payloads"></a>

Use `repeatCount` to duplicate the message across multiple frames for improved reliability. This value must be between 0 and 30. Receiving clients must have logic to de-duplicate the message.

```
try {
    surfaceSource.embedMessage(messageBytes, 5)
    // repeatCount: 0-30, receiving clients should handle duplicates
} catch (e: BroadcastException) {
    Log.e("EmbedMessage", "Failed to embed message: ${e.message}")
}
```

### Reading Embedded Messages
<a name="android-embed-messages-read-messages"></a>

See "Get Supplemental Enhancement Information (SEI)" below for how to read embedded messages from incoming streams.

## Get Supplemental Enhancement Information (SEI)
<a name="android-publish-subscribe-sei-attributes"></a>

The Supplemental Enhancement Information (SEI) NAL unit is used to store frame-aligned metadata alongside the video. Subscribing clients can read SEI payloads from a publisher who is publishing H.264 video by inspecting the `embeddedMessages` property on the `ImageDeviceFrame` objects coming out of the publisher’s `ImageDevice`. To do this, acquire a publisher’s `ImageDevice`, then observe each frame via a callback provided to `setOnFrameCallback`, as shown in the following example:

```
// in a StageRenderer’s onStreamsAdded function, after acquiring the new ImageStream

val imageDevice = imageStream.device as ImageDevice
imageDevice.setOnFrameCallback(object : ImageDevice.FrameCallback {
	override fun onFrame(frame: ImageDeviceFrame) {
    		for (message in frame.embeddedMessages) {
        		if (message is UserDataUnregisteredSeiMessage) {
            		val seiMessageBytes = message.data
            		val seiMessageUUID = message.uuid
           	 
            		// interpret the message's data based on the UUID
        		}
    		}
	}
})
```

## Continue Session in the Background
<a name="android-publish-subscribe-background-session"></a>

When the app enters the background, you may want to stop publishing or subscribe only to other remote participants’ audio. To accomplish this, update your `Strategy` implementation to stop publishing, and subscribe to `AUDIO_ONLY` (or `NONE`, if applicable).

```
// Local variables before going into the background
boolean shouldPublish = true;
Stage.SubscribeType subscribeType = Stage.SubscribeType.AUDIO_VIDEO;

// Stage.Strategy implementation
@Override
boolean shouldPublishFromParticipant(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo) {
	return shouldPublish;
}

@Override
Stage.SubscribeType shouldSubscribeToParticipant(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo) {
	return subscribeType;
}

// In our Activity, modify desired publish/subscribe when we go to background, then call refreshStrategy to update the stage
@Override
void onStop() {
	super.onStop();
	shouldPublish = false;
	subscribeTpye = Stage.SubscribeType.AUDIO_ONLY;
	stage.refreshStrategy();
}
```

## Layered Encoding with Simulcast
<a name="android-publish-subscribe-layered-encoding-simulcast"></a>

Layered encoding with simulcast is an IVS real-time streaming feature that allows publishers to send multiple different quality layers of video, and subscribers to dynamically or manually configure those layers. The feature is described more in the [Streaming Optimizations](real-time-streaming-optimization.md) document.

### Configuring Layered Encoding (Publisher)
<a name="android-layered-encoding-simulcast-configure-publisher"></a>

As a publisher, to enable layered encoding with simulcast, add the following configuration to your `LocalStageStream` on instantiation:

```
// Enable Simulcast
StageVideoConfiguration config = new StageVideoConfiguration();
config.simulcast.setEnabled(true);

ImageLocalStageStream cameraStream = new ImageLocalStageStream(frontCamera, config);

// Other Stage implementation code
```

Depending on the resolution you set on video configuration, a set number of layers will be encoded and sent as defined in the [Default Layers, Qualities, and Framerates](real-time-streaming-optimization.md#real-time-streaming-optimization-default-layers) section of *Streaming Optimizations*.

Also, you can optionally configure individual layers from within the simulcast configuration: 

```
// Enable Simulcast
StageVideoConfiguration config = new StageVideoConfiguration();
config.simulcast.setEnabled(true);

List<StageVideoConfiguration.Simulcast.Layer> simulcastLayers = new ArrayList<>();
simulcastLayers.add(StagePresets.SimulcastLocalLayer.DEFAULT_720);
simulcastLayers.add(StagePresets.SimulcastLocalLayer.DEFAULT_180);

config.simulcast.setLayers(simulcastLayers);

ImageLocalStageStream cameraStream = new ImageLocalStageStream(frontCamera, config);

// Other Stage implementation code
```

Alternately, you can create your own custom layer configurations for up to three layers. If you provide an empty array or no value, the defaults described above are used. Layers are described with the following required property setters:
+ `setSize: Vec2;`
+ `setMaxBitrate: integer;`
+ `setMinBitrate: integer;`
+ `setTargetFramerate: integer;`

Starting from the presets, you can either override individual properties or create an entirely new configuration:

```
// Enable Simulcast
StageVideoConfiguration config = new StageVideoConfiguration();
config.simulcast.setEnabled(true);

List<StageVideoConfiguration.Simulcast.Layer> simulcastLayers = new ArrayList<>();

// Configure high quality layer with custom framerate
StageVideoConfiguration.Simulcast.Layer customHiLayer = StagePresets.SimulcastLocalLayer.DEFAULT_720;
customHiLayer.setTargetFramerate(15);

// Add layers to the list
simulcastLayers.add(customHiLayer);
simulcastLayers.add(StagePresets.SimulcastLocalLayer.DEFAULT_180);

config.simulcast.setLayers(simulcastLayers);

ImageLocalStageStream cameraStream = new ImageLocalStageStream(frontCamera, config);

// Other Stage implementation code
```

For maximum values, limits, and errors which can be triggered when configuring individual layers, see the SDK reference documentation.

### Configuring Layered Encoding (Subscriber)
<a name="android-layered-encoding-simulcast-configure-subscriber"></a>

As a subscriber, there is nothing needed to enable layered encoding. If a publisher is sending simulcast layers, then by default the server dynamically adapts between the layers to choose the optimal quality based on the subscriber's device and network conditions.

Alternatively, to pick explicit layers that the publisher is sending, there are several options, described below.

### Option 1: Initial Layer Quality Preference
<a name="android-layered-encoding-simulcast-layer-quality-preference"></a>

Using the `subscribeConfigurationForParticipant` strategy, it is possible to choose what initial layer you want to receive as a subscriber:

```
@Override
public SubscribeConfiguration subscribeConfigrationForParticipant(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo) {
    SubscribeConfiguration config = new SubscribeConfiguration();

    config.simulcast.setInitialLayerPreference(SubscribeSimulcastConfiguration.InitialLayerPreference.LOWEST_QUALITY);

    return config;
}
```

By default, subscribers always are sent the lowest quality layer first; this slowly ramps up to the highest quality layer. This optimizes end-user bandwidth consumption and provides the best time to video, reducing initial video freezes for users on weaker networks.

These options are available for `InitialLayerPreference`:
+ `LOWEST_QUALITY` — The server delivers the lowest quality layer of video first. This optimizes bandwidth consumption, as well as time to media. Quality is defined as the combination of size, bitrate, and framerate of the video. For example, 720p video is lower quality than 1080p video.
+ `HIGHEST_QUALITY` — The server delivers the highest quality layer of video first. This optimizes quality but may increase the time to media. Quality is defined as the combination of size, bitrate, and framerate of the video. For example, 1080p video is higher quality than 720p video.

**Note:** For initial layer preferences (the `setInitialLayerPreference` call) to take effect, a re-subscribe is necessary as these updates do not apply to the active subscription.

### Option 2: Preferred Layer for Stream
<a name="android-layered-encoding-simulcast-preferred-layer"></a>

The `preferredLayerForStream` strategy method lets you select a layer after the stream has started. This strategy method receives the participant and the stream information, so you can select a layer on a participant-by-participant basis. The SDK calls this method in response to specific events, such as when stream layers change, the participant state changes, or the host application refreshes the strategy.

The strategy method returns a `RemoteStageStream.Layer` object, which can be one of the following:
+ A layer object, such as one returned by `RemoteStageStream.getLayers`.
+ null, which indicates that no layer should be selected and dynamic adaption is preferred.

For example, the following strategy will always have the users selecting the lowest quality layer of video available:

```
@Nullable
@Override
public RemoteStageStream.Layer preferredLayerForStream(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo, @NonNull RemoteStageStream stream) {
    return stream.getLowestQualityLayer();
}
```

To reset the layer selection and return to dynamic adaption, return null or undefined in the strategy. In this example, `appState` is a placeholder variable that represents the host application’s state.

```
@Nullable
@Override
public RemoteStageStream.Layer preferredLayerForStream(@NonNull Stage stage, @NonNull ParticipantInfo participantInfo, @NonNull RemoteStageStream stream) {
    if (appState.isAutoMode) {
        return null;
    } else {
        return appState.layerChoice;
    }
}
```

### Option 3: RemoteStageStream Layer Helpers
<a name="android-layered-encoding-simulcast-remotestagestream-helpers"></a>

`RemoteStageStream` has several helpers which can be used to make decisions about layer selection and display the corresponding selections to end users:
+ **Layer Events** — Alongside `StageRenderer`, the `RemoteStageStream.Listener` has events which communicate layer and simulcast adaption changes:
  + `void onAdaptionChanged(boolean adaption)`
  + `void onLayersChanged(@NonNull List<Layer> layers)`
  + `void onLayerSelected(@Nullable Layer layer, @NonNull LayerSelectedReason reason)`
+ **Layer Methods** — `RemoteStageStream` has several helper methods which can be used to get information about the stream and the layers being presented. These methods are available on the remote stream provided in the `preferredLayerForStream` strategy, as well as remote streams exposed via `StageRenderer.onStreamsAdded`.
  + `stream.getLayers`
  + `stream.getSelectedLayer`
  + `stream.getLowestQualityLayer`
  + `stream.getHighestQualityLayer`
  + `stream.getLayersWithConstraints`

For details, see the `RemoteStageStream` class in the [SDK reference documentation](https://aws.github.io/amazon-ivs-broadcast-docs/latest/android/). For the `LayerSelected` reason, if `UNAVAILABLE` is returned, this indicates that the requested layer could not be selected. A best-effort selection is made in its place, which typically is a lower quality layer to maintain stream stability.

## Video-Configuration Limitations
<a name="android-publish-subscribe-video-limits"></a>

The SDK does not support forcing portrait mode or landscape mode using `StageVideoConfiguration.setSize(BroadcastConfiguration.Vec2 size)`. In portrait orientation, the smaller dimension is used as the width; in landscape orientation, the height. This means that the following two calls to `setSize` have the same effect on the video configuration:

```
StageVideo Configuration config = new StageVideo Configuration();

config.setSize(BroadcastConfiguration.Vec2(720f, 1280f);
config.setSize(BroadcastConfiguration.Vec2(1280f, 720f);
```

## Handling Network Issues
<a name="android-publish-subscribe-network-issues"></a>

When the local device’s network connection is lost, the SDK internally tries to reconnect without any user action. In some cases, the SDK is not successful and user action is needed. There are two main errors related to losing the network connection:
+ Error code 1400, message: "PeerConnection is lost due to unknown network error"
+ Error code 1300, message: "Retry attempts are exhausted"

If the first error is received but the second is not, the SDK is still connected to the stage and will try to reestablish its connections automatically. As a safeguard, you can call `refreshStrategy` without any changes to the strategy method’s return values, to trigger a manual reconnect attempt.

If the second error is received, the SDK’s reconnect attempts have failed and the local device is no longer connected to the stage. In this case, try to rejoin the stage by calling `join` after your network connection has been reestablished.

In general, encountering errors after joining a stage successfully indicates that the SDK was unsuccessful in reestablishing a connection. Create a new `Stage` object and try to join when network conditions improve.

## Using Bluetooth Microphones
<a name="android-publish-subscribe-bluetooth-microphones"></a>

To publish using Bluetooth microphone devices, you must start a Bluetooth SCO connection:

```
Bluetooth.startBluetoothSco(context);
// Now bluetooth microphones can be used
…
// Must also stop bluetooth SCO
Bluetooth.stopBluetoothSco(context);
```

# Known Issues & Workarounds in the IVS Android Broadcast SDK \$1 Real-Time Streaming ​
<a name="broadcast-android-known-issues"></a>

This document lists known issues that you might encounter when using the Amazon IVS real-time streaming Android broadcast SDK and suggests potential workarounds.
+ When an Android device goes to sleep and wakes up, it is possible for the preview to be in a frozen state.

  **Workaround:** Create and use a new `Stage`.
+ When a participant joins with a token that is being used by another participant, the first connection is disconnected without a specific error.

  **Workaround:** None. 
+ There is a rare issue where the publisher is publishing but the publish state that subscribers receive is `inactive`.

  **Workaround:** Try leaving and then joining the session. If the issue remains, create a new token for the publisher.
+ A rare audio-distortion issue may occur intermittently during a stage session, typically on calls of longer durations.

  **Workaround:** The participant with distorted audio can either leave and rejoin the session, or unpublish and republish their audio to fix the issue.
+ External microphones are not supported when publishing to a stage.

  **Workaround:** Do not use an external microphone connected via USB for publishing to a stage.
+ Publishing to a stage with screen share using `createSystemCaptureSources` is not supported.

  **Workaround:** Manage the system capture manually, using custom image-input sources and custom audio-input sources.
+ When an `ImagePreviewView` is removed from a parent (e.g., `removeView()` is called at the parent), the `ImagePreviewView` is released immediately. The `ImagePreviewView` does not show any frames when it is added to another parent view.

  **Workaround:** Request another preview using `getPreview`.
+ When joining a stage with a Samsung Galaxy S22/\$1 with Android 12, you may encounter a 1401 error and the local device fails to join the stage or joins but has no audio.

  **Workaround:** Upgrade to Android 13.
+ When joining a stage with a Nokia X20 on Android 13, the camera may fail to open and an exception is thrown.

  **Workaround:** None.
+ Devices with the MediaTek Helio chipset may not render video of remote participants properly.

  **Workaround:** None.
+ On a few devices, the device OS may choose a different microphone than what’s selected through the SDK. This is because the Amazon IVS Broadcast SDK cannot control how the `VOICE_COMMUNICATION` audio route is defined, as it varies according to different device manufacturers.

  **Workaround:** None.
+ Some Android video encoders cannot be configured with a video size less than 176x176. Configuring a smaller size causes an error and prevents streaming.

  **Workaround:** Do not configure the video size to be less than 176x176.

# Error Handling in the IVS Android Broadcast SDK \$1 Real-Time Streaming
<a name="broadcast-android-error-handling"></a>

This section is an overview of error conditions, how the IVS real-time streaming Android broadcast SDK reports them to the application, and what an application should do when those errors are encountered.

## Fatal vs. Non-Fatal Errors
<a name="broadcast-android-fatal-vs-nonfatal-errors"></a>

The error object has an "is fatal" boolean field of `BroadcastException`.

In general, fatal errors are related to connection to the Stages server (either a connection cannot be established or is lost and cannot be recovered). The application should re-create the stage and re-join, possibly with a new token or when the device’s connectivity recovers.

Non-fatal errors generally are related to the publish/subscribe state and are handled by the SDK, which retries the publish/subscribe operation.

You can check this property:

```
try {
  stage.join(...)
} catch (e: BroadcastException) {
  If (e.isFatal) { 
    // the error is fatal
```

## Join Errors
<a name="broadcast-android-stage-join-errors"></a>

### Malformed Token
<a name="broadcast-android-stage-join-errors-malformed-token"></a>

This happens when the stage token is malformed.

The SDK throws a Java exception from a call to `stage.join`, with error code = 1000 and fatal = true.

**Action**: Create a valid token and retry joining.

### Expired Token
<a name="broadcast-android-stage-join-errors-expired-token"></a>

This happens when the stage token is expired.

The SDK throws a Java exception from a call to `stage.join`, with error code = 1001 and fatal = true.

**Action**: Create a new token and retry joining.

### Invalid or Revoked Token
<a name="broadcast-android-stage-join-errors-invalid-token"></a>

This happens when the stage token is not malformed but is rejected by the Stages server. This is reported asynchronously through the application-supplied stage renderer.

The SDK calls `onConnectionStateChanged` with an exception, with error code = 1026 and fatal = true.

**Action**: Create a valid token and retry joining.

### Network Errors for Initial Join
<a name="broadcast-android-stage-join-errors-network-initial-join"></a>

This happens when the SDK cannot contact the Stages server to establish a connection. This is reported asynchronously through the application-supplied stage renderer.

The SDK calls `onConnectionStateChanged` with an exception, with error code = 1300 and fatal = true.

**Action**: Wait for the device’s connectivity to recover and retry joining.

### Network Errors when Already Joined
<a name="broadcast-android-stage-join-errors-network-already-joined"></a>

If the device’s network connection goes down, the SDK may lose its connection to Stage servers. This is reported asynchronously through the application-supplied stage renderer.

The SDK calls `onConnectionStateChanged` with an exception, with error code = 1300 and fatal = true.

**Action**: Wait for the device’s connectivity to recover and retry joining.

## Publish/Subscribe Errors
<a name="broadcast-android-publish-subscribe-errors"></a>

### Initial
<a name="broadcast-android-publish-subscribe-errors-initial"></a>

There are several errors:
+ MultihostSessionOfferCreationFailPublish (1020)
+ MultihostSessionOfferCreationFailSubscribe (1021)
+ MultihostSessionNoIceCandidates (1022)
+ MultihostSessionStageAtCapacity (1024)
+ SignallingSessionCannotRead (1201)
+ SignallingSessionCannotSend (1202)
+ SignallingSessionBadResponse (1203)

These are reported asynchronously through the application-supplied stage renderer.

The SDK retries the operation for a limited number of times. During retries, the publish/subscribe state is `ATTEMPTING_PUBLISH` / `ATTEMPTING_SUBSCRIBE`. If the retry attempts succeed, the state changes to `PUBLISHED` / `SUBSCRIBED`.

The SDK calls `onError` with the relevant error code and fatal = false.

**Action**: No action is needed, as the SDK retries automatically. Optionally, the application can refresh the strategy to force more retries.

### Already Established, Then Fail
<a name="broadcast-android-publish-subscribe-errors-established"></a>

A publish or subscribe can fail after it is established, most likely due to a network error. The error code for a "peer connection lost due to network error" is 1400.

This is reported asynchronously through the application-supplied stage renderer.

The SDK retries the publish/subscribe operation. During retries, the publish/subscribe state is `ATTEMPTING_PUBLISH` / `ATTEMPTING_SUBSCRIBE`. If the retry attempts succeed, the state changes to `PUBLISHED` / `SUBSCRIBED`.

The SDK calls `onError` with the error code = 1400 and fatal = false.

**Action**: No action is needed, as the SDK retries automatically. Optionally, the application can refresh the strategy to force more retries. In the event of total connectivity loss, it’s likely that the connection to Stages will fail too.