

# IVS Broadcast SDK: Android Guide \$1 Low-Latency Streaming
<a name="broadcast-android"></a>

The IVS Low-Latency Streaming Android Broadcast SDK provides the interfaces required to broadcast to IVS on Android.

The `com.amazonaws.ivs.broadcast` package implements the interface described in this document. The following operations are supported: 
+ Set up (initialize) a broadcast session. 
+ Manage broadcasting.
+ Attach and detach input devices.
+ Manage a composition session. 
+ Receive events. 
+ Receive errors. 

**Latest version of Android broadcast SDK:** 1.41.0 ([Release Notes](https://docs.aws.amazon.com/ivs/latest/LowLatencyUserGuide/release-notes.html#apr09-26-broadcast-mobile-ll)) 

**Reference documentation:** For information on the most important methods available in the Amazon IVS Android broadcast SDK, see the reference documentation at [https://aws.github.io/amazon-ivs-broadcast-docs/1.41.0/android/](https://aws.github.io/amazon-ivs-broadcast-docs/1.41.0/android/).

**Sample code: **See the Android sample repository on GitHub: [https://github.com/aws-samples/amazon-ivs-broadcast-android-sample](https://github.com/aws-samples/amazon-ivs-broadcast-android-sample).

**Platform requirements:** Android 9.0\$1

# Getting Started​ with the IVS Android Broadcast SDK \$1 Low-Latency Streaming
<a name="broadcast-android-getting-started"></a>

This document takes you through the steps involved in getting started with the Amazon IVS low-latency streaming Android broadcast SDK.

## Install the Library
<a name="broadcast-android-install"></a>

To add the Amazon IVS Android broadcast library to your Android development environment, add the library to your module’s `build.gradle` file, as shown here (for the latest version of the Amazon IVS broadcast SDK):

```
repositories {
    mavenCentral()
}
dependencies {
     implementation 'com.amazonaws:ivs-broadcast:1.41.0'
}
```

Alternately, to install the SDK manually, download the latest version from this location:
+ [https://search.maven.org/artifact/com.amazonaws/ivs-broadcast](https://search.maven.org/artifact/com.amazonaws/ivs-broadcast)

## Using the SDK with Debug Symbols
<a name="broadcast-android-using-debug-symbols-ll"></a>

We also publish a version of the Android broadcast SDK which includes debug symbols. You can use this version to improve the quality of debug reports (stack traces) in Firebase Crashlytics, if you run into crashes in the IVS broadcast SDK; i.e., `libbroadcastcore.so`. When you report these crashes to the IVS SDK team, the higher quality stack traces make it easier to fix the issues.

To use this version of the SDK, put the following in your Gradle build files:

```
implementation "com.amazonaws:ivs-broadcast:$version:unstripped@aar"
```

Use the above line instead of this:

```
implementation "com.amazonaws:ivs-broadcast:$version@aar"
```

### Uploading Symbols to Firebase Crashlytics
<a name="android-debug-symbols-ll-firebase-crashlytics"></a>

Ensure that your Gradle build files are set up for Firebase Crashlytics. Follow Google’s instructions here:

[https://firebase.google.com/docs/crashlytics/ndk-reports](https://firebase.google.com/docs/crashlytics/ndk-reports)

Be sure to include `com.google.firebase:firebase-crashlytics-ndk` as a dependency.

When building your app for release, the Firebase Crashlytics plugin should upload symbols automatically. To upload symbols manually, run either of the following:

```
gradle uploadCrashlyticsSymbolFileRelease
```

```
./gradlew uploadCrashlyticsSymbolFileRelease
```

(It will not hurt if symbols are uploaded twice, both automatically and manually.)

### Preventing your Release .apk from Becoming Larger
<a name="android-debug-symbols-ll-sizing-apk"></a>

Before packaging the release `.apk` file, the Android Gradle Plugin automatically tries to strip debug information from shared libraries (including the IVS broadcast SDK's `libbroadcastcore.so` library). However, sometimes this does not happen. As a result, your `.apk` file could become larger and you could get a warning message from the Android Gradle Plugin that it’s unable to strip debug symbols and is packaging `.so` files as is. If this happens, do the following:
+ Install an Android NDK. Any recent version will work.
+ Add `ndkVersion <your_installed_ndk_version_number>` to your application’s `build.gradle` file. Do this even if your application itself does not contain native code.

For more information, see this [issue report](https://issuetracker.google.com/issues/353554169).

## Create the Event Listener
<a name="broadcast-android-create-event-listener"></a>

Setting up an event listener allows you to receive state updates, device-change notifications, errors, and session-audio information.

```
BroadcastSession.Listener broadcastListener = 
          new BroadcastSession.Listener() {
    @Override
    public void onStateChanged(@NonNull BroadcastSession.State state) {
        Log.d(TAG, "State=" + state);
    }

    @Override
    public void onError(@NonNull BroadcastException exception) {
        Log.e(TAG, "Exception: " + exception);
    }
};
```

## Request Permissions
<a name="broadcast-android-permissions"></a>

Your app must request permission to access the user’s camera and mic. (This is not specific to Amazon IVS; it is required for any application that needs access to cameras and microphones.)

Here, we check whether the user has already granted permissions and, if not, ask for them:

```
final String[] requiredPermissions =
         { Manifest.permission.CAMERA, Manifest.permission.RECORD_AUDIO };

for (String permission : requiredPermissions) {
    if (ContextCompat.checkSelfPermission(this, permission) 
                != PackageManager.PERMISSION_GRANTED) {
        // If any permissions are missing we want to just request them all.
        ActivityCompat.requestPermissions(this, requiredPermissions, 0x100);
        break;
    }
}
```

Here, we get the user’s response:

```
@Override
public void onRequestPermissionsResult(int requestCode, 
                                      @NonNull String[] permissions,
                                      @NonNull int[] grantResults) {
    super.onRequestPermissionsResult(requestCode,
               permissions, grantResults);
    if (requestCode == 0x100) {
        for (int result : grantResults) {
            if (result == PackageManager.PERMISSION_DENIED) {
                return;
            }
        }
        setupBroadcastSession();
    }
}
```

## Create the Broadcast Session
<a name="broadcast-android-create-session"></a>

The broadcast interface is `com.amazonaws.ivs.broadcast.BroadcastSession`. Initialize it with a preset, as shown below. If there are any errors during initialization (such as a failure to configure a codec) your `BroadcastListener` will get an error message and `broadcastSession.isReady` will be `false`.

**Important:** All calls to the Amazon IVS Broadcast SDK for Android *must* be made on the thread on which the SDK is instantiated. *A call from a different thread will cause the SDK to throw a fatal error and stop broadcasting*.

```
// Create a broadcast-session instance and sign up to receive broadcast
// events and errors.
Context ctx = getApplicationContext();
broadcastSession = new BroadcastSession(ctx,
                       broadcastListener,
                       Presets.Configuration.STANDARD_PORTRAIT,
                       Presets.Devices.FRONT_CAMERA(ctx));
```

Also see [Create the Broadcast Session (Advanced Version)](broadcast-android-use-cases.md#broadcast-android-create-session-advanced) .

## Set the ImagePreviewView for Preview
<a name="broadcast-android-set-imagepreviewview"></a>

If you want to display a preview for an active camera device, add a preview `ImagePreviewView` for the device to your view hierarchy.

```
// awaitDeviceChanges will fire on the main thread after all pending devices 
// attachments have been completed
broadcastSession.awaitDeviceChanges(() -> {
    for(Device device: session.listAttachedDevices()) {
        // Find the camera we attached earlier
        if(device.getDescriptor().type == Device.Descriptor.DeviceType.CAMERA) {
            LinearLayout previewHolder = findViewById(R.id.previewHolder);
            ImagePreviewView preview = ((ImageDevice)device).getPreviewView();
            preview.setLayoutParams(new LinearLayout.LayoutParams(
                    LinearLayout.LayoutParams.MATCH_PARENT,
                    LinearLayout.LayoutParams.MATCH_PARENT));
            previewHolder.addView(preview);
        }
    }
});
```

## Start a Broadcast
<a name="broadcast-android-start"></a>

The hostname that you receive in the `ingestEndpoint` response field of the `GetChannel` operation needs to have `rtmps://` prepended and `/app` appended. The complete URL should be in this format: `rtmps://{{ ingestEndpoint }}/app`

```
broadcastSession.start(IVS_RTMPS_URL, IVS_STREAMKEY);
```

The Android broadcast SDK supports only RTMPS ingest (not insecure RTMP ingest).

## Stop a Broadcast
<a name="broadcast-android-stop"></a>

```
broadcastSession.stop();
```

## Release the Broadcast Session
<a name="broadcast-android-release-session"></a>

You *must call* the `broadcastSession.release()` method when the broadcast session is no longer in use, to free the resources used by the library.

```
@Override
protected void onDestroy() {
    super.onDestroy();
    previewHolder.removeAllViews();
    broadcastSession.release();
}
```

# Advanced Use Cases for the IVS Android Broadcast SDK \$1 Low-Latency Streaming
<a name="broadcast-android-use-cases"></a>

Here we present some advanced use cases. Start with the basic setup above and continue here. 

## Create a Broadcast Configuration
<a name="broadcast-android-create-configuration"></a>

Here we create a custom configuration with two mixer slots that allow us to bind two video sources to the mixer. One (`custom`) is full screen and laid out behind the other (`camera`), which is smaller and in the bottom-right corner. Note that for the `custom` slot we do not set a position, size, or aspect mode. Because we do not set these parameters, the slot will use the video settings for size and position.

```
BroadcastConfiguration config = BroadcastConfiguration.with($ -> {
    $.audio.setBitrate(128_000);
    $.video.setMaxBitrate(3_500_000);
    $.video.setMinBitrate(500_000);
    $.video.setInitialBitrate(1_500_000);
    $.video.setSize(1280, 720);
    $.mixer.slots = new BroadcastConfiguration.Mixer.Slot[] {
            BroadcastConfiguration.Mixer.Slot.with(slot -> {
                // Do not automatically bind to a source
                slot.setPreferredAudioInput(
                           Device.Descriptor.DeviceType.UNKNOWN);
                // Bind to user image if unbound
                slot.setPreferredVideoInput(
                           Device.Descriptor.DeviceType.USER_IMAGE);
                slot.setName("custom");
                return slot;
            }),
            BroadcastConfiguration.Mixer.Slot.with(slot -> {
                slot.setzIndex(1);
                slot.setAspect(BroadcastConfiguration.AspectMode.FILL);
                slot.setSize(300, 300);
                slot.setPosition($.video.getSize().x - 350,
                        $.video.getSize().y - 350);
                slot.setName("camera");
                return slot;
            })
    };
    return $;
});
```

## Create the Broadcast Session (Advanced Version)
<a name="broadcast-android-create-session-advanced"></a>

Create a `BroadcastSession` as you did in the [basic example](broadcast-android-getting-started.md#broadcast-android-create-session), but provide your custom configuration here. Also provide `null` for the device array, as we will add those manually.

```
// Create a broadcast-session instance and sign up to receive broadcast
// events and errors.
Context ctx = getApplicationContext();
broadcastSession = new BroadcastSession(ctx,
                       broadcastListener,
                       config, // The configuration we created above
                       null); // We’ll manually attach devices after
```

## Iterate and Attach a Camera Device
<a name="broadcast-android-attach-camera"></a>

Here we iterate through input devices that the SDK has detected. On Android 7 (Nougat) this will only return default microphone devices, because the Amazon IVS Broadcast SDK does not support selecting non-default devices on this version of Android.

Once we find a device that we want to use, we call `attachDevice` to attach it. A lambda function is called on the main thread when attaching the input device has completed. In case of failure, you will receive an error in the Listener.

```
for(Device.Descriptor desc: BroadcastSession.listAvailableDevices(getApplicationContext())) {
    if(desc.type == Device.Descriptor.DeviceType.CAMERA &&
            desc.position == Device.Descriptor.Position.FRONT) {
        session.attachDevice(desc, device -> {
            LinearLayout previewHolder = findViewById(R.id.previewHolder);
            ImagePreviewView preview = ((ImageDevice)device).getPreviewView();
            preview.setLayoutParams(new LinearLayout.LayoutParams(
                    LinearLayout.LayoutParams.MATCH_PARENT,
                    LinearLayout.LayoutParams.MATCH_PARENT));
            previewHolder.addView(preview);
            // Bind the camera to the mixer slot we created above.
            session.getMixer().bind(device, "camera");
        });
        break;
    }
}
```

## Swap Cameras
<a name="broadcast-android-swap-cameras"></a>

```
// This assumes you’ve kept a reference called "currentCamera" that points to
// a front facing camera
for(Device device: BroadcastSession.listAvailableDevices()) {
   if(device.type == Device.Descriptor.DeviceType.CAMERA &&
          Device.position != currentCamera.position) {
        // Remove the preview view for the old device.
        // setImagePreviewTextureView is an example function 
        // that handles your view hierarchy.
        setImagePreviewView(null);
        session.exchangeDevices(currentCamera, device, camera -> {
             // Set the preview view for the new device.
             setImagePreviewView(camera.getPreviewView());
             currentCamera = camera;
        });
        break;
   }
}
```

## Create an Input Surface
<a name="broadcast-android-create-input-surface"></a>

To input sound or image data that your app generates, use `createImageInputSource` or `createAudioInputSource`. Both these methods create and attach virtual devices that can be bound to the mixer like any other device.

The `SurfaceSource` returned by `createImageInputSource` has a `getInputSurface` method, which will give you a `Surface` that you can use with the Camera2 API, OpenGL, or Vulkan, or anything else that can write to a Surface.

The `AudioDevice` returned by `createAudioInputSource` can receive Linear PCM data generated by AudioRecorder or other means.

```
SurfaceSource source = session.createImageInputSource();
Surface surface = source.getInputSurface();
session.getMixer().bind(source, “custom”);
```

## Detach a Device
<a name="broadcast-android-detach-device"></a>

If you want to detach and not replace a device, detach it with `Device` or `Device.Descriptor`.

```
session.detachDevice(currentCamera);
```

## Screen and System Audio Capture
<a name="broadcast-android-screen-audio-capture"></a>

The Amazon IVS Broadcast SDK for Android includes some helpers that simplify capturing the device’s screen (Android 6 and higher) and system audio (Android 10 and higher). If you want to manage these manually, you can create a custom image-input source and a custom audio-input source.

To create a screen and system audio-capture session, you must first create a permission-request intent:

```
public void startScreenCapture() {
    MediaProjectionManager manager =
                         (MediaProjectionManager) getApplicationContext()
                         .getSystemService(Context.MEDIA_PROJECTION_SERVICE);
    if(manager != null) {
        Intent intent = manager.createScreenCaptureIntent();
        startActivityIfNeeded(intent, SCREEN_CAPTURE_REQUEST_ID);
    }
}
```

To use this feature, you must provide a class that extends `com.amazonaws.ivs.broadcast.SystemCaptureService`. You do not have to override any of its methods, but the class needs to be there to avoid any potential collisions between services.

You also must add a couple of elements to your Android manifest:

```
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<application ...>
    <service android:name=".ExampleSystemCaptureService"
         android:foregroundServiceType="mediaProjection" 
         android:isolatedProcess="false" />
</application>
...
```

Your class that extends `SystemCaptureService` must be named in the `<service>` element. On Android 9 and later, the `foregroundServiceType` must be `mediaProjection`.

Once the permissions intent has returned, you may proceed with creating the screen and system audio-capture session. On Android 8 and later, you must provide a notification to be displayed in your user’s Notification Panel. The Amazon IVS Broadcast SDK for Android provides the convenience method `createServiceNotificationBuilder`. Alternately, you may provide your own notification. 

```
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
    super.onActivityResult(requestCode, resultCode, data);
    if(requestCode != SCREEN_CAPTURE_REQUEST_ID
       || Activity.RESULT_OK != resultCode) {
        return;
    }
    Notification notification = null;
    if(Build.VERSION.SDK_INT >= 26) {
        Intent intent = new Intent(getApplicationContext(),
                                   NotificationActivity.class);
        notification = session
                         .createServiceNotificationBuilder("example",
                                            "example channel", intent)
                         .build();
    }
    session.createSystemCaptureSources(data,
                  ExampleSystemCaptureService.class,
                  Notification,
                  devices -> {
        // This step is optional if the mixer slots have been given preferred
        // input device types SCREEN and SYSTEM_AUDIO
        for (Device device : devices) {
            session.getMixer().bind(device, "game");
        }
    });
}
```

## Get Recommended Broadcast Settings
<a name="broadcast-android-recommended-settings"></a>

To evaluate your user’s connection before starting a broadcast, use the `recommendedVideoSettings` method to run a brief test. As the test runs, you will receive several recommendations, ordered from most to least recommended. In this version of the SDK, it is not possible to reconfigure the current `BroadcastSession`, so you will need to `release()` it and then create a new one with the recommended settings. You will continue to receive `BroadcastSessionTest.Results` until the `Result.status` is `SUCCESS` or `ERROR`. You can check progress with `Result.progress`.

Amazon IVS supports a maximum bitrate of 8.5 Mbps (for channels whose `type` is `STANDARD` or `ADVANCED`), so the `maximumBitrate` returned by this method never exceeds 8.5 Mbps. To account for small fluctuations in network performance, the recommended `initialBitrate` returned by this method is slightly less than the true bitrate measured in the test. (Using 100% of the available bandwidth usually is inadvisable.)

```
void runBroadcastTest() {
    this.test = session.recommendedVideoSettings(RTMPS_ENDPOINT, RTMPS_STREAMKEY,
        result -> {
            if (result.status == BroadcastSessionTest.Status.SUCCESS) {
                this.recommendation = result.recommendations[0];
            }
        });
}
```

## Using Auto-Reconnect
<a name="broadcast-android-auto-reconnect"></a>

IVS supports automatic reconnection to a broadcast if the broadcast stops unexpectedly without calling the `stop` API; e.g., a temporary loss in network connectivity. To enable auto-reconnect, call `setEnabled(true)` on `BroadcastConfiguration.autoReconnect`.

When something causes the stream to unexpectedly stop, the SDK retries up to 5 times, following a linear backoff strategy. It notifies your application about the retry state through the `BroadcastSession.Listener.onRetryStateChanged` method.

Behind the scenes, auto-reconnect uses IVS [stream-takeover](streaming-config.md#streaming-config-stream-takeover) functionality by appending a priority number, starting with 1, to the end of the provided stream key. For the duration of the `BroadcastSession` instance, that number is incremented by 1 each time a reconnect is attempted. This means if the device’s connection is lost 4 times during a broadcast, and each loss requires 1-4 retry attempts, the priority of the last stream up could be anywhere between 5 and 17. Because of this, *we recommend you do not use IVS stream takeover from another device while auto-reconnect is enabled in the SDK for the same channel*. There are no guarantees what priority the SDK is using at the time, and the SDK will try to reconnect with a higher priority if another device takes over.

## Using Bluetooth Microphones
<a name="broadcast-android-bluetooth-microphones"></a>

To broadcast using Bluetooth microphone devices, you must start a Bluetooth SCO connection:

```
Bluetooth.startBluetoothSco(context);
// Now bluetooth microphones can be used
…
// Must also stop bluetooth SCO
Bluetooth.stopBluetoothSco(context);
```

# Known Issues & Workarounds in the IVS Android Broadcast SDK \$1 Low-Latency Streaming
<a name="broadcast-android-issues"></a>

This document lists known issues that you might encounter when using the Amazon IVS low-latency streaming Android broadcast SDK and suggests potential workarounds.
+ Using an external microphone connected through Bluetooth can be unstable. When a Bluetooth device is connected or disconnected during a broadcasting session, microphone input may stop working until the device is explicitly detached and reattached.

  **Workaround:** If you plan to use a Bluetooth headset, connect it before starting the broadcast and leave it connected throughout the broadcast.
+ The broadcast SDK does not support access on external cameras connected via USB.

  **Workaround:** Do not use external cameras connected via USB. 
+ Submitting audio data faster than realtime (using a custom audio source) results in audio drift.

  **Workaround:** Do not submit audio data faster than realtime. 
+ Android 6 and 7 devices cannot receive the broadcast SDK's `onDeviceAdded` and `onDeviceRemoved` callbacks for microphones, because these Android versions allow only the system’s default microphone.

  **Workaround:** For these devices, the broadcast SDK uses the system's default microphone.
+ When an `ImagePreviewView` is removed from a parent (e.g., `removeView()` is called at the parent), the `ImagePreviewView` is released immediately. The `ImagePreviewView` does not show any frames when it is added to another parent view.

  **Workaround:** Request another preview using `getPreview`.
+ Some Android video encoders cannot be configured with a video size less than 176x176. Configuring a smaller size causes an error and prevents streaming.

  **Workaround:** Do not configure the video size to be less than 176x176.
+ Enabling B-frames can improve compression quality; however some encoders provide less precise bitrate control when B-frames are enabled, which may cause issues during network fluctuations.

  **Workaround:** Consider disabling B-frames if consistent bitrate adherence is more important than compression efficiency for your use case.