Amazon IVS in a Nutshell
Amazon IVS (Interactive Video Service) is a simple, customizable solution for making live streaming applications, based on the same technology that powers Twitch. It is low-latency, supports web and mobile platforms, and includes chat functionality out of the box, allowing for easier integration.
This technology finds applications in eCommerce, video conferencing, eLearning, social media, and other niches. Some of the most prominent users of IVS are:
- Blackboard, a learning management system with over 150 million users
- GoPro, a video hardware and software company
- Hopin, a virtual event platform with more than 100 million users
- ScreenCloud, a live meeting app
- Pococha, a live streaming app with millions of users
Banuba Face AR Live Streaming SDK in a Nutshell
Banuba SDK is a set of developer tools that allow creating a highly engaging video communication experience on web, mobile, and desktop apps. It includes the following functions:
- Photo and video recording
- Rendering engine
- Face tracking
- Avatars
- Portrait backdrops
- Full body backdrops
This feature set allows to create and accurately place face masks, replace backgrounds with pictures or videos, remove skin imperfections, apply virtual makeup, and do a lot of other things that make using camera-related apps more fun.
Over 1000 effects for it are available in the asset store, and more can be made with a browser-based Banuba Studio. Moreover, to decrease the app size, you can store filters, masks, and other effects on the AR cloud so that the users can download them whenever they want.
Face AR Live Streaming SDK is known for its solid optimization, allowing it to work in real-time camera FPS on devices starting from iPhone 5S and most Android smartphones.
Both IVS and Face AR SDK are designed for fast, streamlined integration, which is what we need.
Building a Live Streaming App Step-By-Step
Now, let’s get into the actual process.
Step 1. Follow the IVS setup guide
https://docs.aws.amazon.com/ivs/latest/userguide/getting-started.html

Step 2. Create an IVS channel
Once you’ve created a channel, adjust your settings.

You will get playback and stream configurations.

Step 3. Broadcast from mobile devices (Android/iOS)
To start a broadcast from a mobile platform, you must provide the endpoint URL and stream key for your Amazon IVS channel into BroadCastSession:
The complete URL should be in this format:
[code]
rtmps:///app
broadcastSession.start(IVS_RTMPS_URL, IVS_STREAMKEY);
[/code]
Get IVS_RTMPS_URL and IVS_STREAMKEY on Stream configuration tab in IVS channel
See additional instructions, if you have any questions.
Step 4. Stream Playback
An IVS Stream URL looks like this: https://fcc3ddae59ed.us-west-2.playback.live-video.net/api/video/v1/us-west-2.893648527354.channel.DmumNckWFTqz.m3u8. Your stream URL comes from Playback Configuration
Unlike Youtube links, it can't be opened directly in the browser because it coms as a file with the M3U8 extension. This extension stands for a UTF-8 Encoded Playlist file which points to a stream on the Internet. To open it, use https://debug.ivsdemos.com/ or any other m3u8 player.


When you need extra effects
Now we are getting into additional functionality that won’t take much time to set up but will expand the functionality of your app.
Banuba Face Filters
To integrate it, you will need to get Banuba Face AR SDK. It is simple – shoot us a message through a contact form. You will get the SDK itself along with the installation instructions and a trial token.

The flexibility of Mobile IVS SDK gives the possibility to use custom video sources as stream inputs.
See https://github.com/aws-samples/amazon-ivs-broadcast-android-sample/blob/main/app/src/main/java/com/amazonaws/ivs/basicbroadcast/activities/CustomSourceActivity.kt for IVS custom source example.
[code]
private fun attachCustomCamera() {
CameraManager(applicationContext).apply {
cameraManager = this
viewModel.session?.createImageInputSource()?.let { surfaceSource ->
this.open(surfaceSource
}
}
}
[/code]
SurfaceSource.inputSurface is used as input for video stream broadcast with effective hardware compression. Camera, video, or rendering can be configured to send pictures to this surface. Using Banuba SDK with integrated camera and AR effects is extremely easy, surfaceSource.inputSurface attaches to BanubaSdkManager:
[code]
private fun attachCustomSources() {
Log.d(IVS_TAG, "Attaching custom sources")
viewModel.session?.createImageInputSource()?.let { surfaceSource ->
bnbSdkManager.apply {
attachSurface(surfaceSource.inputSurface
onSurfaceCreated()
onSurfaceChanged(0, STREAM_PARAMS_WIDTH, STREAM_PARAMS_HEIGHT)
}
}
attachCustomMicrophone()
}
[/code]
Full sample available on https://github.com/Banuba/amazon-ivs-android-kotlin
Don’t forget to setup your Banuba client token into BANUBA_CLIENT_TOKEN and Amazon IVS key/endpoint into AMAZON_IV_KEY/AMAZON_IV_ENDPOINT of TokensAndConfig.kt
Conclusion
With a combination of Amazon IVS and Banuba Face AR SDK you will be able to make a video streaming app with AR effects in a couple of hours and with little coding.
Banuba AR Effect
However, it will provide solid quality and flexibility. Try it for yourself, and if you need AR features for another project, drop us a line.
