Blog
Face Filter SDK

Video Background Changer: How to Integrate It & Save Costs

Video background changers let users increase privacy, remove noise, follow the brand's identity, and support sales initiatives during video calls and conferences.

However, building custom background changer apps is pricey while purchasing ready-made software may limit your scalability and leave little space for customizations.

Here comes an SDK solution with relevant functionality that helps you add backgrounds to video with effects, filters, or even create your custom ones.

Today we want to guide you through integrating a video conferencing SDK into your app along with discussing the core features and definitions. 

virtual backgrounds banuba

Video Background Changer: What Is It and How It Works

Video background changers are ready-to-use tools that use green screen or background subtraction technologies. With the help of deep learning, it allows you to digitally extract people from the background.

For example, Banuba's video background changer uses a real-time background subtraction algorithm to replace backgrounds with both static and animated textures. Virtual backgrounds can be changed into many optional presets during a video conferencing meeting, event streaming, in a virtual classroom, or used as a funny effect in advertising.

Power Your App with Real-Time Virtual Background Changer Get Free Trial


Also Read: Virtual Background For Video Conferencing To Improve Privacy and Add Fun

Video background removers and changers can be used together with face tracking to empower your video experience with AR face filters.

background subtraction for auto background removalBanuba Real-time Video Background Changer with AR Filters

Deep Learning-Based Background Subtraction

Banuba has developed a deep convolutional neural network for background subtraction to separate a person from the background for iOS, Android, and Unity.

background subtraction banubaVideo Background Remover Using Deep Learning

The convolutional neural network uses colour images as the input and a probability mask showing whether a pixel belongs to the class “person” or “background.” Such an approach allows us to ensure high performance and good results for real-time background subtraction. Initially, we took a small data set and increased it by active learning and subsequent fine-tuning. A correctly selected background data set helps to obtain optimal results and high-quality implementation.

Additionally, we’ve been working on optimizing our real-time background subtraction for different lighting conditions like flare sports or low lighting. These have been a soft spot for all real-time background detection and extraction software but we’re optimizing our segmentation neural network to work with dynamically changing lighting.

Background subtraction is now available as part of our Face Filter SDK to enable virtual backgrounds with facial animation or as part of AR video conferencing or live streaming solutions to replace backgrounds only. 

Power Your App with Real-Time Virtual Background Changer Get Free Trial

The key features of our video background changer include:

  • Video or photo background remover
  • Portrait and landscape mode 
  • Bokeh effect
  • Real-time or post-processing: mating, filters, etc.
  • Compatible with other face tracking features like face filters or beautification
  • Respect or ignore gyroscope data to adjust the horizon line in virtual backgrounds
  • iOS, Android, and Unity support.

Real-time background subtraction performance (FPS)

  • Android: Min 25 FPS on mid devices like Samsung Galaxy S7 and 40 FPS on 
  • iOS:  Min 20 FPS on low iPhone 5s, 30 FPS on iPhone 11, and 60 FPS on iPad Pro.

3 Types of Virtual Backgrounds with Banuba SDK

Our Video Background Changer framework allows for 3 types of virtual backgrounds

  • Static
  • Animated
  • 3D virtual background 

Static virtual backgrounds are represented as a 2D texture in .png format, This can be any image either created in an image editor or downloaded from the web.

Animated backgrounds use video (.mp4) files and textures to animate objects in the background. This can be a custom, branded video background or video pieces downloaded from the stock.

3D virtual backgrounds use environment textures (.ktx) to create a 360-degree background that a user can see when rotating his or her device. Unlike static or animated backgrounds, where users generally stay still in front of the camera, 3D virtual backgrounds suggest that the user can move and rotate the device. The example of creating a 3D background using environment textures (.ktx)may be found here.

This tutorial will guide you on how to create static and animated virtual backgrounds and enable them with our realm-time background subtraction algorithms.

Power Your App with Real-Time Virtual Background Changer Get Free Trial

Key Steps Before the Integration

Go to our Face Filter SDK page and fill in your request for the Background Remover feature. You’ll get links to downloading the package, Filter Editor, and Viewer components. You'll have a free trial period to validate our feature performance and test integration within your app. The process looks as follows:

  1.  Unpack SDK archives
  2.  Activate token
  3.  Enable demo app examples
  4.  Create custom backgrounds and integrate them into your app

How to Integrate a Video Background Remover Into Your App

In these examples, we’ll learn how to use background subtraction for static and animated virtual backgrounds. 


Custom virtual backgrounds: Preparation stage 

On this step, we’ll set up the environment for the Filter Editor and further background effect conversion to the native format that supports the background SDK.

  1.  Create a folder for your effect and name it the way you name the effect (bg_segmentation).
  2.  Open Effect Constructor (EC), Press ‘Open’ button in the upper left corner and select ‘bg_segmentation’ folder created in the step 1.
  3.  Press ‘Construct’ button in the upper right corner to create the effect. We’ll add our background files later. Now, no notifications will appear.
  4.  Check your folder. You'll see a new folder created inside the effect folder with the same name. Files such as “config.js”, “cfg.toml”, “config.json” and “preview.png” are added to the new folder.

Setting up background segmentation (manually):

1. Along with generated files put your sample media file into the newly generated effect folder.

a. For 2D texture background segmentation: 

[code]
bg.vert
bg_tex.frag
paper.png (or any texture(s) you want)
tri.bsm2
[/code]

b. For Video background segmentation:

[code]
bg.vert
bg_video.frag
[/code]

Video.mp4 (or any video file you want)

[code]
tri.bsm2
[/code]

Sample files can be found in downloaded effect’s example archive.

2. Edit cfg.toml:

a. For 2D texture background segmentation

Write the material section:

[code]
[materials.TriMat]
vs = "bg.vert"
fs = "bg_tex.frag"
blend = "off" # "alpha", "premul_alpha", "screen", "add", "multiply"
backfaces = false
colorwrite = true
zwrite = false
shadow = false
samplers = { tex=0 } # texture is set from config.js
[/code]

Add TriMat material from tri.bsm2 model to the draw order:

[code]
draw_order = [“TriMat”]
[/code]

b. For Video background segmentation

Write the material section:

[code]
[materials.TriMat]
vs = "bg.vert"
fs = "bg_video.frag"
blend = "off" # "alpha", "premul_alpha", "screen", "add", "multiply"
backfaces = false
colorwrite = true
zwrite = false
shadow = false
samplers = {}
[/code]

Add TriMat material from tri.bsm2 model to the draw order:

[code]
draw_order = [“TriMat”]
[/code]

3. For background segmentation edit config.json:

a. For 2D texture background segmentation

Enable the features. Add to general section:

[code]
"recognizer":["background"]
[/code]

b. For Video background segmentation

Add media option to the frx section:

[code]
"frx":{
    "type": "3d_anim",
    "3d_anim": {
      "type": "vfx"
    },
    "media": {
      "type": "VIDEO",
      "file": "video.mp4"
    }
}
[/code]

Enable the features. Add to general section:

[code]
"recognizer": [ "background"  ]
[/code]

4. Add the following code to Effect’s init method

a. For 2D texture background segmentation

Add tri.bsm2 mesh spawn:

[code]
Api.meshfxMsg("spawn", 0, 0, "tri.bsm2");
[/code]

Set BG texture:

[code]
Api.meshfxMsg("tex", 0, 0, "paper.png");
[/code]

Note! You can change BG Texture in runtime, calling Api.meshfxMsg("tex"…) with new texture name;

Bonus. You may color your texture by sending additional color data to the shader by calling:

[code]
Api.meshfxMsg("shaderVec4", 0, 0, "1.0 0.0 0.0 0."); // "1.0 0.0 0.0 0.0" -  RGBA data.
[/code]

b. For Video background segmentation

Add tri.bsm2 mesh spawn:

[code]
Api.meshfxMsg("spawn", 0, 0, "tri.bsm2");
[/code]

Add playVideo function call:

[code]
Api.playVideo("frx", true, 1);
[/code]

The effect construction is completed. You can preview your effect in the Banuba Viewer application. To preview the effect, drag'n'drop it to the Previewer window.

Summing up

Video background changers allow you to remove the background in videos (aka сhroma key) and replace it with a scenery per your choice for an enhanced user experience. 

Let's summarize the core use cases of using virtual background removers:

  • Replace backgrounds during a business call for privacy purposes
  • Change an unsuitable background and remove noise
  • Removing unwanted objects or people from videos
  • Change a boring background during a video call to the jungle or beach
  • Design custom backgrounds to promote your product
  • Take unique brand videos and design creative ad campaigns
  • Create 360-degree 3D backgrounds for educational or entertainment, e.g., for mixed reality
  • Editing “boring” backgrounds to create perfect videos.

Power Your App with Real-Time Virtual Background Changer Get Free Trial


Next read: Getting Started With Creating Face-based AR Experiences

Top