Blog
Face Filter SDK

How To Add Video Background Changer To Your App

Virtual backgrounds have become a must-have feature for video conferencing platforms. Want to integrate a video background changer into your app? Learn how to create and enable virtual backgrounds with our real-time background subtraction technology.

virtual backgrounds banuba

Test Video Background Changer as part of our Video Conferencing software package.

Discover  

What is background subtraction

Green screen or background subtraction technology uses deep learning to digitally extract you from your background. Our real-time background subtraction algorithm can be used to replace backgrounds with both static and animated textures. Virtual backgrounds can be changed into many optional presets during a video conferencing meeting, event streaming, in a virtual classroom or used as a funny effect in advertising.


Also Read: Virtual Background For Video Conferencing To Improve Privacy and Add Fun

Background subtraction can be used together with face tracking enabling face filters, animation controlled with the face.

background subtraction for auto background removalBanuba demo: Real-time Background Subtraction To Add Virtual Backgrounds on Mobile

Background subtraction using deep learning

Banuba has developed a deep convolutional neural network for background subtraction to separate a person from the background for iOS, Android and Unity.

background subtraction banubaBackground subtraction using deep learning

The convolutional neural network uses colour images as the input and a probability mask showing whether a pixel belongs to the class “person” or “background.” Such an approach allows us to ensure high performance and good results for real-time background subtraction. Initially, we took a small data set and increased it by active learning and subsequent fine-tuning. A correctly selected background data set helps to obtain optimal results and high-quality implementation.

Additionally, we’ve been working on optimizing our real-time background subtraction for different lighting conditions like flare sports or low lighting. These have been a soft spot for all real-time background detection and extraction software but we’re optimizing our background segmentation neural network to work with dynamically changing lighting.

Background subtraction is now available as part of our Face Filter SDK to enable virtual backgrounds with facial animation or as part of AR video conferencing or live streaming solutions to replace backgrounds only. 


Also Read: AR Video Calls With Banuba And Agora Integration

Features

  • Video or photo background remover
  • Portrait and landscape mode 
  • Bokeh effect
  • Real-time or post-processing: mating, filters, etc.
  • Compatible with other face tracking features like face filters or beautification
  • Respect or ignore gyroscope data to adjust the horizon line in virtual backgrounds
  • iOS, Android, Unity support

Real-time background subtraction performance (FPS)

  • Android: Min 25 FPS on mid devices like Samsung Galaxy S7 and 40 FPS on 
  • iOS:  Min 20 FPS on low iPhone 5s, 30 FPS on iPhone 11 and 60 FPS on iPad Pro

Reference: Banuba Face AR Technical Specification

Virtual backgrounds you can create with Banuba SDK

Our Video Background Changer framework allows for 3 types of virtual backgrounds

  • Static
  • Animated
  • 3D virtual background 

Static virtual backgrounds are represented as a 2D texture in .png format, This can be any image either created in an image editor or downloaded from the web. 

Animated backgrounds use video (.mp4) files and textures to animate objects on the background. This can be a custom, branded video backgrounds or video pieces downloaded from the stock.

3D virtual backgrounds use environment textures (.ktx) to create a 360-degree background that a user can see when rotating his or her device. Unlike static or animated backgrounds, where users generally stay still in front of the camera, 3D virtual backgrounds suggest that the user can move and rotate the device. The example of creating a 3D background using environment textures (.ktx) may be found here.

This tutorial will guide you on how to create static and animated virtual backgrounds and enable them with our realm-time background subtraction algorithms.

What you’ll need

Go to our Face Filter SDK page and fill in your request for the Background Remover feature. You’ll get links to downloading the package, Filter Editor and Viewer components. You'll have a free trial period to validate our feature performance and test integration within your app. The process looks as follows:

  1. Unpack SDK archives
  2. Activate token
  3. Enable demo app examples
  4. Create custom backgrounds and integrate them into your app

How to add video background changer in your app

In these examples we’ll learn how to use background subtraction for static and animated virtual backgrounds. 


Custom virtual backgrounds: Preparation stage 

On this step, we’ll set up the environment for the Filter Editor and further background effect conversion to the native format that supports the background SDK.

  1.  Create a folder for your effect and name it the way you name the effect (bg_segmentation).
  2.  Open Effect Constructor (EC), Press ‘Open’ button in the upper left corner and select ‘bg_segmentation’ folder created in the step 1.
  3.  Press ‘Construct’ button in the upper right corner to create the effect. We’ll add our background files later. Now, no notifications will appear.
  4.  Check your folder. You'll see a new folder created inside the effect folder with the same name. Files such as “config.js”, “cfg.toml”, “config.json” and “preview.png” are added to the new folder.

Setting up background segmentation (manually):

1. Along with generated files put your sample media file into the newly generated effect folder.

a. For 2D texture background segmentation: 

bg.vert

bg_tex.frag

paper.png (or any texture(s) you want)

tri.bsm2

b. For Video background segmentation:

bg.vert

bg_video.frag

Video.mp4 (or any video file you want)

tri.bsm2

Sample files can be found in downloaded effect’s example archive.

2. Edit cfg.toml:

a. For 2D texture background segmentation:

Write the material section:

[materials.TriMat]

vs = "bg.vert"

fs = "bg_tex.frag"

blend = "off" # "alpha", "premul_alpha", "screen", "add", "multiply"

backfaces = false

colorwrite = true

zwrite = false

shadow = false

samplers = { tex=0 } # texture is set from config.js

Add TriMat material from tri.bsm2 model to the draw order

    draw_order = [“TriMat”]

b. For Video background segmentation:

Write the material section:

[materials.TriMat]

vs = "bg.vert"

fs = "bg_video.frag"

blend = "off" # "alpha", "premul_alpha", "screen", "add", "multiply"

backfaces = false

colorwrite = true

zwrite = false

shadow = false

samplers = {}

Add TriMat material from tri.bsm2 model to the draw order

    draw_order = [“TriMat”]

More about cfg.toml here.

3. For background segmentation edit config.json:

a. For 2D texture background segmentation:

Enable the features. Add to general section:

"recognizer":[

"background"

]

b. For Video background segmentation:

Add media option to the frx section:

  "frx": {

    "type": "3d_anim",

    "3d_anim": {

      "type": "vfx"

    },

    "media": {

    "type": "VIDEO",

    "file": "video.mp4"

    }

  }

Enable the features. Add to general section:

   "recognizer": [

"background"

   ]

4. Add the following code to Effect’s init method:

a. For 2D texture background segmentation:

Add tri.bsm2 mesh spawn.

Api.meshfxMsg("spawn", 0, 0, "tri.bsm2");

Set BG texture.

Api.meshfxMsg("tex", 0, 0, "paper.png");

Note! You can change BG Texture in runtime, calling Api.meshfxMsg("tex"…) with new texture name;

Bonus. You may color your texture by sending additional color data to the shader by calling:

          Api.meshfxMsg("shaderVec4", 0, 0, "1.0 0.0 0.0 0."); // "1.0 0.0 0.0 0.0" -  RGBA data.

b. For Video background segmentation:

Add tri.bsm2 mesh spawn.

Api.meshfxMsg("spawn", 0, 0, "tri.bsm2");

Add playVideo function call.

Api.playVideo("frx", true, 1);

More about config.js here.

The effect construction is completed. You can preview your effect in the Banuba Viewer application. To preview the effect, drag'n'drop it to the Previewer window.

Summing up

With Video Background Changer you can not only remove the background in videos (aka сhroma key) but replace it with a scenery per their choice for enhanced user experience. 

Real-time background changer applications

  • Replace backgrounds during a business call for privacy purposes
  • Change an unsuitable background and remove noise
  • Removing unwanted objects or people from videos
  • Change a boring background during a video call to the jungle or beach
  • Design custom backgrounds to promote your product
  • Take unique brand videos and design creative ad campaigns
  • Create 360-degree 3D backgrounds for educational or entertainment, e.g., for mixed reality
  • Editing “boring” backgrounds to create perfect videos

Want to add a background changer into your app? Get in touch!

Get in Touch


Next read: Getting Started With Creating Face-based AR Experiences

Top