Blog
Web AR

JavaScript + WebGL Face Detection and Tracking to Bring Augmented Reality to the Web

Augmented reality can be a valuable addition to web apps. Inspire purchases with virtual try-on embedded on your e-commerce site. Make your online marketing campaign or offline photo booth experience way more engaging with webcam filters like Snapchat. It's now possible with our JavaScript/WebGL face detection and tracking technology. Learn how to add face tracking to your website with our AR SDK for JavaScript.

WebGL face detection and tracking hero

Real-time face detection and tracking for the web

Face tracking library for the web is a JavaScript API for detecting the user's face and rendering interactive 3D graphics in real-time within a browser. It doesn't require third-party plugins — just the web and the camera that users need to experience AR.

You can test it yourself straight away by heading to our Web AR technology page. The same camera face tracking features can be embedded into any web app or website.

Try Out Web AR

Technology

The core of our web face tracking is a neural network built with Tensorflow Lite. It's an open-source deep learning framework for on-device inference and should not be confused with Tensorflow.js.

On the surface, Tensorflow Lite might not seem a perfect solution for webcam face tracking. Still, one can not ignore its advantages. Among such, reduced size and fast inference on devices with limited compute and memory resources. Tensorflow Lite has a large community, and Google itself uses it for Web AR in Google Meet. 

However, it is not the neural network player only that makes the success of the final result. The right neural network architecture, efficient data loading and its effective processing play a crucial role. Let us touch upon each aspect and explain what makes our face tracking unique.

How we accelerated in-browser face tracking performance 

Neural network optimization

Unlike Tensorflow.js or other libraries, face tracking with Tensorflow Lite requires adapting the network architecture to the specifics of the XNNPACK delegate. We managed to optimize it and get a manifold increase in performance.

The face tracking library coming as part of the SDK delivers stable 30 fps on the camera and render (Chrome, MacBook Pro 15", 2017). It’s enough for real-time face filtering, in-browser makeup try on and virtual try ons.

Fast data processing 

Our face tracking SDKs are written in C ++ with a minimum of external dependencies. For the web, we use the JavaScript wrapper called via Emscripten. It compiles C++ code into WebAssembly, and runs it on the Web.

Emscripten supports fast data input and output via Canvas/WebGL and provides OpenGL support over WebGL. In the case of Chrome, we additionally have a 2x speedup due to the parallelization of mathematical calculations through SIMD instructions.

Effect playback

Fast face tracking performance is not all that makes a good user experience in Web AR. The face detection neural network is just one of the components of the pipeline for processing a frame from input to effect drawing. For smooth 3D graphics rendering on the web we use the EffectPlayer, another SDK component, running on OpenGL

As a result, our face tracking library combines a lightweight core and fast performance on the web. It’s achieved with 

  • fast input/output via WebGL 
  • good performance of adapted neural networks 
  • fast playback of effects on WebGL 

How to embed face tracking features to your website 

Our AR SDK for JavaScript allows you to access face detection and tracking technology via js library. It exports different APIs for Web AR development. 

Face tracking with JavaScript works with browsers with WebGL 2.0 support. It runs on any device, no app required. Neither external dependencies nor server are needed. The library is a perfect fit for mobile browsers being GPU accelerated and managing operations on a WebGL backend.

Steps to integrate the face tracking features into your website with JavaScript 

  1.  Request Web AR SDK by filling our website form.
  2.  Apply the trial token. You'll receive it together with the web ar package.
  3.  Download the face effect example. The example includes a set of Face AR features like background removal, beautification and AR makeup. Each feature is based on face tracking and can be called with ar js method. 
  4.  Refer to the JavaScript code samples in the API methods to design the needed AR face tracking experience within your web environment.

Where to apply webcam face tracking 

Face tracking within the web opens up a myriad of augmented reality scenarios. Here are the most common ones.

bgmask

Virtual try on

You can embed the AR face tracking with JavaScrip to the e-commerce website for real-time virtual try on. Customers can test sunglasses, jewelry or hats in the real-size dimension before purchasing.

Makeup try on 

Combined with neural networks for lips segmentation or hair segmentation, the face tracking API allows showcasing lipstick and hair color products via the web camera.  

Webcam filters like Snapchat

You can overlay 3D objects, face filters, beauty effects and virtual backgrounds letting users enjoy a high-quality web AR experience. The experiences can be captured and shared with in-browser photo and video recording features.

Photo booth

You can build AR Photo booth experiences with real-time webcam video processing and photo capturing features. The best part, they can run offline.

With web AR face tracking, faces must be detected robustly under different conditions like low lightings, different angles and occlusion. We managed to achieve this by leveraging high-level computer vision technology to bring augmented reality to web platforms, applications and sites.

Want to build immersive web AR apps with face filters, beauty and virtual backgrounds? Test our SDK possibilities!

Try Out Web AR

Top
# Tags: