Real-time face detection and tracking for the web
You can test it yourself straight away by heading to our Web AR technology page. The same camera face tracking features can be embedded into any web app or website.
The core of our web face tracking is a neural network built with Tensorflow Lite. It's an open-source deep learning framework for on-device inference and should not be confused with Tensorflow.js.
On the surface, Tensorflow Lite might not seem a perfect solution for webcam face tracking. Still, one can not ignore its advantages. Among such, reduced size and fast inference on devices with limited compute and memory resources. Tensorflow Lite has a large community, and Google itself uses it for Web AR in Google Meet.
However, it is not the neural network player only that makes the success of the final result. The right neural network architecture, efficient data loading and its effective processing play a crucial role. Let us touch upon each aspect and explain what makes our face tracking unique.
How we accelerated in-browser face tracking performance
Neural network optimization
Unlike Tensorflow.js or other libraries, face tracking with Tensorflow Lite requires adapting the network architecture to the specifics of the XNNPACK delegate. We managed to optimize it and get a manifold increase in performance.
The face tracking library coming as part of the SDK delivers stable 30 fps on the camera and render (Chrome, MacBook Pro 15", 2017). It’s enough for real-time face filtering, in-browser makeup try on and virtual try ons.
Fast data processing
Emscripten supports fast data input and output via Canvas/WebGL and provides OpenGL support over WebGL. In the case of Chrome, we additionally have a 2x speedup due to the parallelization of mathematical calculations through SIMD instructions.
Fast face tracking performance is not all that makes a good user experience in Web AR. The face detection neural network is just one of the components of the pipeline for processing a frame from input to effect drawing. For smooth 3D graphics rendering on the web we use the EffectPlayer, another SDK component, running on OpenGL.
As a result, our face tracking library combines a lightweight core and fast performance on the web. It’s achieved with
fast input/output via WebGL
good performance of adapted neural networks
fast playback of effects on WebGL
How to embed face tracking features to your website
Apply the trial token. You'll receive it together with the web ar package.
Download the face effect example. The example includes a set of Face AR features like background removal, beautification and AR makeup. Each feature is based on face tracking and can be called with ar js method.
Where to apply webcam face tracking
Face tracking within the web opens up a myriad of augmented reality scenarios. Here are the most common ones.
Virtual try on
You can embed the AR face tracking with JavaScrip to the e-commerce website for real-time virtual try on. Customers can test sunglasses, jewelry or hats in the real-size dimension before purchasing.
Makeup try on
Combined with neural networks for lips segmentation or hair segmentation, the face tracking API allows showcasing lipstick and hair color products via the web camera.
Webcam filters like Snapchat
You can overlay 3D objects, face filters, beauty effects and virtual backgrounds letting users enjoy a high-quality web AR experience. The experiences can be captured and shared with in-browser photo and video recording features.
You can build AR Photo booth experiences with real-time webcam video processing and photo capturing features. The best part, they can run offline.
With web AR face tracking, faces must be detected robustly under different conditions like low lightings, different angles and occlusion. We managed to achieve this by leveraging high-level computer vision technology to bring augmented reality to web platforms, applications and sites.
Want to build immersive web AR apps with face filters, beauty and virtual backgrounds? Test our SDK possibilities!