Face AR SDK v1.1.0: Better Full-Body And Skin Segmentation, plus Multiple Avatars
The new version of our AR SDK software has many new features and performance improvements. These include functionalities for additional platforms, better optimization, and several changes that will especially benefit clients who use our SDK for makeup try-on. Read on for details!
We keep improving our Face AR SDK and Scene - its rendering engine. The new features, obviously, take the bulk of attention. But there is also a lot of work being done “under the hood” - steps to improve stability, compatibility with different platforms, and transfer functionality from version 0.x.
Such invisible improvements are and will be present in this release and all the following ones. Now, without further ado, let’s get to the major news.
Improved full-body segmentation
Banuba Face AR SDK is now closer to being a full-fledged virtual chromakey, as the neural network that cuts the entire person out of the picture has been improved. Firstly, it now works across all platforms: iOS, Android, Web, and tabletop. Secondly, its performance is now much better, putting less strain on the devices. Finally, it is more precise - see the video below for comparison.
The current use case for this is background replacement. It would be especially useful for remote presentations, placing the speaker in the lab, at the production line, or in the boardroom.
This feature is the basis for many other things related to virtual makeup. All the AR products like powders and foundations are applied to it to get the desired look. Now we have streamlined the tech behind it. Now there is one neural network to rule them all and it works swimmingly across all platforms. It has also been optimized to provide better performance on mobile devices, web, and desktop alike.
Better lips tracking
Banuba’s lips tracking technology is one of the best in the world. With the latest update, it has become even better. The performance it shows on the web and desktop has been improved significantly.
Take note if you want to launch a virtual makeup try-on app and let your users check out various lipsticks and glosses on their screens.
Our AR SDK software now supports several animated avatars on one screen. The camera can distinguish multiple people from the background and turn them into funny cartoon characters that mimic their real counterparts’ every move.
Such a feature would be useful for gaming or casual videoconferencing. Not only are the avatars funny, but they also provide a measure of privacy and security to the users.
Windows DLLs now come signed and with product description, including the SDK version
Face triggers support for versions 1.x (mouth open, smile, etc.)
iOS: Effect info UI in SDK Demo app
iOS 15 support
What about versions 0.X?
At the moment, Banuba Face AR SDK exists in two branches: 1.x and 0.x. The former will be further developed and supported. The latter will also be supported for a time, but no new features will be added to it, as the new rendering engine introduced in 1.0 is much better in all aspects. Still, a few minor improvements and bug fixes to the older version will be made.
Want to try all the new features? Then sign up to check out our SDK at no charge!