Face AR SDK v0.37: Detailed Eye Segmentation, Silicon M1 Support, and Lips Morphing
With the new update, our Face AR SDK gets a few hot features. Those in the headline are only, well, the headlining ones. For details on them and information on what else has been added and changes, check out this article.
Why are there two versions of the SDK?
People following the development of our Face AR SDK might have noticed that just a few weeks ago we have launched the 1.0 version of it. So why is the 0.37 release coming out now?
The thing is, we have to stay ahead of the curve when it comes to features. So until all of them are transferred to the new rendering engine, we will keep updating them on the old one as well.
Now that that’s out of the way, let’s get to the new additions themselves.
Detailed eye segmentation
We have added a new neural network that works with the eyes. Now it can segment not just the eye as a whole (including separate detection of left and right one) but the separate elements of them: pupil, iris, and white. This means an effect can be applied to each part for truly outlandish looks.
This feature works on all platforms: iOS, Android, Web, Desktop.
Detailed eyes segmentation
Apple Silicon M1 support
Now Banuba Face AR SDK works on the devices that use the new line of Apple CPUs/GPUs - the latest generations of MacBooks, iMacs, and iPads. This means the potential target audience you can reach with the cool filters and effects just got bigger. Moreover, this is good for longevity, as Cupertino intends to use the M1 line of processors in its future devices.
Note that the SDK has a WebAR version that is platform-agnostic and works on pretty much anything with a browser.
This is a fun new effect that changes the shape of the lips. It could be useful in short video apps for masks that change the users’ facial expressions or in more serious software (e.g. to show the effects of lip plastic surgery).
The 0.37 version also features another mouth-related update - Lips corrector for iOS. This is a neural network that makes all lip effects apply more precisely.
Before and after lips morphing
macOS: a reworked implementation of macOS framework. We are keeping in line with the latest Xcode packaging practices.
WebAR: API to set the number of faces to track
Unity: Action Units interface
Ability to show camera frames during effect initialization
Support for YUV i420
Ability to set the degree of neck smoothing with JS (in the effect)
Effect Player API using the C programming language