There were several changes that made already impressive AR effects even more stunning and better optimized.
First of all, the effects resources on all platforms are loaded asynchronously. This results in slightly shorter loading times and allows the first effect to work while the second is being loaded.
Secondly, the Android version now has a neural network cache enabled. This means that every use of a certain effect after the first will be much faster (under 1 second).
Thirdly, many existing effects received an update. Lips segmentation networks on all platforms except iOS and Mac (they have already been top-notch) became more precise. However, Apple platforms now have a more efficient background segmentation model that is also better at hand tracking.
Finally, the AR avatars now work smoother and support the popular GLTF file format. Also, eyebrows segmentation now works better across the board, and certain triggers (e.g. “smile” and “mouth open”) are now detected more accurately. These improvements apply to all platforms.
Face Tracking Improvements
The face tracking system, which is at the core of the SDK, has also been updated. Not only has it become less jittery and less prone to false positives, but it now has the option to work even on faces covered with medical masks. Note that this isn’t shipped by default. Contact your customer success manager to learn more.
Easy Web AR Integration
Since version 1.4, the SDKs for Android and iOS are distributed as Maven packages and CocoaPods. This makes integration much quicker and easier. The Web version has also received such treatment, as it can be sent to you as an NPM package.
Besides, it has become more flexible and lightweight. By default, it doesn’t include any feature modules (e.g. face tracker or background replacer), which decreases the size of the obligatory install to about 100Kb. Then you can connect only the modules that you need.
- OEP: Added support of bt601 and bt709, full and video ranges
- Android: Ability to record original video without effects in Demo app
- TFlite upgraded to the version 2.9
- C API: eval_js method added
- Unity: More segmentation neural networks: body, face, face skin, hair, neck, skin, lips. SegmentationExample scene and ready-to-use resources are provided.
- Virtual Background: Background.getBackgroundVideo() method added
- Virtual Background: Ability to pause background video right after it's loaded
- Unity: Plugin and Demo scene refactoring
- Unity: Morphing refactoring and improvement
- C API build on Windows
- Crash on Viewer closing
- Various serialization issues with effects (broken physics, etc.)
- Crash on some devices with error: Resource deadlock would occur
- Background video unexpected line
- Deserialization of empty textures
- Crash or image freeze when a face goes out of the screen border
- OEP: Memory leak in FPS Draw
- OEP: Memory leak when loading effect synchronously
- OEP: Released external surface before creating new (leak fix)
- Web AR: Upside down screenshot on Safari 14
- Web AR: Added better error messages for Outputs
- Web AR: Runtime error when using in iframe.srcdoc
- Web AR: Inability to reuse MediaStream
- Web ARP: Player.destroy() memory leak
- Makeup API: Makeup.lashes affects Eyelashes.color
- Unity: Various effects issues
- Unity: Various startup errors
- Unity: Demo scene UI fixes
- Unity: Broken face in landscape
- Unity: Beautification scene
- Unity: Issue with multiple faces