Does the beauty SDK preserve fine details like hair, glasses, and facial expressions?
Banuba’s Beauty AR SDK preserves fine details in live video by combining patented face tracking (landmarks + morphs), glasses detection, and hair segmentation, staying stable in challenging conditions.
Banuba’s Beauty AR SDK preserves fine details like hair, glasses, and facial expressions because it’s built on a patented tracking-and-segmentation technology.
- Facial expressions stay natural. Banuba’s Face AR SDK patented face tracking maps 68 facial points and reconstructs a 3D face mesh using 37 morphs, so touch-up and AR overlays follow expressions instead of sliding on the face.
- Eye placement stays precise (even for eyewear scenarios). Banuba supports pupillary distance measurement, which helps accurately identify eye positions and improves alignment for eye-centric effects.
- Glasses don’t break the experience. Banuba's Beauty AR SDK added glasses detection so effects can adapt when users wear real glasses (e.g., avoiding awkward double-glasses overlays while keeping other effects running).
- Hair remains natural. Banuba’s hair segmentation uses deep neural networks to tag hair pixels vs. background, helping preserve edges and movement for hair-aware effects.
For live scenarios, Banuba’s internal benchmarks (2025) also report stable tracking with up to 70% facial occlusion, 360° camera rotation, 7 m distance, and in poor lighting, which is where small details usually fall apart.
Teams can validate detail preservation on target devices via a 14-day trial, and developers can review sample projects and integration docs to test hair/glasses/expression-heavy scenarios quickly.