Blog
Face Tracking

Liveness Detection for Secure Face Authentication: How It Works and Why It Matters

To prove that you’re human, you don’t have to select all the images with traffic lights in them. Looking into a camera is enough if that camera is connected to a liveness detection system. The point of this technology is to distinguish between real people and images or other facsimiles. There are plenty of examples of companies falling victims to deepfakes. 


Liveness detection is especially important for financial businesses, as each of them has already lost $600,000 on average on this kind of fraud. In this article, we will explain where this technology fits best, how it works in Banuba Face AR SDK, and how to implement liveness detection most effectively.

Liveness detection

[navigation]

TL;DR:

  • Liveness detection is a technology that distinguishes real people from videos, pictures, or deepfakes;
  • There are active and passive liveness checks;
  • Banuba offers face tracking, head pose detection, gaze tracking and other foundational features;
  • Best practices for implementing liveness detection include combining several checks, randomizing them, and setting clear instructions for users.

What is liveness detection and how does it work?

Liveness detection is a technology that determines whether someone is a real person – not an artificial construct, a picture, or a prerecorded video. 

There are two main ways to achieve that.

Active challenges (explicit checks)

The system asks the user to perform simple tasks to prevent the use of a prerecorded video. 

  • Facial expressions. Blinking, smiling, making certain grimaces, etc. If someone does exactly what is asked, this means they likely aren’t fake.
  • Head pose challenges. Turning to a certain direction, raising or lowering one’s head. A prerecorded video or a picture probably won’t be able to correctly fit the perspective.
  • Gaze and focus tasks. Following an object on screen or steadily looking at something. With eye tracking, it is possible to determine whether it is a real person paying attention.

These challenges can and should be combined, as just one is rarely enough to confidently detect advanced GenAI. For example, an app might ask a user to smile then turn their head from right to left. All the while, the system will verify it via a real-time face tracking. Moreover, a good practice is to randomize such challenges to make it harder to prepare prerecorded responses.

The more unpredictable and random challenges your app gives, the better. At a certain point, spoofing becomes nearly impossible as the number of potential combinations gets astronomical. Moreover, the liveness detection systems analyze each frame and raise red flags whenever there is an error (e.g. the person didn’t blink or smile when prompted). Even the most advanced deepfakes have a hard time meeting such a challenge.

Active approach works well, but it introduces a degree of friction that users might find annoying.

Passive detection (implicit checks)

Passive liveness detection doesn’t require people to perform any specific actions. The system analyzes the minute details of the appearance to distinguish an actual person from a deepfake or a video.

  • Micro-movements & blinking. Most people’s faces aren’t perfectly still. The expression changes slightly, eyes blink, head changes its angle… Generated copies can’t mimic all that, and liveness check algorithms pick it up.
  • Pulse detection. When a heart beats, a person’s skin tone changes slightly, following the flow of oxygenated blood. Real person’s skin tone, that is, because even deepfakes don’t have such a high level of detail. Specially trained neural networks can detect these changes, however, and ensure that they aren’t being fooled.
  • 3D lighting and reflections. Some solutions (e.g. Banuba’s) can shine colored lights on a person’s face and judge whether it’s real by the reflections on the skin and in the eyes. Apple’s FaceID also uses a similar principle, projecting infrared light patterns to determine depth and liveness.
  • Texture and artifact analysis. Cameras have imperfections that affect the pictures and videos they take. They only get more noticeable when said pictures and videos are reproduced (e.g. printed or played on another screen). Such artifacts can be detected with a frame-by-frame analysis.
  • Device and network checks. With this method, the liveness detection system verifies the devices providing the feed, instead of the person. The process might include IP verification (e.g. flagging when the user claims to be in one country but the IP-address is from another) or device fingerprinting (e.g. gathering information about a specific camera to ensure it’s a real one). Such checks aren’t exactly passive and are often an auxiliary method to ensure that the data stream isn't tampered with.

Passive checks are more convenient for the users – simply looking at a camera for a few seconds isn’t a great challenge. However, they need cutting-edge technology to catch all spoofing attempts. Ideally, an app would use a combination of active and passive liveness detection approaches for the highest effectiveness. For example, it could analyze a brief video of a user and only if there are doubts request that the user perform a certain action.

Banuba’s approach to liveness detection

Among Banuba’s product line, Face AR SDK is the one that helps in making liveness detection software. While it doesn’t offer a dedicated API by default, it provides tracking of landmarks, expressions, etc. that other developers can use as a foundation for their apps. Simply put, Banuba gives you the information, while you create the flow for it. 

Banuba’s liveness detection SDK can track the following:

  • Head pose
  • Eye openness
  • Mouth movements
  • Emotion expression
  • Gaze tracking
  • Pulse detection
  • Etc.

It also supports custom cues like reflection tests. Moreover, everything runs on the device (e.g. the mobile phone) for the best privacy, speed, and compliance with regulations like GDPR.

Using Banuba’s technology for liveness detection means processing the gathered data with your own custom checks. For example, you can ask the user to smile, then verify whether this happened (the appropriate mouth movement was detected). And add heartbeat detection for extra security. In short, Banuba provides the technology and you are free to customize its use as needed.

Power Your App with Liveness Detection  Start Free Trial

Real-world use cases

Cryptocurrency exchanges

Anyone can submit documents and claim to be their owner. With Banuba’s liveness detection, the platform can verify whether this claim is true. A quick challenge (e.g. asking the user to blink) supported by micro-movement tracking can make sure that the person is who they claim to be and are not generated or prerecorded.

Brokerages and trading platforms

Businesses that offer forex or stock trading have to include KYC procedures in their registration processes. ID verification can be combined with liveness detection to make sure that the person providing the ID is the rightful owner. This will make fraudulent sign-ups a much harder task and ensure compliance.

Banking & financial services

Remote loans or account opening are convenient for the customers but create additional risks of fraud. Adding liveness detection coupled with ID verification (e.g. a customer submits a picture of their driver’s license and then performs a simple action) could make the application process much more reliable and transparent. 

Insurance and Healthcare

On the one hand, liveness detection can help prevent fraudulent claims – a short challenge could ensure that the documents are submitted by the person entitled to do so. On the other hand, such verification helps protect sensitive data, when discussing health issues. 

Access Control & Smart Devices

Electronic locks need to make sure that the person trying to open them is real – otherwise, they are a liability. Banuba’s SDK could be integrated with them and provide the necessary additional layer of security.

Best practices for implementing liveness detection

When integrating Banuba’s liveness detection technology, follow these suggestions for the best combination of security and user-friendliness:

  • Combine the tests. The more indicators you check, the harder it would be to bypass them all. For example, ask the user to smile and at the same time monitor their heartbeat and light reflections. Banuba’s technology allows tracking dozens of parameters simultaneously, so don’t hesitate to use it to the fullest.
  • Use unpredictable challenges. Have the system issue different challenges with no pattern that can be discerned from the outside. Many different actions (blinks, smiles, frowns, etc.) presented at random make it much harder to prepare videos and try to fool the system.
  • Give clear guidelines. The better you can explain the challenges to the users, the more convenient the verification process. Banuba lets you use augmented reality overlays, so feel free to use emojis to show the necessary facial expressions, place arrows for the direction to turn, and so on. This way you will have a secure system and the legitimate users won’t be inconvenienced.
  • Set up the right conditions. Having good lighting, high-resolution cameras and no barriers to viewing the entire face prevents false negatives. Banuba’s liveness detection software works under worse conditions too, e.g. with facial occlusion and in low light. But the better the environment, the more reliable results you’ll get.
  • Mix active and passive methods. The combination of implicit and explicit challenges gives the best results. In addition to detecting gestures and expressions, Banuba can provide raw frames which your system can check for artifacts and other inconsistencies.
  • Ensure speed and reliability. Process the data without transferring it elsewhere, if possible. This eliminates the risk of interception and speeds up the process. In addition, create fallbacks (e.g. a second attempt at verification and a manual review process) to balance security and user experience.

Future of liveness detection

Authentication methods evolve to combat the new challenges posed by the fraudsters, hackers, and other malicious actors. Attackers get more intricate, and as a response, companies keep developing more robust and sophisticated protection measures.

Some of the latest trends include:

  • Multi-modal biometrics. Combining liveness detection with fingerprint scanning and voice recognition makes it very hard for attackers to bypass all the security measures at the same time.
  • Advanced deepfake detection. Specially designed and trained neural networks can spot tiny inconsistencies that aren’t present on a live video feed but give away generated images.
  • AI-powered anti-spoofing. Artificial intelligence can analyze contextual information (e.g. backgrounds) to detect hacking attempts more reliably.

Conclusion

Liveness detection helps protect you from fraudulent signups and plays a major role in the KYC/AML compliance. There are two major ways to approach it: active and passive (explicit and implicit). These methods can and should be combined for the best result.

Banuba offers robust liveness detection as a part of its SDK that can serve as a technological foundation, providing reliable tracking data. It supports many checks and their combinations, making it a convenient tool for companies looking to build a biometric authentication solution. 

If you want to try it for free, don’t hesitate to sign up for a 14-day trial – no strings attached.

Power Your App with Liveness Detection  Start Free Trial

Top