Almost everyone who has ever used a smartphone knows about Snapchat AR lenses aka face filters. Filters have been around since 2015 and are still popular, as they are a source of great fun and entertainment for social media users. Want to learn more about how filters work? We’ve got you covered! Read on to understand the mechanics behind AR lenses and Snapchat filter technology.
Snapchat filters: How they evolved
The way AR lenses and filters are used has changed along with the evolution of Snapchat. Initially, the app only offered basic AR filters, which nonetheless quickly went viral, because it allowed for new ways of interaction on social media. However, it was not until 2015 when Snapchat first created AR lenses as we know them today. Continuing this trend, in December 2017, Snapchat released Lens Studio, where both users and advertisers were able to create custom filters and apply them to personal snaps and sponsored content.
Another two years later, in 2019, Snapchat further enhanced its AR lenses functionality by letting users transform pets, hands, bodies and famous landmarks around the world with augmented reality. For example, Snapchat users can now change floor into lava with the “Ground Transformation” feature. Forbes argues that this feature is the next one to be exploited by brands, as any terrain now can be turned into a branded landscape.
AR lenses on SnapChat. Source: Destination Jeddah
Currently, Snapchat lenses have evolved from mere entertainment into a full-scale social environment and digital economy that facilitates user interaction and customer engagement. In fact, users now earn money by creating custom filters and selling them to the Snapchat community and brands (certain creators charge brands up to $30,000 for AR lenses).
The technology behind Snapchat filters
Snapchat’s AR filters first took off after the company acquired Looksery in 2015 for $150 million. This allowed the in-house team to build over 3000 AR filters that users could choose from. Looksery was a Ukrainian startup that focused on computer vision and developed an app where users could modify their facial appearance during a video chat. The technology developed by Looksery led to the emergence of modern-day Snapchat AR lenses and filters.
Less than a year after the acquisition, Snapchat lenses went viral with many celebrities, including Jesica Alba and Ariana Grande, playing with AR filters like dog mask, bread face or golden goddess lenses. One of the most prominent campaigns that went viral happened during the 2016 Oscar awards when celebrities tried Snapchat’s “Face Swap” filter to switch faces with Leonardo Dicaprio.
As the Snapchat app evolved, so did the technology behind the filters and lenses. Today, you can find a whole range of AR mechanics, complex computer vision algorithms and neural networks that make these features possible. Most filters are based on face recognition and tracking technology. Let’s take a closer look at how all this works.
Snapchat face recognition and tracking technology
Face recognition and tracking are based on computer vision technology. Okay, but what does that mean, exactly? Computers see our faces as combinations of 1s and 0s that correspond to different facial areas, such as eyebrows, noses and foreheads.
Face recognition and tracking technology. Source: Dev
Some areas on our faces are darker, such as eyes, while others are lighter, like cheeks, for example.
Face recognition and tracking technology. Source: JavaTPoint
When similar combinations of 1s and 0s — representing lighter and darker areas — are repeatedly found within the same coordinates while scanning an image with the camera, computers are able to distinguish faces from other objects.
How Snapchat adds filters to our faces
We now know how the computer can recognize a face on an image. But how does it apply a doggy nose to it, and how is it still on our face during a video when we move? To put a face filter, a computer has to do more complex calculations based on the Active Shape Model (ASM).
To train computers in facial recognition based on this statistical model, people first had to manually mark the facial borders on multiple images to create a dataset that computers use to make conclusions.
Eventually, a computer takes into account all the points on your face to understand what parts of an image correspond to your ears, eyes, nose and other facial features. Knowing this information, a computer can create a 3D mask of your face that can be scaled, rotated and moved as more data comes in from your camera. In the same way, our 3D face tracking technology can instantly recognize the human face to precisely and automatically locate AR filters, try on items, virtual makeup and much more.
Banuba face tracking and filtering technology demo
Unlike most face detection and tracking algorithms that work well only for frontal face recognition, ours support 90-degree face rotation, head tilts and partial face occlusion. This way, the filters and lenses follow the face as the user is moving in a video flow.
Some of the best Snapchat filters explained
Let’s check out some of the most popular Snapchat filters to see what they do and learn more about the tech behind them.
Most popular Snapchat filters. Source: BuzzfeedNews
This filter includes a range of Beauty AR technologies and features that blur your skin, enlarge your eyes and make your features appear more smooth, delicate. It is very popular among users, as it motivates them to generate and share content.
Realistic try-on by Shapchat. Source: ForbusinessSnapchat
According to Forbes, 70% of consumers report it is hard to find clothes online, while businesses lose about $550 million on returns. That is why a physics-based, try-on technology that allows for the realistic representation of objects became popular among both brands and consumers. Farfetch, Prada, Dior and others use Snapchat as a platform for customer engagement, creating filters that allow users to virtually try on their products like makeup, glasses, jewelry and clothes. During one promotional campaign, Dior lenses for sneakers try-on generated 2.3 million views and 6.2x returns on ad spend.
Face swap by Shapchat. Source: TechAdvisor
This algorithm automatically detects faces and allows you to switch them when taking pictures or making a video with your smartphone. After the faces are switched, all expressions that you make will automatically appear on the swapped face. The faces become filters, which means they are applied using the same Active Shape Model as all other lenses are.
Aging filters on Shapchat. Source: Engadget
This feature is called the “Time Machine,” and uses a neural network that is trained to age faces. Users can appear younger or older by dragging a slider left or right. It went viral when it was launched in 2019 and remains one of the most popular Snapchat filters today.
Gendder swap filters by Shapchat. Source: VRScout
This is another viral filter that allows users to visually switch their genders. It uses a neural network to recognize and augment faces. Unlike the “Face Swap” filter, this one can modify your face without switching it with someone else’s.
Disney filters by Shapchat. Source: Republicworld
Ever thought of becoming a Disney character? Well, this Snapchat filter has made that possible. Users can now appear as 3D cartoon characters, which is very engaging and fun. Such filters generate millions of views and shares. For example, a creator of one such filter reported an increase in his Instagram followers from approximately 6K to 600,000 during the first week of 2020.
Background removers by Shapchat. Source: Make it easy
These filters became popular during the COVID-19 pandemic, especially with different types of background replacement desirable for Zoom calls and video chats. The filter is based on background subtraction technology, which detects moving objects in videos from static cameras. The filter enables detaching the foreground (usually a moving object interacting with the camera) from a background (an object, often stationary, not interacting with the camera) to augment or remove all the objects behind that are not actively participating in the video call.
Filters for pets (cats and dogs)
Filters for cats and dogs on Snapchat. Source: RadioTimes
When changing your own appearance isn’t enough, you can apply filters to your pets as well. Why not feature your dog or cat in a Disney movie? Dog and cat pictures and videos have proven to be viral and addictive. Now, users can augment their pets with a variety of filters to boost even more engagement.
As you can see from this post, there is the sophisticated tech behind the Snapchat AR lenses that makes all kinds of user interactions possible. Beyond just fun, Snapchat filters are efficient in boosting user engagement, which opens enormous app monetization opportunities. Cloning Snapchat features is a great way of making your app popular both among brands and users.