According to Marketsandmarkets , the market for gesture recognition is expected to grow from USD 9.6 Billion in 2020 to USD 32.3 Billion in just 5 years. More and more companies are adopting the technology to ease the lives of their customers and solve some of their everyday problems. This is a win-win situation.
Gestures play an important role in our everyday communication and expression. Thus using them to communicate with tech devices needs very little intellectual data processing from our side. That means we can control different things such as vending machines almost without thinking, just by using our fingers and hands.
We at Banuba have developed a prototype of the touchless interface system which uses our hand tracking and gesture recognition technology. The user can interact with a screen without touching it by taking advantage of specific gestures to browse items, select a particular option, take extra actions like selecting how much sugar he or she would like to have, confirm actions or tab and hold virtual AR buttons.
The video below demonstrates how our technology works in the case of a vending machine with a touchless interface technology to order a coffee just by using gestures:
The article you're about to read is going to discuss the hand gesture recognition technology. The first chapter is going to be devoted to explaining why there is a need for it. We'll then proceed by describing how this technology works. This article will be closed by giving readers a few examples of how gesture recognition is being used in vending machines presenting our touchless vending machine prototype and also mentioning a few other cases involving smart homes and TVs as well as commercial displays in shopping centers. Lots of interesting information to cover, so let's start!
1. Why is There a Need for Hand Gesture Recognition Technology?
Do you remember the times when the touchscreen technology first came out? Many of us were happy to stop using physical buttons to control our TVs, navigational systems in our cars, smartphones and other devices switching to smart screen's technology instead. This saved users' time and was more convenient and enjoyable to use.
Times have changed and so did the customer preferences. Nowadays, more and more people don't want to use touchscreens. Many of us think it's unhygienic to use a touchscreen of a public device such as an informational table, especially in times of the pandemic. For other people like car drivers, touchscreens are often unsafe because the former need to distract their attention from driving to perform certain operations with the car's navigation system. Touchscreens are also not always convenient to use when you try to click one of its elements that is small in size. Last but not least, touchscreens can be touched accidentally when you don't want it to.
Touchless interfaces can solve these and many other problems touchscreens have. One important characteristic of touchless interfaces is gesture functionality. In comparison to touchscreen devices, touchless devices do not require us to touch the actual screen to control the device. We can instead use gestures and voice commands to control the device. This is more convenient, hygienic, innovative and appealing, especially for millennials.
So now that we've looked at touchless interface and hand gesture recognition technology, let us next consider how it works.
2. How Does Hand Gesture Recognition Work?
Gesture recognition is a technology aimed at providing live-time data to a computer to execute commands the user wants. People do not need to type anything with keys or tap on a touch screen to perform a specific action. The device's motion sensor can perceive and interpret the person's movements as the primary source of data input.
Our company has developed its hand tracking based on neural networks which consist of a stack of two deep-learning technologies. The first of them is a neural network which is trained and configured to identify hands in images taken with 2D mobile device cameras. The second part is a technology that precisely detects 11-21 hand points applied for the real-time precise detection of 11-21 hand points and further hand tracking. Both networks are trained on manually annotated datasets. At runtime, the input frames are scaled-down and sent to the first type of network which detects and localizes a user’s hand. If the hand is detected, the Skeleton Model is activated to predict key points and track position and attitude of the hand. Additional regression-based models may be applied to predict gesturing, e.g. fist or palm. The detected gestures may be used as a part of the user interface in mobile applications.
Below is a step-by-step description of how Banuba's gesture recognition algorithm works:
Step 1: At the first stage, the neural network responsible for identifying hands in images is used to detect a user’s hand in a region of interest. If the hand is detected, the Encoder neural network is activated to predict key points.
Step 2: A separate algorithm detects the position and attitude of the hand and regression model are utilized to predict gestures, e.g. fist or palm.
Step 3. Finally, the detected gestures are used as application control elements. The gestures performed by a person are compared to the gesture library stored in the computer and once the match is found, the computer executes the command correlated to that specific gesture.
Our hand tracking system is capable of recognizing the most common gestures such as:
- Palm ✋
- Victory ✌️
- Rock 🤘
- Like 👍
- OK 👌
The short video presented below demonstrates how our hand tracing technology works in practice:
Now that we've considered what hand gesture recognition is and how it works, let us next look at the few examples of the cases where this technology has been applied.
3. Where the Hand Gesture Recognition is Used
Use Case 1. Vending Machines
There were times when many of us have tried to force the vending machine to accept your rumpled dollar bill to buy a bottle of mineral water. Long and unpleasant experience. Was it convenient to pay for your coffee when you needed to take your gloves off and back on to look for coins in your wallet during the cold winter days? Well, those days have come to the end thanks to the intelligent vending machines.
These types of devices are a great example where the hand gesture recognition is being used. Controlling such a device doesn't require interaction with its screen or buttons. All the user has to do is to show specific hand gestures we've discussed in the previous chapter or wave his hand either to the left or right, up or down. Payment can be easily made by simply waving the user's smartphone in front of the reader, solving the problems we've described at the beginning of this chapter.
This solution is especially useful in today's times when customers often don't want to touch the screens of the devices that might have been used by the other individuals before.
Banuba Touchless Vending Machine Prototype
Our hand gesture recognition system mentioned in the introductory part can classify different types of gestures based on our internally trained hand skeleton model. There is a possibility of assigning any commands to these hand signs. The algorithm can be taught to identify any number of gestures within a couple of weeks to satisfy your requirements. The former is also capable of recognizing a person's presence, his or her gestures and face within a distance of up to 2 meters under real-world conditions such as lighting, face occlusion and different skin colors giving a possibility of enjoying comfortable remote interaction with the screen.
Use Case 2. Smart Homes
Apart from implementing the technology in the vending machines, there are other cases where it's also being applied. Take an example of a smart house where a hand recognition software allows us to track a hand’s 3D position, rotation and gesture. This allows smart house owners to operate the lights without ever touching the switch, simply by waving their hand in front of it to activate the lights.
Use Case 3. Smart TVs
Another example is the Smart TV. In their case, human face and natural hand gestures are the main components to interact with the smart TV system. The face recognition system is used for identifying the user and the hand gesture recognition controls the actual TV, for example, changing the channels and adjusting the volume.
Use Case 4. Virtual In-Store Displays
Last but not least, hand gesture recognition is also used in commercial in-store displays which can be found in shopping malls to attract more visitor traffic. Retail business is being increasingly digitized. This includes an introduction of multiple smart devices working together on a single IoT platform to deliver hyper-personalized, adaptive and context-specific experiences. While much of the technology is to be invisible to the consumer, shoppers will have the opportunity to interact digitally within the physical store environment to find out the information they are interested in and sometimes for entertainment purposes.
So there you have it. Hand gesture recognition is one of the latest innovative technologies that can make the process of using many smart devices a lot easier. The device where this technology is implemented uses sensors that are capable of monitoring the user’s movement, detecting it and responding to a particular one with the appropriate output. Apart from the vending machines, the gesture recognition technology can also be found in smart homes, shopping malls, smart TV and many other devices.
Are you interested to learn more about the hand gesture recognition technology or want to implement it in one of your devices? Contact our company representatives for a consultation and get a free quote for your next project.