Can you trust the domestic face recognition technology

Face recognition: This is how the technology works under the hood

Dr. Andreas Wolf

Whether with Face ID & Co. on your home smartphone or during biometric check-in at the airport: Face recognition should work reliably, but it must also respect data protection. We explain what it takes so that detection systems can easily combine security and privacy.

EnlargeThis is how facial recognition works
© Photo: © Artem - AdobeStock

You too are probably already using face recognition at home: almost every current smartphone finds faces automatically - mostly to photograph happy people instead of pinched facial features. With the newer iPhones, you can only access the device by looking into the camera. Many notebooks use cameras with Windows Hello to share the operating system. Or you can buy a camera for the smart home as a replacement for the front door lock, with which you can almost smile at your house or apartment door.

But the technology is also catching on in public spaces: the federal police tested facial recognition in a pilot project at Berlin's Südkreuz. Millions of travelers appreciate the convenience of automated border control: in Germany alone there are over 180 Easypass control lanes at seven airports, as well as thousands of e-gates worldwide. 51 countries use automated border controls using biometrics: one country uses the iris to identify travelers, twelve countries use fingerprints, but the vast majority have opted for facial recognition.

EnlargeAt the Südkreuz train station in Berlin, the federal police tested three different facial recognition systems for reliability and performance for a year.

Is Big Brother Reality Yet? Or does the rejection of this technology testify to paranoid hostility to progress? The discussion on facial recognition moves between these positions: Proponents see it as a miracle cure that guarantees nationwide security, while critics brand it as the devil's stuff that destroys privacy. But only those who understand what facial recognition can do can form a well-founded opinion.

Test:Apple's iPhone face recognition is the most secure

What can be seen in one face

EnlargeA face recognition system can assign a person who is currently standing in front of the camera to an existing data record based on a few characteristic features and thus authenticate them.

In order to recognize a person by their face, sensors have to record their characteristic features. This can be, for example, the coordinates of prominent points, lines or other measurable objects on the face. It is important that these can be measured stably over a sufficiently long period of time and with reasonable effort and that all people have these characteristics in different forms.

A facial recognition system has to remember the recorded features for each person it is supposed to recognize. This is where the problems begin: Because from the facial image of a person you can derive not only his identity, but also more detailed data such as age, gender, ethnic origin, mood, body mass index. The next step is to ensure that all relevant features can be correctly captured: This is not always easy with faces. The most common area on the face from which features are captured is the eye region. If it is hidden behind large sunglasses, detection is problematic, as is the case with a hat and scarf. Banning sunglasses and hats so that the surveillance cameras can provide suitable facial images, however, appears unworldly. On the other hand, how efficiently face recognition works depends crucially on the quality of the noted features. So your detection algorithm will make more mistakes if the stored reference material is bad. There is a reason why passport photo requirements are and are becoming ever more stringent. In addition, the question arises whether this facial data is stored in a central or distributed database or with the bearer of the characteristics, i.e. citizens? If the owner of the feature has the power of disposal over his data, no one else can use it without being asked - provided the system used for this is set up as intended. In any case, he is responsible for protecting his data.

Photos or masks: How recognition systems can be outwitted

EnlargeRecognition systems must not be tricked by artifacts such as a photo.

A facial recognition system can be overcome by presenting an artifact, such as a mask or a printed photo. The system must accept the artifact and be able to record the features used on it. Or you bring the real bearer of the characteristic to the presentation using compulsion or helplessness. However, this scenario can easily be averted. Face recognition algorithms can already capture moods. Therefore, it will not be long before procedures are in place that adequately identify compulsion, sleep, or fainting.

Protection against artifacts is more difficult. But solutions can also be developed here: If an attacker presents a two-dimensional object, this can be detected with any 3D technology. You don't have to dig deep into the electronics bag of tricks for this: two cameras are sufficient for stereoscopic vision, a projector for structured light or a so-called time-of-flight camera for distance measurement deliver good results here. A near-infrared (NIR) camera that is integrated in the biometric system detects the attack with a tablet or a color photo. Because many pigments in prints and photos are unrecognizable outside the visible spectrum. A camera only for near infrared does not have to be more expensive than a conventional webcam.

In general, numerous artifacts differ from human skin in their spectral response outside of visible light. NIR is just an example. In addition, printed portraits never blink, do not move their eyes or grimace.

EnlargeMany airports use the Easypass system for border control.

Three-dimensional masks remain as a possible trickery strategy: If an attacker makes sure to select the right pigments, he will not only achieve the right “color” in white light, but also in the NIR. However, attackers often forget eyebrows and lips. A mask that doesn't look like a botox treatment that went wrong has to be thin. It is particularly suitable for an attacker who has a facial structure similar to that of the attacked person. So not every attacker can attack every target person. But a detection system can also be protected against a mask attack: the mask must have eyes and nostrils, and probably also a mouth hole - otherwise the attempted trick could be detected by image processing due to its unusual appearance. Around these face holes there is a ledge where the material, texture and temperature change. A thermal imaging camera shows very strong changes around the nostrils when breathing in and out, unless the attacker is in an environment with body temperature. So a lot can be done against attacks. The threat analysis shows whether this is necessary in a specific case: In this, existing risks are named and assessed and weighed against the costs of possible solutions to avert risks. Every attack requires a point of attack and a capable attacker who can benefit from it.

Web authentication:Why the password won't die out anytime soon

The worst flaws in a detection system

A face recognition system is intended to authenticate a person. To do this, it has to answer the question of whether a set of features stored for an alleged identity matches the data that the person has delivered in front of the sensor. The current data will only approximately match the stored reference data. Because people change - in the course of life, even during the day. The algorithm used must therefore decide whether these two data sets are similar enough to be attributed to the same person.

He can come to the conclusion that the records match or not. However, it can also happen that a person is incorrectly rated as not similar enough to their stored data record - a false mismatch. This can often be corrected by trying again, but the error remains uncomfortable and embarrassing. The worst mistake for biometric security is to classify a wrong person as similar enough to the stored data. In this case, there is a wrong match. Both errors are related: the proportion of false matches only decreases if the proportion of false disagreements increases at the same time.

The perfect system combines data protection and security