The Problem With Biometrics
People who are not all that knowledgeable about digital authentication often think biometrics are the answer to all our authentication problems. Hint: They are not. Many people think the Holy Grail of authentication is facial recognition or maybe even DNA analysis, “When the technology gets here!” It will not.
Biometrics (e.g., fingerprint, facial, iris, retina, veins, geometry, voice, keystrokes, cursor movements, etc.) can be a good form of authentication, but you have to pick good implementations and there are valid concerns no matter what biometric option you may choose.
Biometric Challenges
Here are some of the common issues with biometric authentication:
Let’s go into each one in a little bit more detail.
If you’re interested in watching a webinar on this instead, attend tomorrow’s 2PM EST presentation: https://blog.knowbe4.com/hacking-biometrics-webinar.
Accuracy
Most biometric vendors tout how incredibly accurate their biometric solution is or can be. In most cases, their quoted accuracy figures are overstated. What the vendor is really stating is some hypothetical example of how uniquely different the involved biometric attribute (e.g., “Your fingerprint is unique in the world!”) or what the maximum capability of the underlying hardware is (e.g., “It only has one false-negative error per 10 billion fingerprint submissions!”).
None of that matters. The only accuracy fact that matters is how accurate the biometric solution is in practice in real-world conditions as deployed. And it turns out that most real-world deployments are a lot more inaccurate than the advertising.
The National Institutes of Standards and Technology (NIST) has been reviewing the accuracy of different biometric solutions (mostly fingerprint and facial) for years. Any biometric vendor or algorithm creator can submit their algorithm for review. NIST received 733 submissions for its fingerprint review (https://nvlpubs.nist.gov/nistpubs/ir/2014/NIST.IR.8034.pdf) and over 450 submissions for its facial recognition reviews (https://www.nist.gov/programs-projects/face-recognition-vendor-test-frvt-ongoing).
NIST accuracy goals depend on the review and scenario being tested, but NIST is looking for an accuracy goal around 1:100,000, meaning one error per 100,000 tests. So far, none of the submitted candidates come anywhere close. The best solutions have an error rate of 1.9%, meaning almost two mistakes for every 100 tests. That is a far cry from 1:100,000 and certainly nowhere close to the figures touted by most vendors.
I have been involved in many biometric deployments at scale and we see far higher rates of errors (false-positives or false-negatives) than even what NIST is seeing in their best-case scenario, lab condition testing. I routinely see errors at 1:500 or lower. Biometrics in the real world is a hard nut to crack.
The bottom line is that most biometric solutions are not anywhere as accurate as the vendor claims. But with that said, many biometric solutions are far more accurate than their competitors. There are solutions that rank at the top of their class and a bunch that rank at the bottom. If you are buying a biometric solution, try before you buy, and make sure you are getting the accuracy you thought you were getting. Ask the vendor to talk to two or three of their largest existing customers and ask them about the accuracy rates and if they have any problems using the product in the real world. Real world experience counts for more than any vendor’s marketing attestation could.
Security/Hacking
Anything can be hacked. Any biometric solution can be hacked. Any biometric vendor telling you different should be avoided. But some biometric solutions are more resilient than others. The tough part is telling the difference. Here is what I look for when looking to see if a particular biometric solution is more secure than its competitors:
领英推荐
If you have the ability to choose your biometric solution, choose a solution that is more resilient to attacks.
What to do if a biometric attribute is stolen
One of the most challenging problems is what to do if your biometric attribute is stolen. For example, all ten of my fingerprints were stolen, along with 5.6 million other people, in the infamous June 2015 OPM data breach (https://en.wikipedia.org/wiki/Office_of_Personnel_Management_data_breach). For the rest of my life, I know that my fingerprints have been stolen and are out there in the possession of attackers. How can any system that relies on my fingerprints truly know that who is submitting them is me?
Well, for one, it is better if biometric attributes are paired with a knowledge-based secret like a password or a PIN. An attacker with my fingerprints would also have to know my knowledge-based secret in order to access the system. The attacker might be able to also get that knowledge-based secret as well, but at least it is harder to accomplish.
I like biometric systems that do not store my biometric attributes in “plaintext” form, meaning I do not like any biometric system that takes my fingerprints (or face, retina, iris, etc.) and stores them as the real, complete image in their database. I want biometric systems that read my biometric attributes and then transform them into something the biometric system can store and use, but if stolen, mean nothing to the thief. Here are some ideas on how to do that: https://www.dhirubhai.net/pulse/protecting-mfa-shared-secrets-roger-grimes.
Shared systems can promote disease transmission
Keep in mind that shared biometric readers may be capable of transmitting communicable diseases. This became a bigger worry in the age of COVID-19, but has been a problem for decades. For example, biometric eye readers have always been a worry of passing along conjunctivitis (i.e., pink eye). Shared touch-based readers should be cleaned between uses and many vendors offer “touchless” systems which can read the biometric attribute when it is “hovered” over the reader or from afar.
Privacy issues, government intrusion, etc.
Many nations and businesses now store billions of fingerprints and faces. It may be to conduct legitimate law enforcement scenarios, but many privacy advocates wonder if any single entity having billions of people’s biometric attributes can lead to illegal abuse. Only time will tell, but this is certainly a worry for a non-minor percentage of our population.
Bias
Lastly, many biometric (really, any authentication solution) can have technical bias. This is not the same as a personal bias. This is a bias caused by the technology. For example, many studies have shown that biometric facial scanners have a harder time discerning people with different skin types due to how light reflects off that skin and the ability to recognize features and geometry.
Biases can develop because of socio-economic issues. For instance, any biometric solution requiring a cell phone to work cannot be used by people without cell phones. You may think that everyone in the world has a cell phone, but about 25% of people around the world have no cell phone and many people share cell phones with other people (complicating authentication). Many people may not have a smartphone capable of using a biometric app. For instance, it has been noted that a larger percentage of certain types of workers in the U.S. use “flip” phones instead of smartphones than the general population. This can be because of economic issues, a low experience with cell phones or simply an innate aversion to smartphones. Some people are born without fingerprints (it is called Adermatoglyphia), some without voices or eyes. Face tattoos, glasses, masks and hair can complicate facial recognition scans. Some labor-intensive jobs cause more “micro-abrasions”, which can cause problems with fingerprint scanners, and so on. The important point is to realize that some types of biometric solutions have built-in technical biases against some types of demographic groups. It is just good to be aware of them and to avoid or modify them when possible if the solution will be used across one of the involved demographics.
In Closing
Biometrics are a growing part of the digital authentication world. There are good biometric solutions and bad biometric solutions. Try to pick the more secure and more accurate solutions. Even then, no biometric solution is unhackable or perfect. The best any defender considering a biometric solution can do is to be aware of the good and bad of biometric solutions and pick the best one they can.?
Ex-Serviceman | Administrative Manager | Therapist | Spiritual Healer
1 年@
Competition is a good thing!
1 年I always liked this matrix by NCR done way back
Competition is a good thing!
1 年I deal with problems in biometrics in the self-service. In facial it centers on variables like sunglasses, race, and also temperature related (coming in from the cold e.g.). We also had to deal with Dahua inserting algorithms and server posts related to temperature kiosks (HIKVision and Dahua both blacklisted by feds). Fingerprints are still pretty good in the corrections industry. Amazon has palm. Iris seems to be working pretty good for CLEAR and airports. Biometrics can get pretty intense in the DHS space even to the point of where the TPM chip is manufactured (Taiwan or Germany). Other problems to consider are legal and jurisdictional. Illinois and California are two states to look at. Right now I am looking at biometric system-on-card point of sale PIN entry solution for the disabled. License plate recognition is taking off.
Product Management Leader
1 年It is important that these considerations are shared, but there is a lot of inaccurate data in this post. Starting with NIST numbers that are indeed conducting all tests with the algorithms tunes to 1:100,000 false positive rate (letting the wrong person in, or as they call it False-Match-Rate), and this measure is met. However - the 1.9% error rate you mention is for false-negative (not letting the right person in, or as they call it False-Non-Match-rate). All this leads to is the need for the person to try again and at 98.1% (and some algorithms are 99.5% accurate) the chance of another False-negative is negligible. Another aspect is recommending a person to validate the gov ID to enhance security. A 2014 research shows that even trained border control agents get that wrong 14% of the time - which is a few orders of magnitude less accurate than any of the algorithms. When it comes to biometric data protection - if a vendor is storing images of users' biometric features (face, finger, iris) that is really bad practice. Most vendors today convert it to a template and at #Anonybit we decentralize the biometric to avoid any chance of compromise. I do agree that MFA to enhance overall system security is recommended.
Leader - MENA Enterprise Sales & Market Expansions | Cybersecurity Platforms, Cloud Edge, AIoT || xCisco, xAkamai, xHP-Poly || Consultant & (v)CISO Advisory - RSAC 365 & CIO TechTalk
1 年Must Read!