Using biometrics for authentication has always been a source of controversy. At face value, it seems like a fool-proof way to authenticate users (everyone has unique fingerprints, right?). But dig a level deeper, and biometric access management systems store that fingerprint (or iris, or facial map, or walking gait) as data. And, we all know what happens to data if it’s not protected properly. Which brings us to the big problem with biometrics: while passwords can be changed if there’s a data breach, fingerprints and other biometric data are permanent. One breach of a biometrics database is all it takes for someone to lose their identity for a lifetime.
But, this concern did not stop the inexorable march of technology, and biometric access-management systems have continued to evolve over the years until, predictably, we finally had a big biometric data security event over the summer: the Suprema incident.
The First Big Biometric Data Incident
According to an article published by The Guardian on August, 14, 2019, security researchers discovered an unprotected and mostly unencrypted database belonging to Suprema, a security company responsible for the web-based Biostar 2 biometrics lock system. This database contained over 27.8 million records and 23 gigabytes of data which included fingerprint data, facial recognition data, face photos of users and more.
Suprema’s customers use the system to manage building access across more than 1.5 million locations worldwide. Unfortunately, Suprema was storing actual biometric data in the Biostar 2 database, rather than hashes of that data. This meant that if threat actors could breach the database, they’d be able to change fingerprints (so they could, for example, swap out authorized employees’ fingerprints for their own and penetrate buildings), and add new users (create phony employees with their own fingerprints or faces to access said company).
Aside from the obvious security shortcomings of this situation, it raises some interesting legal issues, given the potential lifelong privacy problems a biometric data breach could cause people. And, as is often the case, law lags technology, so the regulatory situation around biometrics remains somewhat immature and confusing. While some states have specific legislation around biometric data (Illinois’ Biometric Information Privacy Act being the most robust), in many states people’s biometric privacy is left in the hands of enterprises, with no real regulatory oversight.
Regulations Fall Short
Even in states where there is biometric-specific regulation, the requirements of those regulations fall short of security best practices. Illinois, Texas and Washington all have specific biometric privacy statutes, and a number of other states include biometric data in the definition of protected personal information in their consumer privacy laws. None, however, requires encryption of biometric data, which is a standard best practice for protecting data this sensitive.
Even if regulations do not enforce modern best practices, organizations using biometrics should voluntarily adopt those practices to avoid class action suits and other legal exposure resulting from biometric data breaches. To date, these suits typically seek damages for things like “emotional suffering” (brought on, presumably, by the stress of not knowing how your biometric data is being used).
It is not too difficult, however, to see future situations where damages could be sought for more profound issues. For example, if an employee were to be falsely arrested for theft of company property due to a threat actor’s fraudulent use of the employee’s biometric data to access a building, that employee would likely have a significant claim for damages. It does not take an extraordinary imagination to think of scenarios where people would have claims based on the fraudulent use of their biometric signatures.
Adopting Best Practices
Companies can reduce their risk exposure from using or storing biometric data by adopting the following best practices*:
- Develop written policies covering how biometric data will be collected, used, distributed and destroyed. (Oh, and then follow those policies!)
- Inform all relevant populations (employees, customers, etc.) how your organization is handling biometric information, mapped to the established policies.
- Encrypt biometric data at rest and in motion.
- Limit access to the biometric data. If it must be accessed by a third-party, create a contract detailing the parameters for how that third-party is allowed to access and use the data.
- Consider storing less than 100 percent of biometric datasets. For example, only enough fingerprint data to identify a person – not the entire fingerprint.
- Consider implementing two-factor authentication in conjunction with biometric data.
- Address legal and statutory obligations regarding biometric data in all contracts with customers, vendors, etc., as well as employee handbooks.
- Review the organization’s general commercial liability insurance coverage to understand if it provides adequate coverage for biometric-related compliance and legal risks, and if not, determine if the carrier has the ability to help clients understand and manage these risks. (If not, find a new carrier that does).
*Source: ABA, 2019.
It is likely that states (and perhaps the federal government) will adopt similar practices as the basis for biometric data privacy regulations. For companies using these systems, adopting these practices now can reduce legal exposure today, and regulatory risk in the future. Organizations should also monitor the dark web for indicators of digital risk – whether it’s an insider offering access to a biometric database (or any database, for that matter), or a corpus of biometric data for sale.
Contact GroupSense to learn more about monitoring the dark web for digital risk. The faster organizations can identify and mitigate these threats and incidents, the more they can control any potential damage caused by biometric data breaches, and ensure that biometric security delivers more reward than risk!
This article was written collaboratively by Kurtis Minder, CEO at GroupSense and Joe Meadows, a litigation partner at Bean, Kinney & Korman PC with experience in cyber and privacy issues. This article is for informational purposes only and is not intended to convey legal advice.