Ask users what they think about biometrics and you’ll get a mixed response. On the one hand, many people love the speed and simplicity of the technology as well as, perhaps, its science-fiction image. On the other, however, many have serious fears for their privacy.
Surveys conducted by Imprints, a government-funded research project, indicate that the British public finds biometrics the most controversial and worrying of all means of authentication. Organisations aiming to exploit biometric technology have to be able to show that this, the most personal of all types of information, is safe in their hands.
Privacy fears tend to fall under two headings that hackers could access biometric data to hijack an individual’s online identity and personal data provided for one purpose could end up elsewhere.
Most systems don’t store biometric data in the form of an image or recording, instead keeping a mathematical representation of the original characteristic
These fears are in many ways justified. The danger from identity thieves is all the more serious because, unlike a credit card, biometric data can’t be cancelled or replaced if it’s captured by a third party. And it’s constantly exposed, with fingerprints left everywhere and faces on permanent view.
However, contrary to widespread belief, most systems don’t store biometric data in the form of an image or recording, instead keeping a mathematical representation of the original characteristic. This representation is then hashed, or transformed by an algorithm, to create the authorisation code.
This is the technique used by Nuance Communications, which supplies its FreeSpeech system to Barclays. Customers’ voiceprints are now used for authentication over the phone, rather than biographical information such as a mother’s maiden name.
“For the voice biometric to be able to identify you, we create a voiceprint akin to a fingerprint,” says Brett Beranek, head of voice biometrics for Nuance.
“But there’s a significant difference in that the voiceprint is just an alphanumeric string of numbers that only has meaning to the voice biometrics algorithm. It has no value anywhere other than in that system.”
This means, first, that it’s not possible to recreate a fingerprint or voice recording from the mathematical representation and, second, if data is ever compromised, new hashed indexes can be issued from the same fingerprint.
“If a hacker got access to the database of voiceprints, there isn’t anything they could do with them; they couldn’t use them to authenticate anything,” says Mr Beranek.
The same applies to Apple’s Touch ID, now being used by Royal Bank of Scotland and NatWest customers to log in to mobile banking apps. The system encrypts fingerprint data and protects it with a key only available to the phone’s secure enclave.
“The secure enclave is walled off from the rest of the chip and the rest of iOS [mobile operating system],” says Apple. “Therefore, iOS and other apps never access your fingerprint data, it’s never stored on Apple servers and it’s never backed up to iCloud or anywhere else. Only Touch ID uses it and it can’t be used to match against other fingerprint databases.”
Other privacy concerns relate to the way data is used and shared. Post Snowden and the leak of classified information from the US National Security Agency, many consumers have fears about function creep, as highlighted by Sir John Adye, a former director of GCHQ, in his evidence to the government Science and Technology Committee.
“What happens to my personal data when I use them on a smartphone for proving my identity? Is Google going to use that data also to target advertising at me?” he asked.
“Is some other commercial company or maybe some hostile foreign government going to use it to target me in some other way? I don’t know. We need to find ways of getting that kind of system properly organised.”
This, though, is where data protection legislation comes in. “It’s personal data, so it needs to be compliant with the Data Protection Act. There should be transparency about what the data will be used for,” says Simon Rice, head of group technology at the Information Commissioner’s Office (ICO).
“It needs to be stored securely and destroyed when it’s no longer necessary so, if a member of a library cancels their membership, say, the library should destroy their fingerprint.”
Technology can always be hacked – we always say biometrics aren’t bullet proof
All the same, there’s something uniquely personal about biometric data and the government’s Science and Technology Committee is currently examining the need for specialised guidelines. It’s received a submission from the Biometrics Institute, an impartial group set up by users rather than suppliers to advise on the safe use of biometric technology.
“We want to raise awareness of how biometrics actually works: for the public to understand what it means and that their data is handled in a responsible manner,” says the institute’s chief executive Isabelle Moeller.
“There are obviously data protection acts around the world with common criteria on how to handle personal information and biometric information should be handled in the same way as biographical information. But we could possibly add other criteria; one could be that the organisation has to conduct a privacy impact assessment.”
But perhaps one of the biggest misconceptions about biometric authentification is that it’s taking over completely from other security systems.
“Technology can always be hacked – we always say biometrics aren’t bullet proof,” says Ms Moeller. “They offer much higher security than a PIN and card, but we always take the line that the way forward is with multi-factor authentication.”
The ICO’s Mr Rice agrees. “We’ve only got ten fingers so there are only a certain number of possibilities. Passwords or personal questions such as your mother’s maiden name allow the individual a bit more control over what data they hand over,” he says.
“Responsible organisations will give people a range of options; if people want to use fingerprints, they can, or if they prefer a swipe card they could have that as an alternative. It’s something else to add to the mix of security measures – another weapon in the armoury.”