Should you be worried about voiceprints?
The technology has tremendous potential for growth, but it worries some privacy advocates.
Your voice carries all kinds of information about you: your approximate age, gender, attitude and emotional state. Most valuably, it identifies you. A 30-second recording is enough for a computer to create a biometric profile that can be used to name you by measuring your unique vocal traits. The term of art for a biometric vocal profile is “voiceprint” – like a fingerprint.
A surging industry sells voiceprinting software to governments and companies that create databases to fight crime, prevent fraud and enhance personal security. One company’s research says most people in the world will have created a voice biometric profile by the end of this decade. But privacy advocates worry voiceprint databases will create issues of consent, anonymous speech, tracking and effectiveness.
“I don’t see a great security value in voice biometrics,” said Jeremy Gillula, staff technologist at the Electronic Frontier Foundation, a leading non-profit focused on protecting privacy amid new technologies. Although Gillula is not an expert in voice biometrics specifically, he is familiar with biometric technologies and the security and privacy issues they raise.
“Let me put it this way. I certainly wouldn’t put anything that I really wanted to keep secret or safe behind any sort of voice biometrics.”
But many people already have a voiceprint, perhaps without knowing it. The Associated Press reported in October that voice biometric companies have harvested and entered “more than 65 million voiceprints into corporate and government databases.”
The business of voiceprints
Madrid-based AGNITiO is one of those voice biometric companies. Its clients range from global surveillance markets – the US Department of Defense, the Royal Canadian Mounted Police and similar organizations in Spain, France and other countries – to financial markets – banks like BBVA and BNP Paribas – to Verint, an intelligence company that in turn serves North American clients like Verizon and IBM.
Jesus Aragon, vice president for corporate development at AGNITiO, said most voiceprints so far are for fraud prevention and criminal identification. Governments or banks create voiceprint databases of people who have committed crimes, or potential fraudsters, then use those biometric data to confirm a suspect.
“This is being used today to send people to jail,” Aragon said.
It takes a recording of about 30 seconds to make a reliable voice print, Aragon said. “You already called the bank several times, and those banks already recorded your voice. So they can create a voiceprint from those recordings that will be used to authenticate next time you call.”
Banks and software developers are just beginning to use this technology commercially, but “it’s about to explode,” Aragon said. “This is just the beginning.” AGNITiO’s research predicts that by 2019, 5 billion people will have created a voiceprint with their banks or their mobile phone applications.
AGNITiO is a member of the FIDO, or “Fast Identity Online,” Alliance, which includes companies in technology, security, and equipment manufacturing. The Los Angeles Times reported the 150-member non-profit group is making common guidelines for new login systems, like biometrics, which would avoid storing user information in huge databases susceptible to hacking.
How will you interact with voiceprints?
The Associated Press reported in October on a few niches for voiceprints: securing an apartment, accessing a bank account, signing contracts, and monitoring parolees. Another niche for biometrics, including voiceprints, is in securing and accessing a smart device.
Todd Mozer is CEO of Sensory, a small Silicon Valley company that develops algorithms for voice biometrics that ship in Samsung products, among others.
“I think everybody realizes now that passwords aren’t secure,” Mozer said. Moreover, the cloud “has really heightened security concerns”—it’s susceptible to hacking—so his company avoids the cloud problem by storing encrypted data locally.
Mozer said, and other sources agreed, using combinations of biometric data to unlock a device seems like the best future option for personal security. When Sensory combines two or more types of biometrics—for example, using face and voice recognition to unlock a phone—its error rate drops to 0.03 percent. Sensory’s voice encryption alone, according to Mozer, has an error rate of 1 percent for verifying a person’s identity. For identifying someone out of a large group, that number rises.
Why do voiceprints worry privacy advocates?
Biometrics for smartphones do not have potential issues beyond how effective they are at stopping thieves and how securely they are encrypted. In large-scale security operations, like those used by the state, law enforcement and increasingly by banks, questions of consent and privacy rights might arise.
“Certainly with the report about the use of big banks, we see that there hasn’t been proper consent,” said Jay Stanley, senior policy analyst with the ACLU’s Speech, Privacy and Technology Project. Stanley also edits the ACLU’s Free Future blog.
He’s referring to the Associated Press’s report on banks’ use of voice biometrics to stop fraud. The AP reported seven American financial institutions are using blacklists created with voiceprints to alert them to fraudsters. In a document detailing Israel-based NICE Systems Ltd.’s creation of a blacklist for American banks, NICE advised American banks to change the message beginning each call to say: "This call may be monitored, recorded and processed for quality assurance and fraud prevention purposes." Playing this message—specifically the word “processed”—would establish customers’ consent to have a voiceprint created for them, the document said.
This suggested notice “seemed intentionally tailored,” Stanley said, “to avoid giving people a clear idea of what was going on. I don’t think that constitutes proper consent at all.”
Governments have used voice biometrics for years. The Intercept reported in August that the NSA has used voiceprinting to help Turkey kill rebels from the Kurdistan Worker’s Party (PKK). The NSA gave Turkey “access to a state-of-the-art speech recognition system that enabled real-time analysis of intercepted conversations,” and Turkey provided voice samples of targeted activists in exchange, The Intercept reported. A January 2007 document leaked by Edward Snowden said voice samples from PKK members “yielded actionable intelligence that led to the demise or capture of dozens of PKK members in the past year,” The Intercept reported.
Biometric dragnet: the problem of false positives
Clifford Neuman, director of the University of Southern California’s Center for Computer Systems Security, said he’s not aware of legal issues with voiceprinting, so long as there is consent before one’s biometrics are gathered and included in a database. One potential issue, Neuman said, is when voiceprints gathered for one purpose are then used for an unauthorized purpose, like surveillance. Presumably companies’ privacy policies would limit what voiceprints are used for, he said, but “that doesn’t mean they necessarily follow it.”
Jeremy Gillula from the Electronic Frontier Foundation said there is reason to worry about exactly that—using biometric data for an unauthorized purpose. This is an ongoing problem with facial recognition, according to the EFF. The FBI’s Next Generation Identification (NGI), an enormous biometric database, will include 52 million photos by 2015, according to documents EFF obtained through a FOIA lawsuit. The NGI mugshots will comprise photos taken for both criminal and non-criminal purposes, like job applications.
Gillula said the issue with NGI is that no biometric is perfect – not facial recognition, and certainly not voiceprints. There will always be false positives, people mistakenly identified as other people. A false positive could mean a person whose photo was taken for a job application then scraped into an FBI database is identified in a criminal investigation.
“If there are thousands of people checked, then that can mean dozens of people every year who are dragged into this and have their lives disrupted, when there was no real good reason to do it.”