Privacy Advocates' Concerns About Facial Recognition Tech Grow With Increased Usage
With the growing volume and diverse application of facial recognition technology, privacy advocates are increasingly concerned about issues with consent and how the technology's flaws carry heavy privacy risks, they said in interviews with Privacy Daily.
Sign up for a free preview to unlock the rest of this article
Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.
Facial recognition technology can be used to unlock a phone and identify a mugging suspect, and travelers are increasingly seeing it before boarding aircraft or while at customs. In-person face scans at airports that verify passengers match photos on passports or other IDs “tend to be more accurate on the whole [than other uses] and are much less ... privacy and rights invasive,” said Jake Laperruque, deputy director of the security and surveillance project at the Center for Democracy & Technology (CDT).
Yet uses of facial recognition technology often go beyond matching a photo of a person with one printed on a driver's license. For instance, a more complex use of the technology involves applying an algorithm to match the photo with characteristics stored in a database of face prints.
In addition, the technology can help track a person’s location around a city or determine demographics, said Adam Schwartz, privacy litigation director at the Electronic Frontier Foundation (EFF). “We're seeing this with age verification. ... They're scanning a face and saying, ‘Are you 18?’”
Another growing use is for broad identification scans, which help determine who's entering a store or ticketed event and ensure that someone on a prohibited list hasn't gotten in, Laperruque said. Law enforcement can also use facial recognition on a photo of a crime scene to try to track down a suspect.
“A lot of the [privacy] risks depend on the scenario,” Laperruque said. For example, a facial recognition malfunction at an event may lead to someone being forced to procure a physical ticket for entry, while a miscue with law enforcement could result in someone being incorrectly jailed, he noted.
Consumer Reports (CR) Policy Analyst Matt Schwartz agreed, saying “there's a meaningful difference between something like ClearView AI, which allows anybody to instantly recognize anybody else in public,” and private, retail usage of the technology (see 2510080016).
EFF's Schwartz raised concerns about the technology's potential “massive intrusion” of privacy, the deterrent effect on people attending protests, and facial recognition's potential mistakes. In addition, the technology “is having disparate impacts against women and against people of color in misidentifying them as someone who is suspicious,” he said.
"Wherever we go, we ... show our face," said the EFF official. "And we can't change our face, and there are cameras everywhere, and those cameras are increasingly networked together, and so willy-nilly [facial scans] can intrude on our location privacy.” He argued that “part of being a free person" is the ability to "walk around your community and have some anonymity.”
CR's Schwartz said, “It's always concerning that even if we had assurances that technology isn't going to be used in these harmful ways, that those promises can easily go away.”
Privacy Issues
The advocates emphasized that there's a difference between law enforcement use and private, commercial deployment of the technology.
“In terms of law enforcement … the overarching issue is that facial recognition presents a whole set of issues when it's accurate and a whole different set of issues when it's wrong,” Laperruque said. For example, when the technology is incorrect and information is not verified, it can lead to wrongful convictions.
But “even when it is accurate, that creates a whole range of civil liberties risks,” like documenting protesters or people attending religious or political events, Laperruque said. While some states have rules requiring a warrant when the tech is used, “in general, the rules are far too lax, and there's no nationwide rules on it,” he added. “There are very severe risks" that this could be used "to comprehensively catalog sensitive activities in a way that's dangerous and really needs some guardrails to rein it in.”
Adam Schwartz said that generally, corporations "don't need someone's permission" to take a photograph of their face. Some states have laws providing consumers rights to know about and delete information, "but in terms of the initial [taking of photos], only in two states are there any limits at all,” he said, referring to Illinois and Texas.
It’s similar in the government sector, where “most police in most places" can deploy the technology "to their heart's content,” he said. There are, though, 20 cities and counties that have banned facial recognition technology and a few more that require a warrant before it's used.
Matt Schwartz noted that Somerville, Massachusetts, was one of the first localities to ban the technology, right around when it “first made [its] way into the public consciousness.” Deployment of the technology waned a bit, since “a lot of the outcry … affected the rollout” and “deployment of these tools.”
The response also may have "chastened companies a little bit” and “scared them from rolling out some of the more egregious deployments of these technologies,” he added.
The EFF position is that "government should not be using it [at] all,” Adam Schwartz said. “It is simply too dangerous in the hands of the state, especially at this moment of rising authoritarianism.” But in the private sector, EFF draws the line at consent, he said. “If you want to take my face, ask me for permission.”
Increased Use, Doorbells and Consent
Laperruque noted that the technology's use is far from new and that it has been "steadily expanding." Now, though, there’s “more visibility” when it's used. “The more common it [becomes], the more we get insights about real world impacts," he said. But they still "are quite often hidden," especially police uses of the technology.
CR's Schwartz said the real-world examples are drawing awareness. Recent use cases are getting attention "for good reason.”
Adam Schwartz agreed. “With every passing year, there's a big increase in its use,” he said. “It is quite alarming.”
For example, Amazon recently announced it would be adding facial recognition technology to its Ring doorbell camera. “It is very disappointing that Amazon apparently is going to start selling a product that, by design, is going to be taking face prints from innocent people who are walking on a porch to sell cookies or to register people to vote,” EFF's Schwartz said.
“For a century or more, we have -- to a degree -- been used to the idea that if you walk about in public, some rando with a camera might take your picture,” he added. “But taking someone's face print is a completely different thing, because it empowers this stranger to pretty quickly figure out who you are.”
Matt Schwartz advised that there are “things that Ring should be thinking about doing in order to protect people's privacy,” such as consent. Amazon should make sure there's a mechanism to affirmatively consent to be actively identified by the system and ensure that face templates aren't taken if the subject isn't on that list, or at least “deleted immediately and not used by Amazon or anyone else ... for any other purpose.”
“There's probably a technical solution to getting consent there,” he said. “People shouldn't have … their facial template added to a database for others to use as they will, just because they walked up to someone's house, or, even worse, just because they were walking on the sidewalk near somebody's house.”
“The way that this has been rolled out seems pretty suboptimal to me,” the CR policy analyst added. “It seems like there's a lot missing in terms of controls, or at least we don't know how the controls of this technology are actually going to work in practice, because it doesn't seem like there's been a ton of ... public-facing documentation about it.”
Additionally, consent issues arose recently when Home Depot (see 2508050063) and Kmart Australia (see 2509180023 and 2509190001) were slammed for using facial recognition technology in their stores. EFF's Adam Schwartz said that in these instances, consent and how the tech was implemented are crucial.
“If a business says you cannot get in the door to my store without submitting to a face print, then there's no consent here,” and ditto if “the only way you can pay is if you submit to face printing.”
But “if there are two checkout lines, and they are both moving more or less at the same speed … but one of them is taking your face prints,” that’s OK, because it’s based on consent, since a customer can choose either line.
Laperruque also said it's important for people to know that facial recognition technology is “much more broadly used than you would expect” and that “its accuracy is very limited and very circumstance-dependent."