Consumer Advocates: Regulation of Facial Recognition Like 'Wild West'
As the scope and usage of facial recognition technology increases, privacy advocates are increasingly concerned about a lack of regulation as well as carve-outs in instances where rules exist, they said in interviews with Privacy Daily. But there are existing laws that cover the technology, some contended.
Sign up for a free preview to unlock the rest of this article
Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.
Regulation of facial-recognition technology is, “by and large ... the Wild West,” said Adam Schwartz, privacy litigation director at the Electronic Frontier Foundation (EFF).
“The people of the United States are woefully unprotected when it comes to their biometric privacy -- and especially with face printing -- and so the best thing we can do is pass new laws,” he argued. But in the absence of that, “we should be ... taking a fresh look at old privacy laws.”
Consumer Reports (CR) Policy Analyst Matt Schwartz agreed. “Regulation of tech writ large is probably more challenging now than it has been in the last couple of years.”
Illinois is one of the few states with a specific biometric privacy law, in addition to Texas’ Capture or Use of Biometric Identifier Act (CUBI) and Washington state’s provision on biometric identifiers. In addition, some states have laws that regulate biometrics in certain instances.
Called the Biometric Information Privacy Act (BIPA), the Illinois law defines biometric identifiers and biometric information, said privacy attorney Peter Berk of Clark Hill. Identifiers include things like retina scans, iris scans, fingerprints, or scans of one’s hand or face geometry. Biometric information, on the other hand, is data based on a biometric identifier used to identify an individual.
“BIPA has some very specific consent and disclosure requirements in it, including what you're collecting, how it's going to be used, how long it can be kept [and] when it's going to be destroyed,” Berk said. “The subject's consent is also required.”
It's an older statute, from 2008, that has “only recently gotten notoriety, mainly because [of] the large settlements it created in lawsuits,” Berk said.
But just because a state lacks a law similar to BIPA doesn't mean other measures can't be invoked to regulate facial-recognition technology, Berk said. In a ClearView AI case at the U.S. District Court for Northern Illinois (docket 21-cv-0013) for example, the plaintiff alleged violations of the Virginia Computer Crimes Act, California’s Unfair Competition Law and the New York Civil Rights Law in addition to BIPA (see 2205090043).
“We're seeing plaintiffs be creative and use other laws surrounding privacy, name and image misappropriation, and those types of laws and try to make claims that what the company is doing is wrongful,” Berk said.
But CR’s Schwartz said that in states with comprehensive privacy laws, “there are often requirements that they get consent before collecting sensitive information,” and biometrics are sometimes included in the definition of sensitive data. Yet carve-outs for fraud and security can limit the need for businesses to obtain consent, he noted.
“A lot of it depends on how the … facial recognition technology [is] deployed,” said Berk. “Is it done surreptitiously? Is it done in a very public setting or very private setting?”
Jake Laperruque, deputy director of the security and surveillance project at the Center for Democracy & Technology (CDT), said there has been bipartisan progress in states recently, with “some of the most conservative and some of the most liberal states" acting on facial recognition issues. While comprehensive data privacy rules can address commercial uses, he recommended a more tailored approach for law enforcement usage.
“Privacy is a big interest,” but really there are two interests for legislators, consumer privacy and business privacy, Berk said. “Hopefully," states are "trying to find the right balance between the two.”
For example, Amazon recently announced it's adding facial recognition to its Ring doorbell camera, which disappointed EFF and CR (see 2510100041). And while Berk said, “Consumer sentiment plays" into lawmakers' calculus, he's uncertain that additional regulation will outweigh the needs of business.
CR is agnostic about approaches to regulating facial recognition technology, Schwartz said. “It's more a matter of strategy,” depending on “if it's more likely that a state's going to be able to pass a standalone BIPA bill” or “if there's a chance of doing it all in one fell swoop in a comprehensive bill.”
Berk predicted the possibility of seeing “biometric-type regulation through some of the new statutes dealing with AI.” He noted that “we have a lot of broad statutes being passed to regulate AI, and some of those may sweep in ... the use of AI in connection with biometrics.”
Additionally, some states have a genetic privacy statute that could potentially be used to combat facial recognition technology, he said, but “the key issue there is, what is genetic information? Is it purely the chromosomal information, or is it broader?”
Regulators abroad tend to be “stricter on privacy,” and “biometric facial recognition information is sometimes specially categorized,” Berk added.
Overall, EFF's Schwartz is “very optimistic about protecting people's privacy.” He reasoned that "more and more people are more and more afraid and angry about what companies and government are doing to them with their own face and are demanding legislation, and there's only so long that Big Tech can keep its fingers in the dike and hold back the inevitable enactment of strong privacy legislation.”