Consumer Self-Help 'Unlikely to Fix the Dark Patterns Problem,' Law Paper Finds
Consumers are unlikely to protect themselves against privacy dark patterns, so legislation and regulation is needed, according to a paper published Wednesday in the University of New Hampshire Law Review. “This paper strongly suggests that dark patterns do prompt consumers to surrender more privacy than they otherwise would.”
Sign up for a free preview to unlock the rest of this article
Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.
“Dark patterns are online interfaces that manipulate, confuse, or trick consumers into purchasing goods or services that they do not want, or into surrendering personal information that they would prefer to keep private,” wrote professors Matthew Kugler from Northwestern University, Lior Strahilevitz and Marshini Chetty from the University of Chicago, Chirag Mahapatra of Harvard and Yaretzi Ulloa from Yale.
“As new laws and regulations to restrict dark patterns have emerged, skeptics have countered that motivated consumers can and will protect themselves against these manipulative interfaces, making government intervention unnecessary,” they said. However, the professors’ paper “provides experimental evidence showing that consumer self-help is unlikely to fix the dark patterns problem.”
Potentially effective government "interventions may take the form of prohibitions on dark pattern interfaces and/or mandates that web sites and apps respect browser-based privacy preference signals.”
The academics noted that they found an opt-out required by the California Consumer Privacy Act to be effective. “As long as consumers see the Do Not Sell option, a super-majority of them will exercise their rights, and a substantial minority will even overcome dark patterns in order to do so.”
For the study, the academics integrated common dark patterns like obstruction, interface interference, preselection and confusion into a video-streaming website’s privacy settings, said the paper. They found that the patterns were “strikingly effective at manipulating consumers into surrendering private information even when consumers were charged with maximizing their privacy protections and understood that objective.”
Also, the professors found that “nagging dark patterns, which manipulate consumers but do not deceive them, are highly effective even when used sparingly,” the paper said. “These nagging dark patterns convince users to adopt privacy settings consumers do not prefer, not by persuading consumers to change their preferences, but by making it clear that the user interface will not take no for an answer.”
Dark patterns work on “all kinds of Americans,” the academics noted. “Only less technologically literate people and those exhibiting more authoritarian personality dispositions stood out as especially vulnerable, and those effects were somewhat inconsistent.”