EC: Meta and TikTok May Have Violated Digital Services Act
Meta and TikTok may have breached provisions of the Digital Services Act (DSA) regarding transparency and user rights and empowerment, the European Commission said Friday. Neither company commented immediately.
Sign up for a free preview to unlock the rest of this article
Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.
The DSA governs intermediary service providers such as social media, online marketplaces, very large search engines and online platforms with at least 45 million active monthly users in the EU (see 2311100001). Among other provisions, the law bars very large online platforms (VLOPs) from using minors' personal data for targeted advertising (see 2210040001).
The EC tentatively found that TikTok and Meta breached their obligation to give researchers adequate access to public data. In addition, it tentatively found that Meta's Instagram and Facebook failed to offer users simple mechanisms for notifying the companies of illegal content and challenging content moderation decisions.
Transparency is a key element of the DSA, EC officials said at a Friday background briefing. One important aspect of transparency is for researchers to be able to request access to publicly available information on platforms to check issues such as whether users, including minors, are being exposed to illegal or harmful content.
However, Facebook, Instagram and TikTok may have implemented onerous procedures and tools that make it difficult for researchers to request access, preventing them from carrying out those checks.
The DSA requires platforms to install a "notice and action" mechanism that lets users flag illegal content such as child sexual abuse. Here, EC officials noted, they haven't seen good results from Facebook or Instagram, which instead have systems that are too difficult to use and employ deceptive interface designs known as "dark patterns." The EC has had many complaints about the inadequacy of the notice-and-action system, officials said.
This isn't a case about how platforms address illegal content, but how they're implementing DSA-required systems that let users report what they deem illegal content and products, officials said at the briefing.
The DSA also gives users the right to challenge content-moderation decisions when platforms remove their material or suspend their accounts, but Facebook's and Instagram's systems aren't user-friendly, making them ineffective, the EC said.
The companies can now respond to the findings, the EC said. If the VLOPs are ultimately found to have breached the DSA, they could be fined up to 6% of their total annual worldwide revenue.
DSA rules can make the difference in whether unsafe, illegal products and scam ads are taken down and whether kids will be protected from seeing content inciting self-harm, European Consumer Organisation Director-General Agustin Reyna said Friday. He urged the EC to focus on deterrent enforcement actions.