Texas Joins List of States Suing Roblox for Harms, Exploitation of Children
Texas announced a lawsuit against Roblox on Thursday in state court for violating state and federal online safety laws via its conduct with children. It follows several other state attorneys general who have filed similar litigation against the gaming platform.
Sign up for a free preview to unlock the rest of this article
Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.
Texas alleged that Roblox misrepresented itself as a safe digital space, while in reality exposing children to sexually explicit content, exploitation and grooming. The platform has “intentionally marketed Roblox to pre-teen children,” the suit said, and there's no “type of age or identity verification to sign up.”
Roblox also failed “to require age-appropriate designations on the millions of experiences created by third parties,” which gives “children access to highly inappropriate experiences.” The complaint was brought under the Texas Deceptive Trade Practices Act.
“We cannot allow platforms like Roblox to continue operating as digital playgrounds for predators where the well-being of our kids is sacrificed on the altar of corporate greed,” said Texas Attorney General Ken Paxton (R) in a press release. “Roblox must do more to protect kids from sick and twisted freaks hiding behind a screen.”
In an email to Privacy Daily, a Roblox spokesperson said the platform was "disappointed" that "the AG has chosen to file a lawsuit based on misrepresentations and sensationalized claims," instead of "working collaboratively" with the company. The platform shares "Paxton’s commitment to keeping kids and teens safe online, which is why we have implemented industry-leading protocols in an effort to protect users and remove bad actors."
Roblox "prohibit[s] the sharing of images and videos in chat, use[s] filters designed to block the exchange of personal information" and ensures that "trained teams and automated tools continuously monitor communications to detect and remove harmful content," the spokesperson added.
Roblox also recently announced it's working to implement age estimation for all users accessing chat features, "which will help prevent adults from chatting with minors," the email said. Doing so will be "a first for the industry," (see 2507180059).
Kentucky (see 2510070042) and Louisiana (see 2508140051) have sued Roblox for harms to children. A class action with similar charges was also filed against Roblox in early 2024 (see 2402200046).
Florida’s AG has also subpoenaed Roblox for information about children’s safety on the platform, as well as whether its actions (or lack of them) aided bad actors (see 2510200041 and 2504160045).
In response to increasing scrutiny from state enforcers, Roblox announced on Oct. 29 that it would partner with AGs on a youth online safety initiative (see 2510290020). Despite this, a former FTC privacy attorney expressed skepticism regarding the platform’s motives following this announcement (see 2510310016).