Kentucky AG Says Roblox Harms Children in Op-Ed
Roblox's new safety features are the result of pressure from the states and may not actually reduce the harms to children that occur on the gaming platform, argued the Kentucky attorney general Wednesday in an op-ed for the New York Post.
Sign up for a free preview to unlock the rest of this article
Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.
In July, Roblox said it would begin using age-estimation technology to filter content available to children, teens and adults on the platform (see 2507180059). Following legal action from several state enforcers -- including a suit from Kentucky alleging the facilitation of child exploitation (see 2510070042) -- the platform also announced it would partner with attorneys general on a youth online safety initiative (see 2510290020).
But Roblox “has a history of implementing safety tweaks that sound good in a press release but fall short in practice,” the AG said, adding that the company pushed for safety only “in the face of lawsuits like ours,” said AG Russell Coleman (R) in the op-ed.
Filed in October, Kentucky’s lawsuit claims Roblox “did not create the safeguards necessary to keep predators from pretending to be children.” In addition, it failed “to design and implement age-appropriate guardrails to protect young users.” Further, Coleman said in the op-ed that his 10-year-old son was able to “outsmart” the “lax” age-verification on Roblox and create an account without his knowledge or permission.
Roblox, however, called the suit “sensationalized” and based on out-of-date information in response (see 2510080005).
“The company’s splashy public-relations rollout -- which came several weeks before the platform will actually go live in the United States -- made much of features like ‘facial age checks’ meant to limit communication between adult users and kids,” Coleman said. But that will “do nothing to eliminate the problem” of “allowing children as young as 5 to create accounts and chat with people they don’t know, without any parental consent.”
Though “we can hope” the safety measures are “a step in the right direction that truly mitigates this platform’s harmful nature,” the platform “hasn’t earned parents’ trust, and we shouldn’t take executives’ word that they’ve seen the light,” Coleman said.