NY AG Issues NPRM for Kids Social Media Addictive Feeds Act
New York Attorney General Letitia James (D) issued a Notice of Proposed Rulemaking on Monday for the Stop Addictive Feeds Exploitation (SAFE) for Kids Act, which was signed into law in June 2024 (see 2406070065). The proposed rules offer advice for social media companies about how they should restrict their platforms' addictive features to avoid harming the mental health of children.
Sign up for a free preview to unlock the rest of this article
Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.
The comment period on the rules is open until Dec. 1, 2025, the AG office said. Then, the office has a year to finalize the rules, which will go into effect 180 days after the final rules are released.
The SAFE Act requires that users younger than 18 see content from accounts in a set sequence, such as chronological order, unless parental consent is given for an algorithmic personalized, or addictive, feed. Social media platforms are also prohibited from sending notifications to underage users between midnight and 6 a.m. without parental consent.
Addictive feeds and nighttime notifications are "tied to depression, anxiety, eating and sleep disorders, and other mental health issues for children and teenagers," the AG office said.
The proposed rules include suggestions around age assurance, as social media platforms must verify that a user is an adult before the user can access algorithmic feeds or nighttime notifications.
The AG's suggested rules propose users upload a photo or video or submit their email address or phone number to verify their age. However, "companies may confirm a user’s age using a number of existing methods, as long as the methods are shown to be effective and protect users’ data."
The companies must offer at least one option for age assurance that doesn't include uploading a government ID, and anything submitted cannot be used for another purpose and "must be deleted or de-identified immediately after its intended use," said the OAG. An annual test must be done to ensure the accuracy of the age assurance method as well.
If a minor wants an algorithmic feed or nighttime notifications, they must first give approval to a social media company to ask for parental consent; once approved, the company can then get the parental OK. Consent from either the minor or the child can be revoked at any time, and platforms aren't required to show parents the minors' search history to obtain consent.
The remainder of the 144-page collection of proposed rules contains standards for effective, secure, and privacy-protective age assurance, as well as standards operators must meet for valid parental consent.
Iain Corby, executive director of the Age Verification Providers Association, celebrated the rules in a press release. “The fact is that our members successfully completed more than one billion age checks in the last year," he said. "These age determinations are secure and protect user privacy at the highest levels."
State Senator Andrew Gounardes (D) also praised the Act and the release of the regulations. "I passed the SAFE for Kids Act in 2024 for one simple reason: I refuse to raise my children in a world where Big Tech profits at their expense,” he said.“Big Tech spent millions last year to defeat this bill and continue trapping kids into addictive algorithms, leading to a youth mental health crisis and sky-high rates of depression, anxiety, suicidal ideation, and self-harm."