California parents whose children become addicted to social media apps would be able to sue for damages under a bill advanced Tuesday in the state Assembly by a bipartisan pair of lawmakers.
Assembly Bill 2408, or the Social Media Platform Duty to Children Act, was introduced by Republican Jordan Cunningham of Paso Robles and Democrat Buffy Wicks of Oakland with support from the University of San Diego School of Law Children’s Advocacy Institute. It’s the latest in a string of legislative and political efforts to crack down on social media platforms’ exploitation of their youngest users.
“Some of these companies do indeed intentionally design features in their apps — that they know children are using — that cause the children to use it more and more and more, [and] exhibit signs of addiction,” Cunningham said in an interview. “So the question to me becomes … who should pay the social cost of this? Should it be borne by the schools and the parents and the kids, or should it be borne in part by the companies that profited from creating these products?
“We do this with any product you sell to kids. You have to make sure it’s safe. Some sort of stuffed animal or something that you’re selling to parents that are going to put it in their 5-year-olds’ bed — you can’t have toxic chemicals in it.… We just haven’t done that as a society, yet, when it comes to social media. And I think the time is now to do that.”
Media materials from the Children’s Advocacy Institute explain that the bill would first obligate social media companies to not addict child users — if necessary amending their design features and data collection practices — and then empower parents and guardians to pursue legal action in the name of any children injured by companies that fail to comply.
Damages could include $1,000 or more per child in a class-action suit or as much as $25,000 per child per year in a civil penalty, the institute said.
However, it said, there would also be a safe harbor provision that would protect “responsible” social media platforms from being penalized if they took “basic steps to avoid addicting children.” Companies with less than $100 million a year in revenue would also be excluded.
“I suspect you’ll see a range of potential [compliance] solutions,” Cunningham said. “There might be some companies that stop letting kids sign up; that’s probably the safest thing to do. But I don’t know that they’re going to do that. Whatever features within their algorithms that are creating the addictions, especially in teenagers — they can disable those features. That could be another thing.”
Calls to regulate social media companies have grown increasingly loud over the last few years, buoyed by an increasing backlash against companies such as Twitter, TikTok and Meta (formerly Facebook). Critics have focused on problems including the companies’ collection of user data, their role in shaping public discourse and their largely unilateral decisions about how to moderate — and not moderate — user content.
But the effect they have on children has been a particularly charged issue, and one that has proved uniquely conducive to across-the-aisle collaboration. The issue reached a fever pitch late last year when whistleblower and former Facebook employee Frances Haugen leaked documents indicating that the company was aware of the extent to which its subsidiary platform Instagram can take a negative toll on young users’ mental health, especially when it comes to teenage girls and body-image issues.
In the aftermath of Haugen’s leaks and subsequent testimony before Congress, extensive bipartisan criticism of Big Tech coalesced around the effect of social media on underage users.
This month, California Atty. Gen. Rob Bonta helped launch a multi-state investigation into how TikTok may be preying on children. A few months earlier, Bonta launched a similar investigation into Instagram, also focused on young users.
In November, Ohio’s attorney general sued Meta for allegedly having misled investors about the effect its products can have on children, boosting its stock in violation of federal securities laws.
And in January, a Connecticut mother filed a lawsuit against both Meta and Snapchat owner Snap for “defective design, negligence and unreasonably dangerous features” after her daughter took her own life last summer.
Case documents reported on by the Washington Post say that Meta and Snap are responsible for a “burgeoning mental health crisis perpetrated upon the children and teenagers in the United States” and, more specifically, for “the wrongful death of 11-year-old Selena Rodriguez caused by Selena’s addictive use of and exposure to” the platforms.
Efforts to launch an Instagram Kids spinoff app were paused in the wake of Haugen’s whistleblowing. A similar product launched by YouTube in 2015, YouTube Kids, has proved more durable, with human curation replacing the main platform’s algorithmic content recommendations.
The theme of protecting children from the harms of social media even made an appearance in President Biden’s most recent State of the Union address.
“We must hold social media platforms accountable for the national experiment they’re conducting on our children for profit,” the president said.
Cunningham called Haugen’s leaks a “catalyst” for the new bill, though not its sole motivator.
“It’s something that had been on my mind — and my joint author Buffy Wicks’ mind as well — for a number of years,” he said. “We come at it from the standpoint of being legislators that are also parents. I’ve got four kids: three teenagers and a first-grader. And I have many, many friends that have confided in me over the last couple years that their kids, through the use of TikTok or Instagram or both, were suffering psychiatric issues: depression, body-image issues, in some cases even anorexia.”
Representatives from Twitter, Reddit and TikTok declined to comment on the bill. TikTok’s spokesperson said the company had not yet had a chance to review it in-depth but added that it already has tools in place that facilitate screen-time management and disable push notifications for underage users at night.
A representative from Meta did not say if or how the company would change its apps’ policies, features or algorithms if the bill passed, instead pointing to its past rebuttals of how Haugen characterized Instagram’s mental health effect on teens. The representative noted that Meta on Wednesday launched a new resource center to help connect parents with social media supervision tools, alongside other safety features that were already in place.
Thanks to the protection afforded them by a snippet of regulatory language called Section 230, internet platforms enjoy wide legal protection to host content that their users post without being themselves liable for it. Some lawyers describe it as a “brick wall” preventing any meaningful lawsuits against the tech giants.
The Cunningham-Wicks bill attempts to sidestep that wall by targeting the platform’s algorithms rather than any specific content.
According to the Children’s Advocacy Institute, the bill will be heard by the Assembly’s Judiciary Committee sometime this spring. Cunningham said he hopes to get it to Gov. Gavin Newsom by September.