Home Tech & AI Meta was finally held accountable for harming teens. Now what?

Meta was finally held accountable for harming teens. Now what?

1
0


Meta lost a lawsuit against the state of New Mexico last week, marking the first time that the company has been held liable by the court system for endangering child safety. This was a landmark decision on its own — but the next day, Meta lost another case when a jury in Los Angeles found that the company knowingly designed its apps to be addictive to children and teens, therefore endangering the mental health of the plaintiff, a 20-year-old known as K.G.M.

These precedents open the floodgates for a wave of lawsuits concerning Meta’s intentional pursuit of teen users, despite its knowledge that its apps can have negative mental impacts on teens. Thousands of cases like K.G.M.’s are pending, while 40 state attorneys general have filed lawsuits against Meta that are similar to New Mexico’s case.

While social media platforms are legally protected so that they cannot be held responsible for what users post on their platforms, this time, it wasn’t the content on these platforms that was on trial. It was the design features themselves, like endless scroll and round-the-clock notifications.

“They took the model that was used against the tobacco industry many years ago, and instead of focusing on things like content, they focused on these addictive features — how the platform is designed, and issues with the design, which is different than content, where you have this First Amendment argument,” Allison Fitzpatrick, a digital media lawyer and partner at Davis+Gilbert, told TechCrunch. “It turned out to at least be, in these two cases, a winning argument.”

The jury in the New Mexico case, after a six-week trial, found Meta liable for violating the state’s Unfair Practices Act, ordering the company to pay the maximum $5,000 per violation, totaling a $375 million fine. The Los Angeles case, which found Meta 70% liable and YouTube 30% liable for plaintiff K.G.M.’s distress, will fine the companies a combined $6 million. (Snap and TikTok settled the case before trial.)

“That’s nothing to the Metas of the world,” Fitzpatrick said. “But when you take that $6 million and you multiply it by all of the cases that they have against them, that becomes a huge number.”

“We respectfully disagree with these verdicts and will appeal,” a Meta spokesperson told TechCrunch. “Reducing something as complex as teen mental health to a single cause risks leaving the many, broader issues teens face today unaddressed and overlooks the fact that many teens rely on digital communities to connect and find belonging.”

Techcrunch event

San Francisco, CA
|
October 13-15, 2026

Over the course of litigation, new internal documents from Meta were revealed, displaying a pattern of inaction regarding its platforms’ known negative impact on minors, as well as a concentrated attempt to boost teen time spent on its apps, even during school or via “finstas,” which are “fake Instagram” accounts that teens make specifically to hide from parents or teachers.

One document showed a report with the results of a study from 2019, in which Meta conducted 24 in-person, one-on-one interviews with people whose usage of the product had been flagged as problematic — a designation that applies to an estimated 12.5% of users. 

“The best external research indicates that Facebook’s impact on people’s well-being is negative,” the report says.

Multiple documents referenced statements from Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri about prioritizing teen time engagement. Zuckerberg even comments that in order for Facebook Live to succeed with teens, his “guess is we’ll need to be very good at not notifying parents / teachers.”

In other documents, Meta employees spoke flippantly about the company’s goals for increasing teen user retention.

“We learned one of the things we need to optimize for is sneaking a look at your phone in the middle of Chemistry :),” one employee wrote in an email to Meta CPO Chris Cox.

“No one wakes up thinking they want to maximize the number of times they open Instagram that day,” Meta VP of Product Max Eulenstein wrote in an internal email in January 2021. “But that’s exactly what our product teams are trying to do.”

A Meta spokesperson told TechCrunch that many of the newly released documents are from nearly 10 years ago but that the company is listening to parents, experts, and law enforcement about how the platform can improve.

“We do not goal on teen time spent today,” the spokesperson said, citing Instagram Teen Accounts, introduced in 2024, which offer built-in safety features for teenage users. These protections include defaulting accounts to private and only allowing people they follow to tag or mention them in posts. Instagram will also send time limit reminders telling teens to leave the app after 60 minutes, which can only be changed for under-16s with parental permission. 

For Kelly Stonelake, a Director of Product Marketing at Meta who worked at the company from 2009 to 2024, these revelations are unsurprising. (Stonelake is currently suing Meta for alleged gender-based discrimination and harassment.) 

“The mountain of unsealed evidence really demonstrates what I experienced firsthand,” she told TechCrunch. 

At Meta, Stonelake led “go-to-market” strategies for the VR social app Horizon Worlds as it rolled out to teenagers. She alleges that she raised concerns over a lack of effective content moderation tools in the metaverse, but her objections weren’t taken seriously.

The U.S. government has taken a strong interest in the issue of children’s online safety, especially after Meta whistleblower Frances Haugen leaked damning internal documents in 2021 that showed Meta knew that Instagram was harming teen girls. 

While Congress has proposed numerous bills aimed at addressing children’s online safety, many of these efforts would do more to surveil adults and censor speech than it would to protect minors, some privacy activists say.

“There is no universe where passing censorship or ‘age verification’ law, under the guise of kids safety, doesn’t lead to massive online censorship of content and speech that Trump doesn’t like,” Fight for the Future director Evan Greer said in a statement.

Stonelake once lobbied on Capitol Hill for the Kids Online Safety Act, which has had the most momentum of any of these legislative efforts, garnering support from companies like Microsoft, Snap, X, and Apple. But as the bill has evolved and changed, she has grown critical of it.

“I am urging a ‘no’ vote on the current version,” she said, citing the bill’s preemption clauses, which would override state regulations on tech companies. “There is language in the latest version that would close the courthouse doors to school districts, to bereaved families, to states — and that’s wild.”

This language could, for example, preempt the very case that New Mexico brought against Meta. 

“We need folks to come to the table with solutions, instead of what they’re doing now, which is just telling a different story to both sides of the aisle to rile them up and get them freaked out,” Stonelake said. “The actual solution is going to need to be complex and nuanced and consider multiple priorities.”



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here