Meta Hit With $375 Million Verdict for Failing to Protect Kids Online

Courtroom gavel striking down on cracked smartphone representing Meta child safety verdict

A New Mexico jury has slammed Meta with a $375 million verdict for violating state consumer protection laws by failing to safeguard children from predators on Facebook and Instagram. The ruling marks one of the largest penalties ever imposed on a social media company over child safety failures.

The civil trial, which began in Santa Fe last month, centered on a damning 2023 undercover operation by New Mexico Attorney General Raúl Torrez. His office created a fake social media profile of a 13-year-old girl that was “simply inundated with images and targeted solicitations” from child abusers — exposing exactly how easily predators can reach minors on Meta’s platforms.

What the Jury Found

Jurors determined that Meta willfully violated New Mexico’s Unfair Practices Act across multiple counts. The $375 million penalty was calculated based on the number of individual violations — and prosecutors had initially pushed for damages exceeding $2 billion.

Perhaps most damaging were the internal Meta documents revealed during the trial. Company employees discussed how CEO Mark Zuckerberg’s 2019 decision to make Facebook Messenger end-to-end encrypted by default would impact their ability to report approximately 7.5 million child sexual abuse material reports to law enforcement.

“Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew,” Attorney General Torrez said in a statement following the verdict.

Meta’s Response

Meta has vowed to appeal. “We respectfully disagree with the verdict,” a Meta spokesperson said. “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content.”

The company also argued that prosecutors cherry-picked internal documents to paint an unfair picture, though Torrez countered: “All we’re doing is showing the world what they knew behind closed doors and weren’t willing to tell their users.”

The Big Tobacco Comparison

Legal experts have compared this case — and similar ones pending against other social media companies — to the landmark Big Tobacco lawsuits of the 1990s. Both involve allegations that companies knowingly misled the public about the safety and potential harms of their products.

A second phase of the trial is set for May 4, where a judge will determine whether Meta created a public nuisance and should fund programs to address the harms. Prosecutors are also pushing for Meta to implement meaningful changes to its platforms, including effective age verification, proactive removal of predators, and protections against encrypted communications that shield bad actors.

The Bottom Line

This verdict sends a clear message: the era of social media companies hiding behind Section 230 to avoid accountability for child safety is ending. Meta knew its platforms were dangerous for children and chose profits over protection. A $375 million fine is significant, but it’s a rounding error for a company worth over $1.5 trillion. The real question is whether the Phase 2 ruling forces actual product changes — that would hurt Meta far more than any fine.