Zuckerberg Testifies in Social Media Safety Trial: Beauty Filters, 4M Underage Users, and AI Glasses in Court

Zuckerberg testifies in LA social media safety trial courtroom

Mark Zuckerberg took the witness stand Wednesday in a landmark Los Angeles courtroom — and the testimony was as revealing as it was dramatic. The trial, which experts are calling the social media industry's "Big Tobacco" moment, centers on whether Meta, YouTube, TikTok, and Snap deliberately designed their platforms to be addictive and harmful to young users.

The Tim Cook Outreach

Defense lawyers opened by painting Zuckerberg as proactive on teen safety, pointing to a 2018 email exchange between him and Apple CEO Tim Cook. "I thought there were opportunities that our company and Apple could be doing and I wanted to talk to Tim about that," Zuckerberg said. He added: "I care about the wellbeing of teens and kids who are using our services."

Beauty Filters: Free Expression vs. Harm

One of the sharpest exchanges came over Instagram's beauty and cosmetic surgery filters. A University of Chicago study involving 18 experts concluded that these filters cause harm to teenage girls. Despite this, Meta lifted a temporary ban on the filters.

Zuckerberg defended the decision: "I genuinely want to err on the side of giving people the ability to express themselves." He described a stricter approach as "paternalistic" and said it "feels a little overbearing."

A senior Meta employee, VP Margaret Stewart, had emailed internally that while she would support Zuckerberg's decision, she didn't believe it was the "right call given the risks" — citing a personal family situation that gave her "first-hand knowledge" of the alleged harms.

Engagement Goals: "Not Company Goals"

Lawyers presented a 2015 email thread in which Zuckerberg appeared to highlight improving engagement as an urgent company matter. He pushed back, saying the comments were "aspirational." But lawyers then presented evidence from Instagram chief Adam Mosseri showing internal targets to increase daily engagement time to 40 minutes in 2023 and 46 minutes by 2026.

Zuckerberg maintained the company uses internal milestones to measure performance against competitors, not to addict users.

4 Million Underage Users

Lawyers raised a document showing that 4 million children under 13 used Instagram in the U.S., despite the platform requiring users to be 13 or older. Zuckerberg acknowledged that some users lie about their age and said the company removes all underage users it identifies.

A plaintiff's lawyer fired back: "You expect a 9-year-old to read all of the fine print? That's your basis for swearing under oath that children under 13 are not allowed?" Zuckerberg also advocated pushing age-verification responsibility onto Apple and Google, which control mobile operating systems and app stores.

Meta AI Glasses in a Child Safety Trial

In one of the trial's more surreal moments, members of the team escorting Zuckerberg into the courthouse were photographed wearing Meta's own Ray-Ban AI glasses. Recording is prohibited in the courtroom.

Judge Carolyn B. Kuhl threatened contempt: "If you have done that, you must delete that, or you will be held in contempt of the court. This is very serious."

Can the Board Fire Him?

Lawyers also questioned whether Zuckerberg had previously lied about the board's ability to remove him. Referencing comments he made on Joe Rogan's podcast, Zuckerberg clarified in court: "If the board wants to fire me, I could elect a new board and reinstate myself." He also told the courtroom he is "very bad" at media.

Where the Trial Stands

Snap and TikTok settled with the plaintiff before the trial began. Meta and YouTube are still fighting the allegations. Meta's spokesperson told CNBC: "The question for the jury in Los Angeles is whether Instagram was a substantial factor in the plaintiff's mental health struggles."

Meta is also separately on trial in New Mexico, where the state's attorney general alleges the company failed to protect children from online predators. A third major trial is expected this summer in Northern California.

The Bottom Line

Wednesday's testimony exposed the gap between Meta's public messaging and its internal decision-making. Beauty filters stayed up despite harm warnings. Engagement targets existed despite denials. Millions of underage kids used the platform despite age requirements. Whether a jury finds Meta liable remains to be seen — but the testimony alone is damaging.