I Was a Meta Whistleblower. This Week's Landmark Verdicts Validate What Parents and Advocates Have Been Saying for Years.
Kelly Stonelake worked at Meta for nearly 15 years, including as Director of Product Marketing, and is a federal whistleblower and advocate for legislative reform. She also writes this Overturned newsletter on tech accountability and broken systems of power, and serves on the Advisory Council for ParentsRISE!, a survivor-parent-led movement demanding accountability from Big Tech. This article, which reflects the personal opinions of the author, is an excerpt of what originally appeared on the Overturned by Kelly Stonelake substack.
Juries in Los Angeles and New Mexico delivered what history will record as a turning point: accountability has arrived for Big Tech.
Car seat manufacturers and furniture companies can’t sell products that crush kids without consequences, neither can Mark Zuckerberg.
A California jury found Meta and YouTube responsible for the depression and anxiety of a young woman who compulsively used social media beginning in early childhood, awarding her $3 million in compensatory damages, and another $3 million in punitive damages. The jury found that both companies acted with malice, oppression, or fraud.
One day earlier, a separate jury in New Mexico ordered Meta to pay $375 million for failing to protect young users from child predators on Instagram and Facebook, finding that Meta had flouted state consumer protection laws.
I spent nearly 15 years at Meta. My career ended after I raised alarms about harm to children. I watched leadership prioritize protecting the company over protecting kids. I provided sworn testimony to the FTC, and my lawsuit against the company is currently in discovery in Federal Court.
What these juries saw in those internal documents is what I witnessed firsthand: a company that studied harm, documented harm, calculated the revenue from harm, and then tried to ensure those documents were never seen.
Documents that detailed things like a 13-year-old valued at $270 in lifetime advertising revenue. An internal Meta analysis titled “Long Term Retention: The Young Ones are the Best Ones” found that children who join Facebook as tweens have three times the long-term retention of adult users, and explicitly directed the company to “prioritize tweens over all other age groups.”
When Instagram CEO Adam Mosseri took the stand, he compared Instagram addiction to binge-watching Netflix, in a courtroom filled with families who lost children to Instagram. He testified that he earns approximately $10 million per year, much of it tied to performance indicators like stock price. When asked about teen safety, he claimed it was more important than growth. The jury got to weigh that claim against internal documents where his own employees described the platform as “a drug” and themselves as “basically pushers.” They got to weigh it against evidence that both Mosseri and Zuckerberg failed to act when whistleblower Arturo Béjar personally notified them of harm to teens, on body image, mental health, and exposure to predators.
Jurors found Meta and YouTube negligent in the design or operation of their platforms and determined that their negligence was a substantial factor in causing harm to the plaintiff, Kaley, a young woman who first started using YouTube at age 6. Instead of a case about content, this lawsuit was about the product design itself, the infinite scroll, the algorithmic feed, the engagement loops deliberately built to maximize time on platform regardless of the cost to children’s lives.
When Meta whistleblower Brian Boland, who spent more than a decade at the company as Vice President of Partnerships and later Vice President of Advertising Technology, testified that products were routinely launched without meaningful safety testing, and that senior leadership rebuffed him when he suggested studying the impact of News Feed on people who use it, he was describing a culture I recognized. One where I was told directly to avoid taking any notes on Meta’s knowledge of kids using Horizon Worlds or create any record discoverable, one where I was asked to silence other concerned employees as a test of my skill.
The verdict validates what whistleblowers, survivor parents, and advocates have been saying for years. The companies knew. They profited. They lied about it in public, in Congress, and in courtrooms. Mark Zuckerberg told a Senate Judiciary Committee in 2024 that the science doesn’t support a link between social media and harm to children. The jury in Los Angeles disagreed. Meta bears 70% of the responsibility for Kaley’s harms, according to the jury’s findings.
And this is only the beginning. This bellwether verdict may influence the outcome of thousands of other pending lawsuits.
To the survivor parents who have been advocating for years, turning grief to action, many of whom traveled from across the country to sit in that courtroom holding photographs of their children: you led this. To Kaley, who had the courage to put her story on the public record, and the legal team supporting her: you changed history today.
Meta’s response to this trial has been like its response to every accountability moment. Announce a new safety feature. Generate thousands of pieces of press. Offer parents the appearance of control. A recent independent evaluation of Instagram’s Teen Accounts tested 47 of 53 listed safety features and found that 64% were either no longer available or ineffective. Only 17% worked as described. When the tobacco industry faced evidence that cigarettes caused cancer, it responded with light cigarettes and cartoon mascots. Teen Accounts are the modern equivalent: a sop to worried parents and regulators, designed to preserve profit while avoiding real accountability.
The science has said that teens struggle to self-regulate because their brains aren’t finished developing. The evidence has shown these companies won’t self-regulate because their business model depends on addiction. Today, 12 ordinary people in a Los Angeles courtroom answered the question we’ve been asking for years: whose side are we on? They chose the kids.