Meta Just Lost Big: Jury Says the Company Knowingly Harmed Kids for Profit

Photo by Hoi An and Da Nang Photographer on Unsplash
In a landmark victory for families and child safety advocates, a New Mexico jury just handed Meta a $375 million penalty after determining that the tech giant knowingly damaged children’s mental health and covered up child sexual exploitation on its platforms. This isn’t just another corporate fine, it’s a signal that the tide is finally turning against Big Tech.
After nearly seven weeks of testimony, jurors found that Meta, which owns Facebook, Instagram, and WhatsApp, prioritized profits over protecting young people. They agreed with prosecutors that the company made false or misleading statements about safety and engaged in “unconscionable” practices that took advantage of children’s vulnerabilities and inexperience.
The jury discovered thousands of violations, each counting toward the penalty. While $375 million might sound huge, it’s actually less than one-fifth of what prosecutors asked for. And honestly? Meta’s stock went up 5% after the verdict, suggesting Wall Street doesn’t think this will seriously hurt the company’s bottom line. Meta is valued at about $1.5 trillion, so this penalty is basically pocket change.
One juror, Linda Payton, revealed that the jury reached a compromise on how many teens were affected but decided to max out the penalty per violation. She said she believed each child deserved the maximum $5,000 penalty. That’s a powerful statement about valuing young people’s wellbeing over corporate interests.
The New Mexico case relied on undercover investigation where agents posed as children to document sexual solicitations and how Meta responded. The trial examined Meta’s internal documents and heard from whistleblowers, engineers, safety experts, and educators who’ve witnessed the real-world damage of these platforms, including sextortion schemes targeting kids.
Meta claims it disagrees with the verdict and will appeal. The company says it works hard to keep people safe and that while some harmful content slips through, they’re committed to platform security. But jurors weren’t convinced, especially after reviewing communications from CEO Mark Zuckerberg, Instagram head Adam Mosseri, and Meta’s global safety chief Antigone Davis about platform safety.
This case is part of a bigger wave, over 40 state attorneys general have sued Meta for knowingly designing addictive features that contribute to a mental health crisis among young people. The verdict sends a message: companies can’t hide behind Section 230, a 30-year-old legal protection that shields tech platforms from liability for user-generated content. Prosecutors argued Meta should still be held responsible for using algorithms that deliberately push harmful material to kids to maximize engagement.
ParentsSOS, a coalition of families who’ve lost children to social media harms, called this a “watershed moment” in holding Big Tech accountable. There’s more to come too, a judge will decide in May whether Meta created a public nuisance and should fund programs to address the harm it caused.
AUTHOR: mei
SOURCE: NBC Bay Area
























































