Meta Hit with $375 Million Fine Over Child Safety Failures
A New Mexico jury has delivered a landmark verdict against social media giant Meta, ordering the company to pay $375 million in damages for endangering children on its platforms. The ruling came after a six-week trial where the state accused Meta of failing to protect minors from sexual abuse, online solicitation, and human trafficking on Facebook and Instagram.
Historic Verdict in Child Safety Case
The jury found Meta liable under New Mexico's Unfair Practices Act, determining that the company misled users about platform safety for children. This case represents one of the first major jury rulings specifically addressing child safety concerns on social media platforms, setting a significant precedent for future litigation.
New Mexico Attorney General Raul Torrez hailed the decision as a "historic victory," alleging that Meta executives prioritized profits over child protection. "Company leadership was fully aware of the risks but chose not to implement adequate safeguards while misleading the public about their platforms' safety," Torrez stated following the verdict.
Trial Details and Testimony
The extensive trial featured testimony from approximately 40 witnesses, including whistleblowers who provided crucial insights into Meta's internal operations. The court reviewed hundreds of internal documents and reports that allegedly demonstrated Meta's awareness of the dangers facing young users.
While the state had sought $2.2 billion in damages, the jury awarded the lower amount of $375 million. The lawsuit, originally filed in 2023, specifically accused Meta and CEO Mark Zuckerberg of failing to protect minors, with allegations that the company's algorithms actively directed harmful content toward younger users.
Meta's Response and Appeal Plans
Meta has announced it will challenge the ruling, stating it disagrees with the jury's findings. The company emphasized its ongoing investments in safety measures while acknowledging the inherent challenges of identifying harmful content and users across its massive platforms.
"We continue to dedicate significant resources to platform safety and recognize the complexities involved in monitoring content at our scale," a Meta representative said following the verdict.
Ongoing Legal Proceedings
The legal battle is far from over, with a second phase of the case scheduled to begin on May 4. During this phase, the court will consider additional penalties and potential changes to Meta's platform operations and safety protocols.
Meanwhile, a separate case in California is examining whether Meta and YouTube should be held responsible for harm caused to children through their platforms. The outcome of these cases could significantly impact similar lawsuits across the United States and potentially reshape how social media companies approach child safety.
This verdict comes at a time of increasing scrutiny of social media platforms' effects on young users, with lawmakers and regulators worldwide examining how to better protect children in digital spaces. The New Mexico case specifically highlighted how platforms can inadvertently expose minors to predators and harmful content despite safety claims.



