Meta Found Liable for Harming Children’s Mental Health: A Landmark Legal Victory in New Mexico
Jury Delivers Historic Verdict Against Social Media Giant
In a groundbreaking decision that could reshape how social media companies operate, a New Mexico jury ruled on Tuesday that Meta Platforms—the parent company behind Facebook, Instagram, and WhatsApp—has been knowingly harming children’s mental health while violating state consumer protection laws. After deliberating evidence presented during a comprehensive seven-week trial, the jury sided firmly with state prosecutors who argued that the tech giant consistently chose profits over the safety and wellbeing of its youngest users. The verdict specifically found that Meta violated multiple provisions of New Mexico’s Unfair Practices Act, with particular emphasis on how the company deliberately concealed what it knew about rampant child sexual exploitation occurring on its platforms and the detrimental effects its services have on young people’s mental health. The jury determined that Meta engaged in deceptive business practices by making false or misleading statements about platform safety, and—perhaps most damning—that the company engaged in “unconscionable” trade practices that specifically targeted and exploited the natural vulnerabilities and lack of life experience that characterize childhood. With the jury identifying thousands of individual violations, each potentially carrying its own penalty, the financial consequences could reach a staggering $375 million, marking one of the most significant legal defeats for a major technology company in recent memory.
Meta’s Response and the Broader Context of Social Media Litigation
Meta wasted no time in expressing its disagreement with the jury’s findings. Company spokesperson Andy Stone released a statement shortly after the verdict was announced, saying, “We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.” During the trial, Meta’s legal team argued that the company does disclose risks associated with platform use and invests significant resources in identifying and removing harmful content and experiences, while acknowledging that some dangerous material inevitably slips through even the most sophisticated safety systems. Meta attorney Kevin Huff told jurors that “evidence shows not only that Meta invests in safety because it’s the right thing to do but because it is good for business,” and that “Meta designs its apps to help people connect with friends and family, not to try to connect predators.” However, this case represents just the tip of an enormous legal iceberg confronting the social media industry. New Mexico’s lawsuit was among the first to actually reach trial in what has become a tidal wave of litigation involving social media platforms and their documented impacts on children’s development and safety. The timing is particularly significant as school districts across the country implement increasingly strict restrictions on smartphone use in classrooms, and state legislators nationwide push for tougher regulations on how tech companies can interact with minors.
Parallel Cases and Growing Legal Pressure
The New Mexico trial, which began on February 9, is just one front in an expanding legal war against Meta and other social media giants. In federal court in Southern California, another jury has been sequestered for more than a week, deliberating whether Meta and YouTube should be held liable for harms their platforms have caused to children. This case is one of three so-called “bellwether” trials—legal test cases whose outcomes could establish precedents that influence the resolution of thousands of similar lawsuits currently working their way through the court system. The stakes couldn’t be higher for the entire tech industry. Meta CEO Mark Zuckerberg himself was compelled to testify in the Los Angeles trial last month, where he addressed one of the fundamental challenges facing social media platforms: age verification. Zuckerberg told jurors that while Instagram’s terms of service explicitly prohibit users under 13 from creating accounts, enforcing this rule proves extremely difficult in practice because “a meaningful number of people who lie about their age to use our services.” This admission highlights the cat-and-mouse game that has developed between platforms trying to comply with child protection laws and young users determined to access social media regardless of age restrictions. Beyond these individual trials, the legal pressure on Meta is intensifying from multiple directions. More than 40 state attorneys general have joined forces to file lawsuits against the company, collectively claiming that Meta is actively contributing to a mental health crisis among young people by deliberately designing Instagram and Facebook features that are psychologically addictive, keeping young users scrolling and clicking far beyond what might be healthy or appropriate.
The Undercover Investigation That Built the Case
What made New Mexico’s case particularly compelling was the methodology prosecutors used to build their evidence. The state conducted an extensive undercover investigation in which law enforcement agents created fake social media accounts designed to appear as though they belonged to children. These agents then documented in painstaking detail the sexual solicitations these accounts received and, crucially, how Meta’s safety systems responded—or failed to respond—to these predatory approaches. This real-world testing provided the jury with concrete evidence rather than theoretical arguments about platform safety. The lawsuit, officially filed in 2023 by New Mexico Attorney General Raúl Torrez, also accused Meta of failing to fully disclose or adequately address the dangers of social media addiction. Interestingly, Meta has not officially acknowledged that social media addiction exists as a clinical condition, though company executives who testified during the trial did concede the reality of what they termed “problematic use”—acknowledging that some users develop unhealthy relationships with their platforms. These same executives testified that the company wants users to “feel good about the time they spend on Meta’s platforms,” though prosecutors argued this statement rang hollow given the company’s algorithmic design choices. The legal battlefield has been complicated by long-standing protections tech companies have enjoyed under Section 230 of the Communications Decency Act, a nearly 30-year-old provision that has generally shielded social media platforms from liability for content that users post. Additionally, First Amendment protections have historically made it difficult to hold platforms responsible for the speech that appears on their services. However, New Mexico prosecutors crafted their case to argue that Meta bears responsibility not simply for hosting problematic content, but for its active role in amplifying and distributing that content through sophisticated algorithms specifically designed to maximize engagement—particularly among young, vulnerable users.
The Evidence: Internal Documents and Expert Testimony
Throughout the trial, prosecutors presented what they described as damning internal Meta correspondence and reports related to child safety, painting a picture of a company that knew about serious problems but chose not to address them adequately. The jury heard testimony from a diverse array of witnesses that provided multiple perspectives on the issues at hand. Meta executives and platform engineers took the stand to defend the company’s practices and explain the technical challenges of content moderation at massive scale. However, their testimony was countered by whistleblowers—former Meta employees who had left the company and were willing to speak publicly about what they characterized as a corporate culture that prioritized growth and engagement metrics over user safety. The jury also heard from psychiatric experts who explained the documented effects of social media use on developing brains, and tech-safety consultants who could speak authoritatively about industry best practices and where Meta’s efforts fell short. Perhaps most emotionally powerful was testimony from local public school educators who described firsthand the disruptions they’ve witnessed linked directly to social media use, including the devastating impact of sextortion schemes that have targeted students in their schools. State Chief Deputy Attorney General James Grayson framed the entire case simply and powerfully in his closing arguments: “What this case is about is one of the biggest tech companies in the world taking advantage of New Mexico teens.” Prosecution attorney Linda Singer emphasized that Meta’s algorithmic choices have direct consequences: “We know the output is meant to be engagement and time spent for kids. That choice that Meta made has profound negative impacts on kids.” The jury, assembled from residents of Santa Fe County—which includes New Mexico’s politically progressive state capital—was asked to consider whether social media users were specifically misled by public statements about platform safety made by Meta’s highest-ranking officials, including CEO Mark Zuckerberg, Instagram head Adam Mosseri, and Meta’s global head of safety, Antigone Davis.
What Comes Next: Implications and Future Proceedings
The verdict delivered by the New Mexico jury represents only the first phase of this legal battle. A second phase of the trial, tentatively scheduled for May, will proceed before a judge without a jury present. During this phase, the court will determine whether Meta’s practices created a “public nuisance” under New Mexico law—a legal finding that could result in court orders requiring the company to fundamentally change how it operates within the state. Beyond merely imposing financial penalties, the judge could order Meta to implement specific remedies designed to address the harms identified during the trial, potentially including enhanced age verification systems, modified algorithms that don’t prioritize engagement above safety for young users, more robust content moderation specifically focused on child protection, and greater transparency about what the company knows regarding the mental health impacts of its platforms. The implications of this verdict extend far beyond New Mexico’s borders. As one of the first cases in this wave of litigation to reach a jury verdict, it will undoubtedly influence how other cases proceed, how other juries might view similar evidence, and potentially how Meta and other social media companies approach settlement negotiations in pending cases. The finding that Meta engaged in “unconscionable” trade practices and made false or misleading statements could embolden other state attorneys general to pursue similar theories in their jurisdictions. For parents, educators, and child advocates, this verdict represents a significant validation of concerns they’ve been raising for years about the impact of social media on young people. It may also accelerate legislative efforts to impose stricter regulations on how tech companies can design features and deploy algorithms when they know children are using their services, regardless of stated age restrictions. For Meta and the broader tech industry, this verdict signals that the era of light-touch regulation and self-policing may be coming to an end, replaced by meaningful legal accountability for the choices these companies make in designing products used by hundreds of millions of young people worldwide.













