Social Media Giants Face Historic Day in Court Over Child Safety Concerns
A Reckoning for Big Tech
The social media industry is facing its most significant legal challenge yet, as technology giants Meta and Google find themselves in courtrooms defending their platforms against serious allegations of deliberately harming children. Opening statements began Monday in Los Angeles County Superior Court in what could become a defining moment for the tech industry—comparable to the landmark tobacco trials of the 1990s. The lawsuit claims that Instagram and YouTube have intentionally designed their platforms to addict young users, prioritizing profits over the wellbeing of children. While TikTok and Snapchat were originally named in the suit, both companies have settled for undisclosed amounts, leaving Meta and Google to fight what could be a precedent-setting battle. The trial centers on a 20-year-old woman identified only as “KGM,” whose case serves as a bellwether that could determine the fate of thousands of similar lawsuits against social media companies. With additional trials launching simultaneously in New Mexico and more scheduled throughout the year, 2025 may mark the beginning of a fundamental shift in how society holds tech companies accountable for their impact on young minds.
The Case Against Social Media: “Addicting the Brains of Children”
Attorney Mark Lanier, representing the plaintiffs, delivered a compelling opening statement that framed the case as straightforward as “ABC”—which he defined as “Addicting the Brains of Children.” Lanier characterized Meta and Google as “two of the richest corporations in history” that have deliberately “engineered addiction in children’s brains” to boost their bottom lines. The heart of his argument rests on damning internal documents from both companies that appear to contradict their public messaging about child safety. Perhaps most striking was Meta’s own research project dubbed “Project Myst,” which surveyed 1,000 teenagers and their parents about social media use. The study revealed two critical findings: children experiencing trauma and stress were particularly vulnerable to addiction, and parental supervision and controls made little meaningful difference in protecting them. Lanier also presented internal Google documents comparing YouTube to a casino, and Meta employee communications describing Instagram as being “like a drug,” with employees functioning as “basically pushers.” The lawsuit draws explicit parallels to Big Tobacco, noting that the companies “borrowed heavily from the behavioral and neurobiological techniques used by slot machines and exploited by the cigarette industry” to create deliberately addictive products. Lanier warned jurors that the defense would “try to blame the little girl and her parents for the trap they built,” urging them to see through this strategy.
KGM’s Story: From “Creative Spark” to Social Media Casualty
At the center of this legal battle is KGM, a young woman whose childhood experience with social media platforms forms the foundation of the lawsuit. Lanier painted a picture of a vibrant child whom her mother described as a “creative spark” before social media entered her life. She began using YouTube at just six years old and Instagram by age nine—long before she had developed the cognitive tools to understand or resist their addictive features. By the time she graduated elementary school, she had already posted 284 videos on YouTube. The lawsuit claims that her early and extensive use of these platforms addicted her to the technology and significantly worsened her mental health, contributing to depression and suicidal thoughts. What makes this case particularly significant is that KGM was a minor throughout the period in question, which the plaintiffs argue undermines any defense that places responsibility on the user. She made a brief appearance during Lanier’s opening statement and is expected to testify later in the trial. Her testimony, along with that of Meta CEO Mark Zuckerberg and other executives, will likely prove crucial in determining the outcome. The trial is expected to last six to eight weeks, during which jurors have been instructed not to change their social media habits but to use platforms as they normally would—an unusual directive that underscores how deeply these technologies are woven into everyday life.
Internal Documents Reveal a Different Story Than Public Relations
The most damaging evidence against Meta and Google may come from their own files. While both companies have publicly positioned themselves as champions of child safety, implementing various safeguards and parental controls, their internal communications tell a very different story. Documents presented by Lanier show that young children were explicitly identified as target audiences, contradicting public statements suggesting the platforms weren’t designed for such young users. Some Meta employees apparently expressed concerns internally about the company’s failure to address potential harms to children and teenagers, with one communication comparing the situation to the tobacco industry’s well-documented deception. The phrase “for a teenager, social validation is survival” was used by Lanier to explain why features like “like” buttons are so psychologically powerful for young users. The lawsuit argues that companies deliberately engineered these features to exploit minors’ deep craving for social validation and peer approval. This internal evidence could prove critical in overcoming two major legal shields that have traditionally protected tech companies: the First Amendment and Section 230 of the Communications Decency Act, which generally shields platforms from liability for user-generated content. By arguing that the harm comes from deliberate design choices rather than specific content, the plaintiffs hope to sidestep these protections.
The Defense’s Position: Safeguards and Responsibility
Meta and Google strongly dispute the allegations, maintaining that they have made extensive efforts to protect young users and create safer online environments. A Meta spokesperson stated that the company “strongly disagrees with the allegations outlined in the lawsuit” and expressed confidence that “the evidence will show our longstanding commitment to supporting young people.” Similarly, José Castañeda, a Google spokesperson, flatly declared that the allegations against YouTube are “simply not true,” adding that “providing young people with a safer, healthier experience has always been core to our work.” The companies are expected to point to numerous safety features they’ve implemented over the years, including age restrictions, parental controls, time limits, and content filtering systems. They will likely argue that parents bear primary responsibility for monitoring and managing their children’s technology use, and that the companies themselves cannot be held liable for content posted by third parties on their platforms. Judge Carolyn B. Kuhl has instructed jurors to evaluate the liability of Meta and YouTube independently during deliberations, suggesting the evidence against each company may differ in significant ways. The outcome of this trial could have profound implications for how these companies operate and whether they’ll need to fundamentally redesign their platforms to truly protect young users.
A Global Movement Toward Protecting Children Online
The Los Angeles trial is just one front in a growing worldwide movement to protect children from potential harms of social media. On the same day, another trial began in New Mexico, where Attorney General Raúl Torrez has sued Meta over allegations that the platform failed to protect young users from sexual exploitation. In June, a federal bellwether trial will begin in Oakland, California, representing school districts that have sued social media platforms over harms to children—the first of its kind. More than 40 state attorneys general have filed lawsuits against Meta specifically, claiming Instagram and Facebook deliberately addict children and contribute to the youth mental health crisis. TikTok faces similar lawsuits in over a dozen states. Beyond American courtrooms, countries around the world are taking legislative action. France approved a ban on social media for children under 15, set to take effect in September. Australia has banned social media use for those under 16, and companies have already revoked access to approximately 4.7 million accounts identified as belonging to children. The British government is considering similar restrictions as it tightens laws around child protection and screen time. Sacha Haworth, executive director of the nonprofit Tech Oversight Project, emphasized the scope of the issue: “This was only the first case—there are hundreds of parents and school districts in the social media addiction trials that start today, and sadly, new families every day who are speaking out and bringing Big Tech to court for its deliberately harmful products.” Whether through courts or legislation, it appears that society is finally demanding accountability from an industry that has profited enormously while the mental health of young people has declined significantly.













