Instagram Boss Claims Social Media Isn’t Clinically Addictive During Landmark Trial
The Distinction Between Addiction and “Problematic Use”
In a groundbreaking legal case that could reshape how we understand social media’s impact on mental health, Adam Mosseri, the head of Instagram since 2018, has made a controversial statement that has sent ripples through the tech world and beyond. Taking the stand in a Los Angeles courtroom, Mosseri firmly stated his belief that people cannot become clinically addicted to social media platforms like the one he oversees. This testimony came as part of a landmark trial where major social media companies are facing serious accusations of deliberately creating products that are both harmful and addictive to users, particularly young people. Mosseri’s position centers on what he describes as a critical distinction between true clinical addiction—a recognized medical condition—and what he and his Instagram colleagues prefer to call “problematic use.” According to Mosseri, problematic use occurs when someone finds themselves spending more time scrolling through Instagram than they feel comfortable with or intended to. He acknowledged that this phenomenon “definitely happens” among Instagram’s massive user base, but maintained that it doesn’t constitute clinical addiction in the medical sense. When pressed about a specific case involving a 20-year-old plaintiff identified as KGM, who reportedly spent 16 hours in a single day on Instagram, Mosseri conceded that this would qualify as problematic use. However, he was careful to repeatedly remind the court that he is not a medical professional, potentially distancing himself from making definitive statements about addiction that could carry legal or medical weight.
The Body Image Filter Controversy Comes to Light
Perhaps even more revealing than his stance on addiction was what the trial exposed about internal debates at Instagram regarding features that could harm users’ mental health and body image. Court documents revealed that Meta, the parent company of both Instagram and Facebook, banned body image filters in 2019—these are filters that give users the appearance of having undergone plastic surgery or other cosmetic procedures. However, emails presented as evidence painted a more complicated picture of Instagram’s commitment to this ban. The internal communications showed that Mosseri and others at Instagram, including Facebook founder Mark Zuckerberg himself, had seriously discussed reversing this ban, albeit with some modifications. The proposal on the table was to lift the ban entirely but remove these problematic filters from the app’s recommendation section, making them less visible but still available to users who actively sought them out. This middle-ground approach was internally acknowledged as presenting a “notable wellbeing risk” to users, but the emails suggested it would have less negative impact on user growth than maintaining the complete ban. This admission is particularly significant because it appears to show executives weighing the mental health of users against the platform’s growth metrics—exactly the kind of calculation that critics of social media companies have long suspected takes place behind closed doors.
Internal Resistance from Instagram’s Own Teams
What makes this controversy even more interesting is that not everyone within Meta’s ecosystem was on board with lifting the filter ban. Several Instagram employees working in policy, communications, and wellbeing departments pushed back against the proposal to reverse the ban. Perhaps most notably, Nick Clegg—the former Deputy Prime Minister of the United Kingdom who had transitioned to become Meta’s vice president of global affairs—was among those arguing to maintain the prohibition on plastic surgery filters. In an email that was shown in court, Clegg warned that reversing the ban would lead to justified accusations that the company was “putting growth over responsibility.” This internal pushback reveals a tension that exists within these massive tech companies between those focused on user growth and engagement and those tasked with protecting user wellbeing and the company’s reputation. The fact that these concerns were raised by senior figures like Clegg demonstrates that the potential harms were well understood at the highest levels of the organization. Ultimately, the ban on plastic surgery filters remained in place, and when questioned about this during his testimony, Mosseri attempted to frame his involvement as an effort to “balance all the different considerations” before arriving at what he agreed was the right decision.
The Revenue Versus Safety Debate
During his testimony, Mosseri also addressed the broader question of whether social media companies prioritize financial performance over user safety—a criticism that has dogged the industry for years and sits at the heart of this trial. He pushed back against what he characterized as a false choice, stating: “Often people try to frame things as you either prioritise safety or you prioritise revenue. It’s really hard to imagine any instance where prioritising safety isn’t good for revenue.” This argument suggests that keeping users safe and keeping them engaged on the platform are not contradictory goals but rather complementary ones. The logic would be that users who feel safe and have positive experiences on Instagram would be more likely to continue using it, thereby supporting the company’s revenue model. However, this perspective has been met with skepticism from critics who point to the internal emails and documents revealed during the trial as evidence of a more complicated reality. The fact that Instagram executives were even considering reversing the plastic surgery filter ban despite acknowledging wellbeing risks suggests that growth considerations do sometimes take precedence over safety concerns, at least in internal deliberations if not in final decisions.
Criticism from Advocates for Bereaved Families
Mosseri’s testimony has drawn sharp criticism from those representing families who believe their children have been harmed by social media platforms. Matthew P. Bergman, the founding attorney of the Social Media Victims Law Center, issued a strongly-worded statement characterizing Mosseri’s sworn testimony as revealing what bereaved families have suspected all along. According to Bergman, the testimony demonstrated that “Instagram’s executives made a conscious decision to put growth over the safety of minors.” He argued that Mosseri’s admission strikes at the very heart of what the trial is attempting to prove: that the harms caused by Instagram and similar platforms were not unfortunate accidents or unintended consequences, but rather “the result of deliberate design choices that prioritised engagement over children’s wellbeing.” Bergman pointed to the evidence presented in court as showing that Instagram’s leadership was fully aware of the risks their product posed to young users. Despite this knowledge, he argues, they continued to deploy features that were specifically engineered to keep children and teenagers online for longer periods, even when those features exposed them to significant psychological and emotional danger. This perspective frames the issue not as one of well-meaning technology companies struggling to balance competing interests, but as a case of corporate negligence where profit motives overrode the duty of care toward vulnerable young users.
The Broader Implications of This Landmark Case
This trial represents much more than just a legal battle over one company’s practices—it could fundamentally change how society understands and regulates social media’s impact on mental health, particularly for young people. The question of whether social media can be genuinely addictive in a clinical sense has enormous implications. If courts and medical professionals ultimately determine that these platforms can create true addiction, it could open the door to regulation similar to that applied to gambling, tobacco, or other potentially addictive products and activities. The testimony and evidence presented in this case are providing an unprecedented glimpse into the internal decision-making processes of one of the world’s most influential social media companies. The revealed emails and discussions show executives grappling with difficult questions about user wellbeing, growth, and responsibility—and they suggest that these considerations don’t always point in the same direction, despite Mosseri’s assertions to the contrary. As this trial continues, it will likely influence ongoing debates about whether and how to regulate social media companies, particularly regarding their youngest users. Several countries and jurisdictions are already considering or implementing restrictions on social media use by children and teenagers, and the evidence emerging from this courtroom could provide ammunition for those advocating for stricter controls. Whatever the ultimate outcome, this case has already succeeded in forcing a public conversation about the design choices that shape our digital experiences and the responsibility that tech companies bear for the mental health and wellbeing of their billions of users around the world.













