Former TikTok Workers Speak Out: Inside Allegations of Workplace Bullying and Union-Busting
Breaking the Silence on Social Media’s Dark Side
For years, content moderators have worked behind the scenes of our favorite social media platforms, shielding users from the internet’s darkest corners. Now, one former TikTok employee is courageously stepping forward to reveal what she describes as a toxic workplace culture at one of the world’s most popular apps. Lynda Ouazar, who worked as both a content moderator and quality control specialist at TikTok, is joining four former colleagues in launching legal action against the tech giant. This marks the second lawsuit TikTok has faced from former UK employees in recent months, signaling what could be a growing pattern of workplace concerns at the company. Ouazar’s decision to speak publicly comes after what she describes as months of sleepless nights, emotional flashbacks, exhaustion, and a complete loss of motivation—symptoms that paint a troubling picture of the psychological toll her job took on her mental health.
The Psychological Burden of Moderating Extreme Content
When Lynda first joined TikTok, she found the work meaningful and even enjoyable. Starting as a moderator before moving into quality control—where she checked the work of external agency moderators—she initially felt she was making a positive contribution. However, everything changed when she was reassigned to a workflow dealing with the platform’s most disturbing content. Day after day, she was exposed to videos showing child sexual abuse, violence against women, self-harm, and hateful speech filled with slurs. “You don’t want to see children being sexually assaulted, you don’t want to see women going through all kinds of abuse, you don’t want to see people self-harming,” she explains, her words carrying the weight of trauma that no workplace should inflict. The constant exposure to this horrific material took a severe toll on her wellbeing. Despite TikTok’s stated policies allowing moderators to take breaks when needed and offering access to mental health support platforms, Lynda and other moderators who have spoken with media outlets say the reality on the ground was dramatically different from company policy.
The Pressure Cooker: When Speed Trumps Safety
According to Lynda and other current and former moderators, the actual working conditions created an environment where taking necessary mental health breaks felt impossible. Workers describe being monitored by artificial intelligence systems throughout their entire workday, creating constant pressure to maintain speed and productivity regardless of how disturbing the content they encountered might be. “Moderators find themselves pressurised to deliver, so they have to carry on, even if you see something which really affects you and you feel like you have tears in your eyes,” Lynda reveals. She describes colleagues literally crying at their desks while continuing to review traumatic content because missing productivity targets could impact their bonuses, job security, and salaries. This relentless pressure creates a devastating catch-22: moderators must choose between protecting their mental health and protecting their livelihoods. Even more concerning, Lynda argues that this pressure directly compromises user safety on the platform itself. When moderators work under intense time constraints while processing emotionally disturbing material, errors become inevitable—meaning harmful content that should be removed sometimes remains accessible to TikTok’s massive user base, which includes millions of children and teenagers. Despite these allegations, TikTok’s latest transparency report claims the platform removes more than 99% of harmful content before users even report it, and data collected under the EU’s Digital Services Act shows TikTok has the lowest error rates and highest moderation accuracy among major social media platforms.
When Union Membership Becomes a Target
After two years at TikTok, Lynda made the decision to join the United Tech and Allied Workers (UTAW) union and eventually became a union representative—a role that she believes marked the beginning of systematic retaliation against her. The change wasn’t immediate, which made the pattern harder to recognize at first. “It took me some time, I would say a few months, to see the pattern,” she recalls. Gradually, she began experiencing what she characterizes as bullying, harassment, and deliberate exclusion from team activities and projects. Most dramatically, her performance ratings plummeted from the highest possible level to the lowest, yet management failed to provide adequate explanation even when she formally filed a grievance. The situation became even more troubling when she noticed that other employees she had helped recruit into union membership began experiencing identical treatment. What had initially seemed like isolated incidents began to look like a coordinated response to union organizing. The situation came to a head when TikTok launched a major restructuring program last year that changed how the company moderates content. Lynda’s team was informed they were at risk of redundancy. Out of 24 workers placed in this at-risk category, 11 ultimately lost their jobs. According to the lawsuit, every single one of these terminated employees had been openly involved in union activities at TikTok—a statistical pattern the legal team argues is too significant to be coincidental.
Legal Action and the Fight for Accountability
Stella Caram, head of legal at Foxglove, the organization helping to represent the former TikTok workers in their legal case, frames the lawsuit as having implications far beyond these individual workers. “In this case specifically, we want compensation for the workers. They have been unlawfully dismissed because they were engaging with union activities,” she explains. But the goals extend beyond financial compensation for those directly affected. “We wanted to make this a precedent because we’ve seen a lot of this happening across the world,” Caram adds, suggesting that the alleged union-busting and worker mistreatment at TikTok may reflect broader patterns in the tech industry. The legal action represents an attempt to hold one of the world’s most influential technology companies accountable for how it treats the workers who perform some of its most psychologically demanding labor. Content moderators operate in the shadows of the social media industry, doing essential work that protects users but often suffering in silence due to non-disclosure agreements, fear of retaliation, and the stigma surrounding mental health challenges. By speaking publicly and pursuing legal remedies, these former TikTok employees are challenging the notion that tech companies can treat moderators as disposable resources rather than valued employees deserving of proper support and protection.
TikTok’s Response and the Broader Implications
In response to these serious allegations, TikTok has issued a firm denial. A company spokesperson told media outlets: “We strongly reject these baseless and inaccurate claims. We have made ongoing enhancements to our safety technologies and content moderation, which are borne out by the facts: a record rate of violative content removed by automated technology (91%) and record volume of violative content removed in under 24 hours (95%).” The company’s response focuses on its technical performance metrics while not directly addressing the specific allegations about workplace culture, union-busting, or the psychological support provided to moderators. This disconnect highlights a fundamental tension in how we evaluate social media companies: should they be judged primarily by their content moderation statistics, or by how they treat the human workers who make those statistics possible? As this legal case proceeds, it may force a broader conversation about the true cost of keeping social media platforms safe and who ultimately bears that burden. The psychological trauma described by Lynda and her colleagues raises uncomfortable questions about whether the current model of content moderation is sustainable or ethical, and whether companies generating billions in revenue are adequately investing in the wellbeing of workers exposed to humanity’s worst behavior. The outcome of this case could set important precedents not just for TikTok, but for the entire social media industry’s approach to content moderation and labor practices.













