A cinematic style scene depicting a young Ghanaian woman with deep brown skin and expressive eyes, sitting thoughtfully in a modest office environment, surrounded by papers and a computer screen showing social media content. The lighting is soft and warm, casting gentle shadows that highlight her contemplative expression as she gazes at the screen, visibly affected by the content she’s moderating. The background features a poster advocating for mental health awareness, subtly hinting at the underlying theme of emotional impact. The mood is somber yet hopeful, capturing the complexity of her role and the weight of her responsibilities. The visual elements emphasize the tension between technology and human emotion, with the words
Meta lawsuits Ghana: Content moderator mental health impact from violent content & poor conditions. (Image generated by DALL-E).

Listen to this article

Download Audio

Meta Lawsuits Ghana: Worker Mental Health Impact

By Darius Spearman (africanelements)

Support African Elements at patreon.com/africanelements and hear recent news in a single playlist. Additionally, you can gain early access to ad-free video content.

Giant tech companies often present a sleek, modern image. Behind the screens, however, a hidden workforce faces immense pressure. Meta, the parent company of Facebook and Instagram, is now facing serious legal challenges in Ghana. These lawsuits stem from the severe mental health toll experienced by content moderators (Meta’s content moderators face worst conditions yet at secret Ghana operation). These workers, essential for keeping harmful material off platforms used by billions, are often left traumatized and unsupported.

This situation in Ghana isn’t an isolated incident. It echoes similar battles fought by moderators in Kenya, highlighting a disturbing pattern across the African continent. As Meta expands its operations, questions arise about its responsibility towards the well-being of its global workforce, particularly those in the diaspora performing digitally hazardous jobs. Ultimately, the experiences of these moderators reveal the human cost behind our curated social media feeds.

Ghana’s Crisis: Content Moderator Mental Health

In Accra, Ghana, individuals hired by Teleperformance, a Meta contractor, perform the grueling task of content moderation. Their job involves sifting through graphic and violent material posted on Meta’s platforms. Many report suffering from severe mental health issues, including depression, anxiety, and even substance abuse as a consequence (Meta’s content moderators face worst conditions yet at secret Ghana operation). The constant exposure to extreme content, like violence and child abuse imagery, takes a heavy toll.

The working conditions exacerbate this trauma significantly. Moderators allege they are given incredibly short timeframes, sometimes only 20 to 60 seconds, to decide whether a piece of content violates policy (Meta’s content moderators face worst conditions yet at secret Ghana operation). This intense pressure not only increases the psychological burden but also likely leads to higher error rates. Tragically, one moderator attempted suicide after prolonged exposure to horrific child abuse material, highlighting the dire lack of adequate, long-term psychiatric support provided by the company (Meta’s content moderators face worst conditions yet at secret Ghana operation). Meanwhile, Teleperformance claimed it offered “robust wellbeing programs,” yet reportedly fired workers who requested leave specifically for trauma recovery (Meta’s content moderators face worst conditions yet at secret Ghana operation).

Content Review Time Pressure in Ghana

20-60 Seconds Time allocated per content review decision, increasing trauma and error potential.
Data reflects conditions reported by moderators in Accra. Source: Meta’s content moderators face worst conditions yet at secret Ghana operation

Echoes from Kenya: Precedent for Meta Lawsuits Ghana

The unfolding situation in Ghana mirrors events that have previously occurred in Kenya. In 2024, a significant lawsuit was filed against Meta and its then-contractor, Samasource, by 190 Kenyan content moderators (Content moderators sue Meta and outsourcing firm). These workers claimed they developed Post-Traumatic Stress Disorder (PTSD) from constantly reviewing violent and disturbing content. Crucially, medical professionals confirmed these diagnoses, adding weight to their claims.

An earlier, pivotal case was brought by Daniel Motaung in 2022. His lawsuit shed light on the shocking conditions faced by moderators in Nairobi, revealing they earned as little as $1.50 per hour while being exposed to horrific content, including beheadings and child abuse (Meta facing lawsuit over the poor working conditions of content moderators). Furthermore, Meta initially tried to argue that Kenyan courts lacked jurisdiction. However, the Kenyan judiciary ruled that Meta could indeed be sued locally (Meta & Sama lawsuit (re poor working conditions & human trafficking)). This landmark decision set a vital precedent, potentially paving the way for similar legal actions against tech giants across Africa, including the current cases in Ghana.

Kenyan Moderator Lawsuits Against Meta

190
Kenyan moderators sued Meta/Samasource in 2024 over PTSD from content review.
$1.50/hr
Reported hourly wage for Kenyan moderators viewing extreme content (2022 lawsuit).
Jurisdiction Upheld
Kenyan courts ruled Meta could be sued locally, setting a precedent.
Key figures illustrating the scale and nature of the Kenyan content moderation disputes.

Exploitation Unveiled: Teleperformance Working Conditions

Beyond the psychological trauma, content moderators in Ghana report facing systemic financial exploitation. Workers allege that performance bonuses were often withheld for failing to meet opaque targets, effectively driving their hourly pay below $3 (Meta’s content moderators face worst conditions yet at secret Ghana operation). This practice constitutes wage theft, making it challenging for workers to earn a stable, livable income, despite the hazardous nature of their jobs.

Furthermore, invasive surveillance adds another layer of control and pressure. Meta utilizes sophisticated AI systems to closely monitor its moderators. These systems track details like screen time and even mouse movements (Meta facing lawsuit over the poor working conditions of content moderators). Workers have reported being penalized for taking necessary breaks, including trips to the bathroom. Such constant monitoring creates a hostile work environment, compounding the stress already induced by the content itself.

African Tech Worker Rights Under Scrutiny

Advocacy groups are shedding light on the broader patterns of exploitation within Meta’s labor chain across Africa. Foxglove Legal, an organization supporting tech workers, discovered disturbing living conditions for some moderators in Ghana. These workers were housed in monitored accommodations suffering from basic utility shortages, like inconsistent water and electricity (Meta’s content moderators face worst conditions yet at secret Ghana operation). This level of control extends beyond the workplace, creating a coercive environment.

Moreover, the precarious situation of many moderators, particularly those recruited from conflict zones or neighboring countries, is exploited. These individuals often fear deportation if they speak out or protest against the poor working conditions (Workers in Africa challenge Meta over digital exploitation). This fear silences dissent and allows exploitative practices to continue unchecked, raising serious questions about the rights and protections afforded to African tech workers in the global digital economy.

Reported Conditions for Moderators in Ghana

Low Effective Pay

Wage theft via missed opaque targets drives pay below $3/hour.

Intense Surveillance

AI tracks screen time/mouse movements; breaks penalized.

Poor Housing

Monitored housing with water/electricity shortages reported.

Deportation Fears

Migrant workers fear protesting conditions due to deportation risk.

Overview of challenging conditions faced by content moderators employed via contractors in Ghana.

Denials Amidst Evidence: Facebook Content Moderation Standards

Despite the mounting evidence and harrowing testimonies from workers, both Meta and its contractors often deny or downplay the allegations. Teleperformance, the contractor in Ghana, asserted that its moderators receive “robust wellbeing programs” (Meta’s content moderators face worst conditions yet at secret Ghana operation). This claim starkly contrasts with reports of workers being fired precisely for requesting time off to deal with work-induced trauma.

Similarly, Meta states that its contractors are required to provide “industry-leading pay, benefits and support” (Mental trauma: African content moderators push Big Tech on rights). Yet, the company has reportedly refused to disclose specific wage benchmarks or details that would allow independent verification of these claims. Consequently, this lack of transparency makes it difficult to assess whether contractors truly meet these alleged standards, especially when workers themselves report low pay, wage theft, and inadequate support systems.

The legal battles in Ghana and Kenya represent a crucial fight for accountability. They force us to confront the hidden human labor that powers social media and the responsibilities tech giants hold towards their entire global workforce. For workers in the African diaspora, these cases are about more than just compensation; they are about dignity, safety, and demanding fair treatment from some of the world’s wealthiest corporations. Indeed, the outcome of these lawsuits could have far-reaching implications for tech worker rights across the continent and beyond.

ABOUT THE AUTHOR

Darius Spearman is a professor of Black Studies at San Diego City College, where he has been teaching since 2007. He is the author of several books, including Between The Color Lines: A History of African Americans on the California Frontier Through 1890. You can visit Darius online at africanelements.org.