LOGIN

Lawsuit by Kenyan Facebook Content Moderators Highlights Global Impact

by Madison Thomas
0 comments
Facebook content moderators in Kenya

Nearly 200 former content moderators in Kenya are taking legal action against Facebook and its local contractor, Sama, alleging inhumane working conditions that may have far-reaching implications for social media moderators worldwide. This lawsuit, the first of its kind outside the United States, raises concerns about the toll on moderators’ mental health and the need for better support systems. The moderators, responsible for screening posts and removing harmful content, are seeking a compensation fund of $1.6 billion, citing inadequate mental health assistance and low wages as key grievances. Despite a court order for contract extensions until the case is resolved, the moderators claim that both Facebook and Sama have disregarded this requirement.

Operating from Nairobi, Kenya’s capital, the group worked at Facebook’s outsourced content moderation hub, monitoring posts, videos, messages, and other user-generated content from across Africa. Their job involved the constant exposure to distressing material, such as child abuse and violence, taking a toll on their well-being. Initially feeling proud to serve as protectors of online communities, the moderators eventually faced trauma and received little support or transparency from the companies. Non-disclosure agreements restricted their ability to discuss their experiences, while personal items, including phones, were prohibited at work.

The moderators endured significant emotional strain, with some experiencing the resurfacing of past traumas related to political or ethnic violence in their home countries. Despite their dedication, the lack of mental health support and meager wages intensified their distress. Following their termination by Sama earlier this year, the moderators’ mental health has further deteriorated, leaving them grappling with the haunting images they encountered. Financial difficulties have also emerged, with many struggling to afford basic necessities or education for their children. Content moderators earned a monthly salary of $429, supplemented with a minor expat allowance for non-Kenyans.

Facebook’s contractor, Sama, based in the United States, allegedly provided insufficient post-traumatic counseling to moderators in their Nairobi office. Moderators reported that counselors were ill-equipped to address their specific needs. Consequently, without access to adequate mental health care, some moderators have sought solace in their religious beliefs. While Facebook’s parent company, Meta, maintains that its contractors are required to pay above-market wages and offer on-site support by trained practitioners, Meta declined to comment on the Kenyan case.

Sama, responding to the allegations, stated that the salaries provided in Kenya were four times the local minimum wage. The contractor further claimed that before their employment, a significant majority of male and female employees lived below the international poverty line, earning less than $1.90 per day. Sama also asserted that all employees had unrestricted access to one-on-one counseling without fear of reprisal. Regarding a recent court decision extending the moderators’ contracts, Sama expressed confusion and suggested that a subsequent ruling had paused the decision’s implementation.

Experts, such as Sarah Roberts from the University of California, Los Angeles, highlight the potential psychological damage content moderation work can cause. Despite the risks, individuals in lower-income countries may accept such employment in the tech industry in exchange for an office job. Roberts underscores the exploitative nature of outsourcing sensitive work to countries with abundant cheap labor, where firms often avoid taking responsibility by shifting the blame to third-party contractors. Concerns have also been raised about the quality of mental health care provided and the confidentiality of therapy sessions.

This Kenyan court case represents a unique development as content moderators unite to challenge their working conditions, drawing unprecedented attention to the issue. In contrast to the common practice of settling such cases in the United States, the moderators’ resistance may pose challenges for companies facing similar legal actions in other countries. Facebook established moderation hubs globally in response to accusations of permitting hate speech in countries like Ethiopia and Myanmar, where ethnic conflicts led to numerous casualties. By employing moderators fluent in various African languages, Facebook aimed to address content related to local conflicts.

For moderators like Fasica Gebrekidan, who experienced firsthand the Ethiopian conflict while working as a content moderator, the job became a torturous ordeal. Fasica, haunted by the distressing videos and other content she encountered daily, now faces unemployment and an uncertain future. The impact on her mental health is profound, affecting her ability to write, while she fears that the graphic content will forever haunt her thoughts.

Facebook’s influence in the fate of the moderators’ complaint awaits the decision of the Kenyan court, with the next hearing scheduled for July 10. The uncertainty surrounding the case compounds the frustration felt by the moderators, with some considering giving up and returning to their home countries. However, for individuals like Fasica, such an option is not yet feasible.

Note: The original article was rephrased and condensed to provide a summary of the key points.

Frequently Asked Questions (FAQs) about Facebook content moderators in Kenya

What is the lawsuit about?

The lawsuit involves nearly 200 former Facebook content moderators in Kenya who are suing Facebook and its local contractor, Sama, over poor working conditions, including insufficient mental health support and low pay.

Where did the content moderators work?

The content moderators were employed at Facebook’s outsourced content moderation hub in Nairobi, Kenya’s capital.

What are the moderators seeking in the lawsuit?

The moderators are seeking a compensation fund of $1.6 billion for the alleged inadequate working conditions they experienced.

How are the moderators affected by their job?

The moderators face significant mental health challenges due to their exposure to disturbing and traumatic content, which takes a toll on their well-being. They also struggle with low wages and financial difficulties.

What support did the moderators receive?

According to the moderators, they received little support from Facebook and Sama. They claim that proper mental health care was lacking, and counselors were ill-prepared to handle their specific needs.

How does this lawsuit impact other content moderators?

The lawsuit has the potential to create implications for content moderators worldwide, highlighting the need for better working conditions and support systems in the industry.

What is the response from Facebook and Sama?

Facebook and Sama have defended their employment practices, stating that their contractors are contractually obligated to provide above-industry-standard wages and on-site support by trained practitioners. However, specific comments on the Kenyan case were not provided.

More about Facebook content moderators in Kenya

You may also like

Leave a Comment

BNB – Big Big News is a news portal that offers the latest news from around the world. BNB – Big Big News focuses on providing readers with the most up-to-date information from the U.S. and abroad, covering a wide range of topics, including politics, sports, entertainment, business, health, and more.

Editors' Picks

Latest News