advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Kenyan court draws line in the sand for Meta

  • The Employment and Labour Court of Kenya has ordered that Meta cannot fire its moderation staff in the country amid a contentious legal battle.
  • The moderators allege that Meta and an outsourcing firm engaged in the unfair termination of their contracts and blacklisted them.
  • Meta has been ordered by the court to provide psychological support to the moderators.

The Employment and Labour Relations Court of Kenya has decreed that Meta and a content moderation company may not terminate any of the social media giant’s moderation staff in the East African country. At least not until an ongoing court case over allegedly unfair practices is finished.

In April, the same court scrapped a petition filed by Meta Platforms Inc., the owners of Facebook, WhatsApp and Instagram, to throw out the case brought against itself, its Irish branch and moderation outsourcing firm Samasource Kenya (Sama).

At least 260 moderators from across Africa allege that Sama, which was outsourcing content moderators for Meta on its platforms – individuals who sit through hours upon hours of videos, pictures and text on social media to determine if the content is suitable according to policies – sought to terminate their employment without following procedures outlined by Kenyan employment laws.

Apparently, Sama wanted to fire the moderators after it lost the contract from Meta and tried to do so without notice periods and similar procedures. The plaintiffs further allege that Meta gave the contract to another moderation company, one Majorel Kenya Limited, which it instructed to hire completely new staff and blacklist the ex-Sama moderators.

Meta’s argument against the suit was that since Meta doesn’t have a physical presence in Kenya – the moderators were working remotely – it doesn’t need to adhere to Kenyan jurisdiction. It also argues that since the staff are outsourced, they’re not directly Meta’s responsibility.

This didn’t fly with Justice Mathews Nduma who rejected Meta’s application, arguing that since many of the spurned moderators are based in Kenya, Meta and Sama could indeed be sued.

“The court will consider the nature and extent of liability with regard to the alleged breaches and violations of the Constitution arising and or related to employment and Labour relations in Kenya,” Nduma explained at the time.

On 2nd June, the court via Justice Byram Ongaya issued a ruling (embedded below) that Meta Platforms Inc., Meta Platforms Ireland and Sama may not issue any notices or adjust the contracts of the moderators, “pending the hearing and determination of the petition.”

Furthermore, Ongaya has concluded that the moderators are indeed part of Meta’s stuff and the court ordered all three companies to extend any lapsed contracts until at least after the case has been finalised. Meta has also been ordered by the court to not engage in any actions that could constitute blacklisting former Sama moderators from being hired by other firms, such as Majorel.

Finally, Meta will have to “provide proper medical, psychiatric and psychological care for the petitioners and other Facebook Content Moderators in place of ‘wellness counselling’” at least until the case is over.

The ruling goes in-depth at the psychological state of the petitioners, all former content moderators on Meta’s platforms.

One content reviewer told the court that the “majority of the content I review daily includes mutilated or dismembered bodies, sadistic videos depicting man slaughter and burning of persons alive among others.”

They allege that Sama would often allow moderators who experienced distress to go to “wellness counselling” but that Sama’s counsellors “are not qualified psychiatrists or psychologists yet they are supposed to help us process such complex trauma.”

Meta’s platforms, especially Facebook, has again and again come under fire for the content shared by its users and the company’s inability to properly sift through it. Last year, another lawsuit accused Facebook of fueling ethnic violence during the Ethiopian Civil War.

The most infamous case of Facebook and its damaging media was detailed by an Amnesty International Report, stating outright that the social media platform was used to substantially influence the Rohingya genocide of Myanmar in 2017, where thousands of ethnic Rohingya were killed by the Myanmar military.

[Image – Dima Solomin on Unsplash]

advertisement

About Author

advertisement

Related News

advertisement