FILE – Facebook’s Meta logo sign is seen at the company headquarters in Menlo Park, Calif., on, Oct. 28, 2021. Attorneys for a man who once worked as a content moderator for Facebook have filed a lawsuit accusing the company of exploitative and unsafe working conditions. The case against Meta Platforms, Facebook’s parent company, and the outsourcing firm Sama was lodged Tuesday, May 10, 2022 with a court in the Kenyan capital, Nairobi. (AP Photo/Tony Avelar, File)

 

KAMPALA, Uganda (AP) — A man who says he is “destroyed” after working as a content moderator for Facebook has filed a lawsuit accusing the company of human trafficking Africans to work in an exploitative and unsafe facility in Kenya.

The case against Meta Platforms, the Menlo Park, Calif. company that owns Facebook, and Sama, a San Francisco subcontractor, was lodged Tuesday with a court in the Kenyan capital, Nairobi.

Daniel Motaung’s petition “calls upon Kenya’s courts to order Facebook and its outsourcing companies to end exploitation in its Nairobi moderation hub, where content moderators work in dangerous conditions,” said a statement by Foxglove, a London-based legal nonprofit that supports Facebook content moderators.

The first video Motaung watched as a Facebook moderator was a video of someone being beheaded, he told reporters during a call Tuesday. He stayed on the job for roughly six months, after relocating from South Africa to Nairobi in 2019 for the work. Motaung says he was dismissed after trying to spearhead efforts to unionize at the facility.

Motaung said his job was traumatizing and he now has a fear of death.

“I had potential,” Motaung said. “When I went to Kenya, I went to Kenya because I wanted to change my life. I wanted to change the life of my family. I came out a different person, a person who has been destroyed.”

Motaung says in his filing that once he arrived in Kenya for that work, he was told to sign a non-disclosure agreement and his pay was less than promised, with one monthly paycheck that was 40,000 Kenyan shillings, or roughly $350 U.S. dollars.

The lawsuit notes that Sama targets people from poor families across Kenya, South Africa, Ethiopia, Somalia, Uganda and other countries in the region with “misleading job ads” that fail to disclose that they will be working as Facebook content moderators or viewing disturbing content that expose them to mental health woes.

Applicants are recruited “through deceit,” said Mercy Mutemi, who filed the petition in court Tuesday morning. “We found a lot of Africans were forced into force labor situations and human trafficking. When you leave your country for a job that you didn’t apply for, that amounts to human trafficking.”

Content moderators are not given enough medical coverage to seek mental health treatment, the filing alleges.

The lawsuit also seeks orders for Facebook and Sama to respect moderators’ right to unionize.

Meta’s office in Nairobi said it takes seriously its responsibility to people who review content for the company and requires its “partners to provide industry-leading pay, benefits and support,” according to a statement issued by the company’s spokeswoman.

”We also encourage content reviewers to raise issues when they become aware of them and regularly conduct independent audits to ensure our partners are meeting the high standards we expect of them,” the statement said.

In 2020, Facebook agreed to pay $52 million to U.S. content moderators who filed a class action lawsuit after they were repeatedly exposed to beheadings, child and sexual abuse, animal cruelty, terrorism and other disturbing content.

Sama, which describes itself as an ethical AI company, did not immediately provide comment.

Sama’s Nairobi location is the largest content moderation facility in Africa, with approximately 240 employees working on the effort, according to the filing.

“We are not animals,” Motaung said in the statement. “We are people — and we deserve to be treated as such.”

Seitz reported from Washington.

Copyright 2022 The Associated Press. All rights reserved.