Facebook moderators press for pandemic safety protections
“After months of allowing content moderators to work from home, faced with intense pressure to keep Facebook free of hate and disinformation, you have forced us back to the office,” said the open letter released by the British-based legal activist firm Foxglove.
The letter called on Facebook to “keep moderators and their families safe” by maintaining remote work as much as possible and offering “hazard pay” to those who do come into the office.
When the pandemic hit, Facebook sent home most of its content moderators — those responsible for filtering violent and hateful images as well as other content which violates platform rules.
But the social platform discovered limits on what remote employees could do and turned to automated systems using artificial intelligence, which had other shortcomings.
The letter said the current environment highlights the need for human moderators.
“The AI wasn’t up to the job. Important speech got swept into the maw of the Facebook filter — and risky content, like self-harm, stayed up,” the letter said.
“The lesson is clear. Facebook’s algorithms are years away from achieving the necessary level of sophistication to moderate content automatically. They may never get there.”
The petition said Facebook should consider making the moderators full employees — who in most cases may continue working remotely through mid-2021.
“By outsourcing our jobs, Facebook implies that the 35,000 of us who work in moderation are somehow peripheral to social media,” the letter said.
“Yet we are so integral to Facebook’s viability that we must risk our lives to come into work.”
Facebook did not immediately respond to an AFP request for comment.
Disclaimer: Validity of the above story is for 7 Days from original date of publishing. Source: AFP.