Facebook content moderators protest low wages with mobile billboard
Facebook moderators are protesting low wages at contracting agency Accenture in the form of a mobile billboard and an internal letter, according to a tip received by The Verge. Most of all, moderators say they just want to be treated as humans.
The content moderators, who are contracted by Accenture, will run a mobile billboard today targeting Accenture CEO Julie Sweet. The billboard features a picture of Sweet, along with a prominent statement that reads: “Julie Sweet stop exploiting your workers. Pay up. Clean it up. Fix it.” Sweet made a total of $17 million in 2020 alone, a sum that shot up drastically from 2019. The mobile billboard will start its journey in Bethesda, Maryland, the very place that Sweet resides, and come to a stop in Washington, DC.
Big tech companies like Facebook often outsource content moderation duties to third-party contracting firms, and Accenture is one of the biggest ones doing it. A report by The New York Times initially revealed the $500 million contract that Accenture has with Facebook to perform content moderation. The billboard compares that exorbitant number with the amount of money that Accenture moderators actually receive.
Hourly rates at Accenture fall between $16.50 and $18.50, with Ukrainian and Spanish language moderators on the lower end of that scale. They’re excluded from the $2 premium that bilingual moderators are rewarded with, which prompted a demand for equal pay earlier this year — even still, nothing ever came of it.
“In expensive cities like Austin, with prices driven up by the tech industry, this falls far short of a living wage,” the moderators’ letter to Sweet reads. “For every US-based content moderator, in the Times’ estimation, you pocket $50.”
Accenture has continually ignored moderators’ pleas for higher pay. Facebook moderators see some gnarly stuff, and it’s so bad that lawyers suing Facebook estimated that half of all moderators may develop mental health disorders. They’re subject to the horrors of cleaning up questionable content across Facebook’s services, which might include violent videos, sexual content, and hate speech. According to our source, Accenture only offers moderators access to behavioral coaches, rather than professional psychological care.
AI does play a role in moderating content on Facebook, but as an anonymous content moderator told us, it mainly deals with benign posts. Human moderators are the ones tasked with sifting through the more disturbing content. “For all the talk of algorithms and ‘AI’ catching harmful content to protect Facebook’s 2.8 billion users, we — and you — know the truth: moderators do the work,” the letter asserts.
In the end, moderators at Accenture just want to be recognized as humans and feel valued for the work they do. “People often don’t realize that we are quite essential workers,” said the anonymous moderator during an interview with The Verge. “Without us, all of social media wouldn’t be useful at all.”
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.