Removing child exploitation is “priority #1”, Twitter’s new owner and CEO Elon Musk declared last week. But, at the same time, following widespread layoffs and resignations, just one staff member remains on a key team dedicated to removing child sexual abuse content from the site, according to two people with knowledge of the matter, who both requested to remain anonymous.
It’s unclear how many people were on the team before Musk’s takeover. On LinkedIn, WIRED identified four Singapore-based employees who specialize in child safety who said publicly they left Twitter in November.
The importance of in-house child safety experts cannot be understated, researchers say. Based in Twitter’s Asian headquarters in Singapore, the team enforces the company’s ban on child sex abuse material (CSAM) in the Asia Pacific region. Right now, that team has just one full-time employee. The Asia Pacific region is home to around 4.3 billion people, about 60 percent of the world’s population.
The team in Singapore is responsible for some of the platform’s busiest markets, including Japan. Twitter has 59 million users in Japan, second only to the number of users in the United States, according to data aggregator Statista. Yet the Singapore office has also been impacted by widespread layoffs and resignations following Musk’s takeover of the business. In the past month, Twitter laid off half its workforce and then emailed remaining staff asking them to choose between committing to work “long hours at high intensity” or accepting a severance package of three months’ pay.
The impact of layoffs and resignations on Twitter’s ability to tackle CSAM is “very worrying,” says Carolina Christofoletti, a CSAM researcher at the University of São Paulo in Brazil. “It’s delusional to think that there will be no impact on the platform if people who were working on child safety inside of Twitter can be laid off or allowed to resign,” she says. Twitter did not immediately reply to a request for comment.
Twitter’s child safety experts do not fight CSAM on the platform alone. They get help from organizations such as the UK’s Internet Watch Foundation and the US-based National Center for Missing & Exploited Children, which also search the internet to identify CSAM content being shared across platforms like Twitter. The IWF says that data it sends to tech companies can be automatically removed by company systems—it doesn’t require human moderation. “This ensures that the blocking process is as efficient as possible,” says Emma Hardy, IWF communications director.
But these external organizations focus on the end product and lack access to internal Twitter data, says Christofoletti. She describes internal dashboards as critical for analyzing metadata to help the people writing detection code identify CSAM networks before content is shared. “The only people who are able to see that [metadata] is whoever is inside the platform,” she says.
Twitter’s effort to crack down on CSAM is complicated by the fact it allows people to share consensual pornography. The tools used by platforms to scan for child abuse struggle to differentiate between a consenting adult and an unconsenting child, according to Arda Gerkens, who runs the Dutch foundation EOKM, which reports CSAM online. “The technology is not good enough yet,” she says, adding that’s why human staff are so important.
Twitter’s battle to suppress the spread of child sexual abuse on its site predates Musk’s takeover. In its latest transparency report, which covers July to December 2021, the company said it suspended more than half a million accounts for CSAM, a 31 percent increase compared to the previous six months. In September, brands including Dyson and Forbes suspended advertising campaigns after their promotions appeared alongside child abuse content.
Twitter was also forced to delay its plans to monetize the consenting adult community and become an OnlyFans competitor due to concerns this would risk worsening the platform’s CSAM problem. “Twitter cannot accurately detect child sexual exploitation and nonconsensual nudity at scale,” read an internal April 2022 report obtained by The Verge.
Researchers are nervous about how Twitter will tackle the CSAM problem under its new ownership. Those concerns were only exacerbated when Musk asked his followers to “reply in comments” if they saw any issues on Twitter that needed addressing. “This question should not be a Twitter thread,” says Christofoletti. “That’s the very question that he should be asking to the child safety team that he laid off. That’s the contradiction here.”
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.