Elon Musk’s job cuts decimated Twitter team tackling child sexual abuse
Elon Musk has dramatically reduced the size of the Twitter team devoted to tackling child sexual exploitation on the platform, cutting the global team of experts in half and leaving behind an overwhelmed skeleton crew, people familiar with the matter said.
The team now has fewer than 10 specialists to review and escalate reports of child sexual exploitation, three people familiar with the matter said, asking not to be identified for fear of retaliation. At the beginning of the year, Twitter had a team of about 20, they said.
It comes as lawmakers in the European Union and the UK are planning broad-reaching online safety rules that will require social media platforms to better protect children or face significant fines.
Twitter didn’t respond to a request for comment.
The team — a mix of former law enforcement officers and child safety experts based in the U.S., Ireland and Singapore — was stretched before the cuts, working long hours to respond to user reports and legal requests, the people said. They were responsible for stopping the distribution of child sexual abuse material, instances of online grooming and media that promoted attraction to minors as an identity or sexual orientation.
Last week, Musk tweeted that “removing child exploitation is priority #1” and called on people to “reply in the comments if you see anything that Twitter needs to address.”
Some prominent hashtags associated with child sexual exploitation have been removed since Musk took over, changes which had been in the works before he joined, the people said. Still, combating this type of messaging isn’t always as simple as removing tweets containing the offending hashtags since many have another, innocuous purpose, they said. Offenders also constantly change the terms they use to evade detection.
While artificial intelligence-based tools can be useful for identifying images that have already been reviewed and categorized as child sexual exploitation material by law enforcement, human review is particularly important for understanding the nuances of grooming and other exploitative behaviors, identifying previously unknown abusive images and videos, and understanding regional differences in the law, the people said. Humans are also required to respond to requests from law enforcement as part of criminal investigations.
Losing specialists in Europe and Singapore will make policing non-English speaking markets a particular challenge, the people said.
These specialists worked closely with dedicated product managers and engineers to build tools and automation to stop the spread of the material, as well as third-party contractors who helped triage posts that users reported. While only a few employees were cut in the first round of layoffs, the team was decimated when Musk called on Twitter’s workers to commit to a “hardcore” culture or lose their jobs, the people said. The people said that Musk didn’t create an environment where the team wanted to stay.
The defections were part of a broader exodus at Twitter’s trust and safety team, which left after Musk sent the ultimatum this month, people familiar with the matter said previously. The company’s also lost a significant number of its employees who block foreign disinformation campaigns on the platform and entire swathes of Twitter’s audience have been left without content moderation, one of the people said. In the Asia-Pacific region, just one contractor hired to help with spam in the Korean market remained, the person said.
Twitter has also cut a number of contractors who helped moderate content, Axios has reported. Social media platforms including Facebook, TikTok and Twitter use third-party moderators to help sift through flagged posts for violations.
Unlike other types of egregious content that violate Twitter’s rules, it’s illegal for the platform to host child sexual abuse material and, depending on the country, there are requirements to take down and report material within specific time limits.
In the U.K., the Online Safety Bill gives regulators the power to fine platforms hosting user-generated content, including Twitter and other social media apps, as much as 10% of their revenue if they fail to police their platforms effectively.
The EU is also planning regulation that would require tech companies to take a more aggressive approach to detecting sexual abuse material.
The European Commission’s controversial proposal would give courts the power to require companies to scan for material in messages, even if they are end-to-end encrypted. The commission also wants companies to detect grooming via artificial intelligence and use age verification to find minors on their platforms.
“Elon Musk has been very vocal about his commitment to tackling online child sexual abuse,” said Ylva Johansson, the EU Commissioner in charge of the proposal. “I fully expect him to follow through on these public commitments.”
“Having experienced experts and teams in place, as well as those familiar with EU legislation seems to me an obvious baseline from which to scale up this fight,” she said.
- - -
Bloomberg’s Davey Alba, Jack Gillum and Margi Murphy contributed to this report.