Twitter wanted to monetise adult content, until it realised how much exploitative content it housed

The success of OnlyFans is tough to understate. From 2020 to 2021 the firm’s net revenue grew from $375 million to $1.2 billion and the platform expects this to top out at $2.5 billion later this year, according to an Axios report.

One has to wonder then why other social networks haven’t tried to capitalise on the boom of independent adult content creators. Aside from the obvious point of not wanting to be associated with that content, putting content behind a paywall can be risky.

This is what Twitter discovered when it tried to implement a project called Adult Content Monetization (ACM).

This project looked to capitalise on the adult content creators who already use Twitter by giving them the ability to charge others to access content. This revelation was made by The Verge, which has published an article that makes Elon Musk’s issues with bots seem trivial.

Ahead of the launch of the ACM project, Twitter reportedly convened a team comprised of 84 employees it dubbed Red Team. The goal of this team was to pressure test the project and focus on how Twitter could do this safely and responsibly.

The project was derailed when the Red Team discovered that the company wasn’t policing content in its halls effectively.

“Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale,” the Red Team reported in April 2022.

Worse still, Twitter doesn’t have a way to verify users are of legal age and once Musk announced he wanted to buy the firm, the idea was scraped.

But the report reveals that Twitter has a serious problem with child sexual exploitation (CSE) material, a fact executives are aware of, but drag their feet in addressing.

“Employees we spoke to reiterated that despite executives knowing about the company’s CSE problems, Twitter has not committed sufficient resources to detect, remove, and prevent harmful content from the platform,” writes The Verge.

The report details ageing technology that leaves cavernous gaps in Twitter’s detection methods alongside processes that create a backlog of cases to review.

Worse still, even when Twitter was told to create a single tool to process CSE reports, it dragged its feet and told the Health team to  “chip away at these needs over time starting with the highest priority features to avoid the too-big-to-prioritize trap”.

We highly recommend reading the report penned by Zoe Schiffer and Casey Newtown at The Verge (link above). With all of these revelations, bots really are the least of Twitter’s problems.

How Twitter will address these revelations and tackle its CSE problems is unclear. We wouldn’t be surprised if the firm adopted the Tumblr stance and banned adult content on Twitter outright


About Author


Related News