advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Twitter’s woeful CSAM moderation is now costing it money

  • Advertising from 30 brands appeared alongside accounts peddling child sexual abuse material
  • Brands included Disney, Coca-Cola and NBCUniversal.
  • Advertisers are suspending and cancelling campaigns on Twitter which is bad news for a company that depends on advertising revenue

The largest social networks of the world hold the attention of millions and this allows them to make ridiculous amounts of money through advertising.

Unfortunately, as we saw with the Adpocalypse of 2017, where adverts appear can push advertisers away.

We bring all of this up because according to an exclusive report from Reuters, more than 30 advertisers appeared alongside Twitter accounts peddling links to child sexual abuse material (CSAM). The brands include Disney, NBCUniversal, Coca-Cola and even a children’s hospital.

This information was brought to light by research conducted by cybersecurity group Ghost Data and the details are alarming. For instance, the team found that adverts appeared alongside tweets containing keywords such as “rape” and “teens”. An advert for shoe and accessory brand Cole Haan appeared next to a tweet by a user calling for trading teen and child content.

Needless to say, brand president for Cole Haan, David Maddocks, was horrified at the revelation.

“Either Twitter is going to fix this, or we’ll fix it by any means we can, which includes not buying Twitter ads,” Maddocks told Reuters.

Ghost Data reportedly identified over 500 accounts openly requesting and sharing CSAM over a 20-day period in September. Twitter failed to remove 70 percent of those accounts despite stating it has a zero tolerance policy for CSAM. To its credit, after the 500 accounts were shared with Twitter, all were reviewed and permanently suspended

While Twitter says it is investing more resources aimed and making children safer, this problem needs to be addressed rather quickly and writing policies and hiring for new positions – to us – seems like the bare minimum here.

Indeed, in a Twitter thread, Ghost Data founder Andrea Stroppa alleges that Twitter has a hard time catching CSAM in English and that means it has an even harder time catching it in other languages.

Following these revelations, Dyson, Mazda, Forbes and PBS Kids have all suspended marketing campaigns or removed their advertising from Twitter entirely.

This problem isn’t new to Twitter, in fact a recent report revealed that Twitter wanted to explore an OnlyFans-style feature, only for the idea to be scrapped once an internal team discovered just how much CSAM was available on Twitter.

Of course, policing this sort of content isn’t easy and we make no assumptions that it is. Apple infamously tried to battle the problem by scanning users devices for illegal content (it was more complicated than that but the this is what the solution boiled down to) before it was pointed out how that solution could be exploited.

Given that Twitter’s revenue is largely drawn from advertisers, perhaps the company will buckle down and get to solving the issue. It wouldn’t surprise us if Twitter took the Tumblr route and banned adult content from the platform altogether. While adult content creators have something of a safe haven on Twitter, the fact that CSAM is able to slip through the cracks could force Parag Agrawal’s hand in instituting a ban on all nudity and sexually explicit content.

There’s also the matter of Elon Musk’s proposed acquisition of Twitter. The billionaire has been trying to find a way out of the deal and we have to think that CSAM pushing advertisers away is of more consequence to the firm than bots are.

Twitter has yet to address this report but we look forward to seeing how the company intends to tackle this prolem.

advertisement

About Author

advertisement

Related News

advertisement