advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Child abuse material is thriving on the Fediverse

  • The Fediverse offers a decentralised alternative to social media, but a lack of centralised oversight is a problem.
  • The Stanford Internet Observatory Cyber Policy Center discovered a rotting trove of child sexual abuse material on some of Mastodon’s top instances.
  • The researchers also discovered an alarming number of instances based in Japan selling computer-generated CSAM.

There is very clearly a disdain for social media at the moment. Elon Musk’s decisions at Twitter have caused many to leave the platform and while Threads grabbed headlines, interest unravelled rather quickly.

Over the course of the last few years, there has seemingly been an appetite for an alternative to the big tech platforms and decentralised platforms offered such an alternative. The likes of Mastodon and Bluesky make use of the ActivityPub protocol and are colloquially grouped under the banner of Fediverse, or federated social media websites.

Unlike Twitter which is hosted on Twitter’s servers, platforms on the Fediverse are independent and moderated independently. There are other features such as users being able to communicate with other users on another platform but the decentralisation is a core feature.

This week, David Thiel and Renee DiResta of the Stanford Internet Observatory Cyber Policy Center published the Child Safety on Federated Social Media study.

“At a time when the intersection of moderation and free speech is a fraught topic, decentralized social networks have gained significant attention and many millions of new users,” reads the study.

The issue at hand, however, is that because of this lack of oversight from a central moderation body, some of the worst people in society find a home.

The study ingested the public timelines of the top 25 Mastodon instances over the course of two days. During that time JSON metadata was recorded and submitted to PhotoDNA and Google’s SafeSearch API – tools used to detect known child sexual abuse material (CSAM) – for analysis.

Over those two days 325 000 posts were analysed and 112 instances of known CSAM were detected. In addition, Google SafeSearch labelled 554 instances of content as sexually explicit. The researchers also found 713 uses of the top 20 CSAM-related hashtags on posts that included media while these hashtags were also detected on 1 217 posts with no media.

Most disturbingly, the researchers found the instances of computer-generated CSAM.

“The difference in laws and server policies between Japan and much of the rest of the world means that communities dedicated to CG-CSAM—along with other illustrations of child sexual abuse—flourish on some Japanese servers, fostering an environment that also brings with it other forms of harm to children. These same primarily Japanese servers were the source of most detected known instances of non-computer-generated CSAM. We found that on one of the largest Mastodon instances in the Fediverse (based in Japan), 11 of the top 20 most commonly used hashtags were related to pedophilia (both in English and Japanese),” the researchers wrote.

In addition it was discovered that there are a high number of users selling CSAM, directing users to other platforms such as Session, Matrix or Telegram to complete sales. This direction off platform could be because of Mastodon not encrypting DMs. Worse still there are even users claiming to be children selling self-made explicit content.

While there are tools that can be used to control the spread of CSAM on the Fediverse, implementing those tools would require a rethink of the decentralised network.

“Counterintuitively, to enable the scaling of the Fediverse as a whole, some centralized components will be required, particularly in the area of child safety. Investment in one or more centralized clearinghouses for performing content scanning (as well as investment in moderation tooling) would be beneficial to the Fediverse as a whole. Given new commercial entrants into the Fediverse such as WordPress, Tumblr and Threads, we suggest collaboration among these parties to help bring the trust and safety benefits currently enjoyed by centralized platforms to the wider Fediverse ecosystem,” the researchers concluded.

Should the Fediverse continue as it is though, it will soon become even more of a haven for some of the worst people online.

You can read the full study including the researcher’s recommendations here.

[Image – Claudia from Pixabay]

advertisement

About Author

advertisement

Related News

advertisement