advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

The Metaverse, a hotbed for child abusers – report

  • A new report highlights how games that use the Metaverse are being used en masse by child abusers to access children.
  • The way that these games facilitate cooperation between players, allow virtual gifts to be exchanged and use leaderboards create a potent environment where children can be groomed.
  • Abusers can leverage Metaverse games to groom children with an average time of just 45 minutes between first contact and the sharing of illicit material.

Conversations between children and adult predators in the Metaverse can escalate into high-risk grooming situations within just 19 seconds.

This is according to a new report from WeProtect Global Alliance, an organisation against the exploitation of children on the internet.

The report, which takes into consideration over 32 million instances globally, details how since 2019, the prevalence of child sexual abuse material cases online has seen an 87 percent increase.

Online gaming remains a substantial risk for children, especially if there are social features involved. This include the ability for players to chat to each other, either via text or via audio. In these channels within popular online games, young children can be groomed by an adult with an average time of 45 minutes.

However, the report makes it clear that these aren’t all online games, and the riskiest kind of online environment for children are those that facilitate adult-child intermingling, exchanging virtual gifts and public ranking systems.

In particular, games that leverage what has become known as “the metaverse” – a virtual world that exists online independent of the players that visit it. The metaverse can be experienced in games like Roblox and recently Fortnite, both of which can include the three elements that elevate grooming risks – both of which are cited in the report and are supremely popular among young people.

Why do these three pillars of the Metaverse create such a potent environment for predators to abuse children? It’s all about access, trust and influence.

Games that provide environments where players are encouraged to play in the same spaces and cooperate with one another provide locations where predators can access children.

The exchanging of virtual gifts or currencies is a way that offenders can build trust with the children they are looking to groom, and publically visible rankings or players with high public value can encourage the risk of grooming as offenders use these systems to build influence.

“The shortest time recorded was 19 seconds, which involved only seven messages,” explains the report.

“This interaction typifies offenders using a volume-based approach to identifying and engaging victim survivors. They contact multiple children simultaneously, knowing a small percentage will respond and likely become victim-survivors. This interaction included: an introduction, age identification, confirmation that the instigator had a strong interest in children, request for intimate imagery, then termination of the interaction by the potential victim-survivor,” it adds.

Typically in these Metaverse gaming environments, offenders will first identify that the target is a child and then immediately seek to build trust.

Once trust is established, offenders turn the conversation sexual, asking about the minor’s sexual history and preferences. If the child answers these questions, offenders will try to move the conversation to a private messaging platform, usually of the encrypted variety like WhatsApp and Telegram, to share images, videos, voice and video calls.

Sometimes examples of grooming will begin innocently, with offenders simply chatting with targets about the games they are playing, before building a relationship with their targets.

“In these situations, the child is highly likely to think they are in a romantic relationship with the offender, and unlikely to recognise the abusive nature of the relationship,” the report continues.

Finally, the report highlights that new and developing technologies have heightened the threats that children face online. One notable example is generative AI which is being used by perpetrators to create virtual child sexual abuse material.

These images can be incredibly photorealistic and depict children in suggestive situations even if no real children are involved.

Australia is among the first nations worldwide to place measures that require big tech firms to ensure that AI products cannot be used to generate deepfake images and video of child sexual abuse.

“Online-facilitated child sexual exploitation and abuse worldwide demands our attention and action right now. New technological capabilities further exacerbate existing risks, and the situation is no different in Africa. Children’s safety must be non-negotiable,” said Iain Drennan, Executive Director of WeProtect Global Alliance.

“To prevent more children from coming to harm, governments, online service providers, charities and companies must step up their efforts and work together to drive change and protect children.”

Remember, gaming is not the danger here, it is adults that exploit these games to access children. Protect yourself and your children, take full advantage of in-built parental controls and take some time to discover and find out what interests your children have, and how they are using the internet.

To read the full 2023 report from WeProtect, click here.

[Image – Photo by Caleb Woods on Unsplash]

advertisement

About Author

advertisement

Related News

advertisement