advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

ROOST is a new child safety group founded by Google, Roblox, Discord, and OpenAI

  • Google, Roblox, Discord, and OpenAI have founded a new online child safety group called ROOST.
  • The group says it will focus on creating open-source safety tools to better protect children as AI becomes more pervasive.
  • The announcement of the group’s founding was made at the AI Action Summit happening in Paris this week.

The AI Action Summit is currently on the go in Paris, France this week, as a number of regulators, policy makers, and big tech companies from across the globe look to tackle the pervasiveness of AI against the backdrop of an ethical and sustainable strategy. One of the efforts in this regard is the newly founded child safety group – ROOST.

The group has some notable names behind, having been founded by Google, Roblox, Discord, and OpenAI.

“ROOST is a new non-profit organization that brings together the expertise, resources, and investments of major technology companies and philanthropies to build scalable, interoperable safety infrastructure suited for the AI era,” a joint press release regarding the organisation explains.

“Amongst the major announcements of the French AI Action Summit, ROOST addresses an important gap in digital safety—especially online child safety—by providing free, open-source safety tools to public and private organizations of all sizes across the globe,” it adds.

While the intention of the group is indeed commendable, given who its founders are, we are still uncertain if it can achieve some of its lofty ambitions. Roblox for example, has long struggled to police its platform, with the Hindenburg report calling it a “paedophile hellscape for kids”.

That is a rather damning profile of a founding member of this newly formed group. Google also has a spotty history when it comes to ensuring child safety despite its best efforts, and it too, like many other big tech firms, is still trying to manage the creation and distribution of AI-generated explicit content on its platforms.

As such, the companies involved with ROOST will need to not only design open-source tools for others to implement, but also take a serious look at their own respective platforms in order to address the potential harms that AI could pose to young internet users.

While we await to see how that will happen, it looks as if the companies involved with this group will be lending the power of its technologies to the cause of online child safety.

“ROOST will offer free, open-source, and easy-to-use tools to detect, review, and report child sexual abuse material (CSAM); leverage large language models (LLMs) to power safety infrastructure; and make core safety technologies more accessible and more user friendly,” it highlights.

“With dedicated technical teams providing hands-on support, ROOST will meet organizations where they are, helping them integrate robust safety measures while continuing to innovate,” it continues.

advertisement

About Author

Related News

advertisement