- A new report from the Tech Transparency Project (TTP) found that young boys interested in videogames were served more videos about guns on YouTube thanks to recommendations.
- The nonprofit watchdog group posed as a 9-year old and 14-year old boy on YouTube.
- It also found that several of the videos recommended, violated YouTube’s own policies.
Much has been made of the YouTube algorithm and in recent years, how it can be found to radicalise people who watch certain types of content. Now a new a report from the Tech Transparency Project (TTP) shines a concerning light on recommendations on the platform.
More specifically it shows how young boys are seemingly being targeted and served more violent and gun-related content thanks to recommendations.
“YouTube’s algorithms are pushing boys interested in video games to scenes of school shootings, instructions on how to use and modify weapons, and even a movie about notorious serial killer Jeffrey Dahmer, according to a study by the Tech Transparency Project (TTP) that raises new questions about the safety of YouTube’s recommendation system,” researchers from the nonprofit watchdog group noted in its report.
For the purposes of the research, the TTP created four YouTube accounts – two posing as a 9-year old boy and two as a 14-year old boy. The accounts watched playlists of content for popular videogames including Roblox, Lego Star Wars, Halo, and Grand Theft Auto. It then monitored recommendations over a period of 30 days during November last year.
“We then logged and analyzed the videos that YouTube’s algorithm recommended to these minor accounts, with one of each age group watching the recommended videos and one not engaging with them. The study found that YouTube pushed content on shootings and weapons to all of the gamer accounts, but at a much higher volume to the users who clicked on the YouTube-recommended videos,” the report explained.
Perhaps most concerning is that a high volume of recommended videos would be classed in violation of the terms of service that YouTube says it enforces when it comes to uploaded content.
“Our researchers found that YouTube did serve extreme content to kids, including videos that violated the platform’s own policies on violence, firearms, and child safety,” the TTP added.
“YouTube pushed more of this content—in some cases more than 10 times more—to the accounts that watched the recommendations (referred to in this report as ‘engagement accounts.’) In other words, if a gamer showed interest in the videos recommended by YouTube, YouTube’s algorithm served up more and more content related to real-world violence,” it continued.
Researchers also found that the 14-year old account was heavily targeted in terms of gun-related video recommendations, nearly quadrupling that of the 9-year old one.
“YouTube served 1,325 real firearms videos to the 14-year-old engagement account—an average of more than 44 per day. The videos featured shooting scenes and ‘how-tos’ for using or modifying firearms. By contrast, the 14-year-old account that did not click on the recommended content got 172 weapons videos,” the TTP highlighted.
While this report illustrates a clear issue with how content, particular graphic and violent content, is recommended to young YouTube viewers, it is unclear how or indeed if the platform can tackle the problem on its hands.
In a 2021 blog post YouTube outlined how recommendations work on its platform, paying specific attention to content that featured misinformation and what the company calls “borderline”. If this latest report is anything to go by, a lot more needs to be done on that front.
To read the Tech Transparency Project’s report in full, head here.