YouTube’s algorithm removed a mammoth 11.4 million videos in Q2 this year

Share on facebook
Share on twitter
Share on linkedin
Share on email

As one young man in India famously said, “Okay, first of all, the YouTube algorithm.” The system that the video platform has been making use of for some now has been linked to all manner of ills by viewers and creators alike, as well as being something that YouTube itself is constantly trying to refine and improve.

We make reference to it today as YouTube recently revealed some statistics for Q2 of 20202 with regards to its Community Guidelines Enforcement report, with the standout figure being a massive 11.4 million videos being removed between April and June.

This is up substantially from the 9 million that YouTube achieved in the same period last year, with a big contributor to the increase being its reliance on non-human moderation.

As for why non-human moderation played a bigger role, as with most things of late, COVID-19 played a part, according to YouTube.

“When reckoning with greatly reduced human review capacity due to COVID-19, we were forced to make a choice between potential under-enforcement or potential over-enforcement,” it explains in an accompanying blog post.

“Because responsibility is our top priority, we chose the latter—using technology to help with some of the work normally done by reviewers. The result was an increase in the number of videos removed from YouTube; more than double the number of videos we removed in the previous quarter,” it adds.

With Google enforcing a work from home policy for its employees and moderators alike, the company noted that it would be increasing its dependance on technology during the pandemic.

It was also acutely aware that more videos would be removed, and queried by creators as a result, which is why it re-focused a large chunk of its workforce to handling those queries instead of moderating flagged content.

“For certain sensitive policy areas, such as violent extremism and child safety, we accepted a lower level of accuracy to make sure that we were removing as many pieces of violative content as possible,” the platform notes.

Whether this approach was ultimately for the betterment of its creator community, remains to be seen.

“We are continuing to improve the accuracy of our systems and, as reviewers are able to come back to work, we are deploying them to the highest impact areas. We’ll continue to regularly update the community on our progress,” YouTube’s blog concludes.

As such it will be interesting to see whether this algorithmic approach will be as heavily adopted post-COVID-19.

[Image – Photo by Leon Bublitz on Unsplash]

Robin-Leigh Chetty

Robin-Leigh Chetty

Editor of Hypertext. Covers smartphones, IoT, 5G, cloud computing and a few things in between. Also a keen photographer and dabbles in console games when not taking the hatchet to stories.

NEWSLETTER

[mailpoet_form id="1"]