advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Leaked document casts negative light on Facebook’s reporting of child abuse images

This week has been a less than ideal one for Facebook. A few days ago the company was alleged to have paid a Republican consulting firm to spread dangerous misinformation about TikTok, but a new report is even more concerning.

This as the New York Times has published a report related to the ineffective procedures in place for moderators to flag underage explicit content on the platform.

According to a leaked company document, moderators are instructed to “err on the side of adult” when it is not blatantly obvious what the age of a child is in explicit content.

Naturally this is a practice that moderators have taken issue with, especially given the intensity that the job demands. It also means that there are potentially many exploitative images of children slipping through the cracks, as the methods being employed are greatly outdated.

To that end, the New York Times says a 50-year old method to gauge the “phase of puberty” of a person is being used by moderators. The problem with this method lies in the fact that it has no discernible ability to determine age, which is the primary concern here.

Another issue raised in the report, is the sheer volume of content that moderators must deal with, with only a few seconds given to assess and make a decision. Added to this is the fact that moderators are often employed from firms outside of Facebook, which means the content on that platform is not the only thing they are moderating.

While Facebook, which goes by Meta these days, did not comment on the leaked document and the claims resulting from it, the reporting of child abuse imagery is something that the company is aware of.

On that front citing comments made Antigone Davis, head of safety for Meta, the decision to err on the side of adult is seemingly the result of privacy concerns.

“Meta employs a multilayered, rigorous review process that flags far more images than any other tech company. She said the consequences of erroneously flagging child sexual abuse could be ‘life-changing’ for users,” writes Micheal K. Heller of the New York Times.

“The sexual abuse of children online is abhorrent,” added Davis.

With that not being a particularly difficult stance to take on child abuse, it is clear that moderating its own platform is still proving a massive challenge for Facebook and it may be coming at the cost of underage users.

[Source – New York Times]

advertisement

About Author

advertisement

Related News

advertisement