advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Apple has seemingly scrubbed its website of its CSAM solution

Apple sparked controversy earlier this year when it announced plans to curb the spread of child sexual abuse material (CSAM).

While the technology was smart, there were concerns that the system Apple developed could be used by governments and ne’er-do-wells to scan any device for any sort of content.

The backlash was so severe that Apple hit pause on the feature so that it could gather additional input from customers, advocacy groups, researchers and others.

But head to the Apple website today and you’d be hard pressed to find any mention of CSAM detection. As MacRumors reports, all mention of CSAM has been removed from Apple’s Child Safety page. While the technical summary of the solution is still available, it’s the only mention of CSAM we could find on the website.

The signs then, point to CSAM being abandoned altogether. This is not surprising if we’re honest. While the technology was, as we mentioned, very clever, the potential for abuse was incredibly as well.

While Apple’s CSAM detection didn’t look for photos, it did look for image hashes as we explained here.

This meant that the system would only flag an account if an image hash matched a pre-existing database but that is also where the concerns were raised. The fear of many is that this system could be emulated and used by governments to prevent citizens from speaking out against tyranny and other ills.

We’re hoping that Apple will provide more clarity on what is happening here. For now though it seems as if the feature is dead in the water. Here’s hoping that is indeed the case as the last thing Silicon Valley needs is more access to more data.

advertisement

About Author

advertisement

Related News

advertisement