Apple’s CSAM detection tech is under fire — again

Apple’s CSAM detection tech is under fire — again

3 years ago
Anonymous $drS9DEX_Sj

https://techcrunch.com/2021/08/18/apples-csam-detection-tech-is-under-fire-again/

Apple has encountered monumental backlash to a new child sexual abuse imagery (CSAM) detection technology it announced earlier this month. The system, which Apple calls NeuralHash, has yet to be activated for its billion-plus users, but the technology is already facing heat from security researchers who say the algorithm is producing flawed results.

NeuralHash is designed to identify known CSAM on a user’s device without having to possess the image or knowing the contents of the image. Because a user’s photos stored in iCloud are end-to-end encrypted so that even Apple can’t access the data, NeuralHash instead scans for known CSAM on a user’s device, which Apple claims is more privacy friendly as it limits the scanning to just photos rather than other companies which scan all of a user’s file.