Apple Recognizes It Jumped Too Quickly On Its CSAM Detection System; Delays Implementation

Apple Recognizes It Jumped Too Quickly On Its CSAM Detection System; Delays Implementation

3 years ago
Anonymous $drS9DEX_Sj

https://www.techdirt.com/articles/20210903/12031147494/apple-recognizes-it-jumped-too-quickly-csam-detection-system-delays-implementation.shtml

Sometimes speaking out works. A month ago, Apple announced a series of new offerings that it claimed would be useful in fighting back against CSAM (child sexual abuse material). This is a real problem, and it's commendable that Apple was exploring ways to fight it. However, the major concern was how Apple had decided to do this. Despite the fact that a ton of experts have been working on ways to deal with this extremely challenging problem, Apple (in Apple fashion) went it alone and just jumped right in the deep end, causing a lot more trouble than necessary -- both because their implementation had numerous serious risks that Apple didn't seem to account for, and (perhaps more importantly) because the plan could wipe away years of goodwill in conversations between technologists, security professionals, human rights advocates and more in trying to seek solutions that better balance the risks.

Thankfully, with much of the security community, the human rights community, and others calling attention to Apple's dangerous approach, the company has now announced a plan to delay the implementation, gather more information, and actually talk to experts before deciding how to move forward. Apple put (in tiny print...) an update on the page where it announced these features.

Apple Recognizes It Jumped Too Quickly On Its CSAM Detection System; Delays Implementation

Sep 3, 2021, 9:13pm UTC
https://www.techdirt.com/articles/20210903/12031147494/apple-recognizes-it-jumped-too-quickly-csam-detection-system-delays-implementation.shtml > Sometimes speaking out works. A month ago, Apple announced a series of new offerings that it claimed would be useful in fighting back against CSAM (child sexual abuse material). This is a real problem, and it's commendable that Apple was exploring ways to fight it. However, the major concern was how Apple had decided to do this. Despite the fact that a ton of experts have been working on ways to deal with this extremely challenging problem, Apple (in Apple fashion) went it alone and just jumped right in the deep end, causing a lot more trouble than necessary -- both because their implementation had numerous serious risks that Apple didn't seem to account for, and (perhaps more importantly) because the plan could wipe away years of goodwill in conversations between technologists, security professionals, human rights advocates and more in trying to seek solutions that better balance the risks. > Thankfully, with much of the security community, the human rights community, and others calling attention to Apple's dangerous approach, the company has now announced a plan to delay the implementation, gather more information, and actually talk to experts before deciding how to move forward. Apple put (in tiny print...) an update on the page where it announced these features.