Apple’s plan to track down abusive images could cause problems for employers

Apple’s plan to track down abusive images could cause problems for employers

3 years ago
Anonymous $drS9DEX_Sj

https://techmonitor.ai/policy/privacy-and-data-protection/apple-privacy-scanning-child-abuse

Apple’s latest plan to combat the spread of abusive images online will involve automatically scanning devices for evidence of child sexual abuse material (CSAM). The move has pleased some governments and worried privacy advocates, while businesses may suffer a knock-on effect, with employers potentially finding themselves at risk of being implicated by their employees, or being in breach of client privacy regulations.

The new scanning technology will be implemented by Apple later this year. It will detect and report child abuse material to law enforcement agencies. Many cloud service providers, such as Dropbox or Microsoft, already scan content once it has been sent to the cloud. The difference here is that Apple will be using its CSAM detection tech, called NeuralHash, to access a user’s device directly, rather than waiting for potentially harmful content to be uploaded.

Last Seen
4 hours ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000
Last Seen
59 minutes ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000
Last Seen
a minute ago
Reputation
0
Spam
0.000