Apple child safety CSAM scanning to expand on a per-country basis

496
Apple child safety CSAM scanning to expand on a per-country basis

Apple’s much-lauded privacy efforts hit a sour note a couple of days ago when it announced a replacement feature intended to guard children by reporting illegal content that has been stored on a user’s iCloud Photos account. While those on each side of the argument agree that youngsters need to be protected by cracking down on Child sexual assault Material or CSAM, critics argue that other governments might abuse this technique. Apple now clarifies that it won’t be the case since it’ll consider the CSAM detection’s rollout on a case-to-case basis per country.

Privacy advocates have unsurprisingly labeled Apple’s new CSAM scanning feature as spyware or surveillance software due to how it could potentially violate a person’s privacy, despite Apple’s assurances. The basis of the contention is that the method of detecting the presence of CSAM content in photos involves the utilization of AI and machine learning to avoid having humans scan photos manually. While that in itself may be a sort of privacy protection, it also opens the door to abuse and potential privacy violations.

Critics argue that the machine learning system is often fed other data, intentionally or accidentally, that would then be wont to detect and report content unrelated to CSAM. The technology, for instance, might be used as a mass closed-circuit television for activists in countries with more repressive governments. Apple has already indicated its intention to expand its CSAM detection to iPhones and iPads worldwide, adding fuel to the controversy.

Also See:  Apple warns iPhone users should update to new iOS 14.8 ASAP

Apple has now clarified that it won’t be making a blanket rollout without considering the specifics of every market’s laws. This might provide some comfort to citizens in China, Russia, and other countries with strong censorship laws. So instead, CSAM detection will roll out first within the US, where Apple has long been a staunch privacy ally.

That might not satisfy privacy advocates, however, as they see the system itself as hospitable abuse. Apple has repeatedly denounced the creation of backdoors into strong security systems, but it’s now being criticized for creating exactly that, regardless of how narrow or strategically that backdoor is.

Also See:  Apple's next-gen 'M2' Mac processor has reportedly gone into production