A high-profile Apple exec has conceded that its big reveal of the latest child protection tools could’ve gone more smoothly, with software chief Craig Federighi admitting that “in hindsight” confusion was inevitable. Apple announced plans to scan iCloud Photos uploads for youngsters’ illegal images and build new grooming protections into iMessage earlier in August, and was met with instant pushback from privacy advocates et al..
Apple explained that the two systems would focus on protecting younger users and youngsters, though add alternative ways. On the one hand, iCloud Photos would scan backed-up images for child sexual assault materials (CSAM), comparing them to known illegal content. On the other hand, after a person’s review, the user might be reported to the authorities if that was spotted.
In Apple‘s Messages app meanwhile, parents would be ready to activate new filters for younger users. Should someone subsequently send them a picture flagged within the system as being explicit, that picture would be blurred out, and guidance on how to affect potential grooming situations offered? If the kid still opened the image, there’d optionally be an alert sent to the oldsters about it.
While Apple may have expected plaudits for the moves, what it received instead was a substantial degree of suspicion about the technology. Concern was raised about whether governments could demand the expansion of the iCloud Photos scanning, introducing pictures aside from CSAM, which may be wont to track political opposition, protesters, journalists, and more. The technology might be a slippery slope; once Apple demonstrated it had been theoretically possible, it had been argued.
Apple pushed back thereon earlier in the week, with a replacement FAQ attempting to settle fears. It might resist pressure to expand the scanning beyond its current intended purpose, Apple insisted, and protections had been inbuilt to avoid accidental flagging and reports. Still, weeks after the announcement, it’s clear that the furor isn’t getting to settle any time soon.
Speaking to the WSJ, Craig Federighi, Apple senior vice chairman of software engineering, conceded that the launch could’ve gone better. Still, he frames it together of confusion among the audience instead of Apple primarily being guilty.
“It’s really clear tons of messages got jumbled pretty badly in terms of how things were understood,” Federighi said. “We wish that this would’ve begin a touch more clearly for everybody because we feel very positive and strongly about what we’re doing.”
Contrary to concerns that this was a possible hit on user privacy, Federighi argues that the technology – which runs locally on users’ devices, instead of within the cloud, using so-called “hashed” data from CSAM images provided by third parties instead of the pictures themselves – is more beneficial to privacy overall. There’ll be “multiple levels of auditability,” he says. Therefore the system are going to be tuned so that, say, innocent photos a parent might take of a toddler within the bathtub aren’t flagged as kiddie porn.
“If and as long as you meet a threshold of something on the order of 30 known child pornographic images matching, only then does Apple know anything about your account and know anything about those images, and at that time , only knows about those images, not about any of your other images,” he explains.
The software chief says there will be multiple child-safety organizations liable for collating the hashed image lists to which iCloud Photos uploads are compared. An independent auditor also will be verifying that the database contains those images alone. Even so, it’s not just external observers voicing concerns; consistent with a Reuters report; some Apple employees have also been speaking out against the system internally.
As close as Federighi gets to an admission that Apple’s strategy might be problematic may be a regard to the announcement, not its content. “In hindsight,” he says, “introducing these two features at an equivalent time was a recipe for this type of confusion. By releasing them at an equivalent time, people technically connected them and got very scared: What’s happening with my messages? the solution is…nothing is occurring together with your messages.”
The new features will go live later within the year, with the arrival of iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. iCloud Photos upload scanning will only happen if users have enabled backups to the service; locally-saved images won’t be scanned. However, any existing images marked for the backup will be scanned, too, Apple has confirmed.