US iPhone users’ photos will be scanned by Apple’s automated “neutral match” system for pictures of child porn and abuse, according to reports. Security researchers are alarmed the scheme threatens privacy and encryption.
Financial Times reported on the plan Thursday, citing anonymous sources briefed on Apple’s plans. The scheme was reportedly shared with some US academics earlier in the week in a virtual meeting.
Apple plans to scan US iPhones for child abuse imagery https://t.co/wptzpVjEdN
— Financial Times (@FT) August 5, 2021
Dubbed “neutral match,” the system will reportedly scan every photo uploaded to iCloud in the US and tag it with a “safety voucher.” Once a certain number of photos – not specified – are labeled as suspect, Apple will decrypt the suspect photos and inform human reviewers – who can then contact the relevant authorities if the imagery can be verified as illegal, the FT report said. The program is initially intended to be rolled out in the US only.
The plan was described as a compromise between Apple’s promise to protect customer privacy and demands from the US government, intelligence and law enforcement agencies, and child safety activists to help them battle terrorism and child pornography.
Researchers who found out about the plan were alarmed, however. Matthew Green, a security professor at Johns Hopkins University, was the first to tweet about the issue in a lengthy thread late on Wednesday.
Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content.
That’s the message they’re sending to governments, competing services, China, you.
— Matthew Green (@matthew_d_green) August 5, 2021
The problem with this approach, Green warned, is that whoever controls the list of prohibited imagery “can search for whatever content they want on your phone, and you don’t really have any way to know what’s on that list because it’s invisible to you.”
Depending on how the system works, “it might be possible for someone to make problematic images that ‘match’ entirely harmless images. Like political images shared by persecuted groups,” he added. While he could see internet trolls doing it as a prank, Green added “there are some really bad people in the world who would do it on purpose.”
“I don’t particularly want to be on the side of child porn and I’m not a terrorist. But the problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends,” he tweeted.
Several other researchers echoed Green’s concerns. Apple’s move was “tectonic” and a “huge and regressive step for individual privacy,” Alec Muffett, a security researcher and privacy campaigner who worked at Facebook and Deliveroo, told FT.
“Apple are walking back privacy to enable 1984,” he added.
Ross Anderson, professor of security engineering at the University of Cambridge, called it “an absolutely appalling idea” that will lead to “distributed bulk surveillance” of people’s phones and laptops.
Word about Apple’s snooping plan comes just weeks after the revelation that iPhones around the world – but reportedly not in the US, for some reason – were targeted by Pegasus, spying malware deployed by the Israeli company NSO, to keep tabs on over 50,000 people, including journalists, dissidents and even heads of state.