Apple has tried to deflect criticism of its controversial CSAM protection system, but in doing so it has illustrated What is at stake.
The great conversation
Apple announced last week that it would introduce a collection of child protection measures within iOS 15, iPad OS 15, and macOS Monterey when the operating systems launch this fall.
Among other protections, the device’s system scans your iCloud Photos library for evidence of illegal Child Sexual Abuse Material (CSAM) collections. Of course, it’s entirely appropriate to protect children, but privacy advocates remain concerned that Apple’s system will turn into full-blown surveillance.
In an attempt to mitigate criticism, Apple has post new information in which he tries to explain a little more how technology works. As explained in this Apple White Paper, the technology converts the images on your device into a numeric hash that can be compared to a database of known CSAM images as you upload them to iCloud Photos.
While image analysis is performed on the device using Apple’s hashing technology, not all images are flagged or scanned, only those identified as CSAM. Apple argues that this is actually an improvement in the sense that the company at no point scans the entire library.
“Existing techniques implemented by other companies scan all user photos stored in the cloud. This creates a privacy risk for all users, ”says the company’s new FAQ. “Detecting CSAM in iCloud Photos provides significant privacy benefits over these techniques by preventing Apple from learning about photos, unless they both match known CSAM images and are included in an iCloud Photos account that includes a collection of known CSAMs.” .
Despite these guarantees, there are still major concerns about the extent to which the system can be extended to monitor other forms of content. After all, if you can convert a collection of CSAM images into identifiable data, you can convert anything In data you can scan personal information. Privacy advocate Edward Snowden warns, “Make no mistake: if you can scan for child pornography today, you can scan for anything tomorrow.”
Apple says it has no intention of taking its system to other domains. In your FAQ, write:
“We have faced lawsuits to build and implement government-required changes that degrade user privacy before and we have firmly rejected those demands. We will continue to reject them in the future. Let’s be clear, this technology is limited to detecting CSAM stored in iCloud and we will not comply with the request of any government to expand it ”.
On the surface that seems reassuring. But it stands to reason that now that this technology exists, those nations that want to force Apple to extend surveillance on the device for matters beyond CSAM will use whatever weapons they have to force the problem.
“All it would take to expand the narrow back door that Apple is building is an expansion of the machine learning parameters to search for additional types of content,” he warned. Electronic Frontier Foundation.
Preventing this will be a struggle. Which means Apple has a fight ahead of it.
This may be a fight that Apple wants to have. After all, we know that you have taken many important steps to protect user privacy across all of your ecosystems, and we also know that you support changes in the law to protect privacy online.
“It is certainly time, not only for comprehensive privacy law here in the US, but also for global laws and new international agreements that enshrine the principles of data minimization, user awareness, user access, and security. data around the world. “ CEO Tim Cook said this year.
It could be argued that Apple’s high-profile introduction of child protection measures has sparked a broader conversation about rights and privacy in an online and connected world. The only way to prevent the system from spreading beyond CSAM is to help Apple resist the pressure to do so.
In the absence of such support, Apple is unlikely to prevail against all governments alone. In case the company is not supported, the question is When, do not Yes, it will be forced to yield. And yet, governments can still agree on privacy online.
The stakes are high. The risk is that the bricks along the sunlit path to justice Cook has long sought to lay bricks in the wall to prevent such a trip from occurring.
The benefit is that a determined effort can allow the creation of frameworks that allow the end of that path.
The controversy reflects how rocky that road appears to have become.