Apple has released an FAQ document detailing its response to privacy criticisms of its new iCloud Photos feature of scanning child abuse images.
Apple’s suite of tools aimed at protecting children has elicited mixed reactions from security and privacy experts, with some mistakenly choosing to claim that Apple is abandoning its privacy stance. Now Apple has published a rebuttal in the form of a FAQ document.
“At Apple, our goal is to create technology that empowers people and enriches their lives, while helping them stay safe,” he says. the complete document. “We want to protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (MASI).”
“Since we announced these features, many stakeholders, including privacy organizations and child safety organizations, have voiced support for this new solution,” he continues, “and some have reached out with questions.”
The document focuses on how the reviews have combined two themes that Apple believes are completely separate.
“What are the differences between communication security in Messages and CSAM detection in iCloud Photos?” question. “These two functions are not the same and they do not use the same technology.”
Apple emphasizes that the new Messages features are “designed to provide parents … additional tools to help protect their children.” The images sent or received via messages are analyzed on the device “and, therefore, [the feature] it does not change the privacy guarantees of the messages. “
CSAM detection in iCloud Photos does not send information to Apple about “any photos that do not match known CSAM images.”
One concern of privacy and security experts has been that this on-device image scanning could easily be extended to the benefit of authoritarian governments demanding that Apple expand what it seeks.
“Apple will reject such demands,” says the FAQ document. “We have faced lawsuits to build and implement government-required changes that degrade user privacy before, and we have strongly rejected those demands. We will continue to reject them in the future.”
“Let’s be clear,” he continues, “this technology is limited to detecting CSAM stored in iCloud and we will not comply with any government request to expand it.”
Apple’s new post on the subject comes after an open letter was sent, asking the company to reconsider its new features.