Apple publishes FAQ to address CSAM detection and message analysis issues

Apple has posted an FAQ titled “Extended Protections for Children” which aims to allay user privacy concerns about the new CSAM detection in ICloud Photos and communications security for the functions of Messages that the company announced last week.

apple privacy
“Since we announced these features, many stakeholders, including privacy organizations and child safety organizations, have expressed their support for this new solution, and some have asked questions,” reads. on in the FAQ. “This document serves to answer these questions and to provide more clarity and transparency in the process.”

Some discussions have blurred the distinction between the two features, and Apple takes great care in document to differentiate them, explaining that the communication security in Messages “only works on images sent or received in the Messages application for child accounts configured in Family Sharing”, while the CSAM detection in ‌iCloud Photos‌ “does not only affects users who have chosen to use ‌iCloud Photos‌ to store their photos… There is no impact on other data on the device. ”

From the FAQ:

These two features are not identical and do not use the same technology.

Communication Security in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. This only works on pictures sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on the device and therefore does not modify the guarantees of confidentiality of the messages. When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be notified, presented with useful resources, and reassured that there is no problem if they do not want to see or send the picture. Photo. As an added precaution, young children may also be told that, to make sure they are safe, their parents will receive a message if they see it.

The second feature, CSAM detection in iCloud Photos, is designed to prevent CSAM from accessing iCloud photos without providing information to Apple about photos other than those that match known CSAM images. CSAM images are illegal to own in most countries, including the United States. This feature only affects users who have chosen to use iCloud Photos to store their photos. It does not affect users who have not chosen to use iCloud Photos. There is no impact on other data on the device. This feature does not apply to messages.

The remainder of the document is divided into three sections (in bold below), with answers to the following frequently asked questions:

  • Communication security in Messages
  • Who can use communications security in Messages?
  • Does that mean Messages will share information with Apple or law enforcement?
  • Does this break end-to-end encryption in Messages?
  • Does this feature prevent children living in abusive homes from seeking help?
  • Will parents be notified without the children being warned and having a choice?
  • CSAM detection
  • Does this mean that Apple will scan all photos stored on my iPhone?
  • Will this download CSAM images to my ‌iPhone‌ for comparison to my photos?
  • Why is Apple doing this now?
  • Security for CSAM detection for iCloud photos
  • Can the CSAM detection system in ‌iCloud Photos‌ be used to detect things other than CSAM?
  • Could governments force Apple to add non-CSAM images to the hash list?
  • Can non-CSAM images be “injected” into the system to report accounts for items other than CSAM?
  • Will CSAM detection in ‌iCloud Photos‌ falsely report innocent people to law enforcement?

Interested readers should consult the document for Apple’s full answers to these questions. However, it should be noted that for questions that can be answered with a binary yes / no, Apple starts them all with “No” except for the following three questions in the section called “Security for CSAM detection for ‌ICloud Photos‌: “

Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?
Our process is designed to prevent this from happening. CSAM detection for iCloud photos is designed so that the system will only work with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes are based on images acquired and validated to be CSAM by child safety organizations. There is no automated report to law enforcement, and Apple does a human review before making a report to NCMEC. Therefore, the system is only designed to report known photos CSAM in iCloud Photos. In most countries, including the United States, mere possession of these images is a crime and Apple is obligated to report any cases that come to our knowledge to the appropriate authorities.

Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse such requests. Apple’s CSAM detection capability is designed only to detect known CSAM images stored in iCloud Photos that have been identified by experts from NCMEC and other child safety groups. We have already faced requests to create and deploy government-mandated changes that degrade user privacy, and we have firmly refused those requests. We will continue to refuse them in the future. Let’s be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any request from a government to extend it. Additionally, Apple performs a human examination before reporting to NCMEC. In the event that the system flagged photos that do not match the known CSAM images, the account will not be deactivated and no report will be filed with NCMEC.

Can non-CSAM images be “injected” into the system to report accounts for items other than CSAM?
Our process is designed to prevent this from happening. The set of image hashes used for correspondence comes from known and existing images from CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes. The same set of hashes are stored in every iPhone and iPad user’s operating system, so targeted attacks against specific individuals are not possible in our design. Finally, there is no automated report to law enforcement and Apple does a human review before making a report to NCMEC. In the unlikely event that the system reports images that do not match the known CSAM images, the account will not be deactivated and no report will be filed with NCMEC.

Apple has faced a lot of criticism from privacy advocates, security researchers, crypto experts, academics and others for its decision to deploy the technology with the release of iOS 15 and iPad 15, expected in September.

This resulted in a open letter criticizing Apple’s plan to scan iPhones for CSAM in ‌iCloud Photos‌ and explicit images in children’s posts, which has garnered more than 5,500 signatures at the time of writing. Apple has also received criticism from Facebook-owned WhatsApp, including chief Will Cathcart called him “the wrong approach and a setback for the privacy of people all over the world.” Epic games CEO Tim Sweeney also attack the decision, claiming that he had “tried hard” to see the move from Apple’s point of view, but had concluded that, “inevitably, this is government spyware installed by Apple on the basis of a presumption of guilt “.

“No matter how good the good intentions are, Apple is rolling out mass surveillance around the world with it,” noted prominent whistleblower Edward Snowden, adding that “if they can search child porn today, they can search anything tomorrow”. The nonprofit Electronic Frontier Foundation also critical Apple’s plans, stating that “even a carefully documented, carefully thought out, close-range backdoor is still a backdoor.”

Leave a Comment

x