Apple details how its CSAM detection system is designed to prevent abuse

Apple has published a new document today that offers additional details on its recently announced child safety features. The company is responding to concerns about the potential of the new CSAM detection capability to turn into a backdoor, with details on the threshold it uses and more.

One of Apple’s most notable announcements today is that the system will be able to be audited by third parties. Apple explains that it will publish a knowledge base article with the root hash of the encrypted CSAM hash database. Apple will also allow users to inspect the root hash database on their device and compare it to the database in the KB article:

Apple will publish a Knowledge Base article containing a root hash of the encrypted CSAM hash database included with each version of every Apple operating system that supports this feature. Additionally, users will be able to inspect the root hash of the encrypted database present on their device and compare it to the root hash expected in the KB article. The correctness of the root hash calculation displayed to the user in the settings is subject to code inspection by security researchers like all other iOS device-side security claims.

This approach enables third-party technical audits: an auditor can confirm that for any given root hash of the encrypted CSAM database in the knowledge base article or on a device, the database was generated only from it. ‘An intersection of hashes of participating child safety organizations, with no additions, deletions or changes. Facilitating the audit does not require the child safety organization to provide sensitive information such as raw hashes or the source images used to generate the hashes. She only needs to provide a non-sensitive attestation of the complete database that she sent to Apple. Then, in a secure environment on campus, Apple can provide the auditor with technical proof that the intersection and blindness was performed correctly. A participating child safety organization may also decide to conduct the audit.

Apple also addressed the possibility that an organization could include something other than known CSAM content in the database. Apple says it will work with at least two child safety organizations to generate the database included in iOS that are not under the control of the same government:

Apple generates the perceptual CSAM hash database on the device via an intersection of hashes provided by at least two child safety organizations operating in separate sovereign jurisdictions, i.e. not under the control of the same government . All perceptual hashes appearing in a single database of a participating child safety organization, or only in databases of multiple agencies in the same sovereign jurisdiction, are deleted by this process and are not included. in the encrypted CSAM database that Apple includes in the operating system. This mechanism meets our requirement for accuracy of the source image.

Apple also offers new details about the manual review process that is performed after the threshold is reached:

Since Apple does not own the CSAM images whose perceptual hashes make up the database on the device, it’s important to understand that reviewers don’t just check to see if a given flagged image matches an entry in the device. Apple’s encrypted CSAM image database, which is an entry into the intersection of hashes of at least two child safety organizations operating in separate sovereign jurisdictions. Instead, the reviewers only confirm one thing: for an account that has crossed the match threshold, the positively matching images have visual derivatives that are CSAM. This means that if non-CSAM images were inserted into the perceptual CSAM hash database on the device – inadvertently or by force – there would be no effect unless Apple’s human reviewers were also notified. specific non-CSAM images that they need to report (by accounts that exceed the match threshold), and were then forced to do so.

You can find the full document released by Apple today, titled “Apple Child Safety Features Safety Threat Model Review,” here.

FTC: We use automatic affiliate links which generate income. Following.


Check out 9to5Mac on YouTube for more Apple news:

Leave a Reply

Your email address will not be published. Required fields are marked *