Apple employees raise concerns internally over CSAM detection plans

Apple employees now join the chorus of individuals worried about Apple’s plans to digitize iPhone user photo libraries for CSAM or child sexual abuse material, speaking internally about how the technology could be used to digitize user photos for other types of content, according to one report of Reuters.

drone park apple june 2018 2
According to Reuters, an unknown number of Apple employees have turned to Slack’s internal channels to voice concerns about CSAM detection. Specifically, employees fear governments will force Apple to use the technology for censorship by finding content other than CSAM. Some employees fear that Apple is damaging its reputation as a leader in privacy.

Apple employees flooded an internal Apple Slack channel with more than 800 messages about the plan announced a week ago, workers who asked not to be identified told Reuters. Many have expressed concern that the feature could be exploited by repressive governments seeking to find other materials for censorship or arrests, according to workers who saw the thread for several days.

Apple’s past security changes have also raised concerns among employees, but the volume and length of the new debate is surprising, workers said. Some posters feared Apple might damage its reputation as a leader in privacy.

Apple employees in user security-related positions are not believed to have been part of the internal protest, according to the report.

Since its announcement last week, Apple has been bombarded with criticism of its CSAM detection plans, which are still expected to roll out with iOS 15 and iPadOS 15 this fall. Concerns mostly revolve around how the technology might present a slippery slope for future implementations by oppressive governments and regimes.

Apple strongly rejected the idea that the technology on the device used to detect CSAM hardware could be used for other purposes. In one FAQ document published, the company said it would vehemently refuse such a request from governments.

Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse such requests. Apple’s CSAM detection capability is designed only to detect known CSAM images stored in iCloud Photos that have been identified by experts from NCMEC and other child safety groups. We have already faced requests to create and deploy government-mandated changes that degrade user privacy, and we have firmly refused those requests. We will continue to refuse them in the future. Let’s be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any request from a government to extend it. Additionally, Apple performs a human examination before reporting to NCMEC. In the event that the system flagged photos that do not match the known CSAM images, the account will not be deactivated and no report will be filed with NCMEC.

A open letter criticizing Apple and calling on the company to immediately stop its detection deployment plan CSAM has collected more than 7,000 signatures at the time of writing. The head of WhatsApp also weighed in the debate.

Leave a Comment