Apple will use an on-device matching tech using a database of known child abuse image hashes provided by NCMEC and other child safety organisations. Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

It is using a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here