

Apple will scan Photos and Messages for Child Sexual Abuse Materials (CSAM)
source link: https://macdailynews.com/2021/08/05/apple-will-scan-photos-and-messages-for-child-sexual-abuse-materials-csam/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Apple will scan Photos and Messages for Child Sexual Abuse Materials (CSAM)
Thursday, August 5, 2021 3:58 pmFriday, August 6, 202131 Comments
Apple has released the following information via a new webpage entitled “Expanded Protections for Children” in which the company explains it will scan Photos and Messages for Child Sexual Abuse Materials (CSAM).
Expanded Protections for Children
At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).
Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.
Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.
Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.
These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.*
This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.
Communication safety in Messages
The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.
When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.
Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.
This feature is coming in an update later this year to accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey.*

Siri will provide resources and help around searches related to CSAM.CSAM detection
Another important concern is the spread of Child Sexual Abuse Material (CSAM) online. CSAM refers to content that depicts sexually explicit activities involving a child.
To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.
Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.
Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.
This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.
Expanding guidance in Siri and Search
Apple is also expanding guidance in Siri and Search by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.
Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.These updates to Siri and Search are coming later this year in an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.*
More Information
We have provided more information about these features in the documents below, including technical summaries, proofs, and independent assessments of the CSAM-detection system from cryptography and machine learning experts.
Expanded Protections for Children — Technology Summary (PDF)
CSAM Detection — Technical Summary (PDF)
Apple PSI System — Security Protocol and Analysis (PDF)
Technical Assessment of CSAM Detection — Benny Pinkas (PDF)
Technical Assessment of CSAM Detection — David Forsyth (PDF)
Technical Assessment of CSAM Detection — Mihir Bellare (PDF)
Alternative Security Proof of Apple PSI System — Mihir Bellare (PDF)
MacDailyNews Take: “These efforts will evolve and expand over time.” If you didn’t shudder at that line, you should have.
Regardless of the security and privacy protections of Apple’s system, this seems like a slippery slope. Images of the Trojan Horse also spring to mind as we quote ourselves:
Think of The Children™. Whenever you hear that line of horseshit, look for ulterior motives. — MacDailyNews, September 30, 2014
Again, as it’s likely meant to be, this sounds wonderful at first glance (detecting and rooting out purveyors of child pornography – why, everyone’s for that!) and horrible once you think about it for more than a second (massive, awful potential for misuse).
It’s a huge can of worms. And it will do much to negate Apple’s voluminous claims of protecting users’ privacy, regardless of the privacy protections in place.
All of the marketing money, TV ads, and solemn privacy pronouncements from Tim Cook over the last several years can’t stand up to the simple fact that now, in a nutshell, Apple will scan Photos and Messages.
It doesn’t matter what they’re scanning for, because if they can scan for one thing, they can scan for anything.
Smart people will begin looking for alternatives to Apple’s iCloud Photos and Photos and Messages apps.
To turn off your iCloud Photos everywhere, follow these steps:
• On your iPhone, iPad, or iPod touch, go to Settings > [your name] > iCloud > Manage Storage > Photos, then tap Disable & Delete.
• On your Mac, go to Apple menu > System Preferences, then click Apple ID. Choose iCloud, then click Manage. Select Photos, then click Turn Off and Delete.
If you change your mind, follow the steps above on your device then select Undo Delete.
Photos and videos are stored in your account for 30 days. To download your photos and videos on your iOS device, go to Settings > [your name] > iCloud > Photos and select Download and Keep Originals. On your Mac, open Photos, choose Photos > Preferences, then click iCloud, then select Download Originals to this Mac. You can also select the photos and videos that you want to download from iCloud.com.
We hope Apple’s scanning system is never used for anything else besides CSAM, but with the pressures that governments can wield and the amount with which Apple is beholden to China, we highly doubt it.
See also: How to jailbreak your iPhone
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
— Matthew Green (@matthew_d_green) August 4, 2021
Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems.
— Matthew Green (@matthew_d_green) August 5, 2021
This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government? https://t.co/nB8S6hmLE3
— Matthew Green (@matthew_d_green) August 5, 2021
The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t “hurt” anyone’s privacy.
But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal.
— Matthew Green (@matthew_d_green) August 5, 2021
But even if you believe Apple won’t allow these tools to be misused
there’s still a lot to be concerned about. These systems rely on a database of “problematic media hashes” that you, as a consumer, can’t review.
— Matthew Green (@matthew_d_green) August 5, 2021
The idea that Apple is a “privacy” company has bought them a lot of good press. But it’s important to remember that this is the same company that won’t encrypt your iCloud backups because the FBI put pressure on them. https://t.co/tylofPfV13
— Matthew Green (@matthew_d_green) August 5, 2021
The theory is that you will trust Apple to only include really bad images. Say, images curated by the National Center for Missing and Exploited Children (NCMEC). You’d better trust them, because trust is all you have.
— Matthew Green (@matthew_d_green) August 5, 2021
But there are worse things than worrying about Apple being malicious. I mentioned that these perceptual hash functions were “imprecise”. This is on purpose. They’re designed to find images that look like the bad images, even if they’ve been resized, compressed, etc.
— Matthew Green (@matthew_d_green) August 5, 2021
This means that, depending on how they work, it might be possible for someone to make problematic images that “match” entirely harmless images. Like political images shared by persecuted groups. These harmless images would be reported to the provider.
— Matthew Green (@matthew_d_green) August 5, 2021
Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content.
That’s the message they’re sending to governments, competing services, China, you.
— Matthew Green (@matthew_d_green) August 5, 2021
Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone.
And by the time we find out it was a mistake, it will be way too late.
— Matthew Green (@matthew_d_green) August 5, 2021
Recommend
-
7
Siri, meet neuralMatch — Apple plans to scan US iPhones for child abuse imagery Security researchers raise alarm over potential surveillance of personal devices. ...
-
4
Apple to scan U.S. iPhones for images of child sexual abuseBy FRANK BAJAK and BARBARA ORTUTAY35 minutes agoThis May 21, 2021 photo shows the Apple logo displayed on a Mac Pro desktop...
-
7
TechApple will report images of child sexual abuse detected on iCloud to law enforcementPublished Thu, Aug 5 20214:51 PM EDTUpdated Thu, Aug 5 20214...
-
13
Reports of online child sexual abuse in NI rise by 80%Published22 hours agoImage caption, The offenses include sexual assault, rape and sexual communication with a child where any element was committed on...
-
7
Section 230 doesn’t stop Omegle sexual abuse lawsuit, says court Skip to main content...
-
6
Judge Rules Visa Can Be Sued for Monetizing Child Sexual Abuse Material on PornhubThe court could infer that “Visa intended to help MindGeek monetize child porn," the decision states.August 1,...
-
11
TikTok moderators say they were shown child sexual abuse videos during training A “daily required reading” full of horrific imagery ...
-
4
-
3
AI-generated child sex images spawn new nightmare for the webInvestigators say the disturbing images could undermine efforts to find real-world victimsBy...
-
11
CloseExploitive, illegal photos of children found in the data that trains some AIStanford researchers found more than 1,000 ima...
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK