Apple will report child sexual abuse images on iCloud to law
source link: https://www.cnbc.com/2021/08/05/apple-will-report-child-sexual-abuse-images-on-icloud-to-law.html
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
Apple will report images of child sexual abuse detected on iCloud to law enforcement
- Apple will report child exploitation images uploaded to iCloud in the U.S. to law enforcement, the company said on Thursday.
- The new system will detect images called Child Sexual Abuse Material (CSAM) using a process called hashing, where images are transformed into unique numbers that correspond to that image.
- Apple says that its system is more private for users than previous approaches to eliminating illegal images of child pornography, because it uses sophisticated cryptography on both Apple’s servers and user devices.
Apple will report images of child exploitation uploaded to iCloud in the U.S. to law enforcement, the company said on Thursday.
The new system will detect images called Child Sexual Abuse Material (CSAM) using a process called hashing, where images are transformed into unique numbers that correspond to that image.
Apple started testing the system on Thursday, but most U.S. iPhone users won’t be part of it until an iOS 15 update later this year, Apple said.
The move brings Apple in line with other cloud services which already scan user files, often using hashing systems, for content that violates their terms of service, including child exploitation images.
It also represents a test for Apple, which says that its system is more private for users than previous approaches to eliminating illegal images of child sexual abuse, because it uses sophisticated cryptography on both Apple’s servers and user devices and doesn’t scan actual images, only hashes.
But many privacy-sensitive users still recoil from software that notifies governments about the contents on a device or in the cloud, and may react negatively to this announcement, especially since Apple has vociferously defended device encryption and operates in countries with fewer speech protections than the U.S.
Law enforcement officials around the world have also pressured Apple to weaken its encryption for iMessage and other software services like iCloud to investigate child exploitation or terrorism. Thursday’s announcement is a way for Apple to address some of those issues without giving up some of its engineering principles around user privacy.
How it works
Before an image is stored in Apple’s iCloud, Apple matches the image’s hash against a database of hashes provided by National Center for Missing and Exploited Children (NCMEC). That database will be distributed in the code of iOS beginning with an update to iOS 15. The matching process is done on the user’s iPhone, not in the cloud, Apple said.
If Apple then detects a certain number of violating files in an iCloud account, the system will upload a file that allows Apple to decrypt and see the images on that account. A person will manually review the images to confirm whether or not there’s a match.
Apple will only be able to review images that match content that’s already known and reported to these databases — it won’t be able to detect parents’ photos of their kids in the bath, for example, as these images won’t be part of the NCMEC database.
If the person doing the manual review concludes the system did not make an error, then Apple will disable the user’s iCloud account, and send a report to NCMEC or notify law enforcement if necessary. Users can file an appeal to Apple if they think their account was flagged by mistake, an Apple representative said.
The system only works on images uploaded to iCloud, which users can turn off, Apple said. Photos or other images on a device that haven’t been uploaded to Apple servers won’t be part of the system.
Some security researchers have raised concerns that this technology could eventually be used to identify other kinds of images, such as photos of a political protest. Apple said that its system is built so that it only works and only can work with images cataloged by NCMEC or other child safety organizations, and that the way it build the cryptography prevents it from being used for other purposes.
Apple can’t add additional hashes to the database, it said. Apple said that it is presenting its system to cryptography experts to certify that it can detect illegal child exploitation images without compromising user privacy.
Apple unveiled the feature on Thursday along other features intended to protect children from predators. In a separate feature, Apple will use machine learning on an child’s iPhone with a family account to blur images that may contain nudity, and parents can choose to be alerted when a child under 13 receives sexual content in iMessage. Apple also updated Siri with information about how to report child exploitation.
Recommend
-
2
Apple to scan U.S. iPhones for images of child sexual abuseBy FRANK BAJAK and BARBARA ORTUTAY35 minutes agoThis May 21, 2021 photo shows the Apple logo displayed on a Mac Pro desktop...
-
5
Apple will begin scanning its US customers’ devices for known child sexual abuse material (CSAM) later this year, but already faces resistance from privacy and security advocates.Content Continues Below
-
3
Apple will scan Photos and Messages for Child Sexual Abuse Materials (CSAM) Thursday, August 5, 2021 3:58 pmFriday, August 6, 2021
-
4
Researchers warn that iPhone devices may be monitored Security researchers are concerned that Apple's intention to install software that searches for child abuse images on iPhones in the United Stat...
-
7
Reports of online child sexual abuse in NI rise by 80%Published22 hours agoImage caption, The offenses include sexual assault, rape and sexual communication with a child where any element was committed on...
-
2
Judge Rules Visa Can Be Sued for Monetizing Child Sexual Abuse Material on PornhubThe court could infer that “Visa intended to help MindGeek monetize child porn," the decision states.August 1,...
-
7
TikTok moderators say they were shown child sexual abuse videos during training A “daily required reading” full of horrific imagery ...
-
3
AppleApple Officially Cancels Its Plans to Scan iCloud Photos for Child Abuse MaterialThe tech giant's plan, which would have i...
-
1
AI-generated child sex images spawn new nightmare for the webInvestigators say the disturbing images could undermine efforts to find real-world victimsBy...
-
3
CloseExploitive, illegal photos of children found in the data that trains some AIStanford researchers found more than 1,000 ima...
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK