

Apple criticised for system that detects child abuse
source link: https://www.bbc.com/news/technology-58124495
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Apple criticised for system that detects child abuse
Apple is facing criticism over a new system that finds child sexual abuse material (CSAM) on US users' devices.
The technology will search for matches of known CSAM before the image is stored onto iCloud Photos.
But there are concerns that the technology could be expanded and used by authoritarian governments to spy on its own citizens.
WhatsApp head Will Cathcart called Apple's move "very concerning".
Apple said that new versions of iOS and iPadOS - due to be released later this year - will have "new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy".
The system will report a match which is then manually reviewed by a human. It can then take steps to disable a user's account and report to law enforcement.
The company says that the new technology offers "significant" privacy benefits over existing techniques - as Apple only learns about users' photos if they have a collection of known child sex abuse material in their iCloud account.
But WhatsApp's Mr Cathcart says the system "could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable".
I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world.
People have asked if we'll adopt this system for WhatsApp. The answer is no.— Will Cathcart (@wcathcart) August 6, 2021
He argues that WhatsApp's system to tackle child sexual abuse material has reported more than 400,000 cases to the US National Center for Missing and Exploited Children without breaking encryption.
The Electronic Frontier Foundation, a digital rights group, has also criticised the move, labelling it "a fully-built system just waiting for external pressure to make the slightest change".
But some politicians have welcomed Apple's development.
Sajid Javid, UK Health Secretary, said it was time for others, especially Facebook, to follow suit.
US Senator Richard Blumenthal also praised Apple's move, calling it a "welcome, innovative and bold step".
"This shows that we can protect children and our fundamental privacy rights," he added.
Facebook and Apple don't like each other.
That dislike has come to a head in recent months over privacy
Apple's Tim Cook has consistently beaten the drum of "privacy first".
He has not so subtly criticised Facebook's business model - that it essentially sells peoples' data to advertisers.
A recent feature of Apple's new iOS update asked users whether they wanted to be tracked around the internet when they downloaded a new app.
Facebook hated the move, and warned shareholders it could hurt their profits.
So it's not entirely surprising that Facebook owned WhatsApp has come out so emphatically against Apple's new move.
Looking at it cynically, Apple's announcement is a chance for Facebook to tell the world that Apple isn't as keen on privacy as it likes to say.
But the WhatsApp chief isn't alone in his criticism. There are some very real concerns that this technology - in the wrong hands - could be used by governments to spy on its citizens.
Facebook has said in no uncertain terms that it thinks this vision of online safety is dangerous and should be canned.
Not for the first time the two companies have illustrated a totally different philosophical position on of the issues of our age -privacy.
You may be interested in watching:
Recommend
-
7
Siri, meet neuralMatch — Apple plans to scan US iPhones for child abuse imagery Security researchers raise alarm over potential surveillance of personal devices. ...
-
4
Apple to scan U.S. iPhones for images of child sexual abuseBy FRANK BAJAK and BARBARA ORTUTAY35 minutes agoThis May 21, 2021 photo shows the Apple logo displayed on a Mac Pro desktop...
-
7
Apple will begin scanning its US customers’ devices for known child sexual abuse material (CSAM) later this year, but already faces resistance from privacy and security advocates.Content Continues Below
-
7
TechApple will report images of child sexual abuse detected on iCloud to law enforcementPublished Thu, Aug 5 20214:51 PM EDTUpdated Thu, Aug 5 20214...
-
7
Apple will scan Photos and Messages for Child Sexual Abuse Materials (CSAM) Thursday, August 5, 2021 3:58 pmFriday, August 6, 2021
-
9
Researchers warn that iPhone devices may be monitored Security researchers are concerned that Apple's intention to install software that searches for child abuse images on iPhones in the United Stat...
-
5
Apple Defends Its Anti-Child Abuse Imagery Tech After Claims of ‘Hash Collisions’Image: Japanexperterna.se/FlickrResearchers claim they have probed a particular pa...
-
11
Apple is already scanning your emails for child abuse material
-
5
Apple delays plan to scan iPhones for child abuseBy Jane WakefieldTechnology reporter Published1 day agoimage sourceReutersimage captionApple announced the new technolo...
-
5
AppleApple Officially Cancels Its Plans to Scan iCloud Photos for Child Abuse MaterialThe tech giant's plan, which would have i...
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK