Apple plans to make iOS detect child abuse photos: Report
05-August-2021

Apple is reportedly planning to announce photo identification tools that would identify child abuse images in iOS photo libraries, the media reported.
Apple has previously removed individual apps from the App Store over child pornography concerns, but now it's said to be about to introduce such a detection system widely. Using photo hashing, iPhones could identify Child Sexual Abuse Material (CSAM) on the device.
Apple has not confirmed this and so far the sole source is the security expert Matthew Green, a cryptographer and associate professor at Johns Hopkins Information Security Institute, AppleInsider reported on Thursday.
According to Green, the plan is initially to be client-side -- that is, have all of the detection done on a user's iPhone. He argues, however, that it's possible that it's the start of a process that leads to surveillance of data traffic sent and received from the phone.
"Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems," Green said.
"The ability to add scanning systems like this to E2E [end to end encryption] messaging systems has been a major 'ask' by law enforcement the world over," he added.
According to Green, this sort of tool can be a boon for finding child pornography in people's phones.
Green and Johns Hopkins University have also previously worked with Apple to fix a security bug in Messages.-IANS
GST Rate Cut: Small Cars and Bikes to Get Cheaper From September 22
KTR Slams Telangana Govt for CBI Probe Against KCR, Calls It an Insult to People
Tamil Nadu Issues Public Advisory as Viral Fever Cases Rise; Masks Urged in Crowded Areas
KCR’s Daughter Kavitha Suspended from BRS Amid Family Rift and Power Struggle
PM Modi Condemns Abuse of His Late Mother at Bihar Rally, Calls It Insult to All Mothers