SAN FRANCISCO: Apple on Thursday said that iPhones and iPads will soon start detecting images containing child sexual abuse and reporting them as they are uploaded to the iCloud.
The software tweak to Apple's operating systems will monitor pictures, allowing Apple to report findings to the National Center for Missing and Exploited Children, according to a statement by the Silicon Valley-based tech giant.
"We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material (CSAM)," Apple said.
The new technology will allow the phones' operating systems to match abusive photos on a user's phone against a database of known CSAM images provided by child safety organizations, then flag the images as they are uploaded to iCloud, Apple said.
The feature is part of a series of tools heading to Apple mobile devices, according to the company.
Apple's iPhone messaging app will additionally use machine learning to recognize and warn children and their parents when receiving or sending sexually explicit photos, the company said in the statement.
And personal assistant Siri will be taught to "intervene" when users try to search topics related to child sex abuse, according to the company.
Harmony module's limited fuel means Starliner can only stay docked for 45 days
Petitioner sought removal of all objectionable material from TikTok till final disposal of the petition
WhatsApp's upcoming features will make video calling more fun and convenient to use
Neuralink, after years of trials, implanted its first chip in the brain of paralysed patient Noland Arbaugh
Himalayas supply water to 240 million people in mountains and 1.65 billion people in river valleys below
Victory for Reliance Industries comes at a time when many trying to break into Indian satellite market