Apple Plans to Have iPhones Detect Child Pornography, Fueling Privacy Debate
Apple Inc. plans to introduce new iPhone software designed to identify and report collections of sexually exploitative images of children, aiming to bridge the yearslong divide between the company’s pledge to protect customer privacy and law enforcement’s desire to learn of illegal activity happening on the device.
The software, slated for release in an update for U.S. users later this year, is part of a series of changes Apple is preparing for the iPhone to protect children from sexual predators, the company said Thursday.
Apple, which has built much of its brand image in recent years on promises to safeguarding users’ privacy, says that its new software will further enhance those protections by avoiding any need for widespread scanning of images on the company’s servers, something that Apple currently doesn’t perform.
After news of Apple’s plans leaked out Wednesday, critics said they worried that by building software that can flag illegal content belonging to its users, Apple may be softening its stance on how it protects user data via encryption—a source of growing contention between the technology giant and law enforcement organizations over the past decade.
Apple’s system will use new techniques in cryptography and artificial intelligence to identify child sexual abuse material when it is stored using iCloud Photos, the company said. Using software that runs on both the iPhone and Apple’s cloud, Apple will detect whether images on the device match a known database of these illegal images. If a certain number of them—Apple declined to say exactly how many—are uploaded to iCloud Photos, Apple will review the images. If they are found to be illegal, Apple says it will report them to the National Center for Missing and Exploited Children.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.