Shortly after reports today that Apple will start scanning iPhones for child-abuse images, the company confirmed its plan and provided details in a news release and technical summary.
"Apple’s method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind," Apple’s announcement said. "Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child security organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices."
Apple unveiled a new slate of privacy features in WWDC, which is to debut with iOS 15 this fall. The upcoming operating system will include a "privacy report" that informs users about which applications collect their personal data. The report will tell users when an app accessed their iPhone’s photo album, microphone or contact lists. Apple also announced its Mail app will offer more privacy protections and revealed a new service to hide online traffic from internet providers.
Apple has announced today the new item tracking device, AirTags. It can be tracked its exact location using users’ iPhones. Although this is useful in cases where an individual has lost their keys, it has some privacy concerns as this is sharing location of the item, although Apple claims that they put privacy at heart.
"We're not against digital advertising," Cook said. "I think digital advertising is going to thrive in any situation, because more and more time is spent online, less and less is spent on linear TV. And digital advertising will do well in any situation. The question is, do we allow the building of this detailed profile to exist without your consent?"
A study into App Privacy labels on the App Store found that apps from Facebook were most invasive, collected most data from users.