Shortly after reports today that Apple will start scanning iPhones for child-abuse images, the company confirmed its plan and provided details in a news release and technical summary.
"Apple’s method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind," Apple’s announcement said. "Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child security organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices."
Ministers are considering forcing Facebook to implement a backdoor to allow security agencies and police to read the contents of messages sent across its Messenger, WhatsApp and Instagram chat services.
Edward Snowden has exposed that the US government was recording citizen’s phone calls without their permission. There was a surveilling system that was watching us all the time.