That’s simply not true, a new ProPublica report on WhatsApp’s content moderation system finds. We knew that WhatsApp moderators exist; that WhatsApp hands over metadata to law enforcement; and that the company has long shared user data amongst its ecosystem of data-thirsty apps. This report gives a clearer picture of the practices which, until now, Facebook has deliberately obscured in its attempt to sell users on a privacy-oriented platform. WhatsApp can read some of your messages if the recipient reports them.
WhatsApp has been fined €225 million for breaking the EU’s data privacy law by not telling its users how it was sharing their data with its parent company Facebook.
In one of the biggest fines relating to the General Data Protection Regulation (GDPR), the Irish data regulator applied a penalty more than four times the level it had initially proposed for the messaging service after coming under pressure from other European countries.
The WhatsApp ruling came after Luxembourg fined Amazon a record €746 million in July for breaching GDPR and Ireland fined Twitter €450 million in December for not informing regulators about a data leak within 72 hours.
Apple has confirmed to me that it already scans iCloud Mail for CSAM, and has been doing so since 2019. It has not, however, been scanning iCloud Photos or iCloud backups.
The clarification followed me querying a rather odd statement by the company’s anti-fraud chief: that Apple was “the greatest platform for distributing child porn.” That immediately raised the question: If the company wasn’t scanning iCloud photos, how could it know this?
Android has long had an Accessibility API that’s intended for developers to build apps and experiences aimed at helping persons with disabilities, though it isn’t always used for that purpose. Google intends for apps that use the Accessibility API to fall into a few categories, including screen readers, switch-based input systems, and voice-based input systems. The company’s very own “Android Accessibility Suite” app provides, as its name implies, a suite of accessibility tools so persons with disabilities can access their device. The latest update to the Android Accessibility Suite adds a new way for users to control their devices: “Camera Switches.”
Shortly after reports today that Apple will start scanning iPhones for child-abuse images, the company confirmed its plan and provided details in a news release and technical summary.
"Apple’s method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind," Apple’s announcement said. "Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child security organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices."