This has been a dreadful few weeks for Apple on the privacy front—not what the iPhone maker needs in the run up to the launch of iPhone 13 and iOS 15. A week ago, the company awkwardly (albeit inevitably) backtracked on its ill-conceived plan to screen its users’ photos on their devices to weed out known child abuse imagery.
When it comes to cloud photo storage, Google Photos leads the pack—four trillion photos and videos across more than a billion users. Millions of Apple users have Google Photos on their iPhones, iPads And Macs, but Apple has just flagged a serious warning about Google’s platform and given its users a reason to delete their apps.
Screening for CSAM is not controversial. All the major cloud platforms—including Google Photos—have done so for years. “Child sexual abuse material has no place on our platforms,” Google told me. “As we’ve outlined, we utilize a range of industry standard scanning techniques including hash-matching technology and artificial intelligence to identify and remove CSAM that has been uploaded to our servers.”
But Apple, it transpires, has not been doing the same. The company has not yet applied any such screening to iCloud Photos, and its reasoning for this seemingly surprising decision once again highlights the different privacy philosophies at play.
Apple iMessage Soundly Beaten As Radical New Update Goes Live
Quit Google Chrome For One Of These 3 Privacy-Friendly Alternatives
Is Facebook ‘Secretly’ Spying On Your WhatsApp Messages?
Apple’s controversial (now stalled) decision to screen for CSAM on-device rather than in the cloud, was, the company said, because it wanted to flag known imagery “while not learning any information about non-CSAM images.” What it means is that all users should not surrender the privacy of all their content, to flag a tiny minority.
The principle itself is sound enough. If your private iPhone doesn’t flag any potential CSAM matches, Apple’s servers can ignore all your content. If your iPhone does flag potential matches, at least 30 of them, then the server knows exactly where to look.
The issue, though, is that despite detailed technical explanations and assurances, that concept of on-device screening didn’t land well. That “private iPhone” filtering simply came across as on-device spyware, raising the specter of scope creep, of ever more content being flagged at the behest of U.S. and overseas governments. And so, Apple has retreated back to its drawing board for a rethink.
The News Highlights
- Why Is It Important To Remove Google Photos From Your iPhone, iPad, And Mac?
- Check the latest update on Security news
For Latest News Follow us on Google News
- Show all
- Trending News
- Popular By week