This Week's SponsorMagic Lasso Adblock: incredibly private and secure Safari web browsing
By Jason Snell for Macworld
Apple’s out-of-the-blue announcement last week that it was adding a bunch of features to iOS involving child sexual abuse materials (CSAM) generated an entirely predictable reaction. Or, more accurately, reactions. Those on the law-enforcement side of the spectrum praised Apple for its work, and those on the civil-liberties side accused Apple of turning iPhones into surveillance devices.
It’s not surprising at all that Apple’s announcement would be met with scrutiny. If anything is surprising about this whole story, it’s that Apple doesn’t seem to have anticipated all the pushback its announcement received. The company had to post a Frequently-Asked Questions file in response. If Q’s are being FA’d in the wake of your announcement, you probably botched your announcement.
Such an announcement deserves scrutiny. The problem for those seeking to drop their hot takes about this issue is that it’s extremely complicated and there are no easy answers. That doesn’t mean that Apple’s approach is fundamentally right or wrong, but it does mean that Apple has made some choices that are worth exploring and debating.