Lily Hay Newman / Wired:
Ahead of a child safety group's campaign, Apple details why it's dropping iCloud CSAM scanning, saying it could be a “slippery slope of unintended consequences” — Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting.
Posted from: this blog via Microsoft Power Automate.