NEW DELHI: Tech giant Apple’s new child sexual abuse material or CSAM detection features , announced on August 6, will have safeguards against governments trying to manipulate it. This is meant to prevent any single government or law enforcement agency from manipulating CSAM databases to surveil users. The iPhone maker, on August 6, announced two new automated features to improve child safety on its devices. In the technical paper, the company also noted that notifications are never sent to law enforcement directly. The company’s new child safety features have received flak from privacy bodies like the Electronic Frontier Foundation (EFF), which called it a backdoor into Apple’s systems, something law enforcement and governments have long wanted.
Source: Mint August 14, 2021 10:41 UTC