Photo / 123RFApple unveiled plans to scan US iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused by governments looking to surveil their citizens. The tool Apple calls "neuralMatch" will detect known images of child sexual abuse without decrypting people's messages. If it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary. Tech companies including Microsoft, Google, Facebook and others have for years been sharing "hash lists" of known images of child sexual abuse. Law enforcement, however, has long pressured for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.
Source: New Zealand Herald August 05, 2021 19:41 UTC