Facebook is introducing a new AI tool which will detect and remove intimate pictures and videos posted without the subject's consent. It claims that the machine learning tool will make sure the posts, commonly referred to as 'revenge porn', are taken down - saving the victim from having to report them. Facebook users or victims of unauthorised uploads currently have to flag the inappropriate pictures before content moderators will review them. Social media sites across the board have struggled to monitor and contain abusive content users upload, from violent threats to inappropriate photos. The technology, which will be used across Facebook and Instagram, is trained using pictures that Facebook has previously confirmed were revenge porn.
Source: The Star March 18, 2019 06:45 UTC