The first two steps focus on identifying and removing videos that specifically encourage terrorism. Currently, YouTube uses a combination of video analysis software and human content flaggers to find and delete videos that break its community guidelines. “Machines can help identify problematic videos, but human experts still play a role in nuanced decisions about the line between violent propaganda and religious or newsworthy speech,” Walker wrote. Since YouTube cannot delete these videos and others of its kind, the company’s basic plan is to simply hide them as best they can. “These will appear behind an interstitial warning and they will not be monetized, recommended or eligible for comments or user endorsements,” Walker wrote.
Source: Washington Post June 19, 2017 07:14 UTC