“We have always supported diverse points of view being represented on the App Store, but there is no place on our platform for threats of violence and illegal activity,” Apple said in a statement Saturday. “Parler has not taken adequate measures to address the proliferation of these threats to people’s safety.”Apple had given Parler 24 hours on Thursday to establish restrictions on harmful speech, according to an email obtained by Buzzfeed. “We have received numerous complaints regarding objectionable content in your Parler service, accusations that the Parler app was used to plan, coordinate, and facilitate the illegal activities in Washington D.C. on January 6, 2021 that led (among other things) to loss of life, numerous injuries, and the destruction of property,” said the email. “The app also appears to continue to be used to plan and facilitate yet further illegal and dangerous activities.”Google provided the same justification on Friday when it dropped Parler from its app offerings on the Google Play store. Company policies require that apps which display user-generated content have “moderation policies and enforcement that removes egregious content like posts that incite violence,” Google said in a statement.
Source: Huffington Post January 10, 2021 02:42 UTC