Demanding that platforms assume liability for their users’ speech will at best exacerbate the accidental removal of innocent speech. Speech may be flagged, either by other users, human moderators, or algorithms, and placed in queue for adjudication. Section 230 allows platforms to remain open by default and worry about excluding misuse when it occurs, giving a voice to everyone with an internet connection. While this allows them to safely assume full ownership of the speech they publish, it dramatically limits who can speak. This model often produces consistently high-quality speech, but tends to favor some perspectives over others, offering only a narrow slice of elite sentiment.
Source: Daily Sun January 28, 2021 09:45 UTC