Why Facebook Should Change its Content Moderation Policy

The First Amendment protects its citizens from government interference, but does not apply its protections to an infringement on free speech by private companies. Should private media giants, such as Facebook, adopt a similar framework to protect the free speech of its users? Facebook should follow the limits on speech regulations afforded by the First Amendment and accordingly reinstate Trump’s account.

1) Facebook is not a state actor but should nevertheless protect free speech.

Modern private media platforms such as Facebook are the successors to the constitutionally protected areas of the founding. Facebook can be likened to a public square where groups of people chat and share information, headlines and other hot topics as shout outs are akin to the town criers existing at the founding. In this case, the policy behind the First Amendment protecting public discourse should apply with equal force. The First Amendment doesn’t prevent a private company from filtering content. Nevertheless, one can argue that Facebook’s role as a modern replacement for historical public forums for speech counsels that it should adopt similar protections for speech as those enshrined in the First Amendment.

For example, from time immemorial, a park or a town square was a public forum where people would come to express their ideas. Today, however, this is not quite the case. In today’s parks, one may find a leaf letter, but the majority of ideas are expressed through social media. Another reason a town square could be compared to social media, is that it is a public forum that is freely accessible and cheap. The factors of ease of accessibility and entry goes toward social media having replaced parks or town squares for the purpose of idea expression.

To be sure, one can also write a blog, however, unless you have spent time to establish your blog, you may not have as many followers and it is much more difficult to acquire followers. Facebook allows for an extended following of friends. It is easier to acquire friends on Facebook than it is readers of a blog. Overall, it takes considerably more time and resources to acquire readers on a blog. Unless someone is already established as a prominent public figure, the chances of acquiring a large following to reach a broad audience is unlikely.

2) Removing social media profiles undermines First Amendment protections.

Among other things, Facebook’s content moderation policy removes “language that incites or facilitates serious violence,” which is further clarified to lead to the removal or disabling of accounts when there is a “genuine risk of physical harm or direct threats to public safety.” At first glance, this seems similar to the test courts use to determine whether speech is unprotected incitement. Facebook, however, completely disregards the first Brandenburg prong requiring consideration of whether the speaker directed his speech to cause this risk.

Facebook’s omission of this prong could lead to hostile scenarios resulting from posting controversial messages. If someone posts an unpopular opinion, people opposing that opinion could potentially react with violence. That violence could then lead to genuine risks of physical harm or public safety. This unmonitored chain reaction would give the people opposing that opinion a heckler’s veto to the speech they oppose. This is why the first Brandenburg prong is essential. Without the first prong, anyone could go around, cause this result, and remove the speech to which they object. By keeping the control over whether the content is objectionable in the hands of the speaker rather than the heckler, Facebook can avoid this type of hecklers veto.

This distinction is not immaterial. Under Facebook’s policy any speech that could lead to violence could be removed even if is well-meaning. For example, posts regarding an extremely controversial topic but not otherwise a call to violence could be considered likely to create a “genuine risk of physical harm” by hecklers who will react violently to supporters of the poster. In this way, the poster who is himself a victim of the violence his post caused, could end up being blocked.

A more intelligent direction would be for Facebook to follow the Supreme Court’s reasoning in Brandenburg and require that the user’s post be directed toward causing this violence before removing it. Under such a test, President Trump’s video calling for protestors to go home would not be objectionably even if his repeated utterances of voter fraud allegations and expressions of “love” for the protestors could be construed to create a risk of physical harm. Facebook may disagree with how the President chose to tell the protestors to “go home” but the video was clearly directed towards that purpose of reducing violence rather than inciting it. By considering the result that the speaker intended to cause, Facebook can still remove speech calling for violent action but would not remove controversial speech that may incidentally be construed as a call to action when it was not intended as one.

In Abrams v. United States, Justice Holmes commented on the importance of both good and bad ideas in a marketplace of ideas. In Whitney v. California, Justice Brandeis concurred that “If there be time to expose through discussion, the falsehoods and fallacies, to avert the evil by the processes of education, the remedy to be applied is more speech, not enforced silence.” He further noted that “the greatest menace to freedom is an inert people; that public discussion is a political duty, and that this should be a fundamental principle of the American government.” Our Founding Fathers believed in the power of reason as applied through public discussion. They, in fact, recognized that “order cannot be secured merely through fear of punishment for its infraction; that it is hazardous to discourage thought, hope and imagination; that fear breeds repression; that repression breeds hate; that hate menaces stable government; that the path of safety lies in the opportunity to discuss freely supposed grievances and proposed remedies, and that the fitting remedy for evil counsels is good ones.” The only way to defeat bad speech, therefore, is through countering it with good speech. if people are unable to express these ideas in public, there will be no counter, and people will harbor them in secret without allowing a public dialog to disprove their beliefs. The First Amendment was intended to protect an open marketplace of ideas. Big media companies have replaced the town square and thus it is incumbent on them to protect the public dialog.

In sum, the policy rationale for the First Amendment has long been understood to be the protection of some degree of public discourse. Through the push and pull of debate, the public will eventually counter bad ideas and move together towards finding greater truth. These public spaces, however, are now scarcely utilized as a means of expressing opinions and engaging in debate. Today, individuals post his or her thoughts on social media and debate is engaged in through comments or replies. This debate is however meaningless if there are not adequate speech protections to ensure that everyone is given an opportunity to engage.

3) Facebook should adopt the Brandenburg test for incitement or at least clarify and consistently apply its incitement standard.

Critics contend that some ideas are simply not worth sharing. The greatest example of such an idea is incitement to violence. Courts have long recognized that speech that presents a clear and present danger is not protected by the First Amendment. In these scenarios, courts reason that there will not be time for argument and debate to defeat bad ideas before violence occurs. Recognizing the importance of ensuring an open marketplace of ideas, however, has caused the court to adopt a very high standard for what constitutes such incitement. The Brandenburg test establishes when inflammatory speech intending to advocate illegal action can be restricted. In Brandenburg v. Ohio, the Court explained that speech is unprotected incitement when it is reasonably calculated to imminently incite harm. The facts of the case are revealing. The Court ruled that speech was protected when members of the KKK vowed vengeance against minority races and instructed listeners to march on Washington against elected officials who they claimed suppressed the white race. The Court ruled that this speech, though distasteful, was nevertheless protected due to the fact that it was attenuated from any actual action and that there was not a direct call to violence. Subsequent cases have revealed that speech generally only constitutes incitement when there is a direct call to violence in a scenario where it can occur immediately or soon after. Such a high bar ensures that only in the gravest situation of potential danger will the drastic remedy of suppressing speech be permitted. At a minimum, Facebook should outline clearly what standard it is using for incitement as it clearly would not meet the Brandenburg test. As Facebook is not obligated to apply the Brandenburg test, it should nonetheless enunciate what constitutes incitement, what standard it is applying, and then apply it evenly across the board. Facebook should, furthermore, consider the arguments built upon almost a century of United States First Amendment jurisprudence and ensure that only speech that calls for and is likely to lead to immediate violence is suppressed.

In this case, let us assume that Facebook was justified with its initial suspension of the former President’s Facebook account to prevent the incitement of imminent violence at the capitol. Even so, it is not, however, justified in permanently banning his account when no lingering danger exists and there is no evidence suggesting that the former President will make a call to arms. If the scenario changes, Facebook can of course institute a new suspension but a permanent speech suppression in the absence of any dangerous situation is abhorrent to the values protected by the First Amendment.

Facebook does not need to extend preferential treatment to the account of a current or former President, however the need for clear standards of general applicability becomes more urgent when the consequence involves silencing the expression of the leader of a political party representing roughly half the country.

4) Facebook’s current lack of protection for free speech will harm vulnerable users

Although some may argue that former President Trump has other platforms to broadcast his speech, that argument is inapposite. Facebook must consider the overall grandiose effect of its current lack of free speech protection on all users. Facebook’s policies must be applied consistently across all people so that Facebook does not propagate viewpoint favoritism. They must consider everyone to whom the policy has applied, not just the current case, when determining how to formulate their policy. For example, these same standards are being applied to vulnerable groups. Many such vulnerable groups have encountered situations where their leaders have been shut out of speech with no other avenues. Facebook should evaluate the consequences of this standard, not just as applied to the former President but also to their vulnerable users.

5) When Facebook effectively overrides the support of democratic voters, it affects us all

This case specifically underscores the egregious harms of unwarranted censorship. In this specific case, Facebook’s actions further create an acutely anti-democratic effect, silencing the representation of 74 million people. Regardless of former President Trump’s viewpoint, he represents these voters and they are effectively silenced when he is banned. By unilaterally silencing the speech of a former elected official who had led a party representing roughly half of the country, Facebook has decided that the 74 million Americans who voted for the former President are not worthy of any representation. President Trump remains a former elected official with a considerable bully pulpit. Facebook must take great care in deciding to bar those 74 million voters from exerting their democratic influence through even the bare minimum of social media posts using this pulpit. To be sure, Facebook should not accord special protections to elected officials and should instead update its policies to apply a Brandenburg standard to all users. The consequence of silencing the speech of voters by banning the voice of the person who speaks for them, however, shows how Facebook’s current policy has already led to an anti-democratic result and should be reformed.

Roya L. Butler Written by:

Ms. Butler is a Data Privacy and Cybersecurity Attorney with vast experience as a technologist.

Comments are closed.