International standards, not the First Amendment, govern Facebook Oversight Board's policy to barring Donald Trump
I wrote a brief piece for the Boston University Law Review Online, Governing Elections Without Law, reflecting in late 2020 on Professor Rick Hasen’s book Election Meltdown. I opened, “I want to focus on those nonlegal reforms that work alongside the law—places where the law simply runs out, where legislation is worse than the existing problem, or where superior longer-term solutions reside.”
The Facebook Oversight Board released a decision regarding Facbeook’s decision to bar former President Donald Trump from Facebook after the January 6, 2021 riot at the Capitol during the counting of electoral votes. It is precisely this type of private regulation that has become all the more crucial—but private regulation that looks very public in nature.
The framework is an interesting set-up, in my judgment:
The Board’s decisions do not concern the human rights obligations of states or application of national laws, but focus on Facebook’s content policies, its values and its human rights responsibilities as a business. The UN Guiding Principles on Business and Human Rights, which Facebook has endorsed (See Section 4), establish what businesses should do on a voluntary basis to meet these responsibilities. This includes avoiding causing or contributing to human rights harms, in part through identifying possible and actual harms and working to prevent or address them (UNGP Principles 11, 13, 15, 18). These responsibilities extend to harms caused by third parties (UNGP Principle 19).
Facebook has become a virtually indispensable medium for political discourse, and especially so in election periods. It has a responsibility both to allow political expression and to avoid serious risks to other human rights. Facebook, like other digital platforms and media companies, has been heavily criticized for distributing misinformation and amplifying controversial and inflammatory material. Facebook’s human rights responsibilities must be understood in the light of those sometimes competing considerations.The Board analyzes Facebook’s human rights responsibilities through international standards on freedom of expression and the rights to life, security, and political participation. Article 19 of the ICCPR sets out the right to freedom of expression. Article 19 states that “everyone shall have the right to freedom of expression; this right shall include freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers, either orally, in writing or in print, in the form of art, or through any other media of his choice.” The Board does not apply the First Amendment of the U.S. Constitution, which does not govern the conduct of private companies. However, the Board notes that in many relevant respects the principles of freedom of expression reflected in the First Amendment are similar or analogous to the principles of freedom of expression in ICCPR Article 19.
Political speech receives high protection under human rights law because of its importance to democratic debate. The UN Human Rights Committee provided authoritative guidance on Article 19 ICCPR in General Comment No. 34, in which it states that “free communication of information and ideas about public and political issues between citizens, candidates and elected representatives is essential” (para. 20).
Facebook’s decision to suspend Mr. Trump’s Facebook page and Instagram account has freedom of expression implications not only for Mr. Trump but also for the rights of people to hear from political leaders, whether they support them or not. Although political figures do not have a greater right to freedom of expression than other people, restricting their speech can harm the rights of other people to be informed and participate in political affairs. However, international human rights standards expect state actors to condemn violence (Rabat Plan of Action), and to provide accurate information to the public on matters of public interest, while also correcting misinformation (2020 Joint Statement of international freedom of expression monitors on COVID-19).
International law allows for expression to be limited when certain conditions are met. Any restrictions must meet three requirements – rules must be clear and accessible, they must be designed for a legitimate aim, and they must be necessary and proportionate to the risk of harm. The Board uses this three-part test to analyze Facebook’s actions when it restricts content or accounts. First Amendment principles under U.S. law also insist that restrictions on freedom of speech imposed through state action may not be vague, must be for important governmental reasons and must be narrowly tailored to the risk of harm.
The Oversight Board is entirely right, of course, that the First Amendment “does not govern the conduct of private companies.” But neither do “international standards of freedom of expression” as set forth in the International Covenant on Civil and Political Rights, a treaty that binds nation-states and not private companies.
Instead, it is because earlier this year—but after it suspended Mr. Trump—Facebook announced that its corporate human rights policy will be committed to the the United Nation’s Guiding Principles on Business and Human Rights:
On March 16, 2021, Facebook announced its corporate human rights policy, where it commemorated its commitment to respecting rights in accordance with the UN Guiding Principles on Business and Human Rights (UNGPs). The UNGPs, endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. As a global corporation committed to the UNGPs, Facebook must respect international human rights standards wherever it operates. The Oversight Board is called to evaluate Facebook’s decision in view of international human rights standards as applicable to Facebook.
The Board analyzed Facebook’s human rights responsibilities in this case by considering human rights standards including:
-The right to freedom of expression: International Covenant on Civil and Political Rights ( ICCPR), Articles 19 and 20; as interpreted in General Comment No. 34, Human Rights Committee (2011) ( General Comment 34); the Rabat Plan of Action, OHCHR, (2012); UN Special Rapporteur on freedom of opinion and expression report A/HRC/38/35 (2018); Joint Statement of international freedom of expression monitors on COVID-19 (March, 2020).
-The right to life: ICCPR Article 6.
-The right to security of person: ICCPR Article 9, para. 1.
-The right to non-discrimination: ICCPR Articles 2 and 26; International Convention on the Elimination of All Forms of Racial Discrimination ( ICERD), Articles 1 and 4.
-Participation in public affairs and the right to vote: ICCPR Article 25.
-The right to remedy: ICCPR Article 2; General Comment No. 31, Human Rights Committee (2004) ( General Comment 31); UNGPs, Principle 22.
It is, of course, entirely within the rights of Facebook to choose whether the First Amendment, international law, or some other standard will govern how it operates. Perhaps there is little daylight between a strict First Amendment approach and this one. But it’s worth noting that while the First Amendment requires an “important” (or sometimes described as “compelling”) reason to restrict speech, the test here is a “legitimate” aim, described as:
The requirement of legitimate aim means that any measure restricting expression must be for a purpose listed in Article 19, para. 3 of the ICCPR, and this list of aims is exhaustive. Legitimate aims include the protection of public order, as well as respect for the rights of others, including the rights to life, security, and to participate in elections and to have the outcome respected and implemented. An aim would not be legitimate where used as a pretext for suppressing expression, for example, to cite the aims of protecting security or the rights of others to censor speech simply because it is disagreeable or offensive (General Comment No. 34, paras. 11, 30, 46, 48). Facebook’s policy on praising and supporting individuals involved in “violating events,” violence or criminal activity was in accordance with the aims above.
“Legitimate,” then, is a term of art with an “exhaustive” fixed list of reasons. But it does appear to sweep more broadly and would allow regulation of more speech than the First Amendment.
There was also some dispute within the Board about how to assess Facebook’s human rights responsibilities:
A minority believes that it is important to outline some minimum criteria that reflect the Board’s assessment of Facebook’s human rights responsibilities. The majority prefers instead to provide this guidance as a policy recommendation. The minority explicitly notes that Facebook’s responsibilities to respect human rights include facilitating the remediation of adverse human rights impacts it has contributed to (UNGPs, Principle 22). Remedy is a fundamental component of the UNGP ‘Protect, Respect, Remedy’ framework, reflecting international human rights law more broadly (Article 2, para. 1, ICCPR, as interpreted by the Human Rights Committee in General Comment No. 31, paras. 15 - 18). To fulfil its responsibility to guarantee that the adverse impacts are not repeated, Facebook must assess whether reinstating Mr. Trump’s accounts would pose a serious risk of inciting imminent discrimination, violence or other lawless action. This assessment of risk should be based on the considerations the Board detailed in the analysis of necessity and proportionality in Section 8.3.III above, including context and conditions on and off Facebook and Instagram. Facebook should, for example, be satisfied that Mr. Trump has ceased making unfounded claims about election fraud in the manner that justified suspension on January 6. Facebook’s enforcement procedures aim to be rehabilitative, and the minority believes that this aim accords well with the principle of satisfaction in human rights law. A minority of the Board emphasizes that Facebook’s rules should ensure that users who seek reinstatement after suspension recognize their wrongdoing and commit to observing the rules in the future. In this case, the minority suggests that, before Mr. Trump’s account can be restored, Facebook must also aim to ensure the withdrawal of praise or support for those involved in the riots.
I don’t have strong thoughts on the moment on the overall framework or how Facebook ought to behave. The Board recognizes that Facebook’s actions have a significant role in democratic discourse and voting. Providing clear ex ante standards is important. And how it applies in future disputes remains to be seen.