Submission
Privacy Policy
Code of Ethics
Newsletter

The Facebook Community Standards: Content Regulation Policies and Procedures That Shape the Freedom of Expression

As international regulations usually provide a general framework for platforms which they need to be compliant with, the practice of individual online platforms draws clearer definitions that shape the borders of fundamental rights, such as freedom of expression. Past practices show that such sites operate with great room for interpreting international guidelines or even outmaneuver them[1]. In decisive situations, often, platforms like Facebook play a judiciary role that can determine the boundaries of free speech and influences the individuals’ freedom of self-expression[2]. The slow and lengthy procedures of international legislation create an environment where the law becomes slow to adapt to changes in the online space, hence, online platforms gain an advantage, and their own regulations become appraised[3]. These self-generated community guidelines provide law-like frameworks that determine the permitted content and the action policy when it comes to content moderation and removal[4].

The Community Guidelines introduced by Facebook was introduced to contribute to the creation of a safe space where opinion sharing is allowed, and users feel security in doing so[5]. Even though most platforms tailor their community standards to their specific needs, they usually include a written policy of permitted and prohibited content that will be moderated or removed from the site[6]. This applies to Facebook’s Community Standards as well, which was first published in 2018, but the term was already introduced in 2016[7]. The Standards emphasize the importance of fundamental rights, especially freedom of expression, and acknowledge the need for limiting these rights. However, it highlights that the opinion of minorities requires distinct protection as the majority opinion usually oppresses minority thoughts to the peripheries. Facebook claims its standards to be diversity-focused and driven by feedback from its users with respect to fundamental human rights[8]. Facebook frequently deletes, limits, and bans contents and user accounts as well and refers to incompliance with the Standards as justification for moderation. Before the publication of the Standards, Facebook had been accused several times that the control mechanisms were not accessible and easily understandable for the public[9]. But today, the Standards are available to anyone online and can be used as a manual that explains the fundamental principles according to which the site moderates the posted contents theoretically[10].

To determine the relationship between international treaties declaring fundamental rights and the Community Standards, one has to observe their contents and compare them. Facebook states in the Introduction of the Standards that the purpose is the creation of an atmosphere where users do not feel limited in sharing their views and opinions. However, as most international regulations, Facebook points out the importance of limiting freedom of expression. What is not commonly found or highlighted in the regulatory international documents is the emphasis on marginalized minority opinion. Further, it states that Facebook will allow publishing views and opinions they do not agree with when it is worthy and crucial for the public interest. However, in this case, the company has the right to evaluate and decide about the extent of public interest the content has[11]. The European Convention on Human Rights (ECHR) introduced a similar principle that prescribes the High Contracting Party to inform the Secretary General of the Council of Europe that the derogating measures are reasonable and necessary. However, this specific provision applies only in case of war or other public emergency (Article 15)[12]. The similarity is not to be found in the proofing process of the required reasoning, as in the case of Facebook, the burden of proof and the assessment of the decision is its own responsibility. At the same time, both processes are forms of subsequent norm control.

The screening mechanism rests on four fundamental principles, which are authenticity, safety, privacy, and dignity. These principal values are not determined explicitly by the Standards and the international treaties could not yet provide a generally applicable definition to these terms either. The lack of definition is compensated for by the extensive explanation of the “policy rationale” at the beginning of each section. The Standards divide problematic contents and the extent of sanctions into two distinct categories. These categories outline the contents that are not allowed on the platform and those that require additional context of background information, are allowed only with a warning screen, or are only accessible for users above the age of 18[13].

Using algorithms and artificial intelligence to collect data and filter content is a common practice but can have a significant influence on freedom of expression. These algorithm-based systems generate theoretically impartial and unbiased criteria according to which the content of the platform is being moderated. These “filter bubbles”, however, can over- or under-moderate certain contents by eliminating the possibility of individually tailored judgments with the incorporation of context and additional information[14]. The rigidity of the systems therefore needed a backup system that allows secondary decisions to be made[15]. These “put-back” mechanisms are also required by the Digital Services Act (DSA)[16] which offers various control mechanisms to deal with misguided decisions[17]. From the stages, Facebook implemented all of them even before the regulation had been introduced. The internal complaint handling systems allow users to directly appeal the decisions of the site[18].

After the voluntary in-house system, Facebook separated its appeal forum and established an additional external, independent body as well that reviews the decisions of the company[19]. This entity is the Oversight Board (OB) of Facebook. The OB functions as a de facto “court of appeal” that reviews the requests for reconsideration from the users and has the power to advise reinstatement of the content or the account on Facebook. The 40 members of the OB are chosen for 3 years with the possibility of repeated appointment after the expiration of the mandate. Similar to other international regulations, the OB’s case law serves as precedent for future decisions, and similar to the verdicts of the Supreme Courts, the OB’s ruling is binding for subsequent judgments[20]. The appeal process requires the user to send the complaint within 15 days of unjust content moderation. When content is removed or restricted, the user receives a reference number that needs to be included in the case of appeal. After submission, the OB selects a handful number of appeals for review, and within 3 months, it either upholds or overturns the decisions. Further, the OB has the possibility to recommend changes in the policies. Up to June 2021, more than 100 recommendations were issued and 16 out of 22 rulings were overturned by the OB[21]. However, the OB only selects a few cases from thousands of appeals, therefore, the chances of being selected are limited.

Even after establishing the Community Standards, several problems threaten freedom of expression daily on Facebook. From the unclear definitions, through the unclear process of algorithmic filtering, to the low selection rate in appeal cases, there are several other loopholes in the system that pose considerable danger to fundamental rights. Most of these issues are yet to be resolved by further international regulation or Facebook itself. The boundaries of freedom of expression still needed to be determined and explained, as there is still only a “fine line” between what is permitted and what is to be banned.


[1] Magony Gellért Imre: A Facebook-on közzétett tartalmak szabályozása. In. Comparative Law Working Papers, 2021, Vol. 5., No. 3. p.1, Accessed on 20 August 2023. https://www.ojji.u-szeged.hu/images/dokumentumok/CLWP/magony_face.pdf

[2] Nagy Krisztina: Facebook files – gyűlöletbeszéd törölve? A közösségi médiaplatformok tartalom-ellenőrzési tevékenységének alapjogi vonatkozásai. In. Pro Future, 2018, 2.szám,115.

[3] Magony Gellért Imre: A Facebook-on közzétett tartalmak szabályozása. In. Comparative Law Working Papers, 2021, Vol. 5., No. 3. p.1, Accessed on 20 August 2023. https://www.ojji.u-szeged.hu/images/dokumentumok/CLWP/magony_face.pdf

[4] Beteman, Jon, Thompson, Natalie and Smith, Victoria: How Social Media Platforms’ Community Standards Address Inflluence Operations. 2021. In. Carnegie Endowment for International Peace. Published on 1 April 2021. https://carnegieendowment.org/2021/04/01/how-social-media-platforms-community-standards-address-influence-operations-pub-84201

[5] Meta: Facebook Community Standards. 2023. Accessed on 28 July 2023 https://transparency.fb.com/policies/community-standards/

[6] Beteman, Jon, Thompson, Natalie and Smith, Victoria: How Social Media Platforms’ Community Standards Address Inflluence Operations. 2021. In. Carnegie Endowment for International Peace. Published on 1 April 2021. https://carnegieendowment.org/2021/04/01/how-social-media-platforms-community-standards-address-influence-operations-pub-84201

[7] Meta: Our Commitment To Safety. 2020. In. Facebook Business. Accessed on 28 July 2023. https://www.facebook.com/business/news/our-commitment-to-safety#2016

[8] Meta: Our Commitment To Safety. 2020. In. Facebook Business. Accessed on 28 July 2023. https://www.facebook.com/business/news/our-commitment-to-safety#2016

[9] Nagy Krisztina: Facebook files – gyűlöletbeszéd törölve? A közösségi médiaplatformok tartalomellenőrzési tevékenységének alapjogi vonatkozásai. In. Pro Future, 2018, No. 2., p.115.

[10] Nagy Krisztina: Facebook files – gyűlöletbeszéd törölve? A közösségi médiaplatformok tartalomellenőrzési tevékenységének alapjogi vonatkozásai. In. Pro Future, 2018, No. 2., p.116.

[11] Meta: Facebook Community Standards. 2023. Accessed on 28 July 2023 https://transparency.fb.com/policies/community-standards/

[12] European Union: European Convention on Human Rights. – Article 15, Accessed on 19 August 2023. https://www.echr.coe.int/documents/d/echr/Convention_ENG.

[13] Meta: Facebook Community Standards. 2023. Accessed on 28 July 2023 https://transparency.fb.com/policies/community-standards/

[14] Kolarević, Emina: The Influence of Artificial Intelligence on the Right of Freedom of Expression. In Pravo. 2022. Vol 39. No. 1. p.111. Accessed on 13 August 2023. https://doi.org/10.5937/ptp2201111K

[15] Holznagel, Daniel: The Digital Services Act wants to “sue” Facebook over content decisions in private de facto courts. In Verfassungsblog. Juni 2021. https://verfassungsblog.de/dsa-art-21/ Accessed on 14 August 2023.

[16] European Union: Regulation (EU) 2022/2065 of the European Parliament and of the Council on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act) – Article 34. 2022.

[17] Wagner, Ben, Janssen Heleen: A First Impression of Regulatory Powers in the Digital Services Act. In. Verfassungsblog. January 2021. https://verfassungsblog.de/regulatory-powers-dsa/ Accessed on 15 August 2023

[18] Holznagel, Daniel: The Digital Services Act wants to “sue” Facebook over content decisions in private de facto courts. In Verfassungsblog. Juni 2021. https://verfassungsblog.de/dsa-art-21/ Accessed on 14 August 2023.

[19] Holznagel, Daniel: The Digital Services Act wants to “sue” Facebook over content decisions in private de facto courts. In Verfassungsblog. Juni 2021. https://verfassungsblog.de/dsa-art-21/ Accessed on 14 August 2023.

[20] Magony Gellért Imre: A Facebook-on közzétett tartalmak szabályozása. In. Comparative Law Working Papers, 2021, Vol. 5., No. 3. p.6, Accessed on 20 August 2023. https://www.ojji.u-szeged.hu/images/dokumentumok/CLWP/magony_face.pdf

[21] Oversight Board: Appeal to shape the future of Facebook and Instagram. 2023. Accessed on 18 August 2023. https://www.oversightboard.com/appeals-process/


Dorina BOSITS is a law student at the Széchenyi István University of Győr, Hungary, and an international finance and accounting graduate of the University of Applied Sciences of Wiener Neustadt, Austria. The main area of her research includes freedom of speech, digitalization, data protection, and financial law. She is a student at the Law School of MCC and a member of ELSA Győr.

Print Friendly, PDF & Email