Submission
Privacy Policy
Code of Ethics
Newsletter

Human Nature, Hate, Speech Platforms, and the DSA. A “How To” Guide to Limit Harmful Content on VLOPs

As Bertrand Russell—way before the digital media’s age—observed, “few people can be happy unless they hate some other person, nation, or creed.” This implicated the essential need for human beings to hate others to achieve their own happiness, thus having the feeling of fulfillment in their lives. If hate is “second human nature,” how can it be combatted, especially in the online space?

With the worldwide spike in the use of online platforms, the phenomenon of anonymous hate speech grew large. One can easily hide under an avatar or a fake profile, giving in to the human urge—as Russell said—to hate others expressively. The need for controlling and limiting the existence of hate speech comes not only as a right but functions rather like an obligation. This obliges not only states or organizations but also market participants like online platforms and individuals. The obligation to impose limits arises from the idea that in the European legal tradition, the right to freedom of expression is not an unlimited right of the individuals but it has its (very few, sensitive, external) limitations: it needs to give way to a very few rights. The United States’ free speech thinking under the First Amendment does not limit the scope of this freedom itself, but the Supreme Court has limited the breadth of this right by establishing several limitations, such as tests like the clear and present danger test.

The regulation of free (and corresponding hate) speech has come a long way since the twentieth century after the times of the Second World War. I have written several articles on this blog dealing with the history of this regulation and how the perspectives have evolved throughout the decades. This progress (as its result is always a more detailed and nuanced concept) has reached its current peak with the European Union’s Digital Services Act (DSA). Constitutional Discourse has dealt with this act from various points of view and even regarding investigations under its regime.

In this article, I focus on the DSA’s expectation regarding hate speech limitation and awareness and how Very-Large-Online Platforms (VLOPs, who are the main addressees of the DSA) shall act to comply with platform responsibility obligations and these expectations. The special attention on VLOPs is owing to the fact that by reaching a significant number of users, they pose a greater risk to the European Community. By that, Articles 34 and 35 focus on prescribing a moderate level of risk assessment and risk mitigation.

The DSA entered into force in 2022 and regulates online platforms by introducing specific obligations for them to tackle among many other issues content moderation and combatting the publishing of illegal content. One of the most frequent types of illegal content on these platforms is hate speech. DSA expects online platforms to take action and moderate content on their sites. This includes removing illegal content or even limiting the visibility of a certain post (e.g. Articles 20 and 35). It obliges them to introduce reporting mechanisms like the internal complaint-handling system and clear procedural processes for users where they can report hate speech and other illegal content. VLOPs must comply with stricter obligations, including the establishment of a risk assessment and regular reporting. Transparency for these platforms is crucial and these actors must also educate users on how to spot hate speech, how to report it, and most importantly how to avoid doing it.

Regulating hate speech serves the purpose of combatting discrimination and racism among the wider population. The phenomenon of hate speech is defined as a public expression inducing violence and hatred based on a characteristic of a certain group or a specific member of a group which has the ability to create a hostile environment and fear in that group or person belonging to this group. The targeted groups are often characterized by race, religion, descent, or ethnicity but hate against different sexualities and genders has appeared in the last decades as well. The collection of protected characteristics has shown an expanding pattern as more and more features are considered protected.

The European Union intends to combat hate speech and expects its Member States to actively act against the phenomenon. The word hate speech does not describe this phenomenon precisely enough. Hate speech, despite what the name indicates, does not necessarily include speech. Hate can occur in many shapes and forms including speech but also other acts like symbolic speech or other conduct. This arises from also the name of the right itself. Freedom of expression is a broader concept than freedom of speech. Expression can include not only words but also a certain behavior or conduct, a drawing, or even a sound.

Meta’s Community Standards regulate hate speech and define it as any content attacking people based on certain characteristics (protected characteristics like race, national origin, relation, gender, or even disability). This definition is broader than what is being used in most EU Member States as it does not contain restrictions for expressions that have the ability to create fear in the person or group. Meta has established an internal complaint-handling system relying on user reporting and also automated systems. These systems apply Artificial Intelligence to monitor content. Meta still employs human moderators and also established the Oversight Board to potentially overrule previously made decisions but they rarely decide on cases and the process consumes a significant amount of time. Meta uses several sanctions to limit hate speech. Among these, deleting the posts or comments is the most common, but it can limit also the visibility of the content or remove the person’s account as well. Most large platforms have such a regulatory system in place, but it does not seem to work properly as the definitions are not clear and precise enough to protect individuals or groups of individuals. Deficiency can be observed in both ways. In some cases, platforms overregulate opinion and in other cases, they underregulate hate speech. Both cases can cause harm to individuals and shall be avoided. That is what the DSA aims to accomplish.

DSA—as mentioned above—prescribes online platforms to have transparency not only in their reporting but also towards their own users. This transparency is a two-sided obligation that includes on the one hand the education of users of reporting and complaint handling, i.e. the platform’s operation. On the other hand, it also obliges the platform to educate people regarding hate speech and illegal content. The definition of hate speech is relatively broad and does not help the user understand its essence. Educating them with short videos or by providing a more comprehensive summary rather than the one on Transparency Center, would contribute to less hate speech on the Meta platforms. When people are more conscientious about their daily decisions and more conscious about their consequences, it helps them follow the rules as well.

To conclude, a revision of the Community Standards and Meta’s Transparency Center would be needed to educate users better and raise awareness of the lawful usage of its platforms. Also, the emergence of out-of-court dispute settlement bodies (established based on DSA 21(6)) aims to accomplish this purpose, but their effectiveness will be only viable with the lapse of time. Such bodies have so far been established in only five countries, with Hungary being the third to do so.


Dorina BOSITS is a law student at the Széchenyi István University of Győr, Hungary, and an international finance and accounting graduate of the University of Applied Sciences of Wiener Neustadt, Austria. She is currently enrolled for an exchange semester at the Karl Franzes University of Graz, Austria. The main area of her research includes freedom of speech, digitalization, space law, data protection, and financial law. She is a student at the Law School of MCC and a member of ELSA Győr.