
ACE-ing Free Speech? A Free Speech Perspective on Transparency and Accountability in Content Moderation
In an era where digital platforms have become central to public discourse, the transparency and accountability of content moderation decisions are crucial for safeguarding freedom of speech. Striking a balance between protecting users from harmful content and ensuring that speech is not unjustly restricted is a persistent challenge. The European Union has taken a significant step in this regard by establishing the Appeals Centre Europe (ACE), an independent body dedicated to resolving disputes between social media companies and users concerning content moderation decisions.
The Dublin-based ACE addresses disputes related to major platforms such as Facebook, TikTok, and YouTube. Its primary objective is to provide a swift and impartial resolution mechanism, allowing users to contest content moderation decisions before independent experts. This initiative is particularly significant in the context of freedom of speech, as it introduces an additional layer of scrutiny over corporate decisions that impact users’ ability to express themselves online.
The creation of this body stems from the EU’s Digital Services Act (DSA), which mandates large online platforms to implement independent dispute resolution mechanisms for users. Certified by Irish regulatory authorities in October 2024, the Centre ensures a standardized approach to handling content moderation disputes across all 27 EU member states. By offering an independent review process, the Centre enhances the legitimacy of content moderation decisions and strengthens the accountability of digital platforms.
From a procedural standpoint, users can submit complaints online, which are then reviewed by independent experts. The process aims to deliver prompt and impartial decisions, thereby promoting greater transparency and accountability in content moderation. While the Centre’s rulings are not legally binding, platforms are required to justify any deviations from these decisions. This requirement places additional pressure on companies to act fairly and consistently in their moderation policies, reinforcing user rights and mitigating the risk of arbitrary censorship.
Hungarian Constitutional Jurisprudence and Content Moderation
Hungarian constitutional law provides fertile ground for analyzing freedom of speech and content moderation perspectives. Article IX of the Fundamental Law of Hungary[1] guarantees the right to freedom of expression while allowing for limitations in cases where speech incites hatred or violates human dignity. However, the application of these principles to digital platforms remains a contentious issue, particularly as Hungary has seen debates over the regulation of online content by both domestic and foreign tech companies.
Hungary has shifted from an autonomous regulatory approach to a national-level content regulation framework in line with the Digital Services Act (DSA). This transition has resulted in legal debates regarding the implementation and enforcement of content moderation rules. Additionally, cases concerning online content restricitions have been brought before supranational bodies such as the European Court of Human Rights (ECHR) and the Court of Justice of the European Union (CJEU). For instance, Delfi AS v. Estonia, where the issue of national vs. EU-level jurisdiction over content moderation was contested.
In recent years, the Hungarian Constitutional Court has issued several rulings that clarify the relationship between digital content moderation and constitutional speech protections. One key decision (Decision 3/2019. (III.7.) AB)[2] emphasized that any restriction of online speech must be proportionate and necessary in a democratic society. The Court ruled that private digital platforms, while not state actors, should ensure fair procedures when restricting users’ content, as their moderation decisions can have far-reaching consequences for public discourse.
A few years earlier, Decision 7/2014. (III.7.)[3] ABaddressed the issue of “ notice and takedown” by digital media companies, together with the issue of responsibility for online comments. The Court established that online platforms exercising de facto control over public speech bear responsibilities similar to traditional media in ensuring that speech restrictions do not amount to arbitrary censorship. The ruling underscored the need for transparency in content removal and appeals mechanisms.
A more recent case (Decision 18/2021. (V.27.) AB)[4] dealt with the question of liability for defamatory speech on digital platforms. The Court found that while platforms are not automatically liable for user-generated content, they must provide effective redress mechanisms to ensure that individuals can contest unfair moderation decisions. This decision aligns with EU directives on intermediary liability and the importance of maintaining an avenue for free expression while preventing harmful speech.
How to ACE an Appeal from Hungary?
Given the structure of ACE, Hungarian users whose content has been removed or restricted by major platforms have an alternative route to challenge these decisions beyond the national legal system.
ACE provides an additional forum for users to challenge content moderation decisions after exhausting national legal remedies. Rather than serving as an outright alternative to domestic legal procedures, ACE functions as a supranational review body that allows users to escalate cases beyond national regulatory frameworks. This ensures that decisions made within Hungary align with broader European digital rights principles. The process for Hungarian users to escalate a moderation dispute to the Appeals Centre Europe follows several steps:
- Platform’s Internal Review: Users must first submit a complaint directly to the platform (e.g., Facebook, TikTok, or YouTube). Platforms are required under the DSA to offer an internal review mechanism for content takedowns or account restrictions.
- National Legal Recourse: If the platform rejects the complaint, the user may seek redress through Hungary’s courts or regulatory bodies, such as the NMHH (National Media and Infocommunications Authority)[5], which oversees digital content disputes. The standard legal recourse in Hungary follows a multi-tiered approach. Users must first seek review through regulatory bodies such as the National Media and Infocommunications Authority (NMHH). If dissatisfied with the NMHH’s decision, they may appeal through the Hungarian court system. Only after exhausting domestic legal avenues can users escalate their cases to ACE for independent review.
- Escalation of Review to ACE: If the user is dissatisfied with national remedies in obtaining a decision, they can submit a case to ACE. This process involves:
- Submitting a formal complaint online, detailing the moderation decision and previous attempts to contest it.
- Independent review by legal experts specializing in digital rights and platform governance.
- A non-binding resolution by ACE, which the platform must justify if disregards compliance with it.
Implications for Bypassing National Decision-Making
ACE provides Hungarian users with an additional layer of legal scrutiny, potentially allowing them to bypass certain domestic regulatory decisions. This mechanism can be particularly significant in cases where national courts or regulatory bodies are perceived as politically influenced or slow to act. By offering a supranational alternative, ACE strengthens accountability and ensures that content moderation disputes are assessed against broader European digital rights principles.
While the Centre’s decisions are not legally binding, the requirement for platforms to publicly justify any deviations from them creates pressure for compliance. This safeguard helps mitigate the risk of arbitrary censorship, particularly in countries where governmental influence over digital media is a concern.
Conclusion
The Appeals Centre Europe (ACE) represents a crucial innovation in digital governance, offering European users an alternative mechanism to challenge platform moderation decisions. By integrating national and EU-level legal remedies, this framework ensures a more balanced approach to protecting freedom of speech while maintaining the accountability of digital platforms.
[1] Hungarian Fundamental Law (Alaptörvény), Article IX – https://njt.hu
[2] Decision 3/2019. (III.7.) AB on Online Speech Restrictions – https://alkotmanybirosag.hu
[3] Decision 7/2014. (III.7.) AB on Content Removal – https://alkotmanybirosag.hu
[4] Decision 18/2021. (V.27.) AB on Platform Liability – https://alkotmanybirosag.hu
[5] NMHH Regulations on Digital Media – https://nmhh.hu
Réka Kérész is a fourth-year law student at the Széchenyi István University, Ferenc Deák Law School in Győr, Hungary. Her main interests include Constitutional Law, International Law, and the combination of the two.