Submission
Privacy Policy
Code of Ethics
Newsletter

The First Formal Investigation under the DSA Regime

The European Commission has launched a formal investigation under the Digital Services Act (DSA) against Elon Musk’s Twitter, now referred to as “X” in the context of this investigation. This marks the first significant enforcement action under the recently enacted DSA.

The investigation focuses on several potential breaches of the DSA by “X.” Key areas of concern include the platform’s handling of illegal content, its content moderation practices, the use of dark patterns, transparency in advertising, and the accessibility of data to researchers. One specific feature under scrutiny is the “Community Notes” system, which allows users to flag false or misleading content. This system is seen as an attempt at crowd-sourced fact-checking, replacing a dedicated team of fact-checkers.

The European Commission’s move follows incidents where “X” failed to effectively manage the dissemination of illegal content. For example, in the aftermath of Hamas attacks on Israel, the platform was inundated with fake images and misleading information, raising concerns about its ability to handle such situations.

The investigation does not automatically imply that “X” has violated the DSA. However, the Commission has found substantial grounds to investigate these concerns in detail. If found guilty, “X” could face significant penalties, including fines of up to 6% of its global annual turnover, periodic penalties based on its daily global turnover, and possibly even suspension from operating within the EU.

This case is particularly notable as it tests the new DSA framework, which aims to regulate large online platforms more stringently to safeguard public security and uphold digital rights within the European Union. The outcome of this investigation will be closely watched as it could set a precedent for how the DSA is applied to other major tech companies operating in the EU.

The relationship between the EU and “X,” under Elon Musk’s ownership, has been marked by a series of challenges and confrontations, particularly in the context of content moderation and disinformation policies. This tension predates the recent formal investigation under the Digital Services Act.

A significant point of contention arose when “X” (then, Twitter) withdrew from the EU’s Code of Practice on online disinformation. This voluntary code, which Twitter had initially joined in 2018, committed the platform to taking steps to combat the spread of false information. The EU Code aimed to target associated ad revenue, tackle bots and fake accounts, provide tools for reporting disinformation, and empower researchers to study these phenomena. However, under Musk’s leadership, “X” signaled a shift in its approach to handling disinformation, choosing to focus on a crowdsourced fact-checking feature called “Community Notes,” which raised concerns about its effectiveness in managing disinformation.

The EU has been assertive in its stance against disinformation, and “X’s” departure from the Code indicated a growing rift between the platform and the EU’s regulatory framework. EU industry chief Thierry Breton issued a warning to “X,” emphasizing that despite the voluntary nature of the Code, obligations under the DSA remained. The DSA imposes stringent requirements on very large online platforms (VLOPs) like “X” to assess and mitigate systemic risks to civic discourse, including disinformation. The DSA’s obligations represent a legal framework that “X” must adhere to, with potential penalties for non-compliance, including fines and possibly a service ban.

Musk’s approach to content moderation and the management of disinformation on “X” has been under scrutiny, particularly in the context of the EU’s efforts to regulate online platforms more stringently. This regulatory environment creates a complex landscape for “X,” which must balance its content policies with the requirements set forth by the EU. The tension between “X” and the EU reflects broader challenges that major social media platforms face as they navigate varying international regulatory frameworks, particularly concerning content moderation, user safety, and disinformation.

The formal procedure confirms what we have already pointed out in a previous blog post about the controversies around the role and responsibility of Twitter as a platform for public discourse and expression. It is easy to see the risks involved in entrusting such an important service to a single person, either in terms of regulatory compliance or in terms of predictability and security for users. Moreover, at the end of the process, it will become clear how much power the DSA actually has to regulate online platforms.


János Tamás Papp JD, PhD is an assistant professor at Pázmány Péter Catholic University, Hungary, and a legal expert at the Department of Online Platforms of the National Media and Infocommunications Authority of Hungary. He has taught civil and constitutional law since 2015 and became a founding member of the Media Law Research Group of the Department of Private Law. He earned his JD and PhD in Law at the Faculty of Law and Political Sciences of the Pázmány Péter Catholic University. His main research fields are freedom of speech, media law, and issues related to freedom of expression on online platforms. He has a number of publications regarding social media and the law, including a book titled „Regulation of Social Media Platforms in Protection of Democratic Discourses”.

Print Friendly, PDF & Email