Submission
Privacy Policy
Code of Ethics
Newsletter

János Tamás PAPP: Self-Interest or Value Protection? How the EU Tries to Suppress Dangerous Media Content from Third Countries?

The European Union (EU) is not just an economic bloc but also a beacon of shared principles such as democracy, human rights, and the rule of law. These values underpin the EU’s approach to media regulation, including outlets from third countries. The EU has established a legal and regulatory framework that seeks to foster media pluralism, protect freedom of expression, and guard against disinformation.

EU media regulation draws on a variety of legal instruments. The Audiovisual Media Services Directive (AVMSD) is a key piece of legislation that regulates broadcast and on-demand media services in Europe. The AVMSD aims to create a single European market for audiovisual media while ensuring cultural diversity, the protection of minors, and the promotion of European works. Importantly, the directive applies to media service providers who are established within EU Member States, but it also impacts third countries’ outlets seeking to operate in the region. Third countries’ media outlets, under AVMSD, must adhere to the ‘country of origin’ principle, meaning they have to follow the rules of the EU country they broadcast from. This ensures uniformity and fairness but has also been the subject of criticism. Some argue that it enables ‘forum shopping’, where outlets base their operations in countries with lighter regulations.

Furthermore, the EU has been tightening its regulatory grip in response to concerns over disinformation and foreign interference. The Code of Practice on Disinformation is a self-regulatory framework agreed upon by tech and media companies. Its primary aim is to enhance transparency of political advertising and reduce the spread of online disinformation. Even though it is primarily targeted at digital platforms, its scope extends indirectly to third countries’ media outlets.

Recently, there have been calls to extend EU jurisdiction to third countries’ media outlets that target EU citizens. Critics argue that current regulations don’t adequately protect against foreign disinformation campaigns. 

The European Democracy Action Plan, announced at the end of 2020, is a framework developed by the European Commission to strengthen democracy in the European Union. The plan aims, among other things, to ensure the integrity of elections and political advertising, increase transparency, support independent journalism and improve the EU’s ability to detect and respond to disinformation. As part of the Action Plan, Ursula von der Leyen announced the European Media Freedom Act (EMFA) 2021, which builds on the Audiovisual Media Services Directive and sets out different rules on the independence of media regulators, promotes transparency in media ownership and strengthens the independence of editorial decisions. The initiative focuses on removing obstacles to the creation and operation of media services and aims to establish a common framework to promote the internal market in the media sector, with a view to safeguarding media freedom and pluralism in this market. The draft is scheduled for adoption by the end of 2023, and at the time of writing the trilogue negotiations are ongoing.

The current Article 16 of the proposal deals with the coordination of measures for media services outside the EU and provides that: “The Board shall, upon request of the national regulatory authorities or bodies from at least two Member States, coordinate relevant measures by the national regulatory authorities or bodies concerned, related to the dissemination of or access to media services originating from outside the Union or provided by media service providers established outside the Union that, irrespective of their means of distribution or access, target or reach audiences in the Union where, inter alia in view of the control that may be exercised by third countries over them, such media services prejudice or present a serious and grave risk of prejudice to public security.”

The recitals to the Article highlight the specific task of media authorities to protect the internal market from activities of media services from outside the Union that target or reach audiences within the Union. “Such risks could take, for instance, the form of systematic, international campaigns of media manipulation and distortion of facts in view of destabilizing the Union as a whole or particular Member States. In this regard, the coordination between national regulatory authorities or bodies to face together possible public security […] threats stemming from such media services needs to be strengthened”. (EMFA Recital 30) To this end, the legislation aims, according to the preamble, to coordinate the national measures that can be adopted to counter threats to public security posed by media services originating or established outside the EU but aimed at an EU audience. To this end, the legislation proposes the establishment of a list of criteria, to be drawn up by the European Board for Media Services (to be set up by the EMFA). “Such a list would help national regulatory authorities or bodies in situations when a relevant media service provider seeks jurisdiction in a Member State, or when a media service provider already under the jurisdiction of a Member State, appears to pose serious and grave risks to public security. Elements to be covered in such a list could concern, inter alia, ownership, management, financing structures, editorial independence from third countries or adherence to a co-regulatory or self-regulatory mechanism governing editorial standards in one or more Member States.” (EMFA Recital 30b)

This part of the regulation was clearly brought into being by the Russian-Ukrainian conflict. On 1 March 2022, the Council of the European Union adopted a Council Regulation imposing restrictions on the operation and broadcasting in the European Union of certain Russian media outlets linked to the state, in response to Russia’s hybrid warfare. It stipulated that operators are prohibited from broadcasting or allowing the broadcasting of content from Russia Today and any associated service provider, including “transmission or distribution via cable, satellite, IPTV, ISPs, Internet video-sharing platforms or applications”. The suspension of the channel has sparked a major debate among journalistic organizations, as well as lawyers and experts.

The Digital Services Act (DSA) also requires the development of crisis response mechanisms. It defines a crisis as a situation where “exceptional circumstances arise which could lead to a serious threat to public security or public health in the Union or a substantial part of it.” (DSA Art. 36.) In such situations, the Commission may adopt decisions requiring online platforms to take measures such as assessing the threats posed by the operation of their services and taking proportionate, concrete, and effective measures to prevent, eliminate or limit them. The DSA, while not specifically targeting media outlets, could have implications for third countries’ outlets that provide digital services within the EU. The DSA places significant emphasis on creating transparency obligations for digital services, especially those classed as ‘Very Large Online Platforms’ (VLOPs), with more than 45 million users.m (DSA Art. 33.) This includes requirements for clear reporting of their content moderation policies, measures against illegal content, and robust advertising transparency. This is particularly crucial in managing propaganda, as it can often be disguised as legitimate advertising or user content. These new regulations will make it harder for foreign entities to manipulate the digital information space and would increase the traceability of such activities. Additionally, the DSA stipulates that VLOPs must conduct annual audits to assess systemic risks associated with their platform, including the dissemination of illegal content, negative effects on fundamental rights, and intentional manipulation of the platform. They must also assess and mitigate some of the risks arising from the design and use of their services. This includes, according to the Regulation, “any actual or foreseeable negative impact on civil discourse and on the electoral process and public security”. (DSA Art. 34.) While the DSA does not explicitly target propaganda from outside the EU, it creates a comprehensive framework to increase transparency and accountability of digital platforms, making it much more difficult for any entity, foreign or domestic, to use these platforms for propagandistic purposes. Moreover, the DSA would require platforms to provide researchers with access to key data to understand and mitigate risks associated with the dissemination of disinformation, which will support efforts to combat foreign propaganda. (DSA Art. 40.)

In conclusion, while the EU’s regulation of third countries’ media outlets is grounded in its legal and regulatory framework, it’s also subject to ongoing debates. The balance between fostering media pluralism, protecting citizens, and mitigating disinformation is a complex and evolving challenge. The EU’s future steps in this area will be watched closely by regulators and media outlets alike.


János Tamás Papp JD, PhD is an assistant professor at Pázmány Péter Catholic University, Hungary, and a legal expert at the Department of Online Platforms of the National Media and Infocommunications Authority. He earned his JD and PhD in Law at the Faculty of Law and Political Sciences of the Pázmány Péter Catholic University. His main research fields are freedom of speech, media law, and issues related to freedom of expression on online platforms. He has a number of publications regarding social media and the law, including a book titled „Regulation of Social Media Platforms in Protection of Democratic Discourses”.

Print Friendly, PDF & Email