
Meta’s GDPR Fine: Delayed Accountability, but Maybe a Shift Toward Real Consequences
The €251 million fine imposed on Meta by Ireland’s Data Protection Commission (DPC) for a Facebook security breach dating back to 2018 reflects a growing willingness by European regulators to hold tech giants accountable for failing to protect user data. While this is not the largest fine Meta has faced under the European Union’s General Data Protection Regulation (GDPR), it marks an important step in addressing data security failures. However, the broader picture reveals a significant flaw in the enforcement process: the response comes years too late. By the time fines are issued, the harm has already been done, and the affected users have long since moved on, often unaware of the depth of the intrusion into their privacy.
The breach that led to this fine began with a flaw in a “View As” feature Facebook rolled out in 2017. This vulnerability allowed unauthorized users to obtain full access to millions of Facebook profiles by exploiting a design flaw in Facebook’s video upload tool. The breach went undetected until September 2018, and during that time, attackers gained access to the personal data of nearly 29 million accounts worldwide, including approximately 3 million accounts in the EU. Exposed information included not just names and email addresses but also sensitive data like work history, location, and religious beliefs. Given the broad scope of personal data affected, it is unsurprising that the DPC opted for a substantial fine.
Yet, while the scale of the fine may seem significant, the timing of the decision is problematic. The breach occurred more than six years ago, and while Meta did face regulatory inquiries during that time, the delay in reaching a decision undercuts the effectiveness of GDPR enforcement. The core principle of GDPR is to protect the rights and freedoms of individuals in real time, but if regulatory decisions take five or six years to materialize, the deterrent effect of enforcement is weakened. Companies like Meta may view delays as a reason to continue risky practices, knowing they will have years to address potential consequences.
Part of the delay stems from procedural inefficiencies. The GDPR requires regulators to consult with other EU supervisory authorities, a process that has often led to disputes and drawn-out negotiations. Historically, the DPC under its previous commissioner, Helen Dixon, faced criticism for being too lenient on major tech firms, and its enforcement proposals were frequently challenged by its EU counterparts. This procedural gridlock delayed decisions and allowed companies like Meta to continue operating with minimal disruption. Notably, however, the latest fine did not face objections from other regulators, suggesting that enforcement under the DPC’s new leadership is becoming more streamlined.
While delays are frustrating, the rising amount of these fines could signal a shift in how tech giants approach user privacy and security. Companies like Meta have long been able to treat GDPR fines as the cost of doing business, especially when early fines were relatively small. However, the penalties have grown substantially over the years. In 2023, Meta was hit with a record €1.2 billion fine for transferring EU user data to the United States in violation of GDPR rules. The €251 million fine for the 2018 breach is far from that amount, but it reflects a clear upward trajectory in penalty size. These larger fines, combined with more efficient decision-making by regulators, could force companies to prioritize data protection in the development of their products.
The fine also illustrates a shift in focus from punishing breaches after they happen to holding companies accountable for the way their systems are designed. Under GDPR, businesses are required to adopt “data protection by design and by default,” which means privacy safeguards must be built into products from the outset. The DPC found that Meta failed on this front, leading to the larger portion of the fine (€240 million) being attributed to this specific failure. This approach targets not just the immediate consequences of a breach but also the underlying system flaws that allow it to happen. Regulators are signaling that fines will be larger if companies neglect to integrate privacy protections into the architecture of their platforms. For Meta, these growing fines are becoming harder to ignore. When the GDPR first came into force, many critics predicted that companies would simply absorb the fines as part of their operating costs. For companies like Meta, which generate tens of billions of dollars annually, small fines posed little financial threat. However, as the fines increase, so do the reputational risks. Privacy has become a competitive issue in the tech sector, with companies like Apple using privacy as a selling point for their products. If Meta continues to face large fines and negative press over its privacy practices, it may face increased pressure from users, investors, and regulators alike.
But fines alone are not enough. Financial penalties address the harm after the fact, but they do little to protect users while the breach is happening. The real goal of GDPR is to prevent breaches from occurring in the first place, and this can only be achieved if regulators act faster and more decisively. If companies know that it will take five or six years for fines to be imposed, there is little urgency for them to change their behavior. Swift enforcement would create a stronger deterrent, as companies would have less time to prepare for the financial impact of a potential penalty. The lesson from this case is that while higher fines are a step in the right direction, the pace of enforcement must also improve. Regulatory processes must become more efficient, and procedural disputes between supervisory authorities must be resolved more quickly. Without timely enforcement, even large fines lose their effectiveness. For every major fine that makes headlines, there are likely multiple smaller breaches that go undetected or unaddressed. Regulators must close this gap if GDPR is to remain a meaningful tool for protecting individual rights.
Meta, for its part, responded to the fine by emphasizing that the breach occurred in 2018 and that it had “immediate” safeguards in place afterward. But this is precisely the problem. Immediate action in 2018 did not prevent the years-long regulatory process that culminated in this fine. Companies must be compelled to act before a breach occurs, not after. The GDPR’s emphasis on “data protection by design” offers a framework for this, but enforcement delays risk undermining its impact. The DPC’s fine is a reminder that, for companies like Meta, violations of data protection laws have financial consequences. But for users, the harm has already occurred, and the personal information of millions has already been exposed. While the rising amounts of GDPR fines are likely to incentivize companies to improve their privacy practices, they should not be viewed as a complete solution. Accountability must be swift, not retrospective. The DPC’s more recent approach, marked by a lack of objections from other regulators, suggests that enforcement is becoming more efficient. If this trend continues, future violations could be met with faster and larger penalties, creating stronger incentives for companies to prioritize privacy from the outset.
János Tamás Papp, PhD is an assistant professor at Pázmány Péter Catholic University, Hungary and a research expert at the National Media and Infocommunications Authority of Hungary. He earned his JD and PhD in Law at the Faculty of Law and Political Sciences of the Pázmány Péter Catholic University where he has taught civil and constitutional law since 2015 and became a founding member of the Media Law Research Group of the Department of Private Law. His main research fields are freedom of speech, media law, and issues related to freedom of expression on online platforms. He has a number of publications regarding social media and the law, including a book titled „Regulation of Social Media Platforms in Protection of Democratic Discourses.