At the end of August, the European Commission’s new legislation, the Digital Services Act became enforceable to Very Large Online Platforms (VLOP) and Very Large Online Search Engines (VLOSE). The DSA will bring a lot of changes to the regulation of online platforms in a number of areas, some of which are aimed at promoting more conscious user attitudes. The biggest online platforms have announced in various communications what new measures they are introducing to comply with the regulation.
In June, Meta released a lengthy report about how its algorithm works across Facebook and Instagram as part of its push toward transparency. Meta supports these regulations for transparency, accountability, and user empowerment. They’ve assembled a large team to adapt their systems, increase transparency, and offer more options for users. Meta is expanding ad transparency, making apps safer for teens, and releasing AI system insights. New tools for researchers will provide comprehensive access to publicly available content, users in Europe can now customize content experiences, and reporting tools for illegal content are more accessible.
Google has also modified its trust and safety measures and introduced many safety initiatives that match the DSA’s vision, such as the Priority Flagger program, appeal processes for YouTube content removals, restricting personalized ads for those under 18, and regular transparency reports on Community Guidelines Enforcement. Google has also made efforts for greater transparency and content moderation tailored to the DSA, including expanding ads transparency, enhancing data access for researchers, and improving their reporting and appeals processes. They have unveiled a new Transparency Center and plan to expand their transparency reports.
TikTok has published a report on how its recommender systems work and also made some changes to the platform. From now on TikTok accounts for those aged under 16 are set to private by default and their content cannot be recommended in For You feeds. Now, users in Europe aged 13-17 will also no longer see personalized advertising based on their activities on or off TikTok. People already have control over the ads they can see and they can toggle personalized ads on or off in their settings. Users in Europe can now turn off personalization so that their For You and LIVE feeds instead recommend both locally relevant and globally “popular” videos, rather than content based on their personal interests. Similarly, when using non-personalized search, users will see results made up of popular content from their region and in their preferred language. Users’ Following and Friends feeds will show creators they follow in chronological order only.
Snapchat will also give users in the EU the option to opt out of personalized feeds on its Discover and Spotlight pages and has also published reports on how it ranks the posts on these feeds. The company has committed to providing users with more information about why their posts or account has been removed and will give them the tools they need to appeal the decision.
Among the above, I would like to highlight the provisions that give users the right to make certain decisions, whether related to the fate of their data, the study of transparency information published by platforms, or especially, their right to choose how they want to see the content on their news feeds. The online platforms must provide direct and easy access to the function that enables its users to select and modify the option they prefer at any time. In other words, users must be provided with the information that explains why the given information is recommended to them, that is, what are the criteria that are the most significant from the point of view of the recommendations, and if the platform uses several such recommendation methods, it must be ensured the easily accessible, direct and easy-to-use option for the user to choose the version he prefers at any time. Article 38 of the Regulation also defines stricter rules for very large online platforms and very large search engines. In connection with each of their recommendation systems, they must provide at least one option according to which the platform does not organize or offer the displayed information to the users based on the personal data collected from them. However, the point is that these are all just options.
Opt-out options on social media platforms are often positioned as giving users more freedom and control over their online experiences. However, the reality is that the opt-out model has inherent flaws that make it less than effective in guaranteeing true freedom. Firstly, the very nature of an opt-out system starts from the premise that users are automatically enrolled in a certain feature or service. This means that the default setting might involve the collection of data or exposure to certain content unless a user actively takes steps to change it. Given that many users may not be technologically savvy or might overlook the importance of certain settings, the default option can lead to inadvertent consent.
Additionally, the opt-out model assumes that users are informed about all the implications of the settings they are enrolled in. But with the rapid evolution of technology and the intricate ways in which data is used, even well-intentioned users might not fully grasp what they are opting out of. Social media platforms, with their extensive terms of service and privacy policies, often contain language that is dense and difficult for the average user to understand.
Furthermore, even if a user successfully opts out of one feature, the interconnected nature of social media means that their data or preferences might still be influenced by other aspects of the platform. For instance, opting out of targeted ads doesn’t necessarily mean a user’s activities aren’t still being tracked and used in other ways. Lastly, freedom isn’t just about the ability to opt out; it’s also about the ability to have meaningful choices. An opt-out system presents a binary choice, often without giving users a spectrum of options that might better suit their individual needs and preferences.
Finally, we also have to consider whether the user really wants to use the given platform without certain functions. The personalized timeline is fundamentally useful to the user, it frees him from a lot of unnecessary information that he would not be interested in anyway, and in return, it may be worth it that some of his data is used for this purpose. We can also draw the conclusion from the regulation on Cookies that the user experience has become much worse than it was before, many people allow access to much more data due to the many intrusive pop-up windows. In addition, excessive reliance on consent can lead to user consent fatigue, which makes those concerned insensitive to the risks of activities related to the processing of their personal data.
In conclusion, while opt-out options may seem like a step towards greater freedom for internet users, they often fall short of providing genuine control, or the control that is often offered (even if it is perhaps more legal) is less desired by the users.
János Tamás Papp JD, PhD is an assistant professor at Pázmány Péter Catholic University, Hungary, and a legal expert at the Department of Online Platforms of the National Media and Infocommunications Authority of Hungary. He has taught civil and constitutional law since 2015 and became a founding member of the Media Law Research Group of the Department of Private Law. He earned his JD and PhD in Law at the Faculty of Law and Political Sciences of the Pázmány Péter Catholic University. His main research fields are freedom of speech, media law, and issues related to freedom of expression on online platforms. He has a number of publications regarding social media and the law, including a book titled „Regulation of Social Media Platforms in Protection of Democratic Discourses”.