Submission
Privacy Policy
Code of Ethics
Newsletter

Personal Intelligence: A Gateway Drug to All-knowing AI?—Part II.

Universal opt-out mechanisms such as Global Privacy Control have also appeared in recent years. It is, in fact, a service that makes it easier for users to indicate their privacy preferences when navigating the Internet. GPC essentially allows users to set privacy preferences in their browsers. This setting is then transmitted as a signal to each website they visit, including whether they approve or decline the use of cookies, data sharing, data sales, and targeted advertising.

Ideally, if a user disables the use of their data through the GPC, the website will recognize this and cease all activities that involve the sale or sharing of personal data. Many web browsers, browser plug-ins, and tools use the GPC. These include browsers such as Firefox, Brave, Privacy Badger, and DuckDuckGo.

Regarding the GPC, it should be noted that organizations subject to the GDPR are not legally bound to respect such universal opt-out mechanisms. While the GDPR provides that “natural persons should have control over their own personal data” (Recital 7), the GPC sign is not a formal statement on the subject. Although there are already regulations that make it mandatory to process and respect these signals, the acceptance of the technology is far from universal.

In relation to third-party cookies, the reluctance shown by the service providers involved in terminating the method is evident. Google, for example, has repeatedly postponed its phasing out, most recently by years, to 2025.

It is becoming more and more known what kind of harmful abuses uncontrolled online data collection provides. However, efforts to eradicate such practices often face strong resistance. It is also common for parties involved in data collection to look for loopholes or to delay necessary action until the last minute.

It is important to point out that the example of third-party cookies is only one of many methods. The use of mobile apps to access call logs, calendar entries, or worse, the complete file system of devices, or even attempts at cross-device tracking are at least as prominent in our daily lives.

Importantly, none of the technologies presented so far are inherently bad. As mentioned above, the original purpose of cookies was to improve the user experience (for example, so that you don’t have to set your preferred display language every time you visit a website). Also, in the case of mobile apps, much of the access granted is completely justified (a personal assistant would find it difficult to manage, for example, if it couldn’t see your calendar entries).

At this point, of course, the question may rightly arise: how does all this relate to Artificial Intelligence, and especially to the new generation of personal assistants mentioned in the first part of this post? The key here is to observe the trend, which technologies designed to help users tend to follow until their main purpose is to make a profit. Often at the expense of the privacy of their users.

Announced at WWDC 2024, “personal intelligence” is an innovation in more ways than one. Looking behind the company’s re-branding, Apple is essentially incorporating a feature set that includes Generative Artificial Intelligence (GenAI) into its pre-existing personal assistant, Siri. The feature set is billed as Personal Intelligence. The idea is that when communicating with Siri, the assistant will have access to all the data on the system (photos, calendar entries, private conversations, etc.). This is what the company calls “personal context“ in the marketing campaign. In addition, it will be able to interact with each application on behalf of the user.

The company promises that some of the necessary processing will be done directly on the devices running the system, and some will be done on the company’s dedicated servers (if needed). It was also said at the presentation that it will also be possible to validate privacy claims with external experts.

Of course, no one knows today which of these promises will come true and which will not. But that is not the point. What we already see is that the leading players in the mobile market are pushing hard to make AI-based services part of everyday life. Alongside Apple, Samsung, another major player in the mobile market, has built much of its high-end devices this year around GenAI. The consequence of their position in the market is that they will also have the greatest influence on what becomes the new standard on the devices on which we practically live most of our digital lives.

The emerging standard is that we voluntarily give our most personal data for analysis, processing, and use. Of course, we are not talking about the present, but about the very near future.

Since the launch of ChatGPT, the big companies have been announcing one model after another, touted as the market leader. Even now, there seems to be plenty of concern about the provenance of the data collected, with a group of online journals about to sue Microsoft and OpenAI for allegedly using millions of their articles to teach their models, in violation of the publisher’s copyright. Of course, there are plenty of concerns beyond this. These include urban legends, such as the notorious theory that the Facebook app is secretly eavesdropping on users, but there are also several genuine concerns.
Based on current trends, there is a high risk that data collection on GenAI will follow the same path as we have already seen with the example of cookies in terms of ad targeting. The collection of people’s personal data has not changed much with the advent of new technologies, especially generative AI, but has become more opaque. We have less control over deleting or correcting our data, and GenAI is more data-intensive, forcing its developers to collect even more data.

The point of the analogy is that the idea of controlling our digital footprint in the online space only became widespread a few years after an entire industry was built on its almost uncontrolled use. The question is, what will be the consequences of making our most personal data available to companies voluntarily through cloud services as a natural part of our lives?

It should be noted that much of the previous data collection was done using implicit methods. It was largely a case of piecing together data traces left scattered across the online space. Specifically, the data that Apple Intelligence can send to the cloud is data that could previously only be collected in such quantity and purity by spyware. Of course, no one is claiming that Apple is abusing it, as the presentation stressed the opposite. The problem begins when we simply get used to sharing them, because it is quite certain that it is only a matter of time before services appear whose only real purpose is to collect this kind of data.

There is perhaps only one problem with Apple. Why is there no mention of the possibility for users to decide whether they want to use the new services that are coming? Again, this can only set the stage for a harmful practice that accustoms the user to having no real choice in such a situation.

To sum up, the GenAI fever that started a few years ago has now arrived on our mobile phones and is demanding our most personal content to “provide a better user experience”. The biggest problem is that users don’t seem to have any way of opting out if they want to. Also, perhaps it is time to ask ourselves: if we are used to such an unrestricted transfer of personal data to big companies, when will we see the first applications and companies that will use this data, for good or not so good purposes? Most likely… soon.


István ÜVEGES is a researcher in Computer Linguistics at MONTANA Knowledge Management Ltd. and a researcher at the HUN-REN Centre for Social Sciences, Political and Legal Text Mining and Artificial Intelligence Laboratory (poltextLAB). His main interests include practical applications of Automation, Artificial Intelligence (Machine Learning), Legal Language (legalese) studies and the Plain Language Movement.

Print Friendly, PDF & Email