EU's planned data protection legislation raises questions about the nature of privacy

Colin Mackay asks whether the EU's new legislation on data privacy, the general data protection regulation, will still be relevant by the time of its introduction.

By Colin Mackay

24 Aug 2015

Data privacy is a major political issue, and rightly so. Individuals should have a right to privacy, and the EU general data protection regulation in theory will give them greater control over how their personal data is used.

Citizens will have to give explicit consent for their data to be collected. In addition, they have to consent to the purpose for which their data will be used. As a further check, data controllers will need to prove that this consent was given.

While the principles behind the general data protection regulation are genuinely well-intentioned, it is reasonable to ask whether they are realistic.


Firstly, a great deal of personal data has already been created and is increasing exponentially. Practically every smartphone app asks for access to personal data; it has become virtually routine for users to accept these terms.

Consent is given by inertia; individuals need to opt out, not opt in. To understand the scale of the problem, one need only look at the default privacy settings of the new Windows 10 operating system. Within a few years, this programme will be installed, mostly for free, on one billion devices worldwide. Unless the settings are specifically changed, it will send Microsoft vast quantities of information on how you use your devices and what you look at. Presumably this existing data will not be covered by this legislation.

The second issue is whether privacy and anonymity is still a realistic goal. A recent study by the Massachusetts Institute of Technology (MIT) showed that it was possible to match individuals to anonymised credit card records using only four pieces of information.

This need not even be financial data; Twitter, Instagram and Facebook may provide sufficient information. This research highlighted how hard it is to make data anonymous, even where organisations work to strip it of personal information.

The head of the MIT study Yves Alexandre de Montjoye, observed, "We're building this body of evidence showing how hard it actually is to anonymise large sets of data like credit cards, mobile phones, and browsing information. We really need to think about what it means to make data truly anonymous and whether it's even possible"

Finally, there is the question of consent. In highly controlled situations, such as clinical trials or social science studies, researchers have the time to explain what happens to data and how it will be used. Individuals know what they are agreeing to and can give informed consent.

In addition, the data is collected with the concept of privacy in mind. However data, once collected, lives on. How can individuals consent to secondary, unforeseen use of their data?

Combined with existing data, it may be sufficient information to identify them as individuals. Technological advances, such as big data, may allow novel uses and applications that cannot currently be imagined. There is no real way to future-proof your privacy.

What does this mean for data privacy? Is it time for society to accept that the increasing generation of metadata means that there is no realistic data privacy? Perhaps legislators should take a different approach. Should EU citizens be made to opt in to personal data collection rather than be forced to opt out?

Alternatively, there may be technological solutions. The MIT study highlights tools that automatically constrain the data that you share.

Metadata is here to stay; perhaps data privacy is not. As policymakers and corporations realise the potential of big data, particularly in controlling the ever-spiralling costs of healthcare, the problem may resolve itself in ways as yet unforeseen.