Fundamental rights and big data: striking a balance
Citizens and institutions will only be able to reap the benefits of big data if there is public trust in the technology, writes Ana Gomes.
Ana Gomes | Photo credit: European Parliament audiovisual
Earlier this month, the European Parliament voted on a non-legislative resolution which I drafted about the fundamental rights implications of big data: privacy, data protection, non-discrimination, security and law-enforcement.
The report came up as Parliament's civil liberties, justice and home affairs felt the urgency of looking at the ever growing use of big data from the perspective of fundamental rights, and in particular, with regards to the implementation of the general data protection regulation (GDPR) and the police directive, focusing on the processing of personal data when investigating or combatting crime, both adopted last year.
The report attempts to define big data as the "collection, analysis and the recurring accumulation of large amounts of data, including personal data, from a variety of sources, which are subject to automatic processing by computer algorithms and advanced data-processing techniques using both stored and streamed data in order to generate certain correlations, trends and patterns (big data analytics)".
- Townsend Feehan: Policymakers must avoid leading EU into age of data prohibition
- Viviane Reding: Data protection regulation one more step towards digital single market
We acknowledge that the data revolution is here, propelled by the progress of communication technologies and the ubiquitous use of electronic devices and communications which have led to the development of massive, ever-growing data sets in the hands of private and public entities, providing unprecedented insight into human behaviour.
This new reality entails significant risks with regard to the protection of fundamental rights as guaranteed by the EU Charter of Fundamental Rights and Union law - namely involving the protection of our data, our privacy, our right to equal treatment and security.
From the analysis of these risks, the resolution sets out a series of recommendations on digital literacy, ethical frameworks and guidelines for algorithmic accountability and transparency, fostering cooperation among authorities, regulators and the private sector, use of security measures like privacy by design and by default, anonymisation techniques, encryption, and mandatory privacy impact assessments, to name a few.
We want to stress that the immense opportunities of big data can only be fully enjoyed by citizens and institutions if there is public trust in the use of technologies. Indeed, as reported by the Commission, more than 90 per cent of EU citizens claim that that personal information stored on their electronic devices should be accessed only with their permission and that the confidentiality of their communications online should be guaranteed.
It is precisely with the goal of reinforcing trust and security that the Commission published a proposal, on 10 January, for the ePrivacy regulation, which aims to align the rules for electronic communications with the standards of the GDPR, thereby replacing the ePrivacy directive, in place since 2002 and last updated in 2009.
This reform was essential to bring much needed legal certainty over online consent and to ensure that all electronic communications are considered confidential, both in terms of the content of the communication as well as the metadata created by any such communication.
I welcome the fact that the Commission opted for adopting a regulation instead of a directive; this way, citizens and businesses benefit from one single set of rules and avoid the discrepancies in transposition of the directive. Another positive aspect is that the uniform application of the rules will be enforced by the national data protection authorities already competent to enforce the GDPR.
The European Parliament will now analyse the Commission's proposal and verify whether the right balance was struck between the protection of citizens' fundamental rights and the interests of businesses and innovation.
In my view, it is particularly important that the scope of the regulation includes over-the-top services and data from devices such as those referred to as belonging to the Internet of Things, to reflect the market reality.
Informed consent should be requested from all users in a non-burdensome manner, and rules on cookies, which in the past resulted in an overload of authorisation requests bothering internet users, must be improved and simplified. It is also paramount to ensure responsibility of manufacturers to put in place appropriate safeguards to protect data through privacy by design and default.
Clearly, our dependency on technology has fundamentally altered our communications, putting enormous power in the hands of private entities and state agencies. With this proposal, there is a new opportunity for the EU to answer calls from citizens for better regulation and control over the use of our data derived from electronic communications and to establish a legal standard which ensures a balance of power. We should not waste it.
MEPs are set to approve an extension of EU export controls on cyber-surveillance tools.
Polish MEP Róża Thun has threatened legal action against fellow deputy Ryszard Czarnecki over remarks he is alleged to have made in an interview.
MEPs say they have found evidence of “systematic and serious deficiencies” in the rule of law in Malta.
The European commission must ensure that social media companies will respect national laws against incitement to religious hatred and violence, says Roberta Bonazzi.
EU digital policy serves as a blueprint for Europe's future economic growth, says Oliver Süme, president of EuroISPA.
There are different reasons why people believe in extremist ideologies or join extremist groups, explains Alexander Ritzmann.