The urgent case for a new ePrivacy law
A swarm of misinformation and misunderstanding surrounds the case for revising our rules on the confidentiality of electronic communications, otherwise known as ePrivacy. It’s high time for some honest debunking.
We are now at the business end of the 2014-2019 term of office for MEPs and the current College of European Commissioners. The European Parliament and the Council are working hard to finalise negotiations on a number of Commission proposals which are important to the future of data protection and digital rights generally. These include the proposals on cross-border access to electronic evidence and terrorist content on the web. ePrivacy, however, is the indispensable missing piece of the jigsaw.
Here is why.
1. The GDPR regulates data protection, not the privacy of communications.
The adoption of the proposed ePrivacy Regulation is crucial to protect the fundamental rights to privacy and the protection of personal data in the digital age. Progress must be made quickly to allow legal certainty and a level playing field for market operators. It is also necessary to complete the EU’s framework for data protection and confidentiality of communications. Privacy and data protection are core values of the European Union, recognised in the European Convention for Human Rights and in the EU Charter of Fundamental Rights, which must be respected in all policies of the EU pursuant to the Treaty. The right to privacy articulated in Article 7 of the ECHR and Article 8 of the EU Charter is also protected in national constitutions of many EU Member States. The GDPR, however, only formally concerns Article 8 of the Charter, the right to data protection. Hence the EU’s framework is not complete without ePrivacy reform.
2. GDPR is not enough to change the predominant business model of surveillance.
2018 will go down as the year that the world realised that data is not secure and its use also brings disadvantages, not just benefits. Data is not secure even when processed by the technologically most advanced and financially-powerful resourced companies on the planet. The Facebook/Cambridge Analytica revelations are still under investigation in Europe and America, but they are only the tip of the iceberg, a sign of a much wider problem and a symptom of many more problems still unnoticed.
A vast ecosystem has developed over the recent years, financed by advertising, for exploiting these special types of personal data without meaningful consent. It has developed in the legal grey zone between electronic communications services and information society services. Traditional electronic communications services – fixed line and mobile telecommunications providers – have long been subject to clear limitations. They cannot snoop on conversations over their networks. They can use metadata – that is, data revealing the location, time and persons involved in the communication – for marketing and ‘value added services’ beyond the provision of the communications service itself only with the consent of the user or subscriber. Companies within the category of information society services have been able to grow rapidly thanks to loopholes in our current legal environment. In essence, they can justify their data use practices without the obligation to seek consent for using communications data. There is a clear and urgent need to close these gaps and to strengthen the protection of privacy and security of online communications.
While the news about data breaches, each affecting millions of users, indicates a serious issue in itself, it also reveals more about the underlying culture and business practice: the functions that were used illegally by attackers were often provided for supposedly legal use by data brokers and aggregators. The high impact of the breaches and misuse is the result of the standard surveillance or tracking business model that has taken hold of the entire internet for years.
This must stop. In order to create a digital environment in which users can feel safe, it is necessary that abusive, trust-corroding practices are robustly tackled with clear rules.
3. Not all communications providers are required to give people control over their most intimate data.
The mass of the most private information is collected by communications services, provided separately or as part of a social media or other internet platforms. These are the digital fountains from which very precise information about our lives is sourced. Location data reveals every movement, shows where we live, work and shop, which bars and restaurants we attend, which political events we attend, which medical services we need. Such metadata on when, how often and with whom we exchange messages and calls reveal our entire social position: who are good friends, how close are we with our family members, how much time we spend at work or with private contacts. All this information, and a lot more can be collected without ever looking into the content of the information. This is the information which people are entitled to expect to have the most control over.
The main legislative tool for giving people this control in EU law has been to require companies to seek consent. Consent is susceptible to be abused, so the GDPR (Article 4(11)) has clarified the term to mean ‘any freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her’. Additional conditions for valid consent are spelt out in Article 7. Furthermore, in order to process sensitive data lawfully, such consent must also be ‘explicit’ (Article 9(2)(a)).
EU data protection rules have in principle applied to such data processing, and the GDPR clearly applies in its entirety to any processing of data concerning people in the EU, or any monitoring of people in the EU, regardless of where the controller is based. But the GDPR, like the EU’s previous 1995 Data Protection Directive, does not address itself specifically to communications or communications data as such. The Commission decided to propose reforms in two steps, first GDPR, and then ePrivacy (see Recital 173 of the GDPR). Without the ePrivacy rules applying to all providers of electronic communications, these service providers may argue that there is no need to ask permission - consent - from individuals to use their most private information. This is precisely the uncertainty which must be avoided. We cannot put data controllers in a position where they are required to apply simultaneously a modernised data protection regulation alongside outdated and fragmented rules on communications data which were designed to regulate a market and communication technologies which have changed beyond recognition in the last 17 years.
We welcomed the Report on the ePrivacy proposal adopted by the European Parliament last year as an attempt to strike a good balance among the interests at stake. Progress in the Council has been more challenging. Exploring avenues for a possible compromise, therefore, seems inevitable. There are points where scope for negotiation is limited without compromising the principles of communications privacy themselves. For example, the proposed clauses being considered to allow “further processing of metadata for compatible purposes” risk, in the case of the most sensitive personal data, widening the existing loophole and allowing the circumvention of the high level of protection provided for by the GDPR. Metadata could be used for any purpose that is judged by the service provider to meet the ‘compatibility’ clause. In effect this could devalue the limited safeguards of the GDPR, creating considerable additional risk for end-users and coming very close to introducing “legitimate interest” as a legal basis for processing metadata by the provider. This is something that has consistently been ruled out in all previous legislative procedures on the matter, from 1997 to 2009, and for good reasons which remain valid and if any more are more important than ever give the growing speed and effectiveness of data processing technologies.
4. ePrivacy will help fix, not exacerbate, digital market imbalances.
The existing ePrivacy rules required national transposing laws which are inevitably divergent in many ways; the GDPR, meanwhile, is directly applicable. Again, it is unfair and economically unsustainable to expect controllers providing electronic communications services to be subject to a patchwork of data rules, some EU-wide, some national.
And yet, there has been a concerted and very effective lobbying effort to paint updating confidential of communications rules as driving more business to the already mega powerful tech giants which sit at the centre of the adtech ecosystem. The argument goes that only intermediaries will be in a position to seek the consent of individuals under the new rules, driving customers further away from smaller players. The opposite is true.
In our Opinion on Online Manipulation this year, we tried to explain how concentrated markets and the power of a handful of platform intermediaries have driven a wedge between advertisers, publishers and other content producers and consumers. The new ePrivacy rules, like the GDPR, are in part motivated by the need to repair these direct relationships - crucial to restoring trust in online services. Meaningful, specific, legitimate consent can only take place between parties that trust each other. It is difficult to see how generalised user consent can legitimately be hoovered up by a gigantic intermediary on behalf of all services on the other side of the platform. Indeed, high profile complaints have been tabled to this effect under the GDPR.
Older arguments opposing the GDPR are being rehashed: the argument, for example, that freedom to harvest personal data is needed in order to develop new technologies, such as Artificial Intelligence tools. It is posited that the lack of general data protection and privacy rules in the US and China will allow businesses in those countries to take advantage and increase their competitive edge. This argument is flawed in several ways. A technology which is developed under the control of a surveillance system such as exists in China cannot be the solution for a democratic society based on the rule of law and respect of fundamental rights. Meanwhile, there is clear momentum growing for general rules and limits in the United States as well as most countries around the world, as concern increases about the unlimited use and abuse of personal data.
Eyes on the prize.
Last week, the Chair of the European Data Protection Board was invited to testify before the US Senate, which held a hearing on data protection law. Andrea Jelinek was invited alongside US experts, such as the sponsor of the California Consumer Privacy Act and the president of the Center for Democracy and Technology, a non-profit organisation, and she was asked about the first experiences with enforcing the GDPR. While the Senate hearing may be seen in the context of the coming US elections, it also reflects an increasing anxiety in the US that its lack of comprehensive privacy legislation may become a disadvantage in economic terms. The legal developments in the EU and, this year, in the State of California, and the growing number of GDPR-inspired data protection laws in Brazil and around the globe, as well as the stream of revelations of misuse and insecurity of personal data, are putting pressure on lawmakers.
To sum up: as the absolute minimum, the ePrivacy Regulation must not lower the level of protection as foreseen in the GDPR. Instead, we urgently need a higher level of protection, given the importance of the principle of confidentiality of communications and the particularly sensitive nature of metadata. Nevertheless, the co-legislators may find that not all potentially controversial issues are of the same level of importance, and may find compromises on many of them which ensure the necessary protection of fundamental rights. From a data protection and privacy perspective, there is no excuse for not agreeing on the ePrivacy Regulation under the present legislature and to risk an even less certain outcome in the year or years to come.
It is inevitable that the profound and worrying deficiencies of the current data harvesting economy will become more and more visible to more and more people, in Europe and elsewhere. Missing the chance to ensure better respect for privacy now would be a disservice to the citizens who look to the EU to safeguard fundamental rights regardless of a handful of powerful vested interests.
This is the moment for Europe to consolidate its lead in developing AI solutions for a democratic society and provide a sustainable model for other economies. The world’s data protection and privacy commissioners have discussed privacy, data protection and AI over the last couple of years, and will issue guidance next week. Europe should press on and reconcile technology to the principle of democracy and rule of law. A strong ePrivacy Regulation, adopted as soon as possible, will create legal certainty and incentivise investments in that direction.