BSI briefly reviews the effect following the introduction of the General Data Protection Regulation (GDPR) since it came into full effect on 25 May 2018.
The introduction of the GDPR was widely seen as a hugely positive step for individual rights in the new digital paradigm. The Regulation was introduced to allow individuals to reclaim control over the use of their personal data and to enable citizens to better hold those processing and handling their data to account. Citizens and regulators were presented with improved tools including administrative fines if individual rights were not taken seriously. The GDPR was also supplemented by various pieces of country specific legislation for EU Member States to modernise their data protection regimes, among them the Irish Data Protection Act 2018 and the UK Data Protection Act 2018.
A lot has happened in the intervening two years since the GDPR’s introduction, some of which makes the rush to “comply with” the GDPR pale in comparison. The world could never have foreseen the true complexities and implications of, for example, Brexit or the unprecedented Coronavirus pandemic (to cherry-pick two events that have dominated global headlines).
The run up to 25 May 2018 saw organizations race to complete implementation programmes and make changes to their IT and data management processes. “GDPR compliance” became the aim. European regulators threatened to use their significantly improved powers, sparingly, but that there would be no “grace period” (Information Commissioner's Office [ICO], 2017). Two years on, what has the true impact of the regulation been on the lives of data subjects?
This blog does not cover everything that has happened in the intervening two years but rather discusses some carefully chosen highlights (both positive and negative) and looks towards the future of data protection in the post-GDPR era.
Enforcement of the GDPR
The first fine issued under the GDPR was €400,000 from the Portuguese Data Protection Authority to a hospital for failure to comply with the GDPR in October 2018. The first significant fine came from France in January 2019, which the French Regulator – Commission Nationale de l'informatique et des Libertés (CNIL) – issued to Google for €50million. This is subsequently the largest in European data protection history. Whilst the size and scale of the fine is significant, what interested data protection and privacy experts, at that time, was that the CNIL found Google failed to comply with GDPR’s consent and transparency requirements.
The UK’s Information Commissioner’s Office (ICO) swiftly followed suit and stated their intention to fine British Airways and Marriot Hotels for respective breaches, which received widespread media attention. The UK authority also issued a cease processing notice against AggregateIQ Data Services (AIQ), better known for its part in the Cambridge Analytica incident, the first enforcement notice of its kind under the GDPR. This notice ordered AIQ to stop processing personal data belonging to UK and EU citizens. Taken together, the ICO’s enforcement action could be considered the most drastic since the GDPR’s inception, not least in terms of the size and the variation of enforcement. However, to date, no fines have been issued. Similarly, the BA and Marriot cases were due to be reviewed in 2020 to fully determine the extent of sanctions. However, the UK regulator has deferred any conclusive action against BA or Marriot for now, not least due in part to the COVID-19 pandemic.
Overall, analysis of the 28 EU Member State regulators shows that Germany, Czech Republic and Hungary’s regulators have administered the most penalties in the past two years with the most recent fine being issued by the Swedish authorities against Google for violations of data subject rights.
It is often said that data is the new oil and many of the “data rich” companies have located their EU headquarters in Ireland. This has proved to be a challenge for the Data Protection Commission (DPC). However, as of this week (week commencing 18 May 2020, on the eve of GDPR’s 2nd anniversary), the DPC has issued its first fine, sanctioning the Irish Government’s child and family agency - Tusla - €75,000 for a series of security breaches. In addition, the DPC announced late on Friday 22nd May progress in some of its ongoing regulatory investigations including:
- Submission of a draft decision on its Twitter inquiry to the European Data Protection Board (EDPB)
- Preliminary draft decisions have been sent to WhatsApp and Instagram for their response
- An investigation into Facebook is now in the decision making phase
- A decision to fine Tusla for a second time in as many weeks
Dublin has become a hub for many of the largest data heavy industries in the world and the DPC has the unenviable task of monitoring and regulating these firms for GDPR compliance. There is a constant stream of criticism levied at the DPC by regulators, privacy advocates, citizen rights groups, and individuals for their perceived inaction. However, this “inaction” must be considered in context against the perspective of the more than twenty open statutory investigations into big-tech companies for various complaint driven and “own-volition” concerns.
Juxtaposing private vs public sector enforcement, the DPC has so far appeared to have been able to take a much more proactive approach against controllers in the Irish public service. In 2019, the DPC scrutinized the Department of Employment and Social Protection (DEASP) and the introduction of a public service card. In its findings, the DPC found that there was no legal basis requiring the card to access many public services. This decision is strongly contested by the DEASP and they are currently appealing it to the High Court.
Other notable events
One of the most notable of the Court of Justice of the European Union (CJEU) judgements in relation to GDPR is the Planet 49 case. In this case, the CJEU finally ended any notions that consent requirements under the E-Privacy Directive contrasted with GDPR standards. Accordingly, cookie notices require data subjects to “opt in” for permission to place cookies on a user’s machine.
The EU-US Privacy Shield has come under ever increasing spotlight of the EU Parliament and Commission but any changes to that framework are likely to come about because of the Schrems case. This is an ongoing case between Max Schrems, Facebook and the Data Protection Commissioner. The failure to bring this dispute to a close means that many continue to question the validity of the Standard Contractual Clauses (SCCs) that are used by many organizations to transfer personal data from the EU to the US and other ex-EU jurisdictions. The CJEU is to provide its decision in July 2020: a decision that is eagerly awaited and could have a massive impact on those companies who transfer data outside the EU. Additionally, a positive development as an output of the GDPR, is the European Data Protection Boards (EDPB) ever increasing coordination and cooperation. One of the tangible benefits of this is their repertoire of publicly available knowledge base resources and useful tools for professionals and citizens. Worthy of note are the first standard contractual clauses for contracts between controllers and processors (art. 28 GDPR) at the initiative of Danish supervisory authority that were approved by the EDPB. Previously, SCC’s had to be drafted and approved by the EU Commission and accordingly any amendments would require the same lengthy process.
Notwithstanding the focus on reactive regulation, the COVID-19 pandemic is a perfect example of how data protection and privacy matters are viewed by innovators, regulators, governments, and citizens in real-time.
The GDPR provides a clear lawful basis to enable employers and public health authorities to process personal data in the context of an epidemic such as COVID-19. Recital 46 refers to the lawfulness of certain processing aimed at vital or public interest, “including for monitoring epidemics and their spread”. Provisions in both Article 6 and Article 9 also facilitate the collection, use and necessary sharing of personal data related to health in the context of a public health emergency.
There is a global push to develop and roll out “contact tracing apps” to inform and support public health policy responses. However, the use of modern and evolving technologies to track and analyse sensitive personal data will test the balance between privacy and societal benefits. Consideration of privacy and data protection obligations will be vitally important to ensure that any privacy impact is minimized.
Data Controllers are also advised to ensure that the fundamental principles of data protection law are adhered to and that the personal data of the affected subjects is protected. Specifically, Data Controllers should ensure that the processing of personal data:
- Has a clear lawful basis
- Is transparent
- Has a specific and explicit purpose
- Is limited to what is necessary
- Is kept for no longer than is necessary
- Is processed in a manner that ensures the security of the data.
Responding to, mitigating, and recovering from this global crisis will require extraordinary strength, significant resources, solidarity, and a collective effort to prove just how resilient we are as a global society and economy. The protection of data protection rights may seem a secondary or even tertiary objective given the extent of the task facing the world in facing down a global pandemic. Nevertheless, to maintain the promise of GDPR, all involved in app development should ensure privacy by design is embedded from the outset.
Data Protection Impact Assessments (DPIA’s) for this technology require consultation with the Regulators due to nature of the risks presented for data subjects. For example, Article 22 of the GDPR provides that individuals have the right not to be subject to automated decision making which produces legal effects impacting the individual. It is therefore important to consider whether the data derived from these apps will be used to detain people as provided for under section 38A, Health Act 1947.
In the interest of transparency, regulators should be the voice of the data subject and ensure that they are consulted in accordance with Article 36, GDPR with a view to exercising their powers under Article 58, GDPR where necessary. Critically, to enable public trust and buy-in to technology solutions and apps for contact tracing, symptom tracking, or quarantine management, DPIAs should be made publicly available including as much information as possible about the technology.
What is clear is that although the GDPR provides for pandemic specific processing personal data, it is unclear how the regulators will apply enforcement actions where required. Not least to ensure that as countries begin to relax heavy lockdowns, that long-term impacts for privacy are not normalized.
The first two years of the GDPR were not what was expected, and the level of enforcement anticipated by many has not yet materialized. That said, these first two years of the GDPR have seen increased regulatory action and an acute awareness and expectation amongst the general public for privacy and data protection. The continued surge of technology and innovation enables humankind to better serve ourselves, respond to crises and further our development as a society and species. Against this backdrop, privacy and data protection matter now more than ever.
The findings of the DPC’s ongoing statutory inquiries are long awaited and will help reveal the true extent of the GDPR’s reach. However, the DPC’s ability to resource complex, legally challenging and technically intricate investigations are often hampered by funding and resourcing challenges. Until investigations come to their conclusions, the true impact of the GDPR cannot be fully assessed.
The GDPR presents the opportunity for the EU data protection authorities to be more proactive by addressing the incursion into our individual affairs by State and private sector actors to safeguard our individual liberties. COVID-19 has resulted in a boom of technology “solutions” to the pandemic. The apps that many governments will officially endow or endorse could still present difficulties to data protection rights. These rights should be carefully balanced against the interests of the public good based on necessity and proportionality.
The Chair of the European Data Protection Board (EDPB) Andrea Jelinek, said “Data protection rules (such as GDPR) do not hinder measures taken in the fight against the coronavirus pandemic. However, I would like to underline that, even in these exceptional times, the data controller must ensure the protection of the personal data of the data subjects.”
Regulators continue to promise that investigations, enforcement actions and sanctions will continue apace, but fundamentally, the GDPR provides a springboard for the world to continue the careful balancing of data protection and privacy rights with societal benefits. An improved application of rights-based approaches to innovation will be the true mark of the GDPR’s success. As the GDPR matures and similar legislation takes shape all over the world, data protection rights may continue to appear as a secondary or even tertiary objective given the extent of the real tasks facing the world. Nevertheless, fundamental rights should not obstruct the critical tasks of governments and public authorities or hinder the innovation and ingenuity that drives our economic and technology advance. As the GDPR turns two years old on 25 May 2020, its continued application and enforcement will support societal resilience, strengthen fundamental rights, and empower citizens in the EU and beyond.
This blog was written by: